Science.gov

Sample records for peeling wavelet denoising

  1. Birdsong Denoising Using Wavelets

    PubMed Central

    Priyadarshani, Nirosha; Marsland, Stephen; Castro, Isabel; Punchihewa, Amal

    2016-01-01

    Automatic recording of birdsong is becoming the preferred way to monitor and quantify bird populations worldwide. Programmable recorders allow recordings to be obtained at all times of day and year for extended periods of time. Consequently, there is a critical need for robust automated birdsong recognition. One prominent obstacle to achieving this is low signal to noise ratio in unattended recordings. Field recordings are often very noisy: birdsong is only one component in a recording, which also includes noise from the environment (such as wind and rain), other animals (including insects), and human-related activities, as well as noise from the recorder itself. We describe a method of denoising using a combination of the wavelet packet decomposition and band-pass or low-pass filtering, and present experiments that demonstrate an order of magnitude improvement in noise reduction over natural noisy bird recordings. PMID:26812391

  2. Optical Aperture Synthesis Object's Information Extracting Based on Wavelet Denoising

    NASA Astrophysics Data System (ADS)

    Fan, W. J.; Lu, Y.

    2006-10-01

    Wavelet denoising is studied to improve OAS(optical aperture synthesis) object's Fourier information extracting. Translation invariance wavelet denoising based on Donoho wavelet soft threshold denoising is researched to remove Pseudo-Gibbs in wavelet soft threshold image. OAS object's information extracting based on translation invariance wavelet denoising is studied. The study shows that wavelet threshold denoising can improve the precision and the repetition of object's information extracting from interferogram, and the translation invariance wavelet denoising information extracting is better than soft threshold wavelet denoising information extracting.

  3. Time Difference of Arrival (TDOA) Estimation Using Wavelet Based Denoising

    DTIC Science & Technology

    1999-03-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS TIME DIFFERENCE OF ARRIVAL (TDOA) ESTIMATION USING WAVELET BASED DENOISING by Unal Aktas...4. TITLE AND SUBTITLE TIME DIFFERENCE OF ARRIVAL (TDOA) ESTIMATION USING WAVELET BASED DENOISING 6. AUTHOR(S) Unal Aktas 7...difference of arrival (TDOA) method. The wavelet transform is used to increase the accuracy of TDOA estimation. Several denoising techniques based on

  4. Parallel object-oriented, denoising system using wavelet multiresolution analysis

    DOEpatents

    Kamath, Chandrika; Baldwin, Chuck H.; Fodor, Imola K.; Tang, Nu A.

    2005-04-12

    The present invention provides a data de-noising system utilizing processors and wavelet denoising techniques. Data is read and displayed in different formats. The data is partitioned into regions and the regions are distributed onto the processors. Communication requirements are determined among the processors according to the wavelet denoising technique and the partitioning of the data. The data is transforming onto different multiresolution levels with the wavelet transform according to the wavelet denoising technique, the communication requirements, and the transformed data containing wavelet coefficients. The denoised data is then transformed into its original reading and displaying data format.

  5. Optimal wavelet denoising for smart biomonitor systems

    NASA Astrophysics Data System (ADS)

    Messer, Sheila R.; Agzarian, John; Abbott, Derek

    2001-03-01

    Future smart-systems promise many benefits for biomedical diagnostics. The ideal is for simple portable systems that display and interpret information from smart integrated probes or MEMS-based devices. In this paper, we will discuss a step towards this vision with a heart bio-monitor case study. An electronic stethoscope is used to record heart sounds and the problem of extracting noise from the signal is addressed via the use of wavelets and averaging. In our example of heartbeat analysis, phonocardiograms (PCGs) have many advantages in that they may be replayed and analysed for spectral and frequency information. Many sources of noise may pollute a PCG including foetal breath sounds if the subject is pregnant, lung and breath sounds, environmental noise and noise from contact between the recording device and the skin. Wavelets can be employed to denoise the PCG. The signal is decomposed by a discrete wavelet transform. Due to the efficient decomposition of heart signals, their wavelet coefficients tend to be much larger than those due to noise. Thus, coefficients below a certain level are regarded as noise and are thresholded out. The signal can then be reconstructed without significant loss of information in the signal. The questions that this study attempts to answer are which wavelet families, levels of decomposition, and thresholding techniques best remove the noise in a PCG. The use of averaging in combination with wavelet denoising is also addressed. Possible applications of the Hilbert Transform to heart sound analysis are discussed.

  6. Denoising solar radiation data using coiflet wavelets

    SciTech Connect

    Karim, Samsul Ariffin Abdul Janier, Josefina B. Muthuvalu, Mohana Sundaram; Hasan, Mohammad Khatim; Sulaiman, Jumat; Ismail, Mohd Tahir

    2014-10-24

    Signal denoising and smoothing plays an important role in processing the given signal either from experiment or data collection through observations. Data collection usually was mixed between true data and some error or noise. This noise might be coming from the apparatus to measure or collect the data or human error in handling the data. Normally before the data is use for further processing purposes, the unwanted noise need to be filtered out. One of the efficient methods that can be used to filter the data is wavelet transform. Due to the fact that the received solar radiation data fluctuates according to time, there exist few unwanted oscillation namely noise and it must be filtered out before the data is used for developing mathematical model. In order to apply denoising using wavelet transform (WT), the thresholding values need to be calculated. In this paper the new thresholding approach is proposed. The coiflet2 wavelet with variation diminishing 4 is utilized for our purpose. From numerical results it can be seen clearly that, the new thresholding approach give better results as compare with existing approach namely global thresholding value.

  7. Musculoskeletal ultrasound image denoising using Daubechies wavelets

    NASA Astrophysics Data System (ADS)

    Gupta, Rishu; Elamvazuthi, I.; Vasant, P.

    2012-11-01

    Among various existing medical imaging modalities Ultrasound is providing promising future because of its ease availability and use of non-ionizing radiations. In this paper we have attempted to denoise ultrasound image using daubechies wavelet and analyze the results with peak signal to noise ratio and coefficient of correlation as performance measurement index. The different daubechies from 1 to 6 is used on four different ultrasound bone fracture images with three different levels from 1 to 3. The images for visual inspection and PSNR, Coefficient of correlation values are graphically shown for quantitaive analysis of resultant images.

  8. Image denoising based on wavelet cone of influence analysis

    NASA Astrophysics Data System (ADS)

    Pang, Wei; Li, Yufeng

    2009-11-01

    Donoho et al have proposed a method for denoising by thresholding based on wavelet transform, and indeed, the application of their method to image denoising has been extremely successful. But this method is based on the assumption that the type of noise is only additive Gaussian white noise, which is not efficient to impulse noise. In this paper, a new image denoising algorithm based on wavelet cone of influence (COI) analyzing is proposed, and which can effectively remove the impulse noise and preserve the image edges via undecimated discrete wavelet transform (UDWT). Furthermore, combining with the traditional wavelet thresholding denoising method, it can be also used to restrain more widely type of noise such as Gaussian noise, impulse noise, poisson noise and other mixed noise. Experiment results illustrate the advantages of this method.

  9. Terahertz digital holography image denoising using stationary wavelet transform

    NASA Astrophysics Data System (ADS)

    Cui, Shan-Shan; Li, Qi; Chen, Guanghao

    2015-04-01

    Terahertz (THz) holography is a frontier technology in terahertz imaging field. However, reconstructed images of holograms are inherently affected by speckle noise, on account of the coherent nature of light scattering. Stationary wavelet transform (SWT) is an effective tool in speckle noise removal. In this paper, two algorithms for despeckling SAR images are implemented to THz images based on SWT, which are threshold estimation and smoothing operation respectively. Denoised images are then quantitatively assessed by speckle index. Experimental results show that the stationary wavelet transform has superior denoising performance and image detail preservation to discrete wavelet transform. In terms of the threshold estimation, high levels of decomposing are needed for better denoising result. The smoothing operation combined with stationary wavelet transform manifests the optimal denoising effect at single decomposition level, with 5×5 average filtering.

  10. Image denoising with the dual-tree complex wavelet transform

    NASA Astrophysics Data System (ADS)

    Yaseen, Alauldeen S.; Pavlova, Olga N.; Pavlov, Alexey N.; Hramov, Alexander E.

    2016-04-01

    The purpose of this study is to compare image denoising techniques based on real and complex wavelet-transforms. Possibilities provided by the classical discrete wavelet transform (DWT) with hard and soft thresholding are considered, and influences of the wavelet basis and image resizing are discussed. The quality of image denoising for the standard 2-D DWT and the dual-tree complex wavelet transform (DT-CWT) is studied. It is shown that DT-CWT outperforms 2-D DWT at the appropriate selection of the threshold level.

  11. Undecimated Wavelet Transforms for Image De-noising

    SciTech Connect

    Gyaourova, A; Kamath, C; Fodor, I K

    2002-11-19

    A few different approaches exist for computing undecimated wavelet transform. In this work we construct three undecimated schemes and evaluate their performance for image noise reduction. We use standard wavelet based de-noising techniques and compare the performance of our algorithms with the original undecimated wavelet transform, as well as with the decimated wavelet transform. The experiments we have made show that our algorithms have better noise removal/blurring ratio.

  12. Wavelet Denoising of Mobile Radiation Data

    SciTech Connect

    Campbell, D B

    2008-10-31

    The FY08 phase of this project investigated the merits of video fusion as a method for mitigating the false alarms encountered by vehicle borne detection systems in an effort to realize performance gains associated with wavelet denoising. The fusion strategy exploited the significant correlations which exist between data obtained from radiation detectors and video systems with coincident fields of view. The additional information provided by optical systems can greatly increase the capabilities of these detection systems by reducing the burden of false alarms and through the generation of actionable information. The investigation into the use of wavelet analysis techniques as a means of filtering the gross-counts signal obtained from moving radiation detectors showed promise for vehicle borne systems. However, the applicability of these techniques to man-portable systems is limited due to minimal gains in performance over the rapid feedback available to system operators under walking conditions. Furthermore, the fusion of video holds significant promise for systems operating from vehicles or systems organized into stationary arrays; however, the added complexity and hardware required by this technique renders it infeasible for man-portable systems.

  13. Minimum risk wavelet shrinkage operator for Poisson image denoising.

    PubMed

    Cheng, Wu; Hirakawa, Keigo

    2015-05-01

    The pixel values of images taken by an image sensor are said to be corrupted by Poisson noise. To date, multiscale Poisson image denoising techniques have processed Haar frame and wavelet coefficients--the modeling of coefficients is enabled by the Skellam distribution analysis. We extend these results by solving for shrinkage operators for Skellam that minimizes the risk functional in the multiscale Poisson image denoising setting. The minimum risk shrinkage operator of this kind effectively produces denoised wavelet coefficients with minimum attainable L2 error.

  14. Nonlocal hierarchical dictionary learning using wavelets for image denoising.

    PubMed

    Yan, Ruomei; Shao, Ling; Liu, Yan

    2013-12-01

    Exploiting the sparsity within representation models for images is critical for image denoising. The best currently available denoising methods take advantage of the sparsity from image self-similarity, pre-learned, and fixed representations. Most of these methods, however, still have difficulties in tackling high noise levels or noise models other than Gaussian. In this paper, the multiresolution structure and sparsity of wavelets are employed by nonlocal dictionary learning in each decomposition level of the wavelets. Experimental results show that our proposed method outperforms two state-of-the-art image denoising algorithms on higher noise levels. Furthermore, our approach is more adaptive to the less extensively researched uniform noise.

  15. Image denoising based on wavelets and multifractals for singularity detection.

    PubMed

    Zhong, Junmei; Ning, Ruola

    2005-10-01

    This paper presents a very efficient algorithm for image denoising based on wavelets and multifractals for singularity detection. A challenge of image denoising is how to preserve the edges of an image when reducing noise. By modeling the intensity surface of a noisy image as statistically self-similar multifractal processes and taking advantage of the multiresolution analysis with wavelet transform to exploit the local statistical self-similarity at different scales, the pointwise singularity strength value characterizing the local singularity at each scale was calculated. By thresholding the singularity strength, wavelet coefficients at each scale were classified into two categories: the edge-related and regular wavelet coefficients and the irregular coefficients. The irregular coefficients were denoised using an approximate minimum mean-squared error (MMSE) estimation method, while the edge-related and regular wavelet coefficients were smoothed using the fuzzy weighted mean (FWM) filter aiming at preserving the edges and details when reducing noise. Furthermore, to make the FWM-based filtering more efficient for noise reduction at the lowest decomposition level, the MMSE-based filtering was performed as the first pass of denoising followed by performing the FWM-based filtering. Experimental results demonstrated that this algorithm could achieve both good visual quality and high PSNR for the denoised images.

  16. Wavelet-based denoising using local Laplace prior

    NASA Astrophysics Data System (ADS)

    Rabbani, Hossein; Vafadust, Mansur; Selesnick, Ivan

    2007-09-01

    Although wavelet-based image denoising is a powerful tool for image processing applications, relatively few publications have addressed so far wavelet-based video denoising. The main reason is that the standard 3-D data transforms do not provide useful representations with good energy compaction property, for most video data. For example, the multi-dimensional standard separable discrete wavelet transform (M-D DWT) mixes orientations and motions in its subbands, and produces the checkerboard artifacts. So, instead of M-D DWT, usually oriented transforms suchas multi-dimensional complex wavelet transform (M-D DCWT) are proposed for video processing. In this paper we use a Laplace distribution with local variance to model the statistical properties of noise-free wavelet coefficients. This distribution is able to simultaneously model the heavy-tailed and intrascale dependency properties of wavelets. Using this model, simple shrinkage functions are obtained employing maximum a posteriori (MAP) and minimum mean squared error (MMSE) estimators. These shrinkage functions are proposed for video denoising in DCWT domain. The simulation results shows that this simple denoising method has impressive performance visually and quantitatively.

  17. Oriented wavelet transform for image compression and denoising.

    PubMed

    Chappelier, Vivien; Guillemot, Christine

    2006-10-01

    In this paper, we introduce a new transform for image processing, based on wavelets and the lifting paradigm. The lifting steps of a unidimensional wavelet are applied along a local orientation defined on a quincunx sampling grid. To maximize energy compaction, the orientation minimizing the prediction error is chosen adaptively. A fine-grained multiscale analysis is provided by iterating the decomposition on the low-frequency band. In the context of image compression, the multiresolution orientation map is coded using a quad tree. The rate allocation between the orientation map and wavelet coefficients is jointly optimized in a rate-distortion sense. For image denoising, a Markov model is used to extract the orientations from the noisy image. As long as the map is sufficiently homogeneous, interesting properties of the original wavelet are preserved such as regularity and orthogonality. Perfect reconstruction is ensured by the reversibility of the lifting scheme. The mutual information between the wavelet coefficients is studied and compared to the one observed with a separable wavelet transform. The rate-distortion performance of this new transform is evaluated for image coding using state-of-the-art subband coders. Its performance in a denoising application is also assessed against the performance obtained with other transforms or denoising methods.

  18. Impedance cardiography signal denoising using discrete wavelet transform.

    PubMed

    Chabchoub, Souhir; Mansouri, Sofienne; Salah, Ridha Ben

    2016-09-01

    Impedance cardiography (ICG) is a non-invasive technique for diagnosing cardiovascular diseases. In the acquisition procedure, the ICG signal is often affected by several kinds of noise which distort the determination of the hemodynamic parameters. Therefore, doctors cannot recognize ICG waveform correctly and the diagnosis of cardiovascular diseases became inaccurate. The aim of this work is to choose the most suitable method for denoising the ICG signal. Indeed, different wavelet families are used to denoise the ICG signal. The Haar, Daubechies (db2, db4, db6, and db8), Symlet (sym2, sym4, sym6, sym8) and Coiflet (coif2, coif3, coif4, coif5) wavelet families are tested and evaluated in order to select the most suitable denoising method. The wavelet family with best performance is compared with two denoising methods: one based on Savitzky-Golay filtering and the other based on median filtering. Each method is evaluated by means of the signal to noise ratio (SNR), the root mean square error (RMSE) and the percent difference root mean square (PRD). The results show that the Daubechies wavelet family (db8) has superior performance on noise reduction in comparison to other methods.

  19. Multitaper Spectral Analysis and Wavelet Denoising Applied to Helioseismic Data

    NASA Technical Reports Server (NTRS)

    Komm, R. W.; Gu, Y.; Hill, F.; Stark, P. B.; Fodor, I. K.

    1999-01-01

    Estimates of solar normal mode frequencies from helioseismic observations can be improved by using Multitaper Spectral Analysis (MTSA) to estimate spectra from the time series, then using wavelet denoising of the log spectra. MTSA leads to a power spectrum estimate with reduced variance and better leakage properties than the conventional periodogram. Under the assumption of stationarity and mild regularity conditions, the log multitaper spectrum has a statistical distribution that is approximately Gaussian, so wavelet denoising is asymptotically an optimal method to reduce the noise in the estimated spectra. We find that a single m-upsilon spectrum benefits greatly from MTSA followed by wavelet denoising, and that wavelet denoising by itself can be used to improve m-averaged spectra. We compare estimates using two different 5-taper estimates (Stepian and sine tapers) and the periodogram estimate, for GONG time series at selected angular degrees l. We compare those three spectra with and without wavelet-denoising, both visually, and in terms of the mode parameters estimated from the pre-processed spectra using the GONG peak-fitting algorithm. The two multitaper estimates give equivalent results. The number of modes fitted well by the GONG algorithm is 20% to 60% larger (depending on l and the temporal frequency) when applied to the multitaper estimates than when applied to the periodogram. The estimated mode parameters (frequency, amplitude and width) are comparable for the three power spectrum estimates, except for modes with very small mode widths (a few frequency bins), where the multitaper spectra broadened the modest compared with the periodogram. We tested the influence of the number of tapers used and found that narrow modes at low n values are broadened to the extent that they can no longer be fit if the number of tapers is too large. For helioseismic time series of this length and temporal resolution, the optimal number of tapers is less than 10.

  20. Denoising and robust nonlinear wavelet analysis

    NASA Astrophysics Data System (ADS)

    Bruce, Andrew G.; Donoho, David L.; Gao, Hong-Ye; Martin, R. D.

    1994-03-01

    In a series of papers, Donoho and Johnstone develop a powerful theory based on wavelets for extracting non-smooth signals from noisy data. Several nonlinear smoothing algorithms are presented which provide high performance for removing Gaussian noise from a wide range of spatially inhomogeneous signals. However, like other methods based on the linear wavelet transform, these algorithms are very sensitive to certain types of non-Gaussian noise, such as outliers. In this paper, we develop outlier resistant wavelet transforms. In these transforms, outliers and outlier patches are localized to just a few scales. By using the outlier resistant wavelet transform, we improve upon the Donoho and Johnstone nonlinear signal extraction methods. The outlier resistant wavelet algorithms are included with the 'S+WAVELETS' object-oriented toolkit for wavelet analysis.

  1. Region-based image denoising through wavelet and fast discrete curvelet transform

    NASA Astrophysics Data System (ADS)

    Gu, Yanfeng; Guo, Yan; Liu, Xing; Zhang, Ye

    2008-10-01

    Image denoising always is one of important research topics in the image processing field. In this paper, fast discrete curvelet transform (FDCT) and undecimated wavelet transform (UDWT) are proposed for image denoising. A noisy image is first denoised by FDCT and UDWT separately. The whole image space is then divided into edge region and non-edge regions. After that, wavelet transform is performed on the images denoised by FDCT and UDWT respectively. Finally, the resultant image is fused through using both of edge region wavelet cofficients of the image denoised by FDCT and non-edge region wavelet cofficients of the image denoised by UDWT. The proposed method is validated through numerical experiments conducted on standard test images. The experimental results show that the proposed algorithm outperforms wavelet-based and curvelet-based image denoising methods and preserve linear features well.

  2. Study on Underwater Image Denoising Algorithm Based on Wavelet Transform

    NASA Astrophysics Data System (ADS)

    Jian, Sun; Wen, Wang

    2017-02-01

    This paper analyzes the application of MATLAB in underwater image processing, the transmission characteristics of the underwater laser light signal and the kinds of underwater noise has been described, the common noise suppression algorithm: Wiener filter, median filter, average filter algorithm is brought out. Then the advantages and disadvantages of each algorithm in image sharpness and edge protection areas have been compared. A hybrid filter algorithm based on wavelet transform has been proposed which can be used for Color Image Denoising. At last the PSNR and NMSE of each algorithm has been given out, which compares the ability to de-noising

  3. Examining Alternatives to Wavelet Denoising for Astronomical Source Finding

    NASA Astrophysics Data System (ADS)

    Jurek, R.; Brown, S.

    2012-08-01

    The Square Kilometre Array and its pathfinders ASKAP and MeerKAT will produce prodigious amounts of data that necessitate automated source finding. The performance of automated source finders can be improved by pre-processing a dataset. In preparation for the WALLABY and DINGO surveys, we have used a test HI datacube constructed from actual Westerbork Telescope noise and WHISP HI galaxies to test the real world improvement of linear smoothing, the Duchamp source finder's wavelet denoising, iterative median smoothing and mathematical morphology subtraction, on intensity threshold source finding of spectral line datasets. To compare these pre-processing methods we have generated completeness-reliability performance curves for each method and a range of input parameters. We find that iterative median smoothing produces the best source finding results for ASKAP HI spectral line observations, but wavelet denoising is a safer pre-processing technique. In this paper we also present our implementations of iterative median smoothing and mathematical morphology subtraction.

  4. Morphology of the Galaxy Distribution from Wavelet Denoising

    NASA Astrophysics Data System (ADS)

    Martínez, Vicent J.; Starck, Jean-Luc; Saar, Enn; Donoho, David L.; Reynolds, Simon C.; de la Cruz, Pablo; Paredes, Silvestre

    2005-11-01

    We have developed a method based on wavelets to obtain the true underlying smooth density from a point distribution. The goal has been to reconstruct the density field in an optimal way, ensuring that the morphology of the reconstructed field reflects the true underlying morphology of the point field, which, as the galaxy distribution, has a genuinely multiscale structure, with near-singular behavior on sheets, filaments, and hot spots. If the discrete distributions are smoothed using Gaussian filters, the morphological properties tend to be closer to those expected for a Gaussian field. The use of wavelet denoising provides us with a unique and more accurate morphological description.

  5. Optimization of dynamic measurement of receptor kinetics by wavelet denoising.

    PubMed

    Alpert, Nathaniel M; Reilhac, Anthonin; Chio, Tat C; Selesnick, Ivan

    2006-04-01

    The most important technical limitation affecting dynamic measurements with PET is low signal-to-noise ratio (SNR). Several reports have suggested that wavelet processing of receptor kinetic data in the human brain can improve the SNR of parametric images of binding potential (BP). However, it is difficult to fully assess these reports because objective standards have not been developed to measure the tradeoff between accuracy (e.g. degradation of resolution) and precision. This paper employs a realistic simulation method that includes all major elements affecting image formation. The simulation was used to derive an ensemble of dynamic PET ligand (11C-raclopride) experiments that was subjected to wavelet processing. A method for optimizing wavelet denoising is presented and used to analyze the simulated experiments. Using optimized wavelet denoising, SNR of the four-dimensional PET data increased by about a factor of two and SNR of three-dimensional BP maps increased by about a factor of 1.5. Analysis of the difference between the processed and unprocessed means for the 4D concentration data showed that more than 80% of voxels in the ensemble mean of the wavelet processed data deviated by less than 3%. These results show that a 1.5x increase in SNR can be achieved with little degradation of resolution. This corresponds to injecting about twice the radioactivity, a maneuver that is not possible in human studies without saturating the PET camera and/or exposing the subject to more than permitted radioactivity.

  6. Electrocardiogram signal denoising based on a new improved wavelet thresholding

    NASA Astrophysics Data System (ADS)

    Han, Guoqiang; Xu, Zhijun

    2016-08-01

    Good quality electrocardiogram (ECG) is utilized by physicians for the interpretation and identification of physiological and pathological phenomena. In general, ECG signals may mix various noises such as baseline wander, power line interference, and electromagnetic interference in gathering and recording process. As ECG signals are non-stationary physiological signals, wavelet transform is investigated to be an effective tool to discard noises from corrupted signals. A new compromising threshold function called sigmoid function-based thresholding scheme is adopted in processing ECG signals. Compared with other methods such as hard/soft thresholding or other existing thresholding functions, the new algorithm has many advantages in the noise reduction of ECG signals. It perfectly overcomes the discontinuity at ±T of hard thresholding and reduces the fixed deviation of soft thresholding. The improved wavelet thresholding denoising can be proved to be more efficient than existing algorithms in ECG signal denoising. The signal to noise ratio, mean square error, and percent root mean square difference are calculated to verify the denoising performance as quantitative tools. The experimental results reveal that the waves including P, Q, R, and S waves of ECG signals after denoising coincide with the original ECG signals by employing the new proposed method.

  7. Photoacoustic signals denoising of the glucose aqueous solutions using an improved wavelet threshold method

    NASA Astrophysics Data System (ADS)

    Ren, Zhong; Liu, Guodong; Xiong, Zhihua

    2016-10-01

    The photoacoustic signals denoising of glucose is one of most important steps in the quality identification of the fruit because the real-time photoacoustic singals of glucose are easily interfered by all kinds of noises. To remove the noises and some useless information, an improved wavelet threshld function were proposed. Compared with the traditional wavelet hard and soft threshold functions, the improved wavelet threshold function can overcome the pseudo-oscillation effect of the denoised photoacoustic signals due to the continuity of the improved wavelet threshold function, and the error between the denoised signals and the original signals can be decreased. To validate the feasibility of the improved wavelet threshold function denoising, the denoising simulation experiments based on MATLAB programmimg were performed. In the simulation experiments, the standard test signal was used, and three different denoising methods were used and compared with the improved wavelet threshold function. The signal-to-noise ratio (SNR) and the root-mean-square error (RMSE) values were used to evaluate the performance of the improved wavelet threshold function denoising. The experimental results demonstrate that the SNR value of the improved wavelet threshold function is largest and the RMSE value is lest, which fully verifies that the improved wavelet threshold function denoising is feasible. Finally, the improved wavelet threshold function denoising was used to remove the noises of the photoacoustic signals of the glucose solutions. The denoising effect is also very good. Therefore, the improved wavelet threshold function denoising proposed by this paper, has a potential value in the field of denoising for the photoacoustic singals.

  8. Adaptively wavelet-based image denoising algorithm with edge preserving

    NASA Astrophysics Data System (ADS)

    Tan, Yihua; Tian, Jinwen; Liu, Jian

    2006-02-01

    A new wavelet-based image denoising algorithm, which exploits the edge information hidden in the corrupted image, is presented. Firstly, a canny-like edge detector identifies the edges in each subband. Secondly, multiplying the wavelet coefficients in neighboring scales is implemented to suppress the noise while magnifying the edge information, and the result is utilized to exclude the fake edges. The isolated edge pixel is also identified as noise. Unlike the thresholding method, after that we use local window filter in the wavelet domain to remove noise in which the variance estimation is elaborated to utilize the edge information. This method is adaptive to local image details, and can achieve better performance than the methods of state of the art.

  9. Energy-Based Wavelet De-Noising of Hydrologic Time Series

    PubMed Central

    Sang, Yan-Fang; Liu, Changming; Wang, Zhonggen; Wen, Jun; Shang, Lunyu

    2014-01-01

    De-noising is a substantial issue in hydrologic time series analysis, but it is a difficult task due to the defect of methods. In this paper an energy-based wavelet de-noising method was proposed. It is to remove noise by comparing energy distribution of series with the background energy distribution, which is established from Monte-Carlo test. Differing from wavelet threshold de-noising (WTD) method with the basis of wavelet coefficient thresholding, the proposed method is based on energy distribution of series. It can distinguish noise from deterministic components in series, and uncertainty of de-noising result can be quantitatively estimated using proper confidence interval, but WTD method cannot do this. Analysis of both synthetic and observed series verified the comparable power of the proposed method and WTD, but de-noising process by the former is more easily operable. The results also indicate the influences of three key factors (wavelet choice, decomposition level choice and noise content) on wavelet de-noising. Wavelet should be carefully chosen when using the proposed method. The suitable decomposition level for wavelet de-noising should correspond to series' deterministic sub-signal which has the smallest temporal scale. If too much noise is included in a series, accurate de-noising result cannot be obtained by the proposed method or WTD, but the series would show pure random but not autocorrelation characters, so de-noising is no longer needed. PMID:25360533

  10. Energy-based wavelet de-noising of hydrologic time series.

    PubMed

    Sang, Yan-Fang; Liu, Changming; Wang, Zhonggen; Wen, Jun; Shang, Lunyu

    2014-01-01

    De-noising is a substantial issue in hydrologic time series analysis, but it is a difficult task due to the defect of methods. In this paper an energy-based wavelet de-noising method was proposed. It is to remove noise by comparing energy distribution of series with the background energy distribution, which is established from Monte-Carlo test. Differing from wavelet threshold de-noising (WTD) method with the basis of wavelet coefficient thresholding, the proposed method is based on energy distribution of series. It can distinguish noise from deterministic components in series, and uncertainty of de-noising result can be quantitatively estimated using proper confidence interval, but WTD method cannot do this. Analysis of both synthetic and observed series verified the comparable power of the proposed method and WTD, but de-noising process by the former is more easily operable. The results also indicate the influences of three key factors (wavelet choice, decomposition level choice and noise content) on wavelet de-noising. Wavelet should be carefully chosen when using the proposed method. The suitable decomposition level for wavelet de-noising should correspond to series' deterministic sub-signal which has the smallest temporal scale. If too much noise is included in a series, accurate de-noising result cannot be obtained by the proposed method or WTD, but the series would show pure random but not autocorrelation characters, so de-noising is no longer needed.

  11. Denoising portal images by means of wavelet techniques

    NASA Astrophysics Data System (ADS)

    Gonzalez Lopez, Antonio Francisco

    Portal images are used in radiotherapy for the verification of patient positioning. The distinguishing feature of this image type lies in its formation process: the same beam used for patient treatment is used for image formation. The high energy of the photons used in radiotherapy strongly limits the quality of portal images: Low contrast between tissues, low spatial resolution and low signal to noise ratio. This Thesis studies the enhancement of these images, in particular denoising of portal images. The statistical properties of portal images and noise are studied: power spectra, statistical dependencies between image and noise and marginal, joint and conditional distributions in the wavelet domain. Later, various denoising methods are applied to noisy portal images. Methods operating in the wavelet domain are the basis of this Thesis. In addition, the Wiener filter and the non local means filter (NLM), operating in the image domain, are used as a reference. Other topics studied in this Thesis are spatial resolution, wavelet processing and image processing in dosimetry in radiotherapy. In this regard, the spatial resolution of portal imaging systems is studied; a new method for determining the spatial resolution of the imaging equipments in digital radiology is presented; the calculation of the power spectrum in the wavelet domain is studied; reducing uncertainty in film dosimetry is investigated; a method for the dosimetry of small radiation fields with radiochromic film is presented; the optimal signal resolution is determined, as a function of the noise level and the quantization step, in the digitization process of films and the useful optical density range is set, as a function of the required uncertainty level, for a densitometric system. Marginal distributions of portal images are similar to those of natural images. This also applies to the statistical relationships between wavelet coefficients, intra-band and inter-band. These facts result in a better

  12. ECG signal denoising via empirical wavelet transform.

    PubMed

    Singh, Omkar; Sunkaria, Ramesh Kumar

    2016-12-29

    This paper presents new methods for baseline wander correction and powerline interference reduction in electrocardiogram (ECG) signals using empirical wavelet transform (EWT). During data acquisition of ECG signal, various noise sources such as powerline interference, baseline wander and muscle artifacts contaminate the information bearing ECG signal. For better analysis and interpretation, the ECG signal must be free of noise. In the present work, a new approach is used to filter baseline wander and power line interference from the ECG signal. The technique utilized is the empirical wavelet transform, which is a new method used to compute the building modes of a given signal. Its performance as a filter is compared to the standard linear filters and empirical mode decomposition.The results show that EWT delivers a better performance.

  13. Improved deadzone modeling for bivariate wavelet shrinkage-based image denoising

    NASA Astrophysics Data System (ADS)

    DelMarco, Stephen

    2016-05-01

    Modern image processing performed on-board low Size, Weight, and Power (SWaP) platforms, must provide high- performance while simultaneously reducing memory footprint, power consumption, and computational complexity. Image preprocessing, along with downstream image exploitation algorithms such as object detection and recognition, and georegistration, place a heavy burden on power and processing resources. Image preprocessing often includes image denoising to improve data quality for downstream exploitation algorithms. High-performance image denoising is typically performed in the wavelet domain, where noise generally spreads and the wavelet transform compactly captures high information-bearing image characteristics. In this paper, we improve modeling fidelity of a previously-developed, computationally-efficient wavelet-based denoising algorithm. The modeling improvements enhance denoising performance without significantly increasing computational cost, thus making the approach suitable for low-SWAP platforms. Specifically, this paper presents modeling improvements to the Sendur-Selesnick model (SSM) which implements a bivariate wavelet shrinkage denoising algorithm that exploits interscale dependency between wavelet coefficients. We formulate optimization problems for parameters controlling deadzone size which leads to improved denoising performance. Two formulations are provided; one with a simple, closed form solution which we use for numerical result generation, and the second as an integral equation formulation involving elliptic integrals. We generate image denoising performance results over different image sets drawn from public domain imagery, and investigate the effect of wavelet filter tap length on denoising performance. We demonstrate denoising performance improvement when using the enhanced modeling over performance obtained with the baseline SSM model.

  14. Denoising method of heart sound signals based on self-construct heart sound wavelet

    NASA Astrophysics Data System (ADS)

    Cheng, Xiefeng; Zhang, Zheng

    2014-08-01

    In the field of heart sound signal denoising, the wavelet transform has become one of the most effective measures. The selective wavelet basis is based on the well-known orthogonal db series or biorthogonal bior series wavelet. In this paper we present a self-construct wavelet basis which is suitable for the heart sound denoising and analyze its constructor method and features in detail according to the characteristics of heart sound and evaluation criterion of signal denoising. The experimental results show that the heart sound wavelet can effectively filter out the noise of the heart sound signals, reserve the main characteristics of the signal. Compared with the traditional wavelets, it has a higher signal-to-noise ratio, lower mean square error and better denoising effect.

  15. Image denoising using principal component analysis in the wavelet domain

    NASA Astrophysics Data System (ADS)

    Bacchelli, Silvia; Papi, Serena

    2006-05-01

    In this work we describe a method for removing Gaussian noise from digital images, based on the combination of the wavelet packet transform and the principal component analysis. In particular, since the aim of denoising is to retain the energy of the signal while discarding the energy of the noise, our basic idea is to construct powerful tailored filters by applying the Karhunen-Loeve transform in the wavelet packet domain, thus obtaining a compaction of the signal energy into a few principal components, while the noise is spread over all the transformed coefficients. This allows us to act with a suitable shrinkage function on these new coefficients, removing the noise without blurring the edges and the important characteristics of the images. The results of a large numerical experimentation encourage us to keep going in this direction with our studies.

  16. Wavelet denoising of multiframe optical coherence tomography data

    PubMed Central

    Mayer, Markus A.; Borsdorf, Anja; Wagner, Martin; Hornegger, Joachim; Mardin, Christian Y.; Tornow, Ralf P.

    2012-01-01

    We introduce a novel speckle noise reduction algorithm for OCT images. Contrary to present approaches, the algorithm does not rely on simple averaging of multiple image frames or denoising on the final averaged image. Instead it uses wavelet decompositions of the single frames for a local noise and structure estimation. Based on this analysis, the wavelet detail coefficients are weighted, averaged and reconstructed. At a signal-to-noise gain at about 100% we observe only a minor sharpness decrease, as measured by a full-width-half-maximum reduction of 10.5%. While a similar signal-to-noise gain would require averaging of 29 frames, we achieve this result using only 8 frames as input to the algorithm. A possible application of the proposed algorithm is preprocessing in retinal structure segmentation algorithms, to allow a better differentiation between real tissue information and unwanted speckle noise. PMID:22435103

  17. Study on glucose photoacoustic signals denoising based on a modified wavelet shift-invariance thresholding method

    NASA Astrophysics Data System (ADS)

    Ren, Zhong; Liu, Guodong

    2016-11-01

    To improve the denoising effect of the glucose photoacoustic signals, a modified wavelet thresholding combined shift-invariance algorithm was used in this paper. In addition, the shift-invariance method was added into the improved algorithm. To verify the feasibility of modified wavelet shift-invariance threshold denoising algorithm, the simulation experiments were performed. Results show that the denoising effect of modified wavelet shift-invariance thresholding algorithm is better than that of others because its signal-to-noise ratio is largest and the root-mean-square error is lest. Finally, the modified wavelet shift-invariance threshold denoising was used to remove the noises of the photoacoustic signals of glucose aqueous solutions.

  18. Dual tree complex wavelet transform based denoising of optical microscopy images.

    PubMed

    Bal, Ufuk

    2012-12-01

    Photon shot noise is the main noise source of optical microscopy images and can be modeled by a Poisson process. Several discrete wavelet transform based methods have been proposed in the literature for denoising images corrupted by Poisson noise. However, the discrete wavelet transform (DWT) has disadvantages such as shift variance, aliasing, and lack of directional selectivity. To overcome these problems, a dual tree complex wavelet transform is used in our proposed denoising algorithm. Our denoising algorithm is based on the assumption that for the Poisson noise case threshold values for wavelet coefficients can be estimated from the approximation coefficients. Our proposed method was compared with one of the state of the art denoising algorithms. Better results were obtained by using the proposed algorithm in terms of image quality metrics. Furthermore, the contrast enhancement effect of the proposed method on collagen fıber images is examined. Our method allows fast and efficient enhancement of images obtained under low light intensity conditions.

  19. Obtaining single stimulus evoked potentials with wavelet denoising

    NASA Astrophysics Data System (ADS)

    Quian Quiroga, R.

    2000-11-01

    We present a method for the analysis of electroencephalograms (EEG). In particular, small signals due to stimulation, so-called evoked potentials (EPs), have to be detected in the background EEG. This is achieved by using a denoising implementation based on the wavelet decomposition. One recording of visual evoked potentials, and recordings of auditory evoked potentials from four subjects corresponding to different age groups are analyzed. We find higher variability in older individuals. Moreover, since the EPs are identified at the single stimulus level (without need of ensemble averaging), this will allow the calculation of better resolved averages. Since the method is parameter free (i.e. it does not need to be adapted to the particular characteristics of each recording), implementations in clinical settings are imaginable.

  20. Application of the dual-tree complex wavelet transform in biomedical signal denoising.

    PubMed

    Wang, Fang; Ji, Zhong

    2014-01-01

    In biomedical signal processing, Gibbs oscillation and severe frequency aliasing may occur when using the traditional discrete wavelet transform (DWT). Herein, a new denoising algorithm based on the dual-tree complex wavelet transform (DTCWT) is presented. Electrocardiogram (ECG) signals and heart sound signals are denoised based on the DTCWT. The results prove that the DTCWT is efficient. The signal-to-noise ratio (SNR) and the mean square error (MSE) are used to compare the denoising effect. Results of the paired samples t-test show that the new method can remove noise more thoroughly and better retain the boundary and texture of the signal.

  1. De-noising of digital image correlation based on stationary wavelet transform

    NASA Astrophysics Data System (ADS)

    Guo, Xiang; Li, Yulong; Suo, Tao; Liang, Jin

    2017-03-01

    In this paper, a stationary wavelet transform (SWT) based method is proposed to de-noise the digital image with the light noise, and the SWT de-noise algorithm is presented after the analyzing of the light noise. By using the de-noise algorithm, the method was demonstrated to be capable of providing accurate DIC measurements in the light noise environment. The verification, comparative and realistic experiments were conducted using this method. The result indicate that the de-noise method can be applied to the full-field strain measurement under the light interference with a high accuracy and stability.

  2. Biomedical image and signal de-noising using dual tree complex wavelet transform

    NASA Astrophysics Data System (ADS)

    Rizi, F. Yousefi; Noubari, H. Ahmadi; Setarehdan, S. K.

    2011-10-01

    Dual tree complex wavelet transform(DTCWT) is a form of discrete wavelet transform, which generates complex coefficients by using a dual tree of wavelet filters to obtain their real and imaginary parts. The purposes of de-noising are reducing noise level and improving signal to noise ratio (SNR) without distorting the signal or image. This paper proposes a method for removing white Gaussian noise from ECG signals and biomedical images. The discrete wavelet transform (DWT) is very valuable in a large scope of de-noising problems. However, it has limitations such as oscillations of the coefficients at a singularity, lack of directional selectivity in higher dimensions, aliasing and consequent shift variance. The complex wavelet transform CWT strategy that we focus on in this paper is Kingsbury's and Selesnick's dual tree CWT (DTCWT) which outperforms the critically decimated DWT in a range of applications, such as de-noising. Each complex wavelet is oriented along one of six possible directions, and the magnitude of each complex wavelet has a smooth bell-shape. In the final part of this paper, we present biomedical image and signal de-noising by the means of thresholding magnitude of the wavelet coefficients.

  3. An NMR log echo data de-noising method based on the wavelet packet threshold algorithm

    NASA Astrophysics Data System (ADS)

    Meng, Xiangning; Xie, Ranhong; Li, Changxi; Hu, Falong; Li, Chaoliu; Zhou, Cancan

    2015-12-01

    To improve the de-noising effects of low signal-to-noise ratio (SNR) nuclear magnetic resonance (NMR) log echo data, this paper applies the wavelet packet threshold algorithm to the data. The principle of the algorithm is elaborated in detail. By comparing the properties of a series of wavelet packet bases and the relevance between them and the NMR log echo train signal, ‘sym7’ is found to be the optimal wavelet packet basis of the wavelet packet threshold algorithm to de-noise the NMR log echo train signal. A new method is presented to determine the optimal wavelet packet decomposition scale; this is within the scope of its maximum, using the modulus maxima and the Shannon entropy minimum standards to determine the global and local optimal wavelet packet decomposition scales, respectively. The results of applying the method to the simulated and actual NMR log echo data indicate that compared with the wavelet threshold algorithm, the wavelet packet threshold algorithm, which shows higher decomposition accuracy and better de-noising effect, is much more suitable for de-noising low SNR-NMR log echo data.

  4. Wavelet-based fMRI analysis: 3-D denoising, signal separation, and validation metrics.

    PubMed

    Khullar, Siddharth; Michael, Andrew; Correa, Nicolle; Adali, Tulay; Baum, Stefi A; Calhoun, Vince D

    2011-02-14

    We present a novel integrated wavelet-domain based framework (w-ICA) for 3-D denoising functional magnetic resonance imaging (fMRI) data followed by source separation analysis using independent component analysis (ICA) in the wavelet domain. We propose the idea of a 3-D wavelet-based multi-directional denoising scheme where each volume in a 4-D fMRI data set is sub-sampled using the axial, sagittal and coronal geometries to obtain three different slice-by-slice representations of the same data. The filtered intensity value of an arbitrary voxel is computed as an expected value of the denoised wavelet coefficients corresponding to the three viewing geometries for each sub-band. This results in a robust set of denoised wavelet coefficients for each voxel. Given the de-correlated nature of these denoised wavelet coefficients, it is possible to obtain more accurate source estimates using ICA in the wavelet domain. The contributions of this work can be realized as two modules: First, in the analysis module we combine a new 3-D wavelet denoising approach with signal separation properties of ICA in the wavelet domain. This step helps obtain an activation component that corresponds closely to the true underlying signal, which is maximally independent with respect to other components. Second, we propose and describe two novel shape metrics for post-ICA comparisons between activation regions obtained through different frameworks. We verified our method using simulated as well as real fMRI data and compared our results against the conventional scheme (Gaussian smoothing+spatial ICA: s-ICA). The results show significant improvements based on two important features: (1) preservation of shape of the activation region (shape metrics) and (2) receiver operating characteristic curves. It was observed that the proposed framework was able to preserve the actual activation shape in a consistent manner even for very high noise levels in addition to significant reduction in false

  5. Denoising of PET images by combining wavelets and curvelets for improved preservation of resolution and quantitation.

    PubMed

    Le Pogam, A; Hanzouli, H; Hatt, M; Cheze Le Rest, C; Visvikis, D

    2013-12-01

    Denoising of Positron Emission Tomography (PET) images is a challenging task due to the inherent low signal-to-noise ratio (SNR) of the acquired data. A pre-processing denoising step may facilitate and improve the results of further steps such as segmentation, quantification or textural features characterization. Different recent denoising techniques have been introduced and most state-of-the-art methods are based on filtering in the wavelet domain. However, the wavelet transform suffers from some limitations due to its non-optimal processing of edge discontinuities. More recently, a new multi scale geometric approach has been proposed, namely the curvelet transform. It extends the wavelet transform to account for directional properties in the image. In order to address the issue of resolution loss associated with standard denoising, we considered a strategy combining the complementary wavelet and curvelet transforms. We compared different figures of merit (e.g. SNR increase, noise decrease in homogeneous regions, resolution loss, and intensity bias) on simulated and clinical datasets with the proposed combined approach and the wavelet-only and curvelet-only filtering techniques. The three methods led to an increase of the SNR. Regarding the quantitative accuracy however, the wavelet and curvelet only denoising approaches led to larger biases in the intensity and the contrast than the proposed combined algorithm. This approach could become an alternative solution to filters currently used after image reconstruction in clinical systems such as the Gaussian filter.

  6. [Near infrared spectra (NIR) analysis of octane number by wavelet denoising-derivative method].

    PubMed

    Tian, Gao-you; Yuan, Hong-fu; Chu, Xiao-li; Liu, Hui-ying; Lu, Wan-zhen

    2005-04-01

    Derivative can correct baseline effects and also increase the level of noise. Wavelet transform has been proven an efficient tool for de-noising. This paper is directed to the application of wavelet transfer and derivative in the NIR analysis of octane number (RON). The derivative parameters, as well as their effects on the noise level and analytic accuracy of RON, have been studied in detail. The results show that derivative can correct the baseline effects and increase the analytic accuracy. Noise from the derivative spectra has great detriment to the analysis of RON. De-noising of wavelet transform can increase the S/N and improve the analytical accuracy.

  7. Denoising performance of modified dual-tree complex wavelet transform for processing quadrature embolic Doppler signals.

    PubMed

    Serbes, Gorkem; Aydin, Nizamettin

    2014-01-01

    Quadrature signals are dual-channel signals obtained from the systems employing quadrature demodulation. Embolic Doppler ultrasound signals obtained from stroke-prone patients by using Doppler ultrasound systems are quadrature signals caused by emboli, which are particles bigger than red blood cells within circulatory system. Detection of emboli is an important step in diagnosing stroke. Most widely used parameter in detection of emboli is embolic signal-to-background signal ratio. Therefore, in order to increase this ratio, denoising techniques are employed in detection systems. Discrete wavelet transform has been used for denoising of embolic signals, but it lacks shift invariance property. Instead, dual-tree complex wavelet transform having near-shift invariance property can be used. However, it is computationally expensive as two wavelet trees are required. Recently proposed modified dual-tree complex wavelet transform, which reduces the computational complexity, can also be used. In this study, the denoising performance of this method is extensively evaluated and compared with the others by using simulated and real quadrature signals. The quantitative results demonstrated that the modified dual-tree-complex-wavelet-transform-based denoising outperforms the conventional discrete wavelet transform with the same level of computational complexity and exhibits almost equal performance to the dual-tree complex wavelet transform with almost half computational cost.

  8. Study on an improved wavelet shift-invariant threshold denoising for pulsed laser induced glucose photoacoustic signals

    NASA Astrophysics Data System (ADS)

    Wang, Zhengzi; Ren, Zhong; Liu, Guodong

    2015-10-01

    Noninvasive measurement of blood glucose concentration has become a hotspot research in the world due to its characteristic of convenient, rapid and non-destructive etc. The blood glucose concentration monitoring based on photoacoustic technique has attracted many attentions because the detected signal is ultrasonic signals rather than the photo signals. But during the acquisition of the photoacoustic signals of glucose, the photoacoustic signals are not avoid to be polluted by some factors, such as the pulsed laser, electronic noises and circumstance noises etc. These disturbances will impact the measurement accuracy of the glucose concentration, So, the denoising of the glucose photoacoustic signals is a key work. In this paper, a wavelet shift-invariant threshold denoising method is improved, and a novel wavelet threshold function is proposed. For the novel wavelet threshold function, two threshold values and two different factors are set, and the novel function is high order derivative and continuous, which can be looked as the compromise between the wavelet soft threshold denoising and hard threshold denoising. Simulation experimental results illustrate that, compared with other wavelet threshold denoising, this improved wavelet shift-invariant threshold denoising has higher signal-to-noise ratio(SNR) and smaller root mean-square error (RMSE) value. And this improved denoising also has better denoising effect than others. Therefore, this improved denoising has a certain of potential value in the denoising of glucose photoacoustic signals.

  9. Denoising embolic Doppler ultrasound signals using Dual Tree Complex Discrete Wavelet Transform.

    PubMed

    Serbes, Gorkem; Aydin, Nizamettin

    2010-01-01

    Early and accurate detection of asymptomatic emboli is important for monitoring of preventive therapy in stroke-prone patients. One of the problems in detection of emboli is the identification of an embolic signal caused by very small emboli. The amplitude of the embolic signal may be so small that advanced processing methods are required to distinguish these signals from Doppler signals arising from red blood cells. In this study instead of conventional discrete wavelet transform, the Dual Tree Complex Discrete Wavelet Transform was used for denoising embolic signals. Performances of both approaches were compared. Unlike the conventional discrete wavelet transform discrete complex wavelet transform is a shift invariant transform with limited redundancy. Results demonstrate that the Dual Tree Complex Discrete Wavelet Transform based denoising outperforms conventional discrete wavelet denoising. Approximately 8 dB improvement is obtained by using the Dual Tree Complex Discrete Wavelet Transform compared to the improvement provided by the conventional Discrete Wavelet Transform (less than 5 dB).

  10. Evaluation of Wavelet Denoising Methods for Small-Scale Joint Roughness Estimation Using Terrestrial Laser Scanning

    NASA Astrophysics Data System (ADS)

    Bitenc, M.; Kieffer, D. S.; Khoshelham, K.

    2015-08-01

    The precision of Terrestrial Laser Scanning (TLS) data depends mainly on the inherent random range error, which hinders extraction of small details from TLS measurements. New post processing algorithms have been developed that reduce or eliminate the noise and therefore enable modelling details at a smaller scale than one would traditionally expect. The aim of this research is to find the optimum denoising method such that the corrected TLS data provides a reliable estimation of small-scale rock joint roughness. Two wavelet-based denoising methods are considered, namely Discrete Wavelet Transform (DWT) and Stationary Wavelet Transform (SWT), in combination with different thresholding procedures. The question is, which technique provides a more accurate roughness estimates considering (i) wavelet transform (SWT or DWT), (ii) thresholding method (fixed-form or penalised low) and (iii) thresholding mode (soft or hard). The performance of denoising methods is tested by two analyses, namely method noise and method sensitivity to noise. The reference data are precise Advanced TOpometric Sensor (ATOS) measurements obtained on 20 × 30 cm rock joint sample, which are for the second analysis corrupted by different levels of noise. With such a controlled noise level experiments it is possible to evaluate the methods' performance for different amounts of noise, which might be present in TLS data. Qualitative visual checks of denoised surfaces and quantitative parameters such as grid height and roughness are considered in a comparative analysis of denoising methods. Results indicate that the preferred method for realistic roughness estimation is DWT with penalised low hard thresholding.

  11. Evaluation of Effectiveness of Wavelet Based Denoising Schemes Using ANN and SVM for Bearing Condition Classification

    PubMed Central

    G. S., Vijay; H. S., Kumar; Pai P., Srinivasa; N. S., Sriram; Rao, Raj B. K. N.

    2012-01-01

    The wavelet based denoising has proven its ability to denoise the bearing vibration signals by improving the signal-to-noise ratio (SNR) and reducing the root-mean-square error (RMSE). In this paper seven wavelet based denoising schemes have been evaluated based on the performance of the Artificial Neural Network (ANN) and the Support Vector Machine (SVM), for the bearing condition classification. The work consists of two parts, the first part in which a synthetic signal simulating the defective bearing vibration signal with Gaussian noise was subjected to these denoising schemes. The best scheme based on the SNR and the RMSE was identified. In the second part, the vibration signals collected from a customized Rolling Element Bearing (REB) test rig for four bearing conditions were subjected to these denoising schemes. Several time and frequency domain features were extracted from the denoised signals, out of which a few sensitive features were selected using the Fisher's Criterion (FC). Extracted features were used to train and test the ANN and the SVM. The best denoising scheme identified, based on the classification performances of the ANN and the SVM, was found to be the same as the one obtained using the synthetic signal. PMID:23213323

  12. Wavelet-Based Adaptive Denoising of Phonocardiographic Records

    DTIC Science & Technology

    2007-11-02

    the approximated signal, and d the signal details at the given scale; h and g are biorthogonal filters, corresponding to the selected mother wavelet ...dyadic scale can be written as: where is the orthogonal mother wavelet , and: The discrete version of the dyadic wavelet transform can be based on... wavelet with 4 moments equal to zero (Coiflet-2) as the mother wavelet . The two channels were wavelet decomposed up to the 9th order (i = 0, 1 ... 8

  13. Image denoising using trivariate shrinkage filter in the wavelet domain and joint bilateral filter in the spatial domain.

    PubMed

    Yu, Hancheng; Zhao, Li; Wang, Haixian

    2009-10-01

    This correspondence proposes an efficient algorithm for removing Gaussian noise from corrupted image by incorporating a wavelet-based trivariate shrinkage filter with a spatial-based joint bilateral filter. In the wavelet domain, the wavelet coefficients are modeled as trivariate Gaussian distribution, taking into account the statistical dependencies among intrascale wavelet coefficients, and then a trivariate shrinkage filter is derived by using the maximum a posteriori (MAP) estimator. Although wavelet-based methods are efficient in image denoising, they are prone to producing salient artifacts such as low-frequency noise and edge ringing which relate to the structure of the underlying wavelet. On the other hand, most spatial-based algorithms output much higher quality denoising image with less artifacts. However, they are usually too computationally demanding. In order to reduce the computational cost, we develop an efficient joint bilateral filter by using the wavelet denoising result rather than directly processing the noisy image in the spatial domain. This filter could suppress the noise while preserve image details with small computational cost. Extension to color image denoising is also presented. We compare our denoising algorithm with other denoising techniques in terms of PSNR and visual quality. The experimental results indicate that our algorithm is competitive with other denoising techniques.

  14. Implemented Wavelet Packet Tree based Denoising Algorithm in Bus Signals of a Wearable Sensorarray

    NASA Astrophysics Data System (ADS)

    Schimmack, M.; Nguyen, S.; Mercorelli, P.

    2015-11-01

    This paper introduces a thermosensing embedded system with a sensor bus that uses wavelets for the purposes of noise location and denoising. From the principle of the filter bank the measured signal is separated in two bands, low and high frequency. The proposed algorithm identifies the defined noise in these two bands. With the Wavelet Packet Transform as a method of Discrete Wavelet Transform, it is able to decompose and reconstruct bus input signals of a sensor network. Using a seminorm, the noise of a sequence can be detected and located, so that the wavelet basis can be rearranged. This particularly allows for elimination of any incoherent parts that make up unavoidable measuring noise of bus signals. The proposed method was built based on wavelet algorithms from the WaveLab 850 library of the Stanford University (USA). This work gives an insight to the workings of Wavelet Transformation.

  15. Using wavelet denoising and mathematical morphology in the segmentation technique applied to blood cells images.

    PubMed

    Boix, Macarena; Cantó, Begoña

    2013-04-01

    Accurate image segmentation is used in medical diagnosis since this technique is a noninvasive pre-processing step for biomedical treatment. In this work we present an efficient segmentation method for medical image analysis. In particular, with this method blood cells can be segmented. For that, we combine the wavelet transform with morphological operations. Moreover, the wavelet thresholding technique is used to eliminate the noise and prepare the image for suitable segmentation. In wavelet denoising we determine the best wavelet that shows a segmentation with the largest area in the cell. We study different wavelet families and we conclude that the wavelet db1 is the best and it can serve for posterior works on blood pathologies. The proposed method generates goods results when it is applied on several images. Finally, the proposed algorithm made in MatLab environment is verified for a selected blood cells.

  16. Total variation versus wavelet-based methods for image denoising in fluorescence lifetime imaging microscopy.

    PubMed

    Chang, Ching-Wei; Mycek, Mary-Ann

    2012-05-01

    We report the first application of wavelet-based denoising (noise removal) methods to time-domain box-car fluorescence lifetime imaging microscopy (FLIM) images and compare the results to novel total variation (TV) denoising methods. Methods were tested first on artificial images and then applied to low-light live-cell images. Relative to undenoised images, TV methods could improve lifetime precision up to 10-fold in artificial images, while preserving the overall accuracy of lifetime and amplitude values of a single-exponential decay model and improving local lifetime fitting in live-cell images. Wavelet-based methods were at least 4-fold faster than TV methods, but could introduce significant inaccuracies in recovered lifetime values. The denoising methods discussed can potentially enhance a variety of FLIM applications, including live-cell, in vivo animal, or endoscopic imaging studies, especially under challenging imaging conditions such as low-light or fast video-rate imaging.

  17. Low-dose computed tomography image denoising based on joint wavelet and sparse representation.

    PubMed

    Ghadrdan, Samira; Alirezaie, Javad; Dillenseger, Jean-Louis; Babyn, Paul

    2014-01-01

    Image denoising and signal enhancement are the most challenging issues in low dose computed tomography (CT) imaging. Sparse representational methods have shown initial promise for these applications. In this work we present a wavelet based sparse representation denoising technique utilizing dictionary learning and clustering. By using wavelets we extract the most suitable features in the images to obtain accurate dictionary atoms for the denoising algorithm. To achieve improved results we also lower the number of clusters which reduces computational complexity. In addition, a single image noise level estimation is developed to update the cluster centers in higher PSNRs. Our results along with the computational efficiency of the proposed algorithm clearly demonstrates the improvement of the proposed algorithm over other clustering based sparse representation (CSR) and K-SVD methods.

  18. Hardware Design and Implementation of a Wavelet De-Noising Procedure for Medical Signal Preprocessing

    PubMed Central

    Chen, Szi-Wen; Chen, Yuan-Ho

    2015-01-01

    In this paper, a discrete wavelet transform (DWT) based de-noising with its applications into the noise reduction for medical signal preprocessing is introduced. This work focuses on the hardware realization of a real-time wavelet de-noising procedure. The proposed de-noising circuit mainly consists of three modules: a DWT, a thresholding, and an inverse DWT (IDWT) modular circuits. We also proposed a novel adaptive thresholding scheme and incorporated it into our wavelet de-noising procedure. Performance was then evaluated on both the architectural designs of the software and. In addition, the de-noising circuit was also implemented by downloading the Verilog codes to a field programmable gate array (FPGA) based platform so that its ability in noise reduction may be further validated in actual practice. Simulation experiment results produced by applying a set of simulated noise-contaminated electrocardiogram (ECG) signals into the de-noising circuit showed that the circuit could not only desirably meet the requirement of real-time processing, but also achieve satisfactory performance for noise reduction, while the sharp features of the ECG signals can be well preserved. The proposed de-noising circuit was further synthesized using the Synopsys Design Compiler with an Artisan Taiwan Semiconductor Manufacturing Company (TSMC, Hsinchu, Taiwan) 40 nm standard cell library. The integrated circuit (IC) synthesis simulation results showed that the proposed design can achieve a clock frequency of 200 MHz and the power consumption was only 17.4 mW, when operated at 200 MHz. PMID:26501290

  19. Hardware design and implementation of a wavelet de-noising procedure for medical signal preprocessing.

    PubMed

    Chen, Szi-Wen; Chen, Yuan-Ho

    2015-10-16

    In this paper, a discrete wavelet transform (DWT) based de-noising with its applications into the noise reduction for medical signal preprocessing is introduced. This work focuses on the hardware realization of a real-time wavelet de-noising procedure. The proposed de-noising circuit mainly consists of three modules: a DWT, a thresholding, and an inverse DWT (IDWT) modular circuits. We also proposed a novel adaptive thresholding scheme and incorporated it into our wavelet de-noising procedure. Performance was then evaluated on both the architectural designs of the software and. In addition, the de-noising circuit was also implemented by downloading the Verilog codes to a field programmable gate array (FPGA) based platform so that its ability in noise reduction may be further validated in actual practice. Simulation experiment results produced by applying a set of simulated noise-contaminated electrocardiogram (ECG) signals into the de-noising circuit showed that the circuit could not only desirably meet the requirement of real-time processing, but also achieve satisfactory performance for noise reduction, while the sharp features of the ECG signals can be well preserved. The proposed de-noising circuit was further synthesized using the Synopsys Design Compiler with an Artisan Taiwan Semiconductor Manufacturing Company (TSMC, Hsinchu, Taiwan) 40 nm standard cell library. The integrated circuit (IC) synthesis simulation results showed that the proposed design can achieve a clock frequency of 200 MHz and the power consumption was only 17.4 mW, when operated at 200 MHz.

  20. The Differential Pressure Signal De-noised by Domain Transform Combined with Wavelet Threshold

    NASA Astrophysics Data System (ADS)

    Zhang, Yuhao; Wang, Haihui; Li, Chao

    2017-01-01

    In the process of estimating the thrust of an aircraft engine, there is a big problem that the differential pressure signal has large fluctuation. To deal with this problem, we develop an effective and robust adaptive de-noising algorithm based on domain transform combined with wavelet transform (D-WT). First, we do the domain transform for the signal, then sample the transformed signal, and finally the wavelet threshold transform is performed for the signal. Compared with the traditional wavelet transforms, the D-WT method filters the noise effectively and keeps more details.

  1. The Application of Wavelet-Domain Hidden Markov Tree Model in Diabetic Retinal Image Denoising.

    PubMed

    Cui, Dong; Liu, Minmin; Hu, Lei; Liu, Keju; Guo, Yongxin; Jiao, Qing

    2015-01-01

    The wavelet-domain Hidden Markov Tree Model can properly describe the dependence and correlation of fundus angiographic images' wavelet coefficients among scales. Based on the construction of the fundus angiographic images Hidden Markov Tree Models and Gaussian Mixture Models, this paper applied expectation-maximum algorithm to estimate the wavelet coefficients of original fundus angiographic images and the Bayesian estimation to achieve the goal of fundus angiographic images denoising. As is shown in the experimental result, compared with the other algorithms as mean filter and median filter, this method effectively improved the peak signal to noise ratio of fundus angiographic images after denoising and preserved the details of vascular edge in fundus angiographic images.

  2. Speech signal denoising with wavelet-transforms and the mean opinion score characterizing the filtering quality

    NASA Astrophysics Data System (ADS)

    Yaseen, Alauldeen S.; Pavlov, Alexey N.; Hramov, Alexander E.

    2016-03-01

    Speech signal processing is widely used to reduce noise impact in acquired data. During the last decades, wavelet-based filtering techniques are often applied in communication systems due to their advantages in signal denoising as compared with Fourier-based methods. In this study we consider applications of a 1-D double density complex wavelet transform (1D-DDCWT) and compare the results with the standard 1-D discrete wavelet-transform (1DDWT). The performances of the considered techniques are compared using the mean opinion score (MOS) being the primary metric for the quality of the processed signals. A two-dimensional extension of this approach can be used for effective image denoising.

  3. Denoising and robust non-linear wavelet analysis

    NASA Astrophysics Data System (ADS)

    Bruce, Andrew G.; Donoho, David L.; Gao, Hong-Ye; Martin, R. D.

    1994-04-01

    In a series of papers, Donoho and Johnstone develop a powerful theory based on wavelets for extracting non-smooth signals from noisy data. Several nonlinear smoothing algorithms are presented which provide high performance for removing Gaussian noise from a wide range of spatially inhomogeneous signals. However, like other methods based on the linear wavelet transform, these algorithms are very sensitive to certain types of non-Gaussian noise, such as outliers. In this paper, we develop outlier resistance wavelet transforms. In these transforms, outliers and outlier patches are localized to just a few scales. By using the outlier resistant wavelet transforms, we improve upon the Donoho and Johnstone nonlinear signal extraction methods. The outlier resistant wavelet algorithms are included with the S+Wavelets object-oriented toolkit for wavelet analysis.

  4. Sample entropy-based adaptive wavelet de-noising approach for meteorologic and hydrologic time series

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Singh, Vijay P.; Shang, Xiaosan; Ding, Hao; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Chen, Yuanfang; Chen, Xi; Wang, Shicheng; Wang, Zhenlong

    2014-07-01

    De-noising meteorologic and hydrologic time series is important to improve the accuracy and reliability of extraction, analysis, simulation, and forecasting. A hybrid approach, combining sample entropy and wavelet de-noising method, is developed to separate noise from original series and is named as AWDA-SE (adaptive wavelet de-noising approach using sample entropy). The AWDA-SE approach adaptively determines the threshold for wavelet analysis. Two kinds of meteorologic and hydrologic data sets, synthetic data set and 3 representative field measured data sets (one is the annual rainfall data of Jinan station and the other two are annual streamflow series from two typical stations in China, Yingluoxia station on the Heihe River, which is little affected by human activities, and Lijin station on the Yellow River, which is greatly affected by human activities), are used to illustrate the approach. The AWDA-SE approach is compared with three conventional de-noising methods, including fixed-form threshold algorithm, Stein unbiased risk estimation algorithm, and minimax algorithm. Results show that the AWDA-SE approach separates effectively the signal and noise of the data sets and is found to be better than the conventional methods. Measures of assessment standards show that the developed approach can be employed to investigate noisy and short time series and can also be applied to other areas.

  5. Automated wavelet denoising of photoacoustic signals for burn-depth image reconstruction

    NASA Astrophysics Data System (ADS)

    Holan, Scott H.; Viator, John A.

    2007-02-01

    Photoacoustic image reconstruction involves dozens or perhaps hundreds of point measurements, each of which contributes unique information about the subsurface absorbing structures under study. For backprojection imaging, two or more point measurements of photoacoustic waves induced by irradiating a sample with laser light are used to produce an image of the acoustic source. Each of these point measurements must undergo some signal processing, such as denoising and system deconvolution. In order to efficiently process the numerous signals acquired for photoacoustic imaging, we have developed an automated wavelet algorithm for processing signals generated in a burn injury phantom. We used the discrete wavelet transform to denoise photoacoustic signals generated in an optically turbid phantom containing whole blood. The denoising used universal level independent thresholding, as developed by Donoho and Johnstone. The entire signal processing technique was automated so that no user intervention was needed to reconstruct the images. The signals were backprojected using the automated wavelet processing software and showed reconstruction using denoised signals improved image quality by 21%, using a relative 2-norm difference scheme.

  6. Exploring the impact of wavelet-based denoising in the classification of remote sensing hyperspectral images

    NASA Astrophysics Data System (ADS)

    Quesada-Barriuso, Pablo; Heras, Dora B.; Argüello, Francisco

    2016-10-01

    The classification of remote sensing hyperspectral images for land cover applications is a very intensive topic. In the case of supervised classification, Support Vector Machines (SVMs) play a dominant role. Recently, the Extreme Learning Machine algorithm (ELM) has been extensively used. The classification scheme previously published by the authors, and called WT-EMP, introduces spatial information in the classification process by means of an Extended Morphological Profile (EMP) that is created from features extracted by wavelets. In addition, the hyperspectral image is denoised in the 2-D spatial domain, also using wavelets and it is joined to the EMP via a stacked vector. In this paper, the scheme is improved achieving two goals. The first one is to reduce the classification time while preserving the accuracy of the classification by using ELM instead of SVM. The second one is to improve the accuracy results by performing not only a 2-D denoising for every spectral band, but also a previous additional 1-D spectral signature denoising applied to each pixel vector of the image. For each denoising the image is transformed by applying a 1-D or 2-D wavelet transform, and then a NeighShrink thresholding is applied. Improvements in terms of classification accuracy are obtained, especially for images with close regions in the classification reference map, because in these cases the accuracy of the classification in the edges between classes is more relevant.

  7. Improved 3D wavelet-based de-noising of fMRI data

    NASA Astrophysics Data System (ADS)

    Khullar, Siddharth; Michael, Andrew M.; Correa, Nicolle; Adali, Tulay; Baum, Stefi A.; Calhoun, Vince D.

    2011-03-01

    Functional MRI (fMRI) data analysis deals with the problem of detecting very weak signals in very noisy data. Smoothing with a Gaussian kernel is often used to decrease noise at the cost of losing spatial specificity. We present a novel wavelet-based 3-D technique to remove noise in fMRI data while preserving the spatial features in the component maps obtained through group independent component analysis (ICA). Each volume is decomposed into eight volumetric sub-bands using a separable 3-D stationary wavelet transform. Each of the detail sub-bands are then treated through the main denoising module. This module facilitates computation of shrinkage factors through a hierarchical framework. It utilizes information iteratively from the sub-band at next higher level to estimate denoised coefficients at the current level. These de-noised sub-bands are then reconstructed back to the spatial domain using an inverse wavelet transform. Finally, the denoised group fMRI data is analyzed using ICA where the data is decomposed in to clusters of functionally correlated voxels (spatial maps) as indicators of task-related neural activity. The proposed method enables the preservation of shape of the actual activation regions associated with the BOLD activity. In addition it is able to achieve high specificity as compared to the conventionally used FWHM (full width half maximum) Gaussian kernels for smoothing fMRI data.

  8. NOTE: Automated wavelet denoising of photoacoustic signals for circulating melanoma cell detection and burn image reconstruction

    NASA Astrophysics Data System (ADS)

    Holan, Scott H.; Viator, John A.

    2008-06-01

    Photoacoustic image reconstruction may involve hundreds of point measurements, each of which contributes unique information about the subsurface absorbing structures under study. For backprojection imaging, two or more point measurements of photoacoustic waves induced by irradiating a biological sample with laser light are used to produce an image of the acoustic source. Each of these measurements must undergo some signal processing, such as denoising or system deconvolution. In order to process the numerous signals, we have developed an automated wavelet algorithm for denoising signals. We appeal to the discrete wavelet transform for denoising photoacoustic signals generated in a dilute melanoma cell suspension and in thermally coagulated blood. We used 5, 9, 45 and 270 melanoma cells in the laser beam path as test concentrations. For the burn phantom, we used coagulated blood in 1.6 mm silicon tube submerged in Intralipid. Although these two targets were chosen as typical applications for photoacoustic detection and imaging, they are of independent interest. The denoising employs level-independent universal thresholding. In order to accommodate nonradix-2 signals, we considered a maximal overlap discrete wavelet transform (MODWT). For the lower melanoma cell concentrations, as the signal-to-noise ratio approached 1, denoising allowed better peak finding. For coagulated blood, the signals were denoised to yield a clean photoacoustic resulting in an improvement of 22% in the reconstructed image. The entire signal processing technique was automated so that minimal user intervention was needed to reconstruct the images. Such an algorithm may be used for image reconstruction and signal extraction for applications such as burn depth imaging, depth profiling of vascular lesions in skin and the detection of single cancer cells in blood samples.

  9. Noise reduction of time domain electromagnetic data: Application of a combined wavelet denoising method

    NASA Astrophysics Data System (ADS)

    Ji, Yanju; Li, Dongsheng; Yuan, Guiyang; Lin, Jun; Du, Shangyu; Xie, Lijun; Wang, Yuan

    2016-06-01

    A denoising method based on wavelet analysis is presented for the removal of noise (background noise and random spike) from time domain electromagnetic (TEM) data. This method includes two signal processing technologies: wavelet threshold method and stationary wavelet transform. First, wavelet threshold method is used for the removal of background noise from TEM data. Then, the data are divided into a series of details and approximations by using stationary wavelet transform. The random spike in details is identified by zero reference data and adaptive energy detector. Next, the corresponding details are processed to suppress the random spike. The denoised TEM data are reconstructed via inverse stationary wavelet transform using the processed details at each level and the approximations at the highest level. The proposed method has been verified using a synthetic TEM data, the signal-to-noise ratio of synthetic TEM data is increased from 10.97 dB to 24.37 dB at last. This method is also applied to the noise suppression of the field data which were collected at Hengsha island, China. The section image results shown that the noise is suppressed effectively and the resolution of the deep anomaly is obviously improved.

  10. Denoising approach for remote sensing image based on anisotropic diffusion and wavelet transform algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Xiaojun; Lai, Weidong

    2011-08-01

    In this paper, a combined method have been put forward for one ASTER detected image with the wavelet filter to attenuate the noise and the anisotropic diffusion PDE(Partial Differential Equation) for further recovering image contrast. The model is verified in different noising background, since the remote sensing image usually contains salt and pepper, Gaussian as well as speckle noise. Considered the features that noise existing in wavelet domain, the wavelet filter with Bayesian estimation threshold is applied for recovering image contrast from the blurring background. The proposed PDE are performing an anisotropic diffusion in the orthogonal direction, thus preserving the edges during further denoising process. Simulation indicates that the combined algorithm can more effectively recover the blurred image from speckle and Gauss noise background than the only wavelet denoising method, while the denoising effect is also distinct when the pepper-salt noise has low intensity. The combined algorithm proposed in this article can be integrated in remote sensing image analyzing to obtain higher accuracy for environmental interpretation and pattern recognition.

  11. Application of the discrete torus wavelet transform to the denoising of magnetic resonance images of uterine and ovarian masses

    NASA Astrophysics Data System (ADS)

    Sarty, Gordon E.; Atkins, M. Stella; Olatunbosun, Femi; Chizen, Donna; Loewy, John; Kendall, Edward J.; Pierson, Roger A.

    1999-10-01

    A new numerical wavelet transform, the discrete torus wavelet transform, is described and an application is given to the denoising of abdominal magnetic resonance imaging (MRI) data. The discrete tori wavelet transform is an undecimated wavelet transform which is computed using a discrete Fourier transform and multiplication instead of by direct convolution in the image domain. This approach leads to a decomposition of the image onto frames in the space of square summable functions on the discrete torus, l2(T2). The new transform was compared to the traditional decimated wavelet transform in its ability to denoise MRI data. By using denoised images as the basis for the computation of a nuclear magnetic resonance spin-spin relaxation-time map through least squares curve fitting, an error map was generated that was used to assess the performance of the denoising algorithms. The discrete torus wavelet transform outperformed the traditional wavelet transform in 88% of the T2 error map denoising tests with phantoms and gynecologic MRI images.

  12. A joint inter- and intrascale statistical model for Bayesian wavelet based image denoising.

    PubMed

    Pizurica, Aleksandra; Philips, Wilfried; Lemahieu, Ignace; Acheroy, Marc

    2002-01-01

    This paper presents a new wavelet-based image denoising method, which extends a "geometrical" Bayesian framework. The new method combines three criteria for distinguishing supposedly useful coefficients from noise: coefficient magnitudes, their evolution across scales and spatial clustering of large coefficients near image edges. These three criteria are combined in a Bayesian framework. The spatial clustering properties are expressed in a prior model. The statistical properties concerning coefficient magnitudes and their evolution across scales are expressed in a joint conditional model. The three main novelties with respect to related approaches are (1) the interscale-ratios of wavelet coefficients are statistically characterized and different local criteria for distinguishing useful coefficients from noise are evaluated, (2) a joint conditional model is introduced, and (3) a novel anisotropic Markov random field prior model is proposed. The results demonstrate an improved denoising performance over related earlier techniques.

  13. Synergistic Effects of Phase Folding and Wavelet Denoising with Applications in Light Curve Analysis

    DTIC Science & Technology

    2016-09-15

    FOLDING AND WAVELET DENOISING WITH APPLICATIONS IN LIGHT CURVE ANALYSIS I. Introduction Moore’s law, and other exponential growth patterns in technology...that the universe underwent a period of exponential growth shortly after the big bang. The exponential growth of the universe can be accounted for in the...The second section discusses the exponential growth of the solution space which accompanies the linear growth of components. Inherent Error. In

  14. Wavelet transform-based methods for denoising of Coulter counter signals

    NASA Astrophysics Data System (ADS)

    Jagtiani, Ashish V.; Sawant, Rupesh; Carletta, Joan; Zhe, Jiang

    2008-06-01

    A process based on discrete wavelet transforms is developed for denoising and baseline correction of measured signals from Coulter counters. Given signals from a particular Coulter counting experiment, which detect passage of particles through a fluid-filled microchannel, the process uses a cross-validation procedure to pick appropriate parameters for signal denoising; these parameters include the choice of the particular wavelet, the number of levels of decomposition, the threshold value and the threshold strategy. The process is demonstrated on simulated and experimental single channel data obtained from a particular multi-channel Coulter counter processing. For these example experimental signals from 20 µm polymethacrylate and Cottonwood/Eastern Deltoid pollen particles and the simulated signals, denoising is aimed at removing Gaussian white noise, 60 Hz power line interference and low frequency baseline drift. The process can be easily adapted for other Coulter counters and other sources of noise. Overall, wavelets are presented as a tool to aid in accurate detection of particles in Coulter counters.

  15. Denoising of arterial and venous Doppler signals using discrete wavelet transform: effect on clinical parameters.

    PubMed

    Tokmakçi, Mahmut; Erdoğan, Nuri

    2009-05-01

    In this paper, the effects of a wavelet transform based denoising strategy on clinical Doppler parameters are analyzed. The study scheme included: (a) Acquisition of arterial and venous Doppler signals by sampling the audio output of an ultrasound scanner from 20 healthy volunteers, (b) Noise reduction via decomposition of the signals through discrete wavelet transform, (c) Spectral analysis of noisy and noise-free signals with short time Fourier transform, (d) Curve fitting to spectrograms, (e) Calculation of clinical Doppler parameters, (f) Statistical comparison of parameters obtained from noisy and noise-free signals. The decomposition level was selected as the highest level at which the maximum power spectral density and its corresponding frequency were preserved. In all subjects, noise-free spectrograms had smoother trace with less ripples. In both arterial and venous spectrograms, denoising resulted in a significant decrease in the maximum (systolic) and mean frequency, with no statistical difference in the minimum (diastolic) frequency. In arterial signals, this leads to a significant decrease in the calculated parameters such as Systolic/Diastolic Velocity Ratio, Resistivity Index, Pulsatility Index and Acceleration Time. Acceleration Index did not change significantly. Despite a successful denoising, the effects of wavelet decomposition on high frequency components in the Doppler signal should be challenged by comparison with reference data, or, through clinical investigations.

  16. Denoising algorithm based on edge extraction and wavelet transform in digital holography

    NASA Astrophysics Data System (ADS)

    Zhang, Ming; Sang, Xin-zhu; Leng, Jun-min; Cao, Xue-mei

    2013-08-01

    Digital holography is a kind of coherent imaging method and inevitably affected by many factors in the process of recording. One of dominant problems is the speckle noise, which is essentially nonlinear multiplicative noise related to signals. So it is more difficult to remove than additive noise. Due to the noise pollution, the low resolution of image reconstructed is caused. A new solution for suppressing speckle noise in digital hologram is presented, which combines Canny filtering algorithm with wavelet threshold denoising algorithm. Canny filter is used to obtain the edge detail. Wavelet transformation performs denoising. In order to suppress speckle effectively and retain the image details as much as possible, Neyman-Pearson (N-P) criterion is introduced to estimate wavelet coefficient in every scale. An improved threshold function is proposed, whose curve is smoother. The reconstructed image is achieved by merging the denoised image with the edge details. Experimental results and performance parameters of the proposed algorithm are discussed and compared with other methods, which shows that the presented approach can not only effectively eliminate speckle noise, but also retain useful signals and edge information simultaneously.

  17. Fault Detection of a Roller-Bearing System through the EMD of a Wavelet Denoised Signal

    PubMed Central

    Ahn, Jong-Hyo; Kwak, Dae-Ho; Koh, Bong-Hwan

    2014-01-01

    This paper investigates fault detection of a roller bearing system using a wavelet denoising scheme and proper orthogonal value (POV) of an intrinsic mode function (IMF) covariance matrix. The IMF of the bearing vibration signal is obtained through empirical mode decomposition (EMD). The signal screening process in the wavelet domain eliminates noise-corrupted portions that may lead to inaccurate prognosis of bearing conditions. We segmented the denoised bearing signal into several intervals, and decomposed each of them into IMFs. The first IMF of each segment is collected to become a covariance matrix for calculating the POV. We show that covariance matrices from healthy and damaged bearings exhibit different POV profiles, which can be a damage-sensitive feature. We also illustrate the conventional approach of feature extraction, of observing the kurtosis value of the measured signal, to compare the functionality of the proposed technique. The study demonstrates the feasibility of wavelet-based de-noising, and shows through laboratory experiments that tracking the proper orthogonal values of the covariance matrix of the IMF can be an effective and reliable measure for monitoring bearing fault. PMID:25196008

  18. The EM Method in a Probabilistic Wavelet-Based MRI Denoising

    PubMed Central

    2015-01-01

    Human body heat emission and others external causes can interfere in magnetic resonance image acquisition and produce noise. In this kind of images, the noise, when no signal is present, is Rayleigh distributed and its wavelet coefficients can be approximately modeled by a Gaussian distribution. Noiseless magnetic resonance images can be modeled by a Laplacian distribution in the wavelet domain. This paper proposes a new magnetic resonance image denoising method to solve this fact. This method performs shrinkage of wavelet coefficients based on the conditioned probability of being noise or detail. The parameters involved in this filtering approach are calculated by means of the expectation maximization (EM) method, which avoids the need to use an estimator of noise variance. The efficiency of the proposed filter is studied and compared with other important filtering techniques, such as Nowak's, Donoho-Johnstone's, Awate-Whitaker's, and nonlocal means filters, in different 2D and 3D images. PMID:26089959

  19. The EM Method in a Probabilistic Wavelet-Based MRI Denoising.

    PubMed

    Martin-Fernandez, Marcos; Villullas, Sergio

    2015-01-01

    Human body heat emission and others external causes can interfere in magnetic resonance image acquisition and produce noise. In this kind of images, the noise, when no signal is present, is Rayleigh distributed and its wavelet coefficients can be approximately modeled by a Gaussian distribution. Noiseless magnetic resonance images can be modeled by a Laplacian distribution in the wavelet domain. This paper proposes a new magnetic resonance image denoising method to solve this fact. This method performs shrinkage of wavelet coefficients based on the conditioned probability of being noise or detail. The parameters involved in this filtering approach are calculated by means of the expectation maximization (EM) method, which avoids the need to use an estimator of noise variance. The efficiency of the proposed filter is studied and compared with other important filtering techniques, such as Nowak's, Donoho-Johnstone's, Awate-Whitaker's, and nonlocal means filters, in different 2D and 3D images.

  20. Application of time-resolved glucose concentration photoacoustic signals based on an improved wavelet denoising

    NASA Astrophysics Data System (ADS)

    Ren, Zhong; Liu, Guodong; Huang, Zhen

    2014-10-01

    Real-time monitoring of blood glucose concentration (BGC) is a great important procedure in controlling diabetes mellitus and preventing the complication for diabetic patients. Noninvasive measurement of BGC has already become a research hotspot because it can overcome the physical and psychological harm. Photoacoustic spectroscopy is a well-established, hybrid and alternative technique used to determine the BGC. According to the theory of photoacoustic technique, the blood is irradiated by plused laser with nano-second repeation time and micro-joule power, the photoacoustic singals contained the information of BGC are generated due to the thermal-elastic mechanism, then the BGC level can be interpreted from photoacoustic signal via the data analysis. But in practice, the time-resolved photoacoustic signals of BGC are polluted by the varities of noises, e.g., the interference of background sounds and multi-component of blood. The quality of photoacoustic signal of BGC directly impacts the precision of BGC measurement. So, an improved wavelet denoising method was proposed to eliminate the noises contained in BGC photoacoustic signals. To overcome the shortcoming of traditional wavelet threshold denoising, an improved dual-threshold wavelet function was proposed in this paper. Simulation experimental results illustrated that the denoising result of this improved wavelet method was better than that of traditional soft and hard threshold function. To varify the feasibility of this improved function, the actual photoacoustic BGC signals were test, the test reslut demonstrated that the signal-to-noises ratio(SNR) of the improved function increases about 40-80%, and its root-mean-square error (RMSE) decreases about 38.7-52.8%.

  1. Comparison of a discrete wavelet transform method and a modified undecimated discrete wavelet transform method for denoising of mammograms.

    PubMed

    Matsuyama, Eri; Tsai, Du-Yih; Lee, Yongbum; Takahashi, Noriyuki

    2013-01-01

    The purpose of this study was to evaluate the performance of a conventional discrete wavelet transform (DWT) method and a modified undecimated discrete wavelet transform (M-UDWT) method applied to mammographic image denoising. Mutual information, mean square error, and signal to noise ratio were used as image quality measures of images processed by the two methods. We examined the performance of the two methods with visual perceptual evaluation. A two-tailed F test was used to measure statistical significance. The difference between the M-UDWT processed images and the conventional DWT-method processed images was statistically significant (P<0.01). The authors confirmed the superiority and effectiveness of the M-UDWT method. The results of this study suggest the M-UDWT method may provide better image quality as compared to the conventional DWT.

  2. Radiation dose reduction in computed tomography (CT) using a new implementation of wavelet denoising in low tube current acquisitions

    NASA Astrophysics Data System (ADS)

    Tao, Yinghua; Brunner, Stephen; Tang, Jie; Speidel, Michael; Rowley, Howard; VanLysel, Michael; Chen, Guang-Hong

    2011-03-01

    Radiation dose reduction remains at the forefront of research in computed tomography. X-ray tube parameters such as tube current can be lowered to reduce dose; however, images become prohibitively noisy when the tube current is too low. Wavelet denoising is one of many noise reduction techniques. However, traditional wavelet techniques have the tendency to create an artificial noise texture, due to the nonuniform denoising across the image, which is undesirable from a diagnostic perspective. This work presents a new implementation of wavelet denoising that is able to achieve noise reduction, while still preserving spatial resolution. Further, the proposed method has the potential to improve those unnatural noise textures. The technique was tested on both phantom and animal datasets (Catphan phantom and timeresolved swine heart scan) acquired on a GE Discovery VCT scanner. A number of tube currents were used to investigate the potential for dose reduction.

  3. Prognostics of Lithium-Ion Batteries Based on Wavelet Denoising and DE-RVM

    PubMed Central

    Zhang, Chaolong; He, Yigang; Yuan, Lifeng; Xiang, Sheng; Wang, Jinping

    2015-01-01

    Lithium-ion batteries are widely used in many electronic systems. Therefore, it is significantly important to estimate the lithium-ion battery's remaining useful life (RUL), yet very difficult. One important reason is that the measured battery capacity data are often subject to the different levels of noise pollution. In this paper, a novel battery capacity prognostics approach is presented to estimate the RUL of lithium-ion batteries. Wavelet denoising is performed with different thresholds in order to weaken the strong noise and remove the weak noise. Relevance vector machine (RVM) improved by differential evolution (DE) algorithm is utilized to estimate the battery RUL based on the denoised data. An experiment including battery 5 capacity prognostics case and battery 18 capacity prognostics case is conducted and validated that the proposed approach can predict the trend of battery capacity trajectory closely and estimate the battery RUL accurately. PMID:26413090

  4. Denoising of X-ray pulsar observed profile in the undecimated wavelet domain

    NASA Astrophysics Data System (ADS)

    Xue, Meng-fan; Li, Xiao-ping; Fu, Ling-zhong; Liu, Xiu-ping; Sun, Hai-feng; Shen, Li-rong

    2016-01-01

    The low intensity of the X-ray pulsar signal and the strong X-ray background radiation lead to low signal-to-noise ratio (SNR) of the X-ray pulsar observed profile obtained through epoch folding, especially when the observation time is not long enough. This signifies the necessity of denoising of the observed profile. In this paper, the statistical characteristics of the X-ray pulsar signal are studied, and a signal-dependent noise model is established for the observed profile. Based on this, a profile noise reduction method by performing a local linear minimum mean square error filtering in the un-decimated wavelet domain is developed. The detail wavelet coefficients are rescaled by multiplying their amplitudes by a locally adaptive factor, which is the local variance ratio of the noiseless coefficients to the noisy ones. All the nonstationary statistics needed in the algorithm are calculated from the observed profile, without a priori information. The results of experim! ents, carried out on simulated data obtained by the ground-based simulation system and real data obtained by Rossi X-Ray Timing Explorer satellite, indicate that the proposed method is excellent in both noise suppression and preservation of peak sharpness, and it also clearly outperforms four widely accepted and used wavelet denoising methods, in terms of SNR, Pearson correlation coefficient and root mean square error.

  5. Novel wavelet threshold denoising method in axle press-fit zone ultrasonic detection

    NASA Astrophysics Data System (ADS)

    Peng, Chaoyong; Gao, Xiaorong; Peng, Jianping; Wang, Ai

    2017-02-01

    Axles are important part of railway locomotives and vehicles. Periodic ultrasonic inspection of axles can effectively detect and monitor axle fatigue cracks. However, in the axle press-fit zone, the complex interface contact condition reduces the signal-noise ratio (SNR). Therefore, the probability of false positives and false negatives increases. In this work, a novel wavelet threshold function is created to remove noise and suppress press-fit interface echoes in axle ultrasonic defect detection. The novel wavelet threshold function with two variables is designed to ensure the precision of optimum searching process. Based on the positive correlation between the correlation coefficient and SNR and with the experiment phenomenon that the defect and the press-fit interface echo have different axle-circumferential correlation characteristics, a discrete optimum searching process for two undetermined variables in novel wavelet threshold function is conducted. The performance of the proposed method is assessed by comparing it with traditional threshold methods using real data. The statistic results of the amplitude and the peak SNR of defect echoes show that the proposed wavelet threshold denoising method not only maintains the amplitude of defect echoes but also has a higher peak SNR.

  6. Single-trial evoked potentials study by combining wavelet denoising and principal component analysis methods.

    PubMed

    Zou, Ling; Zhang, Yingchun; Yang, Laurence T; Zhou, Renlai

    2010-02-01

    The authors have developed a new approach by combining the wavelet denoising and principal component analysis methods to reduce the number of required trials for efficient extraction of brain evoked-related potentials (ERPs). Evoked-related potentials were initially extracted using wavelet denoising to enhance the signal-to-noise ratio of raw EEG measurements. Principal components of ERPs accounting for 80% of the total variance were extracted as part of the subspace of the ERPs. Finally, the ERPs were reconstructed from the selected principal components. Computer simulation results showed that the combined approach provided estimations with higher signal-to-noise ratio and lower root mean squared error than each of them alone. The authors further tested this proposed approach in single-trial ERPs extraction during an emotional process and brain responses analysis to emotional stimuli. The experimental results also demonstrated the effectiveness of this combined approach in ERPs extraction and further supported the view that emotional stimuli are processed more intensely.

  7. Ultrasound imaging and characterization of biofilms based on wavelet de-noised radiofrequency data.

    PubMed

    Vaidya, Kunal; Osgood, Robert; Ren, Dabin; Pichichero, Michael E; Helguera, María

    2014-03-01

    The ability to non-invasively image and characterize bacterial biofilms in children during nasopharyngeal colonization with potential otopathogens and during acute otitis media would represent a significant advance. We sought to determine if quantitative high-frequency ultrasound techniques could be used to achieve that goal. Systematic time studies of bacterial biofilm formation were performed on three preparations of an isolated Haemophilus influenzae (NTHi) strain, a Streptococcus pneumoniae (Sp) strain and a combination of H. influenzae and S. pneumoniae (NTHi + Sp) in an in vitro environment. The process of characterization included conditioning of the acquired radiofrequency data obtained with a 15-MHz focused, piston transducer by using a seven-level wavelet decomposition scheme to de-noise the individual A-lines acquired. All subsequent spectral parameter estimations were done on the wavelet de-noised radiofrequency data. Various spectral parameters-peak frequency shift, bandwidth reduction and integrated backscatter coefficient-were recorded. These parameters were successfully used to map the progression of the biofilms in time and to differentiate between single- and multiple-species biofilms. Results were compared with those for confocal microscopy and theoretical evaluation of form factor. We conclude that high-frequency ultrasound may prove a useful modality to detect and characterize bacterial biofilms in humans as they form on tissues and plastic materials.

  8. Regression model-based predictions of diel, diurnal and nocturnal dissolved oxygen dynamics after wavelet denoising of noisy time series

    NASA Astrophysics Data System (ADS)

    Evrendilek, F.; Karakaya, N.

    2014-06-01

    Continuous time-series measurements of diel dissolved oxygen (DO) through online sensors are vital to better understanding and management of metabolism of lake ecosystems, but are prone to noise. Discrete wavelet transforms (DWT) with the orthogonal Symmlet and the semiorthogonal Chui-Wang B-spline were compared in denoising diel, daytime and nighttime dynamics of DO, water temperature, pH, and chlorophyll-a. Predictive efficacies of multiple non-linear regression (MNLR) models of DO dynamics were evaluated with or without DWT denoising of either the response variable alone or all the response and explanatory variables. The combined use of the B-spline-based denoising of all the variables and the temporally partitioned data improved both the predictive power and the errors of the MNLR models better than the use of Symmlet DWT denoising of DO only or all the variables with or without the temporal partitioning.

  9. Retinal optical coherence tomography image enhancement via shrinkage denoising using double-density dual-tree complex wavelet transform.

    PubMed

    Chitchian, Shahab; Mayer, Markus A; Boretsky, Adam R; van Kuijk, Frederik J; Motamedi, Massoud

    2012-11-01

    ABSTRACT. Image enhancement of retinal structures, in optical coherence tomography (OCT) scans through denoising, has the potential to aid in the diagnosis of several eye diseases. In this paper, a locally adaptive denoising algorithm using double-density dual-tree complex wavelet transform, a combination of the double-density wavelet transform and the dual-tree complex wavelet transform, is applied to reduce speckle noise in OCT images of the retina. The algorithm overcomes the limitations of commonly used multiple frame averaging technique, namely the limited number of frames that can be recorded due to eye movements, by providing a comparable image quality in significantly less acquisition time equal to an order of magnitude less time compared to the averaging method. In addition, improvements of image quality metrics and 5 dB increase in the signal-to-noise ratio are attained.

  10. Retinal optical coherence tomography image enhancement via shrinkage denoising using double-density dual-tree complex wavelet transform

    PubMed Central

    Mayer, Markus A.; Boretsky, Adam R.; van Kuijk, Frederik J.; Motamedi, Massoud

    2012-01-01

    Abstract. Image enhancement of retinal structures, in optical coherence tomography (OCT) scans through denoising, has the potential to aid in the diagnosis of several eye diseases. In this paper, a locally adaptive denoising algorithm using double-density dual-tree complex wavelet transform, a combination of the double-density wavelet transform and the dual-tree complex wavelet transform, is applied to reduce speckle noise in OCT images of the retina. The algorithm overcomes the limitations of commonly used multiple frame averaging technique, namely the limited number of frames that can be recorded due to eye movements, by providing a comparable image quality in significantly less acquisition time equal to an order of magnitude less time compared to the averaging method. In addition, improvements of image quality metrics and 5 dB increase in the signal-to-noise ratio are attained. PMID:23117804

  11. The application of wavelet shrinkage denoising to magnetic Barkhausen noise measurements

    SciTech Connect

    Thomas, James

    2014-02-18

    The application of Magnetic Barkhausen Noise (MBN) as a non-destructive method of defect detection has proliferated throughout the manufacturing community. Instrument technology and measurement methodology have matured commensurately as applications have moved from the R and D labs to the fully automated manufacturing environment. These new applications present a new set of challenges including a bevy of error sources. A significant obstacle in many industrial applications is a decrease in signal to noise ratio due to (i) environmental EMI and (II) compromises in sensor design for the purposes of automation. The stochastic nature of MBN presents a challenge to any method of noise reduction. An application of wavelet shrinkage denoising is proposed as a method of decreasing extraneous noise in MBN measurements. The method is tested and yields marked improvement on measurements subject to EMI, grounding noise, and even measurements in ideal conditions.

  12. A cross-correlation based fiber optic white-light interferometry with wavelet transform denoising

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Jiang, Yi; Ding, Wenhui; Gao, Ran

    2013-09-01

    A fiber optic white-light interferometry based on cross-correlation calculation is presented. The detected white-light spectrum signal of fiber optic extrinsic Fabry-Perot interferometric (EFPI) sensor is firstly decomposed by discrete wavelet transform for denoising before interrogating the cavity length of the EFPI sensor. In measurement experiment, the cross-correlation algorithm with multiple-level calculations is performed both for achieving the high measurement resolution and for improving the efficiency of the measurement. The experimental results show that the variation range of the measurement results was 1.265 nm, and the standard deviation of the measurement results can reach 0.375 nm when an EFPI sensor with cavity length of 1500 μm was interrogated.

  13. A blind detection scheme based on modified wavelet denoising algorithm for wireless optical communications

    NASA Astrophysics Data System (ADS)

    Li, Ruijie; Dang, Anhong

    2015-10-01

    This paper investigates a detection scheme without channel state information for wireless optical communication (WOC) systems in turbulence induced fading channel. The proposed scheme can effectively diminish the additive noise caused by background radiation and photodetector, as well as the intensity scintillation caused by turbulence. The additive noise can be mitigated significantly using the modified wavelet threshold denoising algorithm, and then, the intensity scintillation can be attenuated by exploiting the temporal correlation of the WOC channel. Moreover, to improve the performance beyond that of the maximum likelihood decision, the maximum a posteriori probability (MAP) criterion is considered. Compared with conventional blind detection algorithm, simulation results show that the proposed detection scheme can improve the signal-to-noise ratio (SNR) performance about 4.38 dB while the bit error rate and scintillation index (SI) are 1×10-6 and 0.02, respectively.

  14. Enhancing P300 Wave of BCI Systems Via Negentropy in Adaptive Wavelet Denoising.

    PubMed

    Vahabi, Z; Amirfattahi, R; Mirzaei, Ar

    2011-07-01

    Brian Computer Interface (BCI) is a direct communication pathway between the brain and an external device. BCIs are often aimed at assisting, augmenting or repairing human cognitive or sensory-motor functions. EEG separation into target and non-target ones based on presence of P300 signal is of difficult task mainly due to their natural low signal to noise ratio. In this paper a new algorithm is introduced to enhance EEG signals and improve their SNR. Our denoising method is based on multi-resolution analysis via Independent Component Analysis (ICA) Fundamentals. We have suggested combination of negentropy as a feature of signal and subband information from wavelet transform. The proposed method is finally tested with dataset from BCI Competition 2003 and gives results that compare favorably.

  15. Non parametric denoising methods based on wavelets: Application to electron microscopy images in low exposure time

    SciTech Connect

    Soumia, Sid Ahmed; Messali, Zoubeida; Ouahabi, Abdeldjalil; Trepout, Sylvain E-mail: cedric.messaoudi@curie.fr Messaoudi, Cedric E-mail: cedric.messaoudi@curie.fr Marco, Sergio E-mail: cedric.messaoudi@curie.fr

    2015-01-13

    The 3D reconstruction of the Cryo-Transmission Electron Microscopy (Cryo-TEM) and Energy Filtering TEM images (EFTEM) hampered by the noisy nature of these images, so that their alignment becomes so difficult. This noise refers to the collision between the frozen hydrated biological samples and the electrons beam, where the specimen is exposed to the radiation with a high exposure time. This sensitivity to the electrons beam led specialists to obtain the specimen projection images at very low exposure time, which resulting the emergence of a new problem, an extremely low signal-to-noise ratio (SNR). This paper investigates the problem of TEM images denoising when they are acquired at very low exposure time. So, our main objective is to enhance the quality of TEM images to improve the alignment process which will in turn improve the three dimensional tomography reconstructions. We have done multiple tests on special TEM images acquired at different exposure time 0.5s, 0.2s, 0.1s and 1s (i.e. with different values of SNR)) and equipped by Golding beads for helping us in the assessment step. We herein, propose a structure to combine multiple noisy copies of the TEM images. The structure is based on four different denoising methods, to combine the multiple noisy TEM images copies. Namely, the four different methods are Soft, the Hard as Wavelet-Thresholding methods, Bilateral Filter as a non-linear technique able to maintain the edges neatly, and the Bayesian approach in the wavelet domain, in which context modeling is used to estimate the parameter for each coefficient. To ensure getting a high signal-to-noise ratio, we have guaranteed that we are using the appropriate wavelet family at the appropriate level. So we have chosen âĂIJsym8âĂİ wavelet at level 3 as the most appropriate parameter. Whereas, for the bilateral filtering many tests are done in order to determine the proper filter parameters represented by the size of the filter, the range parameter and the

  16. Non parametric denoising methods based on wavelets: Application to electron microscopy images in low exposure time

    NASA Astrophysics Data System (ADS)

    Soumia, Sid Ahmed; Messali, Zoubeida; Ouahabi, Abdeldjalil; Trepout, Sylvain; Messaoudi, Cedric; Marco, Sergio

    2015-01-01

    The 3D reconstruction of the Cryo-Transmission Electron Microscopy (Cryo-TEM) and Energy Filtering TEM images (EFTEM) hampered by the noisy nature of these images, so that their alignment becomes so difficult. This noise refers to the collision between the frozen hydrated biological samples and the electrons beam, where the specimen is exposed to the radiation with a high exposure time. This sensitivity to the electrons beam led specialists to obtain the specimen projection images at very low exposure time, which resulting the emergence of a new problem, an extremely low signal-to-noise ratio (SNR). This paper investigates the problem of TEM images denoising when they are acquired at very low exposure time. So, our main objective is to enhance the quality of TEM images to improve the alignment process which will in turn improve the three dimensional tomography reconstructions. We have done multiple tests on special TEM images acquired at different exposure time 0.5s, 0.2s, 0.1s and 1s (i.e. with different values of SNR)) and equipped by Golding beads for helping us in the assessment step. We herein, propose a structure to combine multiple noisy copies of the TEM images. The structure is based on four different denoising methods, to combine the multiple noisy TEM images copies. Namely, the four different methods are Soft, the Hard as Wavelet-Thresholding methods, Bilateral Filter as a non-linear technique able to maintain the edges neatly, and the Bayesian approach in the wavelet domain, in which context modeling is used to estimate the parameter for each coefficient. To ensure getting a high signal-to-noise ratio, we have guaranteed that we are using the appropriate wavelet family at the appropriate level. So we have chosen âĂIJsym8âĂİ wavelet at level 3 as the most appropriate parameter. Whereas, for the bilateral filtering many tests are done in order to determine the proper filter parameters represented by the size of the filter, the range parameter and the

  17. Enhancement of signal denoising and multiple fault signatures detecting in rotating machinery using dual-tree complex wavelet transform

    NASA Astrophysics Data System (ADS)

    Wang, Yanxue; He, Zhengjia; Zi, Yanyang

    2010-01-01

    In order to enhance the desired features related to some special type of machine fault, a technique based on the dual-tree complex wavelet transform (DTCWT) is proposed in this paper. It is demonstrated that DTCWT enjoys better shift invariance and reduced spectral aliasing than second-generation wavelet transform (SGWT) and empirical mode decomposition by means of numerical simulations. These advantages of the DTCWT arise from the relationship between the two dual-tree wavelet basis functions, instead of the matching of the used single wavelet basis function to the signal being analyzed. Since noise inevitably exists in the measured signals, an enhanced vibration signals denoising algorithm incorporating DTCWT with NeighCoeff shrinkage is also developed. Denoising results of vibration signals resulting from a crack gear indicate the proposed denoising method can effectively remove noise and retain the valuable information as much as possible compared to those DWT- and SGWT-based NeighCoeff shrinkage denoising methods. As is well known, excavation of comprehensive signatures embedded in the vibration signals is of practical importance to clearly clarify the roots of the fault, especially the combined faults. In the case of multiple features detection, diagnosis results of rolling element bearings with combined faults and an actual industrial equipment confirm that the proposed DTCWT-based method is a powerful and versatile tool and consistently outperforms SGWT and fast kurtogram, which are widely used recently. Moreover, it must be noted, the proposed method is completely suitable for on-line surveillance and diagnosis due to its good robustness and efficient algorithm.

  18. Denoising of magnetotelluric signals by polarization analysis in the discrete wavelet domain

    NASA Astrophysics Data System (ADS)

    Carbonari, R.; D'Auria, L.; Di Maio, R.; Petrillo, Z.

    2017-03-01

    Magnetotellurics (MT) is one of the prominent geophysical methods for underground deep exploration and, thus, appropriate for applications to petroleum and geothermal research. However, it is not completely reliable when applied in areas characterized by intense urbanization, as the presence of cultural noise may significantly affect the MT impedance tensor estimates and, consequently, the apparent resistivity values that describe the electrical behaviour of the investigated buried structures. The development of denoising techniques of MT data is thus one of the main objectives to make magnetotellurics reliably even in urban or industrialized environments. In this work we propose an algorithm for filtering of MT data affected by temporally localized noise. It exploits the discrete wavelet transform (DWT) that, thanks to the possibility to operates in both time and frequency domain, allows to detect transient components of the MT signal, likely due to disturbances of anthropic nature. The implemented filter relies on the estimate of the ellipticity of the polarized MT wave. The application of the filter to synthetic and field MT data has proven its ability in detecting and removing cultural noise, thus providing apparent resistivity curves more smoothed than those obtained by using raw signals.

  19. A new structure of 3D dual-tree discrete wavelet transforms and applications to video denoising and coding

    NASA Astrophysics Data System (ADS)

    Shi, Fei; Wang, Beibei; Selesnick, Ivan W.; Wang, Yao

    2006-01-01

    This paper introduces an anisotropic decomposition structure of a recently introduced 3-D dual-tree discrete wavelet transform (DDWT), and explores the applications for video denoising and coding. The 3-D DDWT is an attractive video representation because it isolates motion along different directions in separate subbands, and thus leads to sparse video decompositions. Our previous investigation shows that the 3-D DDWT, compared to the standard discrete wavelet transform (DWT), complies better with the statistical models based on sparse presumptions, and gives better visual and numerical results when used for statistical denoising algorithms. Our research on video compression also shows that even with 4:1 redundancy, the 3-D DDWT needs fewer coefficients to achieve the same coding quality (in PSNR) by applying the iterative projection-based noise shaping scheme proposed by Kingsbury. The proposed anisotropic DDWT extends the superiority of isotropic DDWT with more directional subbands without adding to the redundancy. Unlike the original 3-D DDWT which applies dyadic decomposition along all three directions and produces isotropic frequency spacing, it has a non-uniform tiling of the frequency space. By applying this structure, we can improve the denoising results, and the number of significant coefficients can be reduced further, which is beneficial for video coding.

  20. A hybrid fault diagnosis method based on second generation wavelet de-noising and local mean decomposition for rotating machinery.

    PubMed

    Liu, Zhiwen; He, Zhengjia; Guo, Wei; Tang, Zhangchun

    2016-03-01

    In order to extract fault features of large-scale power equipment from strong background noise, a hybrid fault diagnosis method based on the second generation wavelet de-noising (SGWD) and the local mean decomposition (LMD) is proposed in this paper. In this method, a de-noising algorithm of second generation wavelet transform (SGWT) using neighboring coefficients was employed as the pretreatment to remove noise in rotating machinery vibration signals by virtue of its good effect in enhancing the signal-noise ratio (SNR). Then, the LMD method is used to decompose the de-noised signals into several product functions (PFs). The PF corresponding to the faulty feature signal is selected according to the correlation coefficients criterion. Finally, the frequency spectrum is analyzed by applying the FFT to the selected PF. The proposed method is applied to analyze the vibration signals collected from an experimental gearbox and a real locomotive rolling bearing. The results demonstrate that the proposed method has better performances such as high SNR and fast convergence speed than the normal LMD method.

  1. A Neuro-Fuzzy Inference System Combining Wavelet Denoising, Principal Component Analysis, and Sequential Probability Ratio Test for Sensor Monitoring

    SciTech Connect

    Na, Man Gyun; Oh, Seungrohk

    2002-11-15

    A neuro-fuzzy inference system combined with the wavelet denoising, principal component analysis (PCA), and sequential probability ratio test (SPRT) methods has been developed to monitor the relevant sensor using the information of other sensors. The parameters of the neuro-fuzzy inference system that estimates the relevant sensor signal are optimized by a genetic algorithm and a least-squares algorithm. The wavelet denoising technique was applied to remove noise components in input signals into the neuro-fuzzy system. By reducing the dimension of an input space into the neuro-fuzzy system without losing a significant amount of information, the PCA was used to reduce the time necessary to train the neuro-fuzzy system, simplify the structure of the neuro-fuzzy inference system, and also, make easy the selection of the input signals into the neuro-fuzzy system. By using the residual signals between the estimated signals and the measured signals, the SPRT is applied to detect whether the sensors are degraded or not. The proposed sensor-monitoring algorithm was verified through applications to the pressurizer water level, the pressurizer pressure, and the hot-leg temperature sensors in pressurized water reactors.

  2. Single-trial extraction of cognitive evoked potentials by combination of third-order correlation and wavelet denoising.

    PubMed

    Zhang, Z; Tian, X

    2005-01-01

    The application of a recently proposed denoising implementation for obtaining cognitive evoked potentials (CEPs) at the single-trial level is shown. The aim of this investigation is to develop the technique of extracting CEPs by combining both the third-order correlation and the wavelet denoising methods. First, the noisy CEPs was passed through a finite impulse response filter whose impulse response is matched with the shape of the noise-free signal. It was shown that it is possible to estimate the filter impulse response on basis of a select third-order correlation slice (TOCS) of the input noisy CEPs. Second, the output from the third-order correlation filter is decomposed with bi-orthogonal splines at 5 levels. The CEPs is reconstructed by wavelet final approximation a5. We study its performance in simulated data as well as in cognitive evoked potentials of normal rat and Alzheimer's disease (AD) model rat. For the simulated data, the method gives a significantly better reconstruction of the single-trial cognitive evoked potentials responses in comparison with the simulated data. Moreover, with this approach we obtain a significantly better estimation of the amplitudes and latencies of the simulated CEPs. For the real data, the method clearly improves the visualization of single-trial CEPs. This allows the calculation of better averages as well as the study of systematic or unsystematic variations between trials.

  3. Forecasting East Asian Indices Futures via a Novel Hybrid of Wavelet-PCA Denoising and Artificial Neural Network Models

    PubMed Central

    2016-01-01

    The motivation behind this research is to innovatively combine new methods like wavelet, principal component analysis (PCA), and artificial neural network (ANN) approaches to analyze trade in today’s increasingly difficult and volatile financial futures markets. The main focus of this study is to facilitate forecasting by using an enhanced denoising process on market data, taken as a multivariate signal, in order to deduct the same noise from the open-high-low-close signal of a market. This research offers evidence on the predictive ability and the profitability of abnormal returns of a new hybrid forecasting model using Wavelet-PCA denoising and ANN (named WPCA-NN) on futures contracts of Hong Kong’s Hang Seng futures, Japan’s NIKKEI 225 futures, Singapore’s MSCI futures, South Korea’s KOSPI 200 futures, and Taiwan’s TAIEX futures from 2005 to 2014. Using a host of technical analysis indicators consisting of RSI, MACD, MACD Signal, Stochastic Fast %K, Stochastic Slow %K, Stochastic %D, and Ultimate Oscillator, empirical results show that the annual mean returns of WPCA-NN are more than the threshold buy-and-hold for the validation, test, and evaluation periods; this is inconsistent with the traditional random walk hypothesis, which insists that mechanical rules cannot outperform the threshold buy-and-hold. The findings, however, are consistent with literature that advocates technical analysis. PMID:27248692

  4. Imaging reconstruction based on improved wavelet denoising combined with parallel-beam filtered back-projection algorithm

    NASA Astrophysics Data System (ADS)

    Ren, Zhong; Liu, Guodong; Huang, Zhen

    2012-11-01

    The image reconstruction is a key step in medical imaging (MI) and its algorithm's performance determinates the quality and resolution of reconstructed image. Although some algorithms have been used, filter back-projection (FBP) algorithm is still the classical and commonly-used algorithm in clinical MI. In FBP algorithm, filtering of original projection data is a key step in order to overcome artifact of the reconstructed image. Since simple using of classical filters, such as Shepp-Logan (SL), Ram-Lak (RL) filter have some drawbacks and limitations in practice, especially for the projection data polluted by non-stationary random noises. So, an improved wavelet denoising combined with parallel-beam FBP algorithm is used to enhance the quality of reconstructed image in this paper. In the experiments, the reconstructed effects were compared between the improved wavelet denoising and others (directly FBP, mean filter combined FBP and median filter combined FBP method). To determine the optimum reconstruction effect, different algorithms, and different wavelet bases combined with three filters were respectively test. Experimental results show the reconstruction effect of improved FBP algorithm is better than that of others. Comparing the results of different algorithms based on two evaluation standards i.e. mean-square error (MSE), peak-to-peak signal-noise ratio (PSNR), it was found that the reconstructed effects of the improved FBP based on db2 and Hanning filter at decomposition scale 2 was best, its MSE value was less and the PSNR value was higher than others. Therefore, this improved FBP algorithm has potential value in the medical imaging.

  5. Analysis of hydrological trend for radioactivity content in bore-hole water samples using wavelet based denoising.

    PubMed

    Paul, Sabyasachi; Suman, V; Sarkar, P K; Ranade, A K; Pulhani, V; Dafauti, S; Datta, D

    2013-08-01

    A wavelet transform based denoising methodology has been applied to detect the presence of any discernable trend in (137)Cs and (90)Sr activity levels in bore-hole water samples collected four times a year over a period of eight years, from 2002 to 2009, in the vicinity of typical nuclear facilities inside the restricted access zones. The conventional non-parametric methods viz., Mann-Kendall and Spearman rho, along with linear regression when applied for detecting the linear trend in the time series data do not yield results conclusive for trend detection with a confidence of 95% for most of the samples. The stationary wavelet based hard thresholding data pruning method with Haar as the analyzing wavelet was applied to remove the noise present in the same data. Results indicate that confidence interval of the established trend has significantly improved after pre-processing to more than 98% compared to the conventional non-parametric methods when applied to direct measurements.

  6. Wavelet-based denoising of the Fourier metric in real-time wavefront correction for single molecule localization microscopy

    NASA Astrophysics Data System (ADS)

    Tehrani, Kayvan Forouhesh; Mortensen, Luke J.; Kner, Peter

    2016-03-01

    Wavefront sensorless schemes for correction of aberrations induced by biological specimens require a time invariant property of an image as a measure of fitness. Image intensity cannot be used as a metric for Single Molecule Localization (SML) microscopy because the intensity of blinking fluorophores follows exponential statistics. Therefore a robust intensity-independent metric is required. We previously reported a Fourier Metric (FM) that is relatively intensity independent. The Fourier metric has been successfully tested on two machine learning algorithms, a Genetic Algorithm and Particle Swarm Optimization, for wavefront correction about 50 μm deep inside the Central Nervous System (CNS) of Drosophila. However, since the spatial frequencies that need to be optimized fall into regions of the Optical Transfer Function (OTF) that are more susceptible to noise, adding a level of denoising can improve performance. Here we present wavelet-based approaches to lower the noise level and produce a more consistent metric. We compare performance of different wavelets such as Daubechies, Bi-Orthogonal, and reverse Bi-orthogonal of different degrees and orders for pre-processing of images.

  7. Classical low-pass filter and real-time wavelet-based denoising technique implemented on a DSP: a comparison study

    NASA Astrophysics Data System (ADS)

    Dolabdjian, Ch.; Fadili, J.; Huertas Leyva, E.

    2002-11-01

    We have implemented a real-time numerical denoising algorithm, using the Discrete Wavelet Transform (DWT), on a TMS320C3x Digital Signal Processor (DSP). We also compared from a theoretical and practical viewpoints this post-processing approach to a more classical low-pass filter. This comparison was carried out using an ECG-type signal (ElectroCardiogram). The denoising approach is an elegant and extremely fast alternative to the classical linear filters class. It is particularly adapted to non-stationary signals such as those encountered in biological applications. The denoising allows to substantially improve detection of such signals over Fourier-based techniques. This processing step is a vital element in our acquisition chain using high sensitivity magnetic sensors. It should enhance detection of cardiac-type magnetic signals or magnetic particles in movement.

  8. A wavelet-based estimator of the degrees of freedom in denoised fMRI time series for probabilistic testing of functional connectivity and brain graphs.

    PubMed

    Patel, Ameera X; Bullmore, Edward T

    2016-11-15

    Connectome mapping using techniques such as functional magnetic resonance imaging (fMRI) has become a focus of systems neuroscience. There remain many statistical challenges in analysis of functional connectivity and network architecture from BOLD fMRI multivariate time series. One key statistic for any time series is its (effective) degrees of freedom, df, which will generally be less than the number of time points (or nominal degrees of freedom, N). If we know the df, then probabilistic inference on other fMRI statistics, such as the correlation between two voxel or regional time series, is feasible. However, we currently lack good estimators of df in fMRI time series, especially after the degrees of freedom of the "raw" data have been modified substantially by denoising algorithms for head movement. Here, we used a wavelet-based method both to denoise fMRI data and to estimate the (effective) df of the denoised process. We show that seed voxel correlations corrected for locally variable df could be tested for false positive connectivity with better control over Type I error and greater specificity of anatomical mapping than probabilistic connectivity maps using the nominal degrees of freedom. We also show that wavelet despiked statistics can be used to estimate all pairwise correlations between a set of regional nodes, assign a P value to each edge, and then iteratively add edges to the graph in order of increasing P. These probabilistically thresholded graphs are likely more robust to regional variation in head movement effects than comparable graphs constructed by thresholding correlations. Finally, we show that time-windowed estimates of df can be used for probabilistic connectivity testing or dynamic network analysis so that apparent changes in the functional connectome are appropriately corrected for the effects of transient noise bursts. Wavelet despiking is both an algorithm for fMRI time series denoising and an estimator of the (effective) df of denoised

  9. Steady-state sweep visual evoked potential processing denoised by wavelet transform

    NASA Astrophysics Data System (ADS)

    Weiderpass, Heinar A.; Yamamoto, Jorge F.; Salomão, Solange R.; Berezovsky, Adriana; Pereira, Josenilson M.; Sacai, Paula Y.; de Oliveira, José P.; Costa, Marcio A.; Burattini, Marcelo N.

    2008-03-01

    Visually evoked potential (VEP) is a very small electrical signal originated in the visual cortex in response to periodic visual stimulation. Sweep-VEP is a modified VEP procedure used to measure grating visual acuity in non-verbal and preverbal patients. This biopotential is buried in a large amount of electroencephalographic (EEG) noise and movement related artifact. The signal-to-noise ratio (SNR) plays a dominant role in determining both systematic and statistic errors. The purpose of this study is to present a method based on wavelet transform technique for filtering and extracting steady-state sweep-VEP. Counter-phase sine-wave luminance gratings modulated at 6 Hz were used as stimuli to determine sweep-VEP grating acuity thresholds. The amplitude and phase of the second-harmonic (12 Hz) pattern reversal response were analyzed using the fast Fourier transform after the wavelet filtering. The wavelet transform method was used to decompose the VEP signal into wavelet coefficients by a discrete wavelet analysis to determine which coefficients yield significant activity at the corresponding frequency. In a subsequent step only significant coefficients were considered and the remaining was set to zero allowing a reconstruction of the VEP signal. This procedure resulted in filtering out other frequencies that were considered noise. Numerical simulations and analyses of human VEP data showed that this method has provided higher SNR when compared with the classical recursive least squares (RLS) method. An additional advantage was a more appropriate phase analysis showing more realistic second-harmonic amplitude value during phase brake.

  10. Application of a Multidimensional Wavelet Denoising Algorithm for the Detection and Characterization of Astrophysical Sources of Gamma Rays

    SciTech Connect

    Digel, S.W.; Zhang, B.; Chiang, J.; Fadili, J.M.; Starck, J.-L.; /Saclay /Stanford U., Statistics Dept.

    2005-12-02

    Zhang, Fadili, & Starck have recently developed a denoising procedure for Poisson data that offers advantages over other methods of intensity estimation in multiple dimensions. Their procedure, which is nonparametric, is based on thresholding wavelet coefficients. The restoration algorithm applied after thresholding provides good conservation of source flux. We present an investigation of the procedure of Zhang et al. for the detection and characterization of astrophysical sources of high-energy gamma rays, using realistic simulated observations with the Large Area Telescope (LAT). The LAT is to be launched in late 2007 on the Gamma-ray Large Area Space Telescope mission. Source detection in the LAT data is complicated by the low fluxes of point sources relative to the diffuse celestial background, the limited angular resolution, and the tremendous variation of that resolution with energy (from tens of degrees at {approx}30 MeV to 0.1{sup o} at 10 GeV). The algorithm is very fast relative to traditional likelihood model fitting, and permits immediate estimation of spectral properties. Astrophysical sources of gamma rays, especially active galaxies, are typically quite variable, and our current work may lead to a reliable method to quickly characterize the flaring properties of newly-detected sources.

  11. Rolling element bearing defect diagnosis under variable speed operation through angle synchronous averaging of wavelet de-noised estimate

    NASA Astrophysics Data System (ADS)

    Mishra, C.; Samantaray, A. K.; Chakraborty, G.

    2016-05-01

    Rolling element bearings are widely used in rotating machines and their faults can lead to excessive vibration levels and/or complete seizure of the machine. Under special operating conditions such as non-uniform or low speed shaft rotation, the available fault diagnosis methods cannot be applied for bearing fault diagnosis with full confidence. Fault symptoms in such operating conditions cannot be easily extracted through usual measurement and signal processing techniques. A typical example is a bearing in heavy rolling mill with variable load and disturbance from other sources. In extremely slow speed operation, variation in speed due to speed controller transients or external disturbances (e.g., varying load) can be relatively high. To account for speed variation, instantaneous angular position instead of time is used as the base variable of signals for signal processing purposes. Even with time synchronous averaging (TSA) and well-established methods like envelope order analysis, rolling element faults in rolling element bearings cannot be easily identified during such operating conditions. In this article we propose to use order tracking on the envelope of the wavelet de-noised estimate of the short-duration angle synchronous averaged signal to diagnose faults in rolling element bearing operating under the stated special conditions. The proposed four-stage sequential signal processing method eliminates uncorrelated content, avoids signal smearing and exposes only the fault frequencies and its harmonics in the spectrum. We use experimental data1

  12. Evaluation of Wavelet and Non-Local Mean Denoising of Terrestrial Laser Scanning Data for Small-Scale Joint Roughness Estimation

    NASA Astrophysics Data System (ADS)

    Bitenc, M.; Kieffer, D. S.; Khoshelham, K.

    2016-06-01

    Terrestrial Laser Scanning (TLS) is a well-known remote sensing tool that enables precise 3D acquisition of surface morphology from distances of a few meters to a few kilometres. The morphological representations obtained are important in engineering geology and rock mechanics, where surface morphology details are of particular interest in rock stability problems and engineering construction. The actual size of the discernible surface detail depends on the instrument range error (noise effect) and effective data resolution (smoothing effect). Range error can be (partly) removed by applying a denoising method. Based on the positive results from previous studies, two denoising methods, namely 2D wavelet transform (WT) and non-local mean (NLM), are tested here, with the goal of obtaining roughness estimations that are suitable in the context of rock engineering practice. Both methods are applied in two variants: conventional Discrete WT (DWT) and Stationary WT (SWT), classic NLM (NLM) and probabilistic NLM (PNLM). The noise effect and denoising performance are studied in relation to the TLS effective data resolution. Analyses are performed on the reference data acquired by a highly precise Advanced TOpometric Sensor (ATOS) on a 20x30 cm rock joint sample. Roughness ratio is computed by comparing the noisy and denoised surfaces to the original ATOS surface. The roughness ratio indicates the success of all denoising methods. Besides, it shows that SWT oversmoothes the surface and the performance of the DWT, NLM and PNLM vary with the noise level and data resolution. The noise effect becomes less prominent when data resolution decreases.

  13. Improving the performance of the prony method using a wavelet domain filter for MRI denoising.

    PubMed

    Jaramillo, Rodney; Lentini, Marianela; Paluszny, Marco

    2014-01-01

    The Prony methods are used for exponential fitting. We use a variant of the Prony method for abnormal brain tissue detection in sequences of T 2 weighted magnetic resonance images. Here, MR images are considered to be affected only by Rician noise, and a new wavelet domain bilateral filtering process is implemented to reduce the noise in the images. This filter is a modification of Kazubek's algorithm and we use synthetic images to show the ability of the new procedure to suppress noise and compare its performance with respect to the original filter, using quantitative and qualitative criteria. The tissue classification process is illustrated using a real sequence of T 2 MR images, and the filter is applied to each image before using the variant of the Prony method.

  14. Wavelet De-noising of GNSS Based Bridge Health Monitoring Data

    NASA Astrophysics Data System (ADS)

    Ogundipe, Oluropo; Lee, Jae Kang; Roberts, Gethin Wyn

    2014-11-01

    GNSS signal multipath occurs when the GNSS signal reflects of objects in the antenna environment and arrives at the antenna via multiple paths. A bridge environment is one that is prone to multipath with the bridge structure, as well as passing vehicles providing static and dynamic sources of multipath. In this paper, the Wavelet Transform (WT) is applied to bridge data collected on the Machang cable stayed bridge in Korea. The WT algorithm was applied to the GNSS derived bridge defection data at the mid-span. Up to 41% improvement in RMS was observed afterwavelet shrinkage de-noisingwas applied.Application of this algorithm to the torsion data showed significant improvement with the residual average and RMS decreased by 40% and 45% respectively. This method enabled the generation of more accurate information for bridge health monitoring systems in terms of the analysis of frequency, mode shape and three dimensional defections.

  15. Improving the Performance of the Prony Method Using a Wavelet Domain Filter for MRI Denoising

    PubMed Central

    Lentini, Marianela; Paluszny, Marco

    2014-01-01

    The Prony methods are used for exponential fitting. We use a variant of the Prony method for abnormal brain tissue detection in sequences of T2 weighted magnetic resonance images. Here, MR images are considered to be affected only by Rician noise, and a new wavelet domain bilateral filtering process is implemented to reduce the noise in the images. This filter is a modification of Kazubek's algorithm and we use synthetic images to show the ability of the new procedure to suppress noise and compare its performance with respect to the original filter, using quantitative and qualitative criteria. The tissue classification process is illustrated using a real sequence of T2 MR images, and the filter is applied to each image before using the variant of the Prony method. PMID:24834108

  16. A de-noising algorithm based on wavelet threshold-exponential adaptive window width-fitting for ground electrical source airborne transient electromagnetic signal

    NASA Astrophysics Data System (ADS)

    Ji, Yanju; Li, Dongsheng; Yu, Mingmei; Wang, Yuan; Wu, Qiong; Lin, Jun

    2016-05-01

    The ground electrical source airborne transient electromagnetic system (GREATEM) on an unmanned aircraft enjoys considerable prospecting depth, lateral resolution and detection efficiency, etc. In recent years it has become an important technical means of rapid resources exploration. However, GREATEM data are extremely vulnerable to stationary white noise and non-stationary electromagnetic noise (sferics noise, aircraft engine noise and other human electromagnetic noises). These noises will cause degradation of the imaging quality for data interpretation. Based on the characteristics of the GREATEM data and major noises, we propose a de-noising algorithm utilizing wavelet threshold method and exponential adaptive window width-fitting. Firstly, the white noise is filtered in the measured data using the wavelet threshold method. Then, the data are segmented using data window whose step length is even logarithmic intervals. The data polluted by electromagnetic noise are identified within each window based on the discriminating principle of energy detection, and the attenuation characteristics of the data slope are extracted. Eventually, an exponential fitting algorithm is adopted to fit the attenuation curve of each window, and the data polluted by non-stationary electromagnetic noise are replaced with their fitting results. Thus the non-stationary electromagnetic noise can be effectively removed. The proposed algorithm is verified by the synthetic and real GREATEM signals. The results show that in GREATEM signal, stationary white noise and non-stationary electromagnetic noise can be effectively filtered using the wavelet threshold-exponential adaptive window width-fitting algorithm, which enhances the imaging quality.

  17. Combined self-learning based single-image super-resolution and dual-tree complex wavelet transform denoising for medical images

    NASA Astrophysics Data System (ADS)

    Yang, Guang; Ye, Xujiong; Slabaugh, Greg; Keegan, Jennifer; Mohiaddin, Raad; Firmin, David

    2016-03-01

    In this paper, we propose a novel self-learning based single-image super-resolution (SR) method, which is coupled with dual-tree complex wavelet transform (DTCWT) based denoising to better recover high-resolution (HR) medical images. Unlike previous methods, this self-learning based SR approach enables us to reconstruct HR medical images from a single low-resolution (LR) image without extra training on HR image datasets in advance. The relationships between the given image and its scaled down versions are modeled using support vector regression with sparse coding and dictionary learning, without explicitly assuming reoccurrence or self-similarity across image scales. In addition, we perform DTCWT based denoising to initialize the HR images at each scale instead of simple bicubic interpolation. We evaluate our method on a variety of medical images. Both quantitative and qualitative results show that the proposed approach outperforms bicubic interpolation and state-of-the-art single-image SR methods while effectively removing noise.

  18. An 8.12 μW wavelet denoising chip for PPG detection and portable heart rate monitoring in 0.18 μm CMOS

    NASA Astrophysics Data System (ADS)

    Xiang, Li; Xu, Zhang; Peng, Li; Xiaohui, Hu; Hongda, Chen

    2016-05-01

    A low power wavelet denoising chip for photoplethysmography (PPG) detection and portable heart rate monitoring is presented. To eliminate noise and improve detection accuracy, Harr wavelet (HWT) is chosen as the processing tool. An optimized finite impulse response structure is proposed to lower the computational complexity of proposed algorithm, which is benefit for reducing the power consumption of proposed chip. The modulus maxima pair location module is design to accurately locate the PPG peaks. A clock control unit is designed to further reduce the power consumption of the proposed chip. Fabricated with the 0.18 μm N-well CMOS 1P6M technology, the power consumption of proposed chip is only 8.12 μW in 1 V voltage supply. Validated with PPG signals in multiparameter intelligent monitoring in intensive care databases and signals acquired by the wrist photoelectric volume detection front end, the proposed chip can accurately detect PPG signals. The average sensitivity and positive prediction are 99.91% and 100%, respectively.

  19. Wavelet denoising in voxel-based parametric estimation of small animal PET images: a systematic evaluation of spatial constraints and noise reduction algorithms.

    PubMed

    Su, Yi; Shoghi, Kooresh I

    2008-11-07

    Voxel-based estimation of PET images, generally referred to as parametric imaging, can provide invaluable information about the heterogeneity of an imaging agent in a given tissue. Due to high level of noise in dynamic images, however, the estimated parametric image is often noisy and unreliable. Several approaches have been developed to address this challenge, including spatial noise reduction techniques, cluster analysis and spatial constrained weighted nonlinear least-square (SCWNLS) methods. In this study, we develop and test several noise reduction techniques combined with SCWNLS using simulated dynamic PET images. Both spatial smoothing filters and wavelet-based noise reduction techniques are investigated. In addition, 12 different parametric imaging methods are compared using simulated data. With the combination of noise reduction techniques and SCWNLS methods, more accurate parameter estimation can be achieved than with either of the two techniques alone. A less than 10% relative root-mean-square error is achieved with the combined approach in the simulation study. The wavelet denoising based approach is less sensitive to noise and provides more accurate parameter estimation at higher noise levels. Further evaluation of the proposed methods is performed using actual small animal PET datasets. We expect that the proposed method would be useful for cardiac, neurological and oncologic applications.

  20. Wavelets

    NASA Astrophysics Data System (ADS)

    DeVore, Ronald A.; Lucier, Bradley J.

    The subject of `wavelets' is expanding at such a tremendous rate that it is impossible to give, within these few pages, a complete introduction to all aspects of its theory. We hope, however, to allow the reader to become sufficiently acquainted with the subject to understand, in part, the enthusiasm of its proponents toward its potential application to various numerical problems. Furthermore, we hope that our exposition can guide the reader who wishes to make more serious excursions into the subject. Our viewpoint is biased by our experience in approximation theory and data compression; we warn the reader that there are other viewpoints that are either not represented here or discussed only briefly. For example, orthogonal wavelets were developed primarily in the context of signal processing, an application upon which we touch only indirectly. However, there are several good expositions (e.g. Daubechies (1990) and Rioul and Vetterli (1991)) of this application. A discussion of wavelet decompositions in the context of Littlewood-Paley theory can be found in the monograph of Frazier et al. (1991). We shall also not attempt to give a complete discussion of the history of wavelets. Historical accounts can be found in the book of Meyer (1990) and the introduction of the article of Daubechies (1990). We shall try to give sufficient historical commentary in the course of our presentation to provide some feeling for the subject's development.

  1. Study on torpedo fuze signal denoising method based on WPT

    NASA Astrophysics Data System (ADS)

    Zhao, Jun; Sun, Changcun; Zhang, Tao; Ren, Zhiliang

    2013-07-01

    Torpedo fuze signal denoising is an important action to ensure reliable operation of fuze. Based on the good characteristics of wavelet packet transform (WPT) in signal denoising, the paper used wavelet packet transform to denoise the fuze signal under a complex background interference, and a simulation of the denoising results with Matlab is performed. Simulation result shows that the WPT denoising method can effectively eliminate background noise exist in torpedo fuze target signal with higher precision and less distortion, leading to advance the reliability of torpedo fuze operation.

  2. Working memory load-dependent spatio-temporal activity of single-trial P3 response detected with an adaptive wavelet denoiser.

    PubMed

    Zhang, Qiushi; Yang, Xueqian; Yao, Li; Zhao, Xiaojie

    2017-03-27

    Working memory (WM) refers to the holding and manipulation of information during cognitive tasks. Its underlying neural mechanisms have been explored through both functional magnetic resonance imaging (fMRI) and electroencephalography (EEG). Trial-by-trial coupling of simultaneously collected EEG and fMRI signals has become an important and promising approach to study the spatio-temporal dynamics of such cognitive processes. Previous studies have demonstrated a modulation effect of the WM load on both the BOLD response in certain brain areas and the amplitude of P3. However, much remains to be explored regarding the WM load-dependent relationship between the amplitude of ERP components and cortical activities, and the low signal-to-noise ratio (SNR) of the EEG signal still poses a challenge to performing single-trial analyses. In this paper, we investigated the spatio-temporal activities of P3 during an n-back verbal WM task by introducing an adaptive wavelet denoiser into the extraction of single-trial P3 features and using general linear model (GLM) to integrate simultaneously collected EEG and fMRI data. Our results replicated the modulation effect of the WM load on the P3 amplitude. Additionally, the activation of single-trial P3 amplitudes was detected in multiple brain regions, including the insula, the cuneus, the lingual gyrus (LG), and the middle occipital gyrus (MOG). Moreover, we found significant correlations between P3 features and behavioral performance. These findings suggest that the single-trial integration of simultaneous EEG and fMRI signals may provide new insights into classical cognitive functions.

  3. Comparison of automatic denoising methods for phonocardiograms with extraction of signal parameters via the Hilbert Transform

    NASA Astrophysics Data System (ADS)

    Messer, Sheila R.; Agzarian, John; Abbott, Derek

    2001-05-01

    Phonocardiograms (PCGs) have many advantages over traditional auscultation (listening to the heart) because they may be replayed, may be analyzed for spectral and frequency content, and frequencies inaudible to the human ear may be recorded. However, various sources of noise may pollute a PCG including lung sounds, environmental noise and noise generated from contact between the recording device and the skin. Because PCG signals are known to be nonlinear and it is often not possible to determine their noise content, traditional de-noising methods may not be effectively applied. However, other methods including wavelet de-noising, wavelet packet de-noising and averaging can be employed to de-noise the PCG. This study examines and compares these de-noising methods. This study answers such questions as to which de-noising method gives a better SNR, the magnitude of signal information that is lost as a result of the de-noising process, the appropriate uses of the different methods down to such specifics as to which wavelets and decomposition levels give best results in wavelet and wavelet packet de-noising. In general, the wavelet and wavelet packet de-noising performed roughly equally with optimal de-noising occurring at 3-5 levels of decomposition. Averaging also proved a highly useful de- noising technique; however, in some cases averaging is not appropriate. The Hilbert Transform is used to illustrate the results of the de-noising process and to extract instantaneous features including instantaneous amplitude, frequency, and phase.

  4. Wavelets and Scattering

    DTIC Science & Technology

    1994-07-29

    Douglas (MDA). This has been extended to the use of local SVD methods and the use of wavelet packets to provide a controlled sparsening. The goal is to be...possibilities for segmenting, compression and denoising signals and one of us (GVW) is using these wavelets to study edge sets with Prof. B. Jawerth. The

  5. A wavelet phase filter for emission tomography

    SciTech Connect

    Olsen, E.T.; Lin, B.

    1995-07-01

    The presence of a high level of noise is a characteristic in some tomographic imaging techniques such as positron emission tomography (PET). Wavelet methods can smooth out noise while preserving significant features of images. Mallat et al. proposed a wavelet based denoising scheme exploiting wavelet modulus maxima, but the scheme is sensitive to noise. In this study, the authors explore the properties of wavelet phase, with a focus on reconstruction of emission tomography images. Specifically, they show that the wavelet phase of regular Poisson noise under a Haar-type wavelet transform converges in distribution to a random variable uniformly distributed on [0, 2{pi}). They then propose three wavelet-phase-based denoising schemes which exploit this property: edge tracking, local phase variance thresholding, and scale phase variation thresholding. Some numerical results are also presented. The numerical experiments indicate that wavelet phase techniques show promise for wavelet based denoising methods.

  6. Remote sensing image denoising by using discrete multiwavelet transform techniques

    NASA Astrophysics Data System (ADS)

    Wang, Haihui; Wang, Jun; Zhang, Jian

    2006-01-01

    We present a new method by using GHM discrete multiwavelet transform in image denoising on this paper. The developments in wavelet theory have given rise to the wavelet thresholding method, for extracting a signal from noisy data. The method of signal denoising via wavelet thresholding was popularized. Multiwavelets have recently been introduced and they offer simultaneous orthogonality, symmetry and short support. This property makes multiwavelets more suitable for various image processing applications, especially denoising. It is based on thresholding of multiwavelet coefficients arising from the standard scalar orthogonal wavelet transform. It takes into account the covariance structure of the transform. Denoising of images via thresholding of the multiwavelet coefficients result from preprocessing and the discrete multiwavelet transform can be carried out by treating the output in this paper. The form of the threshold is carefully formulated and is the key to the excellent results obtained in the extensive numerical simulations of image denoising. We apply the multiwavelet-based to remote sensing image denoising. Multiwavelet transform technique is rather a new method, and it has a big advantage over the other techniques that it less distorts spectral characteristics of the image denoising. The experimental results show that multiwavelet based image denoising schemes outperform wavelet based method both subjectively and objectively.

  7. Applications of discrete multiwavelet techniques to image denoising

    NASA Astrophysics Data System (ADS)

    Wang, Haihui; Peng, Jiaxiong; Wu, Wei; Ye, Bin

    2003-09-01

    In this paper, we present a new method by using 2-D discrete multiwavelet transform in image denoising. The developments in wavelet theory have given rise to the wavelet thresholding method, for extracting a signal from noisy data. The method of signal denoising via wavelet thresholding was popularized. Multiwavelets have recently been introduced and they offer simultaneous orthogonality, symmetry and short support. This property makes multiwavelets more suitable for various image processing applications, especially denoising. It is based on thresholding of multiwavelet coefficients arising from the standard scalar orthogonal wavelet transform. It takes into account the covariance structure of the transform. Denoising is images via thresholding of the multiwavelet coefficients result from preprocessing and the discrete multiwavelet transform can be carried out by threating the output in this paper. The form of the threshold is carefully formulated and is the key to the excellent results obtained in the extensive numerical simulations of image denoising. The performances of multiwavelets are compared with those of scalar wavelets. Simulations reveal that multiwavelet based image denoising schemes outperform wavelet based method both subjectively and objectively.

  8. Chemical Peel

    MedlinePlus

    ... be done at different depths — light, medium or deep — depending on your desired results. Each type of ... chemical peel after 12 months to maintain results. Deep chemical peel. A deep chemical peel removes skin ...

  9. Wavelet despiking of fractographs

    NASA Astrophysics Data System (ADS)

    Aubry, Jean-Marie; Saito, Naoki

    2000-12-01

    Fractographs are elevation maps of the fracture zone of some broken material. The technique employed to create these maps often introduces noise composed of positive or negative 'spikes' that must be removed before further analysis. Since the roughness of these maps contains useful information, it must be preserved. Consequently, conventional denoising techniques cannot be employed. We use continuous and discrete wavelet transforms of these images, and the properties of wavelet coefficients related to pointwise Hoelder regularity, to detect and remove the spikes.

  10. Spherical 3D isotropic wavelets

    NASA Astrophysics Data System (ADS)

    Lanusse, F.; Rassat, A.; Starck, J.-L.

    2012-04-01

    Context. Future cosmological surveys will provide 3D large scale structure maps with large sky coverage, for which a 3D spherical Fourier-Bessel (SFB) analysis in spherical coordinates is natural. Wavelets are particularly well-suited to the analysis and denoising of cosmological data, but a spherical 3D isotropic wavelet transform does not currently exist to analyse spherical 3D data. Aims: The aim of this paper is to present a new formalism for a spherical 3D isotropic wavelet, i.e. one based on the SFB decomposition of a 3D field and accompany the formalism with a public code to perform wavelet transforms. Methods: We describe a new 3D isotropic spherical wavelet decomposition based on the undecimated wavelet transform (UWT) described in Starck et al. (2006). We also present a new fast discrete spherical Fourier-Bessel transform (DSFBT) based on both a discrete Bessel transform and the HEALPIX angular pixelisation scheme. We test the 3D wavelet transform and as a toy-application, apply a denoising algorithm in wavelet space to the Virgo large box cosmological simulations and find we can successfully remove noise without much loss to the large scale structure. Results: We have described a new spherical 3D isotropic wavelet transform, ideally suited to analyse and denoise future 3D spherical cosmological surveys, which uses a novel DSFBT. We illustrate its potential use for denoising using a toy model. All the algorithms presented in this paper are available for download as a public code called MRS3D at http://jstarck.free.fr/mrs3d.html

  11. Improved Denoising via Poisson Mixture Modeling of Image Sensor Noise.

    PubMed

    Zhang, Jiachao; Hirakawa, Keigo

    2017-04-01

    This paper describes a study aimed at comparing the real image sensor noise distribution to the models of noise often assumed in image denoising designs. A quantile analysis in pixel, wavelet transform, and variance stabilization domains reveal that the tails of Poisson, signal-dependent Gaussian, and Poisson-Gaussian models are too short to capture real sensor noise behavior. A new Poisson mixture noise model is proposed to correct the mismatch of tail behavior. Based on the fact that noise model mismatch results in image denoising that undersmoothes real sensor data, we propose a mixture of Poisson denoising method to remove the denoising artifacts without affecting image details, such as edge and textures. Experiments with real sensor data verify that denoising for real image sensor data is indeed improved by this new technique.

  12. Adaptive Fourier decomposition based ECG denoising.

    PubMed

    Wang, Ze; Wan, Feng; Wong, Chi Man; Zhang, Liming

    2016-10-01

    A novel ECG denoising method is proposed based on the adaptive Fourier decomposition (AFD). The AFD decomposes a signal according to its energy distribution, thereby making this algorithm suitable for separating pure ECG signal and noise with overlapping frequency ranges but different energy distributions. A stop criterion for the iterative decomposition process in the AFD is calculated on the basis of the estimated signal-to-noise ratio (SNR) of the noisy signal. The proposed AFD-based method is validated by the synthetic ECG signal using an ECG model and also real ECG signals from the MIT-BIH Arrhythmia Database both with additive Gaussian white noise. Simulation results of the proposed method show better performance on the denoising and the QRS detection in comparing with major ECG denoising schemes based on the wavelet transform, the Stockwell transform, the empirical mode decomposition, and the ensemble empirical mode decomposition.

  13. Determination and Visualization of pH Values in Anaerobic Digestion of Water Hyacinth and Rice Straw Mixtures Using Hyperspectral Imaging with Wavelet Transform Denoising and Variable Selection

    PubMed Central

    Zhang, Chu; Ye, Hui; Liu, Fei; He, Yong; Kong, Wenwen; Sheng, Kuichuan

    2016-01-01

    Biomass energy represents a huge supplement for meeting current energy demands. A hyperspectral imaging system covering the spectral range of 874–1734 nm was used to determine the pH value of anaerobic digestion liquid produced by water hyacinth and rice straw mixtures used for methane production. Wavelet transform (WT) was used to reduce noises of the spectral data. Successive projections algorithm (SPA), random frog (RF) and variable importance in projection (VIP) were used to select 8, 15 and 20 optimal wavelengths for the pH value prediction, respectively. Partial least squares (PLS) and a back propagation neural network (BPNN) were used to build the calibration models on the full spectra and the optimal wavelengths. As a result, BPNN models performed better than the corresponding PLS models, and SPA-BPNN model gave the best performance with a correlation coefficient of prediction (rp) of 0.911 and root mean square error of prediction (RMSEP) of 0.0516. The results indicated the feasibility of using hyperspectral imaging to determine pH values during anaerobic digestion. Furthermore, a distribution map of the pH values was achieved by applying the SPA-BPNN model. The results in this study would help to develop an on-line monitoring system for biomass energy producing process by hyperspectral imaging. PMID:26901202

  14. Determination and Visualization of pH Values in Anaerobic Digestion of Water Hyacinth and Rice Straw Mixtures Using Hyperspectral Imaging with Wavelet Transform Denoising and Variable Selection.

    PubMed

    Zhang, Chu; Ye, Hui; Liu, Fei; He, Yong; Kong, Wenwen; Sheng, Kuichuan

    2016-02-18

    Biomass energy represents a huge supplement for meeting current energy demands. A hyperspectral imaging system covering the spectral range of 874-1734 nm was used to determine the pH value of anaerobic digestion liquid produced by water hyacinth and rice straw mixtures used for methane production. Wavelet transform (WT) was used to reduce noises of the spectral data. Successive projections algorithm (SPA), random frog (RF) and variable importance in projection (VIP) were used to select 8, 15 and 20 optimal wavelengths for the pH value prediction, respectively. Partial least squares (PLS) and a back propagation neural network (BPNN) were used to build the calibration models on the full spectra and the optimal wavelengths. As a result, BPNN models performed better than the corresponding PLS models, and SPA-BPNN model gave the best performance with a correlation coefficient of prediction (rp) of 0.911 and root mean square error of prediction (RMSEP) of 0.0516. The results indicated the feasibility of using hyperspectral imaging to determine pH values during anaerobic digestion. Furthermore, a distribution map of the pH values was achieved by applying the SPA-BPNN model. The results in this study would help to develop an on-line monitoring system for biomass energy producing process by hyperspectral imaging.

  15. Simultaneous denoising and compression of multispectral images

    NASA Astrophysics Data System (ADS)

    Hagag, Ahmed; Amin, Mohamed; Abd El-Samie, Fathi E.

    2013-01-01

    A new technique for denoising and compression of multispectral satellite images to remove the effect of noise on the compression process is presented. One type of multispectral images has been considered: Landsat Enhanced Thematic Mapper Plus. The discrete wavelet transform (DWT), the dual-tree DWT, and a simple Huffman coder are used in the compression process. Simulation results show that the proposed technique is more effective than other traditional compression-only techniques.

  16. Wavelet transform of neural spike trains

    NASA Astrophysics Data System (ADS)

    Kim, Youngtae; Jung, Min Whan; Kim, Yunbok

    2000-02-01

    Wavelet transform of neural spike trains recorded with a tetrode in the rat primary somatosensory cortex is described. Continuous wavelet transform (CWT) of the spike train clearly shows singularities hidden in the noisy or chaotic spike trains. A multiresolution analysis of the spike train is also carried out using discrete wavelet transform (DWT) for denoising and approximating at different time scales. Results suggest that this multiscale shape analysis can be a useful tool for classifying the spike trains.

  17. Random seismic noise attenuation using the Wavelet Transform

    NASA Astrophysics Data System (ADS)

    Aliouane, L.; Ouadfeul, S.; Boudella, A.; Eladj, S.

    2012-04-01

    In this paper we propose a technique of random noises attenuation from seismic data using the discrete and continuous wavelet transforms. Firstly the discrete wavelet transform (DWT) is applied to denoise seismic data. This last is based on the threshold method applied at the modulus of the DWT. After we calculate the continuous wavelet transform of the denoised seismic seismogram, the final denoised seismic seismogram is the continuous wavelet transform coefficients at the low scale. Application at a synthetic seismic seismogram shows the robustness of the proposed tool for random noises attenuation. We have applied this idea at a real seismic data of a vertical seismic profile realized in Algeria. Keywords: Seismic data, denoising, DWT, CWT, random noise.

  18. Classification of Underwater Signals Using Wavelet-Based Decompositions

    DTIC Science & Technology

    1998-06-01

    proposed by Learned and Willsky [21], uses the SVD information obtained from the power mapping, the second one selects the most within-a-class...34 SPIE, Vol. 2242, pp. 792-802, Wavelet Applications, 1994 [14] R. Coifman and D. Donoho, "Translation-Invariant Denoising ," Internal Report...J. Barsanti, Jr., Denoising of Ocean Acoustic Signals Using Wavelet-Based Techniques, MSEE Thesis, Naval Postgraduate School, Monterey, California

  19. Color Image Denoising via Discriminatively Learned Iterative Shrinkage.

    PubMed

    Sun, Jian; Sun, Jian; Xu, Zingben

    2015-11-01

    In this paper, we propose a novel model, a discriminatively learned iterative shrinkage (DLIS) model, for color image denoising. The DLIS is a generalization of wavelet shrinkage by iteratively performing shrinkage over patch groups and whole image aggregation. We discriminatively learn the shrinkage functions and basis from the training pairs of noisy/noise-free images, which can adaptively handle different noise characteristics in luminance/chrominance channels, and the unknown structured noise in real-captured color images. Furthermore, to remove the splotchy real color noises, we design a Laplacian pyramid-based denoising framework to progressively recover the clean image from the coarsest scale to the finest scale by the DLIS model learned from the real color noises. Experiments show that our proposed approach can achieve the state-of-the-art denoising results on both synthetic denoising benchmark and real-captured color images.

  20. Study on De-noising Technology of Radar Life Signal

    NASA Astrophysics Data System (ADS)

    Yang, Xiu-Fang; Wang, Lian-Huan; Ma, Jiang-Fei; Wang, Pei-Pei

    2016-05-01

    Radar detection is a kind of novel life detection technology, which can be applied to medical monitoring, anti-terrorism and disaster relief street fighting, etc. As the radar life signal is very weak, it is often submerged in the noise. Because of non-stationary and randomness of these clutter signals, it is necessary to denoise efficiently before extracting and separating the useful signal. This paper improves the radar life signal's theoretical model of the continuous wave, does de-noising processing by introducing lifting wavelet transform and determine the best threshold function through comparing the de-noising effects of different threshold functions. The result indicates that both SNR and MSE of the signal are better than the traditional ones by introducing lifting wave transform and using a new improved soft threshold function de-noising method..

  1. A multiscale products technique for denoising of DNA capillary electrophoresis signals

    NASA Astrophysics Data System (ADS)

    Gao, Qingwei; Lu, Yixiang; Sun, Dong; Zhang, Dexiang

    2013-06-01

    Since noise degrades the accuracy and precision of DNA capillary electrophoresis (CE) analysis, signal denoising is thus important to facilitate the postprocessing of CE data. In this paper, a new denoising algorithm based on dyadic wavelet transform using multiscale products is applied for the removal of the noise in the DNA CE signal. The adjacent scale wavelet coefficients are first multiplied to amplify the significant features of the CE signal while diluting noise. Then, noise is suppressed by applying a multiscale threshold to the multiscale products instead of directly to the wavelet coefficients. Finally, the noise-free CE signal is recovered from the thresholded coefficients by using inverse dyadic wavelet transform. We compare the performance of the proposed algorithm with other denoising methods applied to the synthetic CE and real CE signals. Experimental results show that the new scheme achieves better removal of noise while preserving the shape of peaks corresponding to the analytes in the sample.

  2. Progressive image denoising.

    PubMed

    Knaus, Claude; Zwicker, Matthias

    2014-07-01

    Image denoising continues to be an active research topic. Although state-of-the-art denoising methods are numerically impressive and approch theoretical limits, they suffer from visible artifacts.While they produce acceptable results for natural images, human eyes are less forgiving when viewing synthetic images. At the same time, current methods are becoming more complex, making analysis, and implementation difficult. We propose image denoising as a simple physical process, which progressively reduces noise by deterministic annealing. The results of our implementation are numerically and visually excellent. We further demonstrate that our method is particularly suited for synthetic images. Finally, we offer a new perspective on image denoising using robust estimators.

  3. Maximally Localized Radial Profiles for Tight Steerable Wavelet Frames.

    PubMed

    Pad, Pedram; Uhlmann, Virginie; Unser, Michael

    2016-03-22

    A crucial component of steerable wavelets is the radial profile of the generating function in the frequency domain. In this work, we present an infinite-dimensional optimization scheme that helps us find the optimal profile for a given criterion over the space of tight frames. We consider two classes of criteria that measure the localization of the wavelet. The first class specifies the spatial localization of the wavelet profile, and the second that of the resulting wavelet coefficients. From these metrics and the proposed algorithm, we construct tight wavelet frames that are optimally localized and provide their analytical expression. In particular, one of the considered criterion helps us finding back the popular Simoncelli wavelet profile. Finally, the investigation of local orientation estimation, image reconstruction from detected contours in the wavelet domain, and denoising, indicate that optimizing wavelet localization improves the performance of steerable wavelets, since our new wavelets outperform the traditional ones.

  4. Maximally Localized Radial Profiles for Tight Steerable Wavelet Frames.

    PubMed

    Pad, Pedram; Uhlmann, Virginie; Unser, Michael

    2016-05-01

    A crucial component of steerable wavelets is the radial profile of the generating function in the frequency domain. In this paper, we present an infinite-dimensional optimization scheme that helps us find the optimal profile for a given criterion over the space of tight frames. We consider two classes of criteria that measure the localization of the wavelet. The first class specifies the spatial localization of the wavelet profile, and the second that of the resulting wavelet coefficients. From these metrics and the proposed algorithm, we construct tight wavelet frames that are optimally localized and provide their analytical expression. In particular, one of the considered criterion helps us finding back the popular Simoncelli wavelet profile. Finally, the investigation of local orientation estimation, image reconstruction from detected contours in the wavelet domain, and denoising indicate that optimizing wavelet localization improves the performance of steerable wavelets, since our new wavelets outperform the traditional ones.

  5. EEG Signal Decomposition and Improved Spectral Analysis Using Wavelet Transform

    DTIC Science & Technology

    2001-10-25

    research and medical applications. Wavelet transform (WT) is a new multi-resolution time-frequency analysis method. WT possesses localization feature both... wavelet transform , the EEG signals are successfully decomposed and denoised. In this paper we also use a ’quasi-detrending’ method for classification of EEG

  6. Denoising of single-trial matrix representations using 2D nonlinear diffusion filtering.

    PubMed

    Mustaffa, I; Trenado, C; Schwerdtfeger, K; Strauss, D J

    2010-01-15

    In this paper we present a novel application of denoising by means of nonlinear diffusion filters (NDFs). NDFs have been successfully applied for image processing and computer vision areas, particularly in image denoising, smoothing, segmentation, and restoration. We apply two types of NDFs for the denoising of evoked responses in single-trials in a matrix form, the nonlinear isotropic and the anisotropic diffusion filters. We show that by means of NDFs we are able to denoise the evoked potentials resulting in a better extraction of physiologically relevant morphological features over the ongoing experiment. This technique offers the advantage of translation-invariance in comparison to other well-known methods, e.g., wavelet denoising based on maximally decimated filter banks, due to an adaptive diffusion feature. We compare the proposed technique with a wavelet denoising scheme that had been introduced before for evoked responses. It is concluded that NDFs represent a promising and useful approach in the denoising of event related potentials. Novel NDF applications of single-trials of auditory brain responses (ABRs) and the transcranial magnetic stimulation (TMS) evoked electroencephalographic responses denoising are presented in this paper.

  7. Discrete multiscale wavelet shrinkage and integrodifferential equations

    NASA Astrophysics Data System (ADS)

    Didas, S.; Steidl, G.; Weickert, J.

    2008-04-01

    We investigate the relation between discrete wavelet shrinkage and integrodifferential equations in the context of simplification and denoising of one-dimensional signals. In the continuous setting, strong connections between these two approaches were discovered in 6 (see references). The key observation is that the wavelet transform can be understood as derivative operator after the convolution with a smoothing kernel. In this paper, we extend these ideas to the practically relevant discrete setting with both orthogonal and biorthogonal wavelets. In the discrete case, the behaviour of the smoothing kernels for different scales requires additional investigation. The results of discrete multiscale wavelet shrinkage and related discrete versions of integrodifferential equations are compared with respect to their denoising quality by numerical experiments.

  8. Wavelets in medical imaging

    SciTech Connect

    Zahra, Noor e; Sevindir, Huliya A.; Aslan, Zafar; Siddiqi, A. H.

    2012-07-17

    The aim of this study is to provide emerging applications of wavelet methods to medical signals and images, such as electrocardiogram, electroencephalogram, functional magnetic resonance imaging, computer tomography, X-ray and mammography. Interpretation of these signals and images are quite important. Nowadays wavelet methods have a significant impact on the science of medical imaging and the diagnosis of disease and screening protocols. Based on our initial investigations, future directions include neurosurgical planning and improved assessment of risk for individual patients, improved assessment and strategies for the treatment of chronic pain, improved seizure localization, and improved understanding of the physiology of neurological disorders. We look ahead to these and other emerging applications as the benefits of this technology become incorporated into current and future patient care. In this chapter by applying Fourier transform and wavelet transform, analysis and denoising of one of the important biomedical signals like EEG is carried out. The presence of rhythm, template matching, and correlation is discussed by various method. Energy of EEG signal is used to detect seizure in an epileptic patient. We have also performed denoising of EEG signals by SWT.

  9. Wavelets in medical imaging

    NASA Astrophysics Data System (ADS)

    Zahra, Noor e.; Sevindir, Hulya Kodal; Aslan, Zafer; Siddiqi, A. H.

    2012-07-01

    The aim of this study is to provide emerging applications of wavelet methods to medical signals and images, such as electrocardiogram, electroencephalogram, functional magnetic resonance imaging, computer tomography, X-ray and mammography. Interpretation of these signals and images are quite important. Nowadays wavelet methods have a significant impact on the science of medical imaging and the diagnosis of disease and screening protocols. Based on our initial investigations, future directions include neurosurgical planning and improved assessment of risk for individual patients, improved assessment and strategies for the treatment of chronic pain, improved seizure localization, and improved understanding of the physiology of neurological disorders. We look ahead to these and other emerging applications as the benefits of this technology become incorporated into current and future patient care. In this chapter by applying Fourier transform and wavelet transform, analysis and denoising of one of the important biomedical signals like EEG is carried out. The presence of rhythm, template matching, and correlation is discussed by various method. Energy of EEG signal is used to detect seizure in an epileptic patient. We have also performed denoising of EEG signals by SWT.

  10. Frames-Based Denoising in 3D Confocal Microscopy Imaging.

    PubMed

    Konstantinidis, Ioannis; Santamaria-Pang, Alberto; Kakadiaris, Ioannis

    2005-01-01

    In this paper, we propose a novel denoising method for 3D confocal microscopy data based on robust edge detection. Our approach relies on the construction of a non-separable frame system in 3D that incorporates the Sobel operator in dual spatial directions. This multidirectional set of digital filters is capable of robustly detecting edge information by ensemble thresholding of the filtered data. We demonstrate the application of our method to both synthetic and real confocal microscopy data by comparing it to denoising methods based on separable 3D wavelets and 3D median filtering, and report very encouraging results.

  11. A new performance evaluation scheme for jet engine vibration signal denoising

    NASA Astrophysics Data System (ADS)

    Sadooghi, Mohammad Saleh; Esmaeilzadeh Khadem, Siamak

    2016-08-01

    Denoising of a cargo-plane jet engine compressor vibration signal is investigated in this article. Discrete wavelet transform and two families of Donoho-Johnston and parameter method thresholding, are applied to vibration signal. Eighty four combinations of wavelet thresholding and mother wavelet are evaluated. A new performance evaluation scheme for optimal selection of mother wavelet and thresholding method combination is proposed in this paper, which is make a trade off between four performance criteria of signal to noise ratio, percentage root mean square difference, Cross-correlation, and mean square error. Dmeyer mother wavelet (dmey) combined with Rigorous SURE thresholding has the maximum trade off value and was selected as the most appropriate combination for denoising of the signal. It was shown that inappropriate combination leads to data losing. Also higher performance of proposed trade off with respect to other criteria was proven graphically.

  12. Fast wavelet estimation of weak biosignals.

    PubMed

    Causevic, Elvir; Morley, Robert E; Wickerhauser, M Victor; Jacquin, Arnaud E

    2005-06-01

    Wavelet-based signal processing has become commonplace in the signal processing community over the past decade and wavelet-based software tools and integrated circuits are now commercially available. One of the most important applications of wavelets is in removal of noise from signals, called denoising, accomplished by thresholding wavelet coefficients in order to separate signal from noise. Substantial work in this area was summarized by Donoho and colleagues at Stanford University, who developed a variety of algorithms for conventional denoising. However, conventional denoising fails for signals with low signal-to-noise ratio (SNR). Electrical signals acquired from the human body, called biosignals, commonly have below 0 dB SNR. Synchronous linear averaging of a large number of acquired data frames is universally used to increase the SNR of weak biosignals. A novel wavelet-based estimator is presented for fast estimation of such signals. The new estimation algorithm provides a faster rate of convergence to the underlying signal than linear averaging. The algorithm is implemented for processing of auditory brainstem response (ABR) and of auditory middle latency response (AMLR) signals. Experimental results with both simulated data and human subjects demonstrate that the novel wavelet estimator achieves superior performance to that of linear averaging.

  13. Wavelet filtering of chaotic data

    NASA Astrophysics Data System (ADS)

    Grzesiak, M.

    Satisfactory method of removing noise from experimental chaotic data is still an open problem. Normally it is necessary to assume certain properties of the noise and dynamics, which one wants to extract, from time series. The wavelet based method of denoising of time series originating from low-dimensional dynamical systems and polluted by the Gaussian white noise is considered. Its efficiency is investigated by comparing the correlation dimension of clean and noisy data generated for some well-known dynamical systems. The wavelet method is contrasted with the singular value decomposition (SVD) and finite impulse response (FIR) filter methods.

  14. Wavelet methods in data mining

    NASA Astrophysics Data System (ADS)

    Manchanda, P.

    2012-07-01

    Data mining (knowledge discovery in data base) is comparatively new interdisciplinary field developed by joint efforts of mathematicians, statisticians, computer scientists and engineers. There are twelve important ingredients of this field along with their applications in real world problems. In this chapter, we have reviewed application of wavelet methods to data mining, particularly denoising, dimension reduction, similarity search, feature extraction and prediction. Meteorological data of Saudi Arabia and Stock market data of India are considered for illustration.

  15. Nonlinear denoising of transient signals with application to event-related potentials

    NASA Astrophysics Data System (ADS)

    Effern, A.; Lehnertz, K.; Schreiber, T.; Grunwald, T.; David, P.; Elger, C. E.

    2000-06-01

    We present a new wavelet-based method for the denoising of event-related potentials (ERPs), employing techniques recently developed for the paradigm of deterministic chaotic systems. The denoising scheme has been constructed to be appropriate for short and transient time sequences using circular state space embedding. Its effectiveness was successfully tested on simulated signals as well as on ERPs recorded from within a human brain. The method enables the study of individual ERPs against strong ongoing brain electrical activity.

  16. Higher-density dyadic wavelet transform and its application

    NASA Astrophysics Data System (ADS)

    Qin, Yi; Tang, Baoping; Wang, Jiaxu

    2010-04-01

    This paper proposes a higher-density dyadic wavelet transform with two generators, whose corresponding wavelet filters are band-pass and high-pass. The wavelet coefficients at each scale in this case have the same length as the signal. This leads to a new redundant dyadic wavelet transform, which is strictly shift invariant and further increases the sampling in the time dimension. We describe the definition of higher-density dyadic wavelet transform, and discuss the condition of perfect reconstruction of the signal from its wavelet coefficients. The fast implementation algorithm for the proposed transform is given as well. Compared with the higher-density discrete wavelet transform, the proposed transform is shift invariant. Applications into signal denoising indicate that the proposed wavelet transform has better denoising performance than other commonly used wavelet transforms. In the end, various typical wavelet transforms are applied to analyze the vibration signals of two faulty roller bearings, the results show that the proposed wavelet transform can more effectively extract the fault characteristics of the roller bearings than the other wavelet transforms.

  17. Optical Wavelet Signals Processing and Multiplexing

    NASA Astrophysics Data System (ADS)

    Cincotti, Gabriella; Moreolo, Michela Svaluto; Neri, Alessandro

    2005-12-01

    We present compact integrable architectures to perform the discrete wavelet transform (DWT) and the wavelet packet (WP) decomposition of an optical digital signal, and we show that the combined use of planar lightwave circuits (PLC) technology and multiresolution analysis (MRA) can add flexibility to current multiple access optical networks. We furnish the design guidelines to synthesize wavelet filters as two-port lattice-form planar devices, and we give some examples of optical signal denoising and compression/decompression techniques in the wavelet domain. Finally, we present a fully optical wavelet packet division multiplexing (WPDM) scheme where data signals are waveform-coded onto wavelet atom functions for transmission, and numerically evaluate its performances.

  18. Anisotropic Nonlocal Means Denoising

    DTIC Science & Technology

    2011-11-26

    match the nuanced edges and textures of real-world images remains open, since we have considered only brutal binary images here. Finally, while NLM...com- puter vision. Denoising algorithms have evolved from the classical linear and median filters to more modern schemes like total variation denoising...underlying image gradients outperforms NLM by a signi cant margin. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same

  19. Optical Planar Discrete Fourier and Wavelet Transforms

    NASA Astrophysics Data System (ADS)

    Cincotti, Gabriella; Moreolo, Michela Svaluto; Neri, Alessandro

    2007-10-01

    We present all-optical architectures to perform discrete wavelet transform (DWT), wavelet packet (WP) decomposition and discrete Fourier transform (DFT) using planar lightwave circuits (PLC) technology. Any compact-support wavelet filter can be implemented as an optical planar two-port lattice-form device, and different subband filtering schemes are possible to denoise, or multiplex optical signals. We consider both parallel and serial input cases. We design a multiport decoder/decoder that is able to generate/process optical codes simultaneously and a flexible logarithmic wavelength multiplexer, with flat top profile and reduced crosstalk.

  20. Automatic denoising of single-trial evoked potentials.

    PubMed

    Ahmadi, Maryam; Quian Quiroga, Rodrigo

    2013-02-01

    We present an automatic denoising method based on the wavelet transform to obtain single trial evoked potentials. The method is based on the inter- and intra-scale variability of the wavelet coefficients and their deviations from baseline values. The performance of the method is tested with simulated event related potentials (ERPs) and with real visual and auditory ERPs. For the simulated data the presented method gives a significant improvement in the observation of single trial ERPs as well as in the estimation of their amplitudes and latencies, in comparison with a standard denoising technique (Donoho's thresholding) and in comparison with the noisy single trials. For the real data, the proposed method largely filters the spontaneous EEG activity, thus helping the identification of single trial visual and auditory ERPs. The proposed method provides a simple, automatic and fast tool that allows the study of single trial responses and their correlations with behavior.

  1. Detecting the BAO using Discrete Wavelet Packets

    NASA Astrophysics Data System (ADS)

    Garcia, Noel Anthony; Wu, Yunyun; Kadowaki, Kevin; Pando, Jesus

    2017-01-01

    We use wavelet packets to investigate the clustering of matter on galactic scales in search of the Baryon Acoustic Oscillations. We do so in two ways. We develop a wavelet packet approach to measure the power spectrum and apply this method to the CMASS galaxy catalogue from the Sloan Digital Sky Survey (SDSS). We compare the resulting power spectrum to published BOSS results by measuring a parameter β that compares our wavelet detected oscillations to the results from the SDSS collaboration. We find that β=1 indicating that our wavelet packet methods are detecting the BAO at a similar level as traditional Fourier techniques. We then use wavelet packets to decompose, denoise, and then reconstruct the galaxy density field. Using this denoised field, we compute the standard two-point correlation function. We are able to successfully detect the BAO at r ≈ 105 h-1 Mpc in line with previous SDSS results. We conclude that wavelet packets do reproduce the results of the key clustering statistics computed by other means. The wavelet packets show distinct advantages in suppressing high frequency noise and in keeping information localized.

  2. Parametric surface denoising

    NASA Astrophysics Data System (ADS)

    Kakadiaris, Ioannis A.; Konstantinidis, Ioannis; Papadakis, Manos; Ding, Wei; Shen, Lixin

    2005-08-01

    Three dimensional (3D) surfaces can be sampled parametrically in the form of range image data. Smoothing/denoising of such raw data is usually accomplished by adapting techniques developed for intensity image processing, since both range and intensity images comprise parametrically sampled geometry and appearance measurements, respectively. We present a transform-based algorithm for surface denoising, motivated by our previous work on intensity image denoising, which utilizes a non-separable Parseval frame and an ensemble thresholding scheme. The frame is constructed from separable (tensor) products of a piecewise linear spline tight frame and incorporates the weighted average operator and the Sobel operators in directions that are integer multiples of 45°. We compare the performance of this algorithm with other transform-based methods from the recent literature. Our results indicate that such transform methods are suited to the task of smoothing range images.

  3. Research of Gear Fault Detection in Morphological Wavelet Domain

    NASA Astrophysics Data System (ADS)

    Hong, Shi; Fang-jian, Shan; Bo, Cong; Wei, Qiu

    2016-02-01

    For extracting mutation information from gear fault signal and achieving a valid fault diagnosis, a gear fault diagnosis method based on morphological mean wavelet transform was designed. Morphological mean wavelet transform is a linear wavelet in the framework of morphological wavelet. Decomposing gear fault signal by this morphological mean wavelet transform could produce signal synthesis operators and detailed synthesis operators. For signal synthesis operators, it was just close to orginal signal, and for detailed synthesis operators, it contained fault impact signal or interference signal and could be catched. The simulation experiment result indicates that, compared with Fourier transform, the morphological mean wavelet transform method can do time-frequency analysis for original signal, effectively catch impact signal appears position; and compared with traditional linear wavelet transform, it has simple structure, easy realization, signal local extremum sensitivity and high denoising ability, so it is more adapted to gear fault real-time detection.

  4. Multiscale image blind denoising.

    PubMed

    Lebrun, Marc; Colom, Miguel; Morel, Jean-Michel

    2015-10-01

    Arguably several thousands papers are dedicated to image denoising. Most papers assume a fixed noise model, mainly white Gaussian or Poissonian. This assumption is only valid for raw images. Yet, in most images handled by the public and even by scientists, the noise model is imperfectly known or unknown. End users only dispose the result of a complex image processing chain effectuated by uncontrolled hardware and software (and sometimes by chemical means). For such images, recent progress in noise estimation permits to estimate from a single image a noise model, which is simultaneously signal and frequency dependent. We propose here a multiscale denoising algorithm adapted to this broad noise model. This leads to a blind denoising algorithm which we demonstrate on real JPEG images and on scans of old photographs for which the formation model is unknown. The consistency of this algorithm is also verified on simulated distorted images. This algorithm is finally compared with the unique state of the art previous blind denoising method.

  5. Nonlinear Image Denoising Methodologies

    DTIC Science & Technology

    2002-05-01

    53 5.3 A Multiscale Approach to Scale-Space Analysis . . . . . . . . . . . . . . . . 53 5.4...etc. In this thesis, Our approach to denoising is first based on a controlled nonlinear stochastic random walk to achieve a scale space analysis ( as in... stochastic treatment or interpretation of the diffusion. In addition, unless a specific stopping time is known to be adequate, the resulting evolution

  6. A new adaptive algorithm for image denoising based on curvelet transform

    NASA Astrophysics Data System (ADS)

    Chen, Musheng; Cai, Zhishan

    2013-10-01

    The purpose of this paper is to study a method of denoising images corrupted with additive white Gaussian noise. In this paper, the application of the time invariant discrete curvelet transform for noise reduction is considered. In curvelet transform, the frame elements are indexed by scale, orientation and location parameters. It is designed to represent edges and the singularities along curved paths more efficiently than the wavelet transform. Therefore, curvelet transform can get better results than wavelet method in image denoising. In general, image denoising imposes a compromise between noise reduction and preserving significant image details. To achieve a good performance in this respect, an efficient and adaptive image denoising method based on curvelet transform is presented in this paper. Firstly, the noisy image is decomposed into many levels to obtain different frequency sub-bands by curvelet transform. Secondly, efficient and adaptive threshold estimation based on generalized Gaussian distribution modeling of sub-band coefficients is used to remove the noisy coefficients. The choice of the threshold estimation is carried out by analyzing the standard deviation and threshold. Ultimately, invert the multi-scale decomposition to reconstruct the denoised image. Here, to prove the performance of the proposed method, the results are compared with other existent algorithms such as hard and soft threshold based on wavelet. The simulation results on several testing images indicate that the proposed method outperforms the other methods in peak signal to noise ratio and keeps better visual in edges information reservation as well. The results also suggest that curvelet transform can achieve a better performance than the wavelet transform in image denoising.

  7. Developing an efficient technique for satellite image denoising and resolution enhancement for improving classification accuracy

    NASA Astrophysics Data System (ADS)

    Thangaswamy, Sree Sharmila; Kadarkarai, Ramar; Thangaswamy, Sree Renga Raja

    2013-01-01

    Satellite images are corrupted by noise during image acquisition and transmission. The removal of noise from the image by attenuating the high-frequency image components removes important details as well. In order to retain the useful information, improve the visual appearance, and accurately classify an image, an effective denoising technique is required. We discuss three important steps such as image denoising, resolution enhancement, and classification for improving accuracy in a noisy image. An effective denoising technique, hybrid directional lifting, is proposed to retain the important details of the images and improve visual appearance. The discrete wavelet transform based interpolation is developed for enhancing the resolution of the denoised image. The image is then classified using a support vector machine, which is superior to other neural network classifiers. The quantitative performance measures such as peak signal to noise ratio and classification accuracy show the significance of the proposed techniques.

  8. Wavelet-based target detection using multiscale directional analysis

    NASA Astrophysics Data System (ADS)

    Chambers, Bradley J.; Reynolds, William D., Jr.; Campbell, Derrick S.; Fennell, Darius K.; Ansari, Rashid

    2007-04-01

    Efficient processing of imagery derived from remote sensing systems has become ever more important due to increasing data sizes, rates, and bit depths. This paper proposes a target detection method that uses a special class of wavelets based on highly frequency-selective directional filter banks. The approach helps isolate object features in different directional filter output components. These components lend themselves well to the application of powerful denoising and edge detection procedures in the wavelet domain. Edge information is derived from directional wavelet decompositions to detect targets of known dimension in electro optical imagery. Results of successful detection of objects using the proposed method are presented in the paper. The approach highlights many of the benefits of working with directional wavelet analysis for image denoising and detection.

  9. Design Methodology of a New Wavelet Basis Function for Fetal Phonocardiographic Signals

    PubMed Central

    Chourasia, Vijay S.; Tiwari, Anil Kumar

    2013-01-01

    Fetal phonocardiography (fPCG) based antenatal care system is economical and has a potential to use for long-term monitoring due to noninvasive nature of the system. The main limitation of this technique is that noise gets superimposed on the useful signal during its acquisition and transmission. Conventional filtering may result into loss of valuable diagnostic information from these signals. This calls for a robust, versatile, and adaptable denoising method applicable in different operative circumstances. In this work, a novel algorithm based on wavelet transform has been developed for denoising of fPCG signals. Successful implementation of wavelet theory in denoising is heavily dependent on selection of suitable wavelet basis function. This work introduces a new mother wavelet basis function for denoising of fPCG signals. The performance of newly developed wavelet is found to be better when compared with the existing wavelets. For this purpose, a two-channel filter bank, based on characteristics of fPCG signal, is designed. The resultant denoised fPCG signals retain the important diagnostic information contained in the original fPCG signal. PMID:23766693

  10. Blind source separation of multichannel electroencephalogram based on wavelet transform and ICA

    NASA Astrophysics Data System (ADS)

    You, Rong-Yi; Chen, Zhong

    2005-11-01

    Combination of the wavelet transform and independent component analysis (ICA) was employed for blind source separation (BSS) of multichannel electroencephalogram (EEG). After denoising the original signals by discrete wavelet transform, high frequency components of some noises and artifacts were removed from the original signals. The denoised signals were reconstructed again for the purpose of ICA, such that the drawback that ICA cannot distinguish noises from source signals can be overcome effectively. The practical processing results showed that this method is an effective way to BSS of multichannel EEG. The method is actually a combination of wavelet transform with adaptive neural network, so it is also useful for BBS of other complex signals.

  11. Coherent noise removal in seismic data with dual-tree M-band wavelets

    NASA Astrophysics Data System (ADS)

    Duval, Laurent; Chaux, Caroline; Ker, Stéphan

    2007-09-01

    Seismic data and their complexity still challenge signal processing algorithms in several applications. The advent of wavelet transforms has allowed improvements in tackling denoising problems. We propose here coherent noise filtering in seismic data with the dual-tree M-band wavelet transform. They offer the possibility to decompose data locally with improved multiscale directions and frequency bands. Denoising is performed in a deterministic fashion in the directional subbands, depending of the coherent noise properties. Preliminary results show that they consistently better preserve seismic signal of interest embedded in highly energetic directional noises than discrete critically sampled and redundant separable wavelet transforms.

  12. Noise reduction in LOS wind velocity of Doppler lidar using discrete wavelet analysis

    NASA Astrophysics Data System (ADS)

    Wu, Songhua; Liu, Zhishen; Sun, Dapeng

    2003-12-01

    The line of sight (LOS) wind velocity can be determined from the incoherent Doppler lidar backscattering signals. Noise and interference in the measurement greatly degrade the inversion accuracy. In this paper, we apply the discrete wavelet denoising method by using biorthogonal wavelets and adopt a distancedependent thresholds algorithm to improve the accuracy of wind velocity measurement by incoherent Doppler lidar. The noisy simulation data are processed and compared with the true LOS wind velocity. The results are compared by the evaluation of both the standard deviation and correlation coefficient.The results suggest that wavelet denoising with distance-dependent thresholds can considerably reduce the noise and interfering turbulence for wind lidar measurement.

  13. Median Modified Wiener Filter for nonlinear adaptive spatial denoising of protein NMR multidimensional spectra.

    PubMed

    Cannistraci, Carlo Vittorio; Abbas, Ahmed; Gao, Xin

    2015-01-26

    Denoising multidimensional NMR-spectra is a fundamental step in NMR protein structure determination. The state-of-the-art method uses wavelet-denoising, which may suffer when applied to non-stationary signals affected by Gaussian-white-noise mixed with strong impulsive artifacts, like those in multi-dimensional NMR-spectra. Regrettably, Wavelet's performance depends on a combinatorial search of wavelet shapes and parameters; and multi-dimensional extension of wavelet-denoising is highly non-trivial, which hampers its application to multidimensional NMR-spectra. Here, we endorse a diverse philosophy of denoising NMR-spectra: less is more! We consider spatial filters that have only one parameter to tune: the window-size. We propose, for the first time, the 3D extension of the median-modified-Wiener-filter (MMWF), an adaptive variant of the median-filter, and also its novel variation named MMWF*. We test the proposed filters and the Wiener-filter, an adaptive variant of the mean-filter, on a benchmark set that contains 16 two-dimensional and three-dimensional NMR-spectra extracted from eight proteins. Our results demonstrate that the adaptive spatial filters significantly outperform their non-adaptive versions. The performance of the new MMWF* on 2D/3D-spectra is even better than wavelet-denoising. Noticeably, MMWF* produces stable high performance almost invariant for diverse window-size settings: this signifies a consistent advantage in the implementation of automatic pipelines for protein NMR-spectra analysis.

  14. Median Modified Wiener Filter for nonlinear adaptive spatial denoising of protein NMR multidimensional spectra

    PubMed Central

    Cannistraci, Carlo Vittorio; Abbas, Ahmed; Gao, Xin

    2015-01-01

    Denoising multidimensional NMR-spectra is a fundamental step in NMR protein structure determination. The state-of-the-art method uses wavelet-denoising, which may suffer when applied to non-stationary signals affected by Gaussian-white-noise mixed with strong impulsive artifacts, like those in multi-dimensional NMR-spectra. Regrettably, Wavelet's performance depends on a combinatorial search of wavelet shapes and parameters; and multi-dimensional extension of wavelet-denoising is highly non-trivial, which hampers its application to multidimensional NMR-spectra. Here, we endorse a diverse philosophy of denoising NMR-spectra: less is more! We consider spatial filters that have only one parameter to tune: the window-size. We propose, for the first time, the 3D extension of the median-modified-Wiener-filter (MMWF), an adaptive variant of the median-filter, and also its novel variation named MMWF*. We test the proposed filters and the Wiener-filter, an adaptive variant of the mean-filter, on a benchmark set that contains 16 two-dimensional and three-dimensional NMR-spectra extracted from eight proteins. Our results demonstrate that the adaptive spatial filters significantly outperform their non-adaptive versions. The performance of the new MMWF* on 2D/3D-spectra is even better than wavelet-denoising. Noticeably, MMWF* produces stable high performance almost invariant for diverse window-size settings: this signifies a consistent advantage in the implementation of automatic pipelines for protein NMR-spectra analysis. PMID:25619991

  15. Median-modified Wiener filter provides efficient denoising, preserving spot edge and morphology in 2-DE image processing.

    PubMed

    Cannistraci, Carlo V; Montevecchi, Franco M; Alessio, Massimo

    2009-11-01

    Denoising is a fundamental early stage in 2-DE image analysis strongly influencing spot detection or pixel-based methods. A novel nonlinear adaptive spatial filter (median-modified Wiener filter, MMWF), is here compared with five well-established denoising techniques (Median, Wiener, Gaussian, and Polynomial-Savitzky-Golay filters; wavelet denoising) to suggest, by means of fuzzy sets evaluation, the best denoising approach to use in practice. Although median filter and wavelet achieved the best performance in spike and Gaussian denoising respectively, they are unsuitable for contemporary removal of different types of noise, because their best setting is noise-dependent. Vice versa, MMWF that arrived second in each single denoising category, was evaluated as the best filter for global denoising, being its best setting invariant of the type of noise. In addition, median filter eroded the edge of isolated spots and filled the space between close-set spots, whereas MMWF because of a novel filter effect (drop-off-effect) does not suffer from erosion problem, preserves the morphology of close-set spots, and avoids spot and spike fuzzyfication, an aberration encountered for Wiener filter. In our tests, MMWF was assessed as the best choice when the goal is to minimize spot edge aberrations while removing spike and Gaussian noise.

  16. Fast Translation Invariant Multiscale Image Denoising.

    PubMed

    Li, Meng; Ghosal, Subhashis

    2015-12-01

    Translation invariant (TI) cycle spinning is an effective method for removing artifacts from images. However, for a method using O(n) time, the exact TI cycle spinning by averaging all possible circulant shifts requires O(n(2)) time where n is the number of pixels, and therefore is not feasible in practice. Existing literature has investigated efficient algorithms to calculate TI version of some denoising approaches such as Haar wavelet. Multiscale methods, especially those based on likelihood decomposition, such as penalized likelihood estimator and Bayesian methods, have become popular in image processing because of their effectiveness in denoising images. As far as we know, there is no systematic investigation of the TI calculation corresponding to general multiscale approaches. In this paper, we propose a fast TI (FTI) algorithm and a more general k-TI (k-TI) algorithm allowing TI for the last k scales of the image, which are applicable to general d-dimensional images (d = 2, 3, …) with either Gaussian or Poisson noise. The proposed FTI leads to the exact TI estimation but only requires O(n log2 n) time. The proposed k-TI can achieve almost the same performance as the exact TI estimation, but requires even less time. We achieve this by exploiting the regularity present in the multiscale structure, which is justified theoretically. The proposed FTI and k-TI are generic in that they are applicable on any smoothing techniques based on the multiscale structure. We demonstrate the FTI and k-TI algorithms on some recently proposed state-of-the-art methods for both Poisson and Gaussian noised images. Both simulations and real data application confirm the appealing performance of the proposed algorithms. MATLAB toolboxes are online accessible to reproduce the results and be implemented for general multiscale denoising approaches provided by the users.

  17. [Wavelet NeighShrink method for grid texture removal in image of solar radio bursts].

    PubMed

    Zhao, Rui-zhen; Hu, Zhan-yi

    2007-01-01

    The data received from solar bursts contain a lot of noise, which makes further processing more difficult. To remove the noise and enhance the image, we studied the properties of the NeighShrink threshold function and analyzed the influence of neighborhood window size on the denoising result, on the basis of which a new wavelet NeighShrink square root method for image denoising is presented. Firstly, each channel of the solar burst image is normalized, which can, to some extent, remove the horizontal grid texture in the image. Secondly, the preprocessed image is decomposed by wavelet transform, and the obtained wavelet coefficients are thresholded by NeighShrink square root method. Finally, the denoised image is reconstructed by inverse wavelet transform. The experimental results show that the presented method is effective in noise removal and image enhancement.

  18. Studies on filtered back-projection imaging reconstruction based on a modified wavelet threshold function

    NASA Astrophysics Data System (ADS)

    Wang, Zhengzi; Ren, Zhong; Liu, Guodong

    2016-10-01

    In this paper, the wavelet threshold denoising method was used into the filtered back-projection algorithm of imaging reconstruction. To overcome the drawbacks of the traditional soft- and hard-threshold functions, a modified wavelet threshold function was proposed. The modified wavelet threshold function has two threshold values and two variants. To verify the feasibility of the modified wavelet threshold function, the standard test experiments were performed by using the software platform of MATLAB. Experimental results show that the filtered back-projection reconstruction algorithm based on the modified wavelet threshold function has better reconstruction effect because of more flexible advantage.

  19. Quantum Boolean image denoising

    NASA Astrophysics Data System (ADS)

    Mastriani, Mario

    2015-05-01

    A quantum Boolean image processing methodology is presented in this work, with special emphasis in image denoising. A new approach for internal image representation is outlined together with two new interfaces: classical to quantum and quantum to classical. The new quantum Boolean image denoising called quantum Boolean mean filter works with computational basis states (CBS), exclusively. To achieve this, we first decompose the image into its three color components, i.e., red, green and blue. Then, we get the bitplanes for each color, e.g., 8 bits per pixel, i.e., 8 bitplanes per color. From now on, we will work with the bitplane corresponding to the most significant bit (MSB) of each color, exclusive manner. After a classical-to-quantum interface (which includes a classical inverter), we have a quantum Boolean version of the image within the quantum machine. This methodology allows us to avoid the problem of quantum measurement, which alters the results of the measured except in the case of CBS. Said so far is extended to quantum algorithms outside image processing too. After filtering of the inverted version of MSB (inside quantum machine), the result passes through a quantum-classical interface (which involves another classical inverter) and then proceeds to reassemble each color component and finally the ending filtered image. Finally, we discuss the more appropriate metrics for image denoising in a set of experimental results.

  20. 3D Wavelet-Based Filter and Method

    DOEpatents

    Moss, William C.; Haase, Sebastian; Sedat, John W.

    2008-08-12

    A 3D wavelet-based filter for visualizing and locating structural features of a user-specified linear size in 2D or 3D image data. The only input parameter is a characteristic linear size of the feature of interest, and the filter output contains only those regions that are correlated with the characteristic size, thus denoising the image.

  1. The use of ensemble empirical mode decomposition as a novel denoising technique

    NASA Astrophysics Data System (ADS)

    Gaci, Said; Hachay, Olga; Zaourar, Naima

    2016-04-01

    Denoising is of a high importance in geophysical data processing. This paper suggests a new denoising technique based on the Ensemble Empirical mode decomposition (EEMD). This technique has been compared with the discrete wavelet transform (DWT) thresholding. Firstly, both methods have been implemented on synthetic signals with diverse waveforms ('blocks', 'heavy sine', 'Doppler', and 'mishmash'). The EEMD denoising method is proved to be the most efficient for 'blocks', 'heavy sine' and 'mishmash' signals for all the considered signal-to-noise ratio (SNR) values. However, the results obtained using the DWT thresholding are the most reliable for 'Doppler' signal, and the difference between the calculated mean square error (MSE) values using the studied methods is slight and decreases as the SNR value gets smaller values. Secondly, the denoising methods have been applied on real seismic traces recorded in the Algerian Sahara. It is shown that the proposed technique outperforms the DWT thresholding. In conclusion, the EEMD technique can provide a powerful tool for denoising seismic signals. Keywords: Ensemble Empirical mode decomposition (EEMD), Discrete wavelet transform (DWT), seismic signal.

  2. Relevant modes selection method based on Spearman correlation coefficient for laser signal denoising using empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Duan, Yabo; Song, Chengtian

    2016-12-01

    Empirical mode decomposition (EMD) is a recently proposed nonlinear and nonstationary laser signal denoising method. A noisy signal is broken down using EMD into oscillatory components that are called intrinsic mode functions (IMFs). Thresholding-based denoising and correlation-based partial reconstruction of IMFs are the two main research directions for EMD-based denoising. Similar to other decomposition-based denoising approaches, EMD-based denoising methods require a reliable threshold to determine which IMFs are noise components and which IMFs are noise-free components. In this work, we propose a new approach in which each IMF is first denoised using EMD interval thresholding (EMD-IT), and then a robust thresholding process based on Spearman correlation coefficient is used for relevant modes selection. The proposed method tackles the problem using a thresholding-based denoising approach coupled with partial reconstruction of the relevant IMFs. Other traditional denoising methods, including correlation-based EMD partial reconstruction (EMD-Correlation), discrete Fourier transform and wavelet-based methods, are investigated to provide a comparison with the proposed technique. Simulation and test results demonstrate the superior performance of the proposed method when compared with the other methods.

  3. A comparison of Monte Carlo dose calculation denoising techniques

    NASA Astrophysics Data System (ADS)

    El Naqa, I.; Kawrakow, I.; Fippel, M.; Siebers, J. V.; Lindsay, P. E.; Wickerhauser, M. V.; Vicic, M.; Zakarian, K.; Kauffmann, N.; Deasy, J. O.

    2005-03-01

    Recent studies have demonstrated that Monte Carlo (MC) denoising techniques can reduce MC radiotherapy dose computation time significantly by preferentially eliminating statistical fluctuations ('noise') through smoothing. In this study, we compare new and previously published approaches to MC denoising, including 3D wavelet threshold denoising with sub-band adaptive thresholding, content adaptive mean-median-hybrid (CAMH) filtering, locally adaptive Savitzky-Golay curve-fitting (LASG), anisotropic diffusion (AD) and an iterative reduction of noise (IRON) method formulated as an optimization problem. Several challenging phantom and computed-tomography-based MC dose distributions with varying levels of noise formed the test set. Denoising effectiveness was measured in three ways: by improvements in the mean-square-error (MSE) with respect to a reference (low noise) dose distribution; by the maximum difference from the reference distribution and by the 'Van Dyk' pass/fail criteria of either adequate agreement with the reference image in low-gradient regions (within 2% in our case) or, in high-gradient regions, a distance-to-agreement-within-2% of less than 2 mm. Results varied significantly based on the dose test case: greater reductions in MSE were observed for the relatively smoother phantom-based dose distribution (up to a factor of 16 for the LASG algorithm); smaller reductions were seen for an intensity modulated radiation therapy (IMRT) head and neck case (typically, factors of 2-4). Although several algorithms reduced statistical noise for all test geometries, the LASG method had the best MSE reduction for three of the four test geometries, and performed the best for the Van Dyk criteria. However, the wavelet thresholding method performed better for the head and neck IMRT geometry and also decreased the maximum error more effectively than LASG. In almost all cases, the evaluated methods provided acceleration of MC results towards statistically more accurate

  4. The use of wavelet filters for reducing noise in posterior fossa Computed Tomography images

    SciTech Connect

    Pita-Machado, Reinado; Perez-Diaz, Marlen Lorenzo-Ginori, Juan V. Bravo-Pino, Rolando

    2014-11-07

    Wavelet transform based de-noising like wavelet shrinkage, gives the good results in CT. This procedure affects very little the spatial resolution. Some applications are reconstruction methods, while others are a posteriori de-noising methods. De-noising after reconstruction is very difficult because the noise is non-stationary and has unknown distribution. Therefore, methods which work on the sinogram-space don’t have this problem, because they always work over a known noise distribution at this point. On the other hand, the posterior fossa in a head CT is a very complex region for physicians, because it is commonly affected by artifacts and noise which are not eliminated during the reconstruction procedure. This can leads to some false positive evaluations. The purpose of our present work is to compare different wavelet shrinkage de-noising filters to reduce noise, particularly in images of the posterior fossa within CT scans in the sinogram-space. This work describes an experimental search for the best wavelets, to reduce Poisson noise in Computed Tomography (CT) scans. Results showed that de-noising with wavelet filters improved the quality of posterior fossa region in terms of an increased CNR, without noticeable structural distortions.

  5. Study on the FOG's signal based on wavelet

    NASA Astrophysics Data System (ADS)

    Tang, Ji-qiang; Fang, Jian-cheng; Zhang, Yan-shun

    2006-11-01

    In order to study on the fiber optical gyro (abbreviated as FOG) signal based on wavelet, this paper researches the FOG signal drift model and the properties of wavelet analyzed noise, introduces the wavelet filtering method, wavelet base selection, soft and hard threshold value de-noising algorithm and compulsive filtering based on The Haar wavelet. These threshold value filtering results of both of the soft and of the hard threshold value for the same wavelet base of db4 with the same Donoho threshold values and these results of compulsive filtering based on The Haar wavelet and db4 wavelet are presented also in this paper and then these main conclusions based on foregoing analysis are reached: Larger the resolving scale is, the filtering effect is more perfect. The soft threshold value filtering effect is better than that of the hard threshold value filtering at the cost of calculation when the threshold value is same. The zero shift of the compulsive filtering is least when both the wavelet and the resolving scale are same for these filtering methods. For the compulsive filtering with same wavelets, the filtering effect of Harr is better than that of db4 and the calculation of the former is fewer. Finally the author point out that applying the compulsive filtering with the Harr wavelet base and suitable resolving scale to the signal processing of FOG be helpful for the FOG's design and manufacturing.

  6. A comparison of wavelet analysis techniques in digital holograms

    NASA Astrophysics Data System (ADS)

    Molony, Karen M.; Maycock, Jonathan; McDonald, John B.; Hennelly, Bryan M.; Naughton, Thomas J.

    2008-04-01

    This study explores the effectiveness of wavelet analysis techniques on digital holograms of real-world 3D objects. Stationary and discrete wavelet transform techniques have been applied for noise reduction and compared. Noise is a common problem in image analysis and successful reduction of noise without degradation of content is difficult to achieve. These wavelet transform denoising techniques are contrasted with traditional noise reduction techniques; mean filtering, median filtering, Fourier filtering. The different approaches are compared in terms of speckle reduction, edge preservation and resolution preservation.

  7. A fast non-local image denoising algorithm

    NASA Astrophysics Data System (ADS)

    Dauwe, A.; Goossens, B.; Luong, H. Q.; Philips, W.

    2008-02-01

    In this paper we propose several improvements to the original non-local means algorithm introduced by Buades et al. which obtains state-of-the-art denoising results. The strength of this algorithm is to exploit the repetitive character of the image in order to denoise the image unlike conventional denoising algorithms, which typically operate in a local neighbourhood. Due to the enormous amount of weight computations, the original algorithm has a high computational cost. An improvement of image quality towards the original algorithm is to ignore the contributions from dissimilar windows. Even though their weights are very small at first sight, the new estimated pixel value can be severely biased due to the many small contributions. This bad influence of dissimilar windows can be eliminated by setting their corresponding weights to zero. Using the preclassification based on the first three statistical moments, only contributions from similar neighborhoods are computed. To decide whether a window is similar or dissimilar, we will derive thresholds for images corrupted with additive white Gaussian noise. Our accelerated approach is further optimized by taking advantage of the symmetry in the weights, which roughly halves the computation time, and by using a lookup table to speed up the weight computations. Compared to the original algorithm, our proposed method produces images with increased PSNR and better visual performance in less computation time. Our proposed method even outperforms state-of-the-art wavelet denoising techniques in both visual quality and PSNR values for images containing a lot of repetitive structures such as textures: the denoised images are much sharper and contain less artifacts. The proposed optimizations can also be applied in other image processing tasks which employ the concept of repetitive structures such as intra-frame super-resolution or detection of digital image forgery.

  8. Generalized total variation-based MRI Rician denoising model with spatially adaptive regularization parameters.

    PubMed

    Liu, Ryan Wen; Shi, Lin; Huang, Wenhua; Xu, Jing; Yu, Simon Chun Ho; Wang, Defeng

    2014-07-01

    Magnetic resonance imaging (MRI) is an outstanding medical imaging modality but the quality often suffers from noise pollution during image acquisition and transmission. The purpose of this study is to enhance image quality using feature-preserving denoising method. In current literature, most existing MRI denoising methods did not simultaneously take the global image prior and local image features into account. The denoising method proposed in this paper is implemented based on an assumption of spatially varying Rician noise map. A two-step wavelet-domain estimation method is developed to extract the noise map. Following a Bayesian modeling approach, a generalized total variation-based MRI denoising model is proposed based on global hyper-Laplacian prior and Rician noise assumption. The proposed model has the properties of backward diffusion in local normal directions and forward diffusion in local tangent directions. To further improve the denoising performance, a local variance estimator-based method is introduced to calculate the spatially adaptive regularization parameters related to local image features and spatially varying noise map. The main benefit of the proposed method is that it takes full advantage of the global MR image prior and local image features. Numerous experiments have been conducted on both synthetic and real MR data sets to compare our proposed model with some state-of-the-art denoising methods. The experimental results have demonstrated the superior performance of our proposed model in terms of quantitative and qualitative image quality evaluations.

  9. Optimal FPGA implementation of CL multiwavelets architecture for signal denoising application

    NASA Astrophysics Data System (ADS)

    Mohan Kumar, B.; Vidhya Lavanya, R.; Sumesh, E. P.

    2013-03-01

    Wavelet transform is considered one of the efficient transforms of this decade for real time signal processing. Due to implementation constraints scalar wavelets do not possess the properties such as compact support, regularity, orthogonality and symmetry, which are desirable qualities to provide a good signal to noise ratio (SNR) in case of signal denoising. This leads to the evolution of the new dimension of wavelet called 'multiwavelets', which possess more than one scaling and wavelet filters. The architecture implementation of multiwavelets is an emerging area of research. In real time, the signals are in scalar form, which demands the processing architecture to be scalar. But the conventional Donovan Geronimo Hardin Massopust (DGHM) and Chui-Lian (CL) multiwavelets are vectored and are also unbalanced. In this article, the vectored multiwavelet transforms are converted into a scalar form and its architecture is implemented in FPGA (Field Programmable Gate Array) for signal denoising application. The architecture is compared with DGHM multiwavelets architecture in terms of several objective and performance measures. The CL multiwavelets architecture is further optimised for best performance by using DSP48Es. The results show that CL multiwavelet architecture is suited better for the signal denoising application.

  10. Periodized wavelets

    SciTech Connect

    Schlossnagle, G.; Restrepo, J.M.; Leaf, G.K.

    1993-12-01

    The properties of periodized Daubechies wavelets on [0,1] are detailed and contrasted against their counterparts which form a basis for L{sup 2}(R). Numerical examples illustrate the analytical estimates for convergence and demonstrate by comparison with Fourier spectral methods the superiority of wavelet projection methods for approximations. The analytical solution to inner products of periodized wavelets and their derivatives, which are known as connection coefficients, is presented, and several tabulated values are included.

  11. Global Image Denoising.

    PubMed

    Talebi, Hossein; Milanfar, Peyman

    2014-02-01

    Most existing state-of-the-art image denoising algorithms are based on exploiting similarity between a relatively modest number of patches. These patch-based methods are strictly dependent on patch matching, and their performance is hamstrung by the ability to reliably find sufficiently similar patches. As the number of patches grows, a point of diminishing returns is reached where the performance improvement due to more patches is offset by the lower likelihood of finding sufficiently close matches. The net effect is that while patch-based methods, such as BM3D, are excellent overall, they are ultimately limited in how well they can do on (larger) images with increasing complexity. In this paper, we address these shortcomings by developing a paradigm for truly global filtering where each pixel is estimated from all pixels in the image. Our objectives in this paper are two-fold. First, we give a statistical analysis of our proposed global filter, based on a spectral decomposition of its corresponding operator, and we study the effect of truncation of this spectral decomposition. Second, we derive an approximation to the spectral (principal) components using the Nyström extension. Using these, we demonstrate that this global filter can be implemented efficiently by sampling a fairly small percentage of the pixels in the image. Experiments illustrate that our strategy can effectively globalize any existing denoising filters to estimate each pixel using all pixels in the image, hence improving upon the best patch-based methods.

  12. Filtered back-projection reconstruction of photo-acoustic imaging based on an modified wavelet threshold function

    NASA Astrophysics Data System (ADS)

    Ren, Zhong; Liu, Guodong; Huang, Zhen

    2016-10-01

    In this study, the filtered back-projection algorithm was used to reconstruct the photoacoustic imaging. To improve the quality of the reconstructed image, the wavelet threshold denoising method was combined into the filtered back-projection reconstruction algorithm. To obtain the reconstructed effect of the photoacoustic imaging, a modified wavelet threshold function was proposed. To verify the feasibility of the modified wavelet threshold function, the simulation experiments of the standard test phantom were performed by using three different wavelet threshold functions. Compared with the soft- and hard-threshold functions, the modified wavelet threshold function has better denoised and reconstructed effect. Moreover, the peak signal-to-noises ratio (PSNR) value of the modified function is largest, and its mean root square error (MRSE) value is lest than that of two others. Therefore, the filtered back-projection reconstruction algorithm combined with the modified wavelet threshold function has potential value in the reconstruction of the photoacoustic imaging.

  13. Contrast Sensitivity of the Wavelet, Dual Tree Complex Wavelet, Curvelet and Steerable Pyramid Transforms.

    PubMed

    Hill, Paul; Achim, Alin; Al-Mualla, Mohammed Ebrahim; Bull, David

    2016-04-11

    Accurate estimation of the contrast sensitivity of the human visual system is crucial for perceptually based image processing in applications such as compression, fusion and denoising. Conventional Contrast Sensitivity Functions (CSFs) have been obtained using fixed sized Gabor functions. However, the basis functions of multiresolution decompositions such as wavelets often resemble Gabor functions but are of variable size and shape. Therefore to use conventional contrast sensitivity functions in such cases is not appropriate. We have therefore conducted a set of psychophysical tests in order to obtain the contrast sensitivity function for a range of multiresolution transforms: the Discrete Wavelet Transform (DWT), the Steerable Pyramid, the Dual-Tree Complex Wavelet Transform (DT-CWT) and the Curvelet Transform. These measures were obtained using contrast variation of each transforms' basis functions in a 2AFC experiment combined with an adapted version of the QUEST psychometric function method. The results enable future image processing applications that exploit these transforms such as signal fusion, super-resolution processing, denoising and motion estimation, to be perceptually optimised in a principled fashion. The results are compared to an existing vision model (HDR-VDP2) and are used to show quantitative improvements within a denoising application compared to using conventional CSF values.

  14. Denoising for 3-d photon-limited imaging data using nonseparable filterbanks.

    PubMed

    Santamaria-Pang, Alberto; Bildea, Teodor Stefan; Tan, Shan; Kakadiaris, Ioannis A

    2008-12-01

    In this paper, we present a novel frame-based denoising algorithm for photon-limited 3-D images. We first construct a new 3-D nonseparable filterbank by adding elements to an existing frame in a structurally stable way. In contrast with the traditional 3-D separable wavelet system, the new filterbank is capable of using edge information in multiple directions. We then propose a data-adaptive hysteresis thresholding algorithm based on this new 3-D nonseparable filterbank. In addition, we develop a new validation strategy for denoising of photon-limited images containing sparse structures, such as neurons (the structure of interest is less than 5% of total volume). The validation method, based on tubular neighborhoods around the structure, is used to determine the optimal threshold of the proposed denoising algorithm. We compare our method with other state-of-the-art methods and report very encouraging results on applications utilizing both synthetic and real data.

  15. Comparison of de-noising techniques for FIRST images

    SciTech Connect

    Fodor, I K; Kamath, C

    2001-01-22

    Data obtained through scientific observations are often contaminated by noise and artifacts from various sources. As a result, a first step in mining these data is to isolate the signal of interest by minimizing the effects of the contaminations. Once the data has been cleaned or de-noised, data mining can proceed as usual. In this paper, we describe our work in denoising astronomical images from the Faint Images of the Radio Sky at Twenty-Centimeters (FIRST) survey. We are mining this survey to detect radio-emitting galaxies with a bent-double morphology. This task is made difficult by the noise in the images caused by the processing of the sensor data. We compare three different approaches to de-noising: thresholding of wavelet coefficients advocated in the statistical community, traditional Altering methods used in the image processing community, and a simple thresholding scheme proposed by FIRST astronomers. While each approach has its merits and pitfalls, we found that for our purpose, the simple thresholding scheme worked relatively well for the FIRST dataset.

  16. Research on infrared-image denoising algorithm based on the noise analysis of the detector

    NASA Astrophysics Data System (ADS)

    Liu, Songtao; Zhou, Xiaodong; Shen, Tongsheng; Han, Yanli

    2005-01-01

    Since the conventional denoising algorithms have not considered the influence of certain concrete detector, they are not very effective to remove various noises contained in the low signal-to-noise ration infrared image. In this paper, a new thinking for infrared image denoising is proposed, which is based on the noise analyses of detector with an example of L model infrared multi-element detector. According to the noise analyses of this detector, the emphasis is placed on how to filter white noise and fractal noise in the preprocessing phase. Wavelet analysis is a good tool for analyzing 1/f process. 1/f process can be viewed as white noise approximately since its wavelet coefficients are stationary and uncorrelated. So if wavelet transform is adopted, the problem of removing white noise and fraction noise is simplified as the only one problem, i.e., removing white noise. To address this problem, a new wavelet domain adaptive wiener filtering algorithm is presented. From the viewpoint of quantitative and qualitative analyses, the filtering effect of our method is compared with those of traditional median filter, mean filter and wavelet thresholding algorithm in detail. The results show that our method can reduce various noises effectively and raise the ratio of signal-to-noise evidently.

  17. A hybrid spatial-spectral denoising method for infrared hyperspectral images using 2DPCA

    NASA Astrophysics Data System (ADS)

    Huang, Jun; Ma, Yong; Mei, Xiaoguang; Fan, Fan

    2016-11-01

    The traditional noise reduction methods for 3-D infrared hyperspectral images typically operate independently in either the spatial or spectral domain, and such methods overlook the relationship between the two domains. To address this issue, we propose a hybrid spatial-spectral method in this paper to link both domains. First, principal component analysis and bivariate wavelet shrinkage are performed in the 2-D spatial domain. Second, 2-D principal component analysis transformation is conducted in the 1-D spectral domain to separate the basic components from detail ones. The energy distribution of noise is unaffected by orthogonal transformation; therefore, the signal-to-noise ratio of each component is used as a criterion to determine whether a component should be protected from over-denoising or denoised with certain 1-D denoising methods. This study implements the 1-D wavelet shrinking threshold method based on Stein's unbiased risk estimator, and the quantitative results on publicly available datasets demonstrate that our method can improve denoising performance more effectively than other state-of-the-art methods can.

  18. Acral peeling skin syndrome.

    PubMed

    Hashimoto, K; Hamzavi, I; Tanaka, K; Shwayder, T

    2000-12-01

    Peeling skin syndrome is a rare autosomal recessive disease characterized by widespread painless peeling of the skin in superficial sheets. We describe a 34-year-old man with a lifelong history of spontaneous asymptomatic peeling skin limited to the acral surfaces. This patient probably represents a localized variant of peeling skin syndrome, which has previously been described as a generalized condition. Light and electron microscopic studies of biopsy specimens taken before and after immersion in water were performed. It was concluded that this patient has abnormal keratohyalin granules and inadequate aggregation of keratin filaments that caused the separation of the epidermis in the stratum corneum through the clear zone. Alternatively, unknown keratin species expressed in the clear zone may also cause the abnormality. (J Am Acad Dermatol 2000;43:1112-9.).

  19. Chemistry with a Peel.

    ERIC Educational Resources Information Center

    Borer, Londa; Larsen, Eric

    1997-01-01

    Presents experiments that introduce natural product chemistry into high school classrooms. In the laboratory activities, students isolate and analyze the oil in orange peels. Students also perform a steam distillation and learn about terpenes. (DDR)

  20. Electrocardiogram Signal Denoising Using Extreme-Point Symmetric Mode Decomposition and Nonlocal Means

    PubMed Central

    Tian, Xiaoying; Li, Yongshuai; Zhou, Huan; Li, Xiang; Chen, Lisha; Zhang, Xuming

    2016-01-01

    Electrocardiogram (ECG) signals contain a great deal of essential information which can be utilized by physicians for the diagnosis of heart diseases. Unfortunately, ECG signals are inevitably corrupted by noise which will severely affect the accuracy of cardiovascular disease diagnosis. Existing ECG signal denoising methods based on wavelet shrinkage, empirical mode decomposition and nonlocal means (NLM) cannot provide sufficient noise reduction or well-detailed preservation, especially with high noise corruption. To address this problem, we have proposed a hybrid ECG signal denoising scheme by combining extreme-point symmetric mode decomposition (ESMD) with NLM. In the proposed method, the noisy ECG signals will first be decomposed into several intrinsic mode functions (IMFs) and adaptive global mean using ESMD. Then, the first several IMFs will be filtered by the NLM method according to the frequency of IMFs while the QRS complex detected from these IMFs as the dominant feature of the ECG signal and the remaining IMFs will be left unprocessed. The denoised IMFs and unprocessed IMFs are combined to produce the final denoised ECG signals. Experiments on both simulated ECG signals and real ECG signals from the MIT-BIH database demonstrate that the proposed method can suppress noise in ECG signals effectively while preserving the details very well, and it outperforms several state-of-the-art ECG signal denoising methods in terms of signal-to-noise ratio (SNR), root mean squared error (RMSE), percent root mean square difference (PRD) and mean opinion score (MOS) error index. PMID:27681729

  1. Weak transient fault feature extraction based on an optimized Morlet wavelet and kurtosis

    NASA Astrophysics Data System (ADS)

    Qin, Yi; Xing, Jianfeng; Mao, Yongfang

    2016-08-01

    Aimed at solving the key problem in weak transient detection, the present study proposes a new transient feature extraction approach using the optimized Morlet wavelet transform, kurtosis index and soft-thresholding. Firstly, a fast optimization algorithm based on the Shannon entropy is developed to obtain the optimized Morlet wavelet parameter. Compared to the existing Morlet wavelet parameter optimization algorithm, this algorithm has lower computation complexity. After performing the optimized Morlet wavelet transform on the analyzed signal, the kurtosis index is used to select the characteristic scales and obtain the corresponding wavelet coefficients. From the time-frequency distribution of the periodic impulsive signal, it is found that the transient signal can be reconstructed by the wavelet coefficients at several characteristic scales, rather than the wavelet coefficients at just one characteristic scale, so as to improve the accuracy of transient detection. Due to the noise influence on the characteristic wavelet coefficients, the adaptive soft-thresholding method is applied to denoise these coefficients. With the denoised wavelet coefficients, the transient signal can be reconstructed. The proposed method was applied to the analysis of two simulated signals, and the diagnosis of a rolling bearing fault and a gearbox fault. The superiority of the method over the fast kurtogram method was verified by the results of simulation analysis and real experiments. It is concluded that the proposed method is extremely suitable for extracting the periodic impulsive feature from strong background noise.

  2. Discrete directional wavelet bases for image compression

    NASA Astrophysics Data System (ADS)

    Dragotti, Pier L.; Velisavljevic, Vladan; Vetterli, Martin; Beferull-Lozano, Baltasar

    2003-06-01

    The application of the wavelet transform in image processing is most frequently based on a separable construction. Lines and columns in an image are treated independently and the basis functions are simply products of the corresponding one dimensional functions. Such method keeps simplicity in design and computation, but is not capable of capturing properly all the properties of an image. In this paper, a new truly separable discrete multi-directional transform is proposed with a subsampling method based on lattice theory. Alternatively, the subsampling can be omitted and this leads to a multi-directional frame. This transform can be applied in many areas like denoising, non-linear approximation and compression. The results on non-linear approximation and denoising show very interesting gains compared to the standard two-dimensional analysis.

  3. The chemical peel.

    PubMed

    Peters, W

    1991-06-01

    Chemical peeling of facial skin has become a valuable adjunct in the armamentarium of the facial aesthetic surgeon. Among the various techniques available, phenol solutions are the most commonly used. Peeling produces a controlled, partial-thickness chemical burn of the epidermis and the outer dermis. Several techniques are available to "fine tune" the depth of the peel. Regeneration of peeled skin results in a fresh, orderly, organized epidermis. In the dermis, a new 2- to 3-mm band of dense, compact, orderly collagen is formed between the epidermis and the underlying damaged dermis, which results in effective ablation of the fine wrinkles in the skin and a reduction of pigmentation. These clinical and histological changes are long lasting (15-20 years) and may be permanent in some patients. Because of the metabolism and systemic complications of phenol, patient selection should involve systemic evaluation of liver, renal, and cardiac function, as well as an evaluation of the skin quality and medication status of the patient. Because of potential cardiac arrhythmias, peeling must be performed in a medically supervised environment, with continuous cardiac monitoring. The local complications of peeling include pigmentation changes, scarring, milia, ectropion, infection, activation of herpes simplex, and toxic shock syndrome.

  4. Wavelet applied to computer vision in astrophysics

    NASA Astrophysics Data System (ADS)

    Bijaoui, Albert; Slezak, Eric; Traina, Myriam

    2004-02-01

    Multiscale analyses can be provided by application wavelet transforms. For image processing purposes, we applied algorithms which imply a quasi isotropic vision. For a uniform noisy image, a wavelet coefficient W has a probability density function (PDF) p(W) which depends on the noise statistic. The PDF was determined for many statistical noises: Gauss, Poission, Rayleigh, exponential. For CCD observations, the Anscombe transform was generalized to a mixed Gasus+Poisson noise. From the discrete wavelet transform a set of significant wavelet coefficients (SSWC)is obtained. Many applications have been derived like denoising and deconvolution. Our main application is the decomposition of the image into objects, i.e the vision. At each scale an image labelling is performed in the SSWC. An interscale graph linking the fields of significant pixels is then obtained. The objects are identified using this graph. The wavelet coefficients of the tree related to a given object allow one to reconstruct its image by a classical inverse method. This vision model has been applied to astronomical images, improving the analysis of complex structures.

  5. Sonar target enhancement by shrinkage of incoherent wavelet coefficients.

    PubMed

    Hunter, Alan J; van Vossen, Robbert

    2014-01-01

    Background reverberation can obscure useful features of the target echo response in broadband low-frequency sonar images, adversely affecting detection and classification performance. This paper describes a resolution and phase-preserving means of separating the target response from the background reverberation noise using a coherence-based wavelet shrinkage method proposed recently for de-noising magnetic resonance images. The algorithm weights the image wavelet coefficients in proportion to their coherence between different looks under the assumption that the target response is more coherent than the background. The algorithm is demonstrated successfully on experimental synthetic aperture sonar data from a broadband low-frequency sonar developed for buried object detection.

  6. Speckle Suppression in Ultrasonic Images Based on Undecimated Wavelets

    NASA Astrophysics Data System (ADS)

    Argenti, Fabrizio; Torricelli, Gionatan

    2003-12-01

    An original method to denoise ultrasonic images affected by speckle is presented. Speckle is modeled as a signal-dependent noise corrupting the image. Noise reduction is approached as a Wiener-like filtering performed in a shift-invariant wavelet domain by means of an adaptive rescaling of the coefficients of an undecimated octave decomposition. The scaling factor of each coefficient is calculated from local statistics of the degraded image, the parameters of the noise model, and the wavelet filters. Experimental results demonstrate that excellent background smoothing as well as preservation of edge sharpness and fine details can be obtained.

  7. Adaptive denoising and multiscale detection of the V wave in brainstem auditory evoked potentials.

    PubMed

    Popescu, M; Papadimitriou, S; Karamitsos, D; Bezerianos, A

    1999-01-01

    This paper describes a wavelet-transform-based system for the V wave identification in brainstem auditory evoked potentials (BAEP). The system combines signal denoising and rule-based localization modules. The signal denoising module has the potential of effective noise reduction after signal averaging. It analyses adaptively the evolution of the wavelet transform maxima across scales. The singularities of the signal create wavelet maxima with different properties from those of the induced noise. A non-linear filtering process implemented with a neural network extracts out the noise-induced maxima. The filtered wavelet details are subsequently analysed by the rule-based localization module for the automatic identification of the V wave. In the first phase, it implements a set of statistical observations as well as heuristic criteria used by human experts in order to classify the IV-V complex. At the second phase, using a multiscale focusing algorithm, the IV and V waves are positioned on the BAEP signal. Our experiments revealed that the system provides accurate results even for signals exhibiting unclear IV-V complexes.

  8. A de-noising algorithm to improve SNR of segmented gamma scanner for spectrum analysis

    NASA Astrophysics Data System (ADS)

    Li, Huailiang; Tuo, Xianguo; Shi, Rui; Zhang, Jinzhao; Henderson, Mark Julian; Courtois, Jérémie; Yan, Minhao

    2016-05-01

    An improved threshold shift-invariant wavelet transform de-noising algorithm for high-resolution gamma-ray spectroscopy is proposed to optimize the threshold function of wavelet transforms and reduce signal resulting from pseudo-Gibbs artificial fluctuations. This algorithm was applied to a segmented gamma scanning system with large samples in which high continuum levels caused by Compton scattering are routinely encountered. De-noising data from the gamma ray spectrum measured by segmented gamma scanning system with improved, shift-invariant and traditional wavelet transform algorithms were all evaluated. The improved wavelet transform method generated significantly enhanced performance of the figure of merit, the root mean square error, the peak area, and the sample attenuation correction in the segmented gamma scanning system assays. We also found that the gamma energy spectrum can be viewed as a low frequency signal as well as high frequency noise superposition by the spectrum analysis. Moreover, a smoothed spectrum can be appropriate for straightforward automated quantitative analysis.

  9. Simultaneous Fusion and Denoising of Panchromatic and Multispectral Satellite Images

    NASA Astrophysics Data System (ADS)

    Ragheb, Amr M.; Osman, Heba; Abbas, Alaa M.; Elkaffas, Saleh M.; El-Tobely, Tarek A.; Khamis, S.; Elhalawany, Mohamed E.; Nasr, Mohamed E.; Dessouky, Moawad I.; Al-Nuaimy, Waleed; Abd El-Samie, Fathi E.

    2012-12-01

    To identify objects in satellite images, multispectral (MS) images with high spectral resolution and low spatial resolution, and panchromatic (Pan) images with high spatial resolution and low spectral resolution need to be fused. Several fusion methods such as the intensity-hue-saturation (IHS), the discrete wavelet transform, the discrete wavelet frame transform (DWFT), and the principal component analysis have been proposed in recent years to obtain images with both high spectral and spatial resolutions. In this paper, a hybrid fusion method for satellite images comprising both the IHS transform and the DWFT is proposed. This method tries to achieve the highest possible spectral and spatial resolutions with as small distortion in the fused image as possible. A comparison study between the proposed hybrid method and the traditional methods is presented in this paper. Different MS and Pan images from Landsat-5, Spot, Landsat-7, and IKONOS satellites are used in this comparison. The effect of noise on the proposed hybrid fusion method as well as the traditional fusion methods is studied. Experimental results show the superiority of the proposed hybrid method to the traditional methods. The results show also that a wavelet denoising step is required when fusion is performed at low signal-to-noise ratios.

  10. Phase-preserving speckle reduction based on soft thresholding in quaternion wavelet domain

    NASA Astrophysics Data System (ADS)

    Liu, Yipeng; Jin, Jing; Wang, Qiang; Shen, Yi

    2012-10-01

    Speckle reduction is a difficult task for ultrasound image processing because of low resolution and contrast. As a novel tool of image analysis, quaternion wavelet (QW) has some superior properties compared to discrete wavelets, such as nearly shift-invariant wavelet coefficients and phase-based texture presentation. We aim to exploit the excellent performance of speckle reduction in quaternion wavelet domain based on the soft thresholding method. First, we exploit the characteristics of magnitude and phases in quaternion wavelet transform (QWT) to the denoising application, and find that the QWT phases of the images are little influenced by the noises. Then we model the QWT magnitude using the Rayleigh distribution, and derive the thresholding criterion. Furthermore, we conduct several experiments on synthetic speckle images and real ultrasound images. The performance of the proposed speckle reduction algorithm, using QWT with soft thresholding, demonstrates superiority to those using discrete wavelet transform and classical algorithms.

  11. Applications of continuous and orthogonal wavelet transforms to MHD and plasma turbulence

    NASA Astrophysics Data System (ADS)

    Farge, Marie; Schneider, Kai

    2016-10-01

    Wavelet analysis and compression tools are presented and different applications to study MHD and plasma turbulence are illustrated. We use the continuous and the orthogonal wavelet transform to develop several statistical diagnostics based on the wavelet coefficients. We show how to extract coherent structures out of fully developed turbulent flows using wavelet-based denoising and describe multiscale numerical simulation schemes using wavelets. Several examples for analyzing, compressing and computing one, two and three dimensional turbulent MHD or plasma flows are presented. Details can be found in M. Farge and K. Schneider. Wavelet transforms and their applications to MHD and plasma turbulence: A review. Support by the French Research Federation for Fusion Studies within the framework of the European Fusion Development Agreement (EFDA) is thankfully acknowledged.

  12. Superficial chemical peels.

    PubMed

    Zakopoulou, N; Kontochristopoulos, G

    2006-09-01

    Superficial chemical peeling (SCP) involves the application of a peeling agent to the skin, resulting in destruction of part or all of the epidermis. SCP is mainly recommended for facial rejuvenation, photoaging and superficial rhytides, pigmentary dyschromias and acne. It can be used on all Fitzpatrick skin types, no sedation is needed, and the desquamation is usually well accepted. Overpeel and complications are very rare. The most commonly used SCP agents are glycolic acid 20-70%, trichloroacetic acid 10-35%, Jessner's solution, salicylic acid, pyruvic acid, resorcinol 30-50% preparations, and solid carbon dioxide. The careful selection of patients is critical for the outcome of a SCP and contraindications must be seriously considered. The peel procedure is generally common for all SCP agents but a good knowledge of the specific characters of each agent is of great importance in order to decide which to use for each individual patient.

  13. Gap Measurement of Point Machine Using Adaptive Wavelet Threshold and Mathematical Morphology

    PubMed Central

    Xu, Tianhua; Wang, Guang; Wang, Haifeng; Yuan, Tangming; Zhong, Zhiwang

    2016-01-01

    A point machine’s gap is an important indication of its healthy status. An edge detection algorithm is proposed to measure and calculate a point machine’s gap from the gap image captured by CCD plane arrays. This algorithm integrates adaptive wavelet-based image denoising, locally adaptive image binarization, and mathematical morphology technologies. The adaptive wavelet-based image denoising obtains not only an optimal denoising threshold, but also unblurred edges. Locally adaptive image binarization has the advantage of overcoming the local intensity variation in gap images. Mathematical morphology may suppress speckle spots caused by reflective metal surfaces in point machines. The subjective and objective evaluations of the proposed method are presented by using point machine gap images from a railway corporation in China. The performance between the proposed method and conventional edge detection methods has also been compared, and the result shows that the former outperforms the latter. PMID:27898042

  14. Deep RNNs for video denoising

    NASA Astrophysics Data System (ADS)

    Chen, Xinyuan; Song, Li; Yang, Xiaokang

    2016-09-01

    Video denoising can be described as the problem of mapping from a specific length of noisy frames to clean one. We propose a deep architecture based on Recurrent Neural Network (RNN) for video denoising. The model learns a patch-based end-to-end mapping between the clean and noisy video sequences. It takes the corrupted video sequences as the input and outputs the clean one. Our deep network, which we refer to as deep Recurrent Neural Networks (deep RNNs or DRNNs), stacks RNN layers where each layer receives the hidden state of the previous layer as input. Experiment shows (i) the recurrent architecture through temporal domain extracts motion information and does favor to video denoising, and (ii) deep architecture have large enough capacity for expressing mapping relation between corrupted videos as input and clean videos as output, furthermore, (iii) the model has generality to learned different mappings from videos corrupted by different types of noise (e.g., Poisson-Gaussian noise). By training on large video databases, we are able to compete with some existing video denoising methods.

  15. Complications of Macular Peeling

    PubMed Central

    Asencio-Duran, Mónica; Manzano-Muñoz, Beatriz; Vallejo-García, José Luis; García-Martínez, Jesús

    2015-01-01

    Macular peeling refers to the surgical technique for the removal of preretinal tissue or the internal limiting membrane (ILM) in the macula for several retinal disorders, ranging from epiretinal membranes (primary or secondary to diabetic retinopathy, retinal detachment…) to full-thickness macular holes, macular edema, foveal retinoschisis, and others. The technique has evolved in the last two decades, and the different instrumentations and adjuncts have progressively advanced turning into a safer, easier, and more useful tool for the vitreoretinal surgeon. Here, we describe the main milestones of macular peeling, drawing attention to its associated complications. PMID:26425351

  16. Analyzing Planck-Like Data with Wavelets

    NASA Astrophysics Data System (ADS)

    Sanz, J. L.; Barreiro, R. B.; Cayón, L.; Martinez-González, E.; Ruiz, G. A.; Diaz, F. J.; Argüeso, F.; Toffolatti, L.

    Basics on the continuous and discrete wavelet transform with two scales are outlined. We study maps representing anisotropies in the cosmic microwave background radiation (CMB) and the relation to the standard approach, based on the Cl's, is establised through the introduction of a wavelet spectrum. We apply this technique to small angular scale CMB map simulations of size 12.8 x 12.8 degrees and filtered with a 4'.5 Gaussian beam. This resolution resembles the experimental one expected for future high resolution experiments (e.g. the Planck mission). We consider temperature fluctuations derived from standard, open and flat-Lambda CDM models. We also introduce Gaussian noise (uniform and non-uniform) at different S/N levels and results are given regarding denoising.

  17. Wavelet analysis deformation monitoring data of high-speed railway bridge

    NASA Astrophysics Data System (ADS)

    Tang, ShiHua; Huang, Qing; Zhou, Conglin; Xu, HongWei; Liu, YinTao; Li, FeiDa

    2015-12-01

    Deformation monitoring data of high-speed railway bridges will inevitably be affected because of noise pollution, A deformation monitoring point of high-speed railway bridge was measurd by using sokkia SDL30 electronic level for a long time,which got a large number of deformation monitoring data. Based on the characteristics of the deformation monitoring data of high-speed railway bridge, which contain lots of noise. Based on the MATLAB software platform, 120 groups of deformation monitoring data were applied to analysis of wavelet denoising.sym6,db6 wavelet basis function were selected to analyze and remove the noise.The original signal was broken into three layers wavelet,which contain high frequency coefficients and low frequency coefficients.However, high frequency coefficient have plenty of noise.Adaptive method of soft and hard threshold were used to handle in the high frequency coefficient.Then,high frequency coefficient that was removed much of noise combined with low frequency coefficient to reconstitute and obtain reconstruction wavelet signal.Root Mean Square Error (RMSE) and Signal-To-Noise Ratio (SNR) were regarded as evaluation index of denoising,The smaller the root mean square error and the greater signal-to-noise ratio indicate that them have a good effect in denoising. We can surely draw some conclusions in the experimental analysis:the db6 wavelet basis function has a good effect in wavelet denoising by using a adaptive soft threshold method,which root mean square error is minimum and signal-to-noise ratio is maximum.Moreover,the reconstructed image are more smooth than original signal denoising after wavelet denoising, which removed noise and useful signal are obtained in the original signal.Compared to the other three methods, this method has a good effect in denoising, which not only retain useful signal in the original signal, but aiso reach the goal of removing noise. So, it has a strong practical value in a actual deformation monitoring

  18. A novel de-noising method for B ultrasound images

    NASA Astrophysics Data System (ADS)

    Tian, Da-Yong; Mo, Jia-qing; Yu, Yin-Feng; Lv, Xiao-Yi; Yu, Xiao; Jia, Zhen-Hong

    2015-12-01

    B ultrasound as a kind of ultrasonic imaging, which has become the indispensable diagnosis method in clinical medicine. However, the presence of speckle noise in ultrasound image greatly reduces the image quality and interferes with the accuracy of the diagnosis. Therefore, how to construct a method which can eliminate the speckle noise effectively, and at the same time keep the image details effectively is the research target of the current ultrasonic image de-noising. This paper is intended to remove the inherent speckle noise of B ultrasound image. The novel algorithm proposed is based on both wavelet transformation of B ultrasound images and data fusion of B ultrasound images, with a smaller mean squared error (MSE) and greater signal to noise ratio (SNR) compared with other algorithms. The results of this study can effectively remove speckle noise from B ultrasound images, and can well preserved the details and edge information which will produce better visual effects.

  19. Image denoising using local tangent space alignment

    NASA Astrophysics Data System (ADS)

    Feng, JianZhou; Song, Li; Huo, Xiaoming; Yang, XiaoKang; Zhang, Wenjun

    2010-07-01

    We propose a novel image denoising approach, which is based on exploring an underlying (nonlinear) lowdimensional manifold. Using local tangent space alignment (LTSA), we 'learn' such a manifold, which approximates the image content effectively. The denoising is performed by minimizing a newly defined objective function, which is a sum of two terms: (a) the difference between the noisy image and the denoised image, (b) the distance from the image patch to the manifold. We extend the LTSA method from manifold learning to denoising. We introduce the local dimension concept that leads to adaptivity to different kind of image patches, e.g. flat patches having lower dimension. We also plug in a basic denoising stage to estimate the local coordinate more accurately. It is found that the proposed method is competitive: its performance surpasses the K-SVD denoising method.

  20. A nonlinear Bayesian filtering framework for ECG denoising.

    PubMed

    Sameni, Reza; Shamsollahi, Mohammad B; Jutten, Christian; Clifford, Gari D

    2007-12-01

    In this paper, a nonlinear Bayesian filtering framework is proposed for the filtering of single channel noisy electrocardiogram (ECG) recordings. The necessary dynamic models of the ECG are based on a modified nonlinear dynamic model, previously suggested for the generation of a highly realistic synthetic ECG. A modified version of this model is used in several Bayesian filters, including the Extended Kalman Filter, Extended Kalman Smoother, and Unscented Kalman Filter. An automatic parameter selection method is also introduced, to facilitate the adaptation of the model parameters to a vast variety of ECGs. This approach is evaluated on several normal ECGs, by artificially adding white and colored Gaussian noises to visually inspected clean ECG recordings, and studying the SNR and morphology of the filter outputs. The results of the study demonstrate superior results compared with conventional ECG denoising approaches such as bandpass filtering, adaptive filtering, and wavelet denoising, over a wide range of ECG SNRs. The method is also successfully evaluated on real nonstationary muscle artifact. This method may therefore serve as an effective framework for the model-based filtering of noisy ECG recordings.

  1. A novel partial volume effects correction technique integrating deconvolution associated with denoising within an iterative PET image reconstruction

    SciTech Connect

    Merlin, Thibaut; Visvikis, Dimitris; Fernandez, Philippe; Lamare, Frederic

    2015-02-15

    Purpose: Partial volume effect (PVE) plays an important role in both qualitative and quantitative PET image accuracy, especially for small structures. A previously proposed voxelwise PVE correction method applied on PET reconstructed images involves the use of Lucy–Richardson deconvolution incorporating wavelet-based denoising to limit the associated propagation of noise. The aim of this study is to incorporate the deconvolution, coupled with the denoising step, directly inside the iterative reconstruction process to further improve PVE correction. Methods: The list-mode ordered subset expectation maximization (OSEM) algorithm has been modified accordingly with the application of the Lucy–Richardson deconvolution algorithm to the current estimation of the image, at each reconstruction iteration. Acquisitions of the NEMA NU2-2001 IQ phantom were performed on a GE DRX PET/CT system to study the impact of incorporating the deconvolution inside the reconstruction [with and without the point spread function (PSF) model] in comparison to its application postreconstruction and to standard iterative reconstruction incorporating the PSF model. The impact of the denoising step was also evaluated. Images were semiquantitatively assessed by studying the trade-off between the intensity recovery and the noise level in the background estimated as relative standard deviation. Qualitative assessments of the developed methods were additionally performed on clinical cases. Results: Incorporating the deconvolution without denoising within the reconstruction achieved superior intensity recovery in comparison to both standard OSEM reconstruction integrating a PSF model and application of the deconvolution algorithm in a postreconstruction process. The addition of the denoising step permitted to limit the SNR degradation while preserving the intensity recovery. Conclusions: This study demonstrates the feasibility of incorporating the Lucy–Richardson deconvolution associated with a

  2. Translation invariant directional framelet transform combined with Gabor filters for image denoising.

    PubMed

    Shi, Yan; Yang, Xiaoyuan; Guo, Yuhua

    2014-01-01

    This paper is devoted to the study of a directional lifting transform for wavelet frames. A nonsubsampled lifting structure is developed to maintain the translation invariance as it is an important property in image denoising. Then, the directionality of the lifting-based tight frame is explicitly discussed, followed by a specific translation invariant directional framelet transform (TIDFT). The TIDFT has two framelets ψ1, ψ2 with vanishing moments of order two and one respectively, which are able to detect singularities in a given direction set. It provides an efficient and sparse representation for images containing rich textures along with properties of fast implementation and perfect reconstruction. In addition, an adaptive block-wise orientation estimation method based on Gabor filters is presented instead of the conventional minimization of residuals. Furthermore, the TIDFT is utilized to exploit the capability of image denoising, incorporating the MAP estimator for multivariate exponential distribution. Consequently, the TIDFT is able to eliminate the noise effectively while preserving the textures simultaneously. Experimental results show that the TIDFT outperforms some other frame-based denoising methods, such as contourlet and shearlet, and is competitive to the state-of-the-art denoising approaches.

  3. MRS3D: 3D Spherical Wavelet Transform on the Sphere

    NASA Astrophysics Data System (ADS)

    Lanusse, F.; Rassat, A.; Starck, J.-L.

    2011-12-01

    Future cosmological surveys will provide 3D large scale structure maps with large sky coverage, for which a 3D Spherical Fourier-Bessel (SFB) analysis is natural. Wavelets are particularly well-suited to the analysis and denoising of cosmological data, but a spherical 3D isotropic wavelet transform does not currently exist to analyse spherical 3D data. We present a new fast Discrete Spherical Fourier-Bessel Transform (DSFBT) based on both a discrete Bessel Transform and the HEALPIX angular pixelisation scheme. We tested the 3D wavelet transform and as a toy-application, applied a denoising algorithm in wavelet space to the Virgo large box cosmological simulations and found we can successfully remove noise without much loss to the large scale structure. The new spherical 3D isotropic wavelet transform, called MRS3D, is ideally suited to analysing and denoising future 3D spherical cosmological surveys; it uses a novel discrete spherical Fourier-Bessel Transform. MRS3D is based on two packages, IDL and Healpix and can be used only if these two packages have been installed.

  4. Wavelet and wavelet packet compression of electrocardiograms.

    PubMed

    Hilton, M L

    1997-05-01

    Wavelets and wavelet packets have recently emerged as powerful tools for signal compression. Wavelet and wavelet packet-based compression algorithms based on embedded zerotree wavelet (EZW) coding are developed for electrocardiogram (ECG) signals, and eight different wavelets are evaluated for their ability to compress Holter ECG data. Pilot data from a blind evaluation of compressed ECG's by cardiologists suggest that the clinically useful information present in original ECG signals is preserved by 8:1 compression, and in most cases 16:1 compressed ECG's are clinically useful.

  5. Extensions to total variation denoising

    NASA Astrophysics Data System (ADS)

    Blomgren, Peter; Chan, Tony F.; Mulet, Pep

    1997-10-01

    The total variation denoising method, proposed by Rudin, Osher and Fatermi, 92, is a PDE-based algorithm for edge-preserving noise removal. The images resulting from its application are usually piecewise constant, possibly with a staircase effect at smooth transitions and may contain significantly less fine details than the original non-degraded image. In this paper we present some extensions to this technique that aim to improve the above drawbacks, through redefining the total variation functional or the noise constraints.

  6. Denoising and Multivariate Analysis of Time-Of-Flight SIMS Images

    SciTech Connect

    Wickes, Bronwyn; Kim, Y.; Castner, David G.

    2003-08-30

    Time-of-flight SIMS (ToF-SIMS) imaging offers a modality for simultaneously visualizing the spatial distribution of different surface species. However, the utility of ToF-SIMS datasets may be limited by their large size, degraded mass resolution and low ion counts per pixel. Through denoising and multivariate image analysis, regions of similar chemistries may be differentiated more readily in ToF-SIMS image data. Three established denoising algorithms down-binning, boxcar and wavelet filtering were applied to ToF-SIMS images of different surface geometries and chemistries. The effect of these filters on the performance of principal component analysis (PCA) was evaluated in terms of the capture of important chemical image features in the principal component score images, the quality of the principal component

  7. Denoising infrared maritime imagery using tailored dictionaries via modified K-SVD algorithm.

    PubMed

    Smith, L N; Olson, C C; Judd, K P; Nichols, J M

    2012-06-10

    Recent work has shown that tailored overcomplete dictionaries can provide a better image model than standard basis functions for a variety of image processing tasks. Here we propose a modified K-SVD dictionary learning algorithm designed to maintain the advantages of the original approach but with a focus on improved convergence. We then use the learned model to denoise infrared maritime imagery and compare the performance to the original K-SVD algorithm, several overcomplete "fixed" dictionaries, and a standard wavelet denoising algorithm. Results indicate the superiority of overcomplete representations and show that our tailored approach provides similar peak signal-to-noise ratios as the traditional K-SVD at roughly half the computational cost.

  8. Fst-Filter: A flexible spatio-temporal filter for biomedical multichannel data denoising.

    PubMed

    Nuanprasert, Somchai; Adachi, Yoshiaki; Suzuki, Takashi

    2015-08-01

    In this paper, we present the noise reduction method for a multichannel measurement system where the true underlying signal is spatially low-rank and contaminated by spatially correlated noise. Our proposed formulation applies generalized singular value decomposition (GSVD) with signal recovery approach to extend the conventional subspace-based methods for performing the spatio-temporal filtering. Without necessarily requiring the noise covariance data in advance, the implemented optimization scheme allows users to choose the denoising function, F(·) flexibly satisfying for different temporal noise characteristics from a variety of existing efficient temporal filters. An effectiveness of proposed method is demonstrated by yielding the better accuracy for the brain source estimation on simulated magnetoencephalography (MEG) experiments than some traditional methods, e.g., principal component analysis (PCA), robust principal component analysis (RPCA) and multivariate wavelet denoising (MWD).

  9. A study of infrared spectroscopy de-noising based on LMS adaptive filter

    NASA Astrophysics Data System (ADS)

    Mo, Jia-qing; Lv, Xiao-yi; Yu, Xiao

    2015-12-01

    Infrared spectroscopy has been widely used, but which often contains a lot of noise, so the spectral characteristic of the sample is seriously affected. Therefore the de-noising is very important in the spectrum analysis and processing. In the study of infrared spectroscopy, the least mean square (LMS) adaptive filter was applied in the field firstly. LMS adaptive filter algorithm can reserve the detail and envelope of the effective signal when the method was applied to infrared spectroscopy of breast cancer which signal-to-noise ratio (SNR) is lower than 10 dB, contrast and analysis the result with result of wavelet transform and ensemble empirical mode decomposition (EEMD). The three evaluation standards (SNR, root mean square error (RMSE) and the correlation coefficient (ρ)) fully proved de-noising advantages of LMS adaptive filter in infrared spectroscopy of breast cancer.

  10. Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising.

    PubMed

    Zhang, Kai; Zuo, Wangmeng; Chen, Yunjin; Meng, Deyu; Zhang, Lei

    2017-02-01

    Discriminative model learning for image denoising has been recently attracting considerable attentions due to its favorable denoising performance. In this paper, we take one step forward by investigating the construction of feed-forward denoising convolutional neural networks (DnCNNs) to embrace the progress in very deep architecture, learning algorithm, and regularization method into image denoising. Specifically, residual learning and batch normalization are utilized to speed up the training process as well as boost the denoising performance. Different from the existing discriminative denoising models which usually train a specific model for additive white Gaussian noise (AWGN) at a certain noise level, our DnCNN model is able to handle Gaussian denoising with unknown noise level (i.e., blind Gaussian denoising). With the residual learning strategy, DnCNN implicitly removes the latent clean image in the hidden layers. This property motivates us to train a single DnCNN model to tackle with several general image denoising tasks such as Gaussian denoising, single image super-resolution and JPEG image deblocking. Our extensive experiments demonstrate that our DnCNN model can not only exhibit high effectiveness in several general image denoising tasks, but also be efficiently implemented by benefiting from GPU computing.

  11. Wavelet-Based Speech Enhancement Using Time-Adapted Noise Estimation

    NASA Astrophysics Data System (ADS)

    Lei, Sheau-Fang; Tung, Ying-Kai

    Spectral subtraction is commonly used for speech enhancement in a single channel system because of the simplicity of its implementation. However, this algorithm introduces perceptually musical noise while suppressing the background noise. We propose a wavelet-based approach in this paper for suppressing the background noise for speech enhancement in a single channel system. The wavelet packet transform, which emulates the human auditory system, is used to decompose the noisy signal into critical bands. Wavelet thresholding is then temporally adjusted with the noise power by time-adapted noise estimation. The proposed algorithm can efficiently suppress the noise while reducing speech distortion. Experimental results, including several objective measurements, show that the proposed wavelet-based algorithm outperforms spectral subtraction and other wavelet-based denoising approaches for speech enhancement for nonstationary noise environments.

  12. ECG Signal Analysis and Arrhythmia Detection using Wavelet Transform

    NASA Astrophysics Data System (ADS)

    Kaur, Inderbir; Rajni, Rajni; Marwaha, Anupma

    2016-12-01

    Electrocardiogram (ECG) is used to record the electrical activity of the heart. The ECG signal being non-stationary in nature, makes the analysis and interpretation of the signal very difficult. Hence accurate analysis of ECG signal with a powerful tool like discrete wavelet transform (DWT) becomes imperative. In this paper, ECG signal is denoised to remove the artifacts and analyzed using Wavelet Transform to detect the QRS complex and arrhythmia. This work is implemented in MATLAB software for MIT/BIH Arrhythmia database and yields the sensitivity of 99.85 %, positive predictivity of 99.92 % and detection error rate of 0.221 % with wavelet transform. It is also inferred that DWT outperforms principle component analysis technique in detection of ECG signal.

  13. Image denoising using a combined criterion

    NASA Astrophysics Data System (ADS)

    Semenishchev, Evgeny; Marchuk, Vladimir; Shrafel, Igor; Dubovskov, Vadim; Onoyko, Tatyana; Maslennikov, Stansilav

    2016-05-01

    A new image denoising method is proposed in this paper. We are considering an optimization problem with a linear objective function based on two criteria, namely, L2 norm and the first order square difference. This method is a parametric, so by a choice of the parameters we can adapt a proposed criteria of the objective function. The denoising algorithm consists of the following steps: 1) multiple denoising estimates are found on local areas of the image; 2) image edges are determined; 3) parameters of the method are fixed and denoised estimates of the local area are found; 4) local window is moved to the next position (local windows are overlapping) in order to produce the final estimate. A proper choice of parameters of the introduced method is discussed. A comparative analysis of a new denoising method with existed ones is performed on a set of test images.

  14. Green channel guiding denoising on bayer image.

    PubMed

    Tan, Xin; Lai, Shiming; Liu, Yu; Zhang, Maojun

    2014-01-01

    Denoising is an indispensable function for digital cameras. In respect that noise is diffused during the demosaicking, the denoising ought to work directly on bayer data. The difficulty of denoising on bayer image is the interlaced mosaic pattern of red, green, and blue. Guided filter is a novel time efficient explicit filter kernel which can incorporate additional information from the guidance image, but it is still not applied for bayer image. In this work, we observe that the green channel of bayer mode is higher in both sampling rate and Signal-to-Noise Ratio (SNR) than the red and blue ones. Therefore the green channel can be used to guide denoising. This kind of guidance integrates the different color channels together. Experiments on both actual and simulated bayer images indicate that green channel acts well as the guidance signal, and the proposed method is competitive with other popular filter kernel denoising methods.

  15. Exact reconstruction with directional wavelets on the sphere

    NASA Astrophysics Data System (ADS)

    Wiaux, Y.; McEwen, J. D.; Vandergheynst, P.; Blanc, O.

    2008-08-01

    A new formalism is derived for the analysis and exact reconstruction of band-limited signals on the sphere with directional wavelets. It represents an evolution of a previously developed wavelet formalism developed by Antoine & Vandergheynst and Wiaux et al. The translations of the wavelets at any point on the sphere and their proper rotations are still defined through the continuous three-dimensional rotations. The dilations of the wavelets are directly defined in harmonic space through a new kernel dilation, which is a modification of an existing harmonic dilation. A family of factorized steerable functions with compact harmonic support which are suitable for this kernel dilation are first identified. A scale-discretized wavelet formalism is then derived, relying on this dilation. The discrete nature of the analysis scales allows the exact reconstruction of band-limited signals. A corresponding exact multi-resolution algorithm is finally described and an implementation is tested. The formalism is of interest notably for the denoising or the deconvolution of signals on the sphere with a sparse expansion in wavelets. In astrophysics, it finds a particular application for the identification of localized directional features in the cosmic microwave background data, such as the imprint of topological defects, in particular, cosmic strings, and for their reconstruction after separation from the other signal components.

  16. [Establishment and Improvement of Portable X-Ray Fluorescence Spectrometer Detection Model Based on Wavelet Transform].

    PubMed

    Li, Fang; Wang, Ji-hua; Lu, An-xiang; Han, Ping

    2015-04-01

    The concentration of Cr, Cu, Zn, As and Pb in soil was tested by portable X-ray fluorescence spectrometer. Each sample was tested for 3 times, then after using wavelet threshold noise filtering method for denoising and smoothing the spectra, a standard curve for each heavy metal was established according to the standard values of heavy metals in soil and the corresponding counts which was the average of the 3 processed spectra. The signal to noise ratio (SNR), mean square error (MSE) and information entropy (H) were taken to assess the effects of denoising when using wavelet threshold noise filtering method for determining the best wavelet basis and wavelet decomposition level. Some samples with different concentrations and H3 B03 (blank) were chosen to retest this instrument to verify its stability. The results show that: the best denoising result was obtained with the coif3 wavelet basis at the decomposition level of 3 when using the wavelet transform method. The determination coefficient (R2) range of the instrument is 0.990-0.996, indicating that a high degree of linearity was found between the contents of heavy metals in soil and each X-ray fluorescence spectral characteristic peak intensity with the instrument measurement within the range (0-1,500 mg · kg(-1)). After retesting and calculating, the results indicate that all the detection limits of the instrument are below the soil standards at national level. The accuracy of the model has been effectively improved, and the instrument also shows good precision with the practical application of wavelet transform to the establishment and improvement of X-ray fluorescence spectrometer detection model. Thus the instrument can be applied in on-site rapid screening of heavy metal in contaminated soil.

  17. Steerable pyramids and tight wavelet frames in L2(R(d)).

    PubMed

    Unser, Michael; Chenouard, Nicolas; Van de Ville, Dimitri

    2011-10-01

    We present a functional framework for the design of tight steerable wavelet frames in any number of dimensions. The 2-D version of the method can be viewed as a generalization of Simoncelli's steerable pyramid that gives access to a larger palette of steerable wavelets via a suitable parametrization. The backbone of our construction is a primal isotropic wavelet frame that provides the multiresolution decomposition of the signal. The steerable wavelets are obtained by applying a one-to-many mapping (Nth-order generalized Riesz transform) to the primal ones. The shaping of the steerable wavelets is controlled by an M×M unitary matrix (where M is the number of wavelet channels) that can be selected arbitrarily; this allows for a much wider range of solutions than the traditional equiangular configuration (steerable pyramid). We give a complete functional description of these generalized wavelet transforms and derive their steering equations. We describe some concrete examples of transforms, including some built around a Mallat-type multiresolution analysis of L(2)(R(d)), and provide a fast Fourier transform-based decomposition algorithm. We also propose a principal-component-based method for signal-adapted wavelet design. Finally, we present some illustrative examples together with a comparison of the denoising performance of various brands of steerable transforms. The results are in favor of an optimized wavelet design (equalized principal component analysis), which consistently performs best.

  18. Analysis of phonocardiogram signals using wavelet transform.

    PubMed

    Meziani, F; Debbal, S M; Atbi, A

    2012-08-01

    Phonocardiograms (PCG) are recordings of the acoustic waves produced by the mechanical action of the heart. They generally consist of two kinds of acoustic vibrations: heart sounds and heart murmurs. Heart murmurs are often the first signs of pathological changes of the heart valves, and are usually found during auscultation in primary health care. Heart auscultation has been recognized for a long time as an important tool for the diagnosis of heart disease, although its accuracy is still insufficient to diagnose some heart diseases. It does not enable the analyst to obtain both qualitative and quantitative characteristics of the PCG signals. The efficiency of diagnosis can be improved considerably by using modern digital signal processing techniques. Therefore, these last can provide useful and valuable information on these signals. The aim of this study is to analyse PCG signals using wavelet transform. This analysis is based on an algorithm for the detection of heart sounds (the first and second sounds, S1 and S2) and heart murmurs using the PCG signal as the only source. The segmentation algorithm, which separates the components of the heart signal, is based on denoising by wavelet transform (DWT). This algorithm makes it possible to isolate individual sounds (S1 or S2) and murmurs. Thus, the analysis of various PCGs signals using wavelet transform can provide a wide range of statistical parameters related to the phonocardiogram signal.

  19. Denoising Medical Images using Calculus of Variations.

    PubMed

    Kohan, Mahdi Nakhaie; Behnam, Hamid

    2011-07-01

    We propose a method for medical image denoising using calculus of variations and local variance estimation by shaped windows. This method reduces any additive noise and preserves small patterns and edges of images. A pyramid structure-texture decomposition of images is used to separate noise and texture components based on local variance measures. The experimental results show that the proposed method has visual improvement as well as a better SNR, RMSE and PSNR than common medical image denoising methods. Experimental results in denoising a sample Magnetic Resonance image show that SNR, PSNR and RMSE have been improved by 19, 9 and 21 percents respectively.

  20. Discrete directional wavelet bases and frames: analysis and applications

    NASA Astrophysics Data System (ADS)

    Dragotti, Pier Luigi; Velisavljevic, Vladan; Vetterli, Martin; Beferull-Lozano, Baltasar

    2003-11-01

    The application of the wavelet transform in image processing is most frequently based on a separable construction. Lines and columns in an image are treated independently and the basis functions are simply products of the corresponding one dimensional functions. Such method keeps simplicity in design and computation, but is not capable of capturing properly all the properties of an image. In this paper, a new truly separable discrete multi-directional transform is proposed with a subsampling method based on lattice theory. Alternatively, the subsampling can be omitted and this leads to a multi-directional frame. This transform can be applied in many areas like denoising, non-linear approximation and compression. The results on non-linear approximation and denoising show interesting gains compared to the standard two-dimensional analysis.

  1. Peeling mechanism of tomato under infrared heating

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Critical behaviors of peeling tomatoes using infrared heat are thermally induced peel loosening and subsequent cracking. However, the mechanism of peel loosening and cracking due to infrared heating remains unclear. This study aimed at investigating the mechanism of peeling tomatoes under infrared h...

  2. Wavelet transform processing applied to partial discharge evaluation

    NASA Astrophysics Data System (ADS)

    Macedo, E. C. T.; Araújo, D. B.; da Costa, E. G.; Freire, R. C. S.; Lopes, W. T. A.; Torres, I. S. M.; de Souza Neto, J. M. R.; Bhatti, S. A.; Glover, I. A.

    2012-05-01

    Partial Discharge (PD) is characterized by high frequency current pulses that occur in high voltage (HV) electrical equipments originated from gas ionization process when damaged insulation is submitted to high values of electric field [1]. PD monitoring is a useful method of assessing the aging degree of the insulation, manufacturing defects or chemical/mechanical damage. Many sources of noise (e.g. radio transmissions, commutator noise from rotating machines, power electronics switching circuits, corona discharge, etc.) can directly affect the PD estimation. Among the many mathematical techniques that can be applied to de-noise PD signals, the wavelet transform is one of the most powerful. It can simultaneously supply information about the pulse occurrence, time and pulse spectrum, and also de-noise in-field measured PD signals. In this paper is described the application of wavelet transform in the suppression of the main types of noise that can affect the observation and analysis of PD signals in high voltage apparatus. In addition, is presented a study that indicates the appropriated mother-wavelet for this application based on the cross-correlation factor.

  3. A Decomposition Framework for Image Denoising Algorithms.

    PubMed

    Ghimpeteanu, Gabriela; Batard, Thomas; Bertalmio, Marcelo; Levine, Stacey

    2016-01-01

    In this paper, we consider an image decomposition model that provides a novel framework for image denoising. The model computes the components of the image to be processed in a moving frame that encodes its local geometry (directions of gradients and level lines). Then, the strategy we develop is to denoise the components of the image in the moving frame in order to preserve its local geometry, which would have been more affected if processing the image directly. Experiments on a whole image database tested with several denoising methods show that this framework can provide better results than denoising the image directly, both in terms of Peak signal-to-noise ratio and Structural similarity index metrics.

  4. Image denoising in bidimensional empirical mode decomposition domain: the role of Student's probability distribution function

    PubMed Central

    2015-01-01

    Hybridisation of the bi-dimensional empirical mode decomposition (BEMD) with denoising techniques has been proposed in the literature as an effective approach for image denoising. In this Letter, the Student's probability density function is introduced in the computation of the mean envelope of the data during the BEMD sifting process to make it robust to values that are far from the mean. The resulting BEMD is denoted tBEMD. In order to show the effectiveness of the tBEMD, several image denoising techniques in tBEMD domain are employed; namely, fourth order partial differential equation (PDE), linear complex diffusion process (LCDP), non-linear complex diffusion process (NLCDP), and the discrete wavelet transform (DWT). Two biomedical images and a standard digital image were considered for experiments. The original images were corrupted with additive Gaussian noise with three different levels. Based on peak-signal-to-noise ratio, the experimental results show that PDE, LCDP, NLCDP, and DWT all perform better in the tBEMD than in the classical BEMD domain. It is also found that tBEMD is faster than classical BEMD when the noise level is low. When it is high, the computational cost in terms of processing time is similar. The effectiveness of the presented approach makes it promising for clinical applications. PMID:27222723

  5. Image denoising in bidimensional empirical mode decomposition domain: the role of Student's probability distribution function.

    PubMed

    Lahmiri, Salim

    2016-03-01

    Hybridisation of the bi-dimensional empirical mode decomposition (BEMD) with denoising techniques has been proposed in the literature as an effective approach for image denoising. In this Letter, the Student's probability density function is introduced in the computation of the mean envelope of the data during the BEMD sifting process to make it robust to values that are far from the mean. The resulting BEMD is denoted tBEMD. In order to show the effectiveness of the tBEMD, several image denoising techniques in tBEMD domain are employed; namely, fourth order partial differential equation (PDE), linear complex diffusion process (LCDP), non-linear complex diffusion process (NLCDP), and the discrete wavelet transform (DWT). Two biomedical images and a standard digital image were considered for experiments. The original images were corrupted with additive Gaussian noise with three different levels. Based on peak-signal-to-noise ratio, the experimental results show that PDE, LCDP, NLCDP, and DWT all perform better in the tBEMD than in the classical BEMD domain. It is also found that tBEMD is faster than classical BEMD when the noise level is low. When it is high, the computational cost in terms of processing time is similar. The effectiveness of the presented approach makes it promising for clinical applications.

  6. Customized maximal-overlap multiwavelet denoising with data-driven group threshold for condition monitoring of rolling mill drivetrain

    NASA Astrophysics Data System (ADS)

    Chen, Jinglong; Wan, Zhiguo; Pan, Jun; Zi, Yanyang; Wang, Yu; Chen, Binqiang; Sun, Hailiang; Yuan, Jing; He, Zhengjia

    2016-02-01

    Fault identification timely of rolling mill drivetrain is significant for guaranteeing product quality and realizing long-term safe operation. So, condition monitoring system of rolling mill drivetrain is designed and developed. However, because compound fault and weak fault feature information is usually sub-merged in heavy background noise, this task still faces challenge. This paper provides a possibility for fault identification of rolling mills drivetrain by proposing customized maximal-overlap multiwavelet denoising method. The effectiveness of wavelet denoising method mainly relies on the appropriate selections of wavelet base, transform strategy and threshold rule. First, in order to realize exact matching and accurate detection of fault feature, customized multiwavelet basis function is constructed via symmetric lifting scheme and then vibration signal is processed by maximal-overlap multiwavelet transform. Next, based on spatial dependency of multiwavelet transform coefficients, spatial neighboring coefficient data-driven group threshold shrinkage strategy is developed for denoising process by choosing the optimal group length and threshold via the minimum of Stein's Unbiased Risk Estimate. The effectiveness of proposed method is first demonstrated through compound fault identification of reduction gearbox on rolling mill. Then it is applied for weak fault identification of dedusting fan bearing on rolling mill and the results support its feasibility.

  7. The Sea of Wavelets

    NASA Astrophysics Data System (ADS)

    Jones, B. J. T.

    Wavelet analysis has become a major tool in many aspects of data handling, whether it be statistical analysis, noise removal or image reconstruction. Wavelet analysis has worked its way into fields as diverse as economics, medicine, geophysics, music and cosmology.

  8. The peeling skin syndrome.

    PubMed

    Levy, S B; Goldsmith, L A

    1982-11-01

    A unique form of congenital ichthyosis in two unrelated patients is described and characterized histologically by separation of the epidermis between the stratum corneum and the stratum granulosum. The clinical history, genetics, serially performed skin biopsies, and biochemical studies are reviewed. This form of ichthyosis is different from previously described entities. Lifelong peeling of the general body epidermis, pruritus, short stature, easily removed anagen hairs, and the ability to easily mechanically separate stratum corneum from the rest of the epidermis characterize the syndrome. In two families with this disorder, autosomal recessive inheritance is suggested. A low plasma tryptophan level as present in two patients with this disease. This inherited disorder of the epidermis was first described in 1924 before the genetics and histology of ichthyosis were extensively studied and is a distinct genetic and clinical entity to be considered in unusual cases of ichthyosis.

  9. Adaptive image denoising by targeted databases.

    PubMed

    Luo, Enming; Chan, Stanley H; Nguyen, Truong Q

    2015-07-01

    We propose a data-dependent denoising procedure to restore noisy images. Different from existing denoising algorithms which search for patches from either the noisy image or a generic database, the new algorithm finds patches from a database that contains relevant patches. We formulate the denoising problem as an optimal filter design problem and make two contributions. First, we determine the basis function of the denoising filter by solving a group sparsity minimization problem. The optimization formulation generalizes existing denoising algorithms and offers systematic analysis of the performance. Improvement methods are proposed to enhance the patch search process. Second, we determine the spectral coefficients of the denoising filter by considering a localized Bayesian prior. The localized prior leverages the similarity of the targeted database, alleviates the intensive Bayesian computation, and links the new method to the classical linear minimum mean squared error estimation. We demonstrate applications of the proposed method in a variety of scenarios, including text images, multiview images, and face images. Experimental results show the superiority of the new algorithm over existing methods.

  10. Nonlocal means image denoising using orthogonal moments.

    PubMed

    Kumar, Ahlad

    2015-09-20

    An image denoising method in moment domain has been proposed. The denoising involves the development and evaluation based on the modified nonlocal means (NLM) algorithm. It uses the similarity of the neighborhood, evaluated using Krawtchouk moments. The results of the proposed denoising method have been validated using peak signal-to-noise ratio (PSNR), a well-known quality measure such as structural similarity (SSIM) index and blind/referenceless image spatial quality evaluator (BRISQUE). The denoising algorithm has been evaluated for synthetic and real clinical images contaminated by Gaussian, Poisson, and Rician noise. The algorithm performs well compared to the Zernike based denoising as indicated by the PSNR, SSIM, and BRISQUE scores of the denoised images with an improvement of 3.1 dB, 0.1285, and 4.23, respectively. Further, comparative analysis of the proposed work with the existing techniques has also been performed. It has been observed that the results are competitive in terms of PSNR, SSIM, and BRISQUE scores when evaluated for varying levels of noise.

  11. Wavelet-Based Speech Enhancement Using Time-Frequency Adaptation

    NASA Astrophysics Data System (ADS)

    Wang, Kun-Ching

    2003-12-01

    Wavelet denoising is commonly used for speech enhancement because of the simplicity of its implementation. However, the conventional methods generate the presence of musical residual noise while thresholding the background noise. The unvoiced components of speech are often eliminated from this method. In this paper, a novel algorithm of wavelet coefficient threshold (WCT) based on time-frequency adaptation is proposed. In addition, an unvoiced speech enhancement algorithm is also integrated into the system to improve the intelligibility of speech. The wavelet coefficient threshold (WCT) of each subband is first temporally adjusted according to the value of a posterior signal-to-noise ratio (SNR). To prevent the degradation of unvoiced sounds during noise, the algorithm utilizes a simple speech/noise detector (SND) and further divides speech signal into unvoiced and voiced sounds. Then, we apply appropriate wavelet thresholding according to voiced/unvoiced (V/U) decision. Based on the masking properties of human auditory system, a perceptual gain factor is adopted into wavelet thresholding for suppressing musical residual noise. Simulation results show that the proposed method is capable of reducing noise with little speech degradation and the overall performance is superior to several competitive methods.

  12. Transient Detection Using Wavelets.

    DTIC Science & Technology

    1995-03-01

    signaL and transients are nonstationary. A new technique for the analysis of this type of signal, called the Wavelet Transform , was applied to artificial...and real signals. A brief theoretical comparison between the Short Time Fourier Transform and the Wavelet Transform is introduced A multisolution...analysis approach for implementing the transform was used. Computer code for the Discrete Wavelet Transform was implemented. Different types of wavelets to use as basis functions were evaluated. (KAR) P. 2

  13. Wavelet treatment of the intrachain correlation functions of homopolymers in dilute solutions

    NASA Astrophysics Data System (ADS)

    Fedorov, M. V.; Chuev, G. N.; Kuznetsov, Yu. A.; Timoshenko, E. G.

    2004-11-01

    Discrete wavelets are applied to the parametrization of the intrachain two-point correlation functions of homopolymers in dilute solutions obtained from Monte Carlo simulations. Several orthogonal and biorthogonal basis sets have been investigated for use in the truncated wavelet approximation. The quality of the approximation has been assessed by calculation of the scaling exponents obtained from the des Cloizeaux ansatz for the correlation functions of homopolymers with different connectivities in a good solvent. The resulting exponents are in better agreement with those from recent renormalization group calculations as compared to the data without the wavelet denoising. We also discuss how the wavelet treatment improves the quality of data for correlation functions from simulations of homopolymers at varied solvent conditions and of heteropolymers.

  14. Adaptive Bayesian-based speck-reduction in SAR images using complex wavelet transform

    NASA Astrophysics Data System (ADS)

    Ma, Ning; Yan, Wei; Zhang, Peng

    2005-10-01

    In this paper, an improved adaptive speckle reduction method is presented based on dual tree complex wavelet transform (CWT). It combines the characteristics of additive noise reduction of soft thresholding with the CWT's directional selectivity, being its main contribution to adapt the effective threshold to preserve the edge detail. A Bayesian estimator is applied to the decomposed data also to estimate the best value for the noise-free complex wavelet coefficients. This estimation is based on alpha-stable and Gaussian distribution hypotheses for complex wavelet coefficients of the signal and noise, respectively. Experimental results show that the denoising performance is among the state-of-the-art techniques based on real discrete wavelet transform (DWT).

  15. Optimal rejection of multiplicative noise via adaptive shrinkage of undecimated wavelet coefficients

    NASA Astrophysics Data System (ADS)

    Alparone, Luciano; Anghele, Nicola; Argenti, Fabrizio

    2001-12-01

    In this paper speckle reduction is approached as a Wiener-like filtering performed in the wavelet domain by means of an adaptive shrinkage of the detail coefficients of an undecimated decomposition. The amplitude of each coefficient is divided by the variance ratio of the noisy coefficient to the noise-free one. All the above quantities are analytically calculated from the speckled image, the speckle variance, and the wavelet filters. On the test image Lenna corrupted by synthetic speckle, the proposed method outperforms Kuan's LLMMSE filtering by almost 3 dB SNR. Experiments carried out on true and synthetic speckled images demonstrate that the visual quality of the results is excellent in terms of both background smoothing and preservation of edge sharpness and textures. The absence of decimation in the wavelet decomposition avoids the typical ringing impairments produced by critically-subsampled wavelet-based denoising.

  16. Adaptive Wavelet Transforms

    SciTech Connect

    Szu, H.; Hsu, C.

    1996-12-31

    Human sensors systems (HSS) may be approximately described as an adaptive or self-learning version of the Wavelet Transforms (WT) that are capable to learn from several input-output associative pairs of suitable transform mother wavelets. Such an Adaptive WT (AWT) is a redundant combination of mother wavelets to either represent or classify inputs.

  17. An Improved Brain Tumour Classification System using Wavelet Transform and Neural Network.

    PubMed

    Dhas, DAS; Madheswaran, M

    2015-06-09

    An improved brain tumour classification system using wavelet transform and neural network is developed and presented in this paper. The anisotropic diffusion filter is used for image denoising and the performance of oriented rician noise reducing anisotropic diffusion (ORNRAD) filter is validated. The segmentation of the denoised image is carried out by Fuzzy C-means clustering. The features are extracted using Symlet and Coiflet Wavelet transform and Levenberg Marquardt algorithm based neural network is used to classify the magnetic resonance imaging (MRI) images. This MRI classification technique is tested and analysed with the existing methodologies and its performance is found to be satisfactory with a classification accuracy of 93.02%. The developed system can assist the physicians for classifying the MRI images for better decision-making.

  18. Adaptive 2-D wavelet transform based on the lifting scheme with preserved vanishing moments.

    PubMed

    Vrankic, Miroslav; Sersic, Damir; Sucic, Victor

    2010-08-01

    In this paper, we propose novel adaptive wavelet filter bank structures based on the lifting scheme. The filter banks are nonseparable, based on quincunx sampling, with their properties being pixel-wise adapted according to the local image features. Despite being adaptive, the filter banks retain a desirable number of primal and dual vanishing moments. The adaptation is introduced in the predict stage of the filter bank with an adaptation region chosen independently for each pixel, based on the intersection of confidence intervals (ICI) rule. The image denoising results are presented for both synthetic and real-world images. It is shown that the obtained wavelet decompositions perform well, especially for synthetic images that contain periodic patterns, for which the proposed method outperforms the state of the art in image denoising.

  19. Two-dimensional wavelet mapping techniques for damage detection in structural systems

    NASA Astrophysics Data System (ADS)

    Amizic, Bruno; Amaravadi, Venkat; Rao, Vittal S.; Derriso, Mark M.

    2002-07-01

    Application of the wavelet transforms on the measured vibration data provides a new tool for the damage detection analysis of the two-dimensional structural systems. In this paper, a novel two-dimensional wavelet mapping technique for damage detection based on the wavelet transforms and residual mode shapes are proposed. After vibration data was collected, wavelet de-noising shrinkage was performed in order to reduce measurement noise. By performing wavelet decomposition of the residuals of mode shapes and by taking only detail coefficients, wavelet energy maps are constructed for all decomposition levels. The orthogonal property of the wavelet transforms has bee utilized to correlate energy at decomposition levels with the measured vibrational energy. After wavelet maps of interest are determined, they are mapped on top of each other to figure out damaged areas of the two-dimensional structural systems. The energy segmentation procedure is performed by using minimum homogeneity and uncertainty based thresholding methods. It has been shown that the proposed method can clearly locate the multiple damage locations on the two- dimensional structures. This method requires few sampling points, robust and independent of the type of damage or the material damaged. The proposed method is applied to detect multiple damage locations on a two-dimensional plate. The results are very satisfactory.

  20. Discrete shearlet transform on GPU with applications in anomaly detection and denoising

    NASA Astrophysics Data System (ADS)

    Gibert, Xavier; Patel, Vishal M.; Labate, Demetrio; Chellappa, Rama

    2014-12-01

    Shearlets have emerged in recent years as one of the most successful methods for the multiscale analysis of multidimensional signals. Unlike wavelets, shearlets form a pyramid of well-localized functions defined not only over a range of scales and locations, but also over a range of orientations and with highly anisotropic supports. As a result, shearlets are much more effective than traditional wavelets in handling the geometry of multidimensional data, and this was exploited in a wide range of applications from image and signal processing. However, despite their desirable properties, the wider applicability of shearlets is limited by the computational complexity of current software implementations. For example, denoising a single 512 × 512 image using a current implementation of the shearlet-based shrinkage algorithm can take between 10 s and 2 min, depending on the number of CPU cores, and much longer processing times are required for video denoising. On the other hand, due to the parallel nature of the shearlet transform, it is possible to use graphics processing units (GPU) to accelerate its implementation. In this paper, we present an open source stand-alone implementation of the 2D discrete shearlet transform using CUDA C++ as well as GPU-accelerated MATLAB implementations of the 2D and 3D shearlet transforms. We have instrumented the code so that we can analyze the running time of each kernel under different GPU hardware. In addition to denoising, we describe a novel application of shearlets for detecting anomalies in textured images. In this application, computation times can be reduced by a factor of 50 or more, compared to multicore CPU implementations.

  1. Improving the quality of the ECG signal by filtering in wavelet transform domain

    NASA Astrophysics Data System (ADS)

    DzierŻak, RóŻa; Surtel, Wojciech; Dzida, Grzegorz; Maciejewski, Marcin

    2016-09-01

    The article concerns the research methods of noise reduction occurring in the ECG signals. The method is based on the use of filtration in wavelet transform domain. The study was conducted on two types of signal - received during the rest of the patient and obtained during physical activity. For each of the signals 3 types of filtration were used. The study was designed to determine the effectiveness of various wavelets for de-noising signals obtained in both cases. The results confirm the suitability of the method for improving the quality of the electrocardiogram in case of both types of signals.

  2. Wavelet detection of weak far-magnetic signal based on adaptive ARMA model threshold

    NASA Astrophysics Data System (ADS)

    Zhang, Ning; Lin, Chun-sheng; Fang, Shi

    2009-10-01

    Based on Mallat algorithm, a de-noising algorithm of adaptive wavelet threshold is applied for weak magnetic signal detection of far moving target in complex magnetic environment. The choice of threshold is the key problem. With the spectrum analysis of the magnetic field target, a threshold algorithm on the basis of adaptive ARMA model filter is brought forward to improve the wavelet filtering performance. The simulation of this algorithm on measured data is carried out. Compared to Donoho threshold algorithm, it shows that adaptive ARMA model threshold algorithm significantly improved the capability of weak magnetic signal detection in complex magnetic environment.

  3. Signal Enhancement Using Time-Frequency Based Denoising

    DTIC Science & Technology

    2003-03-01

    algorithms on longer data segments. 15. NUMBER OF PAGES 125 14. SUBJECT TERMS Discrete Wavelet Transform , Soft and Hard...5 B. WAVELET TRANSFORM ............................................................................7...1. Continuous Wavelet Transform .........................................................7 2. Discrete Wavelet Transform .............................................................10

  4. Proper orthogonal decomposition and wavelet methods for noise reduction in particle-based transport calculations

    NASA Astrophysics Data System (ADS)

    Nguyen van Ye, Romain; Del-Castillo-Negrete, Diego; Spong, D.; Hirshman, S.; Farge, M.

    2008-11-01

    A limitation of particle-based transport calculations is the noise due to limited statistical sampling. Thus, a key element for the success of these calculations is the development of efficient denoising methods. Here we discuss denoising techniques based on Proper Orthogonal Decomposition (POD) and Wavelet Decomposition (WD). The goal is the reconstruction of smooth (denoised) particle distribution functions from discrete particle data obtained from Monte Carlo simulations. In 2-D, the POD method is based on low rank truncations of the singular value decomposition of the data. For 3-D we propose the use of a generalized low rank approximation of matrices technique. The WD denoising is based on the thresholding of empirical wavelet coefficients [Donoho et al., 1996]. The methods are illustrated and tested with Monte-Carlo particle simulation data of plasma collisional relaxation including pitch angle and energy scattering. As an application we consider guiding-center transport with collisions in a magnetically confined plasma in toroidal geometry. The proposed noise reduction methods allow to achieve high levels of smoothness in the particle distribution function using significantly less particles in the computations.

  5. Predicting apple tree leaf nitrogen content based on hyperspectral applying wavelet and wavelet packet analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Yao; Zheng, Lihua; Li, Minzan; Deng, Xiaolei; Sun, Hong

    2012-11-01

    The visible and NIR spectral reflectance were measured for apple leaves by using a spectrophotometer in fruit-bearing, fruit-falling and fruit-maturing period respectively, and the nitrogen content of each sample was measured in the lab. The analysis of correlation between nitrogen content of apple tree leaves and their hyperspectral data was conducted. Then the low frequency signal and high frequency noise reduction signal were extracted by using wavelet packet decomposition algorithm. At the same time, the original spectral reflectance was denoised taking advantage of the wavelet filtering technology. And then the principal components spectra were collected after PCA (Principal Component Analysis). It was known that the model built based on noise reduction principal components spectra reached higher accuracy than the other three ones in fruit-bearing period and physiological fruit-maturing period. Their calibration R2 reached 0.9529 and 0.9501, and validation R2 reached 0.7285 and 0.7303 respectively. While in the fruit-falling period the model based on low frequency principal components spectra reached the highest accuracy, and its calibration R2 reached 0.9921 and validation R2 reached 0.6234. The results showed that it was an effective way to improve ability of predicting apple tree nitrogen content based on hyperspectral analysis by using wavelet packet algorithm.

  6. Wavelets in Physics

    NASA Astrophysics Data System (ADS)

    van den Berg, J. C.

    1999-08-01

    A guided tour J. C. van den Berg; 1. Wavelet analysis, a new tool in physics J.-P. Antoine; 2. The 2-D wavelet transform, physical applications J.-P. Antoine; 3. Wavelets and astrophysical applications A. Bijaoui; 4. Turbulence analysis, modelling and computing using wavelets M. Farge, N. K.-R. Kevlahan, V. Perrier and K. Schneider; 5. Wavelets and detection of coherent structures in fluid turbulence L. Hudgins and J. H. Kaspersen; 6. Wavelets, non-linearity and turbulence in fusion plasmas B. Ph. van Milligen; 7. Transfers and fluxes of wind kinetic energy between orthogonal wavelet components during atmospheric blocking A. Fournier; 8. Wavelets in atomic physics and in solid state physics J.-P. Antoine, Ph. Antoine and B. Piraux; 9. The thermodynamics of fractals revisited with wavelets A. Arneodo, E. Bacry and J. F. Muzy; 10. Wavelets in medicine and physiology P. Ch. Ivanov, A. L. Goldberger, S. Havlin, C.-K. Peng, M. G. Rosenblum and H. E. Stanley; 11. Wavelet dimension and time evolution Ch.-A. Guérin and M. Holschneider.

  7. Wavelets in Physics

    NASA Astrophysics Data System (ADS)

    van den Berg, J. C.

    2004-03-01

    A guided tour J. C. van den Berg; 1. Wavelet analysis, a new tool in physics J.-P. Antoine; 2. The 2-D wavelet transform, physical applications J.-P. Antoine; 3. Wavelets and astrophysical applications A. Bijaoui; 4. Turbulence analysis, modelling and computing using wavelets M. Farge, N. K.-R. Kevlahan, V. Perrier and K. Schneider; 5. Wavelets and detection of coherent structures in fluid turbulence L. Hudgins and J. H. Kaspersen; 6. Wavelets, non-linearity and turbulence in fusion plasmas B. Ph. van Milligen; 7. Transfers and fluxes of wind kinetic energy between orthogonal wavelet components during atmospheric blocking A. Fournier; 8. Wavelets in atomic physics and in solid state physics J.-P. Antoine, Ph. Antoine and B. Piraux; 9. The thermodynamics of fractals revisited with wavelets A. Arneodo, E. Bacry and J. F. Muzy; 10. Wavelets in medicine and physiology P. Ch. Ivanov, A. L. Goldberger, S. Havlin, C.-K. Peng, M. G. Rosenblum and H. E. Stanley; 11. Wavelet dimension and time evolution Ch.-A. Guérin and M. Holschneider.

  8. Compression of Ultrasonic NDT Image by Wavelet Based Local Quantization

    NASA Astrophysics Data System (ADS)

    Cheng, W.; Li, L. Q.; Tsukada, K.; Hanasaki, K.

    2004-02-01

    Compression on ultrasonic image that is always corrupted by noise will cause `over-smoothness' or much distortion. To solve this problem to meet the need of real time inspection and tele-inspection, a compression method based on Discrete Wavelet Transform (DWT) that can also suppress the noise without losing much flaw-relevant information, is presented in this work. Exploiting the multi-resolution and interscale correlation property of DWT, a simple way named DWCs classification, is introduced first to classify detail wavelet coefficients (DWCs) as dominated by noise, signal or bi-effected. A better denoising can be realized by selective thresholding DWCs. While in `Local quantization', different quantization strategies are applied to the DWCs according to their classification and the local image property. It allocates the bit rate more efficiently to the DWCs thus achieve a higher compression rate. Meanwhile, the decompressed image shows the effects of noise suppressed and flaw characters preserved.

  9. Geodesic denoising for optical coherence tomography images

    NASA Astrophysics Data System (ADS)

    Shahrian Varnousfaderani, Ehsan; Vogl, Wolf-Dieter; Wu, Jing; Gerendas, Bianca S.; Simader, Christian; Langs, Georg; Waldstein, Sebastian M.; Schmidt-Erfurth, Ursula

    2016-03-01

    Optical coherence tomography (OCT) is an optical signal acquisition method capturing micrometer resolution, cross-sectional three-dimensional images. OCT images are used widely in ophthalmology to diagnose and monitor retinal diseases such as age-related macular degeneration (AMD) and Glaucoma. While OCT allows the visualization of retinal structures such as vessels and retinal layers, image quality and contrast is reduced by speckle noise, obfuscating small, low intensity structures and structural boundaries. Existing denoising methods for OCT images may remove clinically significant image features such as texture and boundaries of anomalies. In this paper, we propose a novel patch based denoising method, Geodesic Denoising. The method reduces noise in OCT images while preserving clinically significant, although small, pathological structures, such as fluid-filled cysts in diseased retinas. Our method selects optimal image patch distribution representations based on geodesic patch similarity to noisy samples. Patch distributions are then randomly sampled to build a set of best matching candidates for every noisy sample, and the denoised value is computed based on a geodesic weighted average of the best candidate samples. Our method is evaluated qualitatively on real pathological OCT scans and quantitatively on a proposed set of ground truth, noise free synthetic OCT scans with artificially added noise and pathologies. Experimental results show that performance of our method is comparable with state of the art denoising methods while outperforming them in preserving the critical clinically relevant structures.

  10. Computed tomography perfusion imaging denoising using Gaussian process regression

    NASA Astrophysics Data System (ADS)

    Zhu, Fan; Carpenter, Trevor; Rodriguez Gonzalez, David; Atkinson, Malcolm; Wardlaw, Joanna

    2012-06-01

    Brain perfusion weighted images acquired using dynamic contrast studies have an important clinical role in acute stroke diagnosis and treatment decisions. However, computed tomography (CT) images suffer from low contrast-to-noise ratios (CNR) as a consequence of the limitation of the exposure to radiation of the patient. As a consequence, the developments of methods for improving the CNR are valuable. The majority of existing approaches for denoising CT images are optimized for 3D (spatial) information, including spatial decimation (spatially weighted mean filters) and techniques based on wavelet and curvelet transforms. However, perfusion imaging data is 4D as it also contains temporal information. Our approach using Gaussian process regression (GPR), which takes advantage of the temporal information, to reduce the noise level. Over the entire image, GPR gains a 99% CNR improvement over the raw images and also improves the quality of haemodynamic maps allowing a better identification of edges and detailed information. At the level of individual voxel, GPR provides a stable baseline, helps us to identify key parameters from tissue time-concentration curves and reduces the oscillations in the curve. GPR is superior to the comparable techniques used in this study.

  11. Image-Specific Prior Adaptation for Denoising.

    PubMed

    Lu, Xin; Lin, Zhe; Jin, Hailin; Yang, Jianchao; Wang, James Z

    2015-12-01

    Image priors are essential to many image restoration applications, including denoising, deblurring, and inpainting. Existing methods use either priors from the given image (internal) or priors from a separate collection of images (external). We find through statistical analysis that unifying the internal and external patch priors may yield a better patch prior. We propose a novel prior learning algorithm that combines the strength of both internal and external priors. In particular, we first learn a generic Gaussian mixture model from a collection of training images and then adapt the model to the given image by simultaneously adding additional components and refining the component parameters. We apply this image-specific prior to image denoising. The experimental results show that our approach yields better or competitive denoising results in terms of both the peak signal-to-noise ratio and structural similarity.

  12. [Denoising and assessing method of additive noise in the ultraviolet spectrum of SO2 in flue gas].

    PubMed

    Zhou, Tao; Sun, Chang-Ku; Liu, Bin; Zhao, Yu-Mei

    2009-11-01

    The problem of denoising and assessing method of the spectrum of SO2 in flue gas was studied based on DOAS. The denoising procedure of the additive noise in the spectrum was divided into two parts: reducing the additive noise and enhancing the useful signal. When obtaining the absorption feature of measured gas, a multi-resolution preprocessing method of original spectrum was adopted for denoising by DWT (discrete wavelet transform). The signal energy operators in different scales were used to choose the denoising threshold and separate the useful signal from the noise. On the other hand, because there was no sudden change in the spectra of flue gas in time series, the useful signal component was enhanced according to the signal time dependence. And the standard absorption cross section was used to build the ideal absorption spectrum with the measured gas temperature and pressure. This ideal spectrum was used as the desired signal instead of the original spectrum in the assessing method to modify the SNR (signal-noise ratio). There were two different environments to do the proof test-in the lab and at the scene. In the lab, SO2 was measured several times with the system using this method mentioned above. The average deviation was less than 1.5%, while the repeatability was less than 1%. And the short range experiment data were better than the large range. In the scene of a power plant whose concentration of flue gas had a large variation range, the maximum deviation of this method was 2.31% in the 18 groups of contrast data. The experimental results show that the denoising effect of the scene spectrum was better than that of the lab spectrum. This means that this method can improve the SNR of the spectrum effectively, which is seriously polluted by additive noise.

  13. Speckle filtering of medical ultrasonic images using wavelet and guided filter.

    PubMed

    Zhang, Ju; Lin, Guangkuo; Wu, Lili; Cheng, Yun

    2016-02-01

    Speckle noise is an inherent yet ineffectual residual artifact in medical ultrasound images, which significantly degrades quality and restricts accuracy in automatic diagnostic techniques. Speckle reduction is therefore an important step prior to the analysis and processing of the ultrasound images. A new de-noising method based on an improved wavelet filter and guided filter is proposed in this paper. According to the characteristics of medical ultrasound images in the wavelet domain, an improved threshold function based on the universal wavelet threshold function is developed. The wavelet coefficients of speckle noise and noise-free signal are modeled as Rayleigh distribution and generalized Gaussian distribution respectively. The Bayesian maximum a posteriori estimation is applied to obtain a new wavelet shrinkage algorithm. The coefficients of the low frequency sub-band in the wavelet domain are filtered by guided filter. The filtered image is then obtained by using the inverse wavelet transformation. Experiments with the comparison of the other seven de-speckling filters are conducted. The results show that the proposed method not only has a strong de-speckling ability, but also keeps the image details, such as the edge of a lesion.

  14. Statistical Modeling of Low SNR Magnetic Resonance Images in Wavelet Domain Using Laplacian Prior and Two-Sided Rayleigh Noise for Visual Quality Improvement

    NASA Astrophysics Data System (ADS)

    Rabbani, H.

    2011-01-01

    In this paper we introduce a new wavelet-based image denoising algorithm using maximum a posteriori (MAP) criterion. For this reason we propose Laplace distribution with local variance for clean image and two-sided Rayleigh model for noise in wavelet domain. The local Laplace probability density function (pdf) is able to simultaneously model the heavy-tailed nature of marginal distribution and intrascale dependency between spatial adjacent coefficients. Using local Laplace prior and two-sided Rayleigh noise, we derive a new shrinkage function for image denoising in the wavelet domain. We propose our new spatially adaptive wavelet-based image denoising algorithm for several low signal-to-noise ratio (SNR) magnetic resonance (MR) images and compare the results with other methods. The simulation results show that this algorithm is able to truly improve the visual quality of noisy MR images with very low computational cost. In case the input MR image is blurred, a blind deconvolution (BD) algorithm is necessary for visual quality improvement. Since BD techniques are usually sensitive to noise, in this paper we also apply a BD algorithm to an appropriate subband in the wavelet domain to eliminate the effect of noise in the BD procedure and to further improve visual quality.

  15. Higher-order graph wavelets and sparsity on circulant graphs

    NASA Astrophysics Data System (ADS)

    Kotzagiannidis, Madeleine S.; Dragotti, Pier Luigi

    2015-08-01

    The notion of a graph wavelet gives rise to more advanced processing of data on graphs due to its ability to operate in a localized manner, across newly arising data-dependency structures, with respect to the graph signal and underlying graph structure, thereby taking into consideration the inherent geometry of the data. In this work, we tackle the problem of creating graph wavelet filterbanks on circulant graphs for a sparse representation of certain classes of graph signals. The underlying graph can hereby be data-driven as well as fixed, for applications including image processing and social network theory, whereby clusters can be modelled as circulant graphs, respectively. We present a set of novel graph wavelet filter-bank constructions, which annihilate higher-order polynomial graph signals (up to a border effect) defined on the vertices of undirected, circulant graphs, and are localised in the vertex domain. We give preliminary results on their performance for non-linear graph signal approximation and denoising. Furthermore, we provide extensions to our previously developed segmentation-inspired graph wavelet framework for non-linear image approximation, by incorporating notions of smoothness and vanishing moments, which further improve performance compared to traditional methods.

  16. Peeling skin syndrome: Current status.

    PubMed

    Garg, Kshitij; Singh, Devesh; Mishra, Devesh

    2010-03-15

    Peeling Skin Syndrome (PSS) is a rare genodermatoses characterized by asymptomatic, localized or generalized, continuous exfoliation of the stratum corneum; it may present at birth or in adulthood. We describe a patient having the type A non-inflammatory variant of PSS showing asymptomatic and continuous skin peeling from the neck, trunk, back, and extremities. Friction appeared to be an aggravating factor, but there was no seasonal variation. Histopathology in this condition reveals hyperkeratosis and splitting of the epidermis between the granular layer and the stratum corneum. No treatment for this disorder has been found to be effective so far.

  17. Enhancement of the automatic onset time picking via wavelet thresholding

    NASA Astrophysics Data System (ADS)

    Gaci, Said

    2013-04-01

    Since arrival time-picking is a critical step in the analysis of geophysical data, many time picking algorithms have been developed. Nowadays, the ''short-time-average through long-time-average trigger'' (STA/LTA) in different forms are the most commonly used. This study aims at improving this algorithm in the presence of high amplitude noise. The suggested method consists of denoising the seismic trace using the discrete wavelet transform. Therefore, the STA/LTA curve obtained from the denoised trace displays a faster build up at the position of the wave arrival, and the picking error is reduced. The application of this technique is first demonstrated on synthetic seismic traces with varying noise levels, then extended to uphole seismic traces recorded in the Algerian Sahara. The results show that the picked first arrivals are more accurate than those yielded by the standard STA/LTA algorithm and this method can tolerate high noise levels. Keywords: picking, first arrival, seismic wave, wavelet thresholding.

  18. Discrete wavelet transform core for image processing applications

    NASA Astrophysics Data System (ADS)

    Savakis, Andreas E.; Carbone, Richard

    2005-02-01

    This paper presents a flexible hardware architecture for performing the Discrete Wavelet Transform (DWT) on a digital image. The proposed architecture uses a variation of the lifting scheme technique and provides advantages that include small memory requirements, fixed-point arithmetic implementation, and a small number of arithmetic computations. The DWT core may be used for image processing operations, such as denoising and image compression. For example, the JPEG2000 still image compression standard uses the Cohen-Daubechies-Favreau (CDF) 5/3 and CDF 9/7 DWT for lossless and lossy image compression respectively. Simple wavelet image denoising techniques resulted in improved images up to 27 dB PSNR. The DWT core is modeled using MATLAB and VHDL. The VHDL model is synthesized to a Xilinx FPGA to demonstrate hardware functionality. The CDF 5/3 and CDF 9/7 versions of the DWT are both modeled and used as comparisons. The execution time for performing both DWTs is nearly identical at approximately 14 clock cycles per image pixel for one level of DWT decomposition. The hardware area generated for the CDF 5/3 is around 15,000 gates using only 5% of the Xilinx FPGA hardware area, at 2.185 MHz max clock speed and 24 mW power consumption.

  19. Analysis of de-noising methods to improve the precision of the ILSF BPM electronic readout system

    NASA Astrophysics Data System (ADS)

    Shafiee, M.; Feghhi, S. A. H.; Rahighi, J.

    2016-12-01

    In order to have optimum operation and precise control system at particle accelerators, it is required to measure the beam position with the precision of sub-μm. We developed a BPM electronic readout system at Iranian Light Source Facility and it has been experimentally tested at ALBA accelerator facility. The results show the precision of 0.54 μm in beam position measurements. To improve the precision of this beam position monitoring system to sub-μm level, we have studied different de-noising methods such as principal component analysis, wavelet transforms, filtering by FIR, and direct averaging method. An evaluation of the noise reduction was given to testify the ability of these methods. The results show that the noise reduction based on Daubechies wavelet transform is better than other algorithms, and the method is suitable for signal noise reduction in beam position monitoring system.

  20. Biorefinery of waste orange peel.

    PubMed

    Angel Siles López, José; Li, Qiang; Thompson, Ian P

    2010-03-01

    Up to comparatively recently orange peel and the associated residual remnants of membranes resulting from juice extraction represented a significant disposal problem, especially in those regions where orange cultivation is a major industry. However, recent research has demonstrated that orange peel waste represents a potentially valuable resource that can be developed into high value products. These developments are critically reviewed in this article. This includes a summary of the chemical composition of the substrate and an assessment of the range of applications in which the peel is deployed. Utilization as a substrate to produce animal feed, fertilizer, essential oils, pectin, ethanol, methane, industrial enzymes, and single cell protein is discussed. The applications described together with those that will no doubt be developed in the future, represent great opportunities to harness the economical benefit of this agro-industrial waste and to develop even more efficient and sustainable systems. A scheme of integrated utilization of orange peel in a biorefinery approach is discussed together with some prediction of further necessary research.

  1. Statistical Methods for Image Registration and Denoising

    DTIC Science & Technology

    2008-06-19

    21 2.5.4 Nonlocal Means . . . . . . . . . . . . . . . . . 22 2.5.5 Patch -Based Denoising with Optimal Spatial Adap- tation...24 2.5.6 Other Patch -Based Methods . . . . . . . . . . 25 2.6 Chapter Summary...the nonlocal means [9], and an optimal patch -based algorithm [31]. These algorithms all include some measure of pixel similarity that allows the

  2. Bayesian demosaicing using Gaussian scale mixture priors with local adaptivity in the dual tree complex wavelet packet transform domain

    NASA Astrophysics Data System (ADS)

    Goossens, Bart; Aelterman, Jan; Luong, Hiep; Pizurica, Aleksandra; Philips, Wilfried

    2013-02-01

    In digital cameras and mobile phones, there is an ongoing trend to increase the image resolution, decrease the sensor size and to use lower exposure times. Because smaller sensors inherently lead to more noise and a worse spatial resolution, digital post-processing techniques are required to resolve many of the artifacts. Color filter arrays (CFAs), which use alternating patterns of color filters, are very popular because of price and power consumption reasons. However, color filter arrays require the use of a post-processing technique such as demosaicing to recover full resolution RGB images. Recently, there has been some interest in techniques that jointly perform the demosaicing and denoising. This has the advantage that the demosaicing and denoising can be performed optimally (e.g. in the MSE sense) for the considered noise model, while avoiding artifacts introduced when using demosaicing and denoising sequentially. In this paper, we will continue the research line of the wavelet-based demosaicing techniques. These approaches are computationally simple and very suited for combination with denoising. Therefore, we will derive Bayesian Minimum Squared Error (MMSE) joint demosaicing and denoising rules in the complex wavelet packet domain, taking local adaptivity into account. As an image model, we will use Gaussian Scale Mixtures, thereby taking advantage of the directionality of the complex wavelets. Our results show that this technique is well capable of reconstructing fine details in the image, while removing all of the noise, at a relatively low computational cost. In particular, the complete reconstruction (including color correction, white balancing etc) of a 12 megapixel RAW image takes 3.5 sec on a recent mid-range GPU.

  3. Wavelet Analyses and Applications

    ERIC Educational Resources Information Center

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  4. Wavelet-based approach for detection and analysis of transient signal in distributed power system

    NASA Astrophysics Data System (ADS)

    Liao, Wei; Han, Pu

    2008-10-01

    By combining wavelet transform (WT) with neural network theory, a novel approach is put forward to detect transient fault and analyze voltage stability. The application of signal denoising based on the statistic rule is proposed to determine the threshold of each order of wavelet space. In a view of the inter relationship of wavelet transform and neural network, the whole and local fractal exponents obtained from WT coefficients as features are presented for extracting signal features. The effectiveness of the new algorithm used to extract the characteristic signal is described, which can be realized by the value of those types of transient signal. This model incorporates the advantages of morphological filter and multi-scale WT to extract the feature of fault signal meanwhile restraining various noises. Besides, it can be implemented in real time using the available hardware. The effectiveness of this model was verified with the voltage stability analysis of simulation results.

  5. CW-THz image contrast enhancement using wavelet transform and Retinex

    NASA Astrophysics Data System (ADS)

    Chen, Lin; Zhang, Min; Hu, Qi-fan; Huang, Ying-Xue; Liang, Hua-Wei

    2015-10-01

    To enhance continuous wave terahertz (CW-THz) scanning images contrast and denoising, a method based on wavelet transform and Retinex theory was proposed. In this paper, the factors affecting the quality of CW-THz images were analysed. Second, an approach of combination of the discrete wavelet transform (DWT) and a designed nonlinear function in wavelet domain for the purpose of contrast enhancing was applied. Then, we combine the Retinex algorithm for further contrast enhancement. To evaluate the effectiveness of the proposed method in qualitative and quantitative, it was compared with the adaptive histogram equalization method, the homomorphic filtering method and the SSR(Single-Scale-Retinex) method. Experimental results demonstrated that the presented algorithm can effectively enhance the contrast of CW-THZ image and obtain better visual effect.

  6. [Application of kalman filtering based on wavelet transform in ICP-AES].

    PubMed

    Qin, Xia; Shen, Lan-sun

    2002-12-01

    Kalman filtering is a recursive algorithm, which has been proposed as an attractive alternative to correct overlapping interferences in ICP-AES. However, the noise in ICP-AES contaminates the signal arising from the analyte and hence limits the accuracy of kalman filtering. Wavelet transform is a powerful technique in signal denoising due to its multi-resolution characteristics. In this paper, first, the effect of noise on kalman filtering is discussed. Then we apply the wavelet-transform-based soft-thresholding as the pre-processing of kalman filtering. The simulation results show that the kalman filtering based on wavelet transform can effectively reduce the noise and increase the accuracy of the analysis.

  7. Evaluation of optical properties for real photonic crystal fiber based on total variation in wavelet domain

    NASA Astrophysics Data System (ADS)

    Shen, Yan; Wang, Xin; Lou, Shuqin; Lian, Zhenggang; Zhao, Tongtong

    2016-09-01

    An evaluation method based on the total variation model (TV) in wavelet domain is proposed for modeling optical properties of real photonic crystal fibers (PCFs). The TV model in wavelet domain is set up to suppress the noise of the original image effectively and rebuild the cross section images of real PCFs with high accuracy. The optical properties of three PCFs are evaluated, including two kinds of PCFs that supplied from the Crystal Fiber A/S and a homemade side-leakage PCF, by using the combination of the proposed model and finite element method. Numerical results demonstrate that the proposed method can obtain high noise suppression ratio and effectively reduce the noise of cross section images of PCFs, which leads to an accurate evaluation of optical properties of real PCFs. To the best of our knowledge, it is the first time to denoise the cross section images of PCFs with the TV model in the wavelet domain.

  8. Denoising of PET images by context modelling using local neighbourhood correlation

    NASA Astrophysics Data System (ADS)

    Huerga, Carlos; Castro, Pablo; Corredoira, Eva; Coronado, Monica; Delgado, Victor; Guibelalde, Eduardo

    2017-01-01

    Positron emission tomography (PET) images are characterised by low signal-to-noise ratio and blurred edges when compared with other image modalities. It is therefore advisable to use noise reduction methods for qualitative and quantitative analyses. Given the importance of the maximum and mean uptake values, it is necessary to avoid signal loss, which could modify the clinical significance. This paper proposes a method of non-linear image denoising for PET. It is based on spatially adaptive wavelet-shrinkage and uses context modelling, which explicitly considers the correlation between neighbouring pixels. This context modelling is able to maintain the uptake values and preserve the edges in significant regions. The algorithm is proposed as an alternative to the usual filtering that is performed after reconstruction.

  9. Peeling of tomatoes using novel infrared radiation heating technology

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The effectiveness of using infrared (IR) dry-peeling as an alternative process for peeling tomatoes without lye and water was studied. Compared to conventional lye peeling, IR dry-peeling using 30 s to 75 s heating time resulted in lower peeling loss (8.3% - 13.2% vs. 12.9% - 15.8%), thinner thickne...

  10. Portal imaging: Performance improvement in noise reduction by means of wavelet processing.

    PubMed

    González-López, Antonio; Morales-Sánchez, Juan; Larrey-Ruiz, Jorge; Bastida-Jumilla, María-Consuelo; Verdú-Monedero, Rafael

    2016-01-01

    This paper discusses the suitability, in terms of noise reduction, of various methods which can be applied to an image type often used in radiation therapy: the portal image. Among these methods, the analysis focuses on those operating in the wavelet domain. Wavelet-based methods tested on natural images--such as the thresholding of the wavelet coefficients, the minimization of the Stein unbiased risk estimator on a linear expansion of thresholds (SURE-LET), and the Bayes least-squares method using as a prior a Gaussian scale mixture (BLS-GSM method)--are compared with other methods that operate on the image domain--an adaptive Wiener filter and a nonlocal mean filter (NLM). For the assessment of the performance, the peak signal-to-noise ratio (PSNR), the structural similarity index (SSIM), the Pearson correlation coefficient, and the Spearman rank correlation (ρ) coefficient are used. The performance of the wavelet filters and the NLM method are similar, but wavelet filters outperform the Wiener filter in terms of portal image denoising. It is shown how BLS-GSM and NLM filters produce the smoothest image, while keeping soft-tissue and bone contrast. As for the computational cost, filters using a decimated wavelet transform (decimated thresholding and SURE-LET) turn out to be the most efficient, with calculation times around 1 s.

  11. Adaptive kernel-based image denoising employing semi-parametric regularization.

    PubMed

    Bouboulis, Pantelis; Slavakis, Konstantinos; Theodoridis, Sergios

    2010-06-01

    The main contribution of this paper is the development of a novel approach, based on the theory of Reproducing Kernel Hilbert Spaces (RKHS), for the problem of noise removal in the spatial domain. The proposed methodology has the advantage that it is able to remove any kind of additive noise (impulse, gaussian, uniform, etc.) from any digital image, in contrast to the most commonly used denoising techniques, which are noise dependent. The problem is cast as an optimization task in a RKHS, by taking advantage of the celebrated Representer Theorem in its semi-parametric formulation. The semi-parametric formulation, although known in theory, has so far found limited, to our knowledge, application. However, in the image denoising problem, its use is dictated by the nature of the problem itself. The need for edge preservation naturally leads to such a modeling. Examples verify that in the presence of gaussian noise the proposed methodology performs well compared to wavelet based technics and outperforms them significantly in the presence of impulse or mixed noise.

  12. [A non-local means approach for PET image denoising].

    PubMed

    Yin, Yong; Sun, Weifeng; Lu, Jie; Liu, Tonghai

    2010-04-01

    Denoising is an important issue for medical image processing. Based on the analysis of the Non-local means algorithm recently reported by Buades A, et al. in international journals we herein propose adapting it for PET image denoising. Experimental de-noising results for real clinical PET images show that Non-local means method is superior to median filtering and wiener filtering methods and it can suppress noise in PET images effectively and preserve important details of structure for diagnosis.

  13. A stacked contractive denoising auto-encoder for ECG signal denoising.

    PubMed

    Xiong, Peng; Wang, Hongrui; Liu, Ming; Lin, Feng; Hou, Zengguang; Liu, Xiuling

    2016-12-01

    As a primary diagnostic tool for cardiac diseases, electrocardiogram (ECG) signals are often contaminated by various kinds of noise, such as baseline wander, electrode contact noise and motion artifacts. In this paper, we propose a contractive denoising technique to improve the performance of current denoising auto-encoders (DAEs) for ECG signal denoising. Based on the Frobenius norm of the Jacobean matrix for the learned features with respect to the input, we develop a stacked contractive denoising auto-encoder (CDAE) to build a deep neural network (DNN) for noise reduction, which can significantly improve the expression of ECG signals through multi-level feature extraction. The proposed method is evaluated on ECG signals from the bench-marker MIT-BIH Arrhythmia Database, and the noises come from the MIT-BIH noise stress test database. The experimental results show that the new CDAE algorithm performs better than the conventional ECG denoising method, specifically with more than 2.40 dB improvement in the signal-to-noise ratio (SNR) and nearly 0.075 to 0.350 improvements in the root mean square error (RMSE).

  14. Wavelet analysis in neurodynamics

    NASA Astrophysics Data System (ADS)

    Pavlov, Aleksei N.; Hramov, Aleksandr E.; Koronovskii, Aleksei A.; Sitnikova, Evgenija Yu; Makarov, Valeri A.; Ovchinnikov, Alexey A.

    2012-09-01

    Results obtained using continuous and discrete wavelet transforms as applied to problems in neurodynamics are reviewed, with the emphasis on the potential of wavelet analysis for decoding signal information from neural systems and networks. The following areas of application are considered: (1) the microscopic dynamics of single cells and intracellular processes, (2) sensory data processing, (3) the group dynamics of neuronal ensembles, and (4) the macrodynamics of rhythmical brain activity (using multichannel EEG recordings). The detection and classification of various oscillatory patterns of brain electrical activity and the development of continuous wavelet-based brain activity monitoring systems are also discussed as possibilities.

  15. Bayesian wavelet PCA methodology for turbomachinery damage diagnosis under uncertainty

    NASA Astrophysics Data System (ADS)

    Xu, Shengli; Jiang, Xiaomo; Huang, Jinzhi; Yang, Shuhua; Wang, Xiaofang

    2016-12-01

    Centrifugal compressor often suffers various defects such as impeller cracking, resulting in forced outage of the total plant. Damage diagnostics and condition monitoring of such a turbomachinery system has become an increasingly important and powerful tool to prevent potential failure in components and reduce unplanned forced outage and further maintenance costs, while improving reliability, availability and maintainability of a turbomachinery system. This paper presents a probabilistic signal processing methodology for damage diagnostics using multiple time history data collected from different locations of a turbomachine, considering data uncertainty and multivariate correlation. The proposed methodology is based on the integration of three advanced state-of-the-art data mining techniques: discrete wavelet packet transform, Bayesian hypothesis testing, and probabilistic principal component analysis. The multiresolution wavelet analysis approach is employed to decompose a time series signal into different levels of wavelet coefficients. These coefficients represent multiple time-frequency resolutions of a signal. Bayesian hypothesis testing is then applied to each level of wavelet coefficient to remove possible imperfections. The ratio of posterior odds Bayesian approach provides a direct means to assess whether there is imperfection in the decomposed coefficients, thus avoiding over-denoising. Power spectral density estimated by the Welch method is utilized to evaluate the effectiveness of Bayesian wavelet cleansing method. Furthermore, the probabilistic principal component analysis approach is developed to reduce dimensionality of multiple time series and to address multivariate correlation and data uncertainty for damage diagnostics. The proposed methodology and generalized framework is demonstrated with a set of sensor data collected from a real-world centrifugal compressor with impeller cracks, through both time series and contour analyses of vibration

  16. Aesthetic problems in chemical peeling.

    PubMed

    Goldman, P M; Freed, M I

    1989-09-01

    Questionnaires were mailed to six physicians experienced in chemical peels regarding common problems encountered in the procedure. They are: Thomas H. Alt, M.D., Minneapolis, MN; Samuel J. Stegman, M.D., San Francisco, CA; James J. Stagnone, M.D., Albuquerque, NM; Robert Kotler, M.D., Los Angeles, CA; William H. Beeson, M.D., Indianapolis, IN; and Paul S. Collins, M.D., San Luis Obispo, CA. Their answers are presented in a panel format.

  17. Image denoising by exploring external and internal correlations.

    PubMed

    Yue, Huanjing; Sun, Xiaoyan; Yang, Jingyu; Wu, Feng

    2015-06-01

    Single image denoising suffers from limited data collection within a noisy image. In this paper, we propose a novel image denoising scheme, which explores both internal and external correlations with the help of web images. For each noisy patch, we build internal and external data cubes by finding similar patches from the noisy and web images, respectively. We then propose reducing noise by a two-stage strategy using different filtering approaches. In the first stage, since the noisy patch may lead to inaccurate patch selection, we propose a graph based optimization method to improve patch matching accuracy in external denoising. The internal denoising is frequency truncation on internal cubes. By combining the internal and external denoising patches, we obtain a preliminary denoising result. In the second stage, we propose reducing noise by filtering of external and internal cubes, respectively, on transform domain. In this stage, the preliminary denoising result not only enhances the patch matching accuracy but also provides reliable estimates of filtering parameters. The final denoising image is obtained by fusing the external and internal filtering results. Experimental results show that our method constantly outperforms state-of-the-art denoising schemes in both subjective and objective quality measurements, e.g., it achieves >2 dB gain compared with BM3D at a wide range of noise levels.

  18. PRINCIPAL COMPONENTS FOR NON-LOCAL MEANS IMAGE DENOISING.

    PubMed

    Tasdizen, Tolga

    2008-01-01

    This paper presents an image denoising algorithm that uses principal component analysis (PCA) in conjunction with the non-local means image denoising. Image neighborhood vectors used in the non-local means algorithm are first projected onto a lower-dimensional subspace using PCA. Consequently, neighborhood similarity weights for denoising are computed using distances in this subspace rather than the full space. This modification to the non-local means algorithm results in improved accuracy and computational performance. We present an analysis of the proposed method's accuracy as a function of the dimensionality of the projection subspace and demonstrate that denoising accuracy peaks at a relatively low number of dimensions.

  19. Speckle noise reduction in ultrasound images using a discrete wavelet transform-based image fusion technique.

    PubMed

    Choi, Hyun Ho; Lee, Ju Hwan; Kim, Sung Min; Park, Sung Yun

    2015-01-01

    Here, the speckle noise in ultrasonic images is removed using an image fusion-based denoising method. To optimize the denoising performance, each discrete wavelet transform (DWT) and filtering technique was analyzed and compared. In addition, the performances were compared in order to derive the optimal input conditions. To evaluate the speckle noise removal performance, an image fusion algorithm was applied to the ultrasound images, and comparatively analyzed with the original image without the algorithm. As a result, applying DWT and filtering techniques caused information loss and noise characteristics, and did not represent the most significant noise reduction performance. Conversely, an image fusion method applying SRAD-original conditions preserved the key information in the original image, and the speckle noise was removed. Based on such characteristics, the input conditions of SRAD-original had the best denoising performance with the ultrasound images. From this study, the best denoising technique proposed based on the results was confirmed to have a high potential for clinical application.

  20. A partial least squares and wavelet-transform hybrid model to analyze carbon content in coal using laser-induced breakdown spectroscopy.

    PubMed

    Yuan, Tingbi; Wang, Zhe; Li, Zheng; Ni, Weidou; Liu, Jianmin

    2014-01-07

    A partial least squares (PLS) and wavelet transform hybrid model are proposed to analyze the carbon content of coal by using laser-induced breakdown spectroscopy (LIBS). The hybrid model is composed of two steps of wavelet analysis procedures, which include environmental denoising and background noise reduction, to pretreat the LIBS spectrum. The processed wavelet coefficients, which contain the discrete line information of the spectra, were taken as inputs for the PLS model for calibration and prediction of carbon element. A higher signal-to-noise ratio of carbon line was obtained after environmental denoising, and the best decomposition level was determined after background noise reduction. The hybrid model resulted in a significant improvement over the conventional PLS method under different ambient environments, which include air, argon, and helium. The average relative error of carbon decreased from 2.74 to 1.67% under an ambient helium environment, which indicated a significantly improved accuracy in the measurement of carbon in coal. The best results obtained under an ambient helium environment could be partly attributed to the smallest interference by noise after wavelet denoising. A similar improvement was observed in ambient air and argon environments, thereby proving the applicability of the hybrid model under different experimental conditions.

  1. Postprocessing of Compressed Images via Sequential Denoising

    NASA Astrophysics Data System (ADS)

    Dar, Yehuda; Bruckstein, Alfred M.; Elad, Michael; Giryes, Raja

    2016-07-01

    In this work we propose a novel postprocessing technique for compression-artifact reduction. Our approach is based on posing this task as an inverse problem, with a regularization that leverages on existing state-of-the-art image denoising algorithms. We rely on the recently proposed Plug-and-Play Prior framework, suggesting the solution of general inverse problems via Alternating Direction Method of Multipliers (ADMM), leading to a sequence of Gaussian denoising steps. A key feature in our scheme is a linearization of the compression-decompression process, so as to get a formulation that can be optimized. In addition, we supply a thorough analysis of this linear approximation for several basic compression procedures. The proposed method is suitable for diverse compression techniques that rely on transform coding. Specifically, we demonstrate impressive gains in image quality for several leading compression methods - JPEG, JPEG2000, and HEVC.

  2. Adaptive Image Denoising by Mixture Adaptation

    NASA Astrophysics Data System (ADS)

    Luo, Enming; Chan, Stanley H.; Nguyen, Truong Q.

    2016-10-01

    We propose an adaptive learning procedure to learn patch-based image priors for image denoising. The new algorithm, called the Expectation-Maximization (EM) adaptation, takes a generic prior learned from a generic external database and adapts it to the noisy image to generate a specific prior. Different from existing methods that combine internal and external statistics in ad-hoc ways, the proposed algorithm is rigorously derived from a Bayesian hyper-prior perspective. There are two contributions of this paper: First, we provide full derivation of the EM adaptation algorithm and demonstrate methods to improve the computational complexity. Second, in the absence of the latent clean image, we show how EM adaptation can be modified based on pre-filtering. Experimental results show that the proposed adaptation algorithm yields consistently better denoising results than the one without adaptation and is superior to several state-of-the-art algorithms.

  3. Adaptive Image Denoising by Mixture Adaptation.

    PubMed

    Luo, Enming; Chan, Stanley H; Nguyen, Truong Q

    2016-10-01

    We propose an adaptive learning procedure to learn patch-based image priors for image denoising. The new algorithm, called the expectation-maximization (EM) adaptation, takes a generic prior learned from a generic external database and adapts it to the noisy image to generate a specific prior. Different from existing methods that combine internal and external statistics in ad hoc ways, the proposed algorithm is rigorously derived from a Bayesian hyper-prior perspective. There are two contributions of this paper. First, we provide full derivation of the EM adaptation algorithm and demonstrate methods to improve the computational complexity. Second, in the absence of the latent clean image, we show how EM adaptation can be modified based on pre-filtering. The experimental results show that the proposed adaptation algorithm yields consistently better denoising results than the one without adaptation and is superior to several state-of-the-art algorithms.

  4. CT reconstruction via denoising approximate message passing

    NASA Astrophysics Data System (ADS)

    Perelli, Alessandro; Lexa, Michael A.; Can, Ali; Davies, Mike E.

    2016-05-01

    In this paper, we adapt and apply a compressed sensing based reconstruction algorithm to the problem of computed tomography reconstruction for luggage inspection. Specifically, we propose a variant of the denoising generalized approximate message passing (D-GAMP) algorithm and compare its performance to the performance of traditional filtered back projection and to a penalized weighted least squares (PWLS) based reconstruction method. D-GAMP is an iterative algorithm that at each iteration estimates the conditional probability of the image given the measurements and employs a non-linear "denoising" function which implicitly imposes an image prior. Results on real baggage show that D-GAMP is well-suited to limited-view acquisitions.

  5. Nonlocal Markovian models for image denoising

    NASA Astrophysics Data System (ADS)

    Salvadeo, Denis H. P.; Mascarenhas, Nelson D. A.; Levada, Alexandre L. M.

    2016-01-01

    Currently, the state-of-the art methods for image denoising are patch-based approaches. Redundant information present in nonlocal regions (patches) of the image is considered for better image modeling, resulting in an improved quality of filtering. In this respect, nonlocal Markov random field (MRF) models are proposed by redefining the energy functions of classical MRF models to adopt a nonlocal approach. With the new energy functions, the pairwise pixel interaction is weighted according to the similarities between the patches corresponding to each pair. Also, a maximum pseudolikelihood estimation of the spatial dependency parameter (β) for these models is presented here. For evaluating this proposal, these models are used as an a priori model in a maximum a posteriori estimation to denoise additive white Gaussian noise in images. Finally, results display a notable improvement in both quantitative and qualitative terms in comparison with the local MRFs.

  6. MRI denoising using non-local means.

    PubMed

    Manjón, José V; Carbonell-Caballero, José; Lull, Juan J; García-Martí, Gracián; Martí-Bonmatí, Luís; Robles, Montserrat

    2008-08-01

    Magnetic Resonance (MR) images are affected by random noise which limits the accuracy of any quantitative measurements from the data. In the present work, a recently proposed filter for random noise removal is analyzed and adapted to reduce this noise in MR magnitude images. This parametric filter, named Non-Local Means (NLM), is highly dependent on the setting of its parameters. The aim of this paper is to find the optimal parameter selection for MR magnitude image denoising. For this purpose, experiments have been conducted to find the optimum parameters for different noise levels. Besides, the filter has been adapted to fit with specific characteristics of the noise in MR image magnitude images (i.e. Rician noise). From the results over synthetic and real images we can conclude that this filter can be successfully used for automatic MR denoising.

  7. Wavelet transforms with discrete-time continuous-dilation wavelets

    NASA Astrophysics Data System (ADS)

    Zhao, Wei; Rao, Raghuveer M.

    1999-03-01

    Wavelet constructions and transforms have been confined principally to the continuous-time domain. Even the discrete wavelet transform implemented through multirate filter banks is based on continuous-time wavelet functions that provide orthogonal or biorthogonal decompositions. This paper provides a novel wavelet transform construction based on the definition of discrete-time wavelets that can undergo continuous parameter dilations. The result is a transformation that has the advantage of discrete-time or digital implementation while circumventing the problem of inadequate scaling resolution seen with conventional dyadic or M-channel constructions. Examples of constructing such wavelets are presented.

  8. Denoising forced-choice detection data.

    PubMed

    García-Pérez, Miguel A

    2010-02-01

    Observers in a two-alternative forced-choice (2AFC) detection task face the need to produce a response at random (a guess) on trials in which neither presentation appeared to display a stimulus. Observers could alternatively be instructed to use a 'guess' key on those trials, a key that would produce a random guess and would also record the resultant correct or wrong response as emanating from a computer-generated guess. A simulation study shows that 'denoising' 2AFC data with information regarding which responses are a result of guesses yields estimates of detection threshold and spread of the psychometric function that are far more precise than those obtained in the absence of this information, and parallel the precision of estimates obtained with yes-no tasks running for the same number of trials. Simulations also show that partial compliance with the instructions to use the 'guess' key reduces the quality of the estimates, which nevertheless continue to be more precise than those obtained from conventional 2AFC data if the observers are still moderately compliant. An empirical study testing the validity of simulation results showed that denoised 2AFC estimates of spread were clearly superior to conventional 2AFC estimates and similar to yes-no estimates, but variations in threshold across observers and across sessions hid the benefits of denoising for threshold estimation. The empirical study also proved the feasibility of using a 'guess' key in addition to the conventional response keys defined in 2AFC tasks.

  9. Gearbox diagnostics using wavelet-based windowing technique

    NASA Astrophysics Data System (ADS)

    Omar, F. K.; Gaouda, A. M.

    2009-08-01

    In extracting gear box acoustic signals embedded in excessive noise, the need for an online and automated tool becomes a crucial necessity. One of the recent approaches that have gained some acceptance within the research arena is the Wavelet multi-resolution analysis (WMRA). However selecting an accurate mother wavelet, defining dynamic threshold values and identifying the resolution levels to be considered in gearboxes fault detection and diagnosis are still challenging tasks. This paper proposes a novel wavelet-based technique for detecting, locating and estimating the severity of defects in gear tooth fracture. The proposed technique enhances the WMRA by decomposing the noisy data into different resolution levels while data sliding it into Kaiser's window. Only the maximum expansion coefficients at each resolution level are used in de-noising, detecting and measuring the severity of the defects. A small set of coefficients is used in the monitoring process without assigning threshold values or performing signal reconstruction. The proposed monitoring technique has been applied to a laboratory data corrupted with high noise level.

  10. Wavelet-based pavement image compression and noise reduction

    NASA Astrophysics Data System (ADS)

    Zhou, Jian; Huang, Peisen S.; Chiang, Fu-Pen

    2005-08-01

    For any automated distress inspection system, typically a huge number of pavement images are collected. Use of an appropriate image compression algorithm can save disk space, reduce the saving time, increase the inspection distance, and increase the processing speed. In this research, a modified EZW (Embedded Zero-tree Wavelet) coding method, which is an improved version of the widely used EZW coding method, is proposed. This method, unlike the two-pass approach used in the original EZW method, uses only one pass to encode both the coordinates and magnitudes of wavelet coefficients. An adaptive arithmetic encoding method is also implemented to encode four symbols assigned by the modified EZW into binary bits. By applying a thresholding technique to terminate the coding process, the modified EZW coding method can compress the image and reduce noise simultaneously. The new method is much simpler and faster. Experimental results also show that the compression ratio was increased one and one-half times compared to the EZW coding method. The compressed and de-noised data can be used to reconstruct wavelet coefficients for off-line pavement image processing such as distress classification and quantification.

  11. The Discrete Wavelet Transform

    DTIC Science & Technology

    1991-06-01

    Split- Band Coding," Proc. ICASSP, May 1977, pp 191-195. 12. Vetterli, M. "A Theory of Multirate Filter Banks ," IEEE Trans. ASSP, 35, March 1987, pp 356...both special cases of a single filter bank structure, the discrete wavelet transform, the behavior of which is governed by one’s choice of filters . In...B-1 ,.iii FIGURES 1.1 A wavelet filter bank structure ..................................... 2 2.1 Diagram illustrating the dialation and

  12. Wavelets and Multifractal Analysis

    DTIC Science & Technology

    2004-07-01

    distribution unlimited 13. SUPPLEMENTARY NOTES See also ADM001750, Wavelets and Multifractal Analysis (WAMA) Workshop held on 19-31 July 2004., The original...f)] . . . 16 2.5.4 Detrended Fluctuation Analysis [DFA(m)] . . . . . . . . . . . . . . . 17 2.6 Scale-Independent Measures...18 2.6.1 Detrended -Fluctuation- Analysis Power-Law Exponent (αD) . . . . . . 18 2.6.2 Wavelet-Transform Power-Law Exponent

  13. Powerline interference reduction in ECG signals using empirical wavelet transform and adaptive filtering.

    PubMed

    Singh, Omkar; Sunkaria, Ramesh Kumar

    2015-01-01

    Separating an information-bearing signal from the background noise is a general problem in signal processing. In a clinical environment during acquisition of an electrocardiogram (ECG) signal, The ECG signal is corrupted by various noise sources such as powerline interference (PLI), baseline wander and muscle artifacts. This paper presents novel methods for reduction of powerline interference in ECG signals using empirical wavelet transform (EWT) and adaptive filtering. The proposed methods are compared with the empirical mode decomposition (EMD) based PLI cancellation methods. A total of six methods for PLI reduction based on EMD and EWT are analysed and their results are presented in this paper. The EWT-based de-noising methods have less computational complexity and are more efficient as compared with the EMD-based de-noising methods.

  14. Field programmable gate arrays implementation of Dual Tree Complex Wavelet Transform.

    PubMed

    Canbay, Ferhat; Levent, Vecdi Emre; Serbes, Gorkem; Goren, Sezer; Aydin, Nizamettin

    2015-01-01

    Due to the inherent time-varying characteristics of physiological systems, most biomedical signals (BSs) are expected to have non-stationary character. Therefore, any appropriate analysis method for dealing with BSs should exhibit adjustable time-frequency (TF) resolution. The wavelet transform (WT) provides a TF representation of signals, which has good frequency resolution at low frequencies and good time resolution at high frequencies, resulting in an optimized TF resolution. Discrete wavelet transform (DWT), which is used in various medical signal processing applications such as denoising and feature extraction, is a fast and discretized algorithm for classical WT. However, the DWT has some very important drawbacks such as aliasing, lack of directionality, and shift-variance. To overcome these drawbacks, a new improved discrete transform named as Dual Tree Complex Wavelet Transform (DTCWT) can be used. Nowadays, with the improvements in embedded system technology, portable real-time medical devices are frequently used for rapid diagnosis in patients. In this study, in order to implement DTCWT algorithm in FPGAs, which can be used as real-time feature extraction or denoising operator for biomedical signals, a novel hardware architecture is proposed. In proposed architecture, DTCWT is implemented with only one adder and one multiplier. Additionally, considering the multi-channel outputs of biomedical data acquisition systems, this architecture is capable of running N channels in parallel.

  15. Comparative study of adsorption of Pb(II) on native garlic peel and mercerized garlic peel.

    PubMed

    Liu, Wei; Liu, Yifeng; Tao, Yaqi; Yu, Youjie; Jiang, Hongmei; Lian, Hongzhen

    2014-02-01

    A comparative study using native garlic peel and mercerized garlic peel as adsorbents for the removal of Pb(2+) has been proposed. Under the optimized pH, contact time, and adsorbent dosage, the adsorption capacity of garlic peel after mercerization was increased 2.1 times and up to 109.05 mg g(-1). The equilibrium sorption data for both garlic peels fitted well with Langmuir adsorption isotherm, and the adsorbent-adsorbate kinetics followed pseudo-second-order model. These both garlic peels were characterized by elemental analysis, Fourier transform infrared spectrometry (FT-IR), and scanning electron microscopy, and the results indicated that mercerized garlic peel offers more little pores acted as adsorption sites than native garlic peel and has lower polymerization and crystalline and more accessible functional hydroxyl groups, which resulted in higher adsorption capacity than native garlic peel. The FT-IR and X-ray photoelectron spectroscopy analyses of both garlic peels before and after loaded with Pb(2+) further illustrated that lead was adsorbed on the through chelation between Pb(2+) and O atom existed on the surface of garlic peels. These results described above showed that garlic peel after mercerization can be a more attractive adsorbent due to its faster sorption uptake and higher capacity.

  16. Riesz wavelets and multiresolution structures

    NASA Astrophysics Data System (ADS)

    Larson, David R.; Tang, Wai-Shing; Weber, Eric

    2001-12-01

    Multiresolution structures are important in applications, but they are also useful for analyzing properties of associated wavelets. Given a nonorthogonal (multi-) wavelet in a Hilbert space, we construct a core subspace. Subsequently, the dilates of the core subspace defines a ladder of nested subspaces. Of fundamental importance are two questions: 1) when is the core subspace shift invariant; and if yes, then 2) when is the core subspace generated by shifts of a single vector, i.e. there exists a scaling vector. If the wavelet generates a Riesz basis then the answer to question 1) is yes if and only if the wavelet is a biorthogonal wavelet. Additionally, if the wavelet generates a tight frame of arbitrary frame constant, then the core subspace is shift invariant. Question 1) is still open in case the wavelet generates a non-tight frame. We also present some known results to question 2) and provide some preliminary improvements. Our analysis here arises from investigating the dimension function and the multiplicity function of a wavelet. These two functions agree if the wavelet is orthogonal. Finally, we discuss how these questions are important for considering linear perturbation of wavelets. Utilizing the idea of the local commutant of a unitary system developed by Dai and Larson, we show that nearly all linear perturbations of two orthonormal wavelets form a Riesz wavelet. If in fact these wavelets correspond to a von Neumann algebra in the local commutant of a base wavelet, then the interpolated wavelet is biorthogonal. Moreover, we demonstrate that in this case the interpolated wavelets have a scaling vector if the base wavelet has a scaling vector.

  17. True 4D Image Denoising on the GPU.

    PubMed

    Eklund, Anders; Andersson, Mats; Knutsson, Hans

    2011-01-01

    The use of image denoising techniques is an important part of many medical imaging applications. One common application is to improve the image quality of low-dose (noisy) computed tomography (CT) data. While 3D image denoising previously has been applied to several volumes independently, there has not been much work done on true 4D image denoising, where the algorithm considers several volumes at the same time. The problem with 4D image denoising, compared to 2D and 3D denoising, is that the computational complexity increases exponentially. In this paper we describe a novel algorithm for true 4D image denoising, based on local adaptive filtering, and how to implement it on the graphics processing unit (GPU). The algorithm was applied to a 4D CT heart dataset of the resolution 512  × 512  × 445  × 20. The result is that the GPU can complete the denoising in about 25 minutes if spatial filtering is used and in about 8 minutes if FFT-based filtering is used. The CPU implementation requires several days of processing time for spatial filtering and about 50 minutes for FFT-based filtering. The short processing time increases the clinical value of true 4D image denoising significantly.

  18. A Comparison of PDE-based Non-Linear Anisotropic Diffusion Techniques for Image Denoising

    SciTech Connect

    Weeratunga, S K; Kamath, C

    2003-01-06

    PDE-based, non-linear diffusion techniques are an effective way to denoise images. In a previous study, we investigated the effects of different parameters in the implementation of isotropic, non-linear diffusion. Using synthetic and real images, we showed that for images corrupted with additive Gaussian noise, such methods are quite effective, leading to lower mean-squared-error values in comparison with spatial filters and wavelet-based approaches. In this paper, we extend this work to include anisotropic diffusion, where the diffusivity is a tensor valued function which can be adapted to local edge orientation. This allows smoothing along the edges, but not perpendicular to it. We consider several anisotropic diffusivity functions as well as approaches for discretizing the diffusion operator that minimize the mesh orientation effects. We investigate how these tensor-valued diffusivity functions compare in image quality, ease of use, and computational costs relative to simple spatial filters, the more complex bilateral filters, wavelet-based methods, and isotropic non-linear diffusion based techniques.

  19. Comparison of PDE-based non-linear anistropic diffusion techniques for image denoising

    NASA Astrophysics Data System (ADS)

    Weeratunga, Sisira K.; Kamath, Chandrika

    2003-05-01

    PDE-based, non-linear diffusion techniques are an effective way to denoise images.In a previous study, we investigated the effects of different parameters in the implementation of isotropic, non-linear diffusion. Using synthetic and real images, we showed that for images corrupted with additive Gaussian noise, such methods are quite effective, leading to lower mean-squared-error values in comparison with spatial filters and wavelet-based approaches. In this paper, we extend this work to include anisotropic diffusion, where the diffusivity is a tensor valued function which can be adapted to local edge orientation. This allows smoothing along the edges, but not perpendicular to it. We consider several anisotropic diffusivity functions as well as approaches for discretizing the diffusion operator that minimize the mesh orientation effects. We investigate how these tensor-valued diffusivity functions compare in image quality, ease of use, and computational costs relative to simple spatial filters, the more complex bilateral filters, wavelet-based methods, and isotropic non-linear diffusion based techniques.

  20. Prediction of processing tomato peeling outcomes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Peeling outcomes of processing tomatoes were predicted using multivariate analysis of Magnetic Resonance (MR) images. Tomatoes were obtained from a whole-peel production line. Each fruit was imaged using a 7 Tesla MR system, and a multivariate data set was created from 28 different images. After ...

  1. Food peeling: conventional and new approaches

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Peeling is an important unit operation in food processing that prepares fruits and vegetables for subsequent processes through removal of inedible or undesirable rind or skin. This chapter covers an exhaustive discussion on advancement in peeling technologies of fruits and vegetables from different ...

  2. Pomegranate peel and peel extracts: chemistry and food features.

    PubMed

    Akhtar, Saeed; Ismail, Tariq; Fraternale, Daniele; Sestili, Piero

    2015-05-01

    The present review focuses on the nutritional, functional and anti-infective properties of pomegranate (Punica granatum L.) peel (PoP) and peel extract (PoPx) and on their applications as food additives, functional food ingredients or biologically active components in nutraceutical preparations. Due to their well-known ethnomedical relevance and chemical features, the biomolecules available in PoP and PoPx have been proposed, for instance, as substitutes of synthetic food additives, as nutraceuticals and chemopreventive agents. However, because of their astringency and anti-nutritional properties, PoP and PoPx are not yet considered as ingredients of choice in food systems. Indeed, considering the prospects related to both their health promoting activity and chemical features, the nutritional and nutraceutical potential of PoP and PoPx seems to be still underestimated. The present review meticulously covers the wide range of actual and possible applications (food preservatives, stabilizers, supplements, prebiotics and quality enhancers) of PoP and PoPx components in various food products. Given the overall properties of PoP and PoPx, further investigations in toxicological and sensory aspects of PoP and PoPx should be encouraged to fully exploit the health promoting and technical/economic potential of these waste materials as food supplements.

  3. Combining interior and exterior characteristics for remote sensing image denoising

    NASA Astrophysics Data System (ADS)

    Peng, Ni; Sun, Shujin; Wang, Runsheng; Zhong, Ping

    2016-04-01

    Remote sensing image denoising faces many challenges since a remote sensing image usually covers a wide area and thus contains complex contents. Using the patch-based statistical characteristics is a flexible method to improve the denoising performance. There are usually two kinds of statistical characteristics available: interior and exterior characteristics. Different statistical characteristics have their own strengths to restore specific image contents. Combining different statistical characteristics to use their strengths together may have the potential to improve denoising results. This work proposes a method combining statistical characteristics to adaptively select statistical characteristics for different image contents. The proposed approach is implemented through a new characteristics selection criterion learned over training data. Moreover, with the proposed combination method, this work develops a denoising algorithm for remote sensing images. Experimental results show that our method can make full use of the advantages of interior and exterior characteristics for different image contents and thus improve the denoising performance.

  4. Dual-domain denoising in three dimensional magnetic resonance imaging.

    PubMed

    Peng, Jing; Zhou, Jiliu; Wu, Xi

    2016-08-01

    Denoising is a crucial preprocessing procedure for three dimensional magnetic resonance imaging (3D MRI). Existing denoising methods are predominantly implemented in a single domain, ignoring information in other domains. However, denoising methods are becoming increasingly complex, making analysis and implementation challenging. The present study aimed to develop a dual-domain image denoising (DDID) algorithm for 3D MRI that encapsulates information from the spatial and transform domains. In the present study, the DDID method was used to distinguish signal from noise in the spatial and frequency domains, after which robust accurate noise estimation was introduced for iterative filtering, which is simple and beneficial for computation. In addition, the proposed method was compared quantitatively and qualitatively with existing methods for synthetic and in vivo MRI datasets. The results of the present study suggested that the novel DDID algorithm performed well and provided competitive results, as compared with existing MRI denoising filters.

  5. Patch-based near-optimal image denoising.

    PubMed

    Chatterjee, Priyam; Milanfar, Peyman

    2012-04-01

    In this paper, we propose a denoising method motivated by our previous analysis of the performance bounds for image denoising. Insights from that study are used here to derive a high-performance practical denoising algorithm. We propose a patch-based Wiener filter that exploits patch redundancy for image denoising. Our framework uses both geometrically and photometrically similar patches to estimate the different filter parameters. We describe how these parameters can be accurately estimated directly from the input noisy image. Our denoising approach, designed for near-optimal performance (in the mean-squared error sense), has a sound statistical foundation that is analyzed in detail. The performance of our approach is experimentally verified on a variety of images and noise levels. The results presented here demonstrate that our proposed method is on par or exceeding the current state of the art, both visually and quantitatively.

  6. A connection between score matching and denoising autoencoders.

    PubMed

    Vincent, Pascal

    2011-07-01

    Denoising autoencoders have been previously shown to be competitive alternatives to restricted Boltzmann machines for unsupervised pretraining of each layer of a deep architecture. We show that a simple denoising autoencoder training criterion is equivalent to matching the score (with respect to the data) of a specific energy-based model to that of a nonparametric Parzen density estimator of the data. This yields several useful insights. It defines a proper probabilistic model for the denoising autoencoder technique, which makes it in principle possible to sample from them or rank examples by their energy. It suggests a different way to apply score matching that is related to learning to denoise and does not require computing second derivatives. It justifies the use of tied weights between the encoder and decoder and suggests ways to extend the success of denoising autoencoders to a larger family of energy-based models.

  7. Atomic-Scale Peeling of Graphene

    NASA Astrophysics Data System (ADS)

    Ishikawa, Makoto; Ichikawa, Masaya; Okamoto, Hideki; Itamura, Noriaki; Sasaki, Naruo; Miura, Kouji

    2012-06-01

    We report the atomic-scale peeling of a single-layer graphene on a graphite substrate, in which stick-slip sliding of the single-layer graphene occurs at the atomic scale while maintaining AB-stacking registry with the graphite substrate. The peeling force curve clearly exhibits a transition from surface-contact to line-contact between the graphene and graphite surfaces. The amplitude of the peeling force depends on the lattice orientation of the surface, which is affected by the sliding force at the interface between the graphene and graphite surfaces. This study of peeling at the atomic scale will clarify the relationship among peeling, friction, adhesion, and superlubricity.

  8. Wavelet-Based Image Enhancement in X-Ray Imaging and Tomography

    NASA Astrophysics Data System (ADS)

    Bronnikov, Andrei V.; Duifhuis, Gerrit

    1998-07-01

    We consider an application of the wavelet transform to image processing in x-ray imaging and three-dimensional (3-D) tomography aimed at industrial inspection. Our experimental setup works in two operational modes digital radiography and 3-D cone-beam tomographic data acquisition. Although the x-ray images measured have a large dynamic range and good spatial resolution, their noise properties and contrast are often not optimal. To enhance the images, we suggest applying digital image processing by using wavelet-based algorithms and consider the wavelet-based multiscale edge representation in the framework of the Mallat and Zhong approach IEEE Trans. Pattern Anal. Mach. Intell. 14, 710 (1992) . A contrast-enhancement method by use of equalization of the multiscale edges is suggested. Several denoising algorithms based on modifying the modulus and the phase of the multiscale gradients and several contrast-enhancement techniques applying linear and nonlinear multiscale edge stretching are described and compared by use of experimental data. We propose the use of a filter bank of wavelet-based reconstruction filters for the filtered-backprojection reconstruction algorithm. Experimental results show a considerable increase in the performance of the whole x-ray imaging system for both radiographic and tomographic modes in the case of the application of the wavelet-based image-processing algorithms.

  9. Wavelet-based SAR images despeckling using joint hidden Markov model

    NASA Astrophysics Data System (ADS)

    Li, Qiaoliang; Wang, Guoyou; Liu, Jianguo; Chen, Shaobo

    2007-11-01

    In the past few years, wavelet-domain hidden Markov models have proven to be useful tools for statistical signal and image processing. The hidden Markov tree (HMT) model captures the key features of the joint probability density of the wavelet coefficients of real-world data. One potential drawback to the HMT framework is the deficiency for taking account of intrascale correlations that exist among neighboring wavelet coefficients. In this paper, we propose to develop a joint hidden Markov model by fusing the wavelet Bayesian denoising technique with an image regularization procedure based on HMT and Markov random field (MRF). The Expectation Maximization algorithm is used to estimate hyperparameters and specify the mixture model. The noise-free wavelet coefficients are finally estimated by a shrinkage function based on local weighted averaging of the Bayesian estimator. It is shown that the joint method outperforms lee filter and standard HMT techniques in terms of the integrative measure of the equivalent number of looks (ENL) and Pratt's figure of merit(FOM), especially when dealing with speckle noise in large variance.

  10. Image denoising using a tight frame.

    PubMed

    Shen, Lixin; Papadakis, Manos; Kakadiaris, Ioannis A; Konstantinidis, Ioannis; Kouri, Donald; Hoffman, David

    2006-05-01

    We present a general mathematical theory for lifting frames that allows us to modify existing filters to construct new ones that form Parseval frames. We apply our theory to design nonseparable Parseval frames from separable (tensor) products of a piecewise linear spline tight frame. These new frame systems incorporate the weighted average operator, the Sobel operator, and the Laplacian operator in directions that are integer multiples of 45 degrees. A new image denoising algorithm is then proposed, tailored to the specific properties of these new frame filters. We demonstrate the performance of our algorithm on a diverse set of images with very encouraging results.

  11. Comparative analysis of the interferogram noise filtration using wavelet transform and spin filtering algorithms

    NASA Astrophysics Data System (ADS)

    Zielinski, B.; Patorski, K.

    2008-12-01

    The aim of this paper is to analyze the accuracy of 2D fringe pattern denoising performed by two chosen methods using quasi-1D two-arm spin filter and 2D Discrete Wavelet Transform (DWT) signal decomposition and thresholding. The ultimate aim of this comparison is to estimate which algorithm is better suited for high-accuracy interferometric measurements. In spite of the fact that both algorithms are designed to minimize possible fringe blur and distortion, the evaluation of errors introduced by each algorithm is essential for proper estimation of their performance.

  12. A novel 3D wavelet based filter for visualizing features in noisy biological data

    SciTech Connect

    Moss, W C; Haase, S; Lyle, J M; Agard, D A; Sedat, J W

    2005-01-05

    We have developed a 3D wavelet-based filter for visualizing structural features in volumetric data. The only variable parameter is a characteristic linear size of the feature of interest. The filtered output contains only those regions that are correlated with the characteristic size, thus denoising the image. We demonstrate the use of the filter by applying it to 3D data from a variety of electron microscopy samples including low contrast vitreous ice cryogenic preparations, as well as 3D optical microscopy specimens.

  13. Wavelets on Planar Tesselations

    SciTech Connect

    Bertram, M.; Duchaineau, M.A.; Hamann, B.; Joy, K.I.

    2000-02-25

    We present a new technique for progressive approximation and compression of polygonal objects in images. Our technique uses local parameterizations defined by meshes of convex polygons in the plane. We generalize a tensor product wavelet transform to polygonal domains to perform multiresolution analysis and compression of image regions. The advantage of our technique over conventional wavelet methods is that the domain is an arbitrary tessellation rather than, for example, a uniform rectilinear grid. We expect that this technique has many applications image compression, progressive transmission, radiosity, virtual reality, and image morphing.

  14. Evolutionary Fuzzy Block-Matching-Based Camera Raw Image Denoising.

    PubMed

    Yang, Chin-Chang; Guo, Shu-Mei; Tsai, Jason Sheng-Hong

    2016-10-03

    An evolutionary fuzzy block-matching-based image denoising algorithm is proposed to remove noise from a camera raw image. Recently, a variance stabilization transform is widely used to stabilize the noise variance, so that a Gaussian denoising algorithm can be used to remove the signal-dependent noise in camera sensors. However, in the stabilized domain, the existed denoising algorithm may blur too much detail. To provide a better estimate of the noise-free signal, a new block-matching approach is proposed to find similar blocks by the use of a type-2 fuzzy logic system (FLS). Then, these similar blocks are averaged with the weightings which are determined by the FLS. Finally, an efficient differential evolution is used to further improve the performance of the proposed denoising algorithm. The experimental results show that the proposed denoising algorithm effectively improves the performance of image denoising. Furthermore, the average performance of the proposed method is better than those of two state-of-the-art image denoising algorithms in subjective and objective measures.

  15. Peel test of spinnable carbon nanotube webs

    NASA Astrophysics Data System (ADS)

    Khandoker, Noman; Hawkins, Stephen C.; Ibrahim, Raafat; Huynh, Chi P.

    2014-06-01

    This paper presents results of peel tests with spinnable carbon nanotube webs. Peel tests were performed to study the effect of orientation angles on interface energies between nanotubes. In absence of any binding agent the interface energy represents the Van Der Waals energies between the interacting nanotubes. Therefore, the effect of the orientations on Van Der Waals energies between carbon nanotubes is obtained through the peel test. It is shown that the energy for crossed nanotubes at 90° angle is lower than the energy for parallel nanotubes at 0° angle. This experimental observation was validated by hypothetical theoretical calculations.

  16. Denoising of ECG signal during spaceflight using singular value decomposition

    NASA Astrophysics Data System (ADS)

    Li, Zhuo; Wang, Li

    2009-12-01

    The Singular Value Decomposition (SVD) method is introduced to denoise the ECG signal during spaceflight. The theory base of SVD method is given briefly. The denoising process of the strategy is presented combining a segment of real ECG signal. We improve the algorithm of calculating Singular Value Ratio (SVR) spectrum, and propose a constructive approach of analysis characteristic patterns. We reproduce the ECG signal very well and compress the noise effectively. The SVD method is proved to be suitable for denoising the ECG signal.

  17. Peeling, sliding, pulling and bending

    NASA Astrophysics Data System (ADS)

    Lister, John; Peng, Gunnar

    2015-11-01

    The peeling of an elastic sheet away from thin layer of viscous fluid is a simply-stated and generic problem, that involves complex interactions between the flow and elastic deformation on a range of length scales. Consider an analogue of capillary spreading, where a blister of injected viscous fluid spreads due to tension in the overlying elastic sheet. Here the tension is coupled to the deformation of the sheet, and thus varies in time and space. A key question is whether or not viscous shear stresses ahead of the blister are sufficient to prevent the sheet sliding inwards and relieving the tension. Our asymptotic analysis reveals a dichotomy between fast and slow spreading, and between two-dimensional and axisymmetric spreading. In combination with bending stresses and gravity, which may dominate parts of the flow but not others, there is a plethora of dynamical regimes.

  18. Peeling, sliding, pulling and bending

    NASA Astrophysics Data System (ADS)

    Lister, John; Peng, Gunnar

    2016-11-01

    The peeling of an elastic sheet away from thin layer of viscous fluid is a simply-stated and generic problem, that involves complex interactions between the flow and elastic deformation on a range of length scales. Consider an analogue of capillary spreading, where a blister of injected viscous fluid spreads due to tension in the overlying elastic sheet. Here the tension is coupled to the deformation of the sheet, and thus varies in time and space. A key question is whether or not viscous shear stresses ahead of the blister are sufficient to prevent the sheet sliding inwards and relieving the tension. Our asymptotic analysis reveals a dichotomy between fast and slow spreading, and between two-dimensional and axisymmetric spreading. In combination with bending stresses and gravity, which may dominate parts of the flow but not others, there is a plethora of dynamical regimes.

  19. Wavelet-based method for image filtering using scale-space continuity

    NASA Astrophysics Data System (ADS)

    Jung, Claudio R.; Scharcanski, Jacob

    2001-04-01

    This paper proposes a novel technique to reduce noise while preserving edge sharpness during image filtering. This method is based on the image multiresolution decomposition by a discrete wavelet transform, given a proper wavelet basis. In the transform spaces, edges are implicitly located and preserved, at the same time that image noise is filtered out. At each resolution level, geometric continuity is used to preserve edges that are not isolated. Finally, we compare consecutive levels to preserve edges having continuity along scales. As a result, the proposed technique produces a filtered version of the original image, where homogeneous regions appear segmented by well-defined edges. Possible applications include image presegmentation and image denoising.

  20. Comparative analysis of interferogram noise filtration using wavelet transform and spin filtering algorithms

    NASA Astrophysics Data System (ADS)

    Zielinski, B.; Patorski, K.

    2010-06-01

    The aim of this paper is to analyze 2D fringe pattern denoising performed by two chosen methods based on quasi-1D two-arm spin filter and 2D discrete wavelet transform (DWT) signal decomposition and thresholding. The ultimate aim of this comparison is to estimate which algorithm is better suited for high-accuracy measurements by phase shifting interferometry (PSI) with the phase step evaluation using the lattice site approach. The spin filtering method proposed by Yu et al. (1994) was designed to minimize possible fringe blur and distortion. The 2D DWT also presents such features due to a lossless nature of the signal wavelet decomposition. To compare both methods, a special 2D histogram introduced by Gutman and Weber (1998) is used to evaluate intensity errors introduced by each of the presented algorithms.

  1. Image analysis using a dual-tree M-band wavelet transform.

    PubMed

    Chaux, Caroline; Duval, Laurent; Pesquet, Jean-Christophe

    2006-08-01

    We propose a two-dimensional generalization to the M-band case of the dual-tree decomposition structure (initially proposed by Kingsbury and further investigated by Selesnick) based on a Hilbert pair of wavelets. We particularly address: 1) the construction of the dual basis and 2) the resulting directional analysis. We also revisit the necessary pre-processing stage in the M-band case. While several reconstructions are possible because of the redundancy of the representation, we propose a new optimal signal reconstruction technique, which minimizes potential estimation errors. The effectiveness of the proposed M-band decomposition is demonstrated via denoising comparisons on several image types (natural, texture, seismics), with various M-band wavelets and thresholding strategies. Significant improvements in terms of both overall noise reduction and direction preservation are observed.

  2. The Peel Inlet-Harvey Estuary Study.

    ERIC Educational Resources Information Center

    Walker, Warren; Black, Ronald

    1979-01-01

    Describes how the department of physics of the Western Australian Institute of Technology (WAIT) has been involved in the Peel Inlet-Harvey Estuary study. An appendix which presents the departmental approach to curriculum matters is also included. (HM)

  3. Signal quality enhancement using higher order wavelets for ultrasonic TOFD signals from austenitic stainless steel welds.

    PubMed

    Praveen, Angam; Vijayarekha, K; Abraham, Saju T; Venkatraman, B

    2013-09-01

    Time of flight diffraction (TOFD) technique is a well-developed ultrasonic non-destructive testing (NDT) method and has been applied successfully for accurate sizing of defects in metallic materials. This technique was developed in early 1970s as a means for accurate sizing and positioning of cracks in nuclear components became very popular in the late 1990s and is today being widely used in various industries for weld inspection. One of the main advantages of TOFD is that, apart from fast technique, it provides higher probability of detection for linear defects. Since TOFD is based on diffraction of sound waves from the extremities of the defect compared to reflection from planar faces as in pulse echo and phased array, the resultant signal would be quite weak and signal to noise ratio (SNR) low. In many cases the defect signal is submerged in this noise making it difficult for detection, positioning and sizing. Several signal processing methods such as digital filtering, Split Spectrum Processing (SSP), Hilbert Transform and Correlation techniques have been developed in order to suppress unwanted noise and enhance the quality of the defect signal which can thus be used for characterization of defects and the material. Wavelet Transform based thresholding techniques have been applied largely for de-noising of ultrasonic signals. However in this paper, higher order wavelets are used for analyzing the de-noising performance for TOFD signals obtained from Austenitic Stainless Steel welds. It is observed that higher order wavelets give greater SNR improvement compared to the lower order wavelets.

  4. Chemical peeling of eyelids and periorbital area.

    PubMed

    Morrow, D M

    1992-02-01

    Chemical peeling promotes formation of new epidermis and new dermal collagen, resulting in skin shrinkage, reduction of wrinkling and crepe paper skin, softening of crow's feet, and, when desired, lightened eyelid color. Chemical peeling may be performed as the only eyelid procedure, simultaneously with CO2 laser surgical blepharoplasty, after healing of cold-steel-scalpel or CO2-laser blepharoplasty, and as a repeated procedure to achieve maximal results.

  5. Sparsity based denoising of spectral domain optical coherence tomography images

    PubMed Central

    Fang, Leyuan; Li, Shutao; Nie, Qing; Izatt, Joseph A.; Toth, Cynthia A.; Farsiu, Sina

    2012-01-01

    In this paper, we make contact with the field of compressive sensing and present a development and generalization of tools and results for reconstructing irregularly sampled tomographic data. In particular, we focus on denoising Spectral-Domain Optical Coherence Tomography (SDOCT) volumetric data. We take advantage of customized scanning patterns, in which, a selected number of B-scans are imaged at higher signal-to-noise ratio (SNR). We learn a sparse representation dictionary for each of these high-SNR images, and utilize such dictionaries to denoise the low-SNR B-scans. We name this method multiscale sparsity based tomographic denoising (MSBTD). We show the qualitative and quantitative superiority of the MSBTD algorithm compared to popular denoising algorithms on images from normal and age-related macular degeneration eyes of a multi-center clinical trial. We have made the corresponding data set and software freely available online. PMID:22567586

  6. Improved Rotating Kernel Transformation Based Contourlet Domain Image Denoising Framework.

    PubMed

    Guo, Qing; Dong, Fangmin; Sun, Shuifa; Ren, Xuhong; Feng, Shiyu; Gao, Bruce Zhi

    A contourlet domain image denoising framework based on a novel Improved Rotating Kernel Transformation is proposed, where the difference of subbands in contourlet domain is taken into account. In detail: (1). A novel Improved Rotating Kernel Transformation (IRKT) is proposed to calculate the direction statistic of the image; The validity of the IRKT is verified by the corresponding extracted edge information comparing with the state-of-the-art edge detection algorithm. (2). The direction statistic represents the difference between subbands and is introduced to the threshold function based contourlet domain denoising approaches in the form of weights to get the novel framework. The proposed framework is utilized to improve the contourlet soft-thresholding (CTSoft) and contourlet bivariate-thresholding (CTB) algorithms. The denoising results on the conventional testing images and the Optical Coherence Tomography (OCT) medical images show that the proposed methods improve the existing contourlet based thresholding denoising algorithm, especially for the medical images.

  7. Denoising MR spectroscopic imaging data with low-rank approximations.

    PubMed

    Nguyen, Hien M; Peng, Xi; Do, Minh N; Liang, Zhi-Pei

    2013-01-01

    This paper addresses the denoising problem associated with magnetic resonance spectroscopic imaging (MRSI), where signal-to-noise ratio (SNR) has been a critical problem. A new scheme is proposed, which exploits two low-rank structures that exist in MRSI data, one due to partial separability and the other due to linear predictability. Denoising is performed by arranging the measured data in appropriate matrix forms (i.e., Casorati and Hankel) and applying low-rank approximations by singular value decomposition (SVD). The proposed method has been validated using simulated and experimental data, producing encouraging results. Specifically, the method can effectively denoise MRSI data in a wide range of SNR values while preserving spatial-spectral features. The method could prove useful for denoising MRSI data and other spatial-spectral and spatial-temporal imaging data as well.

  8. Image denoising via sparse and redundant representations over learned dictionaries.

    PubMed

    Elad, Michael; Aharon, Michal

    2006-12-01

    We address the image denoising problem, where zero-mean white and homogeneous Gaussian additive noise is to be removed from a given image. The approach taken is based on sparse and redundant representations over trained dictionaries. Using the K-SVD algorithm, we obtain a dictionary that describes the image content effectively. Two training options are considered: using the corrupted image itself, or training on a corpus of high-quality image database. Since the K-SVD is limited in handling small image patches, we extend its deployment to arbitrary image sizes by defining a global image prior that forces sparsity over patches in every location in the image. We show how such Bayesian treatment leads to a simple and effective denoising algorithm. This leads to a state-of-the-art denoising performance, equivalent and sometimes surpassing recently published leading alternative denoising methods.

  9. A new method for mobile phone image denoising

    NASA Astrophysics Data System (ADS)

    Jin, Lianghai; Jin, Min; Li, Xiang; Xu, Xiangyang

    2015-12-01

    Images captured by mobile phone cameras via pipeline processing usually contain various kinds of noises, especially granular noise with different shapes and sizes in both luminance and chrominance channels. In chrominance channels, noise is closely related to image brightness. To improve image quality, this paper presents a new method to denoise such mobile phone images. The proposed scheme converts the noisy RGB image to luminance and chrominance images, which are then denoised by a common filtering framework. The common filtering framework processes a noisy pixel by first excluding the neighborhood pixels that significantly deviate from the (vector) median and then utilizing the other neighborhood pixels to restore the current pixel. In the framework, the strength of chrominance image denoising is controlled by image brightness. The experimental results show that the proposed method obviously outperforms some other representative denoising methods in terms of both objective measure and visual evaluation.

  10. Temperature field for radiative tomato peeling

    NASA Astrophysics Data System (ADS)

    Cuccurullo, G.; Giordano, L.

    2017-01-01

    Nowadays peeling of tomatoes is performed by using steam or lye, which are expensive and polluting techniques, thus sustainable alternatives are searched for dry peeling and, among that, radiative heating seems to be a fairly promising method. This paper aims to speed up the prediction of surface temperatures useful for realizing dry-peeling, thus a 1D-analytical model for the unsteady temperature field in a rotating tomato exposed to a radiative heating source is presented. Since only short times are of interest for the problem at hand, the model involves a semi-infinite slab cooled by convective heat transfer while heated by a pulsating heat source. The model being linear, the solution is derived following the Laplace Transform method. A 3D finite element model of the rotating tomato is introduced as well in order to validate the analytical solution. A satisfactory agreement is attained. Therefore, two different ways to predict the onset of the peeling conditions are available which can be of help for proper design of peeling plants. Particular attention is paid to study surface temperature uniformity, that being a critical parameter for realizing an easy tomato peeling.

  11. Wavelet evolutionary network for complex-constrained portfolio rebalancing

    NASA Astrophysics Data System (ADS)

    Suganya, N. C.; Vijayalakshmi Pai, G. A.

    2012-07-01

    Portfolio rebalancing problem deals with resetting the proportion of different assets in a portfolio with respect to changing market conditions. The constraints included in the portfolio rebalancing problem are basic, cardinality, bounding, class and proportional transaction cost. In this study, a new heuristic algorithm named wavelet evolutionary network (WEN) is proposed for the solution of complex-constrained portfolio rebalancing problem. Initially, the empirical covariance matrix, one of the key inputs to the problem, is estimated using the wavelet shrinkage denoising technique to obtain better optimal portfolios. Secondly, the complex cardinality constraint is eliminated using k-means cluster analysis. Finally, WEN strategy with logical procedures is employed to find the initial proportion of investment in portfolio of assets and also rebalance them after certain period. Experimental studies of WEN are undertaken on Bombay Stock Exchange, India (BSE200 index, period: July 2001-July 2006) and Tokyo Stock Exchange, Japan (Nikkei225 index, period: March 2002-March 2007) data sets. The result obtained using WEN is compared with the only existing counterpart named Hopfield evolutionary network (HEN) strategy and also verifies that WEN performs better than HEN. In addition, different performance metrics and data envelopment analysis are carried out to prove the robustness and efficiency of WEN over HEN strategy.

  12. Wavelet-based Poisson rate estimation using the Skellam distribution

    NASA Astrophysics Data System (ADS)

    Hirakawa, Keigo; Baqai, Farhan; Wolfe, Patrick J.

    2009-02-01

    Owing to the stochastic nature of discrete processes such as photon counts in imaging, real-world data measurements often exhibit heteroscedastic behavior. In particular, time series components and other measurements may frequently be assumed to be non-iid Poisson random variables, whose rate parameter is proportional to the underlying signal of interest-witness literature in digital communications, signal processing, astronomy, and magnetic resonance imaging applications. In this work, we show that certain wavelet and filterbank transform coefficients corresponding to vector-valued measurements of this type are distributed as sums and differences of independent Poisson counts, taking the so-called Skellam distribution. While exact estimates rarely admit analytical forms, we present Skellam mean estimators under both frequentist and Bayes models, as well as computationally efficient approximations and shrinkage rules, that may be interpreted as Poisson rate estimation method performed in certain wavelet/filterbank transform domains. This indicates a promising potential approach for denoising of Poisson counts in the above-mentioned applications.

  13. Wavelet Signal Processing for Transient Feature Extraction

    DTIC Science & Technology

    1992-03-15

    Research was conducted to evaluate the feasibility of applying Wavelets and Wavelet Transform methods to transient signal feature extraction problems... Wavelet transform techniques were developed to extract low dimensional feature data that allowed a simple classification scheme to easily separate

  14. A new method for fusion, denoising and enhancement of x-ray images retrieved from Talbot-Lau grating interferometry

    NASA Astrophysics Data System (ADS)

    Scholkmann, Felix; Revol, Vincent; Kaufmann, Rolf; Baronowski, Heidrun; Kottler, Christian

    2014-03-01

    This paper introduces a new image denoising, fusion and enhancement framework for combining and optimal visualization of x-ray attenuation contrast (AC), differential phase contrast (DPC) and dark-field contrast (DFC) images retrieved from x-ray Talbot-Lau grating interferometry. The new image fusion framework comprises three steps: (i) denoising each input image (AC, DPC and DFC) through adaptive Wiener filtering, (ii) performing a two-step image fusion process based on the shift-invariant wavelet transform, i.e. first fusing the AC with the DPC image and then fusing the resulting image with the DFC image, and finally (iii) enhancing the fused image to obtain a final image using adaptive histogram equalization, adaptive sharpening and contrast optimization. Application examples are presented for two biological objects (a human tooth and a cherry) and the proposed method is compared to two recently published AC/DPC/DFC image processing techniques. In conclusion, the new framework for the processing of AC, DPC and DFC allows the most relevant features of all three images to be combined in one image while reducing the noise and enhancing adaptively the relevant image features. The newly developed framework may be used in technical and medical applications.

  15. A new method for fusion, denoising and enhancement of x-ray images retrieved from Talbot-Lau grating interferometry.

    PubMed

    Scholkmann, Felix; Revol, Vincent; Kaufmann, Rolf; Baronowski, Heidrun; Kottler, Christian

    2014-03-21

    This paper introduces a new image denoising, fusion and enhancement framework for combining and optimal visualization of x-ray attenuation contrast (AC), differential phase contrast (DPC) and dark-field contrast (DFC) images retrieved from x-ray Talbot-Lau grating interferometry. The new image fusion framework comprises three steps: (i) denoising each input image (AC, DPC and DFC) through adaptive Wiener filtering, (ii) performing a two-step image fusion process based on the shift-invariant wavelet transform, i.e. first fusing the AC with the DPC image and then fusing the resulting image with the DFC image, and finally (iii) enhancing the fused image to obtain a final image using adaptive histogram equalization, adaptive sharpening and contrast optimization. Application examples are presented for two biological objects (a human tooth and a cherry) and the proposed method is compared to two recently published AC/DPC/DFC image processing techniques. In conclusion, the new framework for the processing of AC, DPC and DFC allows the most relevant features of all three images to be combined in one image while reducing the noise and enhancing adaptively the relevant image features. The newly developed framework may be used in technical and medical applications.

  16. Gradient histogram estimation and preservation for texture enhanced image denoising.

    PubMed

    Zuo, Wangmeng; Zhang, Lei; Song, Chunwei; Zhang, David; Gao, Huijun

    2014-06-01

    Natural image statistics plays an important role in image denoising, and various natural image priors, including gradient-based, sparse representation-based, and nonlocal self-similarity-based ones, have been widely studied and exploited for noise removal. In spite of the great success of many denoising algorithms, they tend to smooth the fine scale image textures when removing noise, degrading the image visual quality. To address this problem, in this paper, we propose a texture enhanced image denoising method by enforcing the gradient histogram of the denoised image to be close to a reference gradient histogram of the original image. Given the reference gradient histogram, a novel gradient histogram preservation (GHP) algorithm is developed to enhance the texture structures while removing noise. Two region-based variants of GHP are proposed for the denoising of images consisting of regions with different textures. An algorithm is also developed to effectively estimate the reference gradient histogram from the noisy observation of the unknown image. Our experimental results demonstrate that the proposed GHP algorithm can well preserve the texture appearance in the denoised images, making them look more natural.

  17. Denoising two-photon calcium imaging data.

    PubMed

    Malik, Wasim Q; Schummers, James; Sur, Mriganka; Brown, Emery N

    2011-01-01

    Two-photon calcium imaging is now an important tool for in vivo imaging of biological systems. By enabling neuronal population imaging with subcellular resolution, this modality offers an approach for gaining a fundamental understanding of brain anatomy and physiology. Proper analysis of calcium imaging data requires denoising, that is separating the signal from complex physiological noise. To analyze two-photon brain imaging data, we present a signal plus colored noise model in which the signal is represented as harmonic regression and the correlated noise is represented as an order autoregressive process. We provide an efficient cyclic descent algorithm to compute approximate maximum likelihood parameter estimates by combing a weighted least-squares procedure with the Burg algorithm. We use Akaike information criterion to guide selection of the harmonic regression and the autoregressive model orders. Our flexible yet parsimonious modeling approach reliably separates stimulus-evoked fluorescence response from background activity and noise, assesses goodness of fit, and estimates confidence intervals and signal-to-noise ratio. This refined separation leads to appreciably enhanced image contrast for individual cells including clear delineation of subcellular details and network activity. The application of our approach to in vivo imaging data recorded in the ferret primary visual cortex demonstrates that our method yields substantially denoised signal estimates. We also provide a general Volterra series framework for deriving this and other signal plus correlated noise models for imaging. This approach to analyzing two-photon calcium imaging data may be readily adapted to other computational biology problems which apply correlated noise models.

  18. Biosorption properties of citrus peel derived oligogalacturonides, enzyme-modified pectin and peel hydrolysis residues

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A citrus processing industry priority is obtaining added value from fruit peel. Approximately one-half of each processed fruit is added to the waste stream. Peel residue mainly is composed of water (~80%), the remaining 20% (solid fraction) consists of pectin, soluble sugars, cellulose, proteins, ph...

  19. Seismic coherent and random noise attenuation using the undecimated discrete wavelet transform method with WDGA technique

    NASA Astrophysics Data System (ADS)

    Goudarzi, Alireza; Riahi, Mohammad Ali

    2012-12-01

    One of the most crucial challenges in seismic data processing is the reduction of the noise in the data or improving the signal-to-noise ratio. In this study, the 1D undecimated discrete wavelet transform (UDWT) has been acquired to attenuate random noise and ground roll. Wavelet domain ground roll analysis (WDGA) is applied to find the ground roll energy in the wavelet domain. The WDGA will be a substitute method for thresholding in seismic data processing. To compare the effectiveness of the WDGA method, we apply the 1D double density discrete wavelet transform (DDDWT) using soft thresholding in the random noise reduction and ground roll attenuation processes. Seismic signals intersect with ground roll in the time and frequency domains. Random noise and ground roll have many undesirable effects on pre-stack seismic data, and result in an inaccurate velocity analysis for NMO correction. In this paper, the UDWT by using the WDGA technique and DDDWT (using the soft thresholding technique) and the regular Fourier based method as f-k transform will be used and compared for seismic denoising.

  20. Mouse EEG spike detection based on the adapted continuous wavelet transform

    NASA Astrophysics Data System (ADS)

    Tieng, Quang M.; Kharatishvili, Irina; Chen, Min; Reutens, David C.

    2016-04-01

    Objective. Electroencephalography (EEG) is an important tool in the diagnosis of epilepsy. Interictal spikes on EEG are used to monitor the development of epilepsy and the effects of drug therapy. EEG recordings are generally long and the data voluminous. Thus developing a sensitive and reliable automated algorithm for analyzing EEG data is necessary. Approach. A new algorithm for detecting and classifying interictal spikes in mouse EEG recordings is proposed, based on the adapted continuous wavelet transform (CWT). The construction of the adapted mother wavelet is founded on a template obtained from a sample comprising the first few minutes of an EEG data set. Main Result. The algorithm was tested with EEG data from a mouse model of epilepsy and experimental results showed that the algorithm could distinguish EEG spikes from other transient waveforms with a high degree of sensitivity and specificity. Significance. Differing from existing approaches, the proposed approach combines wavelet denoising, to isolate transient signals, with adapted CWT-based template matching, to detect true interictal spikes. Using the adapted wavelet constructed from a predefined template, the adapted CWT is calculated on small EEG segments to fit dynamical changes in the EEG recording.

  1. Wavelet Preprocessing of Acoustic Signals

    DTIC Science & Technology

    1991-12-01

    wavelet transform to preprocess acoustic broadband signals in a system that discriminates between different classes of acoustic bursts. This is motivated by the similarity between the proportional bandwidth filters provided by the wavelet transform and those found in biological hearing systems. The experiment involves comparing statistical pattern classifier effects of wavelet and FFT preprocessed acoustic signals. The data used was from the DARPA Phase I database, which consists of artificially generated signals with real ocean background. The

  2. [A novel method to determine the redshifts of active galaxies based on wavelet transform].

    PubMed

    Tu, Liang-Ping; Luo, A-Li; Jiang, Bin; Wei, Peng; Zhao, Yong-Heng; Liu, Rong

    2012-10-01

    Automatically determining redshifts of galaxies is very important for astronomical research on large samples, such as large-scale structure of cosmological significance. Galaxies are generally divided into normal galaxies and active galaxies, and the spectra of active galaxies mostly have more obvious emission lines. In the present paper, the authors present a novel method to determine spectral redshifts of active galaxies rapidly based on wavelet transformation mainly, and it does not need to extract line information accurately. This method includes the following steps: Firstly, we denoised a spectrum to be processed; Secondly, the low-frequency spectrum was extracted based on wavelet transform, and then we could get the residual spectrum through the denoised spectrum subtracting the low-frequency spectrum; Thirdly, the authors calculated the standard deviation of the residual spectrum and determined a threshold value T, then retained the wavelength set whose corresponding flux was greater than T; Fourthly, according to the wavelength form of all the standard lines, we calculated all the candidate redshifts; Finally, utilizing the density estimation method based on Parzen window, we determined the redshift point with maximum density, and the average value of its neighborhood would be the final redshift of this spectrum. The experiments on simulated data and real data from SDSS-DR7 show that this method is robust and its correct rate is encouraging. And it can be expected to be applied in the project of LAMOST.

  3. A wavelet relational fuzzy C-means algorithm for 2D gel image segmentation.

    PubMed

    Rashwan, Shaheera; Faheem, Mohamed Talaat; Sarhan, Amany; Youssef, Bayumy A B

    2013-01-01

    One of the most famous algorithms that appeared in the area of image segmentation is the Fuzzy C-Means (FCM) algorithm. This algorithm has been used in many applications such as data analysis, pattern recognition, and image segmentation. It has the advantages of producing high quality segmentation compared to the other available algorithms. Many modifications have been made to the algorithm to improve its segmentation quality. The proposed segmentation algorithm in this paper is based on the Fuzzy C-Means algorithm adding the relational fuzzy notion and the wavelet transform to it so as to enhance its performance especially in the area of 2D gel images. Both proposed modifications aim to minimize the oversegmentation error incurred by previous algorithms. The experimental results of comparing both the Fuzzy C-Means (FCM) and the Wavelet Fuzzy C-Means (WFCM) to the proposed algorithm on real 2D gel images acquired from human leukemias, HL-60 cell lines, and fetal alcohol syndrome (FAS) demonstrate the improvement achieved by the proposed algorithm in overcoming the segmentation error. In addition, we investigate the effect of denoising on the three algorithms. This investigation proves that denoising the 2D gel image before segmentation can improve (in most of the cases) the quality of the segmentation.

  4. Wavelets and Approximation

    DTIC Science & Technology

    2007-11-02

    Daubechies-DeVore (Cohen-Daubechies-Gulleryuz-Orchard) This encoder is optimal on all Besov classes compactly embedded into L2 EZW , Said-Pearlman...DeVore (Cohen-Daubechies-Gulleryuz-Orchard) This encoder is optimal on all Besov classes compactly embedded into L2 EZW , Said-Pearlman, Cargese – p.49...Cohen-Daubechies-Gulleryuz-Orchard) This encoder is optimal on all Besov classes compactly embedded into L2 EZW , Said-Pearlman, Cargese – p.49/49 Wavelet

  5. Wavelet phase synchronization and chaoticity.

    PubMed

    Postnikov, E B

    2009-11-01

    It has been shown that the so-called "wavelet phase" (or "time-scale") synchronization of chaotic signals is actually synchronization of smoothed functions with reduced chaotic fluctuations. This fact is based on the representation of the wavelet transform with the Morlet wavelet as a solution of the Cauchy problem for a simple diffusion equation with initial condition in a form of harmonic function modulated by a given signal. The topological background of the resulting effect is discussed. It is argued that the wavelet phase synchronization provides information about the synchronization of an averaged motion described by bounding tori instead of the fine-level classical chaotic phase synchronization.

  6. Perceptually Lossless Wavelet Compression

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Yang, Gloria Y.; Solomon, Joshua A.; Villasenor, John

    1996-01-01

    The Discrete Wavelet Transform (DWT) decomposes an image into bands that vary in spatial frequency and orientation. It is widely used for image compression. Measures of the visibility of DWT quantization errors are required to achieve optimal compression. Uniform quantization of a single band of coefficients results in an artifact that is the sum of a lattice of random amplitude basis functions of the corresponding DWT synthesis filter, which we call DWT uniform quantization noise. We measured visual detection thresholds for samples of DWT uniform quantization noise in Y, Cb, and Cr color channels. The spatial frequency of a wavelet is r 2(exp -1), where r is display visual resolution in pixels/degree, and L is the wavelet level. Amplitude thresholds increase rapidly with spatial frequency. Thresholds also increase from Y to Cr to Cb, and with orientation from low-pass to horizontal/vertical to diagonal. We propose a mathematical model for DWT noise detection thresholds that is a function of level, orientation, and display visual resolution. This allows calculation of a 'perceptually lossless' quantization matrix for which all errors are in theory below the visual threshold. The model may also be used as the basis for adaptive quantization schemes.

  7. Localization of Wireless Emitters Based on the Time Difference of Arrival (TDOA) and Wavelet Denoising

    DTIC Science & Technology

    1999-05-01

    policy or position of the Department of Defense or the United States Government. 12a. DISTRIBUTION/AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE Approved...wireless communication units . One major commercial factor behind the recent interest in localization is the Federal Communications Commission ( FCC ...and any processing and response delay within the mobile unit . Due to the variations in design and manufacture of the handsets the time estimate is

  8. Glycolic acid peel therapy – a current review

    PubMed Central

    Sharad, Jaishree

    2013-01-01

    Chemical peels have been time-tested and are here to stay. Alpha-hydroxy peels are highly popular in the dermatologist’s arsenal of procedures. Glycolic acid peel is the most common alpha-hydroxy acid peel, also known as fruit peel. It is simple, inexpensive, and has no downtime. This review talks about various studies of glycolic acid peels for various indications, such as acne, acne scars, melasma, postinflammatory hyperpigmentation, photoaging, and seborrhea. Combination therapies and treatment procedure are also discussed. Careful review of medical history, examination of the skin, and pre-peel priming of skin are important before every peel. Proper patient selection, peel timing, and neutralization on-time will ensure good results, with no side effects. Depth of the glycolic acid peel depends on the concentration of the acid used, the number of coats applied, and the time for which it is applied. Hence, it can be used as a very superficial peel, or even a medium depth peel. It has been found to be very safe with Fitzpatrick skin types I–IV. All in all, it is a peel that is here to stay. PMID:24399880

  9. [Application of wavelet transform on improving detecting precision of the non-invasive blood components measurement based on dynamic spectrum method].

    PubMed

    Li, Gang; Men, Jian-Long; Sun, Zhao-Min; Wang, Hui-Quan; Lin, Ling; Tong, Ying; Zhang, Bao-Ju

    2011-02-01

    Time-varying noises in spectra collection process have influence on the prediction accuracy of quantitative calibration in the non-invasive blood components measurement which is based on dynamic spectrum (DS) method. By wavelet transform, we focused on the absorbance wave of fingertip transmission spectrum in pulse frequency band. Then we increased the signal to noise ratio of DS data, and improved the detecting precision of quantitative calibration. After carrying out spectrum data continuous acquisition of the same subject for 10 times, we used wavelet transform de-noising to increase the average correlation coefficient of DS data from 0.979 6 to 0.990 3. BP neural network was used to establish the calibration model of subjects' blood components concentration values against dynamic spectrum data of 110 volunteers. After wavelet transform de-noising, the correlation coefficient of prediction set increased from 0.677 4 to 0.846 8, and the average relative error was decreased from 15.8% to 5.3%. Experimental results showed that the introduction of wavelet transform can effectively remove the noise in DS data, improve the detecting precision, and accelerate the development of non-invasive blood components measurement based on DS method.

  10. The berkeley wavelet transform: a biologically inspired orthogonal wavelet transform.

    PubMed

    Willmore, Ben; Prenger, Ryan J; Wu, Michael C-K; Gallant, Jack L

    2008-06-01

    We describe the Berkeley wavelet transform (BWT), a two-dimensional triadic wavelet transform. The BWT comprises four pairs of mother wavelets at four orientations. Within each pair, one wavelet has odd symmetry, and the other has even symmetry. By translation and scaling of the whole set (plus a single constant term), the wavelets form a complete, orthonormal basis in two dimensions. The BWT shares many characteristics with the receptive fields of neurons in mammalian primary visual cortex (V1). Like these receptive fields, BWT wavelets are localized in space, tuned in spatial frequency and orientation, and form a set that is approximately scale invariant. The wavelets also have spatial frequency and orientation bandwidths that are comparable with biological values. Although the classical Gabor wavelet model is a more accurate description of the receptive fields of individual V1 neurons, the BWT has some interesting advantages. It is a complete, orthonormal basis and is therefore inexpensive to compute, manipulate, and invert. These properties make the BWT useful in situations where computational power or experimental data are limited, such as estimation of the spatiotemporal receptive fields of neurons.

  11. Effect of denoising on supervised lung parenchymal clusters

    NASA Astrophysics Data System (ADS)

    Jayamani, Padmapriya; Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Bartholmai, Brian J.; Robb, Richard A.

    2012-03-01

    Denoising is a critical preconditioning step for quantitative analysis of medical images. Despite promises for more consistent diagnosis, denoising techniques are seldom explored in clinical settings. While this may be attributed to the esoteric nature of the parameter sensitve algorithms, lack of quantitative measures on their ecacy to enhance the clinical decision making is a primary cause of physician apathy. This paper addresses this issue by exploring the eect of denoising on the integrity of supervised lung parenchymal clusters. Multiple Volumes of Interests (VOIs) were selected across multiple high resolution CT scans to represent samples of dierent patterns (normal, emphysema, ground glass, honey combing and reticular). The VOIs were labeled through consensus of four radiologists. The original datasets were ltered by multiple denoising techniques (median ltering, anisotropic diusion, bilateral ltering and non-local means) and the corresponding ltered VOIs were extracted. Plurality of cluster indices based on multiple histogram-based pair-wise similarity measures were used to assess the quality of supervised clusters in the original and ltered space. The resultant rank orders were analyzed using the Borda criteria to nd the denoising-similarity measure combination that has the best cluster quality. Our exhaustive analyis reveals (a) for a number of similarity measures, the cluster quality is inferior in the ltered space; and (b) for measures that benet from denoising, a simple median ltering outperforms non-local means and bilateral ltering. Our study suggests the need to judiciously choose, if required, a denoising technique that does not deteriorate the integrity of supervised clusters.

  12. Data compression by wavelet transforms

    NASA Technical Reports Server (NTRS)

    Shahshahani, M.

    1992-01-01

    A wavelet transform algorithm is applied to image compression. It is observed that the algorithm does not suffer from the blockiness characteristic of the DCT-based algorithms at compression ratios exceeding 25:1, but the edges do not appear as sharp as they do with the latter method. Some suggestions for the improved performance of the wavelet transform method are presented.

  13. A generalized wavelet extrema representation

    SciTech Connect

    Lu, Jian; Lades, M.

    1995-10-01

    The wavelet extrema representation originated by Stephane Mallat is a unique framework for low-level and intermediate-level (feature) processing. In this paper, we present a new form of wavelet extrema representation generalizing Mallat`s original work. The generalized wavelet extrema representation is a feature-based multiscale representation. For a particular choice of wavelet, our scheme can be interpreted as representing a signal or image by its edges, and peaks and valleys at multiple scales. Such a representation is shown to be stable -- the original signal or image can be reconstructed with very good quality. It is further shown that a signal or image can be modeled as piecewise monotonic, with all turning points between monotonic segments given by the wavelet extrema. A new projection operator is introduced to enforce piecewise inonotonicity of a signal in its reconstruction. This leads to an enhancement to previously developed algorithms in preventing artifacts in reconstructed signal.

  14. Wavelet-Based Grid Generation

    NASA Technical Reports Server (NTRS)

    Jameson, Leland

    1996-01-01

    Wavelets can provide a basis set in which the basis functions are constructed by dilating and translating a fixed function known as the mother wavelet. The mother wavelet can be seen as a high pass filter in the frequency domain. The process of dilating and expanding this high-pass filter can be seen as altering the frequency range that is 'passed' or detected. The process of translation moves this high-pass filter throughout the domain, thereby providing a mechanism to detect the frequencies or scales of information at every location. This is exactly the type of information that is needed for effective grid generation. This paper provides motivation to use wavelets for grid generation in addition to providing the final product: source code for wavelet-based grid generation.

  15. Adaptive boxcar/wavelet transform

    NASA Astrophysics Data System (ADS)

    Sezer, Osman G.; Altunbasak, Yucel

    2009-01-01

    This paper presents a new adaptive Boxcar/Wavelet transform for image compression. Boxcar/Wavelet decomposition emphasizes the idea of average-interpolation representation which uses dyadic averages and their interpolation to explain a special case of biorthogonal wavelet transforms (BWT). This perspective for image compression together with lifting scheme offers the ability to train an optimum 2-D filter set for nonlinear prediction (interpolation) that will adapt to the context around the low-pass wavelet coefficients for reducing energy in the high-pass bands. Moreover, the filters obtained after training is observed to posses directional information with some textural clues that can provide better prediction performance. This work addresses a firrst step towards obtaining this new set of training-based fillters in the context of Boxcar/Wavelet transform. Initial experimental results show better subjective quality performance compared to popular 9/7-tap and 5/3-tap BWTs with comparable results in objective quality.

  16. Wavelet preprocessing of acoustic signals

    NASA Astrophysics Data System (ADS)

    Huang, W. Y.; Solorzano, M. R.

    1991-12-01

    This paper describes results using the wavelet transform to preprocess acoustic broadband signals in a system that discriminates between different classes of acoustic bursts. This is motivated by the similarity between the proportional bandwidth filters provided by the wavelet transform and those found in biological hearing systems. The experiment involves comparing statistical pattern classifier effects of wavelet and FFT preprocessed acoustic signals. The data used was from the DARPA Phase 1 database, which consists of artificially generated signals with real ocean background. The results show that the wavelet transform did provide improved performance when classifying in a frame-by-frame basis. The DARPA Phase 1 database is well matched to proportional bandwidth filtering; i.e., signal classes that contain high frequencies do tend to have shorter duration in this database. It is also noted that the decreasing background levels at high frequencies compensate for the poor match of the wavelet transform for long duration (high frequency) signals.

  17. Economic analysis of ethanol production from citrus peel waste

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Florida citrus juice industry produces about 3.5 million tons of wet peel waste per year. In current industrial practice, waste peels are dried and sold as cattle feed to offset the waste disposal cost. Profitability would be greatly improved if peel could be used to produce higher value produ...

  18. Economic analysis of ethanol production from citrus peel waste

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Florida citrus juice industry produces about 3-4 million tons of wet peel waste per year. In current industrial practices, waste peels are dried and sold as cattle feed to offset the waste disposal cost. Profitability could be greatly improved if this amount of peel can be used to produce high...

  19. Economic analysis of ethanol production from citrus peel waste

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Florida citrus juice industry produces about 3.5 million tons of wet peel waste per year. In current industrial practice, waste peels are dried and sold as cattle feed to offset the waste disposal cost. Profitability would be greatly improved if peels could be used to produce higher value produ...

  20. Development of Infrared Radiation Heating Method for Sustainable Tomato Peeling

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Although lye peeling is the widely industrialized method for producing high quality peeled fruit and vegetable products, the peeling method has resulted in negative impacts by significantly exerting both environmental and economic pressure on the tomato processing industry due to its associated sali...

  1. Wavelet Analysis of Bioacoustic Scattering and Marine Mammal Vocalizations

    DTIC Science & Technology

    2005-09-01

    17 B. DISCRETE WAVELET TRANSFORM .....................................................17 1. Mother Wavelet ...LEFT BLANK 11 III. WAVELET THEORY There are two distinct classes of wavelet transforms : the continuous wavelet transform (CWT) and the discrete ... wavelet transform (DWT). The discrete wavelet transform is a compact representation of the data and is particularly useful for noise reduction and

  2. Single-image noise level estimation for blind denoising.

    PubMed

    Liu, Xinhao; Tanaka, Masayuki; Okutomi, Masatoshi

    2013-12-01

    Noise level is an important parameter to many image processing applications. For example, the performance of an image denoising algorithm can be much degraded due to the poor noise level estimation. Most existing denoising algorithms simply assume the noise level is known that largely prevents them from practical use. Moreover, even with the given true noise level, these denoising algorithms still cannot achieve the best performance, especially for scenes with rich texture. In this paper, we propose a patch-based noise level estimation algorithm and suggest that the noise level parameter should be tuned according to the scene complexity. Our approach includes the process of selecting low-rank patches without high frequency components from a single noisy image. The selection is based on the gradients of the patches and their statistics. Then, the noise level is estimated from the selected patches using principal component analysis. Because the true noise level does not always provide the best performance for nonblind denoising algorithms, we further tune the noise level parameter for nonblind denoising. Experiments demonstrate that both the accuracy and stability are superior to the state of the art noise level estimation algorithm for various scenes and noise levels.

  3. GPU-accelerated denoising of 3D magnetic resonance images

    SciTech Connect

    Howison, Mark; Wes Bethel, E.

    2014-05-29

    The raw computational power of GPU accelerators enables fast denoising of 3D MR images using bilateral filtering, anisotropic diffusion, and non-local means. In practice, applying these filtering operations requires setting multiple parameters. This study was designed to provide better guidance to practitioners for choosing the most appropriate parameters by answering two questions: what parameters yield the best denoising results in practice? And what tuning is necessary to achieve optimal performance on a modern GPU? To answer the first question, we use two different metrics, mean squared error (MSE) and mean structural similarity (MSSIM), to compare denoising quality against a reference image. Surprisingly, the best improvement in structural similarity with the bilateral filter is achieved with a small stencil size that lies within the range of real-time execution on an NVIDIA Tesla M2050 GPU. Moreover, inappropriate choices for parameters, especially scaling parameters, can yield very poor denoising performance. To answer the second question, we perform an autotuning study to empirically determine optimal memory tiling on the GPU. The variation in these results suggests that such tuning is an essential step in achieving real-time performance. These results have important implications for the real-time application of denoising to MR images in clinical settings that require fast turn-around times.

  4. Remote sensing image denoising application by generalized morphological component analysis

    NASA Astrophysics Data System (ADS)

    Yu, Chong; Chen, Xiong

    2014-12-01

    In this paper, we introduced a remote sensing image denoising method based on generalized morphological component analysis (GMCA). This novel algorithm is the further extension of morphological component analysis (MCA) algorithm to the blind source separation framework. The iterative thresholding strategy adopted by GMCA algorithm firstly works on the most significant features in the image, and then progressively incorporates smaller features to finely tune the parameters of whole model. Mathematical analysis of the computational complexity of GMCA algorithm is provided. Several comparison experiments with state-of-the-art denoising algorithms are reported. In order to make quantitative assessment of algorithms in experiments, Peak Signal to Noise Ratio (PSNR) index and Structural Similarity (SSIM) index are calculated to assess the denoising effect from the gray-level fidelity aspect and the structure-level fidelity aspect, respectively. Quantitative analysis on experiment results, which is consistent with the visual effect illustrated by denoised images, has proven that the introduced GMCA algorithm possesses a marvelous remote sensing image denoising effectiveness and ability. It is even hard to distinguish the original noiseless image from the recovered image by adopting GMCA algorithm through visual effect.

  5. An adaptive nonlocal means scheme for medical image denoising

    NASA Astrophysics Data System (ADS)

    Thaipanich, Tanaphol; Kuo, C.-C. Jay

    2010-03-01

    Medical images often consist of low-contrast objects corrupted by random noise arising in the image acquisition process. Thus, image denoising is one of the fundamental tasks required by medical imaging analysis. In this work, we investigate an adaptive denoising scheme based on the nonlocal (NL)-means algorithm for medical imaging applications. In contrast with the traditional NL-means algorithm, the proposed adaptive NL-means (ANL-means) denoising scheme has three unique features. First, it employs the singular value decomposition (SVD) method and the K-means clustering (K-means) technique for robust classification of blocks in noisy images. Second, the local window is adaptively adjusted to match the local property of a block. Finally, a rotated block matching algorithm is adopted for better similarity matching. Experimental results from both additive white Gaussian noise (AWGN) and Rician noise are given to demonstrate the superior performance of the proposed ANL denoising technique over various image denoising benchmarks in term of both PSNR and perceptual quality comparison.

  6. Image sequence denoising via sparse and redundant representations.

    PubMed

    Protter, Matan; Elad, Michael

    2009-01-01

    In this paper, we consider denoising of image sequences that are corrupted by zero-mean additive white Gaussian noise. Relative to single image denoising techniques, denoising of sequences aims to also utilize the temporal dimension. This assists in getting both faster algorithms and better output quality. This paper focuses on utilizing sparse and redundant representations for image sequence denoising, extending the work reported in. In the single image setting, the K-SVD algorithm is used to train a sparsifying dictionary for the corrupted image. This paper generalizes the above algorithm by offering several extensions: i) the atoms used are 3-D; ii) the dictionary is propagated from one frame to the next, reducing the number of required iterations; and iii) averaging is done on patches in both spatial and temporal neighboring locations. These modifications lead to substantial benefits in complexity and denoising performance, compared to simply running the single image algorithm sequentially. The algorithm's performance is experimentally compared to several state-of-the-art algorithms, demonstrating comparable or favorable results.

  7. Adaptively Tuned Iterative Low Dose CT Image Denoising

    PubMed Central

    Hashemi, SayedMasoud; Paul, Narinder S.; Beheshti, Soosan; Cobbold, Richard S. C.

    2015-01-01

    Improving image quality is a critical objective in low dose computed tomography (CT) imaging and is the primary focus of CT image denoising. State-of-the-art CT denoising algorithms are mainly based on iterative minimization of an objective function, in which the performance is controlled by regularization parameters. To achieve the best results, these should be chosen carefully. However, the parameter selection is typically performed in an ad hoc manner, which can cause the algorithms to converge slowly or become trapped in a local minimum. To overcome these issues a noise confidence region evaluation (NCRE) method is used, which evaluates the denoising residuals iteratively and compares their statistics with those produced by additive noise. It then updates the parameters at the end of each iteration to achieve a better match to the noise statistics. By combining NCRE with the fundamentals of block matching and 3D filtering (BM3D) approach, a new iterative CT image denoising method is proposed. It is shown that this new denoising method improves the BM3D performance in terms of both the mean square error and a structural similarity index. Moreover, simulations and patient results show that this method preserves the clinically important details of low dose CT images together with a substantial noise reduction. PMID:26089972

  8. Geometric properties of solutions to the total variation denoising problem

    NASA Astrophysics Data System (ADS)

    Chambolle, Antonin; Duval, Vincent; Peyré, Gabriel; Poon, Clarice

    2017-01-01

    This article studies the denoising performance of total variation (TV) image regularization. More precisely, we study geometrical properties of the solution to the so-called Rudin-Osher-Fatemi total variation denoising method. The first contribution of this paper is a precise mathematical definition of the ‘extended support’ (associated to the noise-free image) of TV denoising. It is intuitively the region which is unstable and will suffer from the staircasing effect. We highlight in several practical cases, such as the indicator of convex sets, that this region can be determined explicitly. Our second and main contribution is a proof that the TV denoising method indeed restores an image which is exactly constant outside a small tube surrounding the extended support. The radius of this tube shrinks toward zero as the noise level vanishes, and we are able to determine, in some cases, an upper bound on the convergence rate. For indicators of so-called ‘calibrable’ sets (such as disks or properly eroded squares), this extended support matches the edges, so that discontinuities produced by TV denoising cluster tightly around the edges. In contrast, for indicators of more general shapes or for complicated images, this extended support can be larger. Beside these main results, our paper also proves several intermediate results about fine properties of TV regularization, in particular for indicators of calibrable and convex sets, which are of independent interest.

  9. Adaptively Tuned Iterative Low Dose CT Image Denoising.

    PubMed

    Hashemi, SayedMasoud; Paul, Narinder S; Beheshti, Soosan; Cobbold, Richard S C

    2015-01-01

    Improving image quality is a critical objective in low dose computed tomography (CT) imaging and is the primary focus of CT image denoising. State-of-the-art CT denoising algorithms are mainly based on iterative minimization of an objective function, in which the performance is controlled by regularization parameters. To achieve the best results, these should be chosen carefully. However, the parameter selection is typically performed in an ad hoc manner, which can cause the algorithms to converge slowly or become trapped in a local minimum. To overcome these issues a noise confidence region evaluation (NCRE) method is used, which evaluates the denoising residuals iteratively and compares their statistics with those produced by additive noise. It then updates the parameters at the end of each iteration to achieve a better match to the noise statistics. By combining NCRE with the fundamentals of block matching and 3D filtering (BM3D) approach, a new iterative CT image denoising method is proposed. It is shown that this new denoising method improves the BM3D performance in terms of both the mean square error and a structural similarity index. Moreover, simulations and patient results show that this method preserves the clinically important details of low dose CT images together with a substantial noise reduction.

  10. PDE-based nonlinear diffusion techniques for denoising scientific and industrial images: an empirical study

    NASA Astrophysics Data System (ADS)

    Weeratunga, Sisira K.; Kamath, Chandrika

    2002-05-01

    Removing noise from data is often the first step in data analysis. Denoising techniques should not only reduce the noise, but do so without blurring or changing the location of the edges. Many approaches have been proposed to accomplish this; in this paper, we focus on one such approach, namely the use of non-linear diffusion operators. This approach has been studied extensively from a theoretical viewpoint ever since the 1987 work of Perona and Malik showed that non-linear filters outperformed the more traditional linear Canny edge detector. We complement this theoretical work by investigating the performance of several isotropic diffusion operators on test images from scientific domains. We explore the effects of various parameters such as the choice of diffusivity function, explicit and implicit methods for the discretization of the PDE, and approaches for the spatial discretization of the non-linear operator etc. We also compare these schemes with simple spatial filters and the more complex wavelet-based shrinkage techniques. Our empirical results show that, with an appropriate choice of parameters, diffusion-based schemes can be as effective as competitive techniques.

  11. Primary and secondary metabolism in the sun-exposed peel and the shaded peel of apple fruit.

    PubMed

    Li, Pengmin; Ma, Fengwang; Cheng, Lailiang

    2013-05-01

    The metabolism of carbohydrates, organic acids, amino acids and phenolics was compared between the sun-exposed peel and the shaded peel of apple fruit. Contents of sorbitol and glucose were higher in the sun-exposed peel, whereas those of sucrose and fructose were almost the same in the two peel types. This was related to lower sorbitol dehydrogenase activity and higher activities of sorbitol oxidase, neutral invertase and acid invertase in the sun-exposed peel. The lower starch content in the sun-exposed peel was related to lower sucrose synthase activity early in fruit development. Dark respiratory metabolism in the sun-exposed peel was enhanced by the high peel temperature due to high light exposure. Activities of most enzymes in respiratory metabolism were higher in the sun-exposed peel, but the concentrations of most organic acids were relatively stable, except pyruvate and oxaloacetate. Due to the different availability of carbon skeletons from dark respiration in the two peel types, amino acids with higher C/N ratios are accumulated in the sun-exposed peel whereas those with lower C/N ratios are accumulated in the shaded peel. Contents of anthocyanins and flavonols and activities of phenylalanine ammonia-lyase, UDP-galactose:flavonoid 3-O-glucosyltransferase and several other enzymes were higher in the sun-exposed peel than in the shaded peel, indicating the entire phenylpropanoid pathway is upregulated in the sun-exposed peel. Comprehensive analyses of the metabolites and activities of enzymes involved in primary metabolism and secondary metabolism have allowed us to gain a full picture of the metabolic network in the two peel types under natural light exposure.

  12. Wavelets and spacetime squeeze

    NASA Technical Reports Server (NTRS)

    Han, D.; Kim, Y. S.; Noz, Marilyn E.

    1993-01-01

    It is shown that the wavelet is the natural language for the Lorentz covariant description of localized light waves. A model for covariant superposition is constructed for light waves with different frequencies. It is therefore possible to construct a wave function for light waves carrying a covariant probability interpretation. It is shown that the time-energy uncertainty relation (Delta(t))(Delta(w)) is approximately 1 for light waves is a Lorentz-invariant relation. The connection between photons and localized light waves is examined critically.

  13. Wavelet Packets in Wideband Multiuser Communications

    DTIC Science & Technology

    2004-11-01

    developed doubly orthogonal CDMA user spreading waveforms based on wavelet packets. We have also developed and evaluated a wavelet packet based ...inter symbol interferences. Compared with the existing DFT based multicarrier CDMA systems, better performance is achieved with the wavelet packet...23 3.4 Over Loaded Waveform Design. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 4. Wavelet Packet Based Time-Varying

  14. An Introduction to Wavelet Theory and Analysis

    SciTech Connect

    Miner, N.E.

    1998-10-01

    This report reviews the history, theory and mathematics of wavelet analysis. Examination of the Fourier Transform and Short-time Fourier Transform methods provides tiormation about the evolution of the wavelet analysis technique. This overview is intended to provide readers with a basic understanding of wavelet analysis, define common wavelet terminology and describe wavelet amdysis algorithms. The most common algorithms for performing efficient, discrete wavelet transforms for signal analysis and inverse discrete wavelet transforms for signal reconstruction are presented. This report is intended to be approachable by non- mathematicians, although a basic understanding of engineering mathematics is necessary.

  15. Medical image denoising using one-dimensional singularity function model.

    PubMed

    Luo, Jianhua; Zhu, Yuemin; Hiba, Bassem

    2010-03-01

    A novel denoising approach is proposed that is based on a spectral data substitution mechanism through using a mathematical model of one-dimensional singularity function analysis (1-D SFA). The method consists in dividing the complete spectral domain of the noisy signal into two subsets: the preserved set where the spectral data are kept unchanged, and the substitution set where the original spectral data having lower signal-to-noise ratio (SNR) are replaced by those reconstructed using the 1-D SFA model. The preserved set containing original spectral data is determined according to the SNR of the spectrum. The singular points and singularity degrees in the 1-D SFA model are obtained through calculating finite difference of the noisy signal. The theoretical formulation and experimental results demonstrated that the proposed method allows more efficient denoising while introducing less distortion, and presents significant improvement over conventional denoising methods.

  16. Image denoising via group Sparse representation over learned dictionary

    NASA Astrophysics Data System (ADS)

    Cheng, Pan; Deng, Chengzhi; Wang, Shengqian; Zhang, Chunfeng

    2013-10-01

    Images are one of vital ways to get information for us. However, in the practical application, images are often subject to a variety of noise, so that solving the problem of image denoising becomes particularly important. The K-SVD algorithm can improve the denoising effect by sparse coding atoms instead of the traditional method of sparse coding dictionary. In order to further improve the effect of denoising, we propose to extended the K-SVD algorithm via group sparse representation. The key point of this method is dividing the sparse coefficients into groups, so that adjusts the correlation among the elements by controlling the size of the groups. This new approach can improve the local constraints between adjacent atoms, thereby it is very important to increase the correlation between the atoms. The experimental results show that our method has a better effect on image recovery, which is efficient to prevent the block effect and can get smoother images.

  17. Total Variation Denoising and Support Localization of the Gradient

    NASA Astrophysics Data System (ADS)

    Chambolle, A.; Duval, V.; Peyré, G.; Poon, C.

    2016-10-01

    This paper describes the geometrical properties of the solutions to the total variation denoising method. A folklore statement is that this method is able to restore sharp edges, but at the same time, might introduce some staircasing (i.e. “fake” edges) in flat areas. Quite surprisingly, put aside numerical evidences, almost no theoretical result are available to backup these claims. The first contribution of this paper is a precise mathematical definition of the “extended support” (associated to the noise-free image) of TV denoising. This is intuitively the region which is unstable and will suffer from the staircasing effect. Our main result shows that the TV denoising method indeed restores a piece-wise constant image outside a small tube surrounding the extended support. Furthermore, the radius of this tube shrinks toward zero as the noise level vanishes and in some cases, an upper bound on the convergence rate is given.

  18. Non-local MRI denoising using random sampling.

    PubMed

    Hu, Jinrong; Zhou, Jiliu; Wu, Xi

    2016-09-01

    In this paper, we propose a random sampling non-local mean (SNLM) algorithm to eliminate noise in 3D MRI datasets. Non-local means (NLM) algorithms have been implemented efficiently for MRI denoising, but are always limited by high computational complexity. Compared to conventional methods, which raster through the entire search window when computing similarity weights, the proposed SNLM algorithm randomly selects a small subset of voxels which dramatically decreases the computational burden, together with competitive denoising result. Moreover, structure tensor which encapsulates high-order information was introduced as an optimal sampling pattern for further improvement. Numerical experiments demonstrated that the proposed SNLM method can get a good balance between denoising quality and computation efficiency. At a relative sampling ratio (i.e. ξ=0.05), SNLM can remove noise as effectively as full NLM, meanwhile the running time can be reduced to 1/20 of NLM's.

  19. Apple peels as a value-added food ingredient.

    PubMed

    Wolfe, Kelly L; Liu, Rui Hai

    2003-03-12

    There is some evidence that chronic diseases, such as cancer and cardiovascular disease, may occur as a result of oxidative stress. Apple peels have high concentrations of phenolic compounds and may assist in the prevention of chronic diseases. Millions of pounds of waste apple peels are generated in the production of applesauce and canned apples in New York State each year. We proposed that a valuable food ingredient could be made using the peels of these apples if they could be dried and ground to a powder without large losses of phytochemicals. Rome Beauty apple peels were treated with citric acid dips, ascorbic acid dips, and blanches before being oven-dried at 60 degrees C. Only blanching treatments greatly preserved the phenolic compounds, and peels blanched for 10 s had the highest total phenolic content. Rome Beauty apple peels were then blanched for 10 s and dried under various conditions (oven-dried at 40, 60, or 80 degrees C, air-dried, or freeze-dried). The air-dried and freeze-dried apple peels had the highest total phenolic, flavonoid, and anthocyanin contents. On a fresh weight basis, the total phenolic and flavonoid contents of these samples were similar to those of the fresh apple peels. Freeze-dried peels had a lower water activity than air-dried peels on a fresh weight basis. The optimal processing conditions for the ingredient were blanching for 10s and freeze-drying. The process was scaled up, and the apple peel powder ingredient was characterized. The total phenolic content was 3342 +/- 12 mg gallic acid equivalents/100 g dried peels, the flavonoid content was 2299 +/- 52 mg catechin equivalents/100 g dried peels, and the anthocyanin content was 169.7 +/- 1.6 mg cyanidin 3-glucoside equivalents/100 g dried peels. These phytochemical contents were a significantly higher than those of the fresh apple peels if calculated on a fresh weight basis (p < 0.05). The apple peel powder had a total antioxidant activity of 1251 +/- 56 micromol vitamin C

  20. Wavelet Transform Signal Processing Applied to Ultrasonics.

    DTIC Science & Technology

    1995-05-01

    THE WAVELET TRANSFORM IS APPLIED TO THE ANALYSIS OF ULTRASONIC WAVES FOR IMPROVED SIGNAL DETECTION AND ANALYSIS OF THE SIGNALS. In instances where...the mother wavelet is well defined, the wavelet transform has relative insensitivity to noise and does not need windowing. Peak detection of...ultrasonic pulses using the wavelet transform is described and results show good detection even when large white noise was added. The use of the wavelet

  1. HARDI DATA DENOISING USING VECTORIAL TOTAL VARIATION AND LOGARITHMIC BARRIER

    PubMed Central

    Kim, Yunho; Thompson, Paul M.; Vese, Luminita A.

    2010-01-01

    In this work, we wish to denoise HARDI (High Angular Resolution Diffusion Imaging) data arising in medical brain imaging. Diffusion imaging is a relatively new and powerful method to measure the three-dimensional profile of water diffusion at each point in the brain. These images can be used to reconstruct fiber directions and pathways in the living brain, providing detailed maps of fiber integrity and connectivity. HARDI data is a powerful new extension of diffusion imaging, which goes beyond the diffusion tensor imaging (DTI) model: mathematically, intensity data is given at every voxel and at any direction on the sphere. Unfortunately, HARDI data is usually highly contaminated with noise, depending on the b-value which is a tuning parameter pre-selected to collect the data. Larger b-values help to collect more accurate information in terms of measuring diffusivity, but more noise is generated by many factors as well. So large b-values are preferred, if we can satisfactorily reduce the noise without losing the data structure. Here we propose two variational methods to denoise HARDI data. The first one directly denoises the collected data S, while the second one denoises the so-called sADC (spherical Apparent Diffusion Coefficient), a field of radial functions derived from the data. These two quantities are related by an equation of the form S = SSexp (−b · sADC) (in the noise-free case). By applying these two different models, we will be able to determine which quantity will most accurately preserve data structure after denoising. The theoretical analysis of the proposed models is presented, together with experimental results and comparisons for denoising synthetic and real HARDI data. PMID:20802839

  2. Sinogram denoising via simultaneous sparse representation in learned dictionaries.

    PubMed

    Karimi, Davood; Ward, Rabab K

    2016-05-07

    Reducing the radiation dose in computed tomography (CT) is highly desirable but it leads to excessive noise in the projection measurements. This can significantly reduce the diagnostic value of the reconstructed images. Removing the noise in the projection measurements is, therefore, essential for reconstructing high-quality images, especially in low-dose CT. In recent years, two new classes of patch-based denoising algorithms proved superior to other methods in various denoising applications. The first class is based on sparse representation of image patches in a learned dictionary. The second class is based on the non-local means method. Here, the image is searched for similar patches and the patches are processed together to find their denoised estimates. In this paper, we propose a novel denoising algorithm for cone-beam CT projections. The proposed method has similarities to both these algorithmic classes but is more effective and much faster. In order to exploit both the correlation between neighboring pixels within a projection and the correlation between pixels in neighboring projections, the proposed algorithm stacks noisy cone-beam projections together to form a 3D image and extracts small overlapping 3D blocks from this 3D image for processing. We propose a fast algorithm for clustering all extracted blocks. The central assumption in the proposed algorithm is that all blocks in a cluster have a joint-sparse representation in a well-designed dictionary. We describe algorithms for learning such a dictionary and for denoising a set of projections using this dictionary. We apply the proposed algorithm on simulated and real data and compare it with three other algorithms. Our results show that the proposed algorithm outperforms some of the best denoising algorithms, while also being much faster.

  3. Sinogram denoising via simultaneous sparse representation in learned dictionaries

    NASA Astrophysics Data System (ADS)

    Karimi, Davood; Ward, Rabab K.

    2016-05-01

    Reducing the radiation dose in computed tomography (CT) is highly desirable but it leads to excessive noise in the projection measurements. This can significantly reduce the diagnostic value of the reconstructed images. Removing the noise in the projection measurements is, therefore, essential for reconstructing high-quality images, especially in low-dose CT. In recent years, two new classes of patch-based denoising algorithms proved superior to other methods in various denoising applications. The first class is based on sparse representation of image patches in a learned dictionary. The second class is based on the non-local means method. Here, the image is searched for similar patches and the patches are processed together to find their denoised estimates. In this paper, we propose a novel denoising algorithm for cone-beam CT projections. The proposed method has similarities to both these algorithmic classes but is more effective and much faster. In order to exploit both the correlation between neighboring pixels within a projection and the correlation between pixels in neighboring projections, the proposed algorithm stacks noisy cone-beam projections together to form a 3D image and extracts small overlapping 3D blocks from this 3D image for processing. We propose a fast algorithm for clustering all extracted blocks. The central assumption in the proposed algorithm is that all blocks in a cluster have a joint-sparse representation in a well-designed dictionary. We describe algorithms for learning such a dictionary and for denoising a set of projections using this dictionary. We apply the proposed algorithm on simulated and real data and compare it with three other algorithms. Our results show that the proposed algorithm outperforms some of the best denoising algorithms, while also being much faster.

  4. Dictionary Pair Learning on Grassmann Manifolds for Image Denoising.

    PubMed

    Zeng, Xianhua; Bian, Wei; Liu, Wei; Shen, Jialie; Tao, Dacheng

    2015-11-01

    Image denoising is a fundamental problem in computer vision and image processing that holds considerable practical importance for real-world applications. The traditional patch-based and sparse coding-driven image denoising methods convert 2D image patches into 1D vectors for further processing. Thus, these methods inevitably break down the inherent 2D geometric structure of natural images. To overcome this limitation pertaining to the previous image denoising methods, we propose a 2D image denoising model, namely, the dictionary pair learning (DPL) model, and we design a corresponding algorithm called the DPL on the Grassmann-manifold (DPLG) algorithm. The DPLG algorithm first learns an initial dictionary pair (i.e., the left and right dictionaries) by employing a subspace partition technique on the Grassmann manifold, wherein the refined dictionary pair is obtained through a sub-dictionary pair merging. The DPLG obtains a sparse representation by encoding each image patch only with the selected sub-dictionary pair. The non-zero elements of the sparse representation are further smoothed by the graph Laplacian operator to remove the noise. Consequently, the DPLG algorithm not only preserves the inherent 2D geometric structure of natural images but also performs manifold smoothing in the 2D sparse coding space. We demonstrate that the DPLG algorithm also improves the structural SIMilarity values of the perceptual visual quality for denoised images using the experimental evaluations on the benchmark images and Berkeley segmentation data sets. Moreover, the DPLG also produces the competitive peak signal-to-noise ratio values from popular image denoising algorithms.

  5. Chemical Peels for Melasma in Dark-Skinned Patients

    PubMed Central

    Sarkar, Rashmi; Bansal, Shuchi; Garg, Vijay K

    2012-01-01

    Melasma is a common disorder of hyperpigmentation, which has a severe impact on the quality of life. Inspite of tremendous research, the treatment remains frustrating both to the patient and the treating physician. Dark skin types (Fitzpatrick types IV to VI) are especially difficult to treat owing to the increased risk of post-inflammatory hyperpigmentation (PIH). The treatment ranges from a variety of easily applied topical therapies to agents like lasers and chemical peels. Peels are a well-known modality of treatment for melasma, having shown promising results in many clinical trials. However, in darker races, the choice of the peeling agent becomes relatively limited; so, there is the need for priming agents and additional maintenance peels. Although a number of new agents have come up, there is little published evidence supporting their use in day-to -day practice. The traditional glycolic peels prove to be the best both in terms of safety as well as efficacy. Lactic acid peels being relatively inexpensive and having shown equally good results in a few studies, definitely need further experimentation. We also recommend the use of a new peeling agent, the easy phytic solution, which does not require neutralisation unlike the traditional alpha-hydroxy peels. The choice of peeling agent, the peel concentration as well as the frequency and duration of peels are all important to achieve optimum results. PMID:23378706

  6. Palmoplantar peeling secondary to sirolimus therapy.

    PubMed

    Liu, L S; McNiff, J M; Colegio, O R

    2014-01-01

    Sirolimus (rapamycin) is an immunosuppressive agent commonly used in transplant recipients. Although sirolimus has less renal toxicity than calcineurin inhibitors, its use has been limited by its side effects. The most common cutaneous pathologies associated with sirolimus are inflammatory acneiform eruptions, lymphedema and aphthous ulcers. We present a novel cutaneous manifestation of sirolimus therapy that limited its use in at least one transplant recipient. Upon commencing sirolimus therapy, four solid organ transplant recipients developed tender, nonpruritic palmoplantar peeling within the first month of therapy. The peeling clinically resembled a mild form of hand-foot syndrome, yet none of the patients had been treated with chemotherapeutics. Desquamation presented on the palms and soles with dry vesicles and minor peeling extending to the dorsal aspects of the hands and feet. Histologically, the lesions were noninflammatory; the epidermis showed subtle separation between keratinocytes, suggesting either spongiosis or a defect in intercellular adhesion. One patient opted to discontinue treatment because of the tenderness associated with the palmoplantar peeling, which resulted in complete resolution within 2 weeks.

  7. Palmoplantar Peeling Secondary to Sirolimus Therapy

    PubMed Central

    Liu, L. S.; McNiff, J. M.; Colegio, O. R.

    2014-01-01

    Sirolimus (rapamycin) is an immunosuppressive agent commonly used in transplant recipients. Although sirolimus has less renal toxicity than calcineurin inhibitors, its use has been limited by its side effects. The most common cutaneous pathologies associated with sirolimus are inflammatory acneiform eruptions, lymphedema and aphthous ulcers. We present a novel cutaneous manifestation of sirolimus therapy that limited its use in at least one transplant recipient. Upon commencing sirolimus therapy, four solid organ transplant recipients developed tender, nonpruritic palmoplantar peeling within the first month of therapy. The peeling clinically resembled a mild form of hand-foot syndrome, yet none of the patients had been treated with chemotherapeutics. Desquamation presented on the palms and soles with dry vesicles and minor peeling extending to the dorsal aspects of the hands and feet. Histologically, the lesions were noninflammatory; the epidermis showed subtle separation between keratinocytes, suggesting either spongiosis or a defect in intercellular adhesion. One patient opted to discontinue treatment because of the tenderness associated with the palmoplantar peeling, which resulted in complete resolution within 2 weeks. PMID:24224736

  8. Extraction of phenolics from pomegranate peels

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The effects of different solvents, temperature conditions, solvent-solid ratios and particle sizes on solid-solvent extraction of the total phenolics, proanthocyanidins and flavonoids herein also referred to as antioxidant from pomegranate marc peel (PMP) was studied. Water, methanol, ethanol, aceto...

  9. An Ap"peel"ing Activity

    ERIC Educational Resources Information Center

    Urich, Joshua A.; Sasse, Elizabeth A.

    2011-01-01

    This article describes a hands-on mathematics activity wherein students peel oranges to explore the surface area and volume of a sphere. This activity encourages students to make conjectures and hold mathematical discussions with both their peers and their teacher. Moreover, students develop formulas for the surface area and volume of a sphere…

  10. GPU-Accelerated Denoising in 3D (GD3D)

    SciTech Connect

    2013-10-01

    The raw computational power GPU Accelerators enables fast denoising of 3D MR images using bilateral filtering, anisotropic diffusion, and non-local means. This software addresses two facets of this promising application: what tuning is necessary to achieve optimal performance on a modern GPU? And what parameters yield the best denoising results in practice? To answer the first question, the software performs an autotuning step to empirically determine optimal memory blocking on the GPU. To answer the second, it performs a sweep of algorithm parameters to determine the combination that best reduces the mean squared error relative to a noiseless reference image.

  11. Rapid development of keratoacanthomas after a body peel.

    PubMed

    Cox, SueEllen

    2003-02-01

    Resurfacing techniques have been traditionally limited to the face because of a lack of predictability and standardization for peeling nonfacial skin. There is a need for medical and surgical intervention for treating nonfacial skin that is actinically damaged. Medium-depth chemical peels (Jessner +35% trichloroacetic acid) remove the photodamaged epidermis to stimulate the production of new collagen in the dermis and remove lesions associated with facial actinic damage, including lentigines and actinic keratoses. Widespread actinic damage is common on the arms and chest. A 70% glycolic acid gel plus 40% trichloroacetic acid peel (Cook Body Peel) is a controlled peel that predictably enables peeling of nonfacial skin in a uniform and safe fashion with specific clinical endpoints. An unusual complication of this body peel is reported.

  12. Tailoring wavelets for chaos control.

    PubMed

    Wei, G W; Zhan, Meng; Lai, C-H

    2002-12-31

    Chaos is a class of ubiquitous phenomena and controlling chaos is of great interest and importance. In this Letter, we introduce wavelet controlled dynamics as a new paradigm of dynamical control. We find that by modifying a tiny fraction of the wavelet subspaces of a coupling matrix, we could dramatically enhance the transverse stability of the synchronous manifold of a chaotic system. Wavelet controlled Hopf bifurcation from chaos is observed. Our approach provides a robust strategy for controlling chaos and other dynamical systems in nature.

  13. Peak finding using biorthogonal wavelets

    SciTech Connect

    Tan, C.Y.

    2000-02-01

    The authors show in this paper how they can find the peaks in the input data if the underlying signal is a sum of Lorentzians. In order to project the data into a space of Lorentzian like functions, they show explicitly the construction of scaling functions which look like Lorentzians. From this construction, they can calculate the biorthogonal filter coefficients for both the analysis and synthesis functions. They then compare their biorthogonal wavelets to the FBI (Federal Bureau of Investigations) wavelets when used for peak finding in noisy data. They will show that in this instance, their filters perform much better than the FBI wavelets.

  14. Post-processing noise removal algorithm for magnetic resonance imaging based on edge detection and wavelet analysis.

    PubMed

    Placidi, Giuseppe; Alecci, Marcello; Sotgiu, Antonello

    2003-07-07

    A post-processing noise suppression technique for biomedical MRI images is presented. The described procedure recovers both sharp edges and smooth surfaces from a given noisy MRI image; it does not blur the edges and does not introduce spikes or other artefacts. The fine details of the image are also preserved. The proposed algorithm first extracts the edges from the original image and then performs noise reduction by using a wavelet de-noise method. After the application of the wavelet method, the edges are restored to the filtered image. The result is the original image with less noise, fine detail and sharp edges. Edge extraction is performed by using an algorithm based on Sobel operators. The wavelet de-noise method is based on the calculation of the correlation factor between wavelet coefficients belonging to different scales. The algorithm was tested on several MRI images and, as an example of its application, we report the results obtained from a spin echo (multi echo) MRI image of a human wrist collected with a low field experimental scanner (the signal-to-noise ratio, SNR, of the experimental image was 12). Other filtering operations have been performed after the addition of white noise on both channels of the experimental image, before the magnitude calculation. The results at SNR = 7, SNR = 5 and SNR = 3 are also reported. For SNR values between 5 and 12, the improvement in SNR was substantial and the fine details were preserved, the edges were not blurred and no spikes or other artefacts were evident, demonstrating the good performances of our method. At very low SNR (SNR = 3) our result is worse than that obtained by a simpler filtering procedure.

  15. Post-processing noise removal algorithm for magnetic resonance imaging based on edge detection and wavelet analysis

    NASA Astrophysics Data System (ADS)

    Placidi, Giuseppe; Alecci, Marcello; Sotgiu, Antonello

    2003-07-01

    A post-processing noise suppression technique for biomedical MRI images is presented. The described procedure recovers both sharp edges and smooth surfaces from a given noisy MRI image; it does not blur the edges and does not introduce spikes or other artefacts. The fine details of the image are also preserved. The proposed algorithm first extracts the edges from the original image and then performs noise reduction by using a wavelet de-noise method. After the application of the wavelet method, the edges are restored to the filtered image. The result is the original image with less noise, fine detail and sharp edges. Edge extraction is performed by using an algorithm based on Sobel operators. The wavelet de-noise method is based on the calculation of the correlation factor between wavelet coefficients belonging to different scales. The algorithm was tested on several MRI images and, as an example of its application, we report the results obtained from a spin echo (multi echo) MRI image of a human wrist collected with a low field experimental scanner (the signal-to-noise ratio, SNR, of the experimental image was 12). Other filtering operations have been performed after the addition of white noise on both channels of the experimental image, before the magnitude calculation. The results at SNR = 7, SNR = 5 and SNR = 3 are also reported. For SNR values between 5 and 12, the improvement in SNR was substantial and the fine details were preserved, the edges were not blurred and no spikes or other artefacts were evident, demonstrating the good performances of our method. At very low SNR (SNR = 3) our result is worse than that obtained by a simpler filtering procedure.

  16. Wavelet theory and its applications

    SciTech Connect

    Faber, V.; Bradley, JJ.; Brislawn, C.; Dougherty, R.; Hawrylycz, M.

    1996-07-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). We investigated the theory of wavelet transforms and their relation to Laboratory applications. The investigators have had considerable success in the past applying wavelet techniques to the numerical solution of optimal control problems for distributed- parameter systems, nonlinear signal estimation, and compression of digital imagery and multidimensional data. Wavelet theory involves ideas from the fields of harmonic analysis, numerical linear algebra, digital signal processing, approximation theory, and numerical analysis, and the new computational tools arising from wavelet theory are proving to be ideal for many Laboratory applications. 10 refs.

  17. Wavelet entropy of stochastic processes

    NASA Astrophysics Data System (ADS)

    Zunino, L.; Pérez, D. G.; Garavaglia, M.; Rosso, O. A.

    2007-06-01

    We compare two different definitions for the wavelet entropy associated to stochastic processes. The first one, the normalized total wavelet entropy (NTWS) family [S. Blanco, A. Figliola, R.Q. Quiroga, O.A. Rosso, E. Serrano, Time-frequency analysis of electroencephalogram series, III. Wavelet packets and information cost function, Phys. Rev. E 57 (1998) 932-940; O.A. Rosso, S. Blanco, J. Yordanova, V. Kolev, A. Figliola, M. Schürmann, E. Başar, Wavelet entropy: a new tool for analysis of short duration brain electrical signals, J. Neurosci. Method 105 (2001) 65-75] and a second introduced by Tavares and Lucena [Physica A 357(1) (2005) 71-78]. In order to understand their advantages and disadvantages, exact results obtained for fractional Gaussian noise ( -1<α< 1) and fractional Brownian motion ( 1<α< 3) are assessed. We find out that the NTWS family performs better as a characterization method for these stochastic processes.

  18. Wavelet Analysis of Protein Motion

    PubMed Central

    BENSON, NOAH C.

    2014-01-01

    As high-throughput molecular dynamics simulations of proteins become more common and the databases housing the results become larger and more prevalent, more sophisticated methods to quickly and accurately mine large numbers of trajectories for relevant information will have to be developed. One such method, which is only recently gaining popularity in molecular biology, is the continuous wavelet transform, which is especially well-suited for time course data such as molecular dynamics simulations. We describe techniques for the calculation and analysis of wavelet transforms of molecular dynamics trajectories in detail and present examples of how these techniques can be useful in data mining. We demonstrate that wavelets are sensitive to structural rearrangements in proteins and that they can be used to quickly detect physically relevant events. Finally, as an example of the use of this approach, we show how wavelet data mining has led to a novel hypothesis related to the mechanism of the protein γδ resolvase. PMID:25484480

  19. A new fractional wavelet transform

    NASA Astrophysics Data System (ADS)

    Dai, Hongzhe; Zheng, Zhibao; Wang, Wei

    2017-03-01

    The fractional Fourier transform (FRFT) is a potent tool to analyze the time-varying signal. However, it fails in locating the fractional Fourier domain (FRFD)-frequency contents which is required in some applications. A novel fractional wavelet transform (FRWT) is proposed to solve this problem. It displays the time and FRFD-frequency information jointly in the time-FRFD-frequency plane. The definition, basic properties, inverse transform and reproducing kernel of the proposed FRWT are considered. It has been shown that an FRWT with proper order corresponds to the classical wavelet transform (WT). The multiresolution analysis (MRA) associated with the developed FRWT, together with the construction of the orthogonal fractional wavelets are also presented. Three applications are discussed: the analysis of signal with time-varying frequency content, the FRFD spectrum estimation of signals that involving noise, and the construction of fractional Harr wavelet. Simulations verify the validity of the proposed FRWT.

  20. Gearbox Fault Diagnosis Using Adaptive Wavelet Filter

    NASA Astrophysics Data System (ADS)

    LIN, J.; ZUO, M. J.

    2003-11-01

    Vibration signals from a gearbox are usually noisy. As a result, it is difficult to find early symptoms of a potential failure in a gearbox. Wavelet transform is a powerful tool to disclose transient information in signals. An adaptive wavelet filter based on Morlet wavelet is introduced in this paper. The parameters in the Morlet wavelet function are optimised based on the kurtosis maximisation principle. The wavelet used is adaptive because the parameters are not fixed. The adaptive wavelet filter is found to be very effective in detection of symptoms from vibration signals of a gearbox with early fatigue tooth crack. Two types of discrete wavelet transform (DWT), the decimated with DB4 wavelet and the undecimated with harmonic wavelet, are also used to analyse the same signals for comparison. No periodic impulses appear on any scale in either DWT decomposition.

  1. Improvement of wavelet threshold filtered back-projection image reconstruction algorithm

    NASA Astrophysics Data System (ADS)

    Ren, Zhong; Liu, Guodong; Huang, Zhen

    2014-11-01

    Image reconstruction technique has been applied into many fields including some medical imaging, such as X ray computer tomography (X-CT), positron emission tomography (PET) and nuclear magnetic resonance imaging (MRI) etc, but the reconstructed effects are still not satisfied because original projection data are inevitably polluted by noises in process of image reconstruction. Although some traditional filters e.g., Shepp-Logan (SL) and Ram-Lak (RL) filter have the ability to filter some noises, Gibbs oscillation phenomenon are generated and artifacts leaded by back-projection are not greatly improved. Wavelet threshold denoising can overcome the noises interference to image reconstruction. Since some inherent defects exist in the traditional soft and hard threshold functions, an improved wavelet threshold function combined with filtered back-projection (FBP) algorithm was proposed in this paper. Four different reconstruction algorithms were compared in simulated experiments. Experimental results demonstrated that this improved algorithm greatly eliminated the shortcomings of un-continuity and large distortion of traditional threshold functions and the Gibbs oscillation. Finally, the availability of this improved algorithm was verified from the comparison of two evaluation criterions, i.e. mean square error (MSE), peak signal to noise ratio (PSNR) among four different algorithms, and the optimum dual threshold values of improved wavelet threshold function was gotten.

  2. Wavelet transform and Huffman coding based electrocardiogram compression algorithm: Application to telecardiology

    NASA Astrophysics Data System (ADS)

    Chouakri, S. A.; Djaafri, O.; Taleb-Ahmed, A.

    2013-08-01

    We present in this work an algorithm for electrocardiogram (ECG) signal compression aimed to its transmission via telecommunication channel. Basically, the proposed ECG compression algorithm is articulated on the use of wavelet transform, leading to low/high frequency components separation, high order statistics based thresholding, using level adjusted kurtosis value, to denoise the ECG signal, and next a linear predictive coding filter is applied to the wavelet coefficients producing a lower variance signal. This latter one will be coded using the Huffman encoding yielding an optimal coding length in terms of average value of bits per sample. At the receiver end point, with the assumption of an ideal communication channel, the inverse processes are carried out namely the Huffman decoding, inverse linear predictive coding filter and inverse discrete wavelet transform leading to the estimated version of the ECG signal. The proposed ECG compression algorithm is tested upon a set of ECG records extracted from the MIT-BIH Arrhythmia Data Base including different cardiac anomalies as well as the normal ECG signal. The obtained results are evaluated in terms of compression ratio and mean square error which are, respectively, around 1:8 and 7%. Besides the numerical evaluation, the visual perception demonstrates the high quality of ECG signal restitution where the different ECG waves are recovered correctly.

  3. Bayesian tree-structured image modeling using wavelet-domain hidden Markov models

    NASA Astrophysics Data System (ADS)

    Romberg, Justin K.; Choi, Hyeokho; Baraniuk, Richard G.

    1999-06-01

    Wavelet-domain hidden Markov models have proven to be useful tools for statistical signal and image processing. The hidden Markov tree model captures the key features of the joint density of the wavelet coefficients of real-world data. One potential drawback to the HMT framework is the need for computationally expensive iterative training. In this paper, we prose two reduced-parameter HMT models that capture the general structure of a broad class of real-world images. In the image HMT (iHMT) model we use the fact that for a large class of images the structure of the HMT is self-similar across scale. This allows us to reduce the complexity of the iHMT to just nine easily trained parameters. In the universal HMT (uHMT) we take a Bayesian approach and fix these nine parameters. The uHMT requires no training of any kind. While simple, we show using a series of image estimation/denoising experiments that these two new models retain nearly all of the key structure modeled by the full HMT. Finally, we propose a fast shift-invariant HMT estimation algorithm that outperforms all other wavelet- based estimators in the current literature, both in mean- square error and visual metrics.

  4. Optical HAAR Wavelet Transforms using Computer Generated Holography

    DTIC Science & Technology

    1992-12-17

    This research introduces an optical implementation of the continuous wavelet transform to filter images. The wavelet transform is modeled as a...continuous wavelet transform was performed and that the results compared favorably to digital simulation. Wavelets, Holography, Optical correlators.

  5. BOOK REVIEW: The Illustrated Wavelet Transform Handbook: Introductory Theory and Applications in Science, Engineering, Medicine and Finance

    NASA Astrophysics Data System (ADS)

    Ng, J.; Kingsbury, N. G.

    2004-02-01

    wavelet. The second half of the chapter groups together miscellaneous points about the discrete wavelet transform, including coefficient manipulation for signal denoising and smoothing, a description of Daubechies’ wavelets, the properties of translation invariance and biorthogonality, the two-dimensional discrete wavelet transforms and wavelet packets. The fourth chapter is dedicated to wavelet transform methods in the author’s own specialty, fluid mechanics. Beginning with a definition of wavelet-based statistical measures for turbulence, the text proceeds to describe wavelet thresholding in the analysis of fluid flows. The remainder of the chapter describes wavelet analysis of engineering flows, in particular jets, wakes, turbulence and coherent structures, and geophysical flows, including atmospheric and oceanic processes. The fifth chapter describes the application of wavelet methods in various branches of engineering, including machining, materials, dynamics and information engineering. Unlike previous chapters, this (and subsequent) chapters are styled more as literature reviews that describe the findings of other authors. The areas addressed in this chapter include: the monitoring of machining processes, the monitoring of rotating machinery, dynamical systems, chaotic systems, non-destructive testing, surface characterization and data compression. The sixth chapter continues in this vein with the attention now turned to wavelets in the analysis of medical signals. Most of the chapter is devoted to the analysis of one-dimensional signals (electrocardiogram, neural waveforms, acoustic signals etc.), although there is a small section on the analysis of two-dimensional medical images. The seventh and final chapter of the book focuses on the application of wavelets in three seemingly unrelated application areas: fractals, finance and geophysics. The treatment on wavelet methods in fractals focuses on stochastic fractals with a short section on multifractals. The

  6. Heart Disease Detection Using Wavelets

    NASA Astrophysics Data System (ADS)

    González S., A.; Acosta P., J. L.; Sandoval M., M.

    2004-09-01

    We develop a wavelet based method to obtain standardized gray-scale chart of both healthy hearts and of hearts suffering left ventricular hypertrophy. The hypothesis that early bad functioning of heart can be detected must be tested by comparing the wavelet analysis of the corresponding ECD with the limit cases. Several important parameters shall be taken into account such as age, sex and electrolytic changes.

  7. Image denoising via adaptive eigenvectors of graph Laplacian

    NASA Astrophysics Data System (ADS)

    Chen, Ying; Tang, Yibin; Xu, Ning; Zhou, Lin; Zhao, Li

    2016-07-01

    An image denoising method via adaptive eigenvectors of graph Laplacian (EGL) is proposed. Unlike the trivial parameter setting of the used eigenvectors in the traditional EGL method, in our method, the eigenvectors are adaptively selected in the whole denoising procedure. In detail, a rough image is first built with the eigenvectors from the noisy image, where the eigenvectors are selected by using the deviation estimation of the clean image. Subsequently, a guided image is effectively restored with a weighted average of the noisy and rough images. In this operation, the average coefficient is adaptively obtained to set the deviation of the guided image to approximately that of the clean image. Finally, the denoised image is achieved by a group-sparse model with the pattern from the guided image, where the eigenvectors are chosen in the error control of the noise deviation. Moreover, a modified group orthogonal matching pursuit algorithm is developed to efficiently solve the above group sparse model. The experiments show that our method not only improves the practicality of the EGL methods with the dependence reduction of the parameter setting, but also can outperform some well-developed denoising methods, especially for noise with large deviations.

  8. Image denoising using the higher order singular value decomposition.

    PubMed

    Rajwade, Ajit; Rangarajan, Anand; Banerjee, Arunava

    2013-04-01

    In this paper, we propose a very simple and elegant patch-based, machine learning technique for image denoising using the higher order singular value decomposition (HOSVD). The technique simply groups together similar patches from a noisy image (with similarity defined by a statistically motivated criterion) into a 3D stack, computes the HOSVD coefficients of this stack, manipulates these coefficients by hard thresholding, and inverts the HOSVD transform to produce the final filtered image. Our technique chooses all required parameters in a principled way, relating them to the noise model. We also discuss our motivation for adopting the HOSVD as an appropriate transform for image denoising. We experimentally demonstrate the excellent performance of the technique on grayscale as well as color images. On color images, our method produces state-of-the-art results, outperforming other color image denoising algorithms at moderately high noise levels. A criterion for optimal patch-size selection and noise variance estimation from the residual images (after denoising) is also presented.

  9. Pixon Based Image Denoising Scheme by Preserving Exact Edge Locations

    NASA Astrophysics Data System (ADS)

    Srikrishna, Atluri; Reddy, B. Eswara; Pompapathi, Manasani

    2016-09-01

    Denoising of an image is an essential step in many image processing applications. In any image de-noising algorithm, it is a major concern to keep interesting structures of the image like abrupt changes in image intensity values (edges). In this paper an efficient algorithm for image de-noising is proposed that obtains integrated and consecutive original image from noisy image using diffusion equations in pixon domain. The process mainly consists of two steps. In first step, the pixons for noisy image are obtained by using K-means clustering process and next step includes applying diffusion equations on the pixonal model of the image to obtain new intensity values for the restored image. The process has been applied on a variety of standard images and the objective fidelity has been compared with existing algorithms. The experimental results show that the proposed algorithm has a better performance by preserving edge details compared in terms of Figure of Merit and improved Peak-to-Signal-Noise-Ratio value. The proposed method brings out a denoising technique which preserves edge details.

  10. Blind Image Denoising via Dependent Dirichlet Process Tree.

    PubMed

    Zhu, Fengyuan; Chen, Guangyong; Hao, Jianye; Heng, Pheng-Ann

    2016-08-31

    Most existing image denoising approaches assumed the noise to be homogeneous white Gaussian distributed with known intensity. However, in real noisy images, the noise models are usually unknown beforehand and can be much more complex. This paper addresses this problem and proposes a novel blind image denoising algorithm to recover the clean image from noisy one with the unknown noise model. To model the empirical noise of an image, our method introduces the mixture of Gaussian distribution, which is flexible enough to approximate different continuous distributions. The problem of blind image denoising is reformulated as a learning problem. The procedure is to first build a two-layer structural model for noisy patches and consider the clean ones as latent variable. To control the complexity of the noisy patch model, this work proposes a novel Bayesian nonparametric prior called "Dependent Dirichlet Process Tree" to build the model. Then, this study derives a variational inference algorithm to estimate model parameters and recover clean patches. We apply our method on synthesis and real noisy images with different noise models. Comparing with previous approaches, ours achieves better performance. The experimental results indicate the efficiency of the proposed algorithm to cope with practical image denoising tasks.

  11. Local Sparse Structure Denoising for Low-Light-Level Image.

    PubMed

    Han, Jing; Yue, Jiang; Zhang, Yi; Bai, Lianfa

    2015-12-01

    Sparse and redundant representations perform well in image denoising. However, sparsity-based methods fail to denoise low-light-level (LLL) images because of heavy and complex noise. They consider sparsity on image patches independently and tend to lose the texture structures. To suppress noises and maintain textures simultaneously, it is necessary to embed noise invariant features into the sparse decomposition process. We, therefore, used a local structure preserving sparse coding (LSPSc) formulation to explore the local sparse structures (both the sparsity and local structure) in image. It was found that, with the introduction of spatial local structure constraint into the general sparse coding algorithm, LSPSc could improve the robustness of sparse representation for patches in serious noise. We further used a kernel LSPSc (K-LSPSc) formulation, which extends LSPSc into the kernel space to weaken the influence of linear structure constraint in nonlinear data. Based on the robust LSPSc and K-LSPSc algorithms, we constructed a local sparse structure denoising (LSSD) model for LLL images, which was demonstrated to give high performance in the natural LLL images denoising, indicating that both the LSPSc- and K-LSPSc-based LSSD models have the stable property of noise inhibition and texture details preservation.

  12. Image denoising with dominant sets by a coalitional game approach.

    PubMed

    Hsiao, Pei-Chi; Chang, Long-Wen

    2013-02-01

    Dominant sets are a new graph partition method for pairwise data clustering proposed by Pavan and Pelillo. We address the problem of dominant sets with a coalitional game model, in which each data point is treated as a player and similar data points are encouraged to group together for cooperation. We propose betrayal and hermit rules to describe the cooperative behaviors among the players. After applying the betrayal and hermit rules, an optimal and stable graph partition emerges, and all the players in the partition will not change their groups. For computational feasibility, we design an approximate algorithm for finding a dominant set of mutually similar players and then apply the algorithm to an application such as image denoising. In image denoising, every pixel is treated as a player who seeks similar partners according to its patch appearance in its local neighborhood. By averaging the noisy effects with the similar pixels in the dominant sets, we improve nonlocal means image denoising to restore the intrinsic structure of the original images and achieve competitive denoising results with the state-of-the-art methods in visual and quantitative qualities.

  13. Dictionary-based image denoising for dual energy computed tomography

    NASA Astrophysics Data System (ADS)

    Mechlem, Korbinian; Allner, Sebastian; Mei, Kai; Pfeiffer, Franz; Noël, Peter B.

    2016-03-01

    Compared to conventional computed tomography (CT), dual energy CT allows for improved material decomposition by conducting measurements at two distinct energy spectra. Since radiation exposure is a major concern in clinical CT, there is a need for tools to reduce the noise level in images while preserving diagnostic information. One way to achieve this goal is the application of image-based denoising algorithms after an analytical reconstruction has been performed. We have developed a modified dictionary denoising algorithm for dual energy CT aimed at exploiting the high spatial correlation between between images obtained from different energy spectra. Both the low-and high energy image are partitioned into small patches which are subsequently normalized. Combined patches with improved signal-to-noise ratio are formed by a weighted addition of corresponding normalized patches from both images. Assuming that corresponding low-and high energy image patches are related by a linear transformation, the signal in both patches is added coherently while noise is neglected. Conventional dictionary denoising is then performed on the combined patches. Compared to conventional dictionary denoising and bilateral filtering, our algorithm achieved superior performance in terms of qualitative and quantitative image quality measures. We demonstrate, in simulation studies, that this approach can produce 2d-histograms of the high- and low-energy reconstruction which are characterized by significantly improved material features and separation. Moreover, in comparison to other approaches that attempt denoising without simultaneously using both energy signals, superior similarity to the ground truth can be found with our proposed algorithm.

  14. Denoising Magnetic Resonance Images Using Collaborative Non-Local Means.

    PubMed

    Chen, Geng; Zhang, Pei; Wu, Yafeng; Shen, Dinggang; Yap, Pew-Thian

    2016-02-12

    Noise artifacts in magnetic resonance (MR) images increase the complexity of image processing workflows and decrease the reliability of inferences drawn from the images. It is thus often desirable to remove such artifacts beforehand for more robust and effective quantitative analysis. It is important to preserve the integrity of relevant image information while removing noise in MR images. A variety of approaches have been developed for this purpose, and the non-local means (NLM) filter has been shown to be able to achieve state-of-the-art denoising performance. For effective denoising, NLM relies heavily on the existence of repeating structural patterns, which however might not always be present within a single image. This is especially true when one considers the fact that the human brain is complex and contains a lot of unique structures. In this paper we propose to leverage the repeating structures from multiple images to collaboratively denoise an image. The underlying assumption is that it is more likely to find repeating structures from multiple scans than from a single scan. Specifically, to denoise a target image, multiple images, which may be acquired from different subjects, are spatially aligned to the target image, and an NLM-like block matching is performed on these aligned images with the target image as the reference. This will significantly increase the number of matching structures and thus boost the denoising performance. Experiments on both synthetic and real data show that the proposed approach, collaborative non-local means (CNLM), outperforms the classic NLM and yields results with markedly improved structural details.

  15. Apparatus Tests Peeling Of Bonded Rubbery Material

    NASA Technical Reports Server (NTRS)

    Crook, Russell A.; Graham, Robert

    1996-01-01

    Instrumented hydraulic constrained blister-peel apparatus obtains data on degree of bonding between specimen of rubbery material and rigid plate. Growth of blister tracked by video camera, digital clock, pressure transducer, and piston-displacement sensor. Cylinder pressure controlled by hydraulic actuator system. Linear variable-differential transformer (LVDT) and float provide second, independent measure of change in blister volume used as more precise volume feedback in low-growth-rate test.

  16. Chemical peels in active acne and acne scars.

    PubMed

    Kontochristopoulos, Georgios; Platsidaki, Eftychia

    Chemical peeling is a widely used procedure in the management of acne and acne scars. It causes controlled destruction of a part of or the entire epidermis, with or without the dermis, leading to exfoliation and removal of superficial lesions, followed by regeneration of new epidermal and dermal tissues. The most frequently used peeling agents are salicylic acid, glycolic acid, pyruvic acid, lactic acid, mandelic acid, Jessner solution, trichloroacetic acid, and phenol. The appropriate peel is chosen based on the patient's skin type, acne activity, and type of acne scars. Combination peels minimize side effects. In acne scars, chemical peels may be combined with other procedures to achieve better clinical results. A series of chemical peels can lead to significant improvement over a short period, leading to patient satisfaction and maintenance of clinical results. © 2016 Elsevier Inc. All rights reserved.

  17. [How I treat. . . evanescent youth. Dating back using chemical peels].

    PubMed

    Xhauflaire-Uhoda, E; Marcq, V; Piérard-Franchimont, C; Piérard, G E

    2005-10-01

    Chemical peels induce the destruction and exfoliation of the epidermis using selected caustic substances. The specific properties of these agents result in a limited and controlled destruction of epidermal layers and of the superficial dermis. The choice of the appropriate method, the preparatory phase and the post-peeling care are very important to reach optimal results. Some types of peeling are routinely used. We briefly report those designed for the superficial and medium depth peelings. The topical application of glycolic acid at high concentration induces a superficial peel resulting in shading of small wrinkles. Trichloracetic acid used at an appropriate concentration acts more deeply and thus represents a good indication for more severe signs of ageing. These peels have been used for decades by dermatologists to improve the visible signs of skin ageing and to treat some alterations of the cutaneous relief.

  18. Extraction of bromelain from pineapple peels.

    PubMed

    Ketnawa, S; Chaiwut, P; Rawdkuen, S

    2011-08-01

    Large amount of pineapple peels (by-products) is left over after processing and they are a potential source for bromelain extraction. Distilled water (DI), DI containing cysteine and ethylenediaminetetraacetic acid (EDTA) (DI-CE), sodium phosphate buffer pH 7.0 (PB) and PB containing cysteine and EDTA (PB-CE) were used as extractants for bromelain from the pineapple peels. The highest bromelain activity was obtained when it was extracted with PB-CE (867 and 1032 units for Nang Lae and Phu Lae cultv, respectively). The PB could maintain the pH of the extract (pH 5.1-5.7) when compared with others. Under sodium dodecyl sulfate polyacrylamide gel electrophoresis, the extract showed protein bands in the range 24-28 kDa. The protein band with a molecular weight of ∼28 kDa exposed the clear zone on blue background under the casein-substrate gel electrophoresis. The effects of the bromelain extract on the protein patterns of beef, chicken and squid muscles were also determined. Trichloroacetic acid soluble peptide content of all the treated muscles increased when the amount of bromelain extract increased. Decrease in myosin heavy chains and actin was observed in all the muscle types when bromelain extract was used. The best extractant for bromelain from pineapple peels was PB-CE. Moreover, bromelain extract could be used as a muscle food tenderizing agent in food industries.

  19. The peel test in experimental adhesive fracture mechanics

    NASA Technical Reports Server (NTRS)

    Anderson, G. P.; Devries, K. L.; Williams, M. L.

    1974-01-01

    Several testing methods have been proposed for obtaining critical energy release rate or adhesive fracture energy in bond systems. These tests include blister, cone, lap shear, and peel tests. Peel tests have been used for many years to compare relative strengths of different adhesives, different surface preparation techniques, etc. The present work demonstrates the potential use of the peel test for obtaining adhesive fracture energy values.

  20. Comparative study of 15% TCA peel versus 35% glycolic acid peel for the treatment of melasma

    PubMed Central

    Puri, Neerja

    2012-01-01

    Background: Chemical peels are the mainstay of a cosmetic practitioner's armamentarium because they can be used to treat some skin disorders and can provide aesthetic benefit. Objectives: To compare 15% TCA peel and 35% glycolic acid peel for the treatment of melasma. Material and Methods: We selected 30 participants of melasma aged between 20 and 50 years from the dermatology outpatient department and treated equal numbers with 15% TCA and 35% glycolic acid. Results: Subjective response as graded by the patient showed good or very good response in 70% participants in the glycolic acid group and 64% in the TCA group. Conclusions: There was statistically insignificant difference in the efficacy between the two groups for the treatment of melasma. PMID:23130283

  1. Efficient architecture for adaptive directional lifting-based wavelet transform

    NASA Astrophysics Data System (ADS)

    Yin, Zan; Zhang, Li; Shi, Guangming

    2010-07-01

    Adaptive direction lifting-based wavelet transform (ADL) has better performance than conventional lifting both in image compression and de-noising. However, no architecture has been proposed to hardware implement it because of its high computational complexity and huge internal memory requirements. In this paper, we propose a four-stage pipelined architecture for 2 Dimensional (2D) ADL with fast computation and high data throughput. The proposed architecture comprises column direction estimation, column lifting, row direction estimation and row lifting which are performed in parallel in a pipeline mode. Since the column processed data is transposed, the row processor can reuse the column processor which can decrease the design complexity. In the lifting step, predict and update are also performed in parallel. For an 8×8 image sub-block, the proposed architecture can finish the ADL forward transform within 78 clock cycles. The architecture is implemented on Xilinx Virtex5 device on which the frequency can achieve 367 MHz. The processed time is 212.5 ns, which can meet the request of real-time system.

  2. Wavelet Transforms using VTK-m

    SciTech Connect

    Li, Shaomeng; Sewell, Christopher Meyer

    2016-09-27

    These are a set of slides that deal with the topics of wavelet transforms using VTK-m. First, wavelets are discussed and detailed, then VTK-m is discussed and detailed, then wavelets and VTK-m are looked at from a performance comparison, then from an accuracy comparison, and finally lessons learned, conclusion, and what is next. Lessons learned are the following: Launching worklets is expensive; Natural logic of performing 2D wavelet transform: Repeat the same 1D wavelet transform on every row, repeat the same 1D wavelet transform on every column, invoke the 1D wavelet worklet every time: num_rows x num_columns; VTK-m approach of performing 2D wavelet transform: Create a worklet for 2D that handles both rows and columns, invoke this new worklet only one time; Fast calculation, but cannot reuse 1D implementations.

  3. Edge Detection Using a Complex Wavelet

    DTIC Science & Technology

    1993-12-01

    A complex wavelet of the form Psi(x, y) = C(x jy)exp(-p(x-sq+y-sq))) is used in the continuous wavelet transform to obtain edges from a digital image...and x and y are position variables. The square root of the sum of the squares of the real and imaginary parts of the wavelet transform are used to...radar images and the resulting images are shown. Continuous wavelet transform , Digital image.

  4. Doppler ultrasound wall removal based on the spatial correlation of wavelet coefficients.

    PubMed

    Jin, Dawei; Wang, Yuanyuan

    2007-11-01

    In medical Doppler ultrasound systems, a high-pass filter is commonly used to reject echoes from the vessel wall. However, this leads to the loss of the information from the low velocity blood flow. Here a spatially selective noise filtration algorithm cooperating with a threshold denoising based on wavelets coefficients is applied to estimate the wall clutter. Then the blood flow signal is extracted by subtracting the wall clutter from the mixed signal. Experiments on computer simulated signals with various clutter-to-blood power ratios indicate that this method achieves a lower mean relative error of spectrum than the high-pass filtering and other two previously published separation methods based on the recursive principle component analysis and the irregular sampling and iterative reconstruction, respectively. The method also performs well when applied to in vivo carotid signals. All results suggest that this approach can be implemented as a clutter rejection filter in Doppler ultrasound instruments.

  5. Classification of acoustic emission signals using wavelets and Random Forests : Application to localized corrosion

    NASA Astrophysics Data System (ADS)

    Morizet, N.; Godin, N.; Tang, J.; Maillet, E.; Fregonese, M.; Normand, B.

    2016-03-01

    This paper aims to propose a novel approach to classify acoustic emission (AE) signals deriving from corrosion experiments, even if embedded into a noisy environment. To validate this new methodology, synthetic data are first used throughout an in-depth analysis, comparing Random Forests (RF) to the k-Nearest Neighbor (k-NN) algorithm. Moreover, a new evaluation tool called the alter-class matrix (ACM) is introduced to simulate different degrees of uncertainty on labeled data for supervised classification. Then, tests on real cases involving noise and crevice corrosion are conducted, by preprocessing the waveforms including wavelet denoising and extracting a rich set of features as input of the RF algorithm. To this end, a software called RF-CAM has been developed. Results show that this approach is very efficient on ground truth data and is also very promising on real data, especially for its reliability, performance and speed, which are serious criteria for the chemical industry.

  6. Despeckling SRTM And Other Topographic Data With A Denoising Algorithm

    NASA Astrophysics Data System (ADS)

    Stevenson, J. A.; Sun, X.; Mitchell, N. C.

    2012-12-01

    Noise in topographic data obscures features and increases error in geomorphic products calculated from DEMs. DEMs produced by radar remote sensing, such as SRTM, are frequently used for geomorphological studies, they often contain speckle noise which may significantly lower the quality of geomorphometric analyses. We introduce here an algorithm that denoises three-dimensional objects while preserving sharp features. It is free to download and simple to use. In this study the algorithm is applied to topographic data (synthetic landscapes, SRTM, TOPSAR) and the results are compared against using a mean filter, using LiDAR data as ground truth for the natural datasets. The level of denoising is controlled by two parameters: the threshold (T) that controls the sharpness of the features to be preserved, and the number of iterations (n) that controls how much the data are changed. The optimum settings depend on the nature of the topography and of the noise to be removed, but are typically in the range T = 0.87-0.99 and n = 1-10. If the threshold is too high, noise is preserved. A lower threshold setting is used where noise is spatially uncorrelated (e.g. TOPSAR), whereas in some other datasets (e.g. SRTM), where filtering of the data during processing has introduced spatial correlation to the noise, higher thresholds can be used. Compared to those filtered to an equivalent level with a mean filter, data smoothed by the denoising algorithm of Sun et al. [Sun, X., Rosin, P.L., Martin, R.R., Langbein, F.C., 2007. Fast and effective feature-preserving mesh denoising. IEEE Transactions on Visualisation and Computer Graphics 13, 925-938] are closer to the original data and to the ground truth. Changes to the data are smaller and less correlated to topographic features. Furthermore, the feature-preserving nature of the algorithm allows significant smoothing to be applied to flat areas of topography while limiting the alterations made in mountainous regions, with clear benefits

  7. Wavelet-Based Multiresolution Analyses of Signals

    DTIC Science & Technology

    1992-06-01

    classification. Some signals, notably those of a transient nature, are inherently difficult to analyze with these traditional tools. The Discrete Wavelet Transform has...scales. This thesis investigates dyadic discrete wavelet decompositions of signals. A new multiphase wavelet transform is proposed and investigated. The

  8. New trends in despeckling: undecimated-wavelet shrinkage and fuzzy matching-pursuits estimation

    NASA Astrophysics Data System (ADS)

    Aiazzi, Bruno; Alparone, Luciano; Argenti, Fabrizio; Baronti, Stefano

    2002-02-01

    wavelet decomposition avoids the typical ringing impairments produced by critically-sampled wavelet-based denoising.

  9. Automatic parameter prediction for image denoising algorithms using perceptual quality features

    NASA Astrophysics Data System (ADS)

    Mittal, Anish; Moorthy, Anush K.; Bovik, Alan C.

    2012-03-01

    A natural scene statistics (NSS) based blind image denoising approach is proposed, where denoising is performed without knowledge of the noise variance present in the image. We show how such a parameter estimation can be used to perform blind denoising by combining blind parameter estimation with a state-of-the-art denoising algorithm.1 Our experiments show that for all noise variances simulated on a varied image content, our approach is almost always statistically superior to the reference BM3D implementation in terms of perceived visual quality at the 95% confidence level.

  10. Denoising in Contrast-Enhanced X-ray Images

    NASA Astrophysics Data System (ADS)

    Jeon, Gwanggil

    2016-12-01

    In this paper, we propose a denoising and contrast-enhancement method for medical images. The main purpose of medical image improvement is to transform lower contrast data into higher contrast, and to reduce high noise levels. To meet this goal, we propose a noise-level estimation method, whereby the noise level is estimated by computing the standard deviation and variance in a local block. The obtained noise level is then used as an input parameter for the block-matching and 3D filtering (BM3D) algorithm, and the denoising process is then performed. Noise-level estimation step is important because the BM3D algorithm does not perform well without correct noise-level information. Simulation results confirm that the proposed method outperforms other benchmarks with respect to both their objective and visual performances.

  11. Diffusion Weighted Image Denoising Using Overcomplete Local PCA

    PubMed Central

    Manjón, José V.; Coupé, Pierrick; Concha, Luis; Buades, Antonio; Collins, D. Louis; Robles, Montserrat

    2013-01-01

    Diffusion Weighted Images (DWI) normally shows a low Signal to Noise Ratio (SNR) due to the presence of noise from the measurement process that complicates and biases the estimation of quantitative diffusion parameters. In this paper, a new denoising methodology is proposed that takes into consideration the multicomponent nature of multi-directional DWI datasets such as those employed in diffusion imaging. This new filter reduces random noise in multicomponent DWI by locally shrinking less significant Principal Components using an overcomplete approach. The proposed method is compared with state-of-the-art methods using synthetic and real clinical MR images, showing improved performance in terms of denoising quality and estimation of diffusion parameters. PMID:24019889

  12. A simple filter circuit for denoising biomechanical impact signals.

    PubMed

    Subramaniam, Suba R; Georgakis, Apostolos

    2009-01-01

    We present a simple scheme for denoising non-stationary biomechanical signals with the aim of accurately estimating their second derivative (acceleration). The method is based on filtering in fractional Fourier domains using well-known low-pass filters in a way that amounts to a time-varying cut-off threshold. The resulting algorithm is linear and its design is facilitated by the relationship between the fractional Fourier transform and joint time-frequency representations. The implemented filter circuit employs only three low-order filters while its efficiency is further supported by the low computational complexity of the fractional Fourier transform. The results demonstrate that the proposed method can denoise the signals effectively and is more robust against noise as compared to conventional low-pass filters.

  13. Fast non local means denoising for 3D MR images.

    PubMed

    Coupé, Pierrick; Yger, Pierre; Barillot, Christian

    2006-01-01

    One critical issue in the context of image restoration is the problem of noise removal while keeping the integrity of relevant image information. Denoising is a crucial step to increase image conspicuity and to improve the performances of all the processings needed for quantitative imaging analysis. The method proposed in this paper is based on an optimized version of the Non Local (NL) Means algorithm. This approach uses the natural redundancy of information in image to remove the noise. Tests were carried out on synthetic datasets and on real 3T MR images. The results show that the NL-means approach outperforms other classical denoising methods, such as Anisotropic Diffusion Filter and Total Variation.

  14. Recent advances in wavelet technology

    NASA Technical Reports Server (NTRS)

    Wells, R. O., Jr.

    1994-01-01

    Wavelet research has been developing rapidly over the past five years, and in particular in the academic world there has been significant activity at numerous universities. In the industrial world, there has been developments at Aware, Inc., Lockheed, Martin-Marietta, TRW, Kodak, Exxon, and many others. The government agencies supporting wavelet research and development include ARPA, ONR, AFOSR, NASA, and many other agencies. The recent literature in the past five years includes a recent book which is an index of citations in the past decade on this subject, and it contains over 1,000 references and abstracts.

  15. Image Segmentation Using Affine Wavelets

    DTIC Science & Technology

    1991-12-12

    Fourier Transform [23:677] ........ .. 3-15 3.6. Typical Wavelet Function and its Fourier Transform [23:577] ............ 3-16 3.7. Orientation of...Wavelet Decomposition Filters ii the Fourier Dcmain [14:65] 3-18 4.1. Datafiow- Diagram of the Wa’velet Decompossii ’n Proga, F.r..t cvc.. A -•A 4.2...global spatial relationships, as does a Fourier transforn."[l 1] The main thrust of Daugman’s article [11] was to show the utility of a neural network

  16. [Quantitative evaluation of soil hyperspectra denoising with different filters].

    PubMed

    Huang, Ming-Xiang; Wang, Ke; Shi, Zhou; Gong, Jian-Hua; Li, Hong-Yi; Chen, Jie-Liang

    2009-03-01

    The noise distribution of soil hyperspectra measured by ASD FieldSpec Pro FR was described, and then the quantitative evaluation of spectral denoising with six filters was compared. From the interpretation of soil hyperspectra, the continuum removed, first-order differential and high frequency curves, the UV/VNIR (350-1 050 nm) exhibit hardly noise except the coverage of 40 nm in the beginning 350 nm. However, the SWIR (1 000-2 500 nm) shows different noise distribution. Especially, the latter half of SWIR 2(1 800-2 500 nm) showed more noise, and the intersection spectrum of three spectrometers has more noise than the neighbor spectrum. Six filters were chosen for spectral denoising. The smoothing indexes (SI), horizontal feature reservation index (HFRI) and vertical feature reservation index (VFRI) were designed for evaluating the denoising performance of these filters. The comparison of their indexes shows that WD and MA filters are the optimal choice to filter the noise, in terms of balancing the contradiction between the smoothing and feature reservation ability. Furthermore the first-order differential data of 66 denoising soil spectra by 6 filters were respectively used as the input of the same PLSR model to predict the sand content. The different prediction accuracies caused by the different filters show that compared to the feature reservation ability, the filter's smoothing ability is the principal factor to influence the accuracy. The study can benefit the spectral preprocessing and analyzing, and also provide the scientific foundation for the related spectroscopy applications.

  17. Robust L1 PCA and application in image denoising

    NASA Astrophysics Data System (ADS)

    Gao, Junbin; Kwan, Paul W. H.; Guo, Yi

    2007-11-01

    The so-called robust L1 PCA was introduced in our recent work [1] based on the L1 noise assumption. Due to the heavy tail characteristics of the L1 distribution, the proposed model has been proved much more robust against data outliers. In this paper, we further demonstrate how the learned robust L1 PCA model can be used to denoise image data.

  18. Optimally stabilized PET image denoising using trilateral filtering.

    PubMed

    Mansoor, Awais; Bagci, Ulas; Mollura, Daniel J

    2014-01-01

    Low-resolution and signal-dependent noise distribution in positron emission tomography (PET) images makes denoising process an inevitable step prior to qualitative and quantitative image analysis tasks. Conventional PET denoising methods either over-smooth small-sized structures due to resolution limitation or make incorrect assumptions about the noise characteristics. Therefore, clinically important quantitative information may be corrupted. To address these challenges, we introduced a novel approach to remove signal-dependent noise in the PET images where the noise distribution was considered as Poisson-Gaussian mixed. Meanwhile, the generalized Anscombe's transformation (GAT) was used to stabilize varying nature of the PET noise. Other than noise stabilization, it is also desirable for the noise removal filter to preserve the boundaries of the structures while smoothing the noisy regions. Indeed, it is important to avoid significant loss of quantitative information such as standard uptake value (SUV)-based metrics as well as metabolic lesion volume. To satisfy all these properties, we extended bilateral filtering method into trilateral filtering through multiscaling and optimal Gaussianization process. The proposed method was tested on more than 50 PET-CT images from various patients having different cancers and achieved the superior performance compared to the widely used denoising techniques in the literature.

  19. 2D Orthogonal Locality Preserving Projection for Image Denoising.

    PubMed

    Shikkenawis, Gitam; Mitra, Suman K

    2016-01-01

    Sparse representations using transform-domain techniques are widely used for better interpretation of the raw data. Orthogonal locality preserving projection (OLPP) is a linear technique that tries to preserve local structure of data in the transform domain as well. Vectorized nature of OLPP requires high-dimensional data to be converted to vector format, hence may lose spatial neighborhood information of raw data. On the other hand, processing 2D data directly, not only preserves spatial information, but also improves the computational efficiency considerably. The 2D OLPP is expected to learn the transformation from 2D data itself. This paper derives mathematical foundation for 2D OLPP. The proposed technique is used for image denoising task. Recent state-of-the-art approaches for image denoising work on two major hypotheses, i.e., non-local self-similarity and sparse linear approximations of the data. Locality preserving nature of the proposed approach automatically takes care of self-similarity present in the image while inferring sparse basis. A global basis is adequate for the entire image. The proposed approach outperforms several state-of-the-art image denoising approaches for gray-scale, color, and texture images.

  20. Streak image denoising and segmentation using adaptive Gaussian guided filter.

    PubMed

    Jiang, Zhuocheng; Guo, Baoping

    2014-09-10

    In streak tube imaging lidar (STIL), streak images are obtained using a CCD camera. However, noise in the captured streak images can greatly affect the quality of reconstructed 3D contrast and range images. The greatest challenge for streak image denoising is reducing the noise while preserving details. In this paper, we propose an adaptive Gaussian guided filter (AGGF) for noise removal and detail enhancement of streak images. The proposed algorithm is based on a guided filter (GF) and part of an adaptive bilateral filter (ABF). In the AGGF, the details are enhanced by optimizing the offset parameter. AGGF-denoised streak images are significantly sharper than those denoised by the GF. Moreover, the AGGF is a fast linear time algorithm achieved by recursively implementing a Gaussian filter kernel. Experimentally, AGGF demonstrates its capacity to preserve edges and thin structures and outperforms the existing bilateral filter and domain transform filter in terms of both visual quality and peak signal-to-noise ratio performance.

  1. Microseismic event denoising via adaptive directional vector median filters

    NASA Astrophysics Data System (ADS)

    Zheng, Jing; Lu, Ji-Ren; Jiang, Tian-Qi; Liang, Zhe

    2017-01-01

    We present a novel denoising scheme via Radon transform-based adaptive vector directional median filters named adaptive directional vector median filter (AD-VMF) to suppress noise for microseismic downhole dataset. AD-VMF contains three major steps for microseismic downhole data processing: (i) applying Radon transform on the microseismic data to obtain the parameters of the waves, (ii) performing S-transform to determine the parameters for filters, and (iii) applying the parameters for vector median filter (VMF) to denoise the data. The steps (i) and (ii) can realize the automatic direction detection. The proposed algorithm is tested with synthetic and field datasets that were recorded with a vertical array of receivers. The P-wave and S-wave direct arrivals are properly denoised for poor signal-to-noise ratio (SNR) records. In the simulation case, we also evaluate the performance with mean square error (MSE) in terms of signal-to-noise ratio (SNR). The result shows that the distortion of the proposed method is very low; the SNR is even less than 0 dB.

  2. Two-direction nonlocal model for image denoising.

    PubMed

    Zhang, Xuande; Feng, Xiangchu; Wang, Weiwei

    2013-01-01

    Similarities inherent in natural images have been widely exploited for image denoising and other applications. In fact, if a cluster of similar image patches is rearranged into a matrix, similarities exist both between columns and rows. Using the similarities, we present a two-directional nonlocal (TDNL) variational model for image denoising. The solution of our model consists of three components: one component is a scaled version of the original observed image and the other two components are obtained by utilizing the similarities. Specifically, by using the similarity between columns, we get a nonlocal-means-like estimation of the patch with consideration to all similar patches, while the weights are not the pairwise similarities but a set of clusterwise coefficients. Moreover, by using the similarity between rows, we also get nonlocal-autoregression-like estimations for the center pixels of the similar patches. The TDNL model leads to an alternative minimization algorithm. Experiments indicate that the model can perform on par with or better than the state-of-the-art denoising methods.

  3. Stacked Convolutional Denoising Auto-Encoders for Feature Representation.

    PubMed

    Du, Bo; Xiong, Wei; Wu, Jia; Zhang, Lefei; Zhang, Liangpei; Tao, Dacheng

    2016-03-16

    Deep networks have achieved excellent performance in learning representation from visual data. However, the supervised deep models like convolutional neural network require large quantities of labeled data, which are very expensive to obtain. To solve this problem, this paper proposes an unsupervised deep network, called the stacked convolutional denoising auto-encoders, which can map images to hierarchical representations without any label information. The network, optimized by layer-wise training, is constructed by stacking layers of denoising auto-encoders in a convolutional way. In each layer, high dimensional feature maps are generated by convolving features of the lower layer with kernels learned by a denoising auto-encoder. The auto-encoder is trained on patches extracted from feature maps in the lower layer to learn robust feature detectors. To better train the large network, a layer-wise whitening technique is introduced into the model. Before each convolutional layer, a whitening layer is embedded to sphere the input data. By layers of mapping, raw images are transformed into high-level feature representations which would boost the performance of the subsequent support vector machine classifier. The proposed algorithm is evaluated by extensive experimentations and demonstrates superior classification performance to state-of-the-art unsupervised networks.

  4. Developments in ethanol production from citrus peel waste

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Each year, the Florida citrus juice industry produces about 3.5~5.0 million tons of wet peel waste, which are currently dried and sold as cattle feed, often at a loss, to dispose of the waste residual. Profitability would be greatly improved if the peel waste could be used to produce higher value pr...

  5. Microwave extraction of citrus peel to release pectin

    Technology Transfer Automated Retrieval System (TEKTRAN)

    After removal of soluble sugars and other compounds by washing, citrus peel is largely composed of pectin, cellulose and hemicellulose. In order to utilize the greatest amount of citrus peel product, it would appear reasonable that one or all three of these polysaccharides be converted to a useful m...

  6. Feasibility of Jujube peeling using novel infrared radiation heating technology

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Infrared (IR) radiation heating has a promising potential to be used as a sustainable and effective method to eliminate the use of water and chemicals in the jujube-peeling process and enhance the quality of peeled products. The objective of this study was to investigate the feasibility of use IR he...

  7. Wavelets based on Hermite cubic splines

    NASA Astrophysics Data System (ADS)

    Cvejnová, Daniela; Černá, Dana; Finěk, Václav

    2016-06-01

    In 2000, W. Dahmen et al. designed biorthogonal multi-wavelets adapted to the interval [0,1] on the basis of Hermite cubic splines. In recent years, several more simple constructions of wavelet bases based on Hermite cubic splines were proposed. We focus here on wavelet bases with respect to which both the mass and stiffness matrices are sparse in the sense that the number of nonzero elements in any column is bounded by a constant. Then, a matrix-vector multiplication in adaptive wavelet methods can be performed exactly with linear complexity for any second order differential equation with constant coefficients. In this contribution, we shortly review these constructions and propose a new wavelet which leads to improved Riesz constants. Wavelets have four vanishing wavelet moments.

  8. Antibacterial activity of Citrus reticulata peel extracts.

    PubMed

    Jayaprakasha, G K; Negi, P S; Sikder, S; Rao, L J; Sakariah, K K

    2000-01-01

    Citrus peels were successively extracted with hexane, chloroform and acetone using a soxhlet extractor. The hexane and chloroform extracts were fractionated into alcohol-soluble and alcohol-insoluble fractions. These fractions were tested against different gram positive and gram negative bacteria. The EtOH-soluble fraction was found to be most effective. Fractionation of EtOH-soluble fraction on silica gel column yielded three polymethoxylated flavones, namely desmethylnobiletin, nobiletin and tangeretin. Their structures were confirmed by UV, 1H, 13C NMR and mass spectral studies. The findings indicated a potential of these natural compounds as biopreservatives in food applications.

  9. Manual for Program PSTRESS: Peel stress computation

    NASA Technical Reports Server (NTRS)

    Barkey, Derek A.; Madan, Ram C.

    1987-01-01

    Described is the use of the interactive FORTRAN computer program PSTRESS, which computes a closed form solution for two bonded plates subjected to applied moments, vertical shears, and in-plane forces. The program calculates in-plane stresses in the plates, deflections of the plates, and peel and shear stresses in the adhesive. The document briefly outlines the analytical method used by PSTRESS, describes the input and output of the program, and presents a sample analysis. The results of the latter are shown to be within a few percent of results obtained using a NASTRAN finite element analysis. An appendix containing a listing of PSTRESS is included.

  10. A Wavelet Perspective on the Allan Variance.

    PubMed

    Percival, Donald B

    2016-04-01

    The origins of the Allan variance trace back 50 years ago to two seminal papers, one by Allan (1966) and the other by Barnes (1966). Since then, the Allan variance has played a leading role in the characterization of high-performance time and frequency standards. Wavelets first arose in the early 1980s in the geophysical literature, and the discrete wavelet transform (DWT) became prominent in the late 1980s in the signal processing literature. Flandrin (1992) briefly documented a connection between the Allan variance and a wavelet transform based upon the Haar wavelet. Percival and Guttorp (1994) noted that one popular estimator of the Allan variance-the maximal overlap estimator-can be interpreted in terms of a version of the DWT now widely referred to as the maximal overlap DWT (MODWT). In particular, when the MODWT is based on the Haar wavelet, the variance of the resulting wavelet coefficients-the wavelet variance-is identical to the Allan variance when the latter is multiplied by one-half. The theory behind the wavelet variance can thus deepen our understanding of the Allan variance. In this paper, we review basic wavelet variance theory with an emphasis on the Haar-based wavelet variance and its connection to the Allan variance. We then note that estimation theory for the wavelet variance offers a means of constructing asymptotically correct confidence intervals (CIs) for the Allan variance without reverting to the common practice of specifying a power-law noise type a priori. We also review recent work on specialized estimators of the wavelet variance that are of interest when some observations are missing (gappy data) or in the presence of contamination (rogue observations or outliers). It is a simple matter to adapt these estimators to become estimators of the Allan variance. Finally we note that wavelet variances based upon wavelets other than the Haar offer interesting generalizations of the Allan variance.

  11. Multiplexed fibre Fizeau interferometer and fibre Bragg grating sensor system for simultaneous measurement of quasi-static strain and temperature using discrete wavelet transform

    NASA Astrophysics Data System (ADS)

    Wong, Allan C. L.; Childs, Paul A.; Peng, Gang-Ding

    2006-02-01

    We present a multiplexed fibre Fizeau interferometer (FFI) and fibre Bragg grating (FBG) sensor system for simultaneous measurement of quasi-static strain and temperature. A combined spatial-frequency and wavelength- division multiplexing scheme is employed to multiplex the FFI and FBG sensors. A demodulation technique based on the discrete wavelet transform with signal processing enhancements is used to determine the measurand- induced physical changes of the sensors. The noise associated with the sensor signal is reduced by the block-level-thresholding wavelet denoising method, which is applied via the demodulation technique. This sensor system yields a high accuracy and resolution, and low crosstalk. It is well suited for long-term quasi-static measurements, especially for the structural health monitoring of large-scale structures.

  12. Visibility of wavelet quantization noise

    NASA Technical Reports Server (NTRS)

    Watson, A. B.; Yang, G. Y.; Solomon, J. A.; Villasenor, J.

    1997-01-01

    The discrete wavelet transform (DWT) decomposes an image into bands that vary in spatial frequency and orientation. It is widely used for image compression. Measures of the visibility of DWT quantization errors are required to achieve optimal compression. Uniform quantization of a single band of coefficients results in an artifact that we call DWT uniform quantization noise; it is the sum of a lattice of random amplitude basis functions of the corresponding DWT synthesis filter. We measured visual detection thresholds for samples of DWT uniform quantization noise in Y, Cb, and Cr color channels. The spatial frequency of a wavelet is r 2-lambda, where r is display visual resolution in pixels/degree, and lambda is the wavelet level. Thresholds increase rapidly with wavelet spatial frequency. Thresholds also increase from Y to Cr to Cb, and with orientation from lowpass to horizontal/vertical to diagonal. We construct a mathematical model for DWT noise detection thresholds that is a function of level, orientation, and display visual resolution. This allows calculation of a "perceptually lossless" quantization matrix for which all errors are in theory below the visual threshold. The model may also be used as the basis for adaptive quantization schemes.

  13. Image encoding with triangulation wavelets

    NASA Astrophysics Data System (ADS)

    Hebert, D. J.; Kim, HyungJun

    1995-09-01

    We demonstrate some wavelet-based image processing applications of a class of simplicial grids arising in finite element computations and computer graphics. The cells of a triangular grid form the set of leaves of a binary tree and the nodes of a directed graph consisting of a single cycle. The leaf cycle of a uniform grid forms a pattern for pixel image scanning and for coherent computation of coefficients of splines and wavelets. A simple form of image encoding is accomplished with a 1D quadrature mirror filter whose coefficients represent an expansion of the image in terms of 2D Haar wavelets with triangular support. A combination the leaf cycle and an inherent quadtree structure allow efficient neighbor finding, grid refinement, tree pruning and storage. Pruning of the simplex tree yields a partially compressed image which requires no decoding, but rather may be rendered as a shaded triangulation. This structure and its generalization to n-dimensions form a convenient setting for wavelet analysis and computations based on simplicial grids.

  14. Wavelet Approximation in Data Assimilation

    NASA Technical Reports Server (NTRS)

    Tangborn, Andrew; Atlas, Robert (Technical Monitor)

    2002-01-01

    Estimation of the state of the atmosphere with the Kalman filter remains a distant goal because of high computational cost of evolving the error covariance for both linear and nonlinear systems. Wavelet approximation is presented here as a possible solution that efficiently compresses both global and local covariance information. We demonstrate the compression characteristics on the the error correlation field from a global two-dimensional chemical constituent assimilation, and implement an adaptive wavelet approximation scheme on the assimilation of the one-dimensional Burger's equation. In the former problem, we show that 99%, of the error correlation can be represented by just 3% of the wavelet coefficients, with good representation of localized features. In the Burger's equation assimilation, the discrete linearized equations (tangent linear model) and analysis covariance are projected onto a wavelet basis and truncated to just 6%, of the coefficients. A nearly optimal forecast is achieved and we show that errors due to truncation of the dynamics are no greater than the errors due to covariance truncation.

  15. Denoised and texture enhanced MVCT to improve soft tissue conspicuity

    SciTech Connect

    Sheng, Ke Qi, Sharon X.; Gou, Shuiping; Wu, Jiaolong

    2014-10-15

    Purpose: MVCT images have been used in TomoTherapy treatment to align patients based on bony anatomies but its usefulness for soft tissue registration, delineation, and adaptive radiation therapy is limited due to insignificant photoelectric interaction components and the presence of noise resulting from low detector quantum efficiency of megavoltage x-rays. Algebraic reconstruction with sparsity regularizers as well as local denoising methods has not significantly improved the soft tissue conspicuity. The authors aim to utilize a nonlocal means denoising method and texture enhancement to recover the soft tissue information in MVCT (DeTECT). Methods: A block matching 3D (BM3D) algorithm was adapted to reduce the noise while keeping the texture information of the MVCT images. Following imaging denoising, a saliency map was created to further enhance visual conspicuity of low contrast structures. In this study, BM3D and saliency maps were applied to MVCT images of a CT imaging quality phantom, a head and neck, and four prostate patients. Following these steps, the contrast-to-noise ratios (CNRs) were quantified. Results: By applying BM3D denoising and saliency map, postprocessed MVCT images show remarkable improvements in imaging contrast without compromising resolution. For the head and neck patient, the difficult-to-see lymph nodes and vein in the carotid space in the original MVCT image became conspicuous in DeTECT. For the prostate patients, the ambiguous boundary between the bladder and the prostate in the original MVCT was clarified. The CNRs of phantom low contrast inserts were improved from 1.48 and 3.8 to 13.67 and 16.17, respectively. The CNRs of two regions-of-interest were improved from 1.5 and 3.17 to 3.14 and 15.76, respectively, for the head and neck patient. DeTECT also increased the CNR of prostate from 0.13 to 1.46 for the four prostate patients. The results are substantially better than a local denoising method using anisotropic diffusion

  16. Despeckling SRTM and other topographic data with a denoising algorithm

    NASA Astrophysics Data System (ADS)

    Stevenson, John A.; Sun, Xianfang; Mitchell, Neil C.

    2010-01-01

    Noise in topographic data obscures features and increases error in geomorphic products calculated from DEMs. DEMs produced by radar remote sensing, such as SRTM, are frequently used for geomorphological studies, they often contain speckle noise which may significantly lower the quality of geomorphometric analyses. We introduce here an algorithm that denoises three-dimensional objects while preserving sharp features. It is free to download and simple to use. In this study the algorithm is applied to topographic data (synthetic landscapes, SRTM, TOPSAR) and the results are compared against using a mean filter, using LiDAR data as ground truth for the natural datasets. The level of denoising is controlled by two parameters: the threshold ( T) that controls the sharpness of the features to be preserved, and the number of iterations ( n) that controls how much the data are changed. The optimum settings depend on the nature of the topography and of the noise to be removed, but are typically in the range T = 0.87-0.99 and n = 1-10. If the threshold is too high, noise is preserved. A lower threshold setting is used where noise is spatially uncorrelated (e.g. TOPSAR), whereas in some other datasets (e.g. SRTM), where filtering of the data during processing has introduced spatial correlation to the noise, higher thresholds can be used. Compared to those filtered to an equivalent level with a mean filter, data smoothed by the denoising algorithm of Sun et al. [Sun, X., Rosin, P.L., Martin, R.R., Langbein, F.C., 2007. Fast and effective feature-preserving mesh denoising. IEEE Transactions on Visualisation and Computer Graphics 13, 925-938.] are closer to the original data and to the ground truth. Changes to the data are smaller and less correlated to topographic features. Furthermore, the feature-preserving nature of the algorithm allows significant smoothing to be applied to flat areas of topography while limiting the alterations made in mountainous regions, with clear

  17. Joint application of continuous and discrete wavelet transform on gravity data to identify shallow and deep sources

    NASA Astrophysics Data System (ADS)

    Fedi, M.; Primiceri, R.; Quarta, T.; Villani, A. V.

    2004-01-01

    The discrete wavelet transform (dwt), using the good property of localization of wavelet bases has been used as a powerful tool in filtering and denoising problems. The continuous wavelet transform (cwt) exploits the upward continuation properties of the field horizontal derivative and allows the location of potential field singularities in a simple geometrical manner. Within the cwt space-scale framework, the lines formed by joining, at different scales, the modulus maxima of the wavelet coefficients (multiscale edge detection method) intersect each other at the position of the point source or along the edges of the causative body. As long as the multiscale edge detection method is applied to experimental data the procedure may, however, fail, since the observed anomalies are the superposition of effects of sources having different density contrast, geometrical size and depths. We show that wavelet transform modulus maxima lines attributed to deep sources do not converge toward the true depths, but yield completely erroneous solutions. On the other hand, use of nth-order derivatives of the potential field allows the enhancement of the shallowest source effects, preventing us from obtaining information on the deeper ones. In this paper we therefore try to overcome this problem by a joint application of cwt and dwt. A localized dwt filter coupled to compactness criterion allows the separation of the effects due to the deeper sources from those of the shallower ones. Hence, the multiscale edge detection method, applied separately to the original and the filtered signals enabled the estimation of the depth of shallower and deeper sources, respectively. This analysis, performed on the gravity anomalies of Sardinia (Italy), has given estimations of the depths to both the Campidano graben and the Moho discontinuity, in good agreement with previous interpretations of gravity and seismic data.

  18. Ethanol production from potato peel waste (PPW).

    PubMed

    Arapoglou, D; Varzakas, Th; Vlyssides, A; Israilides, C

    2010-10-01

    Considerable concern is caused by the problem of potato peel waste (PPW) to potato industries in Europe. An integrated, environmentally-friendly solution is yet to be found and is currently undergoing investigation. Potato peel is a zero value waste produced by potato processing plants. However, bio-ethanol produced from potato wastes has a large potential market. If Federal Government regulations are adopted in light of the Kyoto agreement, the mandatory blending of bio-ethanol with traditional gasoline in amounts up to 10% will result in a demand for large quantities of bio-ethanol. PPW contain sufficient quantities of starch, cellulose, hemicellulose and fermentable sugars to warrant use as an ethanol feedstock. In the present study, a number of batches of PPW were hydrolyzed with various enzymes and/or acid, and fermented by Saccharomyces cerevisae var. bayanus to determine fermentability and ethanol production. Enzymatic hydrolysis with a combination of three enzymes, released 18.5 g L(-1) reducing sugar and produced 7.6 g L(-1) of ethanol after fermentation. The results demonstrate that PPW, a by-product of the potato industry features a high potential for ethanol production.

  19. A novel structured dictionary for fast processing of 3D medical images, with application to computed tomography restoration and denoising

    NASA Astrophysics Data System (ADS)

    Karimi, Davood; Ward, Rabab K.

    2016-03-01

    Sparse representation of signals in learned overcomplete dictionaries has proven to be a powerful tool with applications in denoising, restoration, compression, reconstruction, and more. Recent research has shown that learned overcomplete dictionaries can lead to better results than analytical dictionaries such as wavelets in almost all image processing applications. However, a major disadvantage of these dictionaries is that their learning and usage is very computationally intensive. In particular, finding the sparse representation of a signal in these dictionaries requires solving an optimization problem that leads to very long computational times, especially in 3D image processing. Moreover, the sparse representation found by greedy algorithms is usually sub-optimal. In this paper, we propose a novel two-level dictionary structure that improves the performance and the speed of standard greedy sparse coding methods. The first (i.e., the top) level in our dictionary is a fixed orthonormal basis, whereas the second level includes the atoms that are learned from the training data. We explain how such a dictionary can be learned from the training data and how the sparse representation of a new signal in this dictionary can be computed. As an application, we use the proposed dictionary structure for removing the noise and artifacts in 3D computed tomography (CT) images. Our experiments with real CT images show that the proposed method achieves results that are comparable with standard dictionary-based methods while substantially reducing the computational time.

  20. PDE-based Non-Linear Diffusion Techniques for Denoising Scientific and Industrial Images: An Empirical Study

    SciTech Connect

    Weeratunga, S K; Kamath, C

    2001-12-20

    Removing noise from data is often the first step in data analysis. Denoising techniques should not only reduce the noise, but do so without blurring or changing the location of the edges. Many approaches have been proposed to accomplish this; in this paper, they focus on one such approach, namely the use of non-linear diffusion operators. This approach has been studied extensively from a theoretical viewpoint ever since the 1987 work of Perona and Malik showed that non-linear filters outperformed the more traditional linear Canny edge detector. They complement this theoretical work by investigating the performance of several isotropic diffusion operators on test images from scientific domains. They explore the effects of various parameters such as the choice of diffusivity function, explicit and implicit methods for the discretization of the PDE, and approaches for the spatial discretization of the non-linear operator etc. They also compare these schemes with simple spatial filters and the more complex wavelet-based shrinkage techniques. The empirical results show that, with an appropriate choice of parameters, diffusion-based schemes can be as effective as competitive techniques.

  1. A novel method for image denoising of fluorescence molecular imaging based on fuzzy C-Means clustering

    NASA Astrophysics Data System (ADS)

    An, Yu; Liu, Jie; Ye, Jinzuo; Mao, Yamin; Yang, Xin; Jiang, Shixin; Chi, Chongwei; Tian, Jie

    2015-03-01

    As an important molecular imaging modality, fluorescence molecular imaging (FMI) has the advantages of high sensitivity, low cost and ease of use. By labeling the regions of interest with fluorophore, FMI can noninvasively obtain the distribution of fluorophore in-vivo. However, due to the fact that the spectrum of fluorescence is in the section of the visible light range, there are mass of autofluorescence on the surface of the bio-tissues, which is a major disturbing factor in FMI. Meanwhile, the high-level of dark current for charge-coupled device (CCD) camera and other influencing factor can also produce a lot of background noise. In this paper, a novel method for image denoising of FMI based on fuzzy C-Means clustering (FCM) is proposed, because the fluorescent signal is the major component of the fluorescence images, and the intensity of autofluorescence and other background signals is relatively lower than the fluorescence signal. First, the fluorescence image is smoothed by sliding-neighborhood operations to initially eliminate the noise. Then, the wavelet transform (WLT) is performed on the fluorescence images to obtain the major component of the fluorescent signals. After that, the FCM method is adopt to separate the major component and background of the fluorescence images. Finally, the proposed method was validated using the original data obtained by in vivo implanted fluorophore experiment, and the results show that our proposed method can effectively obtain the fluorescence signal while eliminate the background noise, which could increase the quality of fluorescence images.

  2. A Fast Algorithm for Denoising Magnitude Diffusion-Weighted Images with Rank and Edge Constraints

    PubMed Central

    Lam, Fan; Liu, Ding; Song, Zhuang; Schuff, Norbert; Liang, Zhi-Pei

    2015-01-01

    Purpose To accelerate denoising of magnitude diffusion-weighted images subject to joint rank and edge constraints. Methods We extend a previously proposed majorize-minimize (MM) method for statistical estimation that involves noncentral χ distributions and joint rank and edge constraints. A new algorithm is derived which decomposes the constrained noncentral χ denoising problem into a series of constrained Gaussian denoising problems each of which is then solved using an efficient alternating minimization scheme. Results The performance of the proposed algorithm has been evaluated using both simulated and experimental data. Results from simulations based on ex vivo data show that the new algorithm achieves about a factor of 10 speed up over the original Quasi-Newton based algorithm. This improvement in computational efficiency enabled denoising of large data sets containing many diffusion-encoding directions. The denoising performance of the new efficient algorithm is found to be comparable to or even better than that of the original slow algorithm. For an in vivo high-resolution Q-ball acquisition, comparison of fiber tracking results around hippocampus region before and after denoising will also be shown to demonstrate the denoising effects of the new algorithm. Conclusion The optimization problem associated with denoising noncentral χ distributed diffusion-weighted images subject to joint rank and edge constraints can be solved efficiently using an MM-based algorithm. PMID:25733066

  3. Optical wavelet transform for fingerprint identification

    NASA Astrophysics Data System (ADS)

    MacDonald, Robert P.; Rogers, Steven K.; Burns, Thomas J.; Fielding, Kenneth H.; Warhola, Gregory T.; Ruck, Dennis W.

    1994-03-01

    The Federal Bureau of Investigation (FBI) has recently sanctioned a wavelet fingerprint image compression algorithm developed for reducing storage requirements of digitized fingerprints. This research implements an optical wavelet transform of a fingerprint image, as the first step in an optical fingerprint identification process. Wavelet filters are created from computer- generated holograms of biorthogonal wavelets, the same wavelets implemented in the FBI algorithm. Using a detour phase holographic technique, a complex binary filter mask is created with both symmetry and linear phase. The wavelet transform is implemented with continuous shift using an optical correlation between binarized fingerprints written on a Magneto-Optic Spatial Light Modulator and the biorthogonal wavelet filters. A telescopic lens combination scales the transformed fingerprint onto the filters, providing a means of adjusting the biorthogonal wavelet filter dilation continuously. The wavelet transformed fingerprint is then applied to an optical fingerprint identification process. Comparison between normal fingerprints and wavelet transformed fingerprints shows improvement in the optical identification process, in terms of rotational invariance.

  4. A New Method for Nonlocal Means Image Denoising Using Multiple Images.

    PubMed

    Wang, Xingzheng; Wang, Haoqian; Yang, Jiangfeng; Zhang, Yongbing

    2016-01-01

    The basic principle of nonlocal means is to denoise a pixel using the weighted average of the neighbourhood pixels, while the weight is decided by the similarity of these pixels. The key issue of the nonlocal means method is how to select similar patches and design the weight of them. There are two main contributions of this paper: The first contribution is that we use two images to denoise the pixel. These two noised images are with the same noise deviation. Instead of using only one image, we calculate the weight from two noised images. After the first denoising process, we get a pre-denoised image and a residual image. The second contribution is combining the nonlocal property between residual image and pre-denoised image. The improved nonlocal means method pays more attention on the similarity than the original one, which turns out to be very effective in eliminating gaussian noise. Experimental results with simulated data are provided.

  5. Blind source separation based x-ray image denoising from an image sequence.

    PubMed

    Yu, Chun-Yu; Li, Yan; Fei, Bin; Li, Wei-Liang

    2015-09-01

    Blind source separation (BSS) based x-ray image denoising from an image sequence is proposed. Without priori knowledge, the useful image signal can be separated from an x-ray image sequence, for original images are supposed as different combinations of stable image signal and random image noise. The BSS algorithms such as fixed-point independent component analysis and second-order statistics singular value decomposition are used and compared with multi-frame averaging which is a common algorithm for improving image's signal-to-noise ratio (SNR). Denoising performance is evaluated in SNR, standard deviation, entropy, and runtime. Analysis indicates that BSS is applicable to image denoising; the denoised image's quality will get better when more frames are included in an x-ray image sequence, but it will cost more time; there should be trade-off between denoising performance and runtime, which means that the number of frames included in an image sequence is enough.

  6. From heuristic optimization to dictionary learning: a review and comprehensive comparison of image denoising algorithms.

    PubMed

    Shao, Ling; Yan, Ruomei; Li, Xuelong; Liu, Yan

    2014-07-01

    Image denoising is a well explored topic in the field of image processing. In the past several decades, the progress made in image denoising has benefited from the improved modeling of natural images. In this paper, we introduce a new taxonomy based on image representations for a better understanding of state-of-the-art image denoising techniques. Within each category, several representative algorithms are selected for evaluation and comparison. The experimental results are discussed and analyzed to determine the overall advantages and disadvantages of each category. In general, the nonlocal methods within each category produce better denoising results than local ones. In addition, methods based on overcomplete representations using learned dictionaries perform better than others. The comprehensive study in this paper would serve as a good reference and stimulate new research ideas in image denoising.

  7. Hyperspectral image denoising using the robust low-rank tensor recovery.

    PubMed

    Li, Chang; Ma, Yong; Huang, Jun; Mei, Xiaoguang; Ma, Jiayi

    2015-09-01

    Denoising is an important preprocessing step to further analyze the hyperspectral image (HSI), and many denoising methods have been used for the denoising of the HSI data cube. However, the traditional denoising methods are sensitive to outliers and non-Gaussian noise. In this paper, by utilizing the underlying low-rank tensor property of the clean HSI data and the sparsity property of the outliers and non-Gaussian noise, we propose a new model based on the robust low-rank tensor recovery, which can preserve the global structure of HSI and simultaneously remove the outliers and different types of noise: Gaussian noise, impulse noise, dead lines, and so on. The proposed model can be solved by the inexact augmented Lagrangian method, and experiments on simulated and real hyperspectral images demonstrate that the proposed method is efficient for HSI denoising.

  8. A New Method for Nonlocal Means Image Denoising Using Multiple Images

    PubMed Central

    Wang, Xingzheng; Wang, Haoqian; Yang, Jiangfeng; Zhang, Yongbing

    2016-01-01

    The basic principle of nonlocal means is to denoise a pixel using the weighted average of the neighbourhood pixels, while the weight is decided by the similarity of these pixels. The key issue of the nonlocal means method is how to select similar patches and design the weight of them. There are two main contributions of this paper: The first contribution is that we use two images to denoise the pixel. These two noised images are with the same noise deviation. Instead of using only one image, we calculate the weight from two noised images. After the first denoising process, we get a pre-denoised image and a residual image. The second contribution is combining the nonlocal property between residual image and pre-denoised image. The improved nonlocal means method pays more attention on the similarity than the original one, which turns out to be very effective in eliminating gaussian noise. Experimental results with simulated data are provided. PMID:27459293

  9. Early detection for short-circuit fault in low-voltage systems based on fractal exponent wavelet analysis

    NASA Astrophysics Data System (ADS)

    Kang, Shanlin; Wang, Bingjun; Kang, Yuzhe

    2006-11-01

    By combining wavelet transform (WT ) with fractal theory, a novel approach is put forward to detect early short-circuit fault. The application of signal denoising based on the statistic rule is brought forward to determine the threshold of each order of wavelet space, and an effective method is proposed to determine the decomposition adaptively, increasing the signal-noise-ratio (SNR). In a view of the inter relationship of wavelet transform and fractal theory, the whole and local fractal exponents obtained from WT coefficients as features are presented for extracting fault signals. The effectiveness of the new algorithm used to extract the characteristic signal is described, which can be realized by the value of the fractal dimensions of those types of short-circuit fault. In accordance with the threshold value of each type of short-circuit fault in each frequency band, the correlation between the type of short-circuit and the fractal dimensions can be figured to perform extraction. This model incorporates the advantages of morphological filter and multi-scale WT to extract the feature of faults meanwhile restraining various noises. Besides, it can be implemented in real time using the available hardware. The effectiveness of this model was verified with the simulation results.

  10. De-striping hyperspectral imagery using wavelet transform and adaptive frequency domain filtering

    NASA Astrophysics Data System (ADS)

    Pande-Chhetri, Roshan; Abd-Elrahman, Amr

    2011-09-01

    Hyperspectral imagers are built line-by-line similar to images acquired by pushbroom sensors. They can experience striping artifacts due to variations in detector response to incident imagery. In this research, a method for hyperspectral image de-striping based on wavelet analysis and adaptive Fourier zero-frequency amplitude normalization has been developed. The algorithm was tested against three other de-striping algorithms. Hyperspectral image bands of different scenes with significant striping and random noise, as well as an image with simulated noise, were used in the testing. The results were assessed visually and quantitatively using frequency domain Signal-to-Noise Ratio (SNR), Root Mean Square Error (RMSE) and/or Peak Signal-to-Ratio (PSNR). The results demonstrated the superiority of our proposed algorithm in de-striping hyperspectral images without introducing unwanted artifacts, yet preserving image details. In the noise-induced image results, the proposed method reduced RMSE error and improved PSNR by 3.5 dB which is better than other tested methods. A Combined method, integrating the proposed algorithm with a generic wavelet-based de-noising algorithm, showed significant random noise suppression in addition to stripe reduction with a PSNR value of 4.3 dB. These findings make the algorithm a candidate for practical implementation on remote sensing images including high resolution hyperspectral images contaminated with stripe and random noise.

  11. Wavelet-based density estimation for noise reduction in plasma simulations using particles

    NASA Astrophysics Data System (ADS)

    Nguyen van Yen, R.; Del-Castillo-Negrete, D.; Schneider, K.; Farge, M.; Chen, G.

    2009-11-01

    A limitation of particle methods is the inherent noise caused by limited statistical sampling with finite number of particles. Thus, a key issue for the success of these methods is the development of noise reduction techniques in the reconstruction of the particle distribution function from discrete particle data. Here we propose and study a method based on wavelets, previously introduced in the statistical literature to estimate probability densities given a finite number of independent measurements. Its application to plasma simulations can be viewed as a natural extension of the finite size particles (FSP) approach, with the advantage of estimating more accurately distribution functions that have localized sharp features. Furthermore, the moments of the particle distribution function can be preserved with a good accuracy, and there is no constraint on the dimensionality of the system. It is shown that the computational cost of the denoising stage is of the same order as one time step of a FSP simulation. The wavelet method is compared with the recently introduced proper orthogonal decomposition approach in Ref. [D. del-Castillo-Negrete, et al., Phys. Plasma, 15 092308 (2008)].

  12. Wavelet-based density estimation for noise reduction in plasma simulations using particles

    NASA Astrophysics Data System (ADS)

    van yen, Romain Nguyen; del-Castillo-Negrete, Diego; Schneider, Kai; Farge, Marie; Chen, Guangye

    2010-04-01

    For given computational resources, the accuracy of plasma simulations using particles is mainly limited by the noise due to limited statistical sampling in the reconstruction of the particle distribution function. A method based on wavelet analysis is proposed and tested to reduce this noise. The method, known as wavelet-based density estimation (WBDE), was previously introduced in the statistical literature to estimate probability densities given a finite number of independent measurements. Its novel application to plasma simulations can be viewed as a natural extension of the finite size particles (FSP) approach, with the advantage of estimating more accurately distribution functions that have localized sharp features. The proposed method preserves the moments of the particle distribution function to a good level of accuracy, has no constraints on the dimensionality of the system, does not require an a priori selection of a global smoothing scale, and its able to adapt locally to the smoothness of the density based on the given discrete particle data. Moreover, the computational cost of the denoising stage is of the same order as one time step of a FSP simulation. The method is compared with a recently proposed proper orthogonal decomposition based method, and it is tested with three particle data sets involving different levels of collisionality and interaction with external and self-consistent fields.

  13. Wavelet-based density estimation for noise reduction in plasma simulations using particles

    SciTech Connect

    Nguyen van yen, Romain; Del-Castillo-Negrete, Diego B; Schneider, Kai; Farge, Marie; Chen, Guangye

    2010-01-01

    For given computational resources, one of the main limitations in the accuracy of plasma simulations using particles comes from the noise due to limited statistical sampling in the reconstruction of the particle distribution function. A method based on wavelet multiresolution analysis is proposed and tested to reduce this noise. The method, known as wavelet based density estimation (WBDE), was previously introduced in the statistical literature to estimate probability densities given a nite number of independent measurements. Its novel application to plasma simulations can be viewed as a natural extension of the nite size particles (FSP) approach, with the advantage of estimating more accurately distribution functions that have localized sharp features. The proposed method preserves the moments of the particle distribution function to a good level of accuracy, has no constraints on the dimensionality of the system, does not require an a priori selection of a global smoothing scale, and its able to adapt locally to the smoothness of the density based on the given discrete particle data. Most importantly, the computational cost of the denoising stage is of the same order as one timestep of a FSP simulation. The method is compared with a recently proposed proper orthogonal decomposition based method, and it is tested with particle data corresponding to strongly collisional, weakly collisional, and collisionless plasmas simulations.

  14. Ocean Wave Separation Using CEEMD-Wavelet in GPS Wave Measurement

    PubMed Central

    Wang, Junjie; He, Xiufeng; Ferreira, Vagner G.

    2015-01-01

    Monitoring ocean waves plays a crucial role in, for example, coastal environmental and protection studies. Traditional methods for measuring ocean waves are based on ultrasonic sensors and accelerometers. However, the Global Positioning System (GPS) has been introduced recently and has the advantage of being smaller, less expensive, and not requiring calibration in comparison with the traditional methods. Therefore, for accurately measuring ocean waves using GPS, further research on the separation of the wave signals from the vertical GPS-mounted carrier displacements is still necessary. In order to contribute to this topic, we present a novel method that combines complementary ensemble empirical mode decomposition (CEEMD) with a wavelet threshold denoising model (i.e., CEEMD-Wavelet). This method seeks to extract wave signals with less residual noise and without losing useful information. Compared with the wave parameters derived from the moving average skill, high pass filter and wave gauge, the results show that the accuracy of the wave parameters for the proposed method was improved with errors of about 2 cm and 0.2 s for mean wave height and mean period, respectively, verifying the validity of the proposed method. PMID:26262620

  15. The use of glycolic acid as a peeling agent.

    PubMed

    Murad, H; Shamban, A T; Premo, P S

    1995-04-01

    Glycolic acid is a member of the AHA family, which occurs naturally in foods and has been used for centuries as a cutaneous rejuvenation treatment. Recently it has proved to be a versatile peeling agent and it is now widely used to treat many defects of the epidermis and papillary dermis in a variety of strengths, ranging from 20% to 70%, depending on the condition being treated. People of almost any skin type and color are candidates, and almost any area of the body can be peeled. Several weeks prior to a peel the skin may be prepared with topical tretinoin or glycolic acid, and immediately prior to the peel the skin may be degreased with a variety of agents. Following the peel the skin is carefully observed for any complications such as hyperpigmentation and infection. Results are maintained with serial peels and at-home use of tretinoin or glycolic acid, as well as sun avoidance. The glycolic acid can be applied simultaneously with TCA and is another technique for a medium-depth peel. Comparison of 35% TCA-treated skin with 70% glycolic acid-treated skin examined histologically at different times reveals similar changes in papillary dermis connective tissue proteins, epidermal necrosis seen only with TCA, and reversion at 2 years postpeel to pretreatment appearance.

  16. Ripening influences banana and plantain peels composition and energy content.

    PubMed

    Emaga, Thomas Happi; Bindelle, Jérôme; Agneesens, Richard; Buldgen, André; Wathelet, Bernard; Paquot, Michel

    2011-01-01

    Musa sp. peels are widely used by smallholders as complementary feeds for cattle in the tropics. A study of the influence of the variety and the maturation stage of the fruit on fermentability and metabolisable energy (ME) content of the peels was performed using banana (Yangambi Km5) and plantain (Big Ebanga) peels at three stages of maturation in an in vitro model of the rumen. Peel samples were analysed for starch, free sugars and fibre composition. Samples were incubated in the presence of rumen fluid. Kinetics of gas production were modelled, ME content was calculated using prediction equation and short-chain fatty acids production and molar ratio were measured after 72 h of fermentation. Final gas production was higher in plantain (269-339 ml g(-1)) compared to banana (237-328 ml g(-1)) and plantain exhibited higher ME contents (8.9-9.7 MJ/kg of dry matter, DM) compared to banana (7.7-8.8 MJ/kg of DM). Butyrate molar ratio decreased with maturity of the peels. The main influence of the variety and the stage of maturation on all fermentation parameters as well as ME contents of the peels was correlated to changes in the carbohydrate fraction of the peels, including starch and fibre.

  17. Wavelet analysis of internal gravity waves

    NASA Astrophysics Data System (ADS)

    Hawkins, J.; Warn-Varnas, A.; Chin-Bing, S.; King, D.; Smolarkiewicsz, P.

    2005-05-01

    A series of model studies of internal gravity waves (igw) have been conducted for several regions of interest. Dispersion relations from the results have been computed using wavelet analysis as described by Meyers (1993). The wavelet transform is repeatedly applied over time and the components are evaluated with respect to their amplitude and peak position (Torrence and Compo, 1998). In this sense we have been able to compute dispersion relations from model results and from measured data. Qualitative agreement has been obtained in some cases. The results from wavelet analysis must be carefully interpreted because the igw models are fully nonlinear and wavelet analysis is fundamentally a linear technique. Nevertheless, a great deal of information describing igw propagation can be obtained from the wavelet transform. We address the domains over which wavelet analysis techniques can be applied and discuss the limits of their applicability.

  18. Continual skin peeling syndrome. An electron microscopic study.

    PubMed

    Silverman, A K; Ellis, C N; Beals, T F; Woo, T Y

    1986-01-01

    We encountered a patient with continual skin peeling syndrome, a rare disorder in which generalized, noninflammatory exfoliation of the stratum corneum occurs. Although scaling occurred spontaneously in our patient, he was also able to manually peel sheets of skin without bleeding or pain. Histologically, there was separation of corneocytes above the granular cell layer. Ultrastructural examination revealed an unusual type of intracellular cleavage, in which the plasma membrane of the "peeling" cell remained firmly adherent to the underlying cell while the upper part of the cell exfoliated. Unique intercellular electron-dense globular deposits were localized to the stratum corneum.

  19. Removing Signal Intensity Inhomogeneity From Surface Coil MRI Using Discrete Wavelet Transform and Wavelet Packet

    DTIC Science & Technology

    2001-10-25

    We evaluate a combined discrete wavelet transform (DWT) and wavelet packet algorithm to improve the homogeneity of magnetic resonance imaging when a...image and uses this information to normalize the image intensity variations. Estimation of the coil sensitivity profile based on the wavelet transform of

  20. A generalized time-frequency subtraction method for robust speech enhancement based on wavelet filter banks modeling of human auditory system.

    PubMed

    Shao, Yu; Chang, Chip-Hong

    2007-08-01

    We present a new speech enhancement scheme for a single-microphone system to meet the demand for quality noise reduction algorithms capable of operating at a very low signal-to-noise ratio. A psychoacoustic model is incorporated into the generalized perceptual wavelet denoising method to reduce the residual noise and improve the intelligibility of speech. The proposed method is a generalized time-frequency subtraction algorithm, which advantageously exploits the wavelet multirate signal representation to preserve the critical transient information. Simultaneous masking and temporal masking of the human auditory system are modeled by the perceptual wavelet packet transform via the frequency and temporal localization of speech components. The wavelet coefficients are used to calculate the Bark spreading energy and temporal spreading energy, from which a time-frequency masking threshold is deduced to adaptively adjust the subtraction parameters of the proposed method. An unvoiced speech enhancement algorithm is also integrated into the system to improve the intelligibility of speech. Through rigorous objective and subjective evaluations, it is shown that the proposed speech enhancement system is capable of reducing noise with little speech degradation in adverse noise environments and the overall performance is superior to several competitive methods.