Sample records for image filtering techniques

  1. SU-F-I-73: Surface Dose from KV Diagnostic Beams From An On-Board Imager On a Linac Machine Using Different Imaging Techniques and Filters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, I; Hossain, S; Syzek, E

    Purpose: To quantitatively investigate the surface dose deposited in patients imaged with a kV on-board-imager mounted on a radiotherapy machine using different clinical imaging techniques and filters. Methods: A high sensitivity photon diode is used to measure the surface dose on central-axis and at an off-axis-point which is mounted on the top of a phantom setup. The dose is measured for different imaging techniques that include: AP-Pelvis, AP-Head, AP-Abdomen, AP-Thorax, and Extremity. The dose measurements from these imaging techniques are combined with various filtering techniques that include: no-filter (open-field), half-fan bowtie (HF), full-fan bowtie (FF) and Cu-plate filters. The relativemore » surface dose for different imaging and filtering techniques is evaluated quantiatively by the ratio of the dose relative to the Cu-plate filter. Results: The lowest surface dose is deposited with the Cu-plate filter. The highest surface dose deposited results from open fields without filter and it is nearly a factor of 8–30 larger than the corresponding imaging technique with the Cu-plate filter. The AP-Abdomen technique delivers the largest surface dose that is nearly 2.7 times larger than the AP-Head technique. The smallest surface dose is obtained from the Extremity imaging technique. Imaging with bowtie filters decreases the surface dose by nearly 33% in comparison with the open field. The surface doses deposited with the HF or FF-bowtie filters are within few percentages. Image-quality of the radiographic images obtained from the different filtering techniques is similar because the Cu-plate eliminates low-energy photons. The HF- and FF-bowtie filters generate intensity-gradients in the radiographs which affects image-quality in the different imaging technique. Conclusion: Surface dose from kV-imaging decreases significantly with the Cu-plate and bowtie-filters compared to imaging without filters using open-field beams. The use of Cu-plate filter does not affect image-quality and may be used as the default in the different imaging techniques.« less

  2. Swarm Intelligence for Optimizing Hybridized Smoothing Filter in Image Edge Enhancement

    NASA Astrophysics Data System (ADS)

    Rao, B. Tirumala; Dehuri, S.; Dileep, M.; Vindhya, A.

    In this modern era, image transmission and processing plays a major role. It would be impossible to retrieve information from satellite and medical images without the help of image processing techniques. Edge enhancement is an image processing step that enhances the edge contrast of an image or video in an attempt to improve its acutance. Edges are the representations of the discontinuities of image intensity functions. For processing these discontinuities in an image, a good edge enhancement technique is essential. The proposed work uses a new idea for edge enhancement using hybridized smoothening filters and we introduce a promising technique of obtaining best hybrid filter using swarm algorithms (Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO)) to search for an optimal sequence of filters from among a set of rather simple, representative image processing filters. This paper deals with the analysis of the swarm intelligence techniques through the combination of hybrid filters generated by these algorithms for image edge enhancement.

  3. Multiscale morphological filtering for analysis of noisy and complex images

    NASA Astrophysics Data System (ADS)

    Kher, A.; Mitra, S.

    Images acquired with passive sensing techniques suffer from illumination variations and poor local contrasts that create major difficulties in interpretation and identification tasks. On the other hand, images acquired with active sensing techniques based on monochromatic illumination are degraded with speckle noise. Mathematical morphology offers elegant techniques to handle a wide range of image degradation problems. Unlike linear filters, morphological filters do not blur the edges and hence maintain higher image resolution. Their rich mathematical framework facilitates the design and analysis of these filters as well as their hardware implementation. Morphological filters are easier to implement and are more cost effective and efficient than several conventional linear filters. Morphological filters to remove speckle noise while maintaining high resolution and preserving thin image regions that are particularly vulnerable to speckle noise were developed and applied to SAR imagery. These filters used combination of linear (one-dimensional) structuring elements in different (typically four) orientations. Although this approach preserves more details than the simple morphological filters using two-dimensional structuring elements, the limited orientations of one-dimensional elements approximate the fine details of the region boundaries. A more robust filter designed recently overcomes the limitation of the fixed orientations. This filter uses a combination of concave and convex structuring elements. Morphological operators are also useful in extracting features from visible and infrared imagery. A multiresolution image pyramid obtained with successive filtering and a subsampling process aids in the removal of the illumination variations and enhances local contrasts. A morphology-based interpolation scheme was also introduced to reduce intensity discontinuities created in any morphological filtering task. The generality of morphological filtering techniques in extracting information from a wide variety of images obtained with active and passive sensing techniques is discussed. Such techniques are particularly useful in obtaining more information from fusion of complex images by different sensors such as SAR, visible, and infrared.

  4. Multiscale Morphological Filtering for Analysis of Noisy and Complex Images

    NASA Technical Reports Server (NTRS)

    Kher, A.; Mitra, S.

    1993-01-01

    Images acquired with passive sensing techniques suffer from illumination variations and poor local contrasts that create major difficulties in interpretation and identification tasks. On the other hand, images acquired with active sensing techniques based on monochromatic illumination are degraded with speckle noise. Mathematical morphology offers elegant techniques to handle a wide range of image degradation problems. Unlike linear filters, morphological filters do not blur the edges and hence maintain higher image resolution. Their rich mathematical framework facilitates the design and analysis of these filters as well as their hardware implementation. Morphological filters are easier to implement and are more cost effective and efficient than several conventional linear filters. Morphological filters to remove speckle noise while maintaining high resolution and preserving thin image regions that are particularly vulnerable to speckle noise were developed and applied to SAR imagery. These filters used combination of linear (one-dimensional) structuring elements in different (typically four) orientations. Although this approach preserves more details than the simple morphological filters using two-dimensional structuring elements, the limited orientations of one-dimensional elements approximate the fine details of the region boundaries. A more robust filter designed recently overcomes the limitation of the fixed orientations. This filter uses a combination of concave and convex structuring elements. Morphological operators are also useful in extracting features from visible and infrared imagery. A multiresolution image pyramid obtained with successive filtering and a subsampling process aids in the removal of the illumination variations and enhances local contrasts. A morphology-based interpolation scheme was also introduced to reduce intensity discontinuities created in any morphological filtering task. The generality of morphological filtering techniques in extracting information from a wide variety of images obtained with active and passive sensing techniques is discussed. Such techniques are particularly useful in obtaining more information from fusion of complex images by different sensors such as SAR, visible, and infrared.

  5. Edge Preserved Speckle Noise Reduction Using Integrated Fuzzy Filters

    PubMed Central

    Dewal, M. L.; Rohit, Manoj Kumar

    2014-01-01

    Echocardiographic images are inherent with speckle noise which makes visual reading and analysis quite difficult. The multiplicative speckle noise masks finer details, necessary for diagnosis of abnormalities. A novel speckle reduction technique based on integration of geometric, wiener, and fuzzy filters is proposed and analyzed in this paper. The denoising applications of fuzzy filters are studied and analyzed along with 26 denoising techniques. It is observed that geometric filter retains noise and, to address this issue, wiener filter is embedded into the geometric filter during iteration process. The performance of geometric-wiener filter is further enhanced using fuzzy filters and the proposed despeckling techniques are called integrated fuzzy filters. Fuzzy filters based on moving average and median value are employed in the integrated fuzzy filters. The performances of integrated fuzzy filters are tested on echocardiographic images and synthetic images in terms of image quality metrics. It is observed that the performance parameters are highest in case of integrated fuzzy filters in comparison to fuzzy and geometric-fuzzy filters. The clinical validation reveals that the output images obtained using geometric-wiener, integrated fuzzy, nonlocal means, and details preserving anisotropic diffusion filters are acceptable. The necessary finer details are retained in the denoised echocardiographic images. PMID:27437499

  6. A Comparative Study of Different Deblurring Methods Using Filters

    NASA Astrophysics Data System (ADS)

    Srimani, P. K.; Kavitha, S.

    2011-12-01

    This paper attempts to undertake the study of Restored Gaussian Blurred Images by using four types of techniques of deblurring image viz., Wiener filter, Regularized filter, Lucy Richardson deconvolution algorithm and Blind deconvolution algorithm with an information of the Point Spread Function (PSF) corrupted blurred image. The same is applied to the scanned image of seven months baby in the womb and they are compared with one another, so as to choose the best technique for restored or deblurring image. This paper also attempts to undertake the study of restored blurred image using Regualr Filter(RF) with no information about the Point Spread Function (PSF) by using the same four techniques after executing the guess of the PSF. The number of iterations and the weight threshold of it to choose the best guesses for restored or deblurring image of these techniques are determined.

  7. TH-CD-202-04: Evaluation of Virtual Non-Contrast Images From a Novel Split-Filter Dual-Energy CT Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, J; Szczykutowicz, T; Bayouth, J

    Purpose: To compare the ability of two dual-energy CT techniques, a novel split-filter single-source technique of superior temporal resolution against an established sequential-scan technique, to remove iodine contrast from images with minimal impact on CT number accuracy. Methods: A phantom containing 8 tissue substitute materials and vials of varying iodine concentrations (1.7–20.1 mg I /mL) was imaged using a Siemens Edge CT scanner. Dual-energy virtual non-contrast (VNC) images were generated using the novel split-filter technique, in which a 120kVp spectrum is filtered by tin and gold to create high- and low-energy spectra with < 1 second temporal separation between themore » acquisition of low- and high-energy data. Additionally, VNC images were generated with the sequential-scan technique (80 and 140kVp) for comparison. CT number accuracy was evaluated for all materials at 15, 25, and 35mGy CTDIvol. Results: The spectral separation was greater for the sequential-scan technique than the split-filter technique with dual-energy ratios of 2.18 and 1.26, respectively. Both techniques successfully removed iodine contrast, resulting in mean CT numbers within 60HU of 0HU (split-filter) and 40HU of 0HU (sequential-scan) for all iodine concentrations. Additionally, for iodine vials of varying diameter (2–20 mm) with the same concentration (9.9 mg I /mL), the system accurately detected iodine for all sizes investigated. Both dual-energy techniques resulted in reduced CT numbers for bone materials (by >400HU for the densest bone). Increasing the imaging dose did not improve the CT number accuracy for bone in VNC images. Conclusion: VNC images from the split-filter technique successfully removed iodine contrast. These results demonstrate a potential for improving dose calculation accuracy and reducing patient imaging dose, while achieving superior temporal resolution in comparison sequential scans. For both techniques, inaccuracies in CT numbers for bone materials necessitate consideration for radiation therapy treatment planning.« less

  8. Ballistic Imaging and Scattering Measurements for Diesel Spray Combustion: Optical Development and Phenomenological Studies

    DTIC Science & Technology

    2016-04-01

    polystyrene spheres in a water suspension. The impact of spatial filtering , temporal filtering , and scattering path length on image resolution are...The impact of spatial filtering , temporal filtering , and scattering path length on image resolution are reported. The technique is demonstrated...cell filled with polystyrene spheres in a water suspension. The impact of spatial filtering , temporal filtering , and scattering path length on image

  9. SU-E-I-37: Low-Dose Real-Time Region-Of-Interest X-Ray Fluoroscopic Imaging with a GPU-Accelerated Spatially Different Bilateral Filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chung, H; Lee, J; Pua, R

    2014-06-01

    Purpose: The purpose of our study is to reduce imaging radiation dose while maintaining image quality of region of interest (ROI) in X-ray fluoroscopy. A low-dose real-time ROI fluoroscopic imaging technique which includes graphics-processing-unit- (GPU-) accelerated image processing for brightness compensation and noise filtering was developed in this study. Methods: In our ROI fluoroscopic imaging, a copper filter is placed in front of the X-ray tube. The filter contains a round aperture to reduce radiation dose to outside of the aperture. To equalize the brightness difference between inner and outer ROI regions, brightness compensation was performed by use of amore » simple weighting method that applies selectively to the inner ROI, the outer ROI, and the boundary zone. A bilateral filtering was applied to the images to reduce relatively high noise in the outer ROI images. To speed up the calculation of our technique for real-time application, the GPU-acceleration was applied to the image processing algorithm. We performed a dosimetric measurement using an ion-chamber dosimeter to evaluate the amount of radiation dose reduction. The reduction of calculation time compared to a CPU-only computation was also measured, and the assessment of image quality in terms of image noise and spatial resolution was conducted. Results: More than 80% of dose was reduced by use of the ROI filter. The reduction rate depended on the thickness of the filter and the size of ROI aperture. The image noise outside the ROI was remarkably reduced by the bilateral filtering technique. The computation time for processing each frame image was reduced from 3.43 seconds with single CPU to 9.85 milliseconds with GPU-acceleration. Conclusion: The proposed technique for X-ray fluoroscopy can substantially reduce imaging radiation dose to the patient while maintaining image quality particularly in the ROI region in real-time.« less

  10. Symmetric Phase Only Filtering for Improved DPIV Data Processing

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    2006-01-01

    The standard approach in Digital Particle Image Velocimetry (DPIV) data processing is to use Fast Fourier Transforms to obtain the cross-correlation of two single exposure subregions, where the location of the cross-correlation peak is representative of the most probable particle displacement across the subregion. This standard DPIV processing technique is analogous to Matched Spatial Filtering, a technique commonly used in optical correlators to perform the crosscorrelation operation. Phase only filtering is a well known variation of Matched Spatial Filtering, which when used to process DPIV image data yields correlation peaks which are narrower and up to an order of magnitude larger than those obtained using traditional DPIV processing. In addition to possessing desirable correlation plane features, phase only filters also provide superior performance in the presence of DC noise in the correlation subregion. When DPIV image subregions contaminated with surface flare light or high background noise levels are processed using phase only filters, the correlation peak pertaining only to the particle displacement is readily detected above any signal stemming from the DC objects. Tedious image masking or background image subtraction are not required. Both theoretical and experimental analyses of the signal-to-noise ratio performance of the filter functions are presented. In addition, a new Symmetric Phase Only Filtering (SPOF) technique, which is a variation on the traditional phase only filtering technique, is described and demonstrated. The SPOF technique exceeds the performance of the traditionally accepted phase only filtering techniques and is easily implemented in standard DPIV FFT based correlation processing with no significant computational performance penalty. An "Automatic" SPOF algorithm is presented which determines when the SPOF is able to provide better signal to noise results than traditional PIV processing. The SPOF based optical correlation processing approach is presented as a new paradigm for more robust cross-correlation processing of low signal-to-noise ratio DPIV image data."

  11. A motion-compensated image filter for low-dose fluoroscopy in a real-time tumor-tracking radiotherapy system

    PubMed Central

    Miyamoto, Naoki; Ishikawa, Masayori; Sutherland, Kenneth; Suzuki, Ryusuke; Matsuura, Taeko; Toramatsu, Chie; Takao, Seishin; Nihongi, Hideaki; Shimizu, Shinichi; Umegaki, Kikuo; Shirato, Hiroki

    2015-01-01

    In the real-time tumor-tracking radiotherapy system, a surrogate fiducial marker inserted in or near the tumor is detected by fluoroscopy to realize respiratory-gated radiotherapy. The imaging dose caused by fluoroscopy should be minimized. In this work, an image processing technique is proposed for tracing a moving marker in low-dose imaging. The proposed tracking technique is a combination of a motion-compensated recursive filter and template pattern matching. The proposed image filter can reduce motion artifacts resulting from the recursive process based on the determination of the region of interest for the next frame according to the current marker position in the fluoroscopic images. The effectiveness of the proposed technique and the expected clinical benefit were examined by phantom experimental studies with actual tumor trajectories generated from clinical patient data. It was demonstrated that the marker motion could be traced in low-dose imaging by applying the proposed algorithm with acceptable registration error and high pattern recognition score in all trajectories, although some trajectories were not able to be tracked with the conventional spatial filters or without image filters. The positional accuracy is expected to be kept within ±2 mm. The total computation time required to determine the marker position is a few milliseconds. The proposed image processing technique is applicable for imaging dose reduction. PMID:25129556

  12. A comparative study on preprocessing techniques in diabetic retinopathy retinal images: illumination correction and contrast enhancement.

    PubMed

    Rasta, Seyed Hossein; Partovi, Mahsa Eisazadeh; Seyedarabi, Hadi; Javadzadeh, Alireza

    2015-01-01

    To investigate the effect of preprocessing techniques including contrast enhancement and illumination correction on retinal image quality, a comparative study was carried out. We studied and implemented a few illumination correction and contrast enhancement techniques on color retinal images to find out the best technique for optimum image enhancement. To compare and choose the best illumination correction technique we analyzed the corrected red and green components of color retinal images statistically and visually. The two contrast enhancement techniques were analyzed using a vessel segmentation algorithm by calculating the sensitivity and specificity. The statistical evaluation of the illumination correction techniques were carried out by calculating the coefficients of variation. The dividing method using the median filter to estimate background illumination showed the lowest Coefficients of variations in the red component. The quotient and homomorphic filtering methods after the dividing method presented good results based on their low Coefficients of variations. The contrast limited adaptive histogram equalization increased the sensitivity of the vessel segmentation algorithm up to 5% in the same amount of accuracy. The contrast limited adaptive histogram equalization technique has a higher sensitivity than the polynomial transformation operator as a contrast enhancement technique for vessel segmentation. Three techniques including the dividing method using the median filter to estimate background, quotient based and homomorphic filtering were found as the effective illumination correction techniques based on a statistical evaluation. Applying the local contrast enhancement technique, such as CLAHE, for fundus images presented good potentials in enhancing the vasculature segmentation.

  13. Intensity transform and Wiener filter in measurement of blood flow in arteriography

    NASA Astrophysics Data System (ADS)

    Nunes, Polyana F.; Franco, Marcelo L. N.; Filho, João. B. D.; Patrocínio, Ana C.

    2015-03-01

    Using the arteriography examination, it is possible to check anomalies in blood vessels and diseases such as stroke, stenosis, bleeding and especially in the diagnosis of Encephalic Death in comatose individuals. Encephalic death can be diagnosed only when there is complete interruption of all brain functions, and hence the blood stream. During the examination, there may be some interference on the sensors, such as environmental factors, poor maintenance of equipment, patient movement, among other interference, which can directly affect the noise produced in angiography images. Then, we need to use digital image processing techniques to minimize this noise and improve the pixel count. Therefore, this paper proposes to use median filter and enhancement techniques for transformation of intensity using the sigmoid function together with the Wiener filter so you can get less noisy images. It's been realized two filtering techniques to remove the noise of images, one with the median filter and the other with the Wiener filter along the sigmoid function. For 14 tests quantified, including 7 Encephalic Death and 7 other cases, the technique that achieved a most satisfactory number of pixels quantified, also presenting a lesser amount of noise, is the Wiener filter sigmoid function, and in this case used with 0.03 cuttof.

  14. Directional bilateral filters for smoothing fluorescence microscopy images

    NASA Astrophysics Data System (ADS)

    Venkatesh, Manasij; Mohan, Kavya; Seelamantula, Chandra Sekhar

    2015-08-01

    Images obtained through fluorescence microscopy at low numerical aperture (NA) are noisy and have poor resolution. Images of specimens such as F-actin filaments obtained using confocal or widefield fluorescence microscopes contain directional information and it is important that an image smoothing or filtering technique preserve the directionality. F-actin filaments are widely studied in pathology because the abnormalities in actin dynamics play a key role in diagnosis of cancer, cardiac diseases, vascular diseases, myofibrillar myopathies, neurological disorders, etc. We develop the directional bilateral filter as a means of filtering out the noise in the image without significantly altering the directionality of the F-actin filaments. The bilateral filter is anisotropic to start with, but we add an additional degree of anisotropy by employing an oriented domain kernel for smoothing. The orientation is locally adapted using a structure tensor and the parameters of the bilateral filter are optimized for within the framework of statistical risk minimization. We show that the directional bilateral filter has better denoising performance than the traditional Gaussian bilateral filter and other denoising techniques such as SURE-LET, non-local means, and guided image filtering at various noise levels in terms of peak signal-to-noise ratio (PSNR). We also show quantitative improvements in low NA images of F-actin filaments.

  15. Teaching learning based optimization-functional link artificial neural network filter for mixed noise reduction from magnetic resonance image.

    PubMed

    Kumar, M; Mishra, S K

    2017-01-01

    The clinical magnetic resonance imaging (MRI) images may get corrupted due to the presence of the mixture of different types of noises such as Rician, Gaussian, impulse, etc. Most of the available filtering algorithms are noise specific, linear, and non-adaptive. There is a need to develop a nonlinear adaptive filter that adapts itself according to the requirement and effectively applied for suppression of mixed noise from different MRI images. In view of this, a novel nonlinear neural network based adaptive filter i.e. functional link artificial neural network (FLANN) whose weights are trained by a recently developed derivative free meta-heuristic technique i.e. teaching learning based optimization (TLBO) is proposed and implemented. The performance of the proposed filter is compared with five other adaptive filters and analyzed by considering quantitative metrics and evaluating the nonparametric statistical test. The convergence curve and computational time are also included for investigating the efficiency of the proposed as well as competitive filters. The simulation outcomes of proposed filter outperform the other adaptive filters. The proposed filter can be hybridized with other evolutionary technique and utilized for removing different noise and artifacts from others medical images more competently.

  16. Optimizing dual-energy x-ray parameters for the ExacTrac clinical stereoscopic imaging system to enhance soft-tissue imaging.

    PubMed

    Bowman, Wesley A; Robar, James L; Sattarivand, Mike

    2017-03-01

    Stereoscopic x-ray image guided radiotherapy for lung tumors is often hindered by bone overlap and limited soft-tissue contrast. This study aims to evaluate the feasibility of dual-energy imaging techniques and to optimize parameters of the ExacTrac stereoscopic imaging system to enhance soft-tissue imaging for application to lung stereotactic body radiation therapy. Simulated spectra and a physical lung phantom were used to optimize filter material, thickness, tube potentials, and weighting factors to obtain bone subtracted dual-energy images. Spektr simulations were used to identify material in the atomic number range (3-83) based on a metric defined to separate spectra of high and low-energies. Both energies used the same filter due to time constraints of imaging in the presence of respiratory motion. The lung phantom contained bone, soft tissue, and tumor mimicking materials, and it was imaged with a filter thickness in the range of (0-0.7) mm and a kVp range of (60-80) for low energy and (120,140) for high energy. Optimal dual-energy weighting factors were obtained when the bone to soft-tissue contrast-to-noise ratio (CNR) was minimized. Optimal filter thickness and tube potential were achieved by maximizing tumor-to-background CNR. Using the optimized parameters, dual-energy images of an anthropomorphic Rando phantom with a spherical tumor mimicking material inserted in his lung were acquired and evaluated for bone subtraction and tumor contrast. Imaging dose was measured using the dual-energy technique with and without beam filtration and matched to that of a clinical conventional single energy technique. Tin was the material of choice for beam filtering providing the best energy separation, non-toxicity, and non-reactiveness. The best soft-tissue-weighted image in the lung phantom was obtained using 0.2 mm tin and (140, 60) kVp pair. Dual-energy images of the Rando phantom with the tin filter had noticeable improvement in bone elimination, tumor contrast, and noise content when compared to dual-energy imaging with no filtration. The surface dose was 0.52 mGy per each stereoscopic view for both clinical single energy technique and the dual-energy technique in both cases of with and without the tin filter. Dual-energy soft-tissue imaging is feasible without additional imaging dose using the ExacTrac stereoscopic imaging system with optimized acquisition parameters and no beam filtration. Addition of a single tin filter for both the high and low energies has noticeable improvements on dual-energy imaging with optimized parameters. Clinical implementation of a dual-energy technique on ExacTrac stereoscopic imaging could improve lung tumor visibility. © 2017 American Association of Physicists in Medicine.

  17. Noise reduction techniques for Bayer-matrix images

    NASA Astrophysics Data System (ADS)

    Kalevo, Ossi; Rantanen, Henry

    2002-04-01

    In this paper, some arrangements to apply Noise Reduction (NR) techniques for images captured by a single sensor digital camera are studied. Usually, the NR filter processes full three-color component image data. This requires that raw Bayer-matrix image data, available from the image sensor, is first interpolated by using Color Filter Array Interpolation (CFAI) method. Another choice is that the raw Bayer-matrix image data is processed directly. The advantages and disadvantages of both processing orders, before (pre-) CFAI and after (post-) CFAI, are studied with linear, multi-stage median, multistage median hybrid and median-rational filters .The comparison is based on the quality of the output image, the processing power requirements and the amount of memory needed. Also the solution, which improves preservation of details in the NR filtering before the CFAI, is proposed.

  18. Destriping of Landsat MSS images by filtering techniques

    USGS Publications Warehouse

    Pan, Jeng-Jong; Chang, Chein-I

    1992-01-01

    : The removal of striping noise encountered in the Landsat Multispectral Scanner (MSS) images can be generally done by using frequency filtering techniques. Frequency do~ain filteri~g has, how~ver, se,:era~ prob~ems~ such as storage limitation of data required for fast Fourier transforms, nngmg artl~acts appe~nng at hlgh-mt,enslty.dlscontinuities, and edge effects between adjacent filtered data sets. One way for clrcu~,,:entmg the above difficulties IS, to design a spatial filter to convolve with the images. Because it is known that the,stnpmg a.lways appears at frequencies of 1/6, 1/3, and 1/2 cycles per line, it is possible to design a simple one-dimensIOnal spat~a~ fll,ter to take advantage of this a priori knowledge to cope with the above problems. The desired filter is the type of ~mlte Impuls~ response which can be designed by a linear programming and Remez's exchange algorithm coupled ~lth an adaptIve tec,hmque. In addition, a four-step spatial filtering technique with an appropriate adaptive approach IS also presented which may be particularly useful for geometrically rectified MSS images.

  19. A Comparative Study on Preprocessing Techniques in Diabetic Retinopathy Retinal Images: Illumination Correction and Contrast Enhancement

    PubMed Central

    Rasta, Seyed Hossein; Partovi, Mahsa Eisazadeh; Seyedarabi, Hadi; Javadzadeh, Alireza

    2015-01-01

    To investigate the effect of preprocessing techniques including contrast enhancement and illumination correction on retinal image quality, a comparative study was carried out. We studied and implemented a few illumination correction and contrast enhancement techniques on color retinal images to find out the best technique for optimum image enhancement. To compare and choose the best illumination correction technique we analyzed the corrected red and green components of color retinal images statistically and visually. The two contrast enhancement techniques were analyzed using a vessel segmentation algorithm by calculating the sensitivity and specificity. The statistical evaluation of the illumination correction techniques were carried out by calculating the coefficients of variation. The dividing method using the median filter to estimate background illumination showed the lowest Coefficients of variations in the red component. The quotient and homomorphic filtering methods after the dividing method presented good results based on their low Coefficients of variations. The contrast limited adaptive histogram equalization increased the sensitivity of the vessel segmentation algorithm up to 5% in the same amount of accuracy. The contrast limited adaptive histogram equalization technique has a higher sensitivity than the polynomial transformation operator as a contrast enhancement technique for vessel segmentation. Three techniques including the dividing method using the median filter to estimate background, quotient based and homomorphic filtering were found as the effective illumination correction techniques based on a statistical evaluation. Applying the local contrast enhancement technique, such as CLAHE, for fundus images presented good potentials in enhancing the vasculature segmentation. PMID:25709940

  20. Quantitative comparison between full-spectrum and filter-based imaging in hyperspectral fluorescence microscopy

    PubMed Central

    GAO, L.; HAGEN, N.; TKACZYK, T.S.

    2012-01-01

    Summary We implement a filterless illumination scheme on a hyperspectral fluorescence microscope to achieve full-range spectral imaging. The microscope employs polarisation filtering, spatial filtering and spectral unmixing filtering to replace the role of traditional filters. Quantitative comparisons between full-spectrum and filter-based microscopy are provided in the context of signal dynamic range and accuracy of measured fluorophores’ emission spectra. To show potential applications, a five-colour cell immunofluorescence imaging experiment is theoretically simulated. Simulation results indicate that the use of proposed full-spectrum imaging technique may result in three times improvement in signal dynamic range compared to that can be achieved in the filter-based imaging. PMID:22356127

  1. Adaptive texture filtering for defect inspection in ultrasound images

    NASA Astrophysics Data System (ADS)

    Zmola, Carl; Segal, Andrew C.; Lovewell, Brian; Nash, Charles

    1993-05-01

    The use of ultrasonic imaging to analyze defects and characterize materials is critical in the development of non-destructive testing and non-destructive evaluation (NDT/NDE) tools for manufacturing. To develop better quality control and reliability in the manufacturing environment advanced image processing techniques are useful. For example, through the use of texture filtering on ultrasound images, we have been able to filter characteristic textures from highly-textured C-scan images of materials. The materials have highly regular characteristic textures which are of the same resolution and dynamic range as other important features within the image. By applying texture filters and adaptively modifying their filter response, we have examined a family of filters for removing these textures.

  2. Improving the quality of reconstructed X-ray CT images of polymer gel dosimeters: zero-scan coupled with adaptive mean filtering.

    PubMed

    Kakakhel, M B; Jirasek, A; Johnston, H; Kairn, T; Trapp, J V

    2017-03-01

    This study evaluated the feasibility of combining the 'zero-scan' (ZS) X-ray computed tomography (CT) based polymer gel dosimeter (PGD) readout with adaptive mean (AM) filtering for improving the signal to noise ratio (SNR), and to compare these results with available average scan (AS) X-ray CT readout techniques. NIPAM PGD were manufactured, irradiated with 6 MV photons, CT imaged and processed in Matlab. AM filter for two iterations, with 3 × 3 and 5 × 5 pixels (kernel size), was used in two scenarios (a) the CT images were subjected to AM filtering (pre-processing) and these were further employed to generate AS and ZS gel images, and (b) the AS and ZS images were first reconstructed from the CT images and then AM filtering was carried out (post-processing). SNR was computed in an ROI of 30 × 30 for different pre and post processing cases. Results showed that the ZS technique combined with AM filtering resulted in improved SNR. Using the previously-recommended 25 images for reconstruction the ZS pre-processed protocol can give an increase of 44% and 80% in SNR for 3 × 3 and 5 × 5 kernel sizes respectively. However, post processing using both techniques and filter sizes introduced blur and a reduction in the spatial resolution. Based on this work, it is possible to recommend that the ZS method may be combined with pre-processed AM filtering using appropriate kernel size, to produce a large increase in the SNR of the reconstructed PGD images.

  3. Signal-to-noise ratio enhancement on SEM images using a cubic spline interpolation with Savitzky-Golay filters and weighted least squares error.

    PubMed

    Kiani, M A; Sim, K S; Nia, M E; Tso, C P

    2015-05-01

    A new technique based on cubic spline interpolation with Savitzky-Golay smoothing using weighted least squares error filter is enhanced for scanning electron microscope (SEM) images. A diversity of sample images is captured and the performance is found to be better when compared with the moving average and the standard median filters, with respect to eliminating noise. This technique can be implemented efficiently on real-time SEM images, with all mandatory data for processing obtained from a single image. Noise in images, and particularly in SEM images, are undesirable. A new noise reduction technique, based on cubic spline interpolation with Savitzky-Golay and weighted least squares error method, is developed. We apply the combined technique to single image signal-to-noise ratio estimation and noise reduction for SEM imaging system. This autocorrelation-based technique requires image details to be correlated over a few pixels, whereas the noise is assumed to be uncorrelated from pixel to pixel. The noise component is derived from the difference between the image autocorrelation at zero offset, and the estimation of the corresponding original autocorrelation. In the few test cases involving different images, the efficiency of the developed noise reduction filter is proved to be significantly better than those obtained from the other methods. Noise can be reduced efficiently with appropriate choice of scan rate from real-time SEM images, without generating corruption or increasing scanning time. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  4. Speckle noise reduction of 1-look SAR imagery

    NASA Technical Reports Server (NTRS)

    Nathan, Krishna S.; Curlander, John C.

    1987-01-01

    Speckle noise is inherent to synthetic aperture radar (SAR) imagery. Since the degradation of the image due to this noise results in uncertainties in the interpretation of the scene and in a loss of apparent resolution, it is desirable to filter the image to reduce this noise. In this paper, an adaptive algorithm based on the calculation of the local statistics around a pixel is applied to 1-look SAR imagery. The filter adapts to the nonstationarity of the image statistics since the size of the blocks is very small compared to that of the image. The performance of the filter is measured in terms of the equivalent number of looks (ENL) of the filtered image and the resulting resolution degradation. The results are compared to those obtained from different techniques applied to similar data. The local adaptive filter (LAF) significantly increases the ENL of the final image. The associated loss of resolution is also lower than that for other commonly used speckle reduction techniques.

  5. Thermographic image analysis for classification of ACL rupture disease, bone cancer, and feline hyperthyroid, with Gabor filters

    NASA Astrophysics Data System (ADS)

    Alvandipour, Mehrdad; Umbaugh, Scott E.; Mishra, Deependra K.; Dahal, Rohini; Lama, Norsang; Marino, Dominic J.; Sackman, Joseph

    2017-05-01

    Thermography and pattern classification techniques are used to classify three different pathologies in veterinary images. Thermographic images of both normal and diseased animals were provided by the Long Island Veterinary Specialists (LIVS). The three pathologies are ACL rupture disease, bone cancer, and feline hyperthyroid. The diagnosis of these diseases usually involves radiology and laboratory tests while the method that we propose uses thermographic images and image analysis techniques and is intended for use as a prescreening tool. Images in each category of pathologies are first filtered by Gabor filters and then various features are extracted and used for classification into normal and abnormal classes. Gabor filters are linear filters that can be characterized by the two parameters wavelength λ and orientation θ. With two different wavelength and five different orientations, a total of ten different filters were studied. Different combinations of camera views, filters, feature vectors, normalization methods, and classification methods, produce different tests that were examined and the sensitivity, specificity and success rate for each test were produced. Using the Gabor features alone, sensitivity, specificity, and overall success rates of 85% for each of the pathologies was achieved.

  6. Effects of the use of multi-layer filter on radiation exposure and the quality of upper airway radiographs compared to the traditional copper filter.

    PubMed

    Klandima, Somphan; Kruatrachue, Anchalee; Wongtapradit, Lawan; Nithipanya, Narong; Ratanaprakarn, Warangkana

    2014-06-01

    The problem of image quality in a large number of upper airway obstructed patients is the superimposition of the airway over the bone of the spine on the AP view. This problem was resolved by increasing KVp to high KVp technique and adding extra radiographic filters (copper filter) to reduce the sharpness of the bone and increase the clarity of the airway. However, this raises a concern that patients might be receiving an unnecessarily higher dose of radiation, as well as the effectiveness of the invented filter compared to the traditional filter. To evaluate the level of radiation dose that patients receive with the use of multi-layer filter compared to non-filter and to evaluate the image quality of the upper airways between using the radiographic filter (multi-layer filter) and the traditional filter (copperfilter). The attenuation curve of both filter materials was first identified. Then, both the filters were tested with Alderson Rando phantom to determine the appropriate exposure. Using the method described, a new type of filter called the multi-layer filter for imaging patients was developed. A randomized control trial was then performed to compare the effectiveness of the newly developed multi-layer filter to the copper filter. The research was conducted in patients with upper airway obstruction treated at Queen Sirikit National Institute of Child Health from October 2006 to September 2007. A total of 132 patients were divided into two groups. The experimental group used high kVp technique with multi-layer filter, while the control group used copper filter. A comparison of film interpretation between the multi-layer filter and the copper filter was made by a number of radiologists who were blinded to both to the technique and type of filter used. Patients had less radiation from undergoing the kVp technique with copper filter and multi-layer filter compared to the conventional technique, where no filter is used. Patients received approximately 65.5% less radiation dose using high kVp technique with multi-layer filter compared to the conventional technique, and 25.9% less than using the traditional copper filter 45% of the radiologists who participated in this study reported that the high kVp technique with multi-layer filter was better for diagnosing stenosis, or narrowing of the upper airways. 33% reported that, both techniques were equal, while 22% reported that the traditional copper filter allowed for better details of airway obstruction. These findings showed that the multi-layered filter was comparable to the copper filter in terms of film interpretation. Using the multi-layer filter resulted in patients receiving a lower dose of radiation, as well as similar film interpretation when compared to the traditional copper filter.

  7. MR image reconstruction via guided filter.

    PubMed

    Huang, Heyan; Yang, Hang; Wang, Kang

    2018-04-01

    Magnetic resonance imaging (MRI) reconstruction from the smallest possible set of Fourier samples has been a difficult problem in medical imaging field. In our paper, we present a new approach based on a guided filter for efficient MRI recovery algorithm. The guided filter is an edge-preserving smoothing operator and has better behaviors near edges than the bilateral filter. Our reconstruction method is consist of two steps. First, we propose two cost functions which could be computed efficiently and thus obtain two different images. Second, the guided filter is used with these two obtained images for efficient edge-preserving filtering, and one image is used as the guidance image, the other one is used as a filtered image in the guided filter. In our reconstruction algorithm, we can obtain more details by introducing guided filter. We compare our reconstruction algorithm with some competitive MRI reconstruction techniques in terms of PSNR and visual quality. Simulation results are given to show the performance of our new method.

  8. Filtering and left ventricle segmentation of the fetal heart in ultrasound images

    NASA Astrophysics Data System (ADS)

    Vargas-Quintero, Lorena; Escalante-Ramírez, Boris

    2013-11-01

    In this paper, we propose to use filtering methods and a segmentation algorithm for the analysis of fetal heart in ultrasound images. Since noise speckle makes difficult the analysis of ultrasound images, the filtering process becomes a useful task in these types of applications. The filtering techniques consider in this work assume that the speckle noise is a random variable with a Rayleigh distribution. We use two multiresolution methods: one based on wavelet decomposition and the another based on the Hermite transform. The filtering process is used as way to strengthen the performance of the segmentation tasks. For the wavelet-based approach, a Bayesian estimator at subband level for pixel classification is employed. The Hermite method computes a mask to find those pixels that are corrupted by speckle. On the other hand, we picked out a method based on a deformable model or "snake" to evaluate the influence of the filtering techniques in the segmentation task of left ventricle in fetal echocardiographic images.

  9. Image sharpening for mixed spatial and spectral resolution satellite systems

    NASA Technical Reports Server (NTRS)

    Hallada, W. A.; Cox, S.

    1983-01-01

    Two methods of image sharpening (reconstruction) are compared. The first, a spatial filtering technique, extrapolates edge information from a high spatial resolution panchromatic band at 10 meters and adds it to the low spatial resolution narrow spectral bands. The second method, a color normalizing technique, is based on the ability to separate image hue and brightness components in spectral data. Using both techniques, multispectral images are sharpened from 30, 50, 70, and 90 meter resolutions. Error rates are calculated for the two methods and all sharpened resolutions. The results indicate that the color normalizing method is superior to the spatial filtering technique.

  10. Automated selection of the most epithelium-rich areas in gynecologic tumor sections.

    PubMed

    Schipper, N W; Baak, J P; Smeulders, A W

    1991-12-01

    The paper describes an image analysis technique for automated selection of the epithelium-rich areas in standard paraffin tissue sections of ovarian and endometrial premalignancies and malignancies. Two staining procedures were evaluated, Feulgen (pararosanilin) and CAM 5.2, demonstrating the presence of cytokeratin 8 and 18; both were counterstained with naphthol yellow. The technique is based on the corresponding image processing method of automated estimation of the percentage of epithelium in interactively selected microscope fields. With the technique, one image is recorded with a filter to demonstrate where epithelium and stroma lie. This filter is chosen according to the type of staining: it is yellow (lambda = 552 nm) for Feulgen and blue (lambda = 470 nm) for anticytokeratin CAM 5.2. When stroma cannot be distinguished from lumina with the green filter or from epithelium with the blue filter, a second image is recorded from the same microscope field, with a blue filter (lambda = 420 nm) for Feulgen and a yellow filter (lambda = 576 nm) for anticytokeratin CAM 5.2. Discrimination between epithelium and stroma is based on the image contrast range and the packing of nuclei in the yellow image and on the automated classification of the gray value histogram peaks in the blue image. For Feulgen stain the method was evaluated on 30 ovarian tumors of the common epithelial types (8 borderline tumors and 22 carcinomas with various degrees of differentiation) and 30 endometrial carcinomas of different grades.(ABSTRACT TRUNCATED AT 250 WORDS)

  11. SU-E-J-261: The Importance of Appropriate Image Preprocessing to Augment the Information of Radiomics Image Features

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, L; Fried, D; Fave, X

    Purpose: To investigate how different image preprocessing techniques, their parameters, and the different boundary handling techniques can augment the information of features and improve feature’s differentiating capability. Methods: Twenty-seven NSCLC patients with a solid tumor volume and no visually obvious necrotic regions in the simulation CT images were identified. Fourteen of these patients had a necrotic region visible in their pre-treatment PET images (necrosis group), and thirteen had no visible necrotic region in the pre-treatment PET images (non-necrosis group). We investigated how image preprocessing can impact the ability of radiomics image features extracted from the CT to differentiate between twomore » groups. It is expected the histogram in the necrosis group is more negatively skewed, and the uniformity from the necrosis group is less. Therefore, we analyzed two first order features, skewness and uniformity, on the image inside the GTV in the intensity range [−20HU, 180HU] under the combination of several image preprocessing techniques: (1) applying the isotropic Gaussian or anisotropic diffusion smoothing filter with a range of parameter(Gaussian smoothing: size=11, sigma=0:0.1:2.3; anisotropic smoothing: iteration=4, kappa=0:10:110); (2) applying the boundaryadapted Laplacian filter; and (3) applying the adaptive upper threshold for the intensity range. A 2-tailed T-test was used to evaluate the differentiating capability of CT features on pre-treatment PT necrosis. Result: Without any preprocessing, no differences in either skewness or uniformity were observed between two groups. After applying appropriate Gaussian filters (sigma>=1.3) or anisotropic filters(kappa >=60) with the adaptive upper threshold, skewness was significantly more negative in the necrosis group(p<0.05). By applying the boundary-adapted Laplacian filtering after the appropriate Gaussian filters (0.5 <=sigma<=1.1) or anisotropic filters(20<=kappa <=50), the uniformity was significantly lower in the necrosis group (p<0.05). Conclusion: Appropriate selection of image preprocessing techniques allows radiomics features to extract more useful information and thereby improve prediction models based on these features.« less

  12. Multidimensional deconvolution of optical microscope and ultrasound imaging using adaptive least-mean-square (LMS) inverse filtering

    NASA Astrophysics Data System (ADS)

    Sapia, Mark Angelo

    2000-11-01

    Three-dimensional microscope images typically suffer from reduced resolution due to the effects of convolution, optical aberrations and out-of-focus blurring. Two- dimensional ultrasound images are also degraded by convolutional bluffing and various sources of noise. Speckle noise is a major problem in ultrasound images. In microscopy and ultrasound, various methods of digital filtering have been used to improve image quality. Several methods of deconvolution filtering have been used to improve resolution by reversing the convolutional effects, many of which are based on regularization techniques and non-linear constraints. The technique discussed here is a unique linear filter for deconvolving 3D fluorescence microscopy or 2D ultrasound images. The process is to solve for the filter completely in the spatial-domain using an adaptive algorithm to converge to an optimum solution for de-blurring and resolution improvement. There are two key advantages of using an adaptive solution: (1)it efficiently solves for the filter coefficients by taking into account all sources of noise and degraded resolution at the same time, and (2)achieves near-perfect convergence to the ideal linear deconvolution filter. This linear adaptive technique has other advantages such as avoiding artifacts of frequency-domain transformations and concurrent adaptation to suppress noise. Ultimately, this approach results in better signal-to-noise characteristics with virtually no edge-ringing. Many researchers have not adopted linear techniques because of poor convergence, noise instability and negative valued data in the results. The methods presented here overcome many of these well-documented disadvantages and provide results that clearly out-perform other linear methods and may also out-perform regularization and constrained algorithms. In particular, the adaptive solution is most responsible for overcoming the poor performance associated with linear techniques. This linear adaptive approach to deconvolution is demonstrated with results of restoring blurred phantoms for both microscopy and ultrasound and restoring 3D microscope images of biological cells and 2D ultrasound images of human subjects (courtesy of General Electric and Diasonics, Inc.).

  13. Colorization-Based RGB-White Color Interpolation using Color Filter Array with Randomly Sampled Pattern

    PubMed Central

    Oh, Paul; Lee, Sukho; Kang, Moon Gi

    2017-01-01

    Recently, several RGB-White (RGBW) color filter arrays (CFAs) have been proposed, which have extra white (W) pixels in the filter array that are highly sensitive. Due to the high sensitivity, the W pixels have better SNR (Signal to Noise Ratio) characteristics than other color pixels in the filter array, especially, in low light conditions. However, most of the RGBW CFAs are designed so that the acquired RGBW pattern image can be converted into the conventional Bayer pattern image, which is then again converted into the final color image by using conventional demosaicing methods, i.e., color interpolation techniques. In this paper, we propose a new RGBW color filter array based on a totally different color interpolation technique, the colorization algorithm. The colorization algorithm was initially proposed for colorizing a gray image into a color image using a small number of color seeds. Here, we adopt this algorithm as a color interpolation technique, so that the RGBW color filter array can be designed with a very large number of W pixels to make the most of the highly sensitive characteristics of the W channel. The resulting RGBW color filter array has a pattern with a large proportion of W pixels, while the small-numbered RGB pixels are randomly distributed over the array. The colorization algorithm makes it possible to reconstruct the colors from such a small number of RGB values. Due to the large proportion of W pixels, the reconstructed color image has a high SNR value, especially higher than those of conventional CFAs in low light condition. Experimental results show that many important information which are not perceived in color images reconstructed with conventional CFAs are perceived in the images reconstructed with the proposed method. PMID:28657602

  14. Colorization-Based RGB-White Color Interpolation using Color Filter Array with Randomly Sampled Pattern.

    PubMed

    Oh, Paul; Lee, Sukho; Kang, Moon Gi

    2017-06-28

    Recently, several RGB-White (RGBW) color filter arrays (CFAs) have been proposed, which have extra white (W) pixels in the filter array that are highly sensitive. Due to the high sensitivity, the W pixels have better SNR (Signal to Noise Ratio) characteristics than other color pixels in the filter array, especially, in low light conditions. However, most of the RGBW CFAs are designed so that the acquired RGBW pattern image can be converted into the conventional Bayer pattern image, which is then again converted into the final color image by using conventional demosaicing methods, i.e., color interpolation techniques. In this paper, we propose a new RGBW color filter array based on a totally different color interpolation technique, the colorization algorithm. The colorization algorithm was initially proposed for colorizing a gray image into a color image using a small number of color seeds. Here, we adopt this algorithm as a color interpolation technique, so that the RGBW color filter array can be designed with a very large number of W pixels to make the most of the highly sensitive characteristics of the W channel. The resulting RGBW color filter array has a pattern with a large proportion of W pixels, while the small-numbered RGB pixels are randomly distributed over the array. The colorization algorithm makes it possible to reconstruct the colors from such a small number of RGB values. Due to the large proportion of W pixels, the reconstructed color image has a high SNR value, especially higher than those of conventional CFAs in low light condition. Experimental results show that many important information which are not perceived in color images reconstructed with conventional CFAs are perceived in the images reconstructed with the proposed method.

  15. Material characterization and defect inspection in ultrasound images

    NASA Astrophysics Data System (ADS)

    Zmola, Carl; Segal, Andrew C.; Lovewell, Brian; Mahdavieh, Jacob; Ross, Joseph; Nash, Charles

    1992-08-01

    The use of ultrasonic imaging to analyze defects and characterize materials is critical in the development of non-destructive testing and non-destructive evaluation (NDT/NDE) tools for manufacturing. To develop better quality control and reliability in the manufacturing environment advanced image processing techniques are useful. For example, through the use of texture filtering on ultrasound images, we have been able to filter characteristic textures from highly textured C-scan images of materials. The materials have highly regular characteristic textures which are of the same resolution and dynamic range as other important features within the image. By applying texture filters and adaptively modifying their filter response, we have examined a family of filters for removing these textures.

  16. Generalization of the Lyot filter and its application to snapshot spectral imaging.

    PubMed

    Gorman, Alistair; Fletcher-Holmes, David William; Harvey, Andrew Robert

    2010-03-15

    A snapshot multi-spectral imaging technique is described which employs multiple cascaded birefringent interferometers to simultaneously spectrally filter and demultiplex multiple spectral images onto a single detector array. Spectral images are recorded directly without the need for inversion and without rejection of light and so the technique offers the potential for high signal-to-noise ratio. An example of an eight-band multi-spectral movie sequence is presented; we believe this is the first such demonstration of a technique able to record multi-spectral movie sequences without the need for computer reconstruction.

  17. A robust technique based on VLM and Frangi filter for retinal vessel extraction and denoising.

    PubMed

    Khan, Khan Bahadar; Khaliq, Amir A; Jalil, Abdul; Shahid, Muhammad

    2018-01-01

    The exploration of retinal vessel structure is colossally important on account of numerous diseases including stroke, Diabetic Retinopathy (DR) and coronary heart diseases, which can damage the retinal vessel structure. The retinal vascular network is very hard to be extracted due to its spreading and diminishing geometry and contrast variation in an image. The proposed technique consists of unique parallel processes for denoising and extraction of blood vessels in retinal images. In the preprocessing section, an adaptive histogram equalization enhances dissimilarity between the vessels and the background and morphological top-hat filters are employed to eliminate macula and optic disc, etc. To remove local noise, the difference of images is computed from the top-hat filtered image and the high-boost filtered image. Frangi filter is applied at multi scale for the enhancement of vessels possessing diverse widths. Segmentation is performed by using improved Otsu thresholding on the high-boost filtered image and Frangi's enhanced image, separately. In the postprocessing steps, a Vessel Location Map (VLM) is extracted by using raster to vector transformation. Postprocessing steps are employed in a novel way to reject misclassified vessel pixels. The final segmented image is obtained by using pixel-by-pixel AND operation between VLM and Frangi output image. The method has been rigorously analyzed on the STARE, DRIVE and HRF datasets.

  18. Spectroscopic imaging using acousto-optic tunable filters

    NASA Astrophysics Data System (ADS)

    Bouhifd, Mounir; Whelan, Maurice

    2007-07-01

    We report on novel hyper-spectral imaging filter-modules based on acousto-optic tuneable filters (AOTF). The AOTF functions as a full-field tuneable bandpass filter which offers fast continuous or random access tuning with high filtering efficiency. Due to the diffractive nature of the device, the unfiltered zero-order and the filtered first-order images are geometrically separated. The modules developed exploit this feature to simultaneously route both the transmitted white-light image and the filtered fluorescence image to two separate cameras. Incorporation of prisms in the optical paths and careful design of the relay optics in the filter module have overcome a number of aberrations inherent to imaging through AOTFs, leading to excellent spatial resolution. A number of practical uses of this technique, both for in vivo auto-fluorescence endoscopy and in vitro fluorescence microscopy were demonstrated. We describe the operational principle and design of recently improved prototype instruments for fluorescence-based diagnostics and demonstrate their performance by presenting challenging hyper-spectral fluorescence imaging applications.

  19. Edge enhancement and image equalization by unsharp masking using self-adaptive photochromic filters.

    PubMed

    Ferrari, José A; Flores, Jorge L; Perciante, César D; Frins, Erna

    2009-07-01

    A new method for real-time edge enhancement and image equalization using photochromic filters is presented. The reversible self-adaptive capacity of photochromic materials is used for creating an unsharp mask of the original image. This unsharp mask produces a kind of self filtering of the original image. Unlike the usual Fourier (coherent) image processing, the technique we propose can also be used with incoherent illumination. Validation experiments with Bacteriorhodopsin and photochromic glass are presented.

  20. Multiscale image fusion using the undecimated wavelet transform with spectral factorization and nonorthogonal filter banks.

    PubMed

    Ellmauthaler, Andreas; Pagliari, Carla L; da Silva, Eduardo A B

    2013-03-01

    Multiscale transforms are among the most popular techniques in the field of pixel-level image fusion. However, the fusion performance of these methods often deteriorates for images derived from different sensor modalities. In this paper, we demonstrate that for such images, results can be improved using a novel undecimated wavelet transform (UWT)-based fusion scheme, which splits the image decomposition process into two successive filtering operations using spectral factorization of the analysis filters. The actual fusion takes place after convolution with the first filter pair. Its significantly smaller support size leads to the minimization of the unwanted spreading of coefficient values around overlapping image singularities. This usually complicates the feature selection process and may lead to the introduction of reconstruction errors in the fused image. Moreover, we will show that the nonsubsampled nature of the UWT allows the design of nonorthogonal filter banks, which are more robust to artifacts introduced during fusion, additionally improving the obtained results. The combination of these techniques leads to a fusion framework, which provides clear advantages over traditional multiscale fusion approaches, independent of the underlying fusion rule, and reduces unwanted side effects such as ringing artifacts in the fused reconstruction.

  1. Kalman filter techniques for accelerated Cartesian dynamic cardiac imaging.

    PubMed

    Feng, Xue; Salerno, Michael; Kramer, Christopher M; Meyer, Craig H

    2013-05-01

    In dynamic MRI, spatial and temporal parallel imaging can be exploited to reduce scan time. Real-time reconstruction enables immediate visualization during the scan. Commonly used view-sharing techniques suffer from limited temporal resolution, and many of the more advanced reconstruction methods are either retrospective, time-consuming, or both. A Kalman filter model capable of real-time reconstruction can be used to increase the spatial and temporal resolution in dynamic MRI reconstruction. The original study describing the use of the Kalman filter in dynamic MRI was limited to non-Cartesian trajectories because of a limitation intrinsic to the dynamic model used in that study. Here the limitation is overcome, and the model is applied to the more commonly used Cartesian trajectory with fast reconstruction. Furthermore, a combination of the Kalman filter model with Cartesian parallel imaging is presented to further increase the spatial and temporal resolution and signal-to-noise ratio. Simulations and experiments were conducted to demonstrate that the Kalman filter model can increase the temporal resolution of the image series compared with view-sharing techniques and decrease the spatial aliasing compared with TGRAPPA. The method requires relatively little computation, and thus is suitable for real-time reconstruction. Copyright © 2012 Wiley Periodicals, Inc.

  2. Kalman Filter Techniques for Accelerated Cartesian Dynamic Cardiac Imaging

    PubMed Central

    Feng, Xue; Salerno, Michael; Kramer, Christopher M.; Meyer, Craig H.

    2012-01-01

    In dynamic MRI, spatial and temporal parallel imaging can be exploited to reduce scan time. Real-time reconstruction enables immediate visualization during the scan. Commonly used view-sharing techniques suffer from limited temporal resolution, and many of the more advanced reconstruction methods are either retrospective, time-consuming, or both. A Kalman filter model capable of real-time reconstruction can be used to increase the spatial and temporal resolution in dynamic MRI reconstruction. The original study describing the use of the Kalman filter in dynamic MRI was limited to non-Cartesian trajectories, because of a limitation intrinsic to the dynamic model used in that study. Here the limitation is overcome and the model is applied to the more commonly used Cartesian trajectory with fast reconstruction. Furthermore, a combination of the Kalman filter model with Cartesian parallel imaging is presented to further increase the spatial and temporal resolution and SNR. Simulations and experiments were conducted to demonstrate that the Kalman filter model can increase the temporal resolution of the image series compared with view sharing techniques and decrease the spatial aliasing compared with TGRAPPA. The method requires relatively little computation, and thus is suitable for real-time reconstruction. PMID:22926804

  3. Filtering of high noise breast thermal images using fast non-local means.

    PubMed

    Suganthi, S S; Ramakrishnan, S

    2014-01-01

    Analyses of breast thermograms are still a challenging task primarily due to the limitations such as low contrast, low signal to noise ratio and absence of clear edges. Therefore, always there is a requirement for preprocessing techniques before performing any quantitative analysis. In this work, a noise removal framework using fast non-local means algorithm, method noise and median filter was used to denoise breast thermograms. The images considered were subjected to Anscombe transformation to convert the distribution from Poisson to Gaussian. The pre-denoised image was obtained by subjecting the transformed image to fast non-local means filtering. The method noise which is the difference between the original and pre-denoised image was observed with the noise component merged in few structures and fine detail of the image. The image details presented in the method noise was extracted by smoothing the noise part using the median filter. The retrieved image part was added to the pre-denoised image to obtain the final denoised image. The performance of this technique was compared with that of Wiener and SUSAN filters. The results show that all the filters considered are able to remove the noise component. The performance of the proposed denoising framework is found to be good in preserving detail and removing noise. Further, the method noise is observed with negligible image details. Similarly, denoised image with no noise and smoothed edges are observed using Wiener filter and its method noise is contained with few structures and image details. The performance results of SUSAN filter is found to be blurred denoised image with little noise and also method noise with extensive structure and image details. Hence, it appears that the proposed denoising framework is able to preserve the edge information and generate clear image that could help in enhancing the diagnostic relevance of breast thermograms. In this paper, the introduction, objectives, materials and methods, results and discussion and conclusions are presented in detail.

  4. Multispectral and geomorphic studies of processed Voyager 2 images of Europa

    NASA Technical Reports Server (NTRS)

    Meier, T. A.

    1984-01-01

    High resolution images of Europa taken by the Voyager 2 spacecraft were used to study a portion of Europa's dark lineations and the major white line feature Agenor Linea. Initial image processing of images 1195J2-001 (violet filter), 1198J2-001 (blue filter), 1201J2-001 (orange filter), and 1204J2-001 (ultraviolet filter) was performed at the U.S.G.S. Branch of Astrogeology in Flagstaff, Arizona. Processing was completed through the stages of image registration and color ratio image construction. Pixel printouts were used in a new technique of linear feature profiling to compensate for image misregistration through the mapping of features on the printouts. In all, 193 dark lineation segments were mapped and profiled. The more accurate multispectral data derived by this method was plotted using a new application of the ternary diagram, with orange, blue, and violet relative spectral reflectances serving as end members. Statistical techniques were then applied to the ternary diagram plots. The image products generated at LPI were used mainly to cross-check and verify the results of the ternary diagram analysis.

  5. Comparison of edge detection techniques for M7 subtype Leukemic cell in terms of noise filters and threshold value

    NASA Astrophysics Data System (ADS)

    Salam, Afifah Salmi Abdul; Isa, Mohd. Nazrin Md.; Ahmad, Muhammad Imran; Che Ismail, Rizalafande

    2017-11-01

    This paper will focus on the study and identifying various threshold values for two commonly used edge detection techniques, which are Sobel and Canny Edge detection. The idea is to determine which values are apt in giving accurate results in identifying a particular leukemic cell. In addition, evaluating suitability of edge detectors are also essential as feature extraction of the cell depends greatly on image segmentation (edge detection). Firstly, an image of M7 subtype of Acute Myelocytic Leukemia (AML) is chosen due to its diagnosing which were found lacking. Next, for an enhancement in image quality, noise filters are applied. Hence, by comparing images with no filter, median and average filter, useful information can be acquired. Each threshold value is fixed with value 0, 0.25 and 0.5. From the investigation found, without any filter, Canny with a threshold value of 0.5 yields the best result.

  6. Comparison of active-set method deconvolution and matched-filtering for derivation of an ultrasound transit time spectrum.

    PubMed

    Wille, M-L; Zapf, M; Ruiter, N V; Gemmeke, H; Langton, C M

    2015-06-21

    The quality of ultrasound computed tomography imaging is primarily determined by the accuracy of ultrasound transit time measurement. A major problem in analysis is the overlap of signals making it difficult to detect the correct transit time. The current standard is to apply a matched-filtering approach to the input and output signals. This study compares the matched-filtering technique with active set deconvolution to derive a transit time spectrum from a coded excitation chirp signal and the measured output signal. The ultrasound wave travels in a direct and a reflected path to the receiver, resulting in an overlap in the recorded output signal. The matched-filtering and deconvolution techniques were applied to determine the transit times associated with the two signal paths. Both techniques were able to detect the two different transit times; while matched-filtering has a better accuracy (0.13 μs versus 0.18 μs standard deviations), deconvolution has a 3.5 times improved side-lobe to main-lobe ratio. A higher side-lobe suppression is important to further improve image fidelity. These results suggest that a future combination of both techniques would provide improved signal detection and hence improved image fidelity.

  7. Apodized RFI filtering of synthetic aperture radar images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doerry, Armin Walter

    2014-02-01

    Fine resolution Synthetic Aperture Radar (SAR) systems necessarily require wide bandwidths that often overlap spectrum utilized by other wireless services. These other emitters pose a source of Radio Frequency Interference (RFI) to the SAR echo signals that degrades SAR image quality. Filtering, or excising, the offending spectral contaminants will mitigate the interference, but at a cost of often degrading the SAR image in other ways, notably by raising offensive sidelobe levels. This report proposes borrowing an idea from nonlinear sidelobe apodization techniques to suppress interference without the attendant increase in sidelobe levels. The simple post-processing technique is termed Apodized RFImore » Filtering (ARF).« less

  8. Imaging through scattering media by Fourier filtering and single-pixel detection

    NASA Astrophysics Data System (ADS)

    Jauregui-Sánchez, Y.; Clemente, P.; Lancis, J.; Tajahuerce, E.

    2018-02-01

    We present a novel imaging system that combines the principles of Fourier spatial filtering and single-pixel imaging in order to recover images of an object hidden behind a turbid medium by transillumination. We compare the performance of our single-pixel imaging setup with that of a conventional system. We conclude that the introduction of Fourier gating improves the contrast of images in both cases. Furthermore, we show that the combination of single-pixel imaging and Fourier spatial filtering techniques is particularly well adapted to provide images of objects transmitted through scattering media.

  9. An automatic optimum kernel-size selection technique for edge enhancement

    USGS Publications Warehouse

    Chavez, Pat S.; Bauer, Brian P.

    1982-01-01

    Edge enhancement is a technique that can be considered, to a first order, a correction for the modulation transfer function of an imaging system. Digital imaging systems sample a continuous function at discrete intervals so that high-frequency information cannot be recorded at the same precision as lower frequency data. Because of this, fine detail or edge information in digital images is lost. Spatial filtering techniques can be used to enhance the fine detail information that does exist in the digital image, but the filter size is dependent on the type of area being processed. A technique has been developed by the authors that uses the horizontal first difference to automatically select the optimum kernel-size that should be used to enhance the edges that are contained in the image. 

  10. Acousto-optical tunable filter for combined wideband, spectral, and optical coherence microscopy.

    PubMed

    Machikhin, Alexander S; Pozhar, Vitold E; Viskovatykh, Alexander V; Burmak, Ludmila I

    2015-09-01

    A multimodal technique for inspection of microscopic objects by means of wideband optical microscopy, spectral microscopy, and optical coherence microscopy is described, implemented, and tested. The key feature is the spectral selection of light in the output arm of an interferometer with use of the specialized imaging acousto-optical tunable filter. In this filter, two interfering optical beams are diffracted via the same ultrasound wave without destruction of interference image structure. The basic requirements for the acousto-optical tunable filter are defined, and mathematical formulas for calculation of its parameters are derived. Theoretical estimation of the achievable accuracy of the 3D image reconstruction is presented and experimental proofs are given. It is demonstrated that spectral imaging can also be accompanied by measurement of the quantitative reflectance spectra. Examples of inspection of optically transparent and nontransparent samples demonstrate the applicability of the technique.

  11. Silicon oxide nanoparticles doped PQ-PMMA for volume holographic imaging filters.

    PubMed

    Luo, Yuan; Russo, Juan M; Kostuk, Raymond K; Barbastathis, George

    2010-04-15

    Holographic imaging filters are required to have high Bragg selectivity, namely, narrow angular and spectral bandwidth, to obtain spatial-spectral information within a three-dimensional object. In this Letter, we present the design of holographic imaging filters formed using silicon oxide nanoparticles (nano-SiO(2)) in phenanthrenquinone-poly(methyl methacrylate) (PQ-PMMA) polymer recording material. This combination offers greater Bragg selectivity and increases the diffraction efficiency of holographic filters. The holographic filters with optimized ratio of nano-SiO(2) in PQ-PMMA can significantly improve the performance of Bragg selectivity and diffraction efficiency by 53% and 16%, respectively. We present experimental results and data analysis demonstrating this technique in use for holographic spatial-spectral imaging filters.

  12. Speckle reduction in echocardiography by temporal compounding and anisotropic diffusion filtering

    NASA Astrophysics Data System (ADS)

    Giraldo-Guzmán, Jader; Porto-Solano, Oscar; Cadena-Bonfanti, Alberto; Contreras-Ortiz, Sonia H.

    2015-01-01

    Echocardiography is a medical imaging technique based on ultrasound signals that is used to evaluate heart anatomy and physiology. Echocardiographic images are affected by speckle, a type of multiplicative noise that obscures details of the structures, and reduces the overall image quality. This paper shows an approach to enhance echocardiography using two processing techniques: temporal compounding and anisotropic diffusion filtering. We used twenty echocardiographic videos that include one or three cardiac cycles to test the algorithms. Two images from each cycle were aligned in space and averaged to obtain the compound images. These images were then processed using anisotropic diffusion filters to further improve their quality. Resultant images were evaluated using quality metrics and visual assessment by two medical doctors. The average total improvement on signal-to-noise ratio was up to 100.29% for videos with three cycles, and up to 32.57% for videos with one cycle.

  13. Estimated spectrum adaptive postfilter and the iterative prepost filtering algirighms

    NASA Technical Reports Server (NTRS)

    Linares, Irving (Inventor)

    2004-01-01

    The invention presents The Estimated Spectrum Adaptive Postfilter (ESAP) and the Iterative Prepost Filter (IPF) algorithms. These algorithms model a number of image-adaptive post-filtering and pre-post filtering methods. They are designed to minimize Discrete Cosine Transform (DCT) blocking distortion caused when images are highly compressed with the Joint Photographic Expert Group (JPEG) standard. The ESAP and the IPF techniques of the present invention minimize the mean square error (MSE) to improve the objective and subjective quality of low-bit-rate JPEG gray-scale images while simultaneously enhancing perceptual visual quality with respect to baseline JPEG images.

  14. Mapping accuracy via spectrally and structurally based filtering techniques: comparisons through visual observations

    NASA Astrophysics Data System (ADS)

    Chockalingam, Letchumanan

    2005-01-01

    The data of Gunung Ledang region of Malaysia acquired through LANDSAT are considered to map certain hydrogeolocial features. To map these significant features, image-processing tools such as contrast enhancement, edge detection techniques are employed. The advantages of these techniques over the other methods are evaluated from the point of their validity in properly isolating features of hydrogeolocial interest are discussed. As these techniques take the advantage of spectral aspects of the images, these techniques have several limitations to meet the objectives. To discuss these limitations, a morphological transformation, which generally considers the structural aspects rather than spectral aspects from the image, are applied to provide comparisons between the results derived from spectral based and the structural based filtering techniques.

  15. Adaptive box filters for removal of random noise from digital images

    USGS Publications Warehouse

    Eliason, E.M.; McEwen, A.S.

    1990-01-01

    We have developed adaptive box-filtering algorithms to (1) remove random bit errors (pixel values with no relation to the image scene) and (2) smooth noisy data (pixels related to the image scene but with an additive or multiplicative component of noise). For both procedures, we use the standard deviation (??) of those pixels within a local box surrounding each pixel, hence they are adaptive filters. This technique effectively reduces speckle in radar images without eliminating fine details. -from Authors

  16. Enhancement of IVR images by combining an ICA shrinkage filter with a multi-scale filter

    NASA Astrophysics Data System (ADS)

    Chen, Yen-Wei; Matsuo, Kiyotaka; Han, Xianhua; Shimizu, Atsumoto; Shibata, Koichi; Mishina, Yukio; Mukuta, Yoshihiro

    2007-11-01

    Interventional Radiology (IVR) is an important technique to visualize and diagnosis the vascular disease. In real medical application, a weak x-ray radiation source is used for imaging in order to reduce the radiation dose, resulting in a low contrast noisy image. It is important to develop a method to smooth out the noise while enhance the vascular structure. In this paper, we propose to combine an ICA Shrinkage filter with a multiscale filter for enhancement of IVR images. The ICA shrinkage filter is used for noise reduction and the multiscale filter is used for enhancement of vascular structure. Experimental results show that the quality of the image can be dramatically improved without any blurring in edge by the proposed method. Simultaneous noise reduction and vessel enhancement have been achieved.

  17. Improving Image Matching by Reducing Surface Reflections Using Polarising Filter Techniques

    NASA Astrophysics Data System (ADS)

    Conen, N.; Hastedt, H.; Kahmen, O.; Luhmann, T.

    2018-05-01

    In dense stereo matching applications surface reflections may lead to incorrect measurements and blunders in the resulting point cloud. To overcome the problem of disturbing reflexions polarising filters can be mounted on the camera lens and light source. Reflections in the images can be suppressed by crossing the polarising direction of the filters leading to homogeneous illuminated images and better matching results. However, the filter may influence the camera's orientation parameters as well as the measuring accuracy. To quantify these effects, a calibration and an accuracy analysis is conducted within a spatial test arrangement according to the German guideline VDI/VDE 2634.1 (2002) using a DSLR with and without polarising filter. In a second test, the interior orientation is analysed in more detail. The results do not show significant changes of the measuring accuracy in object space and only very small changes of the interior orientation (Δc ≤ 4 μm) with the polarising filter in use. Since in medical applications many tiny reflections are present and impede robust surface measurements, a prototypic trinocular endoscope is equipped with polarising technique. The interior and relative orientation is determined and analysed. The advantage of the polarising technique for medical image matching is shown in an experiment with a moistened pig kidney. The accuracy and completeness of the resulting point cloud can be improved clearly when using polarising filters. Furthermore, an accuracy analysis using a laser triangulation system is performed and the special reflection properties of metallic surfaces are presented.

  18. Speckle noise reduction in ultrasound images using a discrete wavelet transform-based image fusion technique.

    PubMed

    Choi, Hyun Ho; Lee, Ju Hwan; Kim, Sung Min; Park, Sung Yun

    2015-01-01

    Here, the speckle noise in ultrasonic images is removed using an image fusion-based denoising method. To optimize the denoising performance, each discrete wavelet transform (DWT) and filtering technique was analyzed and compared. In addition, the performances were compared in order to derive the optimal input conditions. To evaluate the speckle noise removal performance, an image fusion algorithm was applied to the ultrasound images, and comparatively analyzed with the original image without the algorithm. As a result, applying DWT and filtering techniques caused information loss and noise characteristics, and did not represent the most significant noise reduction performance. Conversely, an image fusion method applying SRAD-original conditions preserved the key information in the original image, and the speckle noise was removed. Based on such characteristics, the input conditions of SRAD-original had the best denoising performance with the ultrasound images. From this study, the best denoising technique proposed based on the results was confirmed to have a high potential for clinical application.

  19. Imaging Multi-Order Fabry-Perot Spectrometer (IMOFPS) for spaceborne measurements of CO

    NASA Astrophysics Data System (ADS)

    Johnson, Brian R.; Kampe, Thomas U.; Cook, William B.; Miecznik, Grzegorz; Novelli, Paul C.; Snell, Hilary E.; Turner-Valle, Jennifer A.

    2003-11-01

    An instrument concept for an Imaging Multi-Order Fabry-Perot Spectrometer (IMOFPS) has been developed for measuring tropospheric carbon monoxide (CO) from space. The concept is based upon a correlation technique similar in nature to multi-order Fabry-Perot (FP) interferometer or gas filter radiometer techniques, which simultaneously measure atmospheric emission from several infrared vibration-rotation lines of CO. Correlation techniques provide a multiplex advantage for increased throughput, high spectral resolution and selectivity necessary for profiling tropospheric CO. Use of unconventional multilayer interference filter designs leads to improvement in CO spectral line correlation compared with the traditional FP multi-order technique, approaching the theoretical performance of gas filter correlation radiometry. In this implementation, however, the gas cell is replaced with a simple, robust solid interference filter. In addition to measuring CO, the correlation filter technique can be applied to measurements of other important gases such as carbon dioxide, nitrous oxide and methane. Imaging the scene onto a 2-D detector array enables a limited range of spectral sampling owing to the field-angle dependence of the filter transmission function. An innovative anamorphic optical system provides a relatively large instrument field-of-view for imaging along the orthogonal direction across the detector array. An important advantage of the IMOFPS concept is that it is a small, low mass and high spectral resolution spectrometer having no moving parts. A small, correlation spectrometer like IMOFPS would be well suited for global observations of CO2, CO, and CH4 from low Earth or regional observations from Geostationary orbit. A prototype instrument is in development for flight demonstration on an airborne platform with potential applications to atmospheric chemistry, wild fire and biomass burning, and chemical dispersion monitoring.

  20. Using quantum filters to process images of diffuse axonal injury

    NASA Astrophysics Data System (ADS)

    Pineda Osorio, Mateo

    2014-06-01

    Some images corresponding to a diffuse axonal injury (DAI) are processed using several quantum filters such as Hermite Weibull and Morse. Diffuse axonal injury is a particular, common and severe case of traumatic brain injury (TBI). DAI involves global damage on microscopic scale of brain tissue and causes serious neurologic abnormalities. New imaging techniques provide excellent images showing cellular damages related to DAI. Said images can be processed with quantum filters, which accomplish high resolutions of dendritic and axonal structures both in normal and pathological state. Using the Laplacian operators from the new quantum filters, excellent edge detectors for neurofiber resolution are obtained. Image quantum processing of DAI images is made using computer algebra, specifically Maple. Quantum filter plugins construction is proposed as a future research line, which can incorporated to the ImageJ software package, making its use simpler for medical personnel.

  1. Optimal focal-plane restoration

    NASA Technical Reports Server (NTRS)

    Reichenbach, Stephen E.; Park, Stephen K.

    1989-01-01

    Image restoration can be implemented efficiently by calculating the convolution of the digital image and a small kernel during image acquisition. Processing the image in the focal-plane in this way requires less computation than traditional Fourier-transform-based techniques such as the Wiener filter and constrained least-squares filter. Here, the values of the convolution kernel that yield the restoration with minimum expected mean-square error are determined using a frequency analysis of the end-to-end imaging system. This development accounts for constraints on the size and shape of the spatial kernel and all the components of the imaging system. Simulation results indicate the technique is effective and efficient.

  2. Integrated circuit layer image segmentation

    NASA Astrophysics Data System (ADS)

    Masalskis, Giedrius; Petrauskas, Romas

    2010-09-01

    In this paper we present IC layer image segmentation techniques which are specifically created for precise metal layer feature extraction. During our research we used many samples of real-life de-processed IC metal layer images which were obtained using optical light microscope. We have created sequence of various image processing filters which provides segmentation results of good enough precision for our application. Filter sequences were fine tuned to provide best possible results depending on properties of IC manufacturing process and imaging technology. Proposed IC image segmentation filter sequences were experimentally tested and compared with conventional direct segmentation algorithms.

  3. Time Domain Filtering of Resolved Images of Sgr A{sup ∗}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shiokawa, Hotaka; Doeleman, Sheperd S.; Gammie, Charles F.

    The goal of the Event Horizon Telescope (EHT) is to provide spatially resolved images of Sgr A*, the source associated with the Galactic Center black hole. Because Sgr A* varies on timescales that are short compared to an EHT observing campaign, it is interesting to ask whether variability contains information about the structure and dynamics of the accretion flow. In this paper, we introduce “time-domain filtering,” a technique to filter time fluctuating images with specific temporal frequency ranges and to demonstrate the power and usage of the technique by applying it to mock millimeter wavelength images of Sgr A*. Themore » mock image data is generated from the General Relativistic Magnetohydrodynamic (GRMHD) simulation and the general relativistic ray-tracing method. We show that the variability on each line of sight is tightly correlated with a typical radius of emission. This is because disk emissivity fluctuates on a timescale of the order of the local orbital period. Time-domain filtered images therefore reflect the model dependent emission radius distribution, which is not accessible in time-averaged images. We show that, in principle, filtered data have the power to distinguish between models with different black-hole spins, different disk viewing angles, and different disk orientations in the sky.« less

  4. Time Domain Filtering of Resolved Images of Sgr A∗

    NASA Astrophysics Data System (ADS)

    Shiokawa, Hotaka; Gammie, Charles F.; Doeleman, Sheperd S.

    2017-09-01

    The goal of the Event Horizon Telescope (EHT) is to provide spatially resolved images of Sgr A*, the source associated with the Galactic Center black hole. Because Sgr A* varies on timescales that are short compared to an EHT observing campaign, it is interesting to ask whether variability contains information about the structure and dynamics of the accretion flow. In this paper, we introduce “time-domain filtering,” a technique to filter time fluctuating images with specific temporal frequency ranges and to demonstrate the power and usage of the technique by applying it to mock millimeter wavelength images of Sgr A*. The mock image data is generated from the General Relativistic Magnetohydrodynamic (GRMHD) simulation and the general relativistic ray-tracing method. We show that the variability on each line of sight is tightly correlated with a typical radius of emission. This is because disk emissivity fluctuates on a timescale of the order of the local orbital period. Time-domain filtered images therefore reflect the model dependent emission radius distribution, which is not accessible in time-averaged images. We show that, in principle, filtered data have the power to distinguish between models with different black-hole spins, different disk viewing angles, and different disk orientations in the sky.

  5. Grid artifact reduction for direct digital radiography detectors based on rotated stationary grids with homomorphic filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Dong Sik; Lee, Sanggyun

    2013-06-15

    Purpose: Grid artifacts are caused when using the antiscatter grid in obtaining digital x-ray images. In this paper, research on grid artifact reduction techniques is conducted especially for the direct detectors, which are based on amorphous selenium. Methods: In order to analyze and reduce the grid artifacts, the authors consider a multiplicative grid image model and propose a homomorphic filtering technique. For minimal damage due to filters, which are used to suppress the grid artifacts, rotated grids with respect to the sampling direction are employed, and min-max optimization problems for searching optimal grid frequencies and angles for given sampling frequenciesmore » are established. The authors then propose algorithms for the grid artifact reduction based on the band-stop filters as well as low-pass filters. Results: The proposed algorithms are experimentally tested for digital x-ray images, which are obtained from direct detectors with the rotated grids, and are compared with other algorithms. It is shown that the proposed algorithms can successfully reduce the grid artifacts for direct detectors. Conclusions: By employing the homomorphic filtering technique, the authors can considerably suppress the strong grid artifacts with relatively narrow-bandwidth filters compared to the normal filtering case. Using rotated grids also significantly reduces the ringing artifact. Furthermore, for specific grid frequencies and angles, the authors can use simple homomorphic low-pass filters in the spatial domain, and thus alleviate the grid artifacts with very low implementation complexity.« less

  6. Two-dimensional real-time imaging system for subtraction angiography using an iodine filter

    NASA Astrophysics Data System (ADS)

    Umetani, Keiji; Ueda, Ken; Takeda, Tohoru; Anno, Izumi; Itai, Yuji; Akisada, Masayoshi; Nakajima, Teiichi

    1992-01-01

    A new type of subtraction imaging system was developed using an iodine filter and a single-energy broad bandwidth monochromatized x ray. The x-ray images of coronary arteries made after intravenous injection of a contrast agent are enhanced by an energy-subtraction technique. Filter chopping of the x-ray beam switches energies rapidly, so that a nearly simultaneous pair of filtered and nonfiltered images can be made. By using a high-speed video camera, a pair of two 512 × 512 pixel images can be obtained within 9 ms. Three hundred eighty-four images (raw data) are stored in a 144-Mbyte frame memory. After phantom studies, in vivo subtracted images of coronary arteries in dogs were obtained at a rate of 15 images/s.

  7. Iodine filter imaging system for subtraction angiography using synchrotron radiation

    NASA Astrophysics Data System (ADS)

    Umetani, K.; Ueda, K.; Takeda, T.; Itai, Y.; Akisada, M.; Nakajima, T.

    1993-11-01

    A new type of real-time imaging system was developed for transvenous coronary angiography. A combination of an iodine filter and a single energy broad-bandwidth X-ray produces two-energy images for the iodine K-edge subtraction technique. X-ray images are sequentially converted to visible images by an X-ray image intensifier. By synchronizing the timing of the movement of the iodine filter into and out of the X-ray beam, two output images of the image intensifier are focused side by side on the photoconductive layer of a camera tube by an oscillating mirror. Both images are read out by electron beam scanning of a 1050-scanning-line video camera within a camera frame time of 66.7 ms. One hundred ninety two pairs of iodine-filtered and non-iodine-filtered images are stored in the frame memory at a rate of 15 pairs/s. In vivo subtracted images of coronary arteries in dogs were obtained in the form of motion pictures.

  8. Computational multispectral video imaging [Invited].

    PubMed

    Wang, Peng; Menon, Rajesh

    2018-01-01

    Multispectral imagers reveal information unperceivable to humans and conventional cameras. Here, we demonstrate a compact single-shot multispectral video-imaging camera by placing a micro-structured diffractive filter in close proximity to the image sensor. The diffractive filter converts spectral information to a spatial code on the sensor pixels. Following a calibration step, this code can be inverted via regularization-based linear algebra to compute the multispectral image. We experimentally demonstrated spectral resolution of 9.6 nm within the visible band (430-718 nm). We further show that the spatial resolution is enhanced by over 30% compared with the case without the diffractive filter. We also demonstrate Vis-IR imaging with the same sensor. Because no absorptive color filters are utilized, sensitivity is preserved as well. Finally, the diffractive filters can be easily manufactured using optical lithography and replication techniques.

  9. Effect of different thickness of material filter on Tc-99m spectra and performance parameters of gamma camera

    NASA Astrophysics Data System (ADS)

    Nazifah, A.; Norhanna, S.; Shah, S. I.; Zakaria, A.

    2014-11-01

    This study aimed to investigate the effects of material filter technique on Tc-99m spectra and performance parameters of Philip ADAC forte dual head gamma camera. Thickness of material filter was selected on the basis of percentage attenuation of various gamma ray energies by different thicknesses of zinc material. A cylindrical source tank of NEMA single photon emission computed tomography (SPECT) Triple Line Source Phantom filled with water and Tc-99m radionuclide injected was used for spectra, uniformity and sensitivity measurements. Vinyl plastic tube was used as a line source for spatial resolution. Images for uniformity were reconstructed by filtered back projection method. Butterworth filter of order 5 and cut off frequency 0.35 cycles/cm was selected. Chang's attenuation correction method was applied by selecting 0.13/cm linear attenuation coefficient. Count rate was decreased with material filter from the compton region of Tc-99m energy spectrum, also from the photopeak region. Spatial resolution was improved. However, uniformity of tomographic image was equivocal, and system volume sensitivity was reduced by material filter. Material filter improved system's spatial resolution. Therefore, the technique may be used for phantom studies to improve the image quality.

  10. Superharmonic imaging with chirp coded excitation: filtering spectrally overlapped harmonics.

    PubMed

    Harput, Sevan; McLaughlan, James; Cowell, David M J; Freear, Steven

    2014-11-01

    Superharmonic imaging improves the spatial resolution by using the higher order harmonics generated in tissue. The superharmonic component is formed by combining the third, fourth, and fifth harmonics, which have low energy content and therefore poor SNR. This study uses coded excitation to increase the excitation energy. The SNR improvement is achieved on the receiver side by performing pulse compression with harmonic matched filters. The use of coded signals also introduces new filtering capabilities that are not possible with pulsed excitation. This is especially important when using wideband signals. For narrowband signals, the spectral boundaries of the harmonics are clearly separated and thus easy to filter; however, the available imaging bandwidth is underused. Wideband excitation is preferable for harmonic imaging applications to preserve axial resolution, but it generates spectrally overlapping harmonics that are not possible to filter in time and frequency domains. After pulse compression, this overlap increases the range side lobes, which appear as imaging artifacts and reduce the Bmode image quality. In this study, the isolation of higher order harmonics was achieved in another domain by using the fan chirp transform (FChT). To show the effect of excitation bandwidth in superharmonic imaging, measurements were performed by using linear frequency modulated chirp excitation with varying bandwidths of 10% to 50%. Superharmonic imaging was performed on a wire phantom using a wideband chirp excitation. Results were presented with and without applying the FChT filtering technique by comparing the spatial resolution and side lobe levels. Wideband excitation signals achieved a better resolution as expected, however range side lobes as high as -23 dB were observed for the superharmonic component of chirp excitation with 50% fractional bandwidth. The proposed filtering technique achieved >50 dB range side lobe suppression and improved the image quality without affecting the axial resolution.

  11. Digital processing of radiographic images

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.

    1973-01-01

    Some techniques are presented and the software documentation for the digital enhancement of radiographs. Both image handling and image processing operations are considered. The image handling operations dealt with are: (1) conversion of format of data from packed to unpacked and vice versa; (2) automatic extraction of image data arrays; (3) transposition and 90 deg rotations of large data arrays; (4) translation of data arrays for registration; and (5) reduction of the dimensions of data arrays by integral factors. Both the frequency and the spatial domain approaches are presented for the design and implementation of the image processing operation. It is shown that spatial domain recursive implementation of filters is much faster than nonrecursive implementations using fast fourier transforms (FFT) for the cases of interest in this work. The recursive implementation of a class of matched filters for enhancing image signal to noise ratio is described. Test patterns are used to illustrate the filtering operations. The application of the techniques to radiographic images of metallic structures is demonstrated through several examples.

  12. Spectral analysis and filtering techniques in digital spatial data processing

    USGS Publications Warehouse

    Pan, Jeng-Jong

    1989-01-01

    A filter toolbox has been developed at the EROS Data Center, US Geological Survey, for retrieving or removing specified frequency information from two-dimensional digital spatial data. This filter toolbox provides capabilities to compute the power spectrum of a given data and to design various filters in the frequency domain. Three types of filters are available in the toolbox: point filter, line filter, and area filter. Both the point and line filters employ Gaussian-type notch filters, and the area filter includes the capabilities to perform high-pass, band-pass, low-pass, and wedge filtering techniques. These filters are applied for analyzing satellite multispectral scanner data, airborne visible and infrared imaging spectrometer (AVIRIS) data, gravity data, and the digital elevation models (DEM) data. -from Author

  13. A study of the x-ray image quality improvement in the examination of the respiratory system based on the new image processing technique

    NASA Astrophysics Data System (ADS)

    Nagai, Yuichi; Kitagawa, Mayumi; Torii, Jun; Iwase, Takumi; Aso, Tomohiko; Ihara, Kanyu; Fujikawa, Mari; Takeuchi, Yumiko; Suzuki, Katsumi; Ishiguro, Takashi; Hara, Akio

    2014-03-01

    Recently, the double contrast technique in a gastrointestinal examination and the transbronchial lung biopsy in an examination for the respiratory system [1-3] have made a remarkable progress. Especially in the transbronchial lung biopsy, better quality of x-ray fluoroscopic images is requested because this examination is performed under a guidance of x-ray fluoroscopic images. On the other hand, various image processing methods [4] for x-ray fluoroscopic images have been developed as an x-ray system with a flat panel detector [5-7] is widely used. A recursive filtering is an effective method to reduce a random noise in x-ray fluoroscopic images. However it has a limitation for its effectiveness of a noise reduction in case of a moving object exists in x-ray fluoroscopic images because the recursive filtering is a noise reduction method by adding last few images. After recursive filtering a residual signal was produced if a moving object existed in x-ray images, and this residual signal disturbed a smooth procedure of the examinations. To improve this situation, new noise reduction method has been developed. The Adaptive Noise Reduction [ANR] is the brand-new noise reduction technique which can be reduced only a noise regardless of the moving object in x-ray fluoroscopic images. Therefore the ANR is a very suitable noise reduction method for the transbronchial lung biopsy under a guidance of x-ray fluoroscopic images because the residual signal caused of the moving object in x-ray fluoroscopic images is never produced after the ANR. In this paper, we will explain an advantage of the ANR by comparing of a performance between the ANR images and the conventional recursive filtering images.

  14. Probabilistic retinal vessel segmentation

    NASA Astrophysics Data System (ADS)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  15. Digital image processing for photo-reconnaissance applications

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1972-01-01

    Digital image-processing techniques developed for processing pictures from NASA space vehicles are analyzed in terms of enhancement, quantitative restoration, and information extraction. Digital filtering, and the action of a high frequency filter in the real and Fourier domain are discussed along with color and brightness.

  16. Compressive spectral testbed imaging system based on thin-film color-patterned filter arrays.

    PubMed

    Rueda, Hoover; Arguello, Henry; Arce, Gonzalo R

    2016-11-20

    Compressive spectral imaging systems can reliably capture multispectral data using far fewer measurements than traditional scanning techniques. In this paper, a thin-film patterned filter array-based compressive spectral imager is demonstrated, including its optical design and implementation. The use of a patterned filter array entails a single-step three-dimensional spatial-spectral coding on the input data cube, which provides higher flexibility on the selection of voxels being multiplexed on the sensor. The patterned filter array is designed and fabricated with micrometer pitch size thin films, referred to as pixelated filters, with three different wavelengths. The performance of the system is evaluated in terms of references measured by a commercially available spectrometer and the visual quality of the reconstructed images. Different distributions of the pixelated filters, including random and optimized structures, are explored.

  17. Reconfigurable Gabor Filter For Fingerprint Recognition Using FPGA Verilog

    NASA Astrophysics Data System (ADS)

    Rosshidi, H. T.; Hadi, A. R.

    2009-06-01

    This paper present the implementations of Gabor filter for fingerprint recognition using Verilog HDL. This work demonstrates the application of Gabor Filter technique to enhance the fingerprint image. The incoming signal in form of image pixel will be filter out or convolute by the Gabor filter to define the ridge and valley regions of fingerprint. This is done with the application of a real time convolve based on Field Programmable Gate Array (FPGA) to perform the convolution operation. The main characteristic of the proposed approach are the usage of memory to store the incoming image pixel and the coefficient of the Gabor filter before the convolution matrix take place. The result was the signal convoluted with the Gabor coefficient.

  18. Hybrid registration of PET/CT in thoracic region with pre-filtering PET sinogram

    NASA Astrophysics Data System (ADS)

    Mokri, S. S.; Saripan, M. I.; Marhaban, M. H.; Nordin, A. J.; Hashim, S.

    2015-11-01

    The integration of physiological (PET) and anatomical (CT) images in cancer delineation requires an accurate spatial registration technique. Although hybrid PET/CT scanner is used to co-register these images, significant misregistrations exist due to patient and respiratory/cardiac motions. This paper proposes a hybrid feature-intensity based registration technique for hybrid PET/CT scanner. First, simulated PET sinogram was filtered with a 3D hybrid mean-median before reconstructing the image. The features were then derived from the segmented structures (lung, heart and tumor) from both images. The registration was performed based on modified multi-modality demon registration with multiresolution scheme. Apart from visual observations improvements, the proposed registration technique increased the normalized mutual information index (NMI) between the PET/CT images after registration. All nine tested datasets show marked improvements in mutual information (MI) index than free form deformation (FFD) registration technique with the highest MI increase is 25%.

  19. Lunar surface chemistry: A new imaging technique

    USGS Publications Warehouse

    Andre, C.G.; Bielefeld, M.J.; Eliason, E.; Soderblom, L.A.; Adler, I.; Philpotts, J.A.

    1977-01-01

    Detailed chemical maps of the lunar surface have been constructed by applying a new weighted-filter imaging technique to Apollo 15 and Apollo 16 x-ray fluorescence data. The data quality improvement is amply demonstrated by (i) modes in the frequency distribution, representing highland and mare soil suites, which are not evident before data filtering and (ii) numerous examples of chemical variations which are correlated with small-scale (about 15 kilometer) lunar topographic features.

  20. Lunar surface chemistry - A new imaging technique

    NASA Technical Reports Server (NTRS)

    Andre, C. G.; Adler, I.; Bielefeld, M. J.; Eliason, E.; Soderblom, L. A.; Philpotts, J. A.

    1977-01-01

    Detailed chemical maps of the lunar surface have been constructed by applying a new weighted-filter imaging technique to Apollo 15 and Apollo 16 X-ray fluorescence data. The data quality improvement is amply demonstrated by (1) modes in the frequency distribution, representing highland and mare soil suites, which are not evident before data filtering, and (2) numerous examples of chemical variations which are correlated with small-scale (about 15 kilometer) lunar topographic features.

  1. Classification of Hyperspectral Data Based on Guided Filtering and Random Forest

    NASA Astrophysics Data System (ADS)

    Ma, H.; Feng, W.; Cao, X.; Wang, L.

    2017-09-01

    Hyperspectral images usually consist of more than one hundred spectral bands, which have potentials to provide rich spatial and spectral information. However, the application of hyperspectral data is still challengeable due to "the curse of dimensionality". In this context, many techniques, which aim to make full use of both the spatial and spectral information, are investigated. In order to preserve the geometrical information, meanwhile, with less spectral bands, we propose a novel method, which combines principal components analysis (PCA), guided image filtering and the random forest classifier (RF). In detail, PCA is firstly employed to reduce the dimension of spectral bands. Secondly, the guided image filtering technique is introduced to smooth land object, meanwhile preserving the edge of objects. Finally, the features are fed into RF classifier. To illustrate the effectiveness of the method, we carry out experiments over the popular Indian Pines data set, which is collected by Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) sensor. By comparing the proposed method with the method of only using PCA or guided image filter, we find that effect of the proposed method is better.

  2. Restoration of out-of-focus images based on circle of confusion estimate

    NASA Astrophysics Data System (ADS)

    Vivirito, Paolo; Battiato, Sebastiano; Curti, Salvatore; La Cascia, M.; Pirrone, Roberto

    2002-11-01

    In this paper a new method for a fast out-of-focus blur estimation and restoration is proposed. It is suitable for CFA (Color Filter Array) images acquired by typical CCD/CMOS sensor. The method is based on the analysis of a single image and consists of two steps: 1) out-of-focus blur estimation via Bayer pattern analysis; 2) image restoration. Blur estimation is based on a block-wise edge detection technique. This edge detection is carried out on the green pixels of the CFA sensor image also called Bayer pattern. Once the blur level has been estimated the image is restored through the application of a new inverse filtering technique. This algorithm gives sharp images reducing ringing and crisping artifact, involving wider region of frequency. Experimental results show the effectiveness of the method, both in subjective and numerical way, by comparison with other techniques found in literature.

  3. Gabor filter based fingerprint image enhancement

    NASA Astrophysics Data System (ADS)

    Wang, Jin-Xiang

    2013-03-01

    Fingerprint recognition technology has become the most reliable biometric technology due to its uniqueness and invariance, which has been most convenient and most reliable technique for personal authentication. The development of Automated Fingerprint Identification System is an urgent need for modern information security. Meanwhile, fingerprint preprocessing algorithm of fingerprint recognition technology has played an important part in Automatic Fingerprint Identification System. This article introduces the general steps in the fingerprint recognition technology, namely the image input, preprocessing, feature recognition, and fingerprint image enhancement. As the key to fingerprint identification technology, fingerprint image enhancement affects the accuracy of the system. It focuses on the characteristics of the fingerprint image, Gabor filters algorithm for fingerprint image enhancement, the theoretical basis of Gabor filters, and demonstration of the filter. The enhancement algorithm for fingerprint image is in the windows XP platform with matlab.65 as a development tool for the demonstration. The result shows that the Gabor filter is effective in fingerprint image enhancement technology.

  4. Optimum constrained image restoration filters

    NASA Technical Reports Server (NTRS)

    Riemer, T. E.; Mcgillem, C. D.

    1974-01-01

    The filter was developed in Hilbert space by minimizing the radius of gyration of the overall or composite system point-spread function subject to constraints on the radius of gyration of the restoration filter point-spread function, the total noise power in the restored image, and the shape of the composite system frequency spectrum. An iterative technique is introduced which alters the shape of the optimum composite system point-spread function, producing a suboptimal restoration filter which suppresses undesirable secondary oscillations. Finally this technique is applied to multispectral scanner data obtained from the Earth Resources Technology Satellite to provide resolution enhancement. An experimental approach to the problems involving estimation of the effective scanner aperture and matching the ERTS data to available restoration functions is presented.

  5. Initial experience using the rigid forceps technique to remove wall-embedded IVC filters.

    PubMed

    Avery, Allan; Stephens, Maximilian; Redmond, Kendal; Harper, John

    2015-06-01

    Severely tilted and embedded inferior vena cava (IVC) filters remain the most challenging IVC filters to remove. Heavy endothelialisation over the filter hook can prevent engagement with standard snare and cone recovery techniques. The rigid forceps technique offers a way to dissect the endothelial cap and reliably retrieve severely tilted and embedded filters. By developing this technique, failed IVC retrieval rates can be significantly reduced and the optimum safety profile offered by temporary filters can be achieved. We present our initial experience with the rigid forceps technique described by Stavropoulos et al. for removing wall-embedded IVC filters. We retrospectively reviewed the medical imaging and patient records of all patients who underwent a rigid forceps filter removal over a 22-month period across two tertiary referral institutions. The rigid forceps technique had a success rate of 85% (11/13) for IVC filter removals. All filters in the series showed evidence of filter tilt and embedding of the filter hook into the IVC wall. Average filter tilt from the Z-axis was 19 degrees (range 8-56). Filters observed in the case study were either Bard G2X (n = 6) or Cook Celect (n = 7). Average filter dwell time was 421 days (range 47-1053). There were no major complications observed. The rigid forceps technique can be readily emulated and is a safe and effective technique to remove severely tilted and embedded IVC filters. The development of this technique across both institutions has increased the successful filter removal rate, with perceived benefits to the safety profile of our IVC filter programme. © 2015 The Royal Australian and New Zealand College of Radiologists.

  6. Nonlocal means-based speckle filtering for ultrasound images

    PubMed Central

    Coupé, Pierrick; Hellier, Pierre; Kervrann, Charles; Barillot, Christian

    2009-01-01

    In image processing, restoration is expected to improve the qualitative inspection of the image and the performance of quantitative image analysis techniques. In this paper, an adaptation of the Non Local (NL-) means filter is proposed for speckle reduction in ultrasound (US) images. Originally developed for additive white Gaussian noise, we propose to use a Bayesian framework to derive a NL-means filter adapted to a relevant ultrasound noise model. Quantitative results on synthetic data show the performances of the proposed method compared to well-established and state-of-the-art methods. Results on real images demonstrate that the proposed method is able to preserve accurately edges and structural details of the image. PMID:19482578

  7. Gabor filter for the segmentation of skin lesions from ultrasonographic images

    NASA Astrophysics Data System (ADS)

    Petrella, Lorena I.; Gómez, W.; Alvarenga, André V.; Pereira, Wagner C. A.

    2012-05-01

    The present work applies Gabor filters bank for texture analysis of skin lesions images, obtained by ultrasound biomicroscopy. The regions affected by the lesions were differentiated from surrounding tissue in all the analyzed cases; however the accuracy of the traced borders showed some limitations in part of the images. Future steps are being contemplated, attempting to enhance the technique performance.

  8. Rapid spontaneous Raman light sheet microscopy using cw-lasers and tunable filters

    PubMed Central

    Rocha-Mendoza, Israel; Licea-Rodriguez, Jacob; Marro, Mónica; Olarte, Omar E.; Plata-Sanchez, Marcos; Loza-Alvarez, Pablo

    2015-01-01

    We perform rapid spontaneous Raman 2D imaging in light-sheet microscopy using continuous wave lasers and interferometric tunable filters. By angularly tuning the filter, the cut-on/off edge transitions are scanned along the excited Stokes wavelengths. This allows obtaining cumulative intensity profiles of the scanned vibrational bands, which are recorded on image stacks; resembling a spectral version of the knife-edge technique to measure intensity profiles. A further differentiation of the stack retrieves the Raman spectra at each pixel of the image which inherits the 3D resolution of the host light sheet system. We demonstrate this technique using solvent solutions and composites of polystyrene beads and lipid droplets immersed in agar and by imaging the C–H (2800-3100cm−1) region in a C. elegans worm. The image acquisition time results in 4 orders of magnitude faster than confocal point scanning Raman systems, allowing the possibility of performing fast spontaneous Raman·3D-imaging on biological samples. PMID:26417514

  9. Acquisition and visualization techniques for narrow spectral color imaging.

    PubMed

    Neumann, László; García, Rafael; Basa, János; Hegedüs, Ramón

    2013-06-01

    This paper introduces a new approach in narrow-band imaging (NBI). Existing NBI techniques generate images by selecting discrete bands over the full visible spectrum or an even wider spectral range. In contrast, here we perform the sampling with filters covering a tight spectral window. This image acquisition method, named narrow spectral imaging, can be particularly useful when optical information is only available within a narrow spectral window, such as in the case of deep-water transmittance, which constitutes the principal motivation of this work. In this study we demonstrate the potential of the proposed photographic technique on nonunderwater scenes recorded under controlled conditions. To this end three multilayer narrow bandpass filters were employed, which transmit at 440, 456, and 470 nm bluish wavelengths, respectively. Since the differences among the images captured in such a narrow spectral window can be extremely small, both image acquisition and visualization require a novel approach. First, high-bit-depth images were acquired with multilayer narrow-band filters either placed in front of the illumination or mounted on the camera lens. Second, a color-mapping method is proposed, using which the input data can be transformed onto the entire display color gamut with a continuous and perceptually nearly uniform mapping, while ensuring optimally high information content for human perception.

  10. Speckle noise reduction in SAR images ship detection

    NASA Astrophysics Data System (ADS)

    Yuan, Ji; Wu, Bin; Yuan, Yuan; Huang, Qingqing; Chen, Jingbo; Ren, Lin

    2012-09-01

    At present, there are two types of method to detect ships in SAR images. One is a direct detection type, detecting ships directly. The other is an indirect detection type. That is, it firstly detects ship wakes, and then seeks ships around wakes. The two types all effect by speckle noise. In order to improve the accuracy of ship detection and get accurate ship and ship wakes parameters, such as ship length, ship width, ship area, the angle of ship wakes and ship outline from SAR images, it is extremely necessary to remove speckle noise in SAR images before data used in various SAR images ship detection. The use of speckle noise reduction filter depends on the specification for a particular application. Some common filters are widely used in speckle noise reduction, such as the mean filter, the median filter, the lee filter, the enhanced lee filter, the Kuan filter, the frost filter, the enhanced frost filter and gamma filter, but these filters represent some disadvantages in SAR image ship detection because of the various types of ship. Therefore, a mathematical function known as the wavelet transform and multi-resolution analysis were used to localize an SAR ocean image into different frequency components or useful subbands, and effectively reduce the speckle in the subbands according to the local statistics within the bands. Finally, the analysis of the statistical results are presented, which demonstrates the advantages and disadvantages of using wavelet shrinkage techniques over standard speckle filters.

  11. Half-Fan-Based Intensity-Weighted Region-of-Interest Imaging for Low-Dose Cone-Beam CT in Image-Guided Radiation Therapy.

    PubMed

    Yoo, Boyeol; Son, Kihong; Pua, Rizza; Kim, Jinsung; Solodov, Alexander; Cho, Seungryong

    2016-10-01

    With the increased use of computed tomography (CT) in clinics, dose reduction is the most important feature people seek when considering new CT techniques or applications. We developed an intensity-weighted region-of-interest (IWROI) imaging method in an exact half-fan geometry to reduce the imaging radiation dose to patients in cone-beam CT (CBCT) for image-guided radiation therapy (IGRT). While dose reduction is highly desirable, preserving the high-quality images of the ROI is also important for target localization in IGRT. An intensity-weighting (IW) filter made of copper was mounted in place of a bowtie filter on the X-ray tube unit of an on-board imager (OBI) system such that the filter can substantially reduce radiation exposure to the outer ROI. In addition to mounting the IW filter, the lead-blade collimation of the OBI was adjusted to produce an exact half-fan scanning geometry for a further reduction of the radiation dose. The chord-based rebinned backprojection-filtration (BPF) algorithm in circular CBCT was implemented for image reconstruction, and a humanoid pelvis phantom was used for the IWROI imaging experiment. The IWROI image of the phantom was successfully reconstructed after beam-quality correction, and it was registered to the reference image within an acceptable level of tolerance. Dosimetric measurements revealed that the dose is reduced by approximately 61% in the inner ROI and by 73% in the outer ROI compared to the conventional bowtie filter-based half-fan scan. The IWROI method substantially reduces the imaging radiation dose and provides reconstructed images with an acceptable level of quality for patient setup and target localization. The proposed half-fan-based IWROI imaging technique can add a valuable option to CBCT in IGRT applications.

  12. Bilateral filtering using the full noise covariance matrix applied to x-ray phase-contrast computed tomography.

    PubMed

    Allner, S; Koehler, T; Fehringer, A; Birnbacher, L; Willner, M; Pfeiffer, F; Noël, P B

    2016-05-21

    The purpose of this work is to develop an image-based de-noising algorithm that exploits complementary information and noise statistics from multi-modal images, as they emerge in x-ray tomography techniques, for instance grating-based phase-contrast CT and spectral CT. Among the noise reduction methods, image-based de-noising is one popular approach and the so-called bilateral filter is a well known algorithm for edge-preserving filtering. We developed a generalization of the bilateral filter for the case where the imaging system provides two or more perfectly aligned images. The proposed generalization is statistically motivated and takes the full second order noise statistics of these images into account. In particular, it includes a noise correlation between the images and spatial noise correlation within the same image. The novel generalized three-dimensional bilateral filter is applied to the attenuation and phase images created with filtered backprojection reconstructions from grating-based phase-contrast tomography. In comparison to established bilateral filters, we obtain improved noise reduction and at the same time a better preservation of edges in the images on the examples of a simulated soft-tissue phantom, a human cerebellum and a human artery sample. The applied full noise covariance is determined via cross-correlation of the image noise. The filter results yield an improved feature recovery based on enhanced noise suppression and edge preservation as shown here on the example of attenuation and phase images captured with grating-based phase-contrast computed tomography. This is supported by quantitative image analysis. Without being bound to phase-contrast imaging, this generalized filter is applicable to any kind of noise-afflicted image data with or without noise correlation. Therefore, it can be utilized in various imaging applications and fields.

  13. Full-color high-definition CGH reconstructing hybrid scenes of physical and virtual objects

    NASA Astrophysics Data System (ADS)

    Tsuchiyama, Yasuhiro; Matsushima, Kyoji; Nakahara, Sumio; Yamaguchi, Masahiro; Sakamoto, Yuji

    2017-03-01

    High-definition CGHs can reconstruct high-quality 3D images that are comparable to that in conventional optical holography. However, it was difficult to exhibit full-color images reconstructed by these high-definition CGHs, because three CGHs for RGB colors and a bulky image combiner were needed to produce full-color images. Recently, we reported a novel technique for full-color reconstruction using RGB color filters, which are similar to that used for liquid-crystal panels. This technique allows us to produce full-color high-definition CGHs composed of a single plate and place them on exhibition. By using the technique, we demonstrate full-color CGHs that reconstruct hybrid scenes comprised of real-existing physical objects and CG-modeled virtual objects in this paper. Here, the wave field of the physical object are obtained from dense multi-viewpoint images by employing the ray-sampling (RS) plane technique. In addition to the technique for full-color capturing and reconstruction of real object fields, the principle and simulation technique for full- color CGHs using RGB color filters are presented.

  14. Measurement of gamma' precipitates in a nickel-based superalloy using energy-filtered transmission electron microscopy coupled with automated segmenting techniques.

    PubMed

    Tiley, J S; Viswanathan, G B; Shiveley, A; Tschopp, M; Srinivasan, R; Banerjee, R; Fraser, H L

    2010-08-01

    Precipitates of the ordered L1(2) gamma' phase (dispersed in the face-centered cubic or FCC gamma matrix) were imaged in Rene 88 DT, a commercial multicomponent Ni-based superalloy, using energy-filtered transmission electron microscopy (EFTEM). Imaging was performed using the Cr, Co, Ni, Ti and Al elemental L-absorption edges in the energy loss spectrum. Manual and automated segmentation procedures were utilized for identification of precipitate boundaries and measurement of precipitate sizes. The automated region growing technique for precipitate identification in images was determined to measure accurately precipitate diameters. In addition, the region growing technique provided a repeatable method for optimizing segmentation techniques for varying EFTEM conditions. (c) 2010 Elsevier Ltd. All rights reserved.

  15. Computer image processing - The Viking experience. [digital enhancement techniques

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    Computer processing of digital imagery from the Viking mission to Mars is discussed, with attention given to subjective enhancement and quantitative processing. Contrast stretching and high-pass filtering techniques of subjective enhancement are described; algorithms developed to determine optimal stretch and filtering parameters are also mentioned. In addition, geometric transformations to rectify the distortion of shapes in the field of view and to alter the apparent viewpoint of the image are considered. Perhaps the most difficult problem in quantitative processing of Viking imagery was the production of accurate color representations of Orbiter and Lander camera images.

  16. Near-infrared spectral image analysis of pork marbling based on Gabor filter and wide line detector techniques.

    PubMed

    Huang, Hui; Liu, Li; Ngadi, Michael O; Gariépy, Claude; Prasher, Shiv O

    2014-01-01

    Marbling is an important quality attribute of pork. Detection of pork marbling usually involves subjective scoring, which raises the efficiency costs to the processor. In this study, the ability to predict pork marbling using near-infrared (NIR) hyperspectral imaging (900-1700 nm) and the proper image processing techniques were studied. Near-infrared images were collected from pork after marbling evaluation according to current standard chart from the National Pork Producers Council. Image analysis techniques-Gabor filter, wide line detector, and spectral averaging-were applied to extract texture, line, and spectral features, respectively, from NIR images of pork. Samples were grouped into calibration and validation sets. Wavelength selection was performed on calibration set by stepwise regression procedure. Prediction models of pork marbling scores were built using multiple linear regressions based on derivatives of mean spectra and line features at key wavelengths. The results showed that the derivatives of both texture and spectral features produced good results, with correlation coefficients of validation of 0.90 and 0.86, respectively, using wavelengths of 961, 1186, and 1220 nm. The results revealed the great potential of the Gabor filter for analyzing NIR images of pork for the effective and efficient objective evaluation of pork marbling.

  17. Image Recommendation Algorithm Using Feature-Based Collaborative Filtering

    NASA Astrophysics Data System (ADS)

    Kim, Deok-Hwan

    As the multimedia contents market continues its rapid expansion, the amount of image contents used in mobile phone services, digital libraries, and catalog service is increasing remarkably. In spite of this rapid growth, users experience high levels of frustration when searching for the desired image. Even though new images are profitable to the service providers, traditional collaborative filtering methods cannot recommend them. To solve this problem, in this paper, we propose feature-based collaborative filtering (FBCF) method to reflect the user's most recent preference by representing his purchase sequence in the visual feature space. The proposed approach represents the images that have been purchased in the past as the feature clusters in the multi-dimensional feature space and then selects neighbors by using an inter-cluster distance function between their feature clusters. Various experiments using real image data demonstrate that the proposed approach provides a higher quality recommendation and better performance than do typical collaborative filtering and content-based filtering techniques.

  18. Simulation of pattern and defect detection in periodic amplitude and phase structures using photorefractive four-wave mixing

    NASA Astrophysics Data System (ADS)

    Nehmetallah, Georges; Banerjee, Partha; Khoury, Jed

    2015-03-01

    The nonlinearity inherent in four-wave mixing in photorefractive (PR) materials is used for adaptive filtering. Examples include script enhancement on a periodic pattern, scratch and defect cluster enhancement, periodic pattern dislocation enhancement, etc. through intensity filtering image manipulation. Organic PR materials have large space-bandwidth product, which makes them useful in adaptive filtering techniques in quality control systems. For instance, in the case of edge enhancement, phase conjugation via four-wave mixing suppresses the low spatial frequencies of the Fourier spectrum of an aperiodic image and consequently leads to image edge enhancement. In this work, we model, numerically verify, and simulate the performance of a four wave mixing setup used for edge, defect and pattern detection in periodic amplitude and phase structures. The results show that this technique successfully detects the slightest defects clearly even with no enhancement. This technique should facilitate improvements in applications such as image display sharpness utilizing edge enhancement, production line defect inspection of fabrics, textiles, e-beam lithography masks, surface inspection, and materials characterization.

  19. Noise reduction with complex bilateral filter.

    PubMed

    Matsumoto, Mitsuharu

    2017-12-01

    This study introduces a noise reduction technique that uses a complex bilateral filter. A bilateral filter is a nonlinear filter originally developed for images that can reduce noise while preserving edge information. It is an attractive filter and has been used in many applications in image processing. When it is applied to an acoustical signal, small-amplitude noise is reduced while the speech signal is preserved. However, a bilateral filter cannot handle noise with relatively large amplitudes owing to its innate characteristics. In this study, the noisy signal is transformed into the time-frequency domain and the filter is improved to handle complex spectra. The high-amplitude noise is reduced in the time-frequency domain via the proposed filter. The features and the potential of the proposed filter are also confirmed through experiments.

  20. Perfect blind restoration of images blurred by multiple filters: theory and efficient algorithms.

    PubMed

    Harikumar, G; Bresler, Y

    1999-01-01

    We address the problem of restoring an image from its noisy convolutions with two or more unknown finite impulse response (FIR) filters. We develop theoretical results about the existence and uniqueness of solutions, and show that under some generically true assumptions, both the filters and the image can be determined exactly in the absence of noise, and stably estimated in its presence. We present efficient algorithms to estimate the blur functions and their sizes. These algorithms are of two types, subspace-based and likelihood-based, and are extensions of techniques proposed for the solution of the multichannel blind deconvolution problem in one dimension. We present memory and computation-efficient techniques to handle the very large matrices arising in the two-dimensional (2-D) case. Once the blur functions are determined, they are used in a multichannel deconvolution step to reconstruct the unknown image. The theoretical and practical implications of edge effects, and "weakly exciting" images are examined. Finally, the algorithms are demonstrated on synthetic and real data.

  1. Defogging of road images using gain coefficient-based trilateral filter

    NASA Astrophysics Data System (ADS)

    Singh, Dilbag; Kumar, Vijay

    2018-01-01

    Poor weather conditions are responsible for most of the road accidents year in and year out. Poor weather conditions, such as fog, degrade the visibility of objects. Thus, it becomes difficult for drivers to identify the vehicles in a foggy environment. The dark channel prior (DCP)-based defogging techniques have been found to be an efficient way to remove fog from road images. However, it produces poor results when image objects are inherently similar to airlight and no shadow is cast on them. To eliminate this problem, a modified restoration model-based DCP is developed to remove the fog from road images. The transmission map is also refined by developing a gain coefficient-based trilateral filter. Thus, the proposed technique has an ability to remove fog from road images in an effective manner. The proposed technique is compared with seven well-known defogging techniques on two benchmark foggy images datasets and five real-time foggy images. The experimental results demonstrate that the proposed approach is able to remove the different types of fog from roadside images as well as significantly improve the image's visibility. It also reveals that the restored image has little or no artifacts.

  2. Research on the Improved Image Dodging Algorithm Based on Mask Technique

    NASA Astrophysics Data System (ADS)

    Yao, F.; Hu, H.; Wan, Y.

    2012-08-01

    The remote sensing image dodging algorithm based on Mask technique is a good method for removing the uneven lightness within a single image. However, there are some problems with this algorithm, such as how to set an appropriate filter size, for which there is no good solution. In order to solve these problems, an improved algorithm is proposed. In this improved algorithm, the original image is divided into blocks, and then the image blocks with different definitions are smoothed using the low-pass filters with different cut-off frequencies to get the background image; for the image after subtraction, the regions with different lightness are processed using different linear transformation models. The improved algorithm can get a better dodging result than the original one, and can make the contrast of the whole image more consistent.

  3. Speckle reduction of OCT images using an adaptive cluster-based filtering

    NASA Astrophysics Data System (ADS)

    Adabi, Saba; Rashedi, Elaheh; Conforto, Silvia; Mehregan, Darius; Xu, Qiuyun; Nasiriavanaki, Mohammadreza

    2017-02-01

    Optical coherence tomography (OCT) has become a favorable device in the dermatology discipline due to its moderate resolution and penetration depth. OCT images however contain grainy pattern, called speckle, due to the broadband source that has been used in the configuration of OCT. So far, a variety of filtering techniques is introduced to reduce speckle in OCT images. Most of these methods are generic and can be applied to OCT images of different tissues. In this paper, we present a method for speckle reduction of OCT skin images. Considering the architectural structure of skin layers, it seems that a skin image can benefit from being segmented in to differentiable clusters, and being filtered separately in each cluster by using a clustering method and filtering methods such as Wiener. The proposed algorithm was tested on an optical solid phantom with predetermined optical properties. The algorithm was also tested on healthy skin images. The results show that the cluster-based filtering method can reduce the speckle and increase the signal-to-noise ratio and contrast while preserving the edges in the image.

  4. Bas-relief map using texture analysis with application to live enhancement of ultrasound images.

    PubMed

    Du, Huarui; Ma, Rui; Wang, Xiaoying; Zhang, Jue; Fang, Jing

    2015-05-01

    For ultrasound imaging, speckle is one of the most important factors in the degradation of contrast resolution because it masks meaningful texture and has the potential to interfere with diagnosis. It is expected that researchers would explore appropriate ways to reduce the speckle noise, to find the edges of structures and enhance weak borders between different organs in ultrasound imaging. Inspired by the principle of differential interference contrast microscopy, a "bas-relief map" is proposed that depicts the texture structure of ultrasound images. Based on a bas-relief map, an adaptive bas-relief filter was developed for ultrafast despeckling. Subsequently, an edge map was introduced to enhance the edges of images in real time. The holistic bas-relief map approach has been used experimentally with synthetic phantoms and digital ultrasound B-scan images of liver, kidney and gallbladder. Based on the visual inspection and the performance metrics of the despeckled images, it was found that the bas-relief map approach is capable of effectively reducing the speckle while significantly enhancing contrast and tissue boundaries for ultrasonic images, and its speckle reduction ability is comparable to that of Kuan, Lee and Frost filters. Meanwhile, the proposed technique could preserve more intra-region details compared with the popular speckle reducing anisotropic diffusion technique and more effectively enhance edges. In addition, the adaptive bas-relief filter was much less time consuming than the Kuan, Lee and Frost filter and speckle reducing anisotropic diffusion techniques. The bas-relief map strategy is effective for speckle reduction and live enhancement of ultrasound images, and can provide a valuable tool for clinical diagnosis. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  5. Nanophotonic Image Sensors.

    PubMed

    Chen, Qin; Hu, Xin; Wen, Long; Yu, Yan; Cumming, David R S

    2016-09-01

    The increasing miniaturization and resolution of image sensors bring challenges to conventional optical elements such as spectral filters and polarizers, the properties of which are determined mainly by the materials used, including dye polymers. Recent developments in spectral filtering and optical manipulating techniques based on nanophotonics have opened up the possibility of an alternative method to control light spectrally and spatially. By integrating these technologies into image sensors, it will become possible to achieve high compactness, improved process compatibility, robust stability and tunable functionality. In this Review, recent representative achievements on nanophotonic image sensors are presented and analyzed including image sensors with nanophotonic color filters and polarizers, metamaterial-based THz image sensors, filter-free nanowire image sensors and nanostructured-based multispectral image sensors. This novel combination of cutting edge photonics research and well-developed commercial products may not only lead to an important application of nanophotonics but also offer great potential for next generation image sensors beyond Moore's Law expectations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Automated railroad reconstruction from remote sensing image based on texture filter

    NASA Astrophysics Data System (ADS)

    Xiao, Jie; Lu, Kaixia

    2018-03-01

    Techniques of remote sensing have been improved incredibly in recent years and very accurate results and high resolution images can be acquired. There exist possible ways to use such data to reconstruct railroads. In this paper, an automated railroad reconstruction method from remote sensing images based on Gabor filter was proposed. The method is divided in three steps. Firstly, the edge-oriented railroad characteristics (such as line features) in a remote sensing image are detected using Gabor filter. Secondly, two response images with the filtering orientations perpendicular to each other are fused to suppress the noise and acquire a long stripe smooth region of railroads. Thirdly, a set of smooth regions can be extracted by firstly computing global threshold for the previous result image using Otsu's method and then converting it to a binary image based on the previous threshold. This workflow is tested on a set of remote sensing images and was found to deliver very accurate results in a quickly and highly automated manner.

  7. A novel method for segmentation of Infrared Scanning Laser Ophthalmoscope (IR-SLO) images of retina.

    PubMed

    Ajaz, Aqsa; Aliahmad, Behzad; Kumar, Dinesh K

    2017-07-01

    Retinal vessel segmentation forms an essential element of automatic retinal disease screening systems. The development of multimodal imaging system with IR-SLO and OCT could help in studying the early stages of retinal disease. The advantages of IR-SLO to examine the alterations in the structure of retina and direct correlation with OCT can be useful for assessment of various diseases. This paper presents an automatic method for segmentation of IR-SLO fundus images based on the combination of morphological filters and image enhancement techniques. As a first step, the retinal vessels are contrasted using morphological filters followed by background exclusion using Contrast Limited Adaptive Histogram Equalization (CLAHE) and Bilateral filtering. The final segmentation is obtained by using Isodata technique. Our approach was tested on a set of 26 IR-SLO images and results were compared to two set of gold standard images. The performance of the proposed method was evaluated in terms of sensitivity, specificity and accuracy. The system has an average accuracy of 0.90 for both the sets.

  8. Thin-film tunable filters for hyperspectral fluorescence microscopy

    PubMed Central

    Favreau, Peter; Hernandez, Clarissa; Lindsey, Ashley Stringfellow; Alvarez, Diego F.; Rich, Thomas; Prabhat, Prashant

    2013-01-01

    Abstract. Hyperspectral imaging is a powerful tool that acquires data from many spectral bands, forming a contiguous spectrum. Hyperspectral imaging was originally developed for remote sensing applications; however, hyperspectral techniques have since been applied to biological fluorescence imaging applications, such as fluorescence microscopy and small animal fluorescence imaging. The spectral filtering method largely determines the sensitivity and specificity of any hyperspectral imaging system. There are several types of spectral filtering hardware available for microscopy systems, most commonly acousto-optic tunable filters (AOTFs) and liquid crystal tunable filters (LCTFs). These filtering technologies have advantages and disadvantages. Here, we present a novel tunable filter for hyperspectral imaging—the thin-film tunable filter (TFTF). The TFTF presents several advantages over AOTFs and LCTFs, most notably, a high percentage transmission and a high out-of-band optical density (OD). We present a comparison of a TFTF-based hyperspectral microscopy system and a commercially available AOTF-based system. We have characterized the light transmission, wavelength calibration, and OD of both systems, and have then evaluated the capability of each system for discriminating between green fluorescent protein and highly autofluorescent lung tissue. Our results suggest that TFTFs are an alternative approach for hyperspectral filtering that offers improved transmission and out-of-band blocking. These characteristics make TFTFs well suited for other biomedical imaging devices, such as ophthalmoscopes or endoscopes. PMID:24077519

  9. Principal Component Analysis in the Spectral Analysis of the Dynamic Laser Speckle Patterns

    NASA Astrophysics Data System (ADS)

    Ribeiro, K. M.; Braga, R. A., Jr.; Horgan, G. W.; Ferreira, D. D.; Safadi, T.

    2014-02-01

    Dynamic laser speckle is a phenomenon that interprets an optical patterns formed by illuminating a surface under changes with coherent light. Therefore, the dynamic change of the speckle patterns caused by biological material is known as biospeckle. Usually, these patterns of optical interference evolving in time are analyzed by graphical or numerical methods, and the analysis in frequency domain has also been an option, however involving large computational requirements which demands new approaches to filter the images in time. Principal component analysis (PCA) works with the statistical decorrelation of data and it can be used as a data filtering. In this context, the present work evaluated the PCA technique to filter in time the data from the biospeckle images aiming the reduction of time computer consuming and improving the robustness of the filtering. It was used 64 images of biospeckle in time observed in a maize seed. The images were arranged in a data matrix and statistically uncorrelated by PCA technique, and the reconstructed signals were analyzed using the routine graphical and numerical methods to analyze the biospeckle. Results showed the potential of the PCA tool in filtering the dynamic laser speckle data, with the definition of markers of principal components related to the biological phenomena and with the advantage of fast computational processing.

  10. Sci-Thur AM: YIS – 07: Optimizing dual-energy x-ray parameters using a single filter for both high and low-energy images to enhance soft-tissue imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowman, Wesley; Sattarivand, Mike

    Objective: To optimize dual-energy parameters of ExacTrac stereoscopic x-ray imaging system for lung SBRT patients Methods: Simulated spectra and a lung phantom were used to optimize filter material, thickness, kVps, and weighting factors to obtain bone subtracted dual-energy images. Spektr simulations were used to identify material in the atomic number (Z) range [3–83] based on a metric defined to separate spectrums of high and low energies. Both energies used the same filter due to time constraints of image acquisition in lung SBRT imaging. A lung phantom containing bone, soft tissue, and a tumor mimicking material was imaged with filter thicknessesmore » range [0–1] mm and kVp range [60–140]. A cost function based on contrast-to-noise-ratio of bone, soft tissue, and tumor, as well as image noise content, was defined to optimize filter thickness and kVp. Using the optimized parameters, dual-energy images of anthropomorphic Rando phantom were acquired and evaluated for bone subtraction. Imaging dose was measured with dual-energy technique using tin filtering. Results: Tin was the material of choice providing the best energy separation, non-toxicity, and non-reactiveness. The best soft-tissue-only image in the lung phantom was obtained using 0.3 mm tin and [140, 80] kVp pair. Dual-energy images of the Rando phantom had noticeable bone elimination when compared to no filtration. Dose was lower with tin filtering compared to no filtration. Conclusions: Dual-energy soft-tissue imaging is feasible using ExacTrac stereoscopic imaging system utilizing a single tin filter for both high and low energies and optimized acquisition parameters.« less

  11. MicroCT parameters for multimaterial elements assessment

    NASA Astrophysics Data System (ADS)

    de Araújo, Olga M. O.; Silva Bastos, Jaqueline; Machado, Alessandra S.; dos Santos, Thaís M. P.; Ferreira, Cintia G.; Rosifini Alves Claro, Ana Paula; Lopes, Ricardo T.

    2018-03-01

    Microtomography is a non-destructive testing technique for quantitative and qualitative analysis. The investigation of multimaterial elements with great difference of density can result in artifacts that degrade image quality depending on combination of additional filter. The aim of this study is the selection of parameters most appropriate for analysis of bone tissue with metallic implant. The results show the simulation with MCNPX code for the distribution of energy without additional filter, with use of aluminum, copper and brass filters and their respective reconstructed images showing the importance of the choice of these parameters in image acquisition process on computed microtomography.

  12. Real-time 3D adaptive filtering for portable imaging systems

    NASA Astrophysics Data System (ADS)

    Bockenbach, Olivier; Ali, Murtaza; Wainwright, Ian; Nadeski, Mark

    2015-03-01

    Portable imaging devices have proven valuable for emergency medical services both in the field and hospital environments and are becoming more prevalent in clinical settings where the use of larger imaging machines is impractical. 3D adaptive filtering is one of the most advanced techniques aimed at noise reduction and feature enhancement, but is computationally very demanding and hence often not able to run with sufficient performance on a portable platform. In recent years, advanced multicore DSPs have been introduced that attain high processing performance while maintaining low levels of power dissipation. These processors enable the implementation of complex algorithms like 3D adaptive filtering, improving the image quality of portable medical imaging devices. In this study, the performance of a 3D adaptive filtering algorithm on a digital signal processor (DSP) is investigated. The performance is assessed by filtering a volume of size 512x256x128 voxels sampled at a pace of 10 MVoxels/sec.

  13. Novel and Advanced Techniques for Complex IVC Filter Retrieval.

    PubMed

    Daye, Dania; Walker, T Gregory

    2017-04-01

    Inferior vena cava (IVC) filter placement is indicated for the treatment of venous thromboembolism (VTE) in patients with a contraindication to or a failure of anticoagulation. With the advent of retrievable IVC filters and their ease of placement, an increasing number of such filters are being inserted for prophylaxis in patients at high risk for VTE. Available data show that only a small number of these filters are retrieved within the recommended period, if at all, prompting the FDA to issue a statement on the need for their timely removal. With prolonged dwell times, advanced techniques may be needed for filter retrieval in up to 60% of the cases. In this article, we review standard and advanced IVC filter retrieval techniques including single-access, dual-access, and dissection techniques. Complicated filter retrievals carry a non-negligible risk for complications such as filter fragmentation and resultant embolization of filter components, venous pseudoaneurysms or stenoses, and breach of the integrity of the caval wall. Careful pre-retrieval assessment of IVC filter position, any significant degree of filter tilting or of hook, and/or strut epithelialization and caval wall penetration by filter components should be considered using dedicated cross-sectional imaging for procedural planning. In complex cases, the risk for retrieval complications should be carefully weighed against the risks of leaving the filter permanently indwelling. The decision to remove an embedded IVC filter using advanced techniques should be individualized to each patient and made with caution, based on the patient's age and existing comorbidities.

  14. Rule-based fuzzy vector median filters for 3D phase contrast MRI segmentation

    NASA Astrophysics Data System (ADS)

    Sundareswaran, Kartik S.; Frakes, David H.; Yoganathan, Ajit P.

    2008-02-01

    Recent technological advances have contributed to the advent of phase contrast magnetic resonance imaging (PCMRI) as standard practice in clinical environments. In particular, decreased scan times have made using the modality more feasible. PCMRI is now a common tool for flow quantification, and for more complex vector field analyses that target the early detection of problematic flow conditions. Segmentation is one component of this type of application that can impact the accuracy of the final product dramatically. Vascular segmentation, in general, is a long-standing problem that has received significant attention. Segmentation in the context of PCMRI data, however, has been explored less and can benefit from object-based image processing techniques that incorporate fluids specific information. Here we present a fuzzy rule-based adaptive vector median filtering (FAVMF) algorithm that in combination with active contour modeling facilitates high-quality PCMRI segmentation while mitigating the effects of noise. The FAVMF technique was tested on 111 synthetically generated PC MRI slices and on 15 patients with congenital heart disease. The results were compared to other multi-dimensional filters namely the adaptive vector median filter, the adaptive vector directional filter, and the scalar low pass filter commonly used in PC MRI applications. FAVMF significantly outperformed the standard filtering methods (p < 0.0001). Two conclusions can be drawn from these results: a) Filtering should be performed after vessel segmentation of PC MRI; b) Vector based filtering methods should be used instead of scalar techniques.

  15. Comparative Study of Speckle Filtering Methods in PolSAR Radar Images

    NASA Astrophysics Data System (ADS)

    Boutarfa, S.; Bouchemakh, L.; Smara, Y.

    2015-04-01

    Images acquired by polarimetric SAR (PolSAR) radar systems are characterized by the presence of a noise called speckle. This noise has a multiplicative nature, corrupts both the amplitude and phase images, which complicates data interpretation, degrades segmentation performance and reduces the detectability of targets. Hence, the need to preprocess the images by adapted filtering methods before analysis.In this paper, we present a comparative study of implemented methods for reducing speckle in PolSAR images. These developed filters are: refined Lee filter based on the estimation of the minimum mean square error MMSE, improved Sigma filter with detection of strong scatterers based on the calculation of the coherency matrix to detect the different scatterers in order to preserve the polarization signature and maintain structures that are necessary for image interpretation, filtering by stationary wavelet transform SWT using multi-scale edge detection and the technique for improving the wavelet coefficients called SSC (sum of squared coefficients), and Turbo filter which is a combination between two complementary filters the refined Lee filter and the wavelet transform SWT. One filter can boost up the results of the other.The originality of our work is based on the application of these methods to several types of images: amplitude, intensity and complex, from a satellite or an airborne radar, and on the optimization of wavelet filtering by adding a parameter in the calculation of the threshold. This parameter will control the filtering effect and get a good compromise between smoothing homogeneous areas and preserving linear structures.The methods are applied to the fully polarimetric RADARSAT-2 images (HH, HV, VH, VV) acquired on Algiers, Algeria, in C-band and to the three polarimetric E-SAR images (HH, HV, VV) acquired on Oberpfaffenhofen area located in Munich, Germany, in P-band.To evaluate the performance of each filter, we used the following criteria: smoothing homogeneous areas, preserving edges and polarimetric information.Experimental results are included to illustrate the different implemented methods.

  16. Design of order statistics filters using feedforward neural networks

    NASA Astrophysics Data System (ADS)

    Maslennikova, Yu. S.; Bochkarev, V. V.

    2016-08-01

    In recent years significant progress have been made in the development of nonlinear data processing techniques. Such techniques are widely used in digital data filtering and image enhancement. Many of the most effective nonlinear filters based on order statistics. The widely used median filter is the best known order statistic filter. Generalized form of these filters could be presented based on Lloyd's statistics. Filters based on order statistics have excellent robustness properties in the presence of impulsive noise. In this paper, we present special approach for synthesis of order statistics filters using artificial neural networks. Optimal Lloyd's statistics are used for selecting of initial weights for the neural network. Adaptive properties of neural networks provide opportunities to optimize order statistics filters for data with asymmetric distribution function. Different examples demonstrate the properties and performance of presented approach.

  17. The EM Method in a Probabilistic Wavelet-Based MRI Denoising

    PubMed Central

    2015-01-01

    Human body heat emission and others external causes can interfere in magnetic resonance image acquisition and produce noise. In this kind of images, the noise, when no signal is present, is Rayleigh distributed and its wavelet coefficients can be approximately modeled by a Gaussian distribution. Noiseless magnetic resonance images can be modeled by a Laplacian distribution in the wavelet domain. This paper proposes a new magnetic resonance image denoising method to solve this fact. This method performs shrinkage of wavelet coefficients based on the conditioned probability of being noise or detail. The parameters involved in this filtering approach are calculated by means of the expectation maximization (EM) method, which avoids the need to use an estimator of noise variance. The efficiency of the proposed filter is studied and compared with other important filtering techniques, such as Nowak's, Donoho-Johnstone's, Awate-Whitaker's, and nonlocal means filters, in different 2D and 3D images. PMID:26089959

  18. The EM Method in a Probabilistic Wavelet-Based MRI Denoising.

    PubMed

    Martin-Fernandez, Marcos; Villullas, Sergio

    2015-01-01

    Human body heat emission and others external causes can interfere in magnetic resonance image acquisition and produce noise. In this kind of images, the noise, when no signal is present, is Rayleigh distributed and its wavelet coefficients can be approximately modeled by a Gaussian distribution. Noiseless magnetic resonance images can be modeled by a Laplacian distribution in the wavelet domain. This paper proposes a new magnetic resonance image denoising method to solve this fact. This method performs shrinkage of wavelet coefficients based on the conditioned probability of being noise or detail. The parameters involved in this filtering approach are calculated by means of the expectation maximization (EM) method, which avoids the need to use an estimator of noise variance. The efficiency of the proposed filter is studied and compared with other important filtering techniques, such as Nowak's, Donoho-Johnstone's, Awate-Whitaker's, and nonlocal means filters, in different 2D and 3D images.

  19. Impact imaging of aircraft composite structure based on a model-independent spatial-wavenumber filter.

    PubMed

    Qiu, Lei; Liu, Bin; Yuan, Shenfang; Su, Zhongqing

    2016-01-01

    The spatial-wavenumber filtering technique is an effective approach to distinguish the propagating direction and wave mode of Lamb wave in spatial-wavenumber domain. Therefore, it has been gradually studied for damage evaluation in recent years. But for on-line impact monitoring in practical application, the main problem is how to realize the spatial-wavenumber filtering of impact signal when the wavenumber of high spatial resolution cannot be measured or the accurate wavenumber curve cannot be modeled. In this paper, a new model-independent spatial-wavenumber filter based impact imaging method is proposed. In this method, a 2D cross-shaped array constructed by two linear piezoelectric (PZT) sensor arrays is used to acquire impact signal on-line. The continuous complex Shannon wavelet transform is adopted to extract the frequency narrowband signals from the frequency wideband impact response signals of the PZT sensors. A model-independent spatial-wavenumber filter is designed based on the spatial-wavenumber filtering technique. Based on the designed filter, a wavenumber searching and best match mechanism is proposed to implement the spatial-wavenumber filtering of the frequency narrowband signals without modeling, which can be used to obtain a wavenumber-time image of the impact relative to a linear PZT sensor array. By using the two wavenumber-time images of the 2D cross-shaped array, the impact direction can be estimated without blind angle. The impact distance relative to the 2D cross-shaped array can be calculated by using the difference of time-of-flight between the frequency narrowband signals of two different central frequencies and the corresponding group velocities. The validations performed on a carbon fiber composite laminate plate and an aircraft composite oil tank show a good impact localization accuracy of the model-independent spatial-wavenumber filter based impact imaging method. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. A filtering approach to edge preserving MAP estimation of images.

    PubMed

    Humphrey, David; Taubman, David

    2011-05-01

    The authors present a computationally efficient technique for maximum a posteriori (MAP) estimation of images in the presence of both blur and noise. The image is divided into statistically independent regions. Each region is modelled with a WSS Gaussian prior. Classical Wiener filter theory is used to generate a set of convex sets in the solution space, with the solution to the MAP estimation problem lying at the intersection of these sets. The proposed algorithm uses an underlying segmentation of the image, and a means of determining the segmentation and refining it are described. The algorithm is suitable for a range of image restoration problems, as it provides a computationally efficient means to deal with the shortcomings of Wiener filtering without sacrificing the computational simplicity of the filtering approach. The algorithm is also of interest from a theoretical viewpoint as it provides a continuum of solutions between Wiener filtering and Inverse filtering depending upon the segmentation used. We do not attempt to show here that the proposed method is the best general approach to the image reconstruction problem. However, related work referenced herein shows excellent performance in the specific problem of demosaicing.

  1. Full-color large-scaled computer-generated holograms using RGB color filters.

    PubMed

    Tsuchiyama, Yasuhiro; Matsushima, Kyoji

    2017-02-06

    A technique using RGB color filters is proposed for creating high-quality full-color computer-generated holograms (CGHs). The fringe of these CGHs is composed of more than a billion pixels. The CGHs reconstruct full-parallax three-dimensional color images with a deep sensation of depth caused by natural motion parallax. The simulation technique as well as the principle and challenges of high-quality full-color reconstruction are presented to address the design of filter properties suitable for large-scaled CGHs. Optical reconstructions of actual fabricated full-color CGHs are demonstrated in order to verify the proposed techniques.

  2. THz near-field spectral encoding imaging using a rainbow metasurface.

    PubMed

    Lee, Kanghee; Choi, Hyun Joo; Son, Jaehyeon; Park, Hyun-Sung; Ahn, Jaewook; Min, Bumki

    2015-09-24

    We demonstrate a fast image acquisition technique in the terahertz range via spectral encoding using a metasurface. The metasurface is composed of spatially varying units of mesh filters that exhibit bandpass features. Each mesh filter is arranged such that the centre frequencies of the mesh filters are proportional to their position within the metasurface, similar to a rainbow. For imaging, the object is placed in front of the rainbow metasurface, and the image is reconstructed by measuring the transmitted broadband THz pulses through both the metasurface and the object. The 1D image information regarding the object is linearly mapped into the spectrum of the transmitted wave of the rainbow metasurface. Thus, 2D images can be successfully reconstructed using simple 1D data acquisition processes.

  3. On detection of median filtering in digital images

    NASA Astrophysics Data System (ADS)

    Kirchner, Matthias; Fridrich, Jessica

    2010-01-01

    In digital image forensics, it is generally accepted that intentional manipulations of the image content are most critical and hence numerous forensic methods focus on the detection of such 'malicious' post-processing. However, it is also beneficial to know as much as possible about the general processing history of an image, including content-preserving operations, since they can affect the reliability of forensic methods in various ways. In this paper, we present a simple yet effective technique to detect median filtering in digital images-a widely used denoising and smoothing operator. As a great variety of forensic methods relies on some kind of a linearity assumption, a detection of non-linear median filtering is of particular interest. The effectiveness of our method is backed with experimental evidence on a large image database.

  4. Homomorphic filtering textural analysis technique to reduce multiplicative noise in the 11Oba nano-doped liquid crystalline compounds

    NASA Astrophysics Data System (ADS)

    Madhav, B. T. P.; Pardhasaradhi, P.; Manepalli, R. K. N. R.; Pisipati, V. G. K. M.

    2015-07-01

    The compound undecyloxy benzoic acid (11Oba) exhibits nematic and smectic-C phases while a nano-doped undecyloxy benzoic acid with ZnO exhibits the same nematic and smectic-C phases with reduced clearing temperature as expected. The doping is done with 0.5% and 1% ZnO molecules. The clearing temperatures are reduced by approximately 4 ° and 6 °, respectively (differential scanning calorimeter data). While collecting the images from a polarizing microscope connected with hot stage and camera, the illumination and reflectance combined multiplicatively and the image quality was reduced to identify the exact phase in the compound. A novel technique of homomorphic filtering is used in this manuscript through which multiplicative noise components of the image are separated linearly in the frequency domain. This technique provides a frequency domain procedure to improve the appearance of an image by gray level range compression and contrast enhancement.

  5. Correlation Filters for Detection of Cellular Nuclei in Histopathology Images.

    PubMed

    Ahmad, Asif; Asif, Amina; Rajpoot, Nasir; Arif, Muhammad; Minhas, Fayyaz Ul Amir Afsar

    2017-11-21

    Nuclei detection in histology images is an essential part of computer aided diagnosis of cancers and tumors. It is a challenging task due to diverse and complicated structures of cells. In this work, we present an automated technique for detection of cellular nuclei in hematoxylin and eosin stained histopathology images. Our proposed approach is based on kernelized correlation filters. Correlation filters have been widely used in object detection and tracking applications but their strength has not been explored in the medical imaging domain up till now. Our experimental results show that the proposed scheme gives state of the art accuracy and can learn complex nuclear morphologies. Like deep learning approaches, the proposed filters do not require engineering of image features as they can operate directly on histopathology images without significant preprocessing. However, unlike deep learning methods, the large-margin correlation filters developed in this work are interpretable, computationally efficient and do not require specialized or expensive computing hardware. A cloud based webserver of the proposed method and its python implementation can be accessed at the following URL: http://faculty.pieas.edu.pk/fayyaz/software.html#corehist .

  6. Efficiency analysis for 3D filtering of multichannel images

    NASA Astrophysics Data System (ADS)

    Kozhemiakin, Ruslan A.; Rubel, Oleksii; Abramov, Sergey K.; Lukin, Vladimir V.; Vozel, Benoit; Chehdi, Kacem

    2016-10-01

    Modern remote sensing systems basically acquire images that are multichannel (dual- or multi-polarization, multi- and hyperspectral) where noise, usually with different characteristics, is present in all components. If noise is intensive, it is desirable to remove (suppress) it before applying methods of image classification, interpreting, and information extraction. This can be done using one of two approaches - by component-wise or by vectorial (3D) filtering. The second approach has shown itself to have higher efficiency if there is essential correlation between multichannel image components as this often happens for multichannel remote sensing data of different origin. Within the class of 3D filtering techniques, there are many possibilities and variations. In this paper, we consider filtering based on discrete cosine transform (DCT) and pay attention to two aspects of processing. First, we study in detail what changes in DCT coefficient statistics take place for 3D denoising compared to component-wise processing. Second, we analyze how selection of component images united into 3D data array influences efficiency of filtering and can the observed tendencies be exploited in processing of images with rather large number of channels.

  7. High-Speed Imaging Optical Pyrometry for Study of Boron Nitride Nanotube Generation

    NASA Technical Reports Server (NTRS)

    Inman, Jennifer A.; Danehy, Paul M.; Jones, Stephen B.; Lee, Joseph W.

    2014-01-01

    A high-speed imaging optical pyrometry system is designed for making in-situ measurements of boron temperature during the boron nitride nanotube synthesis process. Spectrometer measurements show molten boron emission to be essentially graybody in nature, lacking spectral emission fine structure over the visible range of the electromagnetic spectrum. Camera calibration experiments are performed and compared with theoretical calculations to quantitatively establish the relationship between observed signal intensity and temperature. The one-color pyrometry technique described herein involves measuring temperature based upon the absolute signal intensity observed through a narrowband spectral filter, while the two-color technique uses the ratio of the signals through two spectrally separated filters. The present study calibrated both the one- and two-color techniques at temperatures between 1,173 K and 1,591 K using a pco.dimax HD CMOS-based camera along with three such filters having transmission peaks near 550 nm, 632.8 nm, and 800 nm.

  8. Markerless positional verification using template matching and triangulation of kV images acquired during irradiation for lung tumors treated in breath-hold

    NASA Astrophysics Data System (ADS)

    Hazelaar, Colien; Dahele, Max; Mostafavi, Hassan; van der Weide, Lineke; Slotman, Ben; Verbakel, Wilko

    2018-06-01

    Lung tumors treated in breath-hold are subject to inter- and intra-breath-hold variations, which makes tumor position monitoring during each breath-hold important. A markerless technique is desirable, but limited tumor visibility on kV images makes this challenging. We evaluated if template matching  +  triangulation of kV projection images acquired during breath-hold stereotactic treatments could determine 3D tumor position. Band-pass filtering and/or digital tomosynthesis (DTS) were used as image pre-filtering/enhancement techniques. On-board kV images continuously acquired during volumetric modulated arc irradiation of (i) a 3D-printed anthropomorphic thorax phantom with three lung tumors (n  =  6 stationary datasets, n  =  2 gradually moving), and (ii) four patients (13 datasets) were analyzed. 2D reference templates (filtered DRRs) were created from planning CT data. Normalized cross-correlation was used for 2D matching between templates and pre-filtered/enhanced kV images. For 3D verification, each registration was triangulated with multiple previous registrations. Generally applicable image processing/algorithm settings for lung tumors in breath-hold were identified. For the stationary phantom, the interquartile range of the 3D position vector was on average 0.25 mm for 12° DTS  +  band-pass filtering (average detected positions in 2D  =  99.7%, 3D  =  96.1%, and 3D excluding first 12° due to triangulation angle  =  99.9%) compared to 0.81 mm for band-pass filtering only (55.8/52.9/55.0%). For the moving phantom, RMS errors for the lateral/longitudinal/vertical direction after 12° DTS  +  band-pass filtering were 1.5/0.4/1.1 mm and 2.2/0.3/3.2 mm. For the clinical data, 2D position was determined for at least 93% of each dataset and 3D position excluding first 12° for at least 82% of each dataset using 12° DTS  +  band-pass filtering. Template matching  +  triangulation using DTS  +  band-pass filtered images could accurately determine the position of stationary lung tumors. However, triangulation was less accurate/reliable for targets with continuous, gradual displacement in the lateral and vertical directions. This technique is therefore currently most suited to detect/monitor offsets occurring between initial setup and the start of treatment, inter-breath-hold variations, and tumors with predominantly longitudinal motion.

  9. Nanophotonic Image Sensors

    PubMed Central

    Hu, Xin; Wen, Long; Yu, Yan; Cumming, David R. S.

    2016-01-01

    The increasing miniaturization and resolution of image sensors bring challenges to conventional optical elements such as spectral filters and polarizers, the properties of which are determined mainly by the materials used, including dye polymers. Recent developments in spectral filtering and optical manipulating techniques based on nanophotonics have opened up the possibility of an alternative method to control light spectrally and spatially. By integrating these technologies into image sensors, it will become possible to achieve high compactness, improved process compatibility, robust stability and tunable functionality. In this Review, recent representative achievements on nanophotonic image sensors are presented and analyzed including image sensors with nanophotonic color filters and polarizers, metamaterial‐based THz image sensors, filter‐free nanowire image sensors and nanostructured‐based multispectral image sensors. This novel combination of cutting edge photonics research and well‐developed commercial products may not only lead to an important application of nanophotonics but also offer great potential for next generation image sensors beyond Moore's Law expectations. PMID:27239941

  10. SU-E-I-57: Evaluation and Optimization of Effective-Dose Using Different Beam-Hardening Filters in Clinical Pediatric Shunt CT Protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gill, K; Aldoohan, S; Collier, J

    Purpose: Study image optimization and radiation dose reduction in pediatric shunt CT scanning protocol through the use of different beam-hardening filters Methods: A 64-slice CT scanner at OU Childrens Hospital has been used to evaluate CT image contrast-to-noise ratio (CNR) and measure effective-doses based on the concept of CT dose index (CTDIvol) using the pediatric head shunt scanning protocol. The routine axial pediatric head shunt scanning protocol that has been optimized for the intrinsic x-ray tube filter has been used to evaluate CNR by acquiring images using the ACR approved CT-phantom and radiation dose CTphantom, which was used to measuremore » CTDIvol. These results were set as reference points to study and evaluate the effects of adding different filtering materials (i.e. Tungsten, Tantalum, Titanium, Nickel and Copper filters) to the existing filter on image quality and radiation dose. To ensure optimal image quality, the scanner routine air calibration was run for each added filter. The image CNR was evaluated for different kVps and wide range of mAs values using above mentioned beam-hardening filters. These scanning protocols were run under axial as well as under helical techniques. The CTDIvol and the effective-dose were measured and calculated for all scanning protocols and added filtration, including the intrinsic x-ray tube filter. Results: Beam-hardening filter shapes energy spectrum, which reduces the dose by 27%. No noticeable changes in image low contrast detectability Conclusion: Effective-dose is very much dependent on the CTDIVol, which is further very much dependent on beam-hardening filters. Substantial reduction in effective-dose is realized using beam-hardening filters as compare to the intrinsic filter. This phantom study showed that significant radiation dose reduction could be achieved in CT pediatric shunt scanning protocols without compromising in diagnostic value of image quality.« less

  11. Fractional order integration and fuzzy logic based filter for denoising of echocardiographic image.

    PubMed

    Saadia, Ayesha; Rashdi, Adnan

    2016-12-01

    Ultrasound is widely used for imaging due to its cost effectiveness and safety feature. However, ultrasound images are inherently corrupted with speckle noise which severely affects the quality of these images and create difficulty for physicians in diagnosis. To get maximum benefit from ultrasound imaging, image denoising is an essential requirement. To perform image denoising, a two stage methodology using fuzzy weighted mean and fractional integration filter has been proposed in this research work. In stage-1, image pixels are processed by applying a 3 × 3 window around each pixel and fuzzy logic is used to assign weights to the pixels in each window, replacing central pixel of the window with weighted mean of all neighboring pixels present in the same window. Noise suppression is achieved by assigning weights to the pixels while preserving edges and other important features of an image. In stage-2, the resultant image is further improved by fractional order integration filter. Effectiveness of the proposed methodology has been analyzed for standard test images artificially corrupted with speckle noise and real ultrasound B-mode images. Results of the proposed technique have been compared with different state-of-the-art techniques including Lsmv, Wiener, Geometric filter, Bilateral, Non-local means, Wavelet, Perona et al., Total variation (TV), Global Adaptive Fractional Integral Algorithm (GAFIA) and Improved Fractional Order Differential (IFD) model. Comparison has been done on quantitative and qualitative basis. For quantitative analysis different metrics like Peak Signal to Noise Ratio (PSNR), Speckle Suppression Index (SSI), Structural Similarity (SSIM), Edge Preservation Index (β) and Correlation Coefficient (ρ) have been used. Simulations have been done using Matlab. Simulation results of artificially corrupted standard test images and two real Echocardiographic images reveal that the proposed method outperforms existing image denoising techniques reported in the literature. The proposed method for denoising of Echocardiographic images is effective in noise suppression/removal. It not only removes noise from an image but also preserves edges and other important structure. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Filtered Rayleigh Scattering Measurements in a Buoyant Flow Field

    DTIC Science & Technology

    2008-03-01

    ENY/08-M22 Abstract Filtered Rayleigh Scattering (FRS) is a non-intrusive, laser -based flow characterization technique that consists of a narrow...linewidth laser , a molecular absorption filter, and a high resolution camera behind the filter to record images. Gases of different species have...different molecular scattering cross-sections that become apparent as they pass through the interrogating laser light source, and this difference is

  13. Assessing FRET using Spectral Techniques

    PubMed Central

    Leavesley, Silas J.; Britain, Andrea L.; Cichon, Lauren K.; Nikolaev, Viacheslav O.; Rich, Thomas C.

    2015-01-01

    Förster resonance energy transfer (FRET) techniques have proven invaluable for probing the complex nature of protein–protein interactions, protein folding, and intracellular signaling events. These techniques have traditionally been implemented with the use of one or more fluorescence band-pass filters, either as fluorescence microscopy filter cubes, or as dichroic mirrors and band-pass filters in flow cytometry. In addition, new approaches for measuring FRET, such as fluorescence lifetime and acceptor photobleaching, have been developed. Hyperspectral techniques for imaging and flow cytometry have also shown to be promising for performing FRET measurements. In this study, we have compared traditional (filter-based) FRET approaches to three spectral-based approaches: the ratio of acceptor-to-donor peak emission, linear spectral unmixing, and linear spectral unmixing with a correction for direct acceptor excitation. All methods are estimates of FRET efficiency, except for one-filter set and three-filter set FRET indices, which are included for consistency with prior literature. In the first part of this study, spectrofluorimetric data were collected from a CFP–Epac–YFP FRET probe that has been used for intracellular cAMP measurements. All comparisons were performed using the same spectrofluorimetric datasets as input data, to provide a relevant comparison. Linear spectral unmixing resulted in measurements with the lowest coefficient of variation (0.10) as well as accurate fits using the Hill equation. FRET efficiency methods produced coefficients of variation of less than 0.20, while FRET indices produced coefficients of variation greater than 8.00. These results demonstrate that spectral FRET measurements provide improved response over standard, filter-based measurements. Using spectral approaches, single-cell measurements were conducted through hyperspectral confocal microscopy, linear unmixing, and cell segmentation with quantitative image analysis. Results from these studies confirmed that spectral imaging is effective for measuring subcellular, time-dependent FRET dynamics and that additional fluorescent signals can be readily separated from FRET signals, enabling multilabel studies of molecular interactions. PMID:23929684

  14. Assessing FRET using spectral techniques.

    PubMed

    Leavesley, Silas J; Britain, Andrea L; Cichon, Lauren K; Nikolaev, Viacheslav O; Rich, Thomas C

    2013-10-01

    Förster resonance energy transfer (FRET) techniques have proven invaluable for probing the complex nature of protein-protein interactions, protein folding, and intracellular signaling events. These techniques have traditionally been implemented with the use of one or more fluorescence band-pass filters, either as fluorescence microscopy filter cubes, or as dichroic mirrors and band-pass filters in flow cytometry. In addition, new approaches for measuring FRET, such as fluorescence lifetime and acceptor photobleaching, have been developed. Hyperspectral techniques for imaging and flow cytometry have also shown to be promising for performing FRET measurements. In this study, we have compared traditional (filter-based) FRET approaches to three spectral-based approaches: the ratio of acceptor-to-donor peak emission, linear spectral unmixing, and linear spectral unmixing with a correction for direct acceptor excitation. All methods are estimates of FRET efficiency, except for one-filter set and three-filter set FRET indices, which are included for consistency with prior literature. In the first part of this study, spectrofluorimetric data were collected from a CFP-Epac-YFP FRET probe that has been used for intracellular cAMP measurements. All comparisons were performed using the same spectrofluorimetric datasets as input data, to provide a relevant comparison. Linear spectral unmixing resulted in measurements with the lowest coefficient of variation (0.10) as well as accurate fits using the Hill equation. FRET efficiency methods produced coefficients of variation of less than 0.20, while FRET indices produced coefficients of variation greater than 8.00. These results demonstrate that spectral FRET measurements provide improved response over standard, filter-based measurements. Using spectral approaches, single-cell measurements were conducted through hyperspectral confocal microscopy, linear unmixing, and cell segmentation with quantitative image analysis. Results from these studies confirmed that spectral imaging is effective for measuring subcellular, time-dependent FRET dynamics and that additional fluorescent signals can be readily separated from FRET signals, enabling multilabel studies of molecular interactions. © 2013 International Society for Advancement of Cytometry. Copyright © 2013 International Society for Advancement of Cytometry.

  15. A Bio Medical Waste Identification and Classification Algorithm Using Mltrp and Rvm.

    PubMed

    Achuthan, Aravindan; Ayyallu Madangopal, Vasumathi

    2016-10-01

    We aimed to extract the histogram features for text analysis and, to classify the types of Bio Medical Waste (BMW) for garbage disposal and management. The given BMW was preprocessed by using the median filtering technique that efficiently reduced the noise in the image. After that, the histogram features of the filtered image were extracted with the help of proposed Modified Local Tetra Pattern (MLTrP) technique. Finally, the Relevance Vector Machine (RVM) was used to classify the BMW into human body parts, plastics, cotton and liquids. The BMW image was collected from the garbage image dataset for analysis. The performance of the proposed BMW identification and classification system was evaluated in terms of sensitivity, specificity, classification rate and accuracy with the help of MATLAB. When compared to the existing techniques, the proposed techniques provided the better results. This work proposes a new texture analysis and classification technique for BMW management and disposal. It can be used in many real time applications such as hospital and healthcare management systems for proper BMW disposal.

  16. Optimization of a dual-energy contrast-enhanced technique for a photon-counting digital breast tomosynthesis system: I. A theoretical model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carton, Ann-Katherine; Ullberg, Christer; Lindman, Karin

    2010-11-15

    Purpose: Dual-energy (DE) iodine contrast-enhanced x-ray imaging of the breast has been shown to identify cancers that would otherwise be mammographically occult. In this article, theoretical modeling was performed to obtain optimally enhanced iodine images for a photon-counting digital breast tomosynthesis (DBT) system using a DE acquisition technique. Methods: In the system examined, the breast is scanned with a multislit prepatient collimator aligned with a multidetector camera. Each detector collects a projection image at a unique angle during the scan. Low-energy (LE) and high-energy (HE) projection images are acquired simultaneously in a single scan by covering alternate collimator slits withmore » Sn and Cu filters, respectively. Sn filters ranging from 0.08 to 0.22 mm thickness and Cu filters from 0.11 to 0.27 mm thickness were investigated. A tube voltage of 49 kV was selected. Tomographic images, hereafter referred to as DBT images, were reconstructed using a shift-and-add algorithm. Iodine-enhanced DBT images were acquired by performing a weighted logarithmic subtraction of the HE and LE DBT images. The DE technique was evaluated for 20-80 mm thick breasts. Weighting factors, w{sub t}, that optimally cancel breast tissue were computed. Signal-difference-to-noise ratios (SDNRs) between iodine-enhanced and nonenhanced breast tissue normalized to the square root of the mean glandular dose (MGD) were computed as a function of the fraction of the MGD allocated to the HE images. Peak SDNR/{radical}(MGD) and optimal dose allocations were identified. SDNR/{radical}(MGD) and dose allocations were computed for several practical feasible system configurations (i.e., determined by the number of collimator slits covered by Sn and Cu). A practical system configuration and Sn-Cu filter pair that accounts for the trade-off between SDNR, tube-output, and MGD were selected. Results: w{sub t} depends on the Sn-Cu filter combination used, as well as on the breast thickness; to optimally cancel 0% with 50% glandular breast tissue, w{sub t} values were found to range from 0.46 to 0.72 for all breast thicknesses and Sn-Cu filter pairs studied. The optimal w{sub t} values needed to cancel all possible breast tissue glandularites vary by less than 1% for 20 mm thick breasts and 18% for 80 mm breasts. The system configuration where one collimator slit covered by Sn is alternated with two collimator slits covered by Cu delivers SDNR/{radical}(MGD) nearest to the peak value. A reasonable compromise is a 0.16 mm Sn-0.23 mm Cu filter pair, resulting in SDNR values between 1.64 and 0.61 and MGD between 0.70 and 0.53 mGy for 20-80 mm thick breasts at the maximum tube current. Conclusions: A DE acquisition technique for a photon-counting DBT imaging system has been developed and optimized.« less

  17. Microscopy with spatial filtering for sorting particles and monitoring subcellular morphology

    NASA Astrophysics Data System (ADS)

    Zheng, Jing-Yi; Qian, Zhen; Pasternack, Robert M.; Boustany, Nada N.

    2009-02-01

    Optical scatter imaging (OSI) was developed to non-invasively track real-time changes in particle morphology with submicron sensitivity in situ without exogenous labeling, cell fixing, or organelle isolation. For spherical particles, the intensity ratio of wide-to-narrow angle scatter (OSIR, Optical Scatter Image Ratio) was shown to decrease monotonically with diameter and agree with Mie theory. In living cells, we recently reported this technique is able to detect mitochondrial morphological alterations, which were mediated by the Bcl-xL transmembrane domain, and could not be observed by fluorescence or differential interference contrast images. Here we further extend the ability of morphology assessment by adopting a digital micromirror device (DMD) for Fourier filtering. When placed in the Fourier plane the DMD can be used to select scattering intensities at desired combination of scattering angles. We designed an optical filter bank consisting of Gabor-like filters with various scales and rotations based on Gabor filters, which have been widely used for localization of spatial and frequency information in digital images and texture analysis. Using a model system consisting of mixtures of polystyrene spheres and bacteria, we show how this system can be used to sort particles on a microscopic slide based on their size, orientation and aspect ratio. We are currently applying this technique to characterize the morphology of subcellular organelles to help understand fundamental biological processes.

  18. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  19. Quantitative iodine-123 IMP imaging of brain perfusion in schizophrenia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cohen, M.B.; Lake, R.R.; Graham, L.S.

    1989-10-01

    Decreased perfusion in the frontal lobes of patients with chronic schizophrenia has been reported by multiple observes using a variety of techniques. Other observers have been unable to confirm this finding using similar techniques. In this study quantitative single photon emission computed tomography brain imaging was performed using p,5n ({sup 123}I)IMP in five normal subjects and ten chronically medicated patients with schizophrenia. The acquisition data were preprocessed with an image dependent Metz filter and reconstructed using a ramp filtered back projection technique. The uptake in each of 50 regions of interest in each subject was normalized to the uptake inmore » the cerebellum. There were no significant confirmed differences in the comparable ratios of normal subjects and patients with schizophrenia even at the p = 0.15 level. Hypofrontality was not observed.« less

  20. Analysis of Video-Based Microscopic Particle Trajectories Using Kalman Filtering

    PubMed Central

    Wu, Pei-Hsun; Agarwal, Ashutosh; Hess, Henry; Khargonekar, Pramod P.; Tseng, Yiider

    2010-01-01

    Abstract The fidelity of the trajectories obtained from video-based particle tracking determines the success of a variety of biophysical techniques, including in situ single cell particle tracking and in vitro motility assays. However, the image acquisition process is complicated by system noise, which causes positioning error in the trajectories derived from image analysis. Here, we explore the possibility of reducing the positioning error by the application of a Kalman filter, a powerful algorithm to estimate the state of a linear dynamic system from noisy measurements. We show that the optimal Kalman filter parameters can be determined in an appropriate experimental setting, and that the Kalman filter can markedly reduce the positioning error while retaining the intrinsic fluctuations of the dynamic process. We believe the Kalman filter can potentially serve as a powerful tool to infer a trajectory of ultra-high fidelity from noisy images, revealing the details of dynamic cellular processes. PMID:20550894

  1. Separation of man-made and natural patterns in high-altitude imagery of agricultural areas

    NASA Technical Reports Server (NTRS)

    Samulon, A. S.

    1975-01-01

    A nonstationary linear digital filter is designed and implemented which extracts the natural features from high-altitude imagery of agricultural areas. Essentially, from an original image a new image is created which displays information related to soil properties, drainage patterns, crop disease, and other natural phenomena, and contains no information about crop type or row spacing. A model is developed to express the recorded brightness in a narrow-band image in terms of man-made and natural contributions and which describes statistically the spatial properties of each. The form of the minimum mean-square error linear filter for estimation of the natural component of the scene is derived and a suboptimal filter is implemented. Nonstationarity of the two-dimensional random processes contained in the model requires a unique technique for deriving the optimum filter. Finally, the filter depends on knowledge of field boundaries. An algorithm for boundary location is proposed, discussed, and implemented.

  2. An excitation wavelength-scanning spectral imaging system for preclinical imaging

    NASA Astrophysics Data System (ADS)

    Leavesley, Silas; Jiang, Yanan; Patsekin, Valery; Rajwa, Bartek; Robinson, J. Paul

    2008-02-01

    Small-animal fluorescence imaging is a rapidly growing field, driven by applications in cancer detection and pharmaceutical therapies. However, the practical use of this imaging technology is limited by image-quality issues related to autofluorescence background from animal tissues, as well as attenuation of the fluorescence signal due to scatter and absorption. To combat these problems, spectral imaging and analysis techniques are being employed to separate the fluorescence signal from background autofluorescence. To date, these technologies have focused on detecting the fluorescence emission spectrum at a fixed excitation wavelength. We present an alternative to this technique, an imaging spectrometer that detects the fluorescence excitation spectrum at a fixed emission wavelength. The advantages of this approach include increased available information for discrimination of fluorescent dyes, decreased optical radiation dose to the animal, and ability to scan a continuous wavelength range instead of discrete wavelength sampling. This excitation-scanning imager utilizes an acousto-optic tunable filter (AOTF), with supporting optics, to scan the excitation spectrum. Advanced image acquisition and analysis software has also been developed for classification and unmixing of the spectral image sets. Filtering has been implemented in a single-pass configuration with a bandwidth (full width at half maximum) of 16nm at 550nm central diffracted wavelength. We have characterized AOTF filtering over a wide range of incident light angles, much wider than has been previously reported in the literature, and we show how changes in incident light angle can be used to attenuate AOTF side lobes and alter bandwidth. A new parameter, in-band to out-of-band ratio, was defined to assess the quality of the filtered excitation light. Additional parameters were measured to allow objective characterization of the AOTF and the imager as a whole. This is necessary for comparing the excitation-scanning imager to other spectral and fluorescence imaging technologies. The effectiveness of the hyperspectral imager was tested by imaging and analysis of mice with injected fluorescent dyes. Finally, a discussion of the optimization of spectral fluorescence imagers is given, relating the effects of filter quality on fluorescence images collected and the analysis outcome.

  3. Dose reduction in fluoroscopic interventions using a combination of a region of interest (ROI) x-ray attenuator and spatially different, temporally variable temporal filtering

    NASA Astrophysics Data System (ADS)

    Swetadri Vasan, S. N.; Pope, Liza; Ionita, Ciprian N.; Titus, A. H.; Bednarek, D. R.; Rudin, S.

    2013-03-01

    A novel dose reduction technique for fluoroscopic interventions involving a combination of a material x-ray region of interest (ROI) attenuator and spatially different, temporally variable ROI temporal recursive filter, was used to guide the catheter to the ROI in three live animal studies, two involving rabbits and one involving a sheep. In the two rabbit studies presented , a catheter was guided to the entrance of the carotid artery. With the added ROI attenuator the image under the high attenuation region is very noisy. By using temporal filtering with a filter weight of 0.6 on previous frames, the noise is reduced. In the sheep study the catheter was guided to the descending aorta of the animal. The sheep offered a relatively higher attenuation to the incident x-rays and thus a higher temporal filter weight of 0.8 on previous frames was used during the procedure to reduce the noise to levels acceptable by the interventionalist. The image sequences from both studies show that significant dose reduction of 5-6 times can be achieved with acceptable image quality outside the ROI by using the above mentioned technique. Even though the temporal filter weighting outside the ROI is higher, the consequent lag does not prevent perception of catheter movement.

  4. New spectral imaging techniques for blood oximetry in the retina

    NASA Astrophysics Data System (ADS)

    Alabboud, Ied; Muyo, Gonzalo; Gorman, Alistair; Mordant, David; McNaught, Andrew; Petres, Clement; Petillot, Yvan R.; Harvey, Andrew R.

    2007-07-01

    Hyperspectral imaging of the retina presents a unique opportunity for direct and quantitative mapping of retinal biochemistry - particularly of the vasculature where blood oximetry is enabled by the strong variation of absorption spectra with oxygenation. This is particularly pertinent both to research and to clinical investigation and diagnosis of retinal diseases such as diabetes, glaucoma and age-related macular degeneration. The optimal exploitation of hyperspectral imaging however, presents a set of challenging problems, including; the poorly characterised and controlled optical environment of structures within the retina to be imaged; the erratic motion of the eye ball; and the compounding effects of the optical sensitivity of the retina and the low numerical aperture of the eye. We have developed two spectral imaging techniques to address these issues. We describe first a system in which a liquid crystal tuneable filter is integrated into the illumination system of a conventional fundus camera to enable time-sequential, random access recording of narrow-band spectral images. Image processing techniques are described to eradicate the artefacts that may be introduced by time-sequential imaging. In addition we describe a unique snapshot spectral imaging technique dubbed IRIS that employs polarising interferometry and Wollaston prism beam splitters to simultaneously replicate and spectrally filter images of the retina into multiple spectral bands onto a single detector array. Results of early clinical trials acquired with these two techniques together with a physical model which enables oximetry map are reported.

  5. Preprocessing of 2-Dimensional Gel Electrophoresis Images Applied to Proteomic Analysis: A Review.

    PubMed

    Goez, Manuel Mauricio; Torres-Madroñero, Maria Constanza; Röthlisberger, Sarah; Delgado-Trejos, Edilson

    2018-02-01

    Various methods and specialized software programs are available for processing two-dimensional gel electrophoresis (2-DGE) images. However, due to the anomalies present in these images, a reliable, automated, and highly reproducible system for 2-DGE image analysis has still not been achieved. The most common anomalies found in 2-DGE images include vertical and horizontal streaking, fuzzy spots, and background noise, which greatly complicate computational analysis. In this paper, we review the preprocessing techniques applied to 2-DGE images for noise reduction, intensity normalization, and background correction. We also present a quantitative comparison of non-linear filtering techniques applied to synthetic gel images, through analyzing the performance of the filters under specific conditions. Synthetic proteins were modeled into a two-dimensional Gaussian distribution with adjustable parameters for changing the size, intensity, and degradation. Three types of noise were added to the images: Gaussian, Rayleigh, and exponential, with signal-to-noise ratios (SNRs) ranging 8-20 decibels (dB). We compared the performance of wavelet, contourlet, total variation (TV), and wavelet-total variation (WTTV) techniques using parameters SNR and spot efficiency. In terms of spot efficiency, contourlet and TV were more sensitive to noise than wavelet and WTTV. Wavelet worked the best for images with SNR ranging 10-20 dB, whereas WTTV performed better with high noise levels. Wavelet also presented the best performance with any level of Gaussian noise and low levels (20-14 dB) of Rayleigh and exponential noise in terms of SNR. Finally, the performance of the non-linear filtering techniques was evaluated using a real 2-DGE image with previously identified proteins marked. Wavelet achieved the best detection rate for the real image. Copyright © 2018 Beijing Institute of Genomics, Chinese Academy of Sciences and Genetics Society of China. Production and hosting by Elsevier B.V. All rights reserved.

  6. Space Vehicle Pose Estimation via Optical Correlation and Nonlinear Estimation

    NASA Technical Reports Server (NTRS)

    Rakoczy, John M.; Herren, Kenneth A.

    2008-01-01

    A technique for 6-degree-of-freedom (6DOF) pose estimation of space vehicles is being developed. This technique draws upon recent developments in implementing optical correlation measurements in a nonlinear estimator, which relates the optical correlation measurements to the pose states (orientation and position). For the optical correlator, the use of both conjugate filters and binary, phase-only filters in the design of synthetic discriminant function (SDF) filters is explored. A static neural network is trained a priori and used as the nonlinear estimator. New commercial animation and image rendering software is exploited to design the SDF filters and to generate a large filter set with which to train the neural network. The technique is applied to pose estimation for rendezvous and docking of free-flying spacecraft and to terrestrial surface mobility systems for NASA's Vision for Space Exploration. Quantitative pose estimation performance will be reported. Advantages and disadvantages of the implementation of this technique are discussed.

  7. Space Vehicle Pose Estimation via Optical Correlation and Nonlinear Estimation

    NASA Technical Reports Server (NTRS)

    Rakoczy, John; Herren, Kenneth

    2007-01-01

    A technique for 6-degree-of-freedom (6DOF) pose estimation of space vehicles is being developed. This technique draws upon recent developments in implementing optical correlation measurements in a nonlinear estimator, which relates the optical correlation measurements to the pose states (orientation and position). For the optical correlator, the use of both conjugate filters and binary, phase-only filters in the design of synthetic discriminant function (SDF) filters is explored. A static neural network is trained a priori and used as the nonlinear estimator. New commercial animation and image rendering software is exploited to design the SDF filters and to generate a large filter set with which to train the neural network. The technique is applied to pose estimation for rendezvous and docking of free-flying spacecraft and to terrestrial surface mobility systems for NASA's Vision for Space Exploration. Quantitative pose estimation performance will be reported. Advantages and disadvantages of the implementation of this technique are discussed.

  8. Real-Time flare detection using guided filter

    NASA Astrophysics Data System (ADS)

    Lin, Jiaben; Deng, Yuanyong; Yuan, Fei; Guo, Juan

    2017-04-01

    A procedure is introduced for the automatic detection of solar flare using full-disk solar images from Huairou Solar Observing Station (HSOS), National Astronomical Observatories of China. In image preprocessing, median filter is applied to remove the noises. And then we adopt guided filter, which is first introduced into the astronomical image detection, to enhance the edges of flares and restrain the solar limb darkening. Flares are then detected by modified Otsu algorithm and further threshold processing technique. Compared with other automatic detection procedure, the new procedure has some advantages such as real time and reliability as well as no need of image division and local threshold. Also, it reduces the amount of computation largely, which is benefited from the efficient guided filter algorithm. The procedure has been tested on one month sequences (December 2013) of HSOS full-disk solar images and the result of flares detection shows that the number of flares detected by our procedure is well consistent with the manual one.

  9. Integration of adaptive guided filtering, deep feature learning, and edge-detection techniques for hyperspectral image classification

    NASA Astrophysics Data System (ADS)

    Wan, Xiaoqing; Zhao, Chunhui; Gao, Bing

    2017-11-01

    The integration of an edge-preserving filtering technique in the classification of a hyperspectral image (HSI) has been proven effective in enhancing classification performance. This paper proposes an ensemble strategy for HSI classification using an edge-preserving filter along with a deep learning model and edge detection. First, an adaptive guided filter is applied to the original HSI to reduce the noise in degraded images and to extract powerful spectral-spatial features. Second, the extracted features are fed as input to a stacked sparse autoencoder to adaptively exploit more invariant and deep feature representations; then, a random forest classifier is applied to fine-tune the entire pretrained network and determine the classification output. Third, a Prewitt compass operator is further performed on the HSI to extract the edges of the first principal component after dimension reduction. Moreover, the regional growth rule is applied to the resulting edge logical image to determine the local region for each unlabeled pixel. Finally, the categories of the corresponding neighborhood samples are determined in the original classification map; then, the major voting mechanism is implemented to generate the final output. Extensive experiments proved that the proposed method achieves competitive performance compared with several traditional approaches.

  10. LROC assessment of non-linear filtering methods in Ga-67 SPECT imaging

    NASA Astrophysics Data System (ADS)

    De Clercq, Stijn; Staelens, Steven; De Beenhouwer, Jan; D'Asseler, Yves; Lemahieu, Ignace

    2006-03-01

    In emission tomography, iterative reconstruction is usually followed by a linear smoothing filter to make such images more appropriate for visual inspection and diagnosis by a physician. This will result in a global blurring of the images, smoothing across edges and possibly discarding valuable image information for detection tasks. The purpose of this study is to investigate which possible advantages a non-linear, edge-preserving postfilter could have on lesion detection in Ga-67 SPECT imaging. Image quality can be defined based on the task that has to be performed on the image. This study used LROC observer studies based on a dataset created by CPU-intensive Gate Monte Carlo simulations of a voxelized digital phantom. The filters considered in this study were a linear Gaussian filter, a bilateral filter, the Perona-Malik anisotropic diffusion filter and the Catte filtering scheme. The 3D MCAT software phantom was used to simulate the distribution of Ga-67 citrate in the abdomen. Tumor-present cases had a 1-cm diameter tumor randomly placed near the edges of the anatomical boundaries of the kidneys, bone, liver and spleen. Our data set was generated out of a single noisy background simulation using the bootstrap method, to significantly reduce the simulation time and to allow for a larger observer data set. Lesions were simulated separately and added to the background afterwards. These were then reconstructed with an iterative approach, using a sufficiently large number of MLEM iterations to establish convergence. The output of a numerical observer was used in a simplex optimization method to estimate an optimal set of parameters for each postfilter. No significant improvement was found for using edge-preserving filtering techniques over standard linear Gaussian filtering.

  11. Fast global image smoothing based on weighted least squares.

    PubMed

    Min, Dongbo; Choi, Sunghwan; Lu, Jiangbo; Ham, Bumsub; Sohn, Kwanghoon; Do, Minh N

    2014-12-01

    This paper presents an efficient technique for performing a spatially inhomogeneous edge-preserving image smoothing, called fast global smoother. Focusing on sparse Laplacian matrices consisting of a data term and a prior term (typically defined using four or eight neighbors for 2D image), our approach efficiently solves such global objective functions. In particular, we approximate the solution of the memory-and computation-intensive large linear system, defined over a d-dimensional spatial domain, by solving a sequence of 1D subsystems. Our separable implementation enables applying a linear-time tridiagonal matrix algorithm to solve d three-point Laplacian matrices iteratively. Our approach combines the best of two paradigms, i.e., efficient edge-preserving filters and optimization-based smoothing. Our method has a comparable runtime to the fast edge-preserving filters, but its global optimization formulation overcomes many limitations of the local filtering approaches. Our method also achieves high-quality results as the state-of-the-art optimization-based techniques, but runs ∼10-30 times faster. Besides, considering the flexibility in defining an objective function, we further propose generalized fast algorithms that perform Lγ norm smoothing (0 < γ < 2) and support an aggregated (robust) data term for handling imprecise data constraints. We demonstrate the effectiveness and efficiency of our techniques in a range of image processing and computer graphics applications.

  12. White-Light Optical Information Processing and Holography.

    DTIC Science & Technology

    1982-05-03

    artifact noise . I. wever, the deblurring spatial filter that we used were a narrow spectral band centered at 5154A green light. To compensate for the scaling...Processing, White-Light 11olographyv, Image Profcessing, Optical Signal Process inI, Image Subtraction, Image Deblurring . 70. A S’ R ACT (Continua on crow ad...optical processing technique, we had shown that the incoherent source techniques provides better image quality, and very low coherent artifact noise

  13. A linear shift-invariant image preprocessing technique for multispectral scanner systems

    NASA Technical Reports Server (NTRS)

    Mcgillem, C. D.; Riemer, T. E.

    1973-01-01

    A linear shift-invariant image preprocessing technique is examined which requires no specific knowledge of any parameter of the original image and which is sufficiently general to allow the effective radius of the composite imaging system to be arbitrarily shaped and reduced, subject primarily to the noise power constraint. In addition, the size of the point-spread function of the preprocessing filter can be arbitrarily controlled, thus minimizing truncation errors.

  14. Multispectral interference filter arrays with compensation of angular dependence or extended spectral range.

    PubMed

    Frey, Laurent; Masarotto, Lilian; Armand, Marilyn; Charles, Marie-Lyne; Lartigue, Olivier

    2015-05-04

    Thin film Fabry-Perot filter arrays with high selectivity can be realized with a single patterning step, generating a spatial modulation of the effective refractive index in the optical cavity. In this paper, we investigate the ability of this technology to address two applications in the field of image sensors. First, the spectral tuning may be used to compensate the blue-shift of the filters in oblique incidence, provided the filter array is located in an image plane of an optical system with higher field of view than aperture angle. The technique is analyzed for various types of filters and experimental evidence is shown with copper-dielectric infrared filters. Then, we propose a design of a multispectral filter array with an extended spectral range spanning the visible and near-infrared range, using a single set of materials and realizable on a single substrate.

  15. A Multi-Scale Algorithm for Graffito Advertisement Detection from Images of Real Estate

    NASA Astrophysics Data System (ADS)

    Yang, Jun; Zhu, Shi-Jiao

    There is a significant need to detect and extract the graffito advertisement embedded in the housing images automatically. However, it is a hard job to separate the advertisement region well since housing images generally have complex background. In this paper, a detecting algorithm which uses multi-scale Gabor filters to identify graffito regions is proposed. Firstly, multi-scale Gabor filters with different directions are applied to housing images, then the approach uses these frequency data to find likely graffito regions using the relationship of different channels, it exploits the ability of different filters technique to solve the detection problem with low computational efforts. Lastly, the method is tested on several real estate images which are embedded graffito advertisement to verify its robustness and efficiency. The experiments demonstrate graffito regions can be detected quite well.

  16. Seismic imaging of the Waltham Canyon fault, California: comparison of ray‐theoretical and Fresnel volume prestack depth migration

    USGS Publications Warehouse

    Bauer, Klaus; Ryberg, Trond; Fuis, Gary S.; Lüth, Stefan

    2013-01-01

    Near‐vertical faults can be imaged using reflected refractions identified in controlled‐source seismic data. Often theses phases are observed on a few neighboring shot or receiver gathers, resulting in a low‐fold data set. Imaging can be carried out with Kirchhoff prestack depth migration in which migration noise is suppressed by constructive stacking of large amounts of multifold data. Fresnel volume migration can be used for low‐fold data without severe migration noise, as the smearing along isochrones is limited to the first Fresnel zone around the reflection point. We developed a modified Fresnel volume migration technique to enhance imaging of steep faults and to suppress noise and undesired coherent phases. The modifications include target‐oriented filters to separate reflected refractions from steep‐dipping faults and reflections with hyperbolic moveout. Undesired phases like multiple reflections, mode conversions, direct P and S waves, and surface waves are suppressed by these filters. As an alternative approach, we developed a new prestack line‐drawing migration method, which can be considered as a proxy to an infinite frequency approximation of the Fresnel volume migration. The line‐drawing migration is not considering waveform information but requires significantly shorter computational time. Target‐oriented filters were extended by dip filters in the line‐drawing migration method. The migration methods were tested with synthetic data and applied to real data from the Waltham Canyon fault, California. The two techniques are applied best in combination, to design filters and to generate complementary images of steep faults.

  17. Compressive Sampling based Image Coding for Resource-deficient Visual Communication.

    PubMed

    Liu, Xianming; Zhai, Deming; Zhou, Jiantao; Zhang, Xinfeng; Zhao, Debin; Gao, Wen

    2016-04-14

    In this paper, a new compressive sampling based image coding scheme is developed to achieve competitive coding efficiency at lower encoder computational complexity, while supporting error resilience. This technique is particularly suitable for visual communication with resource-deficient devices. At the encoder, compact image representation is produced, which is a polyphase down-sampled version of the input image; but the conventional low-pass filter prior to down-sampling is replaced by a local random binary convolution kernel. The pixels of the resulting down-sampled pre-filtered image are local random measurements and placed in the original spatial configuration. The advantages of local random measurements are two folds: 1) preserve high-frequency image features that are otherwise discarded by low-pass filtering; 2) remain a conventional image and can therefore be coded by any standardized codec to remove statistical redundancy of larger scales. Moreover, measurements generated by different kernels can be considered as multiple descriptions of the original image and therefore the proposed scheme has the advantage of multiple description coding. At the decoder, a unified sparsity-based soft-decoding technique is developed to recover the original image from received measurements in a framework of compressive sensing. Experimental results demonstrate that the proposed scheme is competitive compared with existing methods, with a unique strength of recovering fine details and sharp edges at low bit-rates.

  18. Structured illumination for wide-field Raman imaging of cell membranes

    NASA Astrophysics Data System (ADS)

    Chen, Houkai; Wang, Siqi; Zhang, Yuquan; Yang, Yong; Fang, Hui; Zhu, Siwei; Yuan, Xiaocong

    2017-11-01

    Although the diffraction limit still restricts their lateral resolution, conventional wide-field Raman imaging techniques offer fast imaging speeds compared with scanning schemes. To extend the lateral resolution of wide-field Raman microscopy using filters, standing-wave illumination technique is used, and an improvement of lateral resolution by a factor of more than two is achieved. Specifically, functionalized surface enhanced Raman scattering nanoparticles are employed to strengthen the desired scattering signals to label cell membranes. This wide-field Raman imaging technique affords various significant opportunities in the biological applications.

  19. Detection of urban expansion in an urban-rural landscape with multitemporal QuickBird images

    PubMed Central

    Lu, Dengsheng; Hetrick, Scott; Moran, Emilio; Li, Guiying

    2011-01-01

    Accurately detecting urban expansion with remote sensing techniques is a challenge due to the complexity of urban landscapes. This paper explored methods for detecting urban expansion with multitemporal QuickBird images in Lucas do Rio Verde, Mato Grosso, Brazil. Different techniques, including image differencing, principal component analysis (PCA), and comparison of classified impervious surface images with the matched filtering method, were used to examine urbanization detection. An impervious surface image classified with the hybrid method was used to modify the urbanization detection results. As a comparison, the original multispectral image and segmentation-based mean-spectral images were used during the detection of urbanization. This research indicates that the comparison of classified impervious surface images with matched filtering method provides the best change detection performance, followed by the image differencing method based on segmentation-based mean spectral images. The PCA is not a good method for urban change detection in this study. Shadows and high spectral variation within the impervious surfaces represent major challenges to the detection of urban expansion when high spatial resolution images are used. PMID:21799706

  20. Inhomogeneity compensation for MR brain image segmentation using a multi-stage FCM-based approach.

    PubMed

    Szilágyi, László; Szilágyi, Sándor M; Dávid, László; Benyó, Zoltán

    2008-01-01

    Intensity inhomogeneity or intensity non-uniformity (INU) is an undesired phenomenon that represents the main obstacle for MR image segmentation and registration methods. Various techniques have been proposed to eliminate or compensate the INU, most of which are embedded into clustering algorithms. This paper proposes a multiple stage fuzzy c-means (FCM) based algorithm for the estimation and compensation of the slowly varying additive or multiplicative noise, supported by a pre-filtering technique for Gaussian and impulse noise elimination. The slowly varying behavior of the bias or gain field is assured by a smoothening filter that performs a context dependent averaging, based on a morphological criterion. The experiments using 2-D synthetic phantoms and real MR images show, that the proposed method provides accurate segmentation. The produced segmentation and fuzzy membership values can serve as excellent support for 3-D registration and segmentation techniques.

  1. Error analysis of filtering operations in pixel-duplicated images of diabetic retinopathy

    NASA Astrophysics Data System (ADS)

    Mehrubeoglu, Mehrube; McLauchlan, Lifford

    2010-08-01

    In this paper, diabetic retinopathy is chosen for a sample target image to demonstrate the effectiveness of image enlargement through pixel duplication in identifying regions of interest. Pixel duplication is presented as a simpler alternative to data interpolation techniques for detecting small structures in the images. A comparative analysis is performed on different image processing schemes applied to both original and pixel-duplicated images. Structures of interest are detected and and classification parameters optimized for minimum false positive detection in the original and enlarged retinal pictures. The error analysis demonstrates the advantages as well as shortcomings of pixel duplication in image enhancement when spatial averaging operations (smoothing filters) are also applied.

  2. Despeckle filtering software toolbox for ultrasound imaging of the common carotid artery.

    PubMed

    Loizou, Christos P; Theofanous, Charoula; Pantziaris, Marios; Kasparis, Takis

    2014-04-01

    Ultrasound imaging of the common carotid artery (CCA) is a non-invasive tool used in medicine to assess the severity of atherosclerosis and monitor its progression through time. It is also used in border detection and texture characterization of the atherosclerotic carotid plaque in the CCA, the identification and measurement of the intima-media thickness (IMT) and the lumen diameter that all are very important in the assessment of cardiovascular disease (CVD). Visual perception, however, is hindered by speckle, a multiplicative noise, that degrades the quality of ultrasound B-mode imaging. Noise reduction is therefore essential for improving the visual observation quality or as a pre-processing step for further automated analysis, such as image segmentation of the IMT and the atherosclerotic carotid plaque in ultrasound images. In order to facilitate this preprocessing step, we have developed in MATLAB(®) a unified toolbox that integrates image despeckle filtering (IDF), texture analysis and image quality evaluation techniques to automate the pre-processing and complement the disease evaluation in ultrasound CCA images. The proposed software, is based on a graphical user interface (GUI) and incorporates image normalization, 10 different despeckle filtering techniques (DsFlsmv, DsFwiener, DsFlsminsc, DsFkuwahara, DsFgf, DsFmedian, DsFhmedian, DsFad, DsFnldif, DsFsrad), image intensity normalization, 65 texture features, 15 quantitative image quality metrics and objective image quality evaluation. The software is publicly available in an executable form, which can be downloaded from http://www.cs.ucy.ac.cy/medinfo/. It was validated on 100 ultrasound images of the CCA, by comparing its results with quantitative visual analysis performed by a medical expert. It was observed that the despeckle filters DsFlsmv, and DsFhmedian improved image quality perception (based on the expert's assessment and the image texture and quality metrics). It is anticipated that the system could help the physician in the assessment of cardiovascular image analysis. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. Exploring an optimal wavelet-based filter for cryo-ET imaging.

    PubMed

    Huang, Xinrui; Li, Sha; Gao, Song

    2018-02-07

    Cryo-electron tomography (cryo-ET) is one of the most advanced technologies for the in situ visualization of molecular machines by producing three-dimensional (3D) biological structures. However, cryo-ET imaging has two serious disadvantages-low dose and low image contrast-which result in high-resolution information being obscured by noise and image quality being degraded, and this causes errors in biological interpretation. The purpose of this research is to explore an optimal wavelet denoising technique to reduce noise in cryo-ET images. We perform tests using simulation data and design a filter using the optimum selected wavelet parameters (three-level decomposition, level-1 zeroed out, subband-dependent threshold, a soft-thresholding and spline-based discrete dyadic wavelet transform (DDWT)), which we call a modified wavelet shrinkage filter; this filter is suitable for noisy cryo-ET data. When testing using real cryo-ET experiment data, higher quality images and more accurate measures of a biological structure can be obtained with the modified wavelet shrinkage filter processing compared with conventional processing. Because the proposed method provides an inherent advantage when dealing with cryo-ET images, it can therefore extend the current state-of-the-art technology in assisting all aspects of cryo-ET studies: visualization, reconstruction, structural analysis, and interpretation.

  4. Bayesian demosaicing using Gaussian scale mixture priors with local adaptivity in the dual tree complex wavelet packet transform domain

    NASA Astrophysics Data System (ADS)

    Goossens, Bart; Aelterman, Jan; Luong, Hiep; Pizurica, Aleksandra; Philips, Wilfried

    2013-02-01

    In digital cameras and mobile phones, there is an ongoing trend to increase the image resolution, decrease the sensor size and to use lower exposure times. Because smaller sensors inherently lead to more noise and a worse spatial resolution, digital post-processing techniques are required to resolve many of the artifacts. Color filter arrays (CFAs), which use alternating patterns of color filters, are very popular because of price and power consumption reasons. However, color filter arrays require the use of a post-processing technique such as demosaicing to recover full resolution RGB images. Recently, there has been some interest in techniques that jointly perform the demosaicing and denoising. This has the advantage that the demosaicing and denoising can be performed optimally (e.g. in the MSE sense) for the considered noise model, while avoiding artifacts introduced when using demosaicing and denoising sequentially. In this paper, we will continue the research line of the wavelet-based demosaicing techniques. These approaches are computationally simple and very suited for combination with denoising. Therefore, we will derive Bayesian Minimum Squared Error (MMSE) joint demosaicing and denoising rules in the complex wavelet packet domain, taking local adaptivity into account. As an image model, we will use Gaussian Scale Mixtures, thereby taking advantage of the directionality of the complex wavelets. Our results show that this technique is well capable of reconstructing fine details in the image, while removing all of the noise, at a relatively low computational cost. In particular, the complete reconstruction (including color correction, white balancing etc) of a 12 megapixel RAW image takes 3.5 sec on a recent mid-range GPU.

  5. Power spectral ensity of markov texture fields

    NASA Technical Reports Server (NTRS)

    Shanmugan, K. S.; Holtzman, J. C.

    1984-01-01

    Texture is an important image characteristic. A variety of spatial domain techniques were proposed for extracting and utilizing textural features for segmenting and classifying images. for the most part, these spatial domain techniques are ad hos in nature. A markov random field model for image texture is discussed. A frequency domain description of image texture is derived in terms of the power spectral density. This model is used for designing optimum frequency domain filters for enhancing, restoring and segmenting images based on their textural properties.

  6. Constrained optimization of image restoration filters

    NASA Technical Reports Server (NTRS)

    Riemer, T. E.; Mcgillem, C. D.

    1973-01-01

    A linear shift-invariant preprocessing technique is described which requires no specific knowledge of the image parameters and which is sufficiently general to allow the effective radius of the composite imaging system to be minimized while constraining other system parameters to remain within specified limits.

  7. Optical filters for wavelength selection in fluorescence instrumentation.

    PubMed

    Erdogan, Turan

    2011-04-01

    Fluorescence imaging and analysis techniques have become ubiquitous in life science research, and they are poised to play an equally vital role in in vitro diagnostics (IVD) in the future. Optical filters are crucial for nearly all fluorescence microscopes and instruments, not only to provide the obvious function of spectral control, but also to ensure the highest possible detection sensitivity and imaging resolution. Filters make it possible for the sample to "see" light within only the absorption band, and the detector to "see" light within only the emission band. Without filters, the detector would not be able to distinguish the desired fluorescence from scattered excitation light and autofluorescence from the sample, substrate, and other optics in the system. Today the vast majority of fluorescence instruments, including the widely popular fluorescence microscope, use thin-film interference filters to control the spectra of the excitation and emission light. Hence, this unit emphasizes thin-film filters. After briefly introducing different types of thin-film filters and how they are made, the unit describes in detail different optical filter configurations in fluorescence instruments, including both single-color and multicolor imaging systems. Several key properties of thin-film filters, which can significantly affect optical system performance, are then described. In the final section, tunable optical filters are also addressed in a relative comparison.

  8. Image Display and Manipulation System (IDAMS) program documentation, Appendixes A-D. [including routines, convolution filtering, image expansion, and fast Fourier transformation

    NASA Technical Reports Server (NTRS)

    Cecil, R. W.; White, R. A.; Szczur, M. R.

    1972-01-01

    The IDAMS Processor is a package of task routines and support software that performs convolution filtering, image expansion, fast Fourier transformation, and other operations on a digital image tape. A unique task control card for that program, together with any necessary parameter cards, selects each processing technique to be applied to the input image. A variable number of tasks can be selected for execution by including the proper task and parameter cards in the input deck. An executive maintains control of the run; it initiates execution of each task in turn and handles any necessary error processing.

  9. Validation of Fujinon intelligent chromoendoscopy with high definition endoscopes in colonoscopy.

    PubMed

    Parra-Blanco, Adolfo; Jiménez, Alejandro; Rembacken, Björn; González, Nicolás; Nicolás-Pérez, David; Gimeno-García, Antonio Z; Carrillo-Palau, Marta; Matsuda, Takahisa; Quintero, Enrique

    2009-11-14

    To validate high definition endoscopes with Fujinon intelligent chromoendoscopy (FICE) in colonoscopy. The image quality of normal white light endoscopy (WLE), that of the 10 available FICE filters and that of a gold standard (0.2% indigo carmine dye) were compared. FICE-filter 4 [red, green, and blue (RGB) wavelengths of 520, 500, and 405 nm, respectively] provided the best images for evaluating the vascular pattern compared to white light. The mucosal surface was best assessed using filter 4. However, the views obtained were not rated significantly better than those observed with white light. The "gold standard", indigo carmine (IC) dye, was found to be superior to both white light and filter 4. Filter 6 (RGB wavelengths of 580, 520, and 460 nm, respectively) allowed for exploration of the IC-stained mucosa. When assessing mucosal polyps, both FICE with magnification, and magnification following dye spraying were superior to the same techniques without magnification and to white light imaging. In the presence of suboptimal bowel preparation, observation with the FICE mode was possible, and endoscopists considered it to be superior to observation with white light. FICE-filter 4 with magnification improves the image quality of the colonic vascular patterns obtained with WLE.

  10. A comparative study of new and current methods for dental micro-CT image denoising

    PubMed Central

    Lashgari, Mojtaba; Qin, Jie; Swain, Michael

    2016-01-01

    Objectives: The aim of the current study was to evaluate the application of two advanced noise-reduction algorithms for dental micro-CT images and to implement a comparative analysis of the performance of new and current denoising algorithms. Methods: Denoising was performed using gaussian and median filters as the current filtering approaches and the block-matching and three-dimensional (BM3D) method and total variation method as the proposed new filtering techniques. The performance of the denoising methods was evaluated quantitatively using contrast-to-noise ratio (CNR), edge preserving index (EPI) and blurring indexes, as well as qualitatively using the double-stimulus continuous quality scale procedure. Results: The BM3D method had the best performance with regard to preservation of fine textural features (CNREdge), non-blurring of the whole image (blurring index), the clinical visual score in images with very fine features and the overall visual score for all types of images. On the other hand, the total variation method provided the best results with regard to smoothing of images in texture-free areas (CNRTex-free) and in preserving the edges and borders of image features (EPI). Conclusions: The BM3D method is the most reliable technique for denoising dental micro-CT images with very fine textural details, such as shallow enamel lesions, in which the preservation of the texture and fine features is of the greatest importance. On the other hand, the total variation method is the technique of choice for denoising images without very fine textural details in which the clinician or researcher is interested mainly in anatomical features and structural measurements. PMID:26764583

  11. Impact of atmospheric correction and image filtering on hyperspectral classification of tree species using support vector machine

    NASA Astrophysics Data System (ADS)

    Shahriari Nia, Morteza; Wang, Daisy Zhe; Bohlman, Stephanie Ann; Gader, Paul; Graves, Sarah J.; Petrovic, Milenko

    2015-01-01

    Hyperspectral images can be used to identify savannah tree species at the landscape scale, which is a key step in measuring biomass and carbon, and tracking changes in species distributions, including invasive species, in these ecosystems. Before automated species mapping can be performed, image processing and atmospheric correction is often performed, which can potentially affect the performance of classification algorithms. We determine how three processing and correction techniques (atmospheric correction, Gaussian filters, and shade/green vegetation filters) affect the prediction accuracy of classification of tree species at pixel level from airborne visible/infrared imaging spectrometer imagery of longleaf pine savanna in Central Florida, United States. Species classification using fast line-of-sight atmospheric analysis of spectral hypercubes (FLAASH) atmospheric correction outperformed ATCOR in the majority of cases. Green vegetation (normalized difference vegetation index) and shade (near-infrared) filters did not increase classification accuracy when applied to large and continuous patches of specific species. Finally, applying a Gaussian filter reduces interband noise and increases species classification accuracy. Using the optimal preprocessing steps, our classification accuracy of six species classes is about 75%.

  12. Development of a hybrid image processing algorithm for automatic evaluation of intramuscular fat content in beef M. longissimus dorsi.

    PubMed

    Du, Cheng-Jin; Sun, Da-Wen; Jackman, Patrick; Allen, Paul

    2008-12-01

    An automatic method for estimating the content of intramuscular fat (IMF) in beef M. longissimus dorsi (LD) was developed using a sequence of image processing algorithm. To extract IMF particles within the LD muscle from structural features of intermuscular fat surrounding the muscle, three steps of image processing algorithm were developed, i.e. bilateral filter for noise removal, kernel fuzzy c-means clustering (KFCM) for segmentation, and vector confidence connected and flood fill for IMF extraction. The technique of bilateral filtering was firstly applied to reduce the noise and enhance the contrast of the beef image. KFCM was then used to segment the filtered beef image into lean, fat, and background. The IMF was finally extracted from the original beef image by using the techniques of vector confidence connected and flood filling. The performance of the algorithm developed was verified by correlation analysis between the IMF characteristics and the percentage of chemically extractable IMF content (P<0.05). Five IMF features are very significantly correlated with the fat content (P<0.001), including count densities of middle (CDMiddle) and large (CDLarge) fat particles, area densities of middle and large fat particles, and total fat area per unit LD area. The highest coefficient is 0.852 for CDLarge.

  13. Analysis of the potential for non-invasive imaging of oxygenation at heart depth, using ultrasound optical tomography (UOT) or photo-acoustic tomography (PAT).

    PubMed

    Walther, Andreas; Rippe, Lars; Wang, Lihong V; Andersson-Engels, Stefan; Kröll, Stefan

    2017-10-01

    Despite the important medical implications, it is currently an open task to find optical non-invasive techniques that can image deep organs in humans. Addressing this, photo-acoustic tomography (PAT) has received a great deal of attention in the past decade, owing to favorable properties like high contrast and high spatial resolution. However, even with optimal components PAT cannot penetrate beyond a few centimeters, which still presents an important limitation of the technique. Here, we calculate the absorption contrast levels for PAT and for ultrasound optical tomography (UOT) and compare them to their relevant noise sources as a function of imaging depth. The results indicate that a new development in optical filters, based on rare-earth-ion crystals, can push the UOT technique significantly ahead of PAT. Such filters allow the contrast-to-noise ratio for UOT to be up to three orders of magnitude better than for PAT at depths of a few cm into the tissue. It also translates into a significant increase of the image depth of UOT compared to PAT, enabling deep organs to be imaged in humans in real time. Furthermore, such spectral holeburning filters are not sensitive to speckle decorrelation from the tissue and can operate at nearly any angle of incident light, allowing good light collection. We theoretically demonstrate the improved performance in the medically important case of non-invasive optical imaging of the oxygenation level of the frontal part of the human myocardial tissue. Our results indicate that further studies on UOT are of interest and that the technique may have large impact on future directions of biomedical optics.

  14. Semi-automated procedures for shoreline extraction using single RADARSAT-1 SAR image

    NASA Astrophysics Data System (ADS)

    Al Fugura, A.'kif; Billa, Lawal; Pradhan, Biswajeet

    2011-12-01

    Coastline identification is important for surveying and mapping reasons. Coastline serves as the basic point of reference and is used on nautical charts for navigation purposes. Its delineation has become crucial and more important in the wake of the many recent earthquakes and tsunamis resulting in complete change and redraw of some shorelines. In a tropical country like Malaysia, presence of cloud cover hinders the application of optical remote sensing data. In this study a semi-automated technique and procedures are presented for shoreline delineation from RADARSAT-1 image. A scene of RADARSAT-1 satellite image was processed using enhanced filtering technique to identify and extract the shoreline coast of Kuala Terengganu, Malaysia. RADSARSAT image has many advantages over the optical data because of its ability to penetrate cloud cover and its night sensing capabilities. At first, speckles were removed from the image by using Lee sigma filter which was used to reduce random noise and to enhance the image and discriminate the boundary between land and water. The results showed an accurate and improved extraction and delineation of the entire coastline of Kuala Terrenganu. The study demonstrated the reliability of the image averaging filter in reducing random noise over the sea surface especially near the shoreline. It enhanced land-water boundary differentiation, enabling better delineation of the shoreline. Overall, the developed techniques showed the potential of radar imagery for accurate shoreline mapping and will be useful for monitoring shoreline changes during high and low tides as well as shoreline erosion in a tropical country like Malaysia.

  15. 3D Display Using Conjugated Multiband Bandpass Filters

    NASA Technical Reports Server (NTRS)

    Bae, Youngsam; White, Victor E.; Shcheglov, Kirill

    2012-01-01

    Stereoscopic display techniques are based on the principle of displaying two views, with a slightly different perspective, in such a way that the left eye views only by the left eye, and the right eye views only by the right eye. However, one of the major challenges in optical devices is crosstalk between the two channels. Crosstalk is due to the optical devices not completely blocking the wrong-side image, so the left eye sees a little bit of the right image and the right eye sees a little bit of the left image. This results in eyestrain and headaches. A pair of interference filters worn as an optical device can solve the problem. The device consists of a pair of multiband bandpass filters that are conjugated. The term "conjugated" describes the passband regions of one filter not overlapping with those of the other, but the regions are interdigitated. Along with the glasses, a 3D display produces colors composed of primary colors (basis for producing colors) having the spectral bands the same as the passbands of the filters. More specifically, the primary colors producing one viewpoint will be made up of the passbands of one filter, and those of the other viewpoint will be made up of the passbands of the conjugated filter. Thus, the primary colors of one filter would be seen by the eye that has the matching multiband filter. The inherent characteristic of the interference filter will allow little or no transmission of the wrong side of the stereoscopic images.

  16. Multispectral imaging for medical diagnosis

    NASA Technical Reports Server (NTRS)

    Anselmo, V. J.

    1977-01-01

    Photography technique determines amount of morbidity present in tissue. Imaging apparatus incorporates numerical filtering. Overall system operates in near-real time. Information gained from this system enables physician to understand extent of injury and leads to accelerated treatment.

  17. SU-F-J-189: A Method to Improve the Spatial Resolution of Prompt Gamma Based Compton Imaging for Proton Range Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draeger, E; Chen, H; Polf, J

    Purpose: To test two new techniques, the distance-of-closest approach (DCA) and Compton line (CL) filters, developed as a means of improving the spatial resolution of Compton camera (CC) imaging. Methods: Gammas emitted from {sup 22}Na, {sup 137}Cs, and {sup 60}Co point sources were measured with a prototype 3-stage CC. The energy deposited and position of each interaction in each stage were recorded and used to calculate a “cone-of-origin” for each gamma that scattered twice in the CC. A DCA filter was developed which finds the shortest distance from the gamma’s cone-of-origin surface to the location of the gamma source. Themore » DCA filter was applied to the data to determine the initial energy of the gamma and to remove “bad” interactions that only contribute noise to the image. Additionally, a CL filter, which removes gamma events that do not follow the theoretical predictions of the Compton scatter equation, was used to further remove “bad” interactions from the measured data. Then images were reconstructed with raw, unfiltered data, DCA filtered data, and DCA+CL filtered data and the achievable image resolution of each dataset was compared. Results: Spatial resolutions of ∼2 mm, and better than 2 mm, were achievable with the DCA and DCA+CL filtered data, respectively, compared to > 5 mm for the raw, unfiltered data. Conclusion: In many special cases in medical imaging where information about the source position may be known, such as proton radiotherapy range verification, the application of the DCA and CL filters can result in considerable improvements in the achievable spatial resolutions of Compton imaging.« less

  18. Morphological filtering and multiresolution fusion for mammographic microcalcification detection

    NASA Astrophysics Data System (ADS)

    Chen, Lulin; Chen, Chang W.; Parker, Kevin J.

    1997-04-01

    Mammographic images are often of relatively low contrast and poor sharpness with non-stationary background or clutter and are usually corrupted by noise. In this paper, we propose a new method for microcalcification detection using gray scale morphological filtering followed by multiresolution fusion and present a unified general filtering form called the local operating transformation for whitening filtering and adaptive thresholding. The gray scale morphological filters are used to remove all large areas that are considered as non-stationary background or clutter variations, i.e., to prewhiten images. The multiresolution fusion decision is based on matched filter theory. In addition to the normal matched filter, the Laplacian matched filter which is directly related through the wavelet transforms to multiresolution analysis is exploited for microcalcification feature detection. At the multiresolution fusion stage, the region growing techniques are used in each resolution level. The parent-child relations between resolution levels are adopted to make final detection decision. FROC is computed from test on the Nijmegen database.

  19. Ultrasound Image Despeckling Using Stochastic Distance-Based BM3D.

    PubMed

    Santos, Cid A N; Martins, Diego L N; Mascarenhas, Nelson D A

    2017-06-01

    Ultrasound image despeckling is an important research field, since it can improve the interpretability of one of the main categories of medical imaging. Many techniques have been tried over the years for ultrasound despeckling, and more recently, a great deal of attention has been focused on patch-based methods, such as non-local means and block-matching collaborative filtering (BM3D). A common idea in these recent methods is the measure of distance between patches, originally proposed as the Euclidean distance, for filtering additive white Gaussian noise. In this paper, we derive new stochastic distances for the Fisher-Tippett distribution, based on well-known statistical divergences, and use them as patch distance measures in a modified version of the BM3D algorithm for despeckling log-compressed ultrasound images. State-of-the-art results in filtering simulated, synthetic, and real ultrasound images confirm the potential of the proposed approach.

  20. A novel Kalman filter based video image processing scheme for two-photon fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Sun, Wenqing; Huang, Xia; Li, Chunqiang; Xiao, Chuan; Qian, Wei

    2016-03-01

    Two-photon fluorescence microscopy (TPFM) is a perfect optical imaging equipment to monitor the interaction between fast moving viruses and hosts. However, due to strong unavoidable background noises from the culture, videos obtained by this technique are too noisy to elaborate this fast infection process without video image processing. In this study, we developed a novel scheme to eliminate background noises, recover background bacteria images and improve video qualities. In our scheme, we modified and implemented the following methods for both host and virus videos: correlation method, round identification method, tree-structured nonlinear filters, Kalman filters, and cell tracking method. After these procedures, most of noises were eliminated and host images were recovered with their moving directions and speed highlighted in the videos. From the analysis of the processed videos, 93% bacteria and 98% viruses were correctly detected in each frame on average.

  1. A fusion algorithm for infrared and visible based on guided filtering and phase congruency in NSST domain

    NASA Astrophysics Data System (ADS)

    Liu, Zhanwen; Feng, Yan; Chen, Hang; Jiao, Licheng

    2017-10-01

    A novel and effective image fusion method is proposed for creating a highly informative and smooth surface of fused image through merging visible and infrared images. Firstly, a two-scale non-subsampled shearlet transform (NSST) is employed to decompose the visible and infrared images into detail layers and one base layer. Then, phase congruency is adopted to extract the saliency maps from the detail layers and a guided filtering is proposed to compute the filtering output of base layer and saliency maps. Next, a novel weighted average technique is used to make full use of scene consistency for fusion and obtaining coefficients map. Finally the fusion image was acquired by taking inverse NSST of the fused coefficients map. Experiments show that the proposed approach can achieve better performance than other methods in terms of subjective visual effect and objective assessment.

  2. The influence of software filtering in digital mammography image quality

    NASA Astrophysics Data System (ADS)

    Michail, C.; Spyropoulou, V.; Kalyvas, N.; Valais, I.; Dimitropoulos, N.; Fountos, G.; Kandarakis, I.; Panayiotakis, G.

    2009-05-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  3. Adaptive Wiener filter super-resolution of color filter array images.

    PubMed

    Karch, Barry K; Hardie, Russell C

    2013-08-12

    Digital color cameras using a single detector array with a Bayer color filter array (CFA) require interpolation or demosaicing to estimate missing color information and provide full-color images. However, demosaicing does not specifically address fundamental undersampling and aliasing inherent in typical camera designs. Fast non-uniform interpolation based super-resolution (SR) is an attractive approach to reduce or eliminate aliasing and its relatively low computational load is amenable to real-time applications. The adaptive Wiener filter (AWF) SR algorithm was initially developed for grayscale imaging and has not previously been applied to color SR demosaicing. Here, we develop a novel fast SR method for CFA cameras that is based on the AWF SR algorithm and uses global channel-to-channel statistical models. We apply this new method as a stand-alone algorithm and also as an initialization image for a variational SR algorithm. This paper presents the theoretical development of the color AWF SR approach and applies it in performance comparisons to other SR techniques for both simulated and real data.

  4. Automatic detection of solar features in HSOS full-disk solar images using guided filter

    NASA Astrophysics Data System (ADS)

    Yuan, Fei; Lin, Jiaben; Guo, Jingjing; Wang, Gang; Tong, Liyue; Zhang, Xinwei; Wang, Bingxiang

    2018-02-01

    A procedure is introduced for the automatic detection of solar features using full-disk solar images from Huairou Solar Observing Station (HSOS), National Astronomical Observatories of China. In image preprocessing, median filter is applied to remove the noises. Guided filter is adopted to enhance the edges of solar features and restrain the solar limb darkening, which is first introduced into the astronomical target detection. Then specific features are detected by Otsu algorithm and further threshold processing technique. Compared with other automatic detection procedures, our procedure has some advantages such as real time and reliability as well as no need of local threshold. Also, it reduces the amount of computation largely, which is benefited from the efficient guided filter algorithm. The procedure has been tested on one month sequences (December 2013) of HSOS full-disk solar images and the result shows that the number of features detected by our procedure is well consistent with the manual one.

  5. Gaussian Process Kalman Filter for Focal Plane Wavefront Correction and Exoplanet Signal Extraction

    NASA Astrophysics Data System (ADS)

    Sun, He; Kasdin, N. Jeremy

    2018-01-01

    Currently, the ultimate limitation of space-based coronagraphy is the ability to subtract the residual PSF after wavefront correction to reveal the planet. Called reference difference imaging (RDI), the technique consists of conducting wavefront control to collect the reference point spread function (PSF) by observing a bright star, and then extracting target planet signals by subtracting a weighted sum of reference PSFs. Unfortunately, this technique is inherently inefficient because it spends a significant fraction of the observing time on the reference star rather than the target star with the planet. Recent progress in model based wavefront estimation suggests an alternative approach. A Kalman filter can be used to estimate the stellar PSF for correction by the wavefront control system while simultaneously estimating the planet signal. Without observing the reference star, the (extended) Kalman filter directly utilizes the wavefront correction data and combines the time series observations and model predictions to estimate the stellar PSF and planet signals. Because wavefront correction is used during the entire observation with no slewing, the system has inherently better stability. In this poster we show our results aimed at further improving our Kalman filter estimation accuracy by including not only temporal correlations but also spatial correlations among neighboring pixels in the images. This technique is known as a Gaussian process Kalman filter (GPKF). We also demonstrate the advantages of using a Kalman filter rather than RDI by simulating a real space exoplanet detection mission.

  6. MR signal intensity: staying on the bright side in MR image interpretation

    PubMed Central

    Bloem, Johan L; Reijnierse, Monique; Huizinga, Tom W J

    2018-01-01

    In 2003, the Nobel Prize for Medicine was awarded for contribution to the invention of MRI, reflecting the incredible value of MRI for medicine. Since 2003, enormous technical advancements have been made in acquiring MR images. However, MRI has a complicated, accident-prone dark side; images are not calibrated and respective images are dependent on all kinds of subjective choices in the settings of the machine, acquisition technique parameters, reconstruction techniques, data transmission, filtering and postprocessing techniques. The bright side is that understanding MR techniques increases opportunities to unravel characteristics of tissue. In this viewpoint, we summarise the different subjective choices that can be made to generate MR images and stress the importance of communication between radiologists and rheumatologists to correctly interpret images.

  7. Bowtie filters for dedicated breast CT: Theory and computational implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kontson, Kimberly, E-mail: Kimberly.Kontson@fda.hhs.gov; Jennings, Robert J.

    Purpose: To design bowtie filters with improved properties for dedicated breast CT to improve image quality and reduce dose to the patient. Methods: The authors present three different bowtie filters designed for a cylindrical 14-cm diameter phantom with a uniform composition of 40/60 breast tissue, which vary in their design objectives and performance improvements. Bowtie design #1 is based on single material spectral matching and produces nearly uniform spectral shape for radiation incident upon the detector. Bowtie design #2 uses the idea of basis material decomposition to produce the same spectral shape and intensity at the detector, using two differentmore » materials. Bowtie design #3 eliminates the beam hardening effect in the reconstructed image by adjusting the bowtie filter thickness so that the effective attenuation coefficient for every ray is the same. All three designs are obtained using analytical computational methods and linear attenuation coefficients. Thus, the designs do not take into account the effects of scatter. The authors considered this to be a reasonable approach to the filter design problem since the use of Monte Carlo methods would have been computationally intensive. The filter profiles for a cone-angle of 0° were used for the entire length of each filter because the differences between those profiles and the correct cone-beam profiles for the cone angles in our system are very small, and the constant profiles allowed construction of the filters with the facilities available to us. For evaluation of the filters, we used Monte Carlo simulation techniques and the full cone-beam geometry. Images were generated with and without each bowtie filter to analyze the effect on dose distribution, noise uniformity, and contrast-to-noise ratio (CNR) homogeneity. Line profiles through the reconstructed images generated from the simulated projection images were also used as validation for the filter designs. Results: Examples of the three designs are presented. Initial verification of performance of the designs was done using analytical computations of HVL, intensity, and effective attenuation coefficient behind the phantom as a function of fan-angle with a cone-angle of 0°. The performance of the designs depends only weakly on incident spectrum and tissue composition. For all designs, the dynamic range requirement on the detector was reduced compared to the no-bowtie-filter case. Further verification of the filter designs was achieved through analysis of reconstructed images from simulations. Simulation data also showed that the use of our bowtie filters can reduce peripheral dose to the breast by 61% and provide uniform noise and CNR distributions. The bowtie filter design concepts validated in this work were then used to create a computational realization of a 3D anthropomorphic bowtie filter capable of achieving a constant effective attenuation coefficient behind the entire field-of-view of an anthropomorphic breast phantom. Conclusions: Three different bowtie filter designs that vary in performance improvements were described and evaluated using computational and simulation techniques. Results indicate that the designs are robust against variations in breast diameter, breast composition, and tube voltage, and that the use of these filters can reduce patient dose and improve image quality compared to the no-bowtie-filter case.« less

  8. GPU Accelerated Vector Median Filter

    NASA Technical Reports Server (NTRS)

    Aras, Rifat; Shen, Yuzhong

    2011-01-01

    Noise reduction is an important step for most image processing tasks. For three channel color images, a widely used technique is vector median filter in which color values of pixels are treated as 3-component vectors. Vector median filters are computationally expensive; for a window size of n x n, each of the n(sup 2) vectors has to be compared with other n(sup 2) - 1 vectors in distances. General purpose computation on graphics processing units (GPUs) is the paradigm of utilizing high-performance many-core GPU architectures for computation tasks that are normally handled by CPUs. In this work. NVIDIA's Compute Unified Device Architecture (CUDA) paradigm is used to accelerate vector median filtering. which has to the best of our knowledge never been done before. The performance of GPU accelerated vector median filter is compared to that of the CPU and MPI-based versions for different image and window sizes, Initial findings of the study showed 100x improvement of performance of vector median filter implementation on GPUs over CPU implementations and further speed-up is expected after more extensive optimizations of the GPU algorithm .

  9. Adaptive Filtering to Enhance Noise Immunity of Impedance and Admittance Spectroscopy: Comparison with Fourier Transformation

    NASA Astrophysics Data System (ADS)

    Stupin, Daniil D.; Koniakhin, Sergei V.; Verlov, Nikolay A.; Dubina, Michael V.

    2017-05-01

    The time-domain technique for impedance spectroscopy consists of computing the excitation voltage and current response Fourier images by fast or discrete Fourier transformation and calculating their relation. Here we propose an alternative method for excitation voltage and current response processing for deriving a system impedance spectrum based on a fast and flexible adaptive filtering method. We show the equivalence between the problem of adaptive filter learning and deriving the system impedance spectrum. To be specific, we express the impedance via the adaptive filter weight coefficients. The noise-canceling property of adaptive filtering is also justified. Using the RLC circuit as a model system, we experimentally show that adaptive filtering yields correct admittance spectra and elements ratings in the high-noise conditions when the Fourier-transform technique fails. Providing the additional sensitivity of impedance spectroscopy, adaptive filtering can be applied to otherwise impossible-to-interpret time-domain impedance data. The advantages of adaptive filtering are justified with practical living-cell impedance measurements.

  10. Detection of retinal nerve fiber layer defects in retinal fundus images using Gabor filtering

    NASA Astrophysics Data System (ADS)

    Hayashi, Yoshinori; Nakagawa, Toshiaki; Hatanaka, Yuji; Aoyama, Akira; Kakogawa, Masakatsu; Hara, Takeshi; Fujita, Hiroshi; Yamamoto, Tetsuya

    2007-03-01

    Retinal nerve fiber layer defect (NFLD) is one of the most important findings for the diagnosis of glaucoma reported by ophthalmologists. However, such changes could be overlooked, especially in mass screenings, because ophthalmologists have limited time to search for a number of different changes for the diagnosis of various diseases such as diabetes, hypertension and glaucoma. Therefore, the use of a computer-aided detection (CAD) system can improve the results of diagnosis. In this work, a technique for the detection of NFLDs in retinal fundus images is proposed. In the preprocessing step, blood vessels are "erased" from the original retinal fundus image by using morphological filtering. The preprocessed image is then transformed into a rectangular array. NFLD regions are observed as vertical dark bands in the transformed image. Gabor filtering is then applied to enhance the vertical dark bands. False positives (FPs) are reduced by a rule-based method which uses the information of the location and the width of each candidate region. The detected regions are back-transformed into the original configuration. In this preliminary study, 71% of NFLD regions are detected with average number of FPs of 3.2 per image. In conclusion, we have developed a technique for the detection of NFLDs in retinal fundus images. Promising results have been obtained in this initial study.

  11. Speckle imaging techniques of the turbulence degraded images

    NASA Astrophysics Data System (ADS)

    Liu, Jin; Huang, Zongfu; Mao, Hongjun; Liang, Yonghui

    2018-03-01

    We propose a speckle imaging algorithm in which we use the improved form of spectral ratio to obtain the Fried parameter, we also use a filter to reduce the high frequency noise effects. Our algorithm makes an improvement in the quality of the reconstructed images. The performance is illustrated by computer simulations.

  12. Coronagraph Focal-Plane Phase Masks Based on Photonic Crystal Technology: Recent Progress and Observational Strategy

    NASA Technical Reports Server (NTRS)

    Murakami, Naoshi; Nishikawa, Jun; Sakamoto, Moritsugu; Ise, Akitoshi; Oka, Kazuhiko; Baba, Naoshi; Murakami, Hiroshi; Tamura, Motohide; Traub, Wesley A.; Mawet, Dimitri; hide

    2012-01-01

    Photonic crystal, an artificial periodic nanostructure of refractive indices, is one of the attractive technologies for coronagraph focal-plane masks aiming at direct imaging and characterization of terrestrial extrasolar planets. We manufactured the eight-octant phase mask (8OPM) and the vector vortex mask (VVM) very precisely using the photonic crystal technology. Fully achromatic phase-mask coronagraphs can be realized by applying appropriate polarization filters to the masks. We carried out laboratory experiments of the polarization-filtered 8OPM coronagraph using the High-Contrast Imaging Testbed (HCIT), a state-of-the-art coronagraph simulator at the Jet Propulsion Laboratory (JPL). We report the experimental results of 10-8-level contrast across several wavelengths over 10% bandwidth around 800nm. In addition, we present future prospects and observational strategy for the photonic-crystal mask coronagraphs combined with differential imaging techniques to reach higher contrast. We proposed to apply a polarization-differential imaging (PDI) technique to the VVM coronagraph, in which we built a two-channel coronagraph using polarizing beam splitters to avoid a loss of intensity due to the polarization filters. We also proposed to apply an angular-differential imaging (ADI) technique to the 8OPM coronagraph. The 8OPM/ADI mode avoids an intensity loss due to a phase transition of the mask and provides a full field of view around central stars. We present results of preliminary laboratory demonstrations of the PDI and ADI observational modes with the phase-mask coronagraphs.

  13. Underwater 3d Modeling: Image Enhancement and Point Cloud Filtering

    NASA Astrophysics Data System (ADS)

    Sarakinou, I.; Papadimitriou, K.; Georgoula, O.; Patias, P.

    2016-06-01

    This paper examines the results of image enhancement and point cloud filtering on the visual and geometric quality of 3D models for the representation of underwater features. Specifically it evaluates the combination of effects from the manual editing of images' radiometry (captured at shallow depths) and the selection of parameters for point cloud definition and mesh building (processed in 3D modeling software). Such datasets, are usually collected by divers, handled by scientists and used for geovisualization purposes. In the presented study, have been created 3D models from three sets of images (seafloor, part of a wreck and a small boat's wreck) captured at three different depths (3.5m, 10m and 14m respectively). Four models have been created from the first dataset (seafloor) in order to evaluate the results from the application of image enhancement techniques and point cloud filtering. The main process for this preliminary study included a) the definition of parameters for the point cloud filtering and the creation of a reference model, b) the radiometric editing of images, followed by the creation of three improved models and c) the assessment of results by comparing the visual and the geometric quality of improved models versus the reference one. Finally, the selected technique is tested on two other data sets in order to examine its appropriateness for different depths (at 10m and 14m) and different objects (part of a wreck and a small boat's wreck) in the context of an ongoing research in the Laboratory of Photogrammetry and Remote Sensing.

  14. Design of coupled mace filters for optical pattern recognition using practical spatial light modulators

    NASA Technical Reports Server (NTRS)

    Rajan, P. K.; Khan, Ajmal

    1993-01-01

    Spatial light modulators (SLMs) are being used in correlation-based optical pattern recognition systems to implement the Fourier domain filters. Currently available SLMs have certain limitations with respect to the realizability of these filters. Therefore, it is necessary to incorporate the SLM constraints in the design of the filters. The design of a SLM-constrained minimum average correlation energy (SLM-MACE) filter using the simulated annealing-based optimization technique was investigated. The SLM-MACE filter was synthesized for three different types of constraints. The performance of the filter was evaluated in terms of its recognition (discrimination) capabilities using computer simulations. The correlation plane characteristics of the SLM-MACE filter were found to be reasonably good. The SLM-MACE filter yielded far better results than the analytical MACE filter implemented on practical SLMs using the constrained magnitude technique. Further, the filter performance was evaluated in the presence of noise in the input test images. This work demonstrated the need to include the SLM constraints in the filter design. Finally, a method is suggested to reduce the computation time required for the synthesis of the SLM-MACE filter.

  15. Fingerprint image enhancement by differential hysteresis processing.

    PubMed

    Blotta, Eduardo; Moler, Emilce

    2004-05-10

    A new method to enhance defective fingerprints images through image digital processing tools is presented in this work. When the fingerprints have been taken without any care, blurred and in some cases mostly illegible, as in the case presented here, their classification and comparison becomes nearly impossible. A combination of spatial domain filters, including a technique called differential hysteresis processing (DHP), is applied to improve these kind of images. This set of filtering methods proved to be satisfactory in a wide range of cases by uncovering hidden details that helped to identify persons. Dactyloscopy experts from Policia Federal Argentina and the EAAF have validated these results.

  16. An unsupervised technique for optimal feature selection in attribute profiles for spectral-spatial classification of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Bhardwaj, Kaushal; Patra, Swarnajyoti

    2018-04-01

    Inclusion of spatial information along with spectral features play a significant role in classification of remote sensing images. Attribute profiles have already proved their ability to represent spatial information. In order to incorporate proper spatial information, multiple attributes are required and for each attribute large profiles need to be constructed by varying the filter parameter values within a wide range. Thus, the constructed profiles that represent spectral-spatial information of an hyperspectral image have huge dimension which leads to Hughes phenomenon and increases computational burden. To mitigate these problems, this work presents an unsupervised feature selection technique that selects a subset of filtered image from the constructed high dimensional multi-attribute profile which are sufficiently informative to discriminate well among classes. In this regard the proposed technique exploits genetic algorithms (GAs). The fitness function of GAs are defined in an unsupervised way with the help of mutual information. The effectiveness of the proposed technique is assessed using one-against-all support vector machine classifier. The experiments conducted on three hyperspectral data sets show the robustness of the proposed method in terms of computation time and classification accuracy.

  17. Morphology filter bank for extracting nodular and linear patterns in medical images.

    PubMed

    Hashimoto, Ryutaro; Uchiyama, Yoshikazu; Uchimura, Keiichi; Koutaki, Gou; Inoue, Tomoki

    2017-04-01

    Using image processing to extract nodular or linear shadows is a key technique of computer-aided diagnosis schemes. This study proposes a new method for extracting nodular and linear patterns of various sizes in medical images. We have developed a morphology filter bank that creates multiresolution representations of an image. Analysis bank of this filter bank produces nodular and linear patterns at each resolution level. Synthesis bank can then be used to perfectly reconstruct the original image from these decomposed patterns. Our proposed method shows better performance based on a quantitative evaluation using a synthesized image compared with a conventional method based on a Hessian matrix, often used to enhance nodular and linear patterns. In addition, experiments show that our method can be applied to the followings: (1) microcalcifications of various sizes in mammograms can be extracted, (2) blood vessels of various sizes in retinal fundus images can be extracted, and (3) thoracic CT images can be reconstructed while removing normal vessels. Our proposed method is useful for extracting nodular and linear shadows or removing normal structures in medical images.

  18. A comparison of line enhancement techniques: applications to guide-wire detection and respiratory motion tracking

    NASA Astrophysics Data System (ADS)

    Bismuth, Vincent; Vancamberg, Laurence; Gorges, Sébastien

    2009-02-01

    During interventional radiology procedures, guide-wires are usually inserted into the patients vascular tree for diagnosis or healing purpose. These procedures are monitored with an Xray interventional system providing images of the interventional devices navigating through the patient's body. The automatic detection of such tools by image processing means has gained maturity over the past years and enables applications ranging from image enhancement to multimodal image fusion. Sophisticated detection methods are emerging, which rely on a variety of device enhancement techniques. In this article we reviewed and classified these techniques into three families. We chose a state of the art approach in each of them and built a rigorous framework to compare their detection capability and their computational complexity. Through simulations and the intensive use of ROC curves we demonstrated that the Hessian based methods are the most robust to strong curvature of the devices and that the family of rotated filters technique is the most suited for detecting low CNR and low curvature devices. The steerable filter approach demonstrated less interesting detection capabilities and appears to be the most expensive one to compute. Finally we demonstrated the interest of automatic guide-wire detection on a clinical topic: the compensation of respiratory motion in multimodal image fusion.

  19. Application of the EM algorithm to radiographic images.

    PubMed

    Brailean, J C; Little, D; Giger, M L; Chen, C T; Sullivan, B J

    1992-01-01

    The expectation maximization (EM) algorithm has received considerable attention in the area of positron emitted tomography (PET) as a restoration and reconstruction technique. In this paper, the restoration capabilities of the EM algorithm when applied to radiographic images is investigated. This application does not involve reconstruction. The performance of the EM algorithm is quantitatively evaluated using a "perceived" signal-to-noise ratio (SNR) as the image quality metric. This perceived SNR is based on statistical decision theory and includes both the observer's visual response function and a noise component internal to the eye-brain system. For a variety of processing parameters, the relative SNR (ratio of the processed SNR to the original SNR) is calculated and used as a metric to compare quantitatively the effects of the EM algorithm with two other image enhancement techniques: global contrast enhancement (windowing) and unsharp mask filtering. The results suggest that the EM algorithm's performance is superior when compared to unsharp mask filtering and global contrast enhancement for radiographic images which contain objects smaller than 4 mm.

  20. Directional analysis and filtering for dust storm detection in NOAA-AVHRR imagery

    NASA Astrophysics Data System (ADS)

    Janugani, S.; Jayaram, V.; Cabrera, S. D.; Rosiles, J. G.; Gill, T. E.; Rivera Rivera, N.

    2009-05-01

    In this paper, we propose spatio-spectral processing techniques for the detection of dust storms and automatically finding its transport direction in 5-band NOAA-AVHRR imagery. Previous methods that use simple band math analysis have produced promising results but have drawbacks in producing consistent results when low signal to noise ratio (SNR) images are used. Moreover, in seeking to automate the dust storm detection, the presence of clouds in the vicinity of the dust storm creates a challenge in being able to distinguish these two types of image texture. This paper not only addresses the detection of the dust storm in the imagery, it also attempts to find the transport direction and the location of the sources of the dust storm. We propose a spatio-spectral processing approach with two components: visualization and automation. Both approaches are based on digital image processing techniques including directional analysis and filtering. The visualization technique is intended to enhance the image in order to locate the dust sources. The automation technique is proposed to detect the transport direction of the dust storm. These techniques can be used in a system to provide timely warnings of dust storms or hazard assessments for transportation, aviation, environmental safety, and public health.

  1. SART-Type Half-Threshold Filtering Approach for CT Reconstruction

    PubMed Central

    YU, HENGYONG; WANG, GE

    2014-01-01

    The ℓ1 regularization problem has been widely used to solve the sparsity constrained problems. To enhance the sparsity constraint for better imaging performance, a promising direction is to use the ℓp norm (0 < p < 1) and solve the ℓp minimization problem. Very recently, Xu et al. developed an analytic solution for the ℓ1∕2 regularization via an iterative thresholding operation, which is also referred to as half-threshold filtering. In this paper, we design a simultaneous algebraic reconstruction technique (SART)-type half-threshold filtering framework to solve the computed tomography (CT) reconstruction problem. In the medical imaging filed, the discrete gradient transform (DGT) is widely used to define the sparsity. However, the DGT is noninvertible and it cannot be applied to half-threshold filtering for CT reconstruction. To demonstrate the utility of the proposed SART-type half-threshold filtering framework, an emphasis of this paper is to construct a pseudoinverse transforms for DGT. The proposed algorithms are evaluated with numerical and physical phantom data sets. Our results show that the SART-type half-threshold filtering algorithms have great potential to improve the reconstructed image quality from few and noisy projections. They are complementary to the counterparts of the state-of-the-art soft-threshold filtering and hard-threshold filtering. PMID:25530928

  2. SART-Type Half-Threshold Filtering Approach for CT Reconstruction.

    PubMed

    Yu, Hengyong; Wang, Ge

    2014-01-01

    The [Formula: see text] regularization problem has been widely used to solve the sparsity constrained problems. To enhance the sparsity constraint for better imaging performance, a promising direction is to use the [Formula: see text] norm (0 < p < 1) and solve the [Formula: see text] minimization problem. Very recently, Xu et al. developed an analytic solution for the [Formula: see text] regularization via an iterative thresholding operation, which is also referred to as half-threshold filtering. In this paper, we design a simultaneous algebraic reconstruction technique (SART)-type half-threshold filtering framework to solve the computed tomography (CT) reconstruction problem. In the medical imaging filed, the discrete gradient transform (DGT) is widely used to define the sparsity. However, the DGT is noninvertible and it cannot be applied to half-threshold filtering for CT reconstruction. To demonstrate the utility of the proposed SART-type half-threshold filtering framework, an emphasis of this paper is to construct a pseudoinverse transforms for DGT. The proposed algorithms are evaluated with numerical and physical phantom data sets. Our results show that the SART-type half-threshold filtering algorithms have great potential to improve the reconstructed image quality from few and noisy projections. They are complementary to the counterparts of the state-of-the-art soft-threshold filtering and hard-threshold filtering.

  3. Segmentation of dermatoscopic images by frequency domain filtering and k-means clustering algorithms.

    PubMed

    Rajab, Maher I

    2011-11-01

    Since the introduction of epiluminescence microscopy (ELM), image analysis tools have been extended to the field of dermatology, in an attempt to algorithmically reproduce clinical evaluation. Accurate image segmentation of skin lesions is one of the key steps for useful, early and non-invasive diagnosis of coetaneous melanomas. This paper proposes two image segmentation algorithms based on frequency domain processing and k-means clustering/fuzzy k-means clustering. The two methods are capable of segmenting and extracting the true border that reveals the global structure irregularity (indentations and protrusions), which may suggest excessive cell growth or regression of a melanoma. As a pre-processing step, Fourier low-pass filtering is applied to reduce the surrounding noise in a skin lesion image. A quantitative comparison of the techniques is enabled by the use of synthetic skin lesion images that model lesions covered with hair to which Gaussian noise is added. The proposed techniques are also compared with an established optimal-based thresholding skin-segmentation method. It is demonstrated that for lesions with a range of different border irregularity properties, the k-means clustering and fuzzy k-means clustering segmentation methods provide the best performance over a range of signal to noise ratios. The proposed segmentation techniques are also demonstrated to have similar performance when tested on real skin lesions representing high-resolution ELM images. This study suggests that the segmentation results obtained using a combination of low-pass frequency filtering and k-means or fuzzy k-means clustering are superior to the result that would be obtained by using k-means or fuzzy k-means clustering segmentation methods alone. © 2011 John Wiley & Sons A/S.

  4. Optimal exposure techniques for iodinated contrast enhanced breast CT

    NASA Astrophysics Data System (ADS)

    Glick, Stephen J.; Makeev, Andrey

    2016-03-01

    Screening for breast cancer using mammography has been very successful in the effort to reduce breast cancer mortality, and its use has largely resulted in the 30% reduction in breast cancer mortality observed since 1990 [1]. However, diagnostic mammography remains an area of breast imaging that is in great need for improvement. One imaging modality proposed for improving the accuracy of diagnostic workup is iodinated contrast-enhanced breast CT [2]. In this study, a mathematical framework is used to evaluate optimal exposure techniques for contrast-enhanced breast CT. The ideal observer signal-to-noise ratio (i.e., d') figure-of-merit is used to provide a task performance based assessment of optimal acquisition parameters under the assumptions of a linear, shift-invariant imaging system. A parallel-cascade model was used to estimate signal and noise propagation through the detector, and a realistic lesion model with iodine uptake was embedded into a structured breast background. Ideal observer performance was investigated across kVp settings, filter materials, and filter thickness. Results indicated many kVp spectra/filter combinations can improve performance over currently used x-ray spectra.

  5. Feasibility study consisting of a review of contour generation methods from stereograms

    NASA Technical Reports Server (NTRS)

    Kim, C. J.; Wyant, J. C.

    1980-01-01

    A review of techniques for obtaining contour information from stereo pairs is given. Photogrammetric principles including a description of stereoscopic vision are presented. The use of conventional contour generation methods, such as the photogrammetric plotting technique, electronic correlator, and digital correlator are described. Coherent optical techniques for contour generation are discussed and compared to the electronic correlator. The optical techniques are divided into two categories: (1) image plane operation and (2) frequency plane operation. The description of image plane correlators are further divided into three categories: (1) image to image correlator, (2) interferometric correlator, and (3) positive negative transparencies. The frequency plane correlators are divided into two categories: (1) correlation of Fourier transforms, and (2) filtering techniques.

  6. Comparisons of linear and nonlinear pyramid schemes for signal and image processing

    NASA Astrophysics Data System (ADS)

    Morales, Aldo W.; Ko, Sung-Jea

    1997-04-01

    Linear filters banks are being used extensively in image and video applications. New research results in wavelet applications for compression and de-noising are constantly appearing in the technical literature. On the other hand, non-linear filter banks are also being used regularly in image pyramid algorithms. There are some inherent advantages in using non-linear filters instead of linear filters when non-Gaussian processes are present in images. However, a consistent way of comparing performance criteria between these two schemes has not been fully developed yet. In this paper a recently discovered tool, sample selection probabilities, is used to compare the behavior of linear and non-linear filters. In the conversion from weights of order statistics (OS) filters to coefficients of the impulse response is obtained through these probabilities. However, the reverse problem: the conversion from coefficients of the impulse response to the weights of OS filters is not yet fully understood. One of the reasons for this difficulty is the highly non-linear nature of the partitions and generating function used. In the present paper the problem is posed as an optimization of integer linear programming subject to constraints directly obtained from the coefficients of the impulse response. Although the technique to be presented in not completely refined, it certainly appears to be promising. Some results will be shown.

  7. Effects of pupil filter patterns in line-scan focal modulation microscopy

    NASA Astrophysics Data System (ADS)

    Shen, Shuhao; Pant, Shilpa; Chen, Rui; Chen, Nanguang

    2018-03-01

    Line-scan focal modulation microscopy (LSFMM) is an emerging imaging technique that affords high imaging speed and good optical sectioning at the same time. We present a systematic investigation into optimal design of the pupil filter for LSFMM in an attempt to achieve the best performance in terms of spatial resolutions, optical sectioning, and modulation depth. Scalar diffraction theory was used to compute light propagation and distribution in the system and theoretical predictions on system performance, which were then compared with experimental results.

  8. Wavelet Filter Banks for Super-Resolution SAR Imaging

    NASA Technical Reports Server (NTRS)

    Sheybani, Ehsan O.; Deshpande, Manohar; Memarsadeghi, Nargess

    2011-01-01

    This paper discusses Innovative wavelet-based filter banks designed to enhance the analysis of super resolution Synthetic Aperture Radar (SAR) images using parametric spectral methods and signal classification algorithms, SAR finds applications In many of NASA's earth science fields such as deformation, ecosystem structure, and dynamics of Ice, snow and cold land processes, and surface water and ocean topography. Traditionally, standard methods such as Fast-Fourier Transform (FFT) and Inverse Fast-Fourier Transform (IFFT) have been used to extract Images from SAR radar data, Due to non-parametric features of these methods and their resolution limitations and observation time dependence, use of spectral estimation and signal pre- and post-processing techniques based on wavelets to process SAR radar data has been proposed. Multi-resolution wavelet transforms and advanced spectral estimation techniques have proven to offer efficient solutions to this problem.

  9. Single line-of-sight dual energy backlighter for mix width experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, K. L., E-mail: baker7@llnl.gov; Glendinning, S. G.; Martinez, D.

    2014-11-15

    We present a diagnostic technique used to spatially multiplex two x-ray radiographs of an object onto a detector along a single line-of-sight. This technique uses a thin, <2 μm, cosputtered backlighter target to simultaneously produce both Ni and Zn He{sub α} emission. A Ni picket fence filter, 500 μm wide bars and troughs, is then placed in front of the detector to pass only the Ni He{sub α} emission in the bar region and both energies in the trough region thereby spatially multiplexing the two radiographs on a single image. Initial experimental results testing the backlighter spectrum are presented alongmore » with simulated images showing the calculated radiographic images though the nickel picket fence filter which are used to measure the mix width in an accelerated nickel foam.« less

  10. Using focused plenoptic cameras for rich image capture.

    PubMed

    Georgiev, T; Lumsdaine, A; Chunev, G

    2011-01-01

    This approach uses a focused plenoptic camera to capture the plenoptic function's rich "non 3D" structure. It employs two techniques. The first simultaneously captures multiple exposures (or other aspects) based on a microlens array having an interleaved set of different filters. The second places multiple filters at the main lens aperture.

  11. Detection of micro gap weld joint by using magneto-optical imaging and Kalman filtering compensated with RBF neural network

    NASA Astrophysics Data System (ADS)

    Gao, Xiangdong; Chen, Yuquan; You, Deyong; Xiao, Zhenlin; Chen, Xiaohui

    2017-02-01

    An approach for seam tracking of micro gap weld whose width is less than 0.1 mm based on magneto optical (MO) imaging technique during butt-joint laser welding of steel plates is investigated. Kalman filtering(KF) technology with radial basis function(RBF) neural network for weld detection by an MO sensor was applied to track the weld center position. Because the laser welding system process noises and the MO sensor measurement noises were colored noises, the estimation accuracy of traditional KF for seam tracking was degraded by the system model with extreme nonlinearities and could not be solved by the linear state-space model. Also, the statistics characteristics of noises could not be accurately obtained in actual welding. Thus, a RBF neural network was applied to the KF technique to compensate for the weld tracking errors. The neural network can restrain divergence filter and improve the system robustness. In comparison of traditional KF algorithm, the RBF with KF was not only more effectively in improving the weld tracking accuracy but also reduced noise disturbance. Experimental results showed that magneto optical imaging technique could be applied to detect micro gap weld accurately, which provides a novel approach for micro gap seam tracking.

  12. A Comprehensive Motion Estimation Technique for the Improvement of EIS Methods Based on the SURF Algorithm and Kalman Filter.

    PubMed

    Cheng, Xuemin; Hao, Qun; Xie, Mengdi

    2016-04-07

    Video stabilization is an important technology for removing undesired motion in videos. This paper presents a comprehensive motion estimation method for electronic image stabilization techniques, integrating the speeded up robust features (SURF) algorithm, modified random sample consensus (RANSAC), and the Kalman filter, and also taking camera scaling and conventional camera translation and rotation into full consideration. Using SURF in sub-pixel space, feature points were located and then matched. The false matched points were removed by modified RANSAC. Global motion was estimated by using the feature points and modified cascading parameters, which reduced the accumulated errors in a series of frames and improved the peak signal to noise ratio (PSNR) by 8.2 dB. A specific Kalman filter model was established by considering the movement and scaling of scenes. Finally, video stabilization was achieved with filtered motion parameters using the modified adjacent frame compensation. The experimental results proved that the target images were stabilized even when the vibrating amplitudes of the video become increasingly large.

  13. Real-time blind deconvolution of retinal images in adaptive optics scanning laser ophthalmoscopy

    NASA Astrophysics Data System (ADS)

    Li, Hao; Lu, Jing; Shi, Guohua; Zhang, Yudong

    2011-06-01

    With the use of adaptive optics (AO), the ocular aberrations can be compensated to get high-resolution image of living human retina. However, the wavefront correction is not perfect due to the wavefront measure error and hardware restrictions. Thus, it is necessary to use a deconvolution algorithm to recover the retinal images. In this paper, a blind deconvolution technique called Incremental Wiener filter is used to restore the adaptive optics confocal scanning laser ophthalmoscope (AOSLO) images. The point-spread function (PSF) measured by wavefront sensor is only used as an initial value of our algorithm. We also realize the Incremental Wiener filter on graphics processing unit (GPU) in real-time. When the image size is 512 × 480 pixels, six iterations of our algorithm only spend about 10 ms. Retinal blood vessels as well as cells in retinal images are restored by our algorithm, and the PSFs are also revised. Retinal images with and without adaptive optics are both restored. The results show that Incremental Wiener filter reduces the noises and improve the image quality.

  14. Demonstration of a single-wavelength spectral-imaging-based Thai jasmine rice identification

    NASA Astrophysics Data System (ADS)

    Suwansukho, Kajpanya; Sumriddetchkajorn, Sarun; Buranasiri, Prathan

    2011-07-01

    A single-wavelength spectral-imaging-based Thai jasmine rice breed identification is demonstrated. Our nondestructive identification approach relies on a combination of fluorescent imaging and simple image processing techniques. Especially, we apply simple image thresholding, blob filtering, and image subtracting processes to either a 545 or a 575nm image in order to identify our desired Thai jasmine rice breed from others. Other key advantages include no waste product and fast identification time. In our demonstration, UVC light is used as our exciting light, a liquid crystal tunable optical filter is used as our wavelength seclector, and a digital camera with 640activepixels×480activepixels is used to capture the desired spectral image. Eight Thai rice breeds having similar size and shape are tested. Our experimental proof of concept shows that by suitably applying image thresholding, blob filtering, and image subtracting processes to the selected fluorescent image, the Thai jasmine rice breed can be identified with measured false acceptance rates of <22.9% and <25.7% for spectral images at 545 and 575nm wavelengths, respectively. A measured fast identification time is 25ms, showing high potential for real-time applications.

  15. A Photometric Technique for Determining Fluid Concentration using Consumer-Grade Hardware

    NASA Technical Reports Server (NTRS)

    Leslie, F.; Ramachandran, N.

    1999-01-01

    In support of a separate study to produce an exponential concentration gradient in a magnetic fluid, a noninvasive technique for determining, species concentration from off-the-shelf hardware has been developed. The approach uses a backlighted fluid test cell photographed with a commercial digital camcorder. Because the light extinction coefficient is wavelength dependent, tests were conducted to determine the best filter color to use, although some guidance was also provided using an absorption spectrophotometer. With the appropriate filter in place, the provide attenuation of the light passing, through the test cell was captured by the camcorder. The digital image was analyzed for intensity using, software from Scion Image Corp. downloaded from the Internet. The analysis provides a two-dimensional array of concentration with an average error of 0.0095 ml/ml. This technique is superior to invasive techniques, which require extraction of a sample that disturbs the concentration distribution in the test cell. Refinements of this technique using a true monochromatic laser light Source are also discussed.

  16. A blue optical filter for narrow-band imaging in endoscopic capsules

    NASA Astrophysics Data System (ADS)

    Silva, M. F.; Ghaderi, M.; Goncalves, L. M.; de Graaf, G.; Wolffenbuttel, R. F.; Correia, J. H.

    2014-05-01

    This paper presents the design, simulation, fabrication, and characterization of a thin-film Fabry-Perot resonator composed of titanium dioxide (TiO2) and silicon dioxide (SiO2) thin-films. The optical filter is developed to be integrated with a light emitting diode (LED) for enabling narrow-band imaging (NBI) in endoscopy. The NBI is a high resolution imaging technique that uses spectrally centered blue light (415 nm) and green light (540 nm) to illuminate the target tissue. The light at 415 nm enhances the imaging of superficial veins due to their hemoglobin absorption, while the light at 540 nm penetrates deeper into the mucosa, thus enhances the sub-epithelial vessels imaging. Typically the endoscopes and endoscopic capsules use white light for acquiring images of the gastrointestinal (GI) tract. However, implementing the NBI technique in endoscopic capsules enhances their capabilities for the clinical applications. A commercially available blue LED with a maximum peak intensity at 404 nm and Full Width Half Maximum (FWHM) of 20 nm is integrated with a narrow band blue filter as the NBI light source. The thin film simulations show a maximum spectral transmittance of 36 %, that is centered at 415 nm with FWHM of 13 nm for combined the blue LED and a Fabry Perot resonator system. A custom made deposition scheme was developed for the fabrication of the blue optical filter by RF sputtering. RF powered reactive sputtering at 200 W with the gas flows of argon and oxygen that are controlled for a 5:1 ratio gives the optimum optical conditions for TiO2 thin films. For SiO2 thin films, a non-reactive RF sputtering at 150 W with argon gas flow at 15 sccm results in the best optical performance. The TiO2 and SiO2 thin films were fully characterized by an ellipsometer in the wavelength range between 250 nm to 1600 nm. Finally, the optical performance of the blue optical filter is measured and presented.

  17. High-speed railway signal trackside equipment patrol inspection system

    NASA Astrophysics Data System (ADS)

    Wu, Nan

    2018-03-01

    High-speed railway signal trackside equipment patrol inspection system comprehensively applies TDI (time delay integration), high-speed and highly responsive CMOS architecture, low illumination photosensitive technique, image data compression technique, machine vision technique and so on, installed on high-speed railway inspection train, and achieves the collection, management and analysis of the images of signal trackside equipment appearance while the train is running. The system will automatically filter out the signal trackside equipment images from a large number of the background image, and identify of the equipment changes by comparing the original image data. Combining with ledger data and train location information, the system accurately locate the trackside equipment, conscientiously guiding maintenance.

  18. An image filtering technique for SPIDER visible tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fonnesu, N., E-mail: nicola.fonnesu@igi.cnr.it; Agostini, M.; Brombin, M.

    2014-02-15

    The tomographic diagnostic developed for the beam generated in the SPIDER facility (100 keV, 50 A prototype negative ion source of ITER neutral beam injector) will characterize the two-dimensional particle density distribution of the beam. The simulations described in the paper show that instrumental noise has a large influence on the maximum achievable resolution of the diagnostic. To reduce its impact on beam pattern reconstruction, a filtering technique has been adapted and implemented in the tomography code. This technique is applied to the simulated tomographic reconstruction of the SPIDER beam, and the main results are reported.

  19. Initial results of a new generation dual source CT system using only an in-plane comb filter for ultra-high resolution temporal bone imaging.

    PubMed

    Meyer, Mathias; Haubenreisser, Holger; Raupach, Rainer; Schmidt, Bernhard; Lietzmann, Florian; Leidecker, Christianne; Allmendinger, Thomas; Flohr, Thomas; Schad, Lothar R; Schoenberg, Stefan O; Henzler, Thomas

    2015-01-01

    To prospectively evaluate radiation dose and image quality of a third generation dual-source CT (DSCT) without z-axis filter behind the patient for temporal bone CT. Forty-five patients were either examined on a first, second, or third generation DSCT in an ultra-high-resolution (UHR) temporal bone-imaging mode. On the third generation DSCT system, the tighter focal spot of 0.2 mm(2) removes the necessity for an additional z-axis-filter, leading to an improved z-axis radiation dose efficiency. Images of 0.4 mm were reconstructed using standard filtered-back-projection or iterative reconstruction (IR) technique for previous generations of DSCT and a novel IR algorithm for the third generation DSCT. Radiation dose and image quality were compared between the three DSCT systems. The statistically significantly highest subjective and objective image quality was evaluated for the third generation DSCT when compared to the first or second generation DSCT systems (all p < 0.05). Total effective dose was 63%/39% lower for the third generation examination as compared to the first and second generation DSCT. Temporal bone imaging without z-axis-UHR-filter and a novel third generation IR algorithm allows for significantly higher image quality while lowering effective dose when compared to the first two generations of DSCTs. • Omitting the z-axis-filter allows a reduction in radiation dose of 50% • A smaller focal spot of 0.2 mm (2) significantly improves spatial resolution • Ultra-high-resolution temporal-bone-CT helps to gain diagnostic information of the middle/inner ear.

  20. An Application of Rotation- and Translation-Invariant Overcomplete Wavelets to the registration of Remotely Sensed Imagery

    NASA Technical Reports Server (NTRS)

    LeMoigne, Jacqueline; Zavorine, Ilya

    1999-01-01

    A wavelet-based image registration approach has previously been proposed by the authors. In this work, wavelet coefficient maxima obtained from an orthogonal wavelet decomposition using Daubechies filters were utilized to register images in a multi-resolution fashion. Tested on several remote sensing datasets, this method gave very encouraging results. Despite the lack of translation-invariance of these filters, we showed that when using cross-correlation as a feature matching technique, features of size larger than twice the size of the filters are correctly registered by using the low-frequency subbands of the Daubechies wavelet decomposition. Nevertheless, high-frequency subbands are still sensitive to translation effects. In this work, we are considering a rotation- and translation-invariant representation developed by E. Simoncelli and integrate it in our image registration scheme. The two types of filters, Daubechies and Simoncelli filters, are then being compared from a registration point of view, utilizing synthetic data as well as data from the Landsat/ Thematic Mapper (TM) and from the NOAA Advanced Very High Resolution Radiometer (AVHRR).

  1. An Application of Rotation- and Translation-Invariant Overcomplete Wavelets to the Registration of Remotely Sensed Imagery

    NASA Technical Reports Server (NTRS)

    LeMoigne, Jacqueline; Zavorine, Ilya

    1999-01-01

    A wavelet-based image registration approach has previously been proposed by the authors. In this work, wavelet coefficient maxima obtained from an orthogonal wavelet decomposition using Daubechies filters were utilized to register images in a multi-resolution fashion. Tested on several remote sensing datasets, this method gave very encouraging results. Despite the lack of translation-invariance of these filters, we showed that when using cross-correlation as a feature matching technique, features of size larger than twice the size of the filters are correctly registered by using the low-frequency subbands of the Daubechies wavelet decomposition. Nevertheless, high-frequency subbands are still sensitive to translation effects. In this work, we are considering a rotation- and translation-invariant representation developed by E. Simoncelli and integrate it in our image registration scheme. The two types of filters, Daubechies and Simoncelli filters, are then being compared from a registration point of view, utilizing synthetic data as well as data from the Landsat/ Thematic Mapper (TM) and from the NOAA Advanced Very High Resolution Radiometer (AVHRR).

  2. Optical correlation based pose estimation using bipolar phase grayscale amplitude spatial light modulators

    NASA Astrophysics Data System (ADS)

    Outerbridge, Gregory John, II

    Pose estimation techniques have been developed on both optical and digital correlator platforms to aid in the autonomous rendezvous and docking of spacecraft. This research has focused on the optical architecture, which utilizes high-speed bipolar-phase grayscale-amplitude spatial light modulators as the image and correlation filter devices. The optical approach has the primary advantage of optical parallel processing: an extremely fast and efficient way of performing complex correlation calculations. However, the constraints imposed on optically implementable filters makes optical correlator based posed estimation technically incompatible with the popular weighted composite filter designs successfully used on the digital platform. This research employs a much simpler "bank of filters" approach to optical pose estimation that exploits the inherent efficiency of optical correlation devices. A novel logarithmically mapped optically implementable matched filter combined with a pose search algorithm resulted in sub-degree standard deviations in angular pose estimation error. These filters were extremely simple to generate, requiring no complicated training sets and resulted in excellent performance even in the presence of significant background noise. Common edge detection and scaling of the input image was the only image pre-processing necessary for accurate pose detection at all alignment distances of interest.

  3. Active thermography and post-processing image enhancement for recovering of abraded and paint-covered alphanumeric identification marks

    NASA Astrophysics Data System (ADS)

    Montanini, R.; Quattrocchi, A.; Piccolo, S. A.

    2016-09-01

    Alphanumeric marking is a common technique employed in industrial applications for identification of products. However, the realised mark can undergo deterioration, either by extensive use or voluntary deletion (e.g. removal of identification numbers of weapons or vehicles). For recovery of the lost data many destructive or non-destructive techniques have been endeavoured so far, which however present several restrictions. In this paper, active infrared thermography has been exploited for the first time in order to assess its effectiveness in restoring paint covered and abraded labels made by means of different manufacturing processes (laser, dot peen, impact, cold press and scribe). Optical excitation of the target surface has been achieved using pulse (PT), lock-in (LT) and step heating (SHT) thermography. Raw infrared images were analysed with a dedicated image processing software originally developed in Matlab™, exploiting several methods, which include thermographic signal reconstruction (TSR), guided filtering (GF), block guided filtering (BGF) and logarithmic transformation (LN). Proper image processing of the raw infrared images resulted in superior contrast and enhanced readability. In particular, for deeply abraded marks, good outcomes have been obtained by application of logarithmic transformation to raw PT images and block guided filtering to raw phase LT images. With PT and LT it was relatively easy to recover labels covered by paint, with the latter one providing better thermal contrast for all the examined targets. Step heating thermography never led to adequate label identification instead.

  4. Evaluation of the resolving potency of a novel reconstruction filter on periodontal ligament space with dental cone-beam CT: a quantitative phantom study

    NASA Astrophysics Data System (ADS)

    Houno, Yuuki; Hishikawa, Toshimitsu; Gotoh, Ken-ichi; Naitoh, Munetaka; Ariji, Eiichiro; Kodera, Yoshie

    2014-03-01

    Diagnosis of the alveolar bone condition is important for the treatment planning of periodontal disease. Especially the determination of periodontal ligament space is the most important remark because it represents the periodontal tissue support for tooth retention. However, owing to the image blur of the current cone-beam CT (CBCT) imaging technique, the periodontal ligament space is difficult to visualize. In this study, we developed an original periodontal ligament phantom (PLP) and evaluated the image quality of simulated periodontal ligament space using a novel reconstruction filter for CBCT that emphasized high frequency component. PLP was composed from two resin blocks of different materials, the bone equivalent block and the dentine equivalent block. They were assembled to make continuously changing space from 0.0 to 1.0 millimeter that mimics periodontal ligament space. PLP was placed in water and the image was obtained by using Alphard-3030 dental cone-beam CT (Asahi Roentgen Industry Co., Ltd.). Then we reconstructed the projection data with a novel reconstruction filter. The axial images were compared with conventional reconstructed images. In novel filter reconstruction images, 0.4 millimeter of the space width was steadily detected by calculation of pixel value, on the other hand 0.6 millimeter was in conventional images. With our method, the resolving potency of conebeam CT images was improved.

  5. Prospective implementation of an algorithm for bedside intravascular ultrasound-guided filter placement in critically ill patients.

    PubMed

    Killingsworth, Christopher D; Taylor, Steven M; Patterson, Mark A; Weinberg, Jordan A; McGwin, Gerald; Melton, Sherry M; Reiff, Donald A; Kerby, Jeffrey D; Rue, Loring W; Jordan, William D; Passman, Marc A

    2010-05-01

    Although contrast venography is the standard imaging method for inferior vena cava (IVC) filter insertion, intravascular ultrasound (IVUS) imaging is a safe and effective option that allows for bedside filter placement and is especially advantageous for immobilized critically ill patients by limiting resource use, risk of transportation, and cost. This study reviewed the effectiveness of a prospectively implemented algorithm for IVUS-guided IVC filter placement in this high-risk population. Current evidence-based guidelines were used to create a clinical decision algorithm for IVUS-guided IVC filter placement in critically ill patients. After a defined lead-in phase to allow dissemination of techniques, the algorithm was prospectively implemented on January 1, 2008. Data were collected for 1 year using accepted reporting standards and a quality assurance review performed based on intent-to-treat at 6, 12, and 18 months. As defined in the prospectively implemented algorithm, 109 patients met criteria for IVUS-directed bedside IVC filter placement. Technical feasibility was 98.1%. Only 2 patients had inadequate IVUS visualization for bedside filter placement and required subsequent placement in the endovascular suite. Technical success, defined as proper deployment in an infrarenal position, was achieved in 104 of the remaining 107 patients (97.2%). The filter was permanent in 21 (19.6%) and retrievable in 86 (80.3%). The single-puncture technique was used in 101 (94.4%), with additional dual access required in 6 (5.6%). Periprocedural complications were rare but included malpositioning requiring retrieval and repositioning in three patients, filter tilt >/=15 degrees in two, and arteriovenous fistula in one. The 30-day mortality rate for the bedside group was 5.5%, with no filter-related deaths. Successful placement of IVC filters using IVUS-guided imaging at the bedside in critically ill patients can be established through an evidence-based prospectively implemented algorithm, thereby limiting the need for transport in this high-risk population. Copyright (c) 2010 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.

  6. Adaptive HIFU noise cancellation for simultaneous therapy and imaging using an integrated HIFU/imaging transducer

    PubMed Central

    Jeong, Jong Seob; Cannata, Jonathan Matthew; Shung, K Kirk

    2010-01-01

    It was previously demonstrated that it is feasible to simultaneously perform ultrasound therapy and imaging of a coagulated lesion during treatment with an integrated transducer that is capable of high intensity focused ultrasound (HIFU) and B-mode ultrasound imaging. It was found that coded excitation and fixed notch filtering upon reception could significantly reduce interference caused by the therapeutic transducer. During HIFU sonication, the imaging signal generated with coded excitation and fixed notch filtering had a range side-lobe level of less than −40 dB, while traditional short-pulse excitation and fixed notch filtering produced a range side-lobe level of −20 dB. The shortcoming is, however, that relatively complicated electronics may be needed to utilize coded excitation in an array imaging system. It is for this reason that in this paper an adaptive noise canceling technique is proposed to improve image quality by minimizing not only the therapeutic interference, but also the remnant side-lobe ‘ripples’ when using the traditional short-pulse excitation. The performance of this technique was verified through simulation and experiments using a prototype integrated HIFU/imaging transducer. Although it is known that the remnant ripples are related to the notch attenuation value of the fixed notch filter, in reality, it is difficult to find the optimal notch attenuation value due to the change in targets or the media resulted from motion or different acoustic properties even during one sonication pulse. In contrast, the proposed adaptive noise canceling technique is capable of optimally minimizing both the therapeutic interference and residual ripples without such constraints. The prototype integrated HIFU/imaging transducer is composed of three rectangular elements. The 6 MHz center element is used for imaging and the outer two identical 4 MHz elements work together to transmit the HIFU beam. Two HIFU elements of 14.4 mm × 20.0 mm dimensions could increase the temperature of the soft biological tissue from 55 °C to 71 °C within 60 s. Two types of experiments for simultaneous therapy and imaging were conducted to acquire a single scan-line and B-mode image with an aluminum plate and a slice of porcine muscle, respectively. The B-mode image was obtained using the single element imaging system during HIFU beam transmission. The experimental results proved that the combination of the traditional short-pulse excitation and the adaptive noise canceling method could significantly reduce therapeutic interference and remnant ripples and thus may be a better way to implement real-time simultaneous therapy and imaging. PMID:20224162

  7. Adaptive HIFU noise cancellation for simultaneous therapy and imaging using an integrated HIFU/imaging transducer.

    PubMed

    Jeong, Jong Seob; Cannata, Jonathan Matthew; Shung, K Kirk

    2010-04-07

    It was previously demonstrated that it is feasible to simultaneously perform ultrasound therapy and imaging of a coagulated lesion during treatment with an integrated transducer that is capable of high intensity focused ultrasound (HIFU) and B-mode ultrasound imaging. It was found that coded excitation and fixed notch filtering upon reception could significantly reduce interference caused by the therapeutic transducer. During HIFU sonication, the imaging signal generated with coded excitation and fixed notch filtering had a range side-lobe level of less than -40 dB, while traditional short-pulse excitation and fixed notch filtering produced a range side-lobe level of -20 dB. The shortcoming is, however, that relatively complicated electronics may be needed to utilize coded excitation in an array imaging system. It is for this reason that in this paper an adaptive noise canceling technique is proposed to improve image quality by minimizing not only the therapeutic interference, but also the remnant side-lobe 'ripples' when using the traditional short-pulse excitation. The performance of this technique was verified through simulation and experiments using a prototype integrated HIFU/imaging transducer. Although it is known that the remnant ripples are related to the notch attenuation value of the fixed notch filter, in reality, it is difficult to find the optimal notch attenuation value due to the change in targets or the media resulted from motion or different acoustic properties even during one sonication pulse. In contrast, the proposed adaptive noise canceling technique is capable of optimally minimizing both the therapeutic interference and residual ripples without such constraints. The prototype integrated HIFU/imaging transducer is composed of three rectangular elements. The 6 MHz center element is used for imaging and the outer two identical 4 MHz elements work together to transmit the HIFU beam. Two HIFU elements of 14.4 mm x 20.0 mm dimensions could increase the temperature of the soft biological tissue from 55 degrees C to 71 degrees C within 60 s. Two types of experiments for simultaneous therapy and imaging were conducted to acquire a single scan-line and B-mode image with an aluminum plate and a slice of porcine muscle, respectively. The B-mode image was obtained using the single element imaging system during HIFU beam transmission. The experimental results proved that the combination of the traditional short-pulse excitation and the adaptive noise canceling method could significantly reduce therapeutic interference and remnant ripples and thus may be a better way to implement real-time simultaneous therapy and imaging.

  8. Statistical Analysis of speckle noise reduction techniques for echocardiographic Images

    NASA Astrophysics Data System (ADS)

    Saini, Kalpana; Dewal, M. L.; Rohit, Manojkumar

    2011-12-01

    Echocardiography is the safe, easy and fast technology for diagnosing the cardiac diseases. As in other ultrasound images these images also contain speckle noise. In some cases this speckle noise is useful such as in motion detection. But in general noise removal is required for better analysis of the image and proper diagnosis. Different Adaptive and anisotropic filters are included for statistical analysis. Statistical parameters such as Signal-to-Noise Ratio (SNR), Peak Signal-to-Noise Ratio (PSNR), and Root Mean Square Error (RMSE) calculated for performance measurement. One more important aspect that there may be blurring during speckle noise removal. So it is prefered that filter should be able to enhance edges during noise removal.

  9. Studies of EGRET sources with a novel image restoration technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tajima, Hiroyasu; Cohen-Tanugi, Johann; Kamae, Tuneyoshi

    2007-07-12

    We have developed an image restoration technique based on the Richardson-Lucy algorithm optimized for GLAST-LAT image analysis. Our algorithm is original since it utilizes the PSF (point spread function) that is calculated for each event. This is critical for EGRET and GLAST-LAT image analysis since the PSF depends on the energy and angle of incident gamma-rays and varies by more than one order of magnitude. EGRET and GLAST-LAT image analysis also faces Poisson noise due to low photon statistics. Our technique incorporates wavelet filtering to minimize noise effects. We present studies of EGRET sources using this novel image restoration techniquemore » for possible identification of extended gamma-ray sources.« less

  10. A Comparison of the Multiscale Retinex With Other Image Enhancement Techniques

    NASA Technical Reports Server (NTRS)

    Rahman, Zia-Ur; Woodell, Glenn A.; Jobson, Daniel J.

    1997-01-01

    The multiscale retinex with color restoration (MSRCR) has shown itself to be a very versatile automatic image enhancement algorithm that simultaneously provides dynamic range compression, color constancy, and color rendition. A number of algorithms exist that provide one or more of these features, but not all. In this paper we compare the performance of the MSRCR with techniques that are widely used for image enhancement. Specifically, we compare the MSRCR with color adjustment methods such as gamma correction and gain/offset application, histogram modification techniques such as histogram equalization and manual histogram adjustment, and other more powerful techniques such as homomorphic filtering and 'burning and dodging'. The comparison is carried out by testing the suite of image enhancement methods on a set of diverse images. We find that though some of these techniques work well for some of these images, only the MSRCR performs universally well on the test set.

  11. SU-G-TeP2-11: Initial Evaluation of a Novel Split-Filter Dual-Energy CT for Use in Radiation Oncology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, J; Huang, J; Szczykutowicz, T

    2016-06-15

    Purpose: To perform an initial evaluation of a novel split-filter dual-energy CT (DECT) system with the goal of understanding the clinical utility and limitations of the system for radiation therapy. Methods: Several phantoms were imaged using the split-filter DECT technique on the Siemens Edge CT scanner using a range of clinically-relevant doses. The optimum-contrast reconstruction, the mixed reconstruction, and the monoenergetic reconstructions (ranging from 40 keV to 190 keV) were evaluated. Each image was analyzed for CT number accuracy, uniformity, noise, low-contrast visibility (LCV), spatial resolution and geometric distortion. For comparison purposes, all parameters were evaluated on 120 kVp single-energymore » CT (SECT) scans used for treatment planning, as well as, a sequential-scan DECT technique for corresponding doses. Results: For all DECT reconstructions no observable geometric distortion was found. Both the optimal-contrast and mixed images demonstrated slight improvements in LCV and noise when compared to the SECT, and slight reductions in CT number accuracy and spatial resolution. The CT numbers trended as expected for the monoenergetic reconstructions, with CT number accuracy within 50 HU for materials of density <2 g/cm3. Spatial resolution increased with energy, and for monoenergetic reconstructions >70 keV the spatial resolution exceeded that of the SECT. The noise in the monoenergetic reconstructions increased with decreasing energy. Thus, the image uniformity, signal-to-noise ratio and LCV were diminished at lower energies (70 keV). Applying iterative reconstruction techniques to the low-energy images reduced noise and improved LCV. The signal-to-noise ratio was stable for energies >100 keV. Conclusion: The initial commissioning of the novel split-filter DECT technology demonstrated favorable results for clinical implementation. The mixed reconstruction showed potential as a replacement for the treatment planning SECT. The image parameters for the monoenergetic reconstructions varied appropriately with energy. This work provides an initial understanding of the limitations and potential applications for monoenergetic imaging.« less

  12. Unenhanced third-generation dual-source chest CT using a tin filter for spectral shaping at 100kVp.

    PubMed

    Haubenreisser, Holger; Meyer, Mathias; Sudarski, Sonja; Allmendinger, Thomas; Schoenberg, Stefan O; Henzler, Thomas

    2015-08-01

    To prospectively investigate image quality and radiation dose of 100kVp spectral shaping chest CT using a dedicated tin filter on a 3rd generation dual-source CT (DSCT) in comparison to standard 100kVp chest CT. Sixty patients referred for a non-contrast chest on a 3rd generation DSCT were prospectively included and examined at 100kVp with a dedicated tin filter. These patients were retrospectively matched with patients that were examined on a 2nd generation DSCT at 100kVp without tin filter. Objective and subjective image quality was assessed in various anatomic regions and radiation dose was compared. Radiation dose was decreased by 90% using the tin filter (3.0 vs 0.32mSv). Soft tissue attenuation and image noise was not statistically different for both examination techniques (p>0.05), however image noise was found to be significantly higher in the trachea when using the additional tin filter (p=0.002). SNR was found to be statistically similar in pulmonary tissue, significantly lower when measured in air and significantly higher in the aorta for the scans on the 3rd generation DSCT. Subjective image quality with regard to overall quality and image noise and sharpness was not statistically significantly different (p>0.05). 100kVp spectral shaping chest CT by means of a tube-based tin-filter on a 3rd generation DSCT allows 90% dose reduction when compared to 100kVp chest CT on a 2nd generation DSCT without spectral shaping. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Autonomous space target recognition and tracking approach using star sensors based on a Kalman filter.

    PubMed

    Ye, Tao; Zhou, Fuqiang

    2015-04-10

    When imaged by detectors, space targets (including satellites and debris) and background stars have similar point-spread functions, and both objects appear to change as detectors track targets. Therefore, traditional tracking methods cannot separate targets from stars and cannot directly recognize targets in 2D images. Consequently, we propose an autonomous space target recognition and tracking approach using a star sensor technique and a Kalman filter (KF). A two-step method for subpixel-scale detection of star objects (including stars and targets) is developed, and the combination of the star sensor technique and a KF is used to track targets. The experimental results show that the proposed method is adequate for autonomously recognizing and tracking space targets.

  14. Nonlinear spatio-temporal filtering of dynamic PET data using a four-dimensional Gaussian filter and expectation-maximization deconvolution

    NASA Astrophysics Data System (ADS)

    Floberg, J. M.; Holden, J. E.

    2013-02-01

    We introduce a method for denoising dynamic PET data, spatio-temporal expectation-maximization (STEM) filtering, that combines four-dimensional Gaussian filtering with EM deconvolution. The initial Gaussian filter suppresses noise at a broad range of spatial and temporal frequencies and EM deconvolution quickly restores the frequencies most important to the signal. We aim to demonstrate that STEM filtering can improve variance in both individual time frames and in parametric images without introducing significant bias. We evaluate STEM filtering with a dynamic phantom study, and with simulated and human dynamic PET studies of a tracer with reversible binding behaviour, [C-11]raclopride, and a tracer with irreversible binding behaviour, [F-18]FDOPA. STEM filtering is compared to a number of established three and four-dimensional denoising methods. STEM filtering provides substantial improvements in variance in both individual time frames and in parametric images generated with a number of kinetic analysis techniques while introducing little bias. STEM filtering does bias early frames, but this does not affect quantitative parameter estimates. STEM filtering is shown to be superior to the other simple denoising methods studied. STEM filtering is a simple and effective denoising method that could be valuable for a wide range of dynamic PET applications.

  15. Vena Cava Filter Retrieval with Aorto-Iliac Arterial Strut Penetration.

    PubMed

    Holly, Brian P; Gaba, Ron C; Lessne, Mark L; Lewandowski, Robert J; Ryu, Robert K; Desai, Kush R; Sing, Ronald F

    2018-05-03

    To evaluate the safety and technical success of inferior vena cava (IVC) filter retrieval in the setting of aorto-iliac arterial strut penetration. IVC filter registries from six large United States IVC filter retrieval practices were retrospectively reviewed to identify patients who underwent IVC filter retrieval in the setting of filter strut penetration into the adjacent aorta or iliac artery. Patient demographics, implant duration, indication for placement, IVC filter type, retrieval technique and technical success, adverse events, and post procedural clinical outcomes were identified. Arterial penetration was determined based on pre-procedure CT imaging in all cases. The IVC filter retrieval technique used was at the discretion of the operating physician. Seventeen patients from six US centers who underwent retrieval of an IVC filter with at least one strut penetrating either the aorta or iliac artery were identified. Retrieval technical success rate was 100% (17/17), without any major adverse events. Post-retrieval follow-up ranging from 10 days to 2 years (mean 4.6 months) was available in 12/17 (71%) patients; no delayed adverse events were encountered. Findings from this series suggest that chronically indwelling IVC filters with aorto-iliac arterial strut penetration may be safely retrieved.

  16. High performance 3D adaptive filtering for DSP based portable medical imaging systems

    NASA Astrophysics Data System (ADS)

    Bockenbach, Olivier; Ali, Murtaza; Wainwright, Ian; Nadeski, Mark

    2015-03-01

    Portable medical imaging devices have proven valuable for emergency medical services both in the field and hospital environments and are becoming more prevalent in clinical settings where the use of larger imaging machines is impractical. Despite their constraints on power, size and cost, portable imaging devices must still deliver high quality images. 3D adaptive filtering is one of the most advanced techniques aimed at noise reduction and feature enhancement, but is computationally very demanding and hence often cannot be run with sufficient performance on a portable platform. In recent years, advanced multicore digital signal processors (DSP) have been developed that attain high processing performance while maintaining low levels of power dissipation. These processors enable the implementation of complex algorithms on a portable platform. In this study, the performance of a 3D adaptive filtering algorithm on a DSP is investigated. The performance is assessed by filtering a volume of size 512x256x128 voxels sampled at a pace of 10 MVoxels/sec with an Ultrasound 3D probe. Relative performance and power is addressed between a reference PC (Quad Core CPU) and a TMS320C6678 DSP from Texas Instruments.

  17. Ghost imaging of phase objects with classical incoherent light

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shirai, Tomohiro; Setaelae, Tero; Friberg, Ari T.

    2011-10-15

    We describe an optical setup for performing spatial Fourier filtering in ghost imaging with classical incoherent light. This is achieved by a modification of the conventional geometry for lensless ghost imaging. It is shown on the basis of classical coherence theory that with this technique one can realize what we call phase-contrast ghost imaging to visualize pure phase objects.

  18. Pseudo color ghost coding imaging with pseudo thermal light

    NASA Astrophysics Data System (ADS)

    Duan, De-yang; Xia, Yun-jie

    2018-04-01

    We present a new pseudo color imaging scheme named pseudo color ghost coding imaging based on ghost imaging but with multiwavelength source modulated by a spatial light modulator. Compared with conventional pseudo color imaging where there is no nondegenerate wavelength spatial correlations resulting in extra monochromatic images, the degenerate wavelength and nondegenerate wavelength spatial correlations between the idle beam and signal beam can be obtained simultaneously. This scheme can obtain more colorful image with higher quality than that in conventional pseudo color coding techniques. More importantly, a significant advantage of the scheme compared to the conventional pseudo color coding imaging techniques is the image with different colors can be obtained without changing the light source and spatial filter.

  19. Design, fabrication and testing of hierarchical micro-optical structures and systems

    NASA Astrophysics Data System (ADS)

    Cannistra, Aaron Thomas

    Micro-optical systems are becoming essential components in imaging, sensing, communications, computing, and other applications. Optically based designs are replacing electronic, chemical and mechanical systems for a variety of reasons, including low power consumption, reduced maintenance, and faster operation. However, as the number and variety of applications increases, micro-optical system designs are becoming smaller, more integrated, and more complicated. Micro and nano-optical systems found in nature, such as the imaging systems found in many insects and crustaceans, can have highly integrated optical structures that vary in size by orders of magnitude. These systems incorporate components such as compound lenses, anti-reflective lens surface structuring, spectral filters, and polarization selective elements. For animals, these hybrid optical systems capable of many optical functions in a compact package have been repeatedly selected during the evolutionary process. Understanding the advantages of these designs gives motivation for synthetic optical systems with comparable functionality. However, alternative fabrication methods that deviate from conventional processes are needed to create such systems. Further complicating the issue, the resulting device geometry may not be readily compatible with existing measurement techniques. This dissertation explores several nontraditional fabrication techniques for optical components with hierarchical geometries and measurement techniques to evaluate performance of such components. A micro-transfer molding process is found to produce high-fidelity micro-optical structures and is used to fabricate a spectral filter on a curved surface. By using a custom measurement setup we demonstrate that the spectral filter retains functionality despite the nontraditional geometry. A compound lens is fabricated using similar fabrication techniques and the imaging performance is analyzed. A spray coating technique for photoresist application to curved surfaces combined with interference lithography is also investigated. Using this technique, we generate polarizers on curved surfaces and measure their performance. This work furthers an understanding of how combining multiple optical components affects the performance of each component, the final integrated devices, and leads towards realization of biomimetically inspired imaging systems.

  20. Digital cleaning and "dirt" layer visualization of an oil painting.

    PubMed

    Palomero, Cherry May T; Soriano, Maricor N

    2011-10-10

    We demonstrate a new digital cleaning technique which uses a neural network that is trained to learn the transformation from dirty to clean segments of a painting image. The inputs and outputs of the network are pixels belonging to dirty and clean segments found in Fernando Amorsolo's Malacañang by the River. After digital cleaning we visualize the painting's discoloration by assuming it to be a transmission filter superimposed on the clean painting. Using an RGB color-to-spectrum transformation to obtain the point-per-point spectra of the clean and dirty painting images, we calculate this "dirt" filter and render it for the whole image.

  1. Metal Artifact Suppression in Dental Cone Beam Computed Tomography Images Using Image Processing Techniques.

    PubMed

    Johari, Masoumeh; Abdollahzadeh, Milad; Esmaeili, Farzad; Sakhamanesh, Vahideh

    2018-01-01

    Dental cone beam computed tomography (CBCT) images suffer from severe metal artifacts. These artifacts degrade the quality of acquired image and in some cases make it unsuitable to use. Streaking artifacts and cavities around teeth are the main reason of degradation. In this article, we have proposed a new artifact reduction algorithm which has three parallel components. The first component extracts teeth based on the modeling of image histogram with a Gaussian mixture model. Striking artifact reduction component reduces artifacts using converting image into the polar domain and applying morphological filtering. The third component fills cavities through a simple but effective morphological filtering operation. Finally, results of these three components are combined into a fusion step to create a visually good image which is more compatible to human visual system. Results show that the proposed algorithm reduces artifacts of dental CBCT images and produces clean images.

  2. Optical multiple-image authentication based on cascaded phase filtering structure

    NASA Astrophysics Data System (ADS)

    Wang, Q.; Alfalou, A.; Brosseau, C.

    2016-10-01

    In this study, we report on the recent developments of optical image authentication algorithms. Compared with conventional optical encryption, optical image authentication achieves more security strength because such methods do not need to recover information of plaintext totally during the decryption period. Several recently proposed authentication systems are briefly introduced. We also propose a novel multiple-image authentication system, where multiple original images are encoded into a photon-limited encoded image by using a triple-plane based phase retrieval algorithm and photon counting imaging (PCI) technique. One can only recover a noise-like image using correct keys. To check authority of multiple images, a nonlinear fractional correlation is employed to recognize the original information hidden in the decrypted results. The proposal can be implemented optically using a cascaded phase filtering configuration. Computer simulation results are presented to evaluate the performance of this proposal and its effectiveness.

  3. An Optimal Partial Differential Equations-based Stopping Criterion for Medical Image Denoising.

    PubMed

    Khanian, Maryam; Feizi, Awat; Davari, Ali

    2014-01-01

    Improving the quality of medical images at pre- and post-surgery operations are necessary for beginning and speeding up the recovery process. Partial differential equations-based models have become a powerful and well-known tool in different areas of image processing such as denoising, multiscale image analysis, edge detection and other fields of image processing and computer vision. In this paper, an algorithm for medical image denoising using anisotropic diffusion filter with a convenient stopping criterion is presented. In this regard, the current paper introduces two strategies: utilizing the efficient explicit method due to its advantages with presenting impressive software technique to effectively solve the anisotropic diffusion filter which is mathematically unstable, proposing an automatic stopping criterion, that takes into consideration just input image, as opposed to other stopping criteria, besides the quality of denoised image, easiness and time. Various medical images are examined to confirm the claim.

  4. Metal Artifact Suppression in Dental Cone Beam Computed Tomography Images Using Image Processing Techniques

    PubMed Central

    Johari, Masoumeh; Abdollahzadeh, Milad; Esmaeili, Farzad; Sakhamanesh, Vahideh

    2018-01-01

    Background: Dental cone beam computed tomography (CBCT) images suffer from severe metal artifacts. These artifacts degrade the quality of acquired image and in some cases make it unsuitable to use. Streaking artifacts and cavities around teeth are the main reason of degradation. Methods: In this article, we have proposed a new artifact reduction algorithm which has three parallel components. The first component extracts teeth based on the modeling of image histogram with a Gaussian mixture model. Striking artifact reduction component reduces artifacts using converting image into the polar domain and applying morphological filtering. The third component fills cavities through a simple but effective morphological filtering operation. Results: Finally, results of these three components are combined into a fusion step to create a visually good image which is more compatible to human visual system. Conclusions: Results show that the proposed algorithm reduces artifacts of dental CBCT images and produces clean images. PMID:29535920

  5. DWI filtering using joint information for DTI and HARDI.

    PubMed

    Tristán-Vega, Antonio; Aja-Fernández, Santiago

    2010-04-01

    The filtering of the Diffusion Weighted Images (DWI) prior to the estimation of the diffusion tensor or other fiber Orientation Distribution Functions (ODF) has been proved to be of paramount importance in the recent literature. More precisely, it has been evidenced that the estimation of the diffusion tensor without a previous filtering stage induces errors which cannot be recovered by further regularization of the tensor field. A number of approaches have been intended to overcome this problem, most of them based on the restoration of each DWI gradient image separately. In this paper we propose a methodology to take advantage of the joint information in the DWI volumes, i.e., the sum of the information given by all DWI channels plus the correlations between them. This way, all the gradient images are filtered together exploiting the first and second order information they share. We adapt this methodology to two filters, namely the Linear Minimum Mean Squared Error (LMMSE) and the Unbiased Non-Local Means (UNLM). These new filters are tested over a wide variety of synthetic and real data showing the convenience of the new approach, especially for High Angular Resolution Diffusion Imaging (HARDI). Among the techniques presented, the joint LMMSE is proved a very attractive approach, since it shows an accuracy similar to UNLM (or even better in some situations) with a much lighter computational load. Copyright 2009 Elsevier B.V. All rights reserved.

  6. Image reconstruction of x-ray tomography by using image J platform

    NASA Astrophysics Data System (ADS)

    Zain, R. M.; Razali, A. M.; Salleh, K. A. M.; Yahya, R.

    2017-01-01

    A tomogram is a technical term for a CT image. It is also called a slice because it corresponds to what the object being scanned would look like if it were sliced open along a plane. A CT slice corresponds to a certain thickness of the object being scanned. So, while a typical digital image is composed of pixels, a CT slice image is composed of voxels (volume elements). In the case of x-ray tomography, similar to x-ray Radiography, the quantity being imaged is the distribution of the attenuation coefficient μ(x) within the object of interest. The different is only on the technique to produce the tomogram. The image of x-ray radiography can be produced straight foward after exposed to x-ray, while the image of tomography produces by combination of radiography images in every angle of projection. A number of image reconstruction methods by converting x-ray attenuation data into a tomography image have been produced by researchers. In this work, Ramp filter in "filtered back projection" has been applied. The linear data acquired at each angular orientation are convolved with a specially designed filter and then back projected across a pixel field at the same angle. This paper describe the step of using Image J software to produce image reconstruction of x-ray tomography.

  7. Injection-controlled laser resonator

    DOEpatents

    Chang, J.J.

    1995-07-18

    A new injection-controlled laser resonator incorporates self-filtering and self-imaging characteristics with an efficient injection scheme. A low-divergence laser signal is injected into the resonator, which enables the injection signal to be converted to the desired resonator modes before the main laser pulse starts. This injection technique and resonator design enable the laser cavity to improve the quality of the injection signal through self-filtering before the main laser pulse starts. The self-imaging property of the present resonator reduces the cavity induced diffraction effects and, in turn, improves the laser beam quality. 5 figs.

  8. Injection-controlled laser resonator

    DOEpatents

    Chang, Jim J.

    1995-07-18

    A new injection-controlled laser resonator incorporates self-filtering and self-imaging characteristics with an efficient injection scheme. A low-divergence laser signal is injected into the resonator, which enables the injection signal to be converted to the desired resonator modes before the main laser pulse starts. This injection technique and resonator design enable the laser cavity to improve the quality of the injection signal through self-filtering before the main laser pulse starts. The self-imaging property of the present resonator reduces the cavity induced diffraction effects and, in turn, improves the laser beam quality.

  9. "Relative CIR": an image enhancement and visualization technique

    USGS Publications Warehouse

    Fleming, Michael D.

    1993-01-01

    Many techniques exist to spectrally and spatially enhance digital multispectral scanner data. One technique enhances an image while keeping the colors as they would appear in a color-infrared (CIR) image. This "relative CIR" technique generates an image that is both spectrally and spatially enhanced, while displaying a maximum range of colors. The technique enables an interpreter to visualize either spectral or land cover classes by their relative CIR characteristics. A relative CIR image is generated by developed spectral statistics for each class in the classifications and then, using a nonparametric approach for spectral enhancement, the means of the classes for each band are ranked. A 3 by 3 pixel smoothing filter is applied to the classification for spatial enhancement and the classes are mapped to the representative rank for each band. Practical applications of the technique include displaying an image classification product as a CIR image that was not derived directly from a spectral image, visualizing how a land cover classification would look as a CIR image, and displaying a spectral classification or intermediate product that will be used to label spectral classes.

  10. Trigonometric Transforms for Image Reconstruction

    DTIC Science & Technology

    1998-06-01

    applying trigo - nometric transforms to image reconstruction problems. Many existing linear image reconstruc- tion techniques rely on knowledge of...ancestors. The research performed for this dissertation represents the first time the symmetric convolution-multiplication property of trigo - nometric...Fourier domain. The traditional representation of these filters will be similar to new trigo - nometric transform versions derived in later chapters

  11. Neutron tomography of particulate filters: A non-destructive investigation tool for applied and industrial research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toops, Todd J.; Bilheux, Hassina Z.; Voisin, Sophie

    2013-08-19

    This research describes the development and implementation of high-fidelity neutron imaging and the associated analysis of the images. This advanced capability allows the non-destructive, non-invasive imaging of particulate filters (PFs) and how the deposition of particulate and catalytic washcoat occurs within the filter. The majority of the efforts described here were performed at the High Flux Isotope Reactor (HFIR) CG-1D neutron imaging beamline at Oak Ridge National Laboratory; the current spatial resolution is approximately 50 μm. The sample holder is equipped with a high-precision rotation stage that allows 3D imaging (i.e., computed tomography) of the sample when combined with computerizedmore » reconstruction tools. What enables the neutron-based image is the ability of some elements to absorb or scatter neutrons where other elements allow the neutron to pass through them with negligible interaction. Of particular interest in this study is the scattering of neutrons by hydrogen-containing molecules, such as hydrocarbons (HCs) and/or water, which are adsorbed to the surface of soot, ash and catalytic washcoat. Even so, the interactions with this adsorbed water/HC is low and computational techniques were required to enhance the contrast, primarily a modified simultaneous iterative reconstruction technique (SIRT). Lastly, this effort describes the following systems: particulate randomly distributed in a PF, ash deposition in PFs, a catalyzed washcoat layer in a PF, and three particulate loadings in a SiC PF.« less

  12. Hyperspectral Imaging Using Flexible Endoscopy for Laryngeal Cancer Detection

    PubMed Central

    Regeling, Bianca; Thies, Boris; Gerstner, Andreas O. H.; Westermann, Stephan; Müller, Nina A.; Bendix, Jörg; Laffers, Wiebke

    2016-01-01

    Hyperspectral imaging (HSI) is increasingly gaining acceptance in the medical field. Up until now, HSI has been used in conjunction with rigid endoscopy to detect cancer in vivo. The logical next step is to pair HSI with flexible endoscopy, since it improves access to hard-to-reach areas. While the flexible endoscope’s fiber optic cables provide the advantage of flexibility, they also introduce an interfering honeycomb-like pattern onto images. Due to the substantial impact this pattern has on locating cancerous tissue, it must be removed before the HS data can be further processed. Thereby, the loss of information is to minimize avoiding the suppression of small-area variations of pixel values. We have developed a system that uses flexible endoscopy to record HS cubes of the larynx and designed a special filtering technique to remove the honeycomb-like pattern with minimal loss of information. We have confirmed its feasibility by comparing it to conventional filtering techniques using an objective metric and by applying unsupervised and supervised classifications to raw and pre-processed HS cubes. Compared to conventional techniques, our method successfully removes the honeycomb-like pattern and considerably improves classification performance, while preserving image details. PMID:27529255

  13. Hyperspectral Imaging Using Flexible Endoscopy for Laryngeal Cancer Detection.

    PubMed

    Regeling, Bianca; Thies, Boris; Gerstner, Andreas O H; Westermann, Stephan; Müller, Nina A; Bendix, Jörg; Laffers, Wiebke

    2016-08-13

    Hyperspectral imaging (HSI) is increasingly gaining acceptance in the medical field. Up until now, HSI has been used in conjunction with rigid endoscopy to detect cancer in vivo. The logical next step is to pair HSI with flexible endoscopy, since it improves access to hard-to-reach areas. While the flexible endoscope's fiber optic cables provide the advantage of flexibility, they also introduce an interfering honeycomb-like pattern onto images. Due to the substantial impact this pattern has on locating cancerous tissue, it must be removed before the HS data can be further processed. Thereby, the loss of information is to minimize avoiding the suppression of small-area variations of pixel values. We have developed a system that uses flexible endoscopy to record HS cubes of the larynx and designed a special filtering technique to remove the honeycomb-like pattern with minimal loss of information. We have confirmed its feasibility by comparing it to conventional filtering techniques using an objective metric and by applying unsupervised and supervised classifications to raw and pre-processed HS cubes. Compared to conventional techniques, our method successfully removes the honeycomb-like pattern and considerably improves classification performance, while preserving image details.

  14. Light field image denoising using a linear 4D frequency-hyperfan all-in-focus filter

    NASA Astrophysics Data System (ADS)

    Dansereau, Donald G.; Bongiorno, Daniel L.; Pizarro, Oscar; Williams, Stefan B.

    2013-02-01

    Imaging in low light is problematic as sensor noise can dominate imagery, and increasing illumination or aperture size is not always effective or practical. Computational photography offers a promising solution in the form of the light field camera, which by capturing redundant information offers an opportunity for elegant noise rejection. We show that the light field of a Lambertian scene has a 4D hyperfan-shaped frequency-domain region of support at the intersection of a dual-fan and a hypercone. By designing and implementing a filter with appropriately shaped passband we accomplish denoising with a single all-in-focus linear filter. Drawing examples from the Stanford Light Field Archive and images captured using a commercially available lenselet- based plenoptic camera, we demonstrate that the hyperfan outperforms competing methods including synthetic focus, fan-shaped antialiasing filters, and a range of modern nonlinear image and video denoising techniques. We show the hyperfan preserves depth of field, making it a single-step all-in-focus denoising filter suitable for general-purpose light field rendering. We include results for different noise types and levels, over a variety of metrics, and in real-world scenarios. Finally, we show that the hyperfan's performance scales with aperture count.

  15. A combination of spatial and recursive temporal filtering for noise reduction when using region of interest (ROI) fluoroscopy for patient dose reduction in image guided vascular interventions with significant anatomical motion

    NASA Astrophysics Data System (ADS)

    Setlur Nagesh, S. V.; Khobragade, P.; Ionita, C.; Bednarek, D. R.; Rudin, S.

    2015-03-01

    Because x-ray based image-guided vascular interventions are minimally invasive they are currently the most preferred method of treating disorders such as stroke, arterial stenosis, and aneurysms; however, the x-ray exposure to the patient during long image-guided interventional procedures could cause harmful effects such as cancer in the long run and even tissue damage in the short term. ROI fluoroscopy reduces patient dose by differentially attenuating the incident x-rays outside the region-of-interest. To reduce the noise in the dose-reduced regions previously recursive temporal filtering was successfully demonstrated for neurovascular interventions. However, in cardiac interventions, anatomical motion is significant and excessive recursive filtering could cause blur. In this work the effects of three noise-reduction schemes, including recursive temporal filtering, spatial mean filtering, and a combination of spatial and recursive temporal filtering, were investigated in a simulated ROI dose-reduced cardiac intervention. First a model to simulate the aortic arch and its movement was built. A coronary stent was used to simulate a bioprosthetic valve used in TAVR procedures and was deployed under dose-reduced ROI fluoroscopy during the simulated heart motion. The images were then retrospectively processed for noise reduction in the periphery, using recursive temporal filtering, spatial filtering and a combination of both. Quantitative metrics for all three noise reduction schemes are calculated and are presented as results. From these it can be concluded that with significant anatomical motion, a combination of spatial and recursive temporal filtering scheme is best suited for reducing the excess quantum noise in the periphery. This new noise-reduction technique in combination with ROI fluoroscopy has the potential for substantial patient-dose savings in cardiac interventions.

  16. Adaptive non-local means on local principle neighborhood for noise/artifacts reduction in low-dose CT images.

    PubMed

    Zhang, Yuanke; Lu, Hongbing; Rong, Junyan; Meng, Jing; Shang, Junliang; Ren, Pinghong; Zhang, Junying

    2017-09-01

    Low-dose CT (LDCT) technique can reduce the x-ray radiation exposure to patients at the cost of degraded images with severe noise and artifacts. Non-local means (NLM) filtering has shown its potential in improving LDCT image quality. However, currently most NLM-based approaches employ a weighted average operation directly on all neighbor pixels with a fixed filtering parameter throughout the NLM filtering process, ignoring the non-stationary noise nature of LDCT images. In this paper, an adaptive NLM filtering scheme on local principle neighborhoods (PC-NLM) is proposed for structure-preserving noise/artifacts reduction in LDCT images. Instead of using neighboring patches directly, in the PC-NLM scheme, the principle component analysis (PCA) is first applied on local neighboring patches of the target patch to decompose the local patches into uncorrelated principle components (PCs), then a NLM filtering is used to regularize each PC of the target patch and finally the regularized components is transformed to get the target patch in image domain. Especially, in the NLM scheme, the filtering parameter is estimated adaptively from local noise level of the neighborhood as well as the signal-to-noise ratio (SNR) of the corresponding PC, which guarantees a "weaker" NLM filtering on PCs with higher SNR and a "stronger" filtering on PCs with lower SNR. The PC-NLM procedure is iteratively performed several times for better removal of the noise and artifacts, and an adaptive iteration strategy is developed to reduce the computational load by determining whether a patch should be processed or not in next round of the PC-NLM filtering. The effectiveness of the presented PC-NLM algorithm is validated by experimental phantom studies and clinical studies. The results show that it can achieve promising gain over some state-of-the-art methods in terms of artifact suppression and structure preservation. With the use of PCA on local neighborhoods to extract principal structural components, as well as adaptive NLM filtering on PCs of the target patch using filtering parameter estimated based on the local noise level and corresponding SNR, the proposed PC-NLM method shows its efficacy in preserving fine anatomical structures and suppressing noise/artifacts in LDCT images. © 2017 American Association of Physicists in Medicine.

  17. Nonlinear optical THz generation and sensing applications

    NASA Astrophysics Data System (ADS)

    Kawase, Kodo

    2012-03-01

    We have suggested a wide range of real-life applications using novel terahertz imaging techniques. A high-resolution terahertz tomography was demonstrated by ultra short terahertz pulses using optical fiber and a nonlinear organic crystal. We also report on the thickness measurement of very thin films using high-sensitivity metal mesh filter. Further we have succeeded in a non-destructive inspection that can monitor the soot distribution in the ceramic filter using millimeter-to-terahertz wave computed tomography. These techniques are directly applicable to the non-destructive testing in industries.

  18. Destriping AIS data using Fourier filtering techniques

    NASA Technical Reports Server (NTRS)

    Hlavka, C.

    1986-01-01

    Airborne Imaging Spectrometers (AIS) data collected in 1984 and 1985 showed pronounced striping in the vertical and horizontal directions. This striping reduced the signal to noise ratio so that features of the spectra of forest canopies were obscured or altered by noise. This noise was removed by application of a notch filter to the Fourier transform of the imagery in each waveband.

  19. Control of the Low-energy X-rays by Using MCNP5 and Numerical Analysis for a New Concept Intra-oral X-ray Imaging System

    NASA Astrophysics Data System (ADS)

    Huh, Jangyong; Ji, Yunseo; Lee, Rena

    2018-05-01

    An X-ray control algorithm to modulate the X-ray intensity distribution over the FOV (field of view) has been developed by using numerical analysis and MCNP5, a particle transport simulation code on the basis of the Monte Carlo method. X-rays, which are widely used in medical diagnostic imaging, should be controlled in order to maximize the performance of the X-ray imaging system. However, transporting X-rays, like a liquid or a gas is conveyed through a physical form such as pipes, is not possible. In the present study, an X-ray control algorithm and technique to uniformize the Xray intensity projected on the image sensor were developed using a flattening filter and a collimator in order to alleviate the anisotropy of the distribution of X-rays due to intrinsic features of the X-ray generator. The proposed method, which is combined with MCNP5 modeling and numerical analysis, aimed to optimize a flattening filter and a collimator for a uniform distribution of X-rays. Their size and shape were estimated from the method. The simulation and the experimental results both showed that the method yielded an intensity distribution over an X-ray field of 6×4 cm2 at SID (source to image-receptor distance) of 5 cm with a uniformity of more than 90% when the flattening filter and the collimator were mounted on the system. The proposed algorithm and technique are not only confined to flattening filter development but can also be applied for other X-ray related research and development efforts.

  20. Technology for Elevated Temperature Tests of Structural Panels

    NASA Technical Reports Server (NTRS)

    Thornton, E. A.

    1999-01-01

    A technique for full-field measurement of surface temperature and in-plane strain using a single grid imaging technique was demonstrated on a sample subjected to thermally-induced strain. The technique is based on digital imaging of a sample marked by an alternating line array of La2O2S:Eu(+3) thermographic phosphor and chromium illuminated by a UV lamp. Digital images of this array in unstrained and strained states were processed using a modified spin filter. Normal strain distribution was determined by combining unstrained and strained grid images using a single grid digital moire technique. Temperature distribution was determined by ratioing images of phosphor intensity at two wavelengths. Combined strain and temperature measurements demonstrated on the thermally heated sample were DELTA-epsilon = +/- 250 microepsilon and DELTA-T = +/- 5 K respectively with a spatial resolution of 0.8 mm.

  1. The composite classification problem in optical information processing

    NASA Technical Reports Server (NTRS)

    Hall, Eric B.

    1995-01-01

    Optical pattern recognition allows objects to be recognized from their images and permits their positional parameters to be estimated accurately in real time. The guiding principle behind optical pattern recognition is that a lens focusing a beam of coherent light modulated with an image produces the two-dimensinal Fourier transform of that image. When the resulting output is further transformed by the matched filter corresponding to the original image, one obtains the autocorrelation function of the original image, which has a peak at the origin. Such a device is called an optical correlator and may be used to recognize the locate the image for which it is designed. (From a practical perspective, an approximation to the matched filter must be used since the spatial light modulator (SLM) on which the filter is implemented usually does not allow one to independently control both the magnitude and phase of the filter.) Generally, one is not just concerned with recognizing a single image but is interested in recognizing a variety of rotated and scaled views of a particular image. In order to recognize these different views using an optical correlator, one may select a subset of these views (whose elements are called training images) and then use a composite filter that is designed to produce a correlation peak for each training image. Presumably, these peaks should be sharp and easily distinguishable from the surrounding correlation plane values. In this report we consider two areas of research regarding composite optical correlators. First, we consider the question of how best to choose the training images that are used to design the composite filter. With regard to quantity, the number of training images should be large enough to adequately represent all possible views of the targeted object yet small enough to ensure that the resolution of the filter is not exhausted. As for the images themselves, they should be distinct enough to avoid numerical difficulties yet similar enough to avoid gaps in which certain views of the target will be unrecognized. One method that we introduce to study this problem is called probing and involves the creation of the artificial imagery. The second problem we consider involves the clasification of the composite filter's correlation plane data. In particular, we would like to determine not only whether or not we are viewing a training image, but, in the former case, we would like to determine which training image is being viewed. This second problem is investigated using traditional M-ary hypothesis testing techniques.

  2. Signal-to-noise ratio estimation on SEM images using cubic spline interpolation with Savitzky-Golay smoothing.

    PubMed

    Sim, K S; Kiani, M A; Nia, M E; Tso, C P

    2014-01-01

    A new technique based on cubic spline interpolation with Savitzky-Golay noise reduction filtering is designed to estimate signal-to-noise ratio of scanning electron microscopy (SEM) images. This approach is found to present better result when compared with two existing techniques: nearest neighbourhood and first-order interpolation. When applied to evaluate the quality of SEM images, noise can be eliminated efficiently with optimal choice of scan rate from real-time SEM images, without generating corruption or increasing scanning time. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.

  3. Information-Adaptive Image Encoding and Restoration

    NASA Technical Reports Server (NTRS)

    Park, Stephen K.; Rahman, Zia-ur

    1998-01-01

    The multiscale retinex with color restoration (MSRCR) has shown itself to be a very versatile automatic image enhancement algorithm that simultaneously provides dynamic range compression, color constancy, and color rendition. A number of algorithms exist that provide one or more of these features, but not all. In this paper we compare the performance of the MSRCR with techniques that are widely used for image enhancement. Specifically, we compare the MSRCR with color adjustment methods such as gamma correction and gain/offset application, histogram modification techniques such as histogram equalization and manual histogram adjustment, and other more powerful techniques such as homomorphic filtering and 'burning and dodging'. The comparison is carried out by testing the suite of image enhancement methods on a set of diverse images. We find that though some of these techniques work well for some of these images, only the MSRCR performs universally well oil the test set.

  4. Color standardization and optimization in whole slide imaging.

    PubMed

    Yagi, Yukako

    2011-03-30

    Standardization and validation of the color displayed by digital slides is an important aspect of digital pathology implementation. While the most common reason for color variation is the variance in the protocols and practices in the histology lab, the color displayed can also be affected by variation in capture parameters (for example, illumination and filters), image processing and display factors in the digital systems themselves. We have been developing techniques for color validation and optimization along two paths. The first was based on two standard slides that are scanned and displayed by the imaging system in question. In this approach, one slide is embedded with nine filters with colors selected especially for H&E stained slides (looking like tiny Macbeth color chart); the specific color of the nine filters were determined in our previous study and modified for whole slide imaging (WSI). The other slide is an H&E stained mouse embryo. Both of these slides were scanned and the displayed images were compared to a standard. The second approach was based on our previous multispectral imaging research. As a first step, the two slide method (above) was used to identify inaccurate display of color and its cause, and to understand the importance of accurate color in digital pathology. We have also improved the multispectral-based algorithm for more consistent results in stain standardization. In near future, the results of the two slide and multispectral techniques can be combined and will be widely available. We have been conducting a series of researches and developing projects to improve image quality to establish Image Quality Standardization. This paper discusses one of most important aspects of image quality - color.

  5. Application of wavelet techniques for cancer diagnosis using ultrasound images: A Review.

    PubMed

    Sudarshan, Vidya K; Mookiah, Muthu Rama Krishnan; Acharya, U Rajendra; Chandran, Vinod; Molinari, Filippo; Fujita, Hamido; Ng, Kwan Hoong

    2016-02-01

    Ultrasound is an important and low cost imaging modality used to study the internal organs of human body and blood flow through blood vessels. It uses high frequency sound waves to acquire images of internal organs. It is used to screen normal, benign and malignant tissues of various organs. Healthy and malignant tissues generate different echoes for ultrasound. Hence, it provides useful information about the potential tumor tissues that can be analyzed for diagnostic purposes before therapeutic procedures. Ultrasound images are affected with speckle noise due to an air gap between the transducer probe and the body. The challenge is to design and develop robust image preprocessing, segmentation and feature extraction algorithms to locate the tumor region and to extract subtle information from isolated tumor region for diagnosis. This information can be revealed using a scale space technique such as the Discrete Wavelet Transform (DWT). It decomposes an image into images at different scales using low pass and high pass filters. These filters help to identify the detail or sudden changes in intensity in the image. These changes are reflected in the wavelet coefficients. Various texture, statistical and image based features can be extracted from these coefficients. The extracted features are subjected to statistical analysis to identify the significant features to discriminate normal and malignant ultrasound images using supervised classifiers. This paper presents a review of wavelet techniques used for preprocessing, segmentation and feature extraction of breast, thyroid, ovarian and prostate cancer using ultrasound images. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Optimal Matched Filter in the Low-number Count Poisson Noise Regime and Implications for X-Ray Source Detection

    NASA Astrophysics Data System (ADS)

    Ofek, Eran O.; Zackay, Barak

    2018-04-01

    Detection of templates (e.g., sources) embedded in low-number count Poisson noise is a common problem in astrophysics. Examples include source detection in X-ray images, γ-rays, UV, neutrinos, and search for clusters of galaxies and stellar streams. However, the solutions in the X-ray-related literature are sub-optimal in some cases by considerable factors. Using the lemma of Neyman–Pearson, we derive the optimal statistics for template detection in the presence of Poisson noise. We demonstrate that, for known template shape (e.g., point sources), this method provides higher completeness, for a fixed false-alarm probability value, compared with filtering the image with the point-spread function (PSF). In turn, we find that filtering by the PSF is better than filtering the image using the Mexican-hat wavelet (used by wavdetect). For some background levels, our method improves the sensitivity of source detection by more than a factor of two over the popular Mexican-hat wavelet filtering. This filtering technique can also be used for fast PSF photometry and flare detection; it is efficient and straightforward to implement. We provide an implementation in MATLAB. The development of a complete code that works on real data, including the complexities of background subtraction and PSF variations, is deferred for future publication.

  7. The use of an image registration technique in the urban growth monitoring

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Foresti, C.; Deoliveira, M. D. L. N.; Niero, M.; Parreira, E. M. D. M. F.

    1984-01-01

    The use of an image registration program in the studies of urban growth is described. This program permits a quick identification of growing areas with the overlap of the same scene in different periods, and with the use of adequate filters. The city of Brasilia, Brazil, is selected for the test area. The dynamics of Brasilia urban growth are analyzed with the overlap of scenes dated June 1973, 1978 and 1983. The results showed the utilization of the image registration technique for the monitoring of dynamic urban growth.

  8. Multi-scale Morphological Image Enhancement of Chest Radiographs by a Hybrid Scheme.

    PubMed

    Alavijeh, Fatemeh Shahsavari; Mahdavi-Nasab, Homayoun

    2015-01-01

    Chest radiography is a common diagnostic imaging test, which contains an enormous amount of information about a patient. However, its interpretation is highly challenging. The accuracy of the diagnostic process is greatly influenced by image processing algorithms; hence enhancement of the images is indispensable in order to improve visibility of the details. This paper aims at improving radiograph parameters such as contrast, sharpness, noise level, and brightness to enhance chest radiographs, making use of a triangulation method. Here, contrast limited adaptive histogram equalization technique and noise suppression are simultaneously performed in wavelet domain in a new scheme, followed by morphological top-hat and bottom-hat filtering. A unique implementation of morphological filters allows for adjustment of the image brightness and significant enhancement of the contrast. The proposed method is tested on chest radiographs from Japanese Society of Radiological Technology database. The results are compared with conventional enhancement techniques such as histogram equalization, contrast limited adaptive histogram equalization, Retinex, and some recently proposed methods to show its strengths. The experimental results reveal that the proposed method can remarkably improve the image contrast while keeping the sensitive chest tissue information so that radiologists might have a more precise interpretation.

  9. Multi-scale Morphological Image Enhancement of Chest Radiographs by a Hybrid Scheme

    PubMed Central

    Alavijeh, Fatemeh Shahsavari; Mahdavi-Nasab, Homayoun

    2015-01-01

    Chest radiography is a common diagnostic imaging test, which contains an enormous amount of information about a patient. However, its interpretation is highly challenging. The accuracy of the diagnostic process is greatly influenced by image processing algorithms; hence enhancement of the images is indispensable in order to improve visibility of the details. This paper aims at improving radiograph parameters such as contrast, sharpness, noise level, and brightness to enhance chest radiographs, making use of a triangulation method. Here, contrast limited adaptive histogram equalization technique and noise suppression are simultaneously performed in wavelet domain in a new scheme, followed by morphological top-hat and bottom-hat filtering. A unique implementation of morphological filters allows for adjustment of the image brightness and significant enhancement of the contrast. The proposed method is tested on chest radiographs from Japanese Society of Radiological Technology database. The results are compared with conventional enhancement techniques such as histogram equalization, contrast limited adaptive histogram equalization, Retinex, and some recently proposed methods to show its strengths. The experimental results reveal that the proposed method can remarkably improve the image contrast while keeping the sensitive chest tissue information so that radiologists might have a more precise interpretation. PMID:25709942

  10. Angular filter refractometry analysis using simulated annealing [An improved method for characterizing plasma density profiles using angular filter refractometry

    DOE PAGES

    Angland, P.; Haberberger, D.; Ivancic, S. T.; ...

    2017-10-30

    Here, a new method of analysis for angular filter refractometry images was developed to characterize laser-produced, long-scale-length plasmas using an annealing algorithm to iterative converge upon a solution. Angular filter refractometry (AFR) is a novel technique used to characterize the density pro files of laser-produced, long-scale-length plasmas. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on a minimization of themore » $$\\chi$$2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in average uncertainty in the density profile of 5-10% in the region of interest.« less

  11. Angular filter refractometry analysis using simulated annealing [An improved method for characterizing plasma density profiles using angular filter refractometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angland, P.; Haberberger, D.; Ivancic, S. T.

    Here, a new method of analysis for angular filter refractometry images was developed to characterize laser-produced, long-scale-length plasmas using an annealing algorithm to iterative converge upon a solution. Angular filter refractometry (AFR) is a novel technique used to characterize the density pro files of laser-produced, long-scale-length plasmas. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on a minimization of themore » $$\\chi$$2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in average uncertainty in the density profile of 5-10% in the region of interest.« less

  12. Experimental evaluation of dual multiple aperture devices for fluence field modulated x-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Mathews, A. J.; Gang, G.; Levinson, R.; Zbijewski, W.; Kawamoto, S.; Siewerdsen, J. H.; Stayman, J. W.

    2017-03-01

    Acquisition of CT images with comparable diagnostic power can potentially be achieved with lower radiation exposure than the current standard of care through the adoption of hardware-based fluence-field modulation (e.g. dynamic bowtie filters). While modern CT scanners employ elements such as static bowtie filters and tube-current modulation, such solutions are limited in the fluence patterns that they can achieve, and thus are limited in their ability to adapt to broad classes of patient morphology. Fluence-field modulation also enables new applications such as region-of-interest imaging, task specific imaging, reducing measurement noise or improving image quality. The work presented in this paper leverages a novel fluence modulation strategy that uses "Multiple Aperture Devices" (MADs) which are, in essence, binary filters, blocking or passing x-rays on a fine scale. Utilizing two MAD devices in series provides the capability of generating a large number of fluence patterns via small relative motions between the MAD filters. We present the first experimental evaluation of fluence-field modulation using a dual-MAD system, and demonstrate the efficacy of this technique with a characterization of achievable fluence patterns and an investigation of experimental projection data.

  13. Sensitive test for sea mine identification based on polarization-aided image processing.

    PubMed

    Leonard, I; Alfalou, A; Brosseau, C

    2013-12-02

    Techniques are widely sought to detect and identify sea mines. This issue is characterized by complicated mine shapes and underwater light propagation dependencies. In a preliminary study we use a preprocessing step for denoising underwater images before applying the algorithm for mine detection. Once a mine is detected, the protocol for identifying it is activated. Among many correlation filters, we have focused our attention on the asymmetric segmented phase-only filter for quantifying the recognition rate because it allows us to significantly increase the number of reference images in the fabrication of this filter. Yet they are not entirely satisfactory in terms of recognition rate and the obtained images revealed to be of low quality. In this report, we propose a way to improve upon this preliminary study by using a single wavelength polarimetric camera in order to denoise the images. This permits us to enhance images and improve depth visibility. We present illustrative results using in situ polarization imaging of a target through a milk-water mixture and demonstrate that our challenging objective of increasing the detection rate and decreasing the false alarm rate has been achieved.

  14. Infrared and visible image fusion with spectral graph wavelet transform.

    PubMed

    Yan, Xiang; Qin, Hanlin; Li, Jia; Zhou, Huixin; Zong, Jing-guo

    2015-09-01

    Infrared and visible image fusion technique is a popular topic in image analysis because it can integrate complementary information and obtain reliable and accurate description of scenes. Multiscale transform theory as a signal representation method is widely used in image fusion. In this paper, a novel infrared and visible image fusion method is proposed based on spectral graph wavelet transform (SGWT) and bilateral filter. The main novelty of this study is that SGWT is used for image fusion. On the one hand, source images are decomposed by SGWT in its transform domain. The proposed approach not only effectively preserves the details of different source images, but also excellently represents the irregular areas of the source images. On the other hand, a novel weighted average method based on bilateral filter is proposed to fuse low- and high-frequency subbands by taking advantage of spatial consistency of natural images. Experimental results demonstrate that the proposed method outperforms seven recently proposed image fusion methods in terms of both visual effect and objective evaluation metrics.

  15. A physical parameter method for the design of broad-band X-ray imaging systems to do coronal plasma diagnostics

    NASA Technical Reports Server (NTRS)

    Kahler, S.; Krieger, A. S.

    1978-01-01

    The technique commonly used for the analysis of data from broad-band X-ray imaging systems for plasma diagnostics is the filter ratio method. This requires the use of two or more broad-band filters to derive temperatures and line-of-sight emission integrals or emission measure distributions as a function of temperature. Here an alternative analytical approach is proposed in which the temperature response of the imaging system is matched to the physical parameter being investigated. The temperature response of a system designed to measure the total radiated power along the line of sight of any coronal structure is calculated. Other examples are discussed.

  16. Reducing charging effects in scanning electron microscope images by Rayleigh contrast stretching method (RCS).

    PubMed

    Wan Ismail, W Z; Sim, K S; Tso, C P; Ting, H Y

    2011-01-01

    To reduce undesirable charging effects in scanning electron microscope images, Rayleigh contrast stretching is developed and employed. First, re-scaling is performed on the input image histograms with Rayleigh algorithm. Then, contrast stretching or contrast adjustment is implemented to improve the images while reducing the contrast charging artifacts. This technique has been compared to some existing histogram equalization (HE) extension techniques: recursive sub-image HE, contrast stretching dynamic HE, multipeak HE and recursive mean separate HE. Other post processing methods, such as wavelet approach, spatial filtering, and exponential contrast stretching, are compared as well. Overall, the proposed method produces better image compensation in reducing charging artifacts. Copyright © 2011 Wiley Periodicals, Inc.

  17. Advanced technology development for image gathering, coding, and processing

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.

    1990-01-01

    Three overlapping areas of research activities are presented: (1) Information theory and optimal filtering are extended to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing. (2) Focal-plane processing techniques and technology are developed to combine effectively image gathering with coding. The emphasis is on low-level vision processing akin to the retinal processing in human vision. (3) A breadboard adaptive image-coding system is being assembled. This system will be used to develop and evaluate a number of advanced image-coding technologies and techniques as well as research the concept of adaptive image coding.

  18. Noise-gating to Clean Astrophysical Image Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeForest, C. E.

    I present a family of algorithms to reduce noise in astrophysical images and image sequences, preserving more information from the original data than is retained by conventional techniques. The family uses locally adaptive filters (“noise gates”) in the Fourier domain to separate coherent image structure from background noise based on the statistics of local neighborhoods in the image. Processing of solar data limited by simple shot noise or by additive noise reveals image structure not easily visible in the originals, preserves photometry of observable features, and reduces shot noise by a factor of 10 or more with little to nomore » apparent loss of resolution. This reveals faint features that were either not directly discernible or not sufficiently strongly detected for quantitative analysis. The method works best on image sequences containing related subjects, for example movies of solar evolution, but is also applicable to single images provided that there are enough pixels. The adaptive filter uses the statistical properties of noise and of local neighborhoods in the data to discriminate between coherent features and incoherent noise without reference to the specific shape or evolution of those features. The technique can potentially be modified in a straightforward way to exploit additional a priori knowledge about the functional form of the noise.« less

  19. Noise-gating to Clean Astrophysical Image Data

    NASA Astrophysics Data System (ADS)

    DeForest, C. E.

    2017-04-01

    I present a family of algorithms to reduce noise in astrophysical images and image sequences, preserving more information from the original data than is retained by conventional techniques. The family uses locally adaptive filters (“noise gates”) in the Fourier domain to separate coherent image structure from background noise based on the statistics of local neighborhoods in the image. Processing of solar data limited by simple shot noise or by additive noise reveals image structure not easily visible in the originals, preserves photometry of observable features, and reduces shot noise by a factor of 10 or more with little to no apparent loss of resolution. This reveals faint features that were either not directly discernible or not sufficiently strongly detected for quantitative analysis. The method works best on image sequences containing related subjects, for example movies of solar evolution, but is also applicable to single images provided that there are enough pixels. The adaptive filter uses the statistical properties of noise and of local neighborhoods in the data to discriminate between coherent features and incoherent noise without reference to the specific shape or evolution of those features. The technique can potentially be modified in a straightforward way to exploit additional a priori knowledge about the functional form of the noise.

  20. Aural analysis of image texture via cepstral filtering and sonification

    NASA Astrophysics Data System (ADS)

    Rangayyan, Rangaraj M.; Martins, Antonio C. G.; Ruschioni, Ruggero A.

    1996-03-01

    Texture plays an important role in image analysis and understanding, with many applications in medical imaging and computer vision. However, analysis of texture by image processing is a rather difficult issue, with most techniques being oriented towards statistical analysis which may not have readily comprehensible perceptual correlates. We propose new methods for auditory display (AD) and sonification of (quasi-) periodic texture (where a basic texture element or `texton' is repeated over the image field) and random texture (which could be modeled as filtered or `spot' noise). Although the AD designed is not intended to be speech- like or musical, we draw analogies between the two types of texture mentioned above and voiced/unvoiced speech, and design a sonification algorithm which incorporates physical and perceptual concepts of texture and speech. More specifically, we present a method for AD of texture where the projections of the image at various angles (Radon transforms or integrals) are mapped to audible signals and played in sequence. In the case of random texture, the spectral envelopes of the projections are related to the filter spot characteristics, and convey the essential information for texture discrimination. In the case of periodic texture, the AD provides timber and pitch related to the texton and periodicity. In another procedure for sonification of periodic texture, we propose to first deconvolve the image using cepstral analysis to extract information about the texton and horizontal and vertical periodicities. The projections of individual textons at various angles are used to create a voiced-speech-like signal with each projection mapped to a basic wavelet, the horizontal period to pitch, and the vertical period to rhythm on a longer time scale. The sound pattern then consists of a serial, melody-like sonification of the patterns for each projection. We believe that our approaches provide the much-desired `natural' connection between the image data and the sounds generated. We have evaluated the sonification techniques with a number of synthetic textures. The sound patterns created have demonstrated the potential of the methods in distinguishing between different types of texture. We are investigating the application of these techniques to auditory analysis of texture in medical images such as magnetic resonance images.

  1. A median filter approach for correcting errors in a vector field

    NASA Technical Reports Server (NTRS)

    Schultz, H.

    1985-01-01

    Techniques are presented for detecting and correcting errors in a vector field. These methods employ median filters which are frequently used in image processing to enhance edges and remove noise. A detailed example is given for wind field maps produced by a spaceborne scatterometer. The error detection and replacement algorithm was tested with simulation data from the NASA Scatterometer (NSCAT) project.

  2. Dynamic intensity-weighted region of interest imaging for conebeam CT

    PubMed Central

    Pearson, Erik; Pan, Xiaochuan; Pelizzari, Charles

    2017-01-01

    BACKGROUND Patient dose from image guidance in radiotherapy is small compared to the treatment dose. However, the imaging beam is untargeted and deposits dose equally in tumor and healthy tissues. It is desirable to minimize imaging dose while maintaining efficacy. OBJECTIVE Image guidance typically does not require full image quality throughout the patient. Dynamic filtration of the kV beam allows local control of CT image noise for high quality around the target volume and lower quality elsewhere, with substantial dose sparing and reduced scatter fluence on the detector. METHODS The dynamic Intensity-Weighted Region of Interest (dIWROI) technique spatially varies beam intensity during acquisition with copper filter collimation. Fluence is reduced by 95% under the filters with the aperture conformed dynamically to the ROI during cone-beam CT scanning. Preprocessing to account for physical effects of the collimator before reconstruction is described. RESULTS Reconstructions show image quality comparable to a standard scan in the ROI, with higher noise and streak artifacts in the outer region but still adequate quality for patient localization. Monte Carlo modeling shows dose reduction by 10–15% in the ROI due to reduced scatter, and up to 75% outside. CONCLUSIONS The presented technique offers a method to reduce imaging dose by accepting increased image noise outside the ROI, while maintaining full image quality inside the ROI. PMID:27257875

  3. Detecting Weak Spectral Lines in Interferometric Data through Matched Filtering

    NASA Astrophysics Data System (ADS)

    Loomis, Ryan A.; Öberg, Karin I.; Andrews, Sean M.; Walsh, Catherine; Czekala, Ian; Huang, Jane; Rosenfeld, Katherine A.

    2018-04-01

    Modern radio interferometers enable observations of spectral lines with unprecedented spatial resolution and sensitivity. In spite of these technical advances, many lines of interest are still at best weakly detected and therefore necessitate detection and analysis techniques specialized for the low signal-to-noise ratio (S/N) regime. Matched filters can leverage knowledge of the source structure and kinematics to increase sensitivity of spectral line observations. Application of the filter in the native Fourier domain improves S/N while simultaneously avoiding the computational cost and ambiguities associated with imaging, making matched filtering a fast and robust method for weak spectral line detection. We demonstrate how an approximate matched filter can be constructed from a previously observed line or from a model of the source, and we show how this filter can be used to robustly infer a detection significance for weak spectral lines. When applied to ALMA Cycle 2 observations of CH3OH in the protoplanetary disk around TW Hya, the technique yields a ≈53% S/N boost over aperture-based spectral extraction methods, and we show that an even higher boost will be achieved for observations at higher spatial resolution. A Python-based open-source implementation of this technique is available under the MIT license at http://github.com/AstroChem/VISIBLE.

  4. Volumetric imaging of supersonic boundary layers using filtered Rayleigh scattering background suppression

    NASA Technical Reports Server (NTRS)

    Forkey, Joseph N.; Lempert, Walter R.; Bogdonoff, Seymour M.; Miles, Richard B.; Russell, G.

    1995-01-01

    We demonstrate the use of Filtererd Rayleigh Scattering and a 3D reconstruction technique to interrogate the highly three dimensional flow field inside of a supersonic inlet model. A 3 inch by 3 inch by 2.5 inch volume is reconstructed yielding 3D visualizations of the crossing shock waves and of the boundary layer. In this paper we discuss the details of the techniques used, and present the reconstructured 3D images.

  5. Wear Detection of Drill Bit by Image-based Technique

    NASA Astrophysics Data System (ADS)

    Sukeri, Maziyah; Zulhilmi Paiz Ismadi, Mohd; Rahim Othman, Abdul; Kamaruddin, Shahrul

    2018-03-01

    Image processing for computer vision function plays an essential aspect in the manufacturing industries for the tool condition monitoring. This study proposes a dependable direct measurement method to measure the tool wear using image-based analysis. Segmentation and thresholding technique were used as the means to filter and convert the colour image to binary datasets. Then, the edge detection method was applied to characterize the edge of the drill bit. By using cross-correlation method, the edges of original and worn drill bits were correlated to each other. Cross-correlation graphs were able to detect the difference of the worn edge despite small difference between the graphs. Future development will focus on quantifying the worn profile as well as enhancing the sensitivity of the technique.

  6. Laser-induced acoustic imaging of underground objects

    NASA Astrophysics Data System (ADS)

    Li, Wen; DiMarzio, Charles A.; McKnight, Stephen W.; Sauermann, Gerhard O.; Miller, Eric L.

    1999-02-01

    This paper introduces a new demining technique based on the photo-acoustic interaction, together with results from photo- acoustic experiments. We have buried different types of targets (metal, rubber and plastic) in different media (sand, soil and water) and imaged them by measuring reflection of acoustic waves generated by irradiation with a CO2 laser. Research has been focused on the signal acquisition and signal processing. A deconvolution method using Wiener filters is utilized in data processing. Using a uniform spatial distribution of laser pulses at the ground's surface, we obtained 3D images of buried objects. The images give us a clear representation of the shapes of the underground objects. The quality of the images depends on the mismatch of acoustic impedance of the buried objects, the bandwidth and center frequency of the acoustic sensors and the selection of filter functions.

  7. Detection and Evaluation of Skin Disorders by One of Photogrammetric Image Analysis Methods

    NASA Astrophysics Data System (ADS)

    Güçin, M.; Patias, P.; Altan, M. O.

    2012-08-01

    Abnormalities on skin may vary from simple acne to painful wounds which affect a person's life quality. Detection of these kinds of disorders in early stages, followed by the evaluation of abnormalities is of high importance. At this stage, photogrammetry offers a non-contact solution to this concern by providing geometric highly accurate data. Photogrammetry, which has been used for firstly topographic purposes, in virtue of terrestrial photogrammetry became useful technique in non-topographic applications also (Wolf et al., 2000). Moreover the extension of usage of photogrammetry, in parallel with the development in technology, analogue photographs are replaced with digital images and besides digital image processing techniques, it provides modification of digital images by using filters, registration processes etc. Besides, photogrammetry (using same coordinate system by registration of images) can serve as a tool for the comparison of temporal imaging data. The aim of this study is to examine several digital image processing techniques, in particular the digital filters, which might be useful to determine skin disorders. In our study we examine affordable to purchase, user friendly software which needs neither expertise nor pre-training. Since it is a pre-work for subsequent and deeper studies, Adobe Photoshop 7.0 is used as a present software. In addition to that Adobe Photoshop released a DesAcc plug-ins with CS3 version and provides full compatibility with DICOM (Digital Imaging and Communications in Medicine) and PACS (Picture Archiving and Communications System) that enables doctors to store all medical data together with relevant images and share if necessary.

  8. Masked-backlighter technique used to simultaneously image x-ray absorption and x-ray emission from an inertial confinement fusion plasma.

    PubMed

    Marshall, F J; Radha, P B

    2014-11-01

    A method to simultaneously image both the absorption and the self-emission of an imploding inertial confinement fusion plasma has been demonstrated on the OMEGA Laser System. The technique involves the use of a high-Z backlighter, half of which is covered with a low-Z material, and a high-speed x-ray framing camera aligned to capture images backlit by this masked backlighter. Two strips of the four-strip framing camera record images backlit by the high-Z portion of the backlighter, while the other two strips record images aligned with the low-Z portion of the backlighter. The emission from the low-Z material is effectively eliminated by a high-Z filter positioned in front of the framing camera, limiting the detected backlighter emission to that of the principal emission line of the high-Z material. As a result, half of the images are of self-emission from the plasma and the other half are of self-emission plus the backlighter. The advantage of this technique is that the self-emission simultaneous with backlighter absorption is independently measured from a nearby direction. The absorption occurs only in the high-Z backlit frames and is either spatially separated from the emission or the self-emission is suppressed by filtering, or by using a backlighter much brighter than the self-emission, or by subtraction. The masked-backlighter technique has been used on the OMEGA Laser System to simultaneously measure the emission profiles and the absorption profiles of polar-driven implosions.

  9. A multi-modal stereo microscope based on a spatial light modulator.

    PubMed

    Lee, M P; Gibson, G M; Bowman, R; Bernet, S; Ritsch-Marte, M; Phillips, D B; Padgett, M J

    2013-07-15

    Spatial Light Modulators (SLMs) can emulate the classic microscopy techniques, including differential interference (DIC) contrast and (spiral) phase contrast. Their programmability entails the benefit of flexibility or the option to multiplex images, for single-shot quantitative imaging or for simultaneous multi-plane imaging (depth-of-field multiplexing). We report the development of a microscope sharing many of the previously demonstrated capabilities, within a holographic implementation of a stereo microscope. Furthermore, we use the SLM to combine stereo microscopy with a refocusing filter and with a darkfield filter. The instrument is built around a custom inverted microscope and equipped with an SLM which gives various imaging modes laterally displaced on the same camera chip. In addition, there is a wide angle camera for visualisation of a larger region of the sample.

  10. Intelligent identification of remnant ridge edges in region west of Yongxing Island, South China Sea

    NASA Astrophysics Data System (ADS)

    Wang, Weiwei; Guo, Jing; Cai, Guanqiang; Wang, Dawei

    2018-02-01

    Edge detection enables identification of geomorphologic unit boundaries and thus assists with geomorphical mapping. In this paper, an intelligent edge identification method is proposed and image processing techniques are applied to multi-beam bathymetry data. To accomplish this, a color image is generated by the bathymetry, and a weighted method is used to convert the color image to a gray image. As the quality of the image has a significant influence on edge detection, different filter methods are applied to the gray image for de-noising. The peak signal-to-noise ratio and mean square error are calculated to evaluate which filter method is most appropriate for depth image filtering and the edge is subsequently detected using an image binarization method. Traditional image binarization methods cannot manage the complicated uneven seafloor, and therefore a binarization method is proposed that is based on the difference between image pixel values; the appropriate threshold for image binarization is estimated according to the probability distribution of pixel value differences between two adjacent pixels in horizontal and vertical directions, respectively. Finally, an eight-neighborhood frame is adopted to thin the binary image, connect the intermittent edge, and implement contour extraction. Experimental results show that the method described here can recognize the main boundaries of geomorphologic units. In addition, the proposed automatic edge identification method avoids use of subjective judgment, and reduces time and labor costs.

  11. Hyperspectral small animal fluorescence imaging: spectral selection imaging

    NASA Astrophysics Data System (ADS)

    Leavesley, Silas; Jiang, Yanan; Patsekin, Valery; Hall, Heidi; Vizard, Douglas; Robinson, J. Paul

    2008-02-01

    Molecular imaging is a rapidly growing area of research, fueled by needs in pharmaceutical drug-development for methods for high-throughput screening, pre-clinical and clinical screening for visualizing tumor growth and drug targeting, and a growing number of applications in the molecular biology fields. Small animal fluorescence imaging employs fluorescent probes to target molecular events in vivo, with a large number of molecular targeting probes readily available. The ease at which new targeting compounds can be developed, the short acquisition times, and the low cost (compared to microCT, MRI, or PET) makes fluorescence imaging attractive. However, small animal fluorescence imaging suffers from high optical scattering, absorption, and autofluorescence. Much of these problems can be overcome through multispectral imaging techniques, which collect images at different fluorescence emission wavelengths, followed by analysis, classification, and spectral deconvolution methods to isolate signals from fluorescence emission. We present an alternative to the current method, using hyperspectral excitation scanning (spectral selection imaging), a technique that allows excitation at any wavelength in the visible and near-infrared wavelength range. In many cases, excitation imaging may be more effective at identifying specific fluorescence signals because of the higher complexity of the fluorophore excitation spectrum. Because the excitation is filtered and not the emission, the resolution limit and image shift imposed by acousto-optic tunable filters have no effect on imager performance. We will discuss design of the imager, optimizing the imager for use in small animal fluorescence imaging, and application of spectral analysis and classification methods for identifying specific fluorescence signals.

  12. Peering through the flames: imaging techniques for reacting aluminum powders

    DOE PAGES

    Zepper, Ethan T.; Pantoya, Michelle L.; Bhattacharya, Sukalyan; ...

    2017-03-17

    Combusting metals burn at high temperatures and emit high-intensity radiation in the visible spectrum which can over-saturate regular imaging sensors and obscure the field of view. Filtering the luminescence can result in limited information and hinder thorough combustion characterization. A method for “seeing through the flames” of a highly luminescent aluminum powder reaction is presented using copper vapor laser (CVL) illumination synchronized with a high-speed camera. A statistical comparison of combusting aluminum particle agglomerate between filtered halogen and CVL illumination shows the effectiveness of this diagnostic approach. When ignited by an electrically induced plasma, aluminum particles are entrained as solidmore » agglomerates that rotate about their centers of mass and are surrounded by emitted, burning gases. Furthermore, the average agglomerate diameter appears to be 160 micrometers when viewed with standard illumination and a high-speed camera. But, a significantly lower diameter of 50 micrometers is recorded when imaged with CVL illumination. Our results advocate that alternative imaging techniques are required to resolve the complexities of metal particle combustion.« less

  13. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in quantifying coronary calcium.

    PubMed

    Takahashi, Masahiro; Kimura, Fumiko; Umezawa, Tatsuya; Watanabe, Yusuke; Ogawa, Harumi

    2016-01-01

    Adaptive statistical iterative reconstruction (ASIR) has been used to reduce radiation dose in cardiac computed tomography. However, change of image parameters by ASIR as compared to filtered back projection (FBP) may influence quantification of coronary calcium. To investigate the influence of ASIR on calcium quantification in comparison to FBP. In 352 patients, CT images were reconstructed using FBP alone, FBP combined with ASIR 30%, 50%, 70%, and ASIR 100% based on the same raw data. Image noise, plaque density, Agatston scores and calcium volumes were compared among the techniques. Image noise, Agatston score, and calcium volume decreased significantly with ASIR compared to FBP (each P < 0.001). Use of ASIR reduced Agatston score by 10.5% to 31.0%. In calcified plaques both of patients and a phantom, ASIR decreased maximum CT values and calcified plaque size. In comparison to FBP, adaptive statistical iterative reconstruction (ASIR) may significantly decrease Agatston scores and calcium volumes. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  14. Photographic film image enhancement

    NASA Technical Reports Server (NTRS)

    Horner, J. L.

    1975-01-01

    A series of experiments were undertaken to assess the feasibility of defogging color film by the techniques of optical spatial filtering. A coherent optical processor was built using red, blue, and green laser light input and specially designed Fourier transformation lenses. An array of spatial filters was fabricated on black and white emulsion slides using the coherent optical processor. The technique was first applied to laboratory white light fogged film, and the results were successful. However, when the same technique was applied to some original Apollo X radiation fogged color negatives, the results showed no similar restoration. Examples of each experiment are presented and possible reasons for the lack of restoration in the Apollo films are discussed.

  15. Elimination of coherent noise in a coherent light imaging system

    NASA Technical Reports Server (NTRS)

    Grebowsky, G. J.; Hermann, R. L.; Paull, H. B.; Shulman, A. R.

    1970-01-01

    Optical imaging systems using coherent light introduce objectionable noise into the output image plane. Dust and bubbles on and in lenses cause most of the noise in the output image. This noise usually appears as bull's-eye diffraction patterns in the image. By rotating the lens about the optical axis these diffraction patterns can be essentially eliminated. The technique does not destroy the spatial coherence of the light and permits spatial filtering of the input plane.

  16. Extraction of latent images from printed media

    NASA Astrophysics Data System (ADS)

    Sergeyev, Vladislav; Fedoseev, Victor

    2015-12-01

    In this paper we propose an automatic technology for extraction of latent images from printed media such as documents, banknotes, financial securities, etc. This technology includes image processing by adaptively constructed Gabor filter bank for obtaining feature images, as well as subsequent stages of feature selection, grouping and multicomponent segmentation. The main advantage of the proposed technique is versatility: it allows to extract latent images made by different texture variations. Experimental results showing performance of the method over another known system for latent image extraction are given.

  17. Gabor fusion master slave optical coherence tomography

    PubMed Central

    Cernat, Ramona; Bradu, Adrian; Israelsen, Niels Møller; Bang, Ole; Rivet, Sylvain; Keane, Pearse A.; Heath, David-Garway; Rajendram, Ranjan; Podoleanu, Adrian

    2017-01-01

    This paper describes the application of the Gabor filtering protocol to a Master/Slave (MS) swept source optical coherence tomography (SS)-OCT system at 1300 nm. The MS-OCT system delivers information from selected depths, a property that allows operation similar to that of a time domain OCT system, where dynamic focusing is possible. The Gabor filtering processing following collection of multiple data from different focus positions is different from that utilized by a conventional swept source OCT system using a Fast Fourier transform (FFT) to produce an A-scan. Instead of selecting the bright parts of A-scans for each focus position, to be placed in a final B-scan image (or in a final volume), and discarding the rest, the MS principle can be employed to advantageously deliver signal from the depths within each focus range only. The MS procedure is illustrated on creating volumes of data of constant transversal resolution from a cucumber and from an insect by repeating data acquisition for 4 different focus positions. In addition, advantage is taken from the tolerance to dispersion of the MS principle that allows automatic compensation for dispersion created by layers above the object of interest. By combining the two techniques, Gabor filtering and Master/Slave, a powerful imaging instrument is demonstrated. The master/slave technique allows simultaneous display of three categories of images in one frame: multiple depth en-face OCT images, two cross-sectional OCT images and a confocal like image obtained by averaging the en-face ones. We also demonstrate the superiority of MS-OCT over its FFT based counterpart when used with a Gabor filtering OCT instrument in terms of the speed of assembling the fused volume. For our case, we show that when more than 4 focus positions are required to produce the final volume, MS is faster than the conventional FFT based procedure. PMID:28270987

  18. Trends in Correlation-Based Pattern Recognition and Tracking in Forward-Looking Infrared Imagery

    PubMed Central

    Alam, Mohammad S.; Bhuiyan, Sharif M. A.

    2014-01-01

    In this paper, we review the recent trends and advancements on correlation-based pattern recognition and tracking in forward-looking infrared (FLIR) imagery. In particular, we discuss matched filter-based correlation techniques for target detection and tracking which are widely used for various real time applications. We analyze and present test results involving recently reported matched filters such as the maximum average correlation height (MACH) filter and its variants, and distance classifier correlation filter (DCCF) and its variants. Test results are presented for both single/multiple target detection and tracking using various real-life FLIR image sequences. PMID:25061840

  19. Co-registration of ultrasound and frequency-domain photoacoustic radar images and image improvement for tumor detection

    NASA Astrophysics Data System (ADS)

    Dovlo, Edem; Lashkari, Bahman; Choi, Sung soo Sean; Mandelis, Andreas

    2015-03-01

    This paper demonstrates the co-registration of ultrasound (US) and frequency domain photoacoustic radar (FD-PAR) images with significant image improvement from applying image normalization, filtering and amplification techniques. Achieving PA imaging functionality on a commercial Ultrasound instrument could accelerate clinical acceptance and use. Experimental results presented demonstrate live animal testing and show enhancements in signal-to-noise ratio (SNR), contrast and spatial resolution. The co-registered image produced from the US and phase PA images, provides more information than both images independently.

  20. A 3D ultrasound scanner: real time filtering and rendering algorithms.

    PubMed

    Cifarelli, D; Ruggiero, C; Brusacà, M; Mazzarella, M

    1997-01-01

    The work described here has been carried out within a collaborative project between DIST and ESAOTE BIOMEDICA aiming to set up a new ultrasonic scanner performing 3D reconstruction. A system is being set up to process and display 3D ultrasonic data in a fast, economical and user friendly way to help the physician during diagnosis. A comparison is presented among several algorithms for digital filtering, data segmentation and rendering for real time, PC based, three-dimensional reconstruction from B-mode ultrasonic biomedical images. Several algorithms for digital filtering have been compared as relates to processing time and to final image quality. Three-dimensional data segmentation techniques and rendering has been carried out with special reference to user friendly features for foreseeable applications and reconstruction speed.

  1. Rapid calibrated high-resolution hyperspectral imaging using tunable laser source

    NASA Astrophysics Data System (ADS)

    Nguyen, Lam K.; Margalith, Eli

    2009-05-01

    We present a novel hyperspectral imaging technique based on tunable laser technology. By replacing the broadband source and tunable filters of a typical NIR imaging instrument, several advantages are realized, including: high spectral resolution, highly variable field-of-views, fast scan-rates, high signal-to-noise ratio, and the ability to use optical fiber for efficient and flexible sample illumination. With this technique, high-resolution, calibrated hyperspectral images over the NIR range can be acquired in seconds. The performance of system features will be demonstrated on two example applications: detecting melamine contamination in wheat gluten and separating bovine protein from wheat protein in cattle feed.

  2. Quantitative phase-filtered wavelength-modulated differential photoacoustic radar tumor hypoxia imaging toward early cancer detection.

    PubMed

    Dovlo, Edem; Lashkari, Bahman; Soo Sean Choi, Sung; Mandelis, Andreas; Shi, Wei; Liu, Fei-Fei

    2017-09-01

    Overcoming the limitations of conventional linear spectroscopy used in multispectral photoacoustic imaging, wherein a linear relationship is assumed between the absorbed optical energy and the absorption spectra of the chromophore at a specific location, is crucial for obtaining accurate spatially-resolved quantitative functional information by exploiting known chromophore-specific spectral characteristics. This study introduces a non-invasive phase-filtered differential photoacoustic technique, wavelength-modulated differential photoacoustic radar (WM-DPAR) imaging that addresses this issue by eliminating the effect of the unknown wavelength-dependent fluence. It employs two laser wavelengths modulated out-of-phase to significantly suppress background absorption while amplifying the difference between the two photoacoustic signals. This facilitates pre-malignant tumor identification and hypoxia monitoring, as minute changes in total hemoglobin concentration and hemoglobin oxygenation are detectable. The system can be tuned for specific applications such as cancer screening and SO 2 quantification by regulating the amplitude ratio and phase shift of the signal. The WM-DPAR imaging of a head and neck carcinoma tumor grown in the thigh of a nude rat demonstrates the functional PA imaging of small animals in vivo. The PA appearance of the tumor in relation to tumor vascularity is investigated by immunohistochemistry. Phase-filtered WM-DPAR imaging is also illustrated, maximizing quantitative SO 2 imaging fidelity of tissues. Oxygenation levels within a tumor grown in the thigh of a nude rat using the two-wavelength phase-filtered differential PAR method. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Label-free optical lymphangiography: development of an automatic segmentation method applied to optical coherence tomography to visualize lymphatic vessels using Hessian filters

    PubMed Central

    Yousefi, Siavash; Qin, Jia; Zhi, Zhongwei

    2013-01-01

    Abstract. Lymphatic vessels are a part of the circulatory system that collect plasma and other substances that have leaked from the capillaries into interstitial fluid (lymph) and transport lymph back to the circulatory system. Since lymph is transparent, lymphatic vessels appear as dark hallow vessel-like regions in optical coherence tomography (OCT) cross sectional images. We propose an automatic method to segment lymphatic vessel lumen from OCT structural cross sections using eigenvalues of Hessian filters. Compared to the existing method based on intensity threshold, Hessian filters are more selective on vessel shape and less sensitive to intensity variations and noise. Using this segmentation technique along with optical micro-angiography allows label-free noninvasive simultaneous visualization of blood and lymphatic vessels in vivo. Lymphatic vessels play an important role in cancer, immune system response, inflammatory disease, wound healing and tissue regeneration. Development of imaging techniques and visualization tools for lymphatic vessels is valuable in understanding the mechanisms and studying therapeutic methods in related disease and tissue response. PMID:23922124

  4. Implementation issues of the nearfield equivalent source imaging microphone array

    NASA Astrophysics Data System (ADS)

    Bai, Mingsian R.; Lin, Jia-Hong; Tseng, Chih-Wen

    2011-01-01

    This paper revisits a nearfield microphone array technique termed nearfield equivalent source imaging (NESI) proposed previously. In particular, various issues concerning the implementation of the NESI algorithm are examined. The NESI can be implemented in both the time domain and the frequency domain. Acoustical variables including sound pressure, particle velocity, active intensity and sound power are calculated by using multichannel inverse filters. Issues concerning sensor deployment are also investigated for the nearfield array. The uniform array outperformed a random array previously optimized for far-field imaging, which contradicts the conventional wisdom in far-field arrays. For applications in which only a patch array with scarce sensors is available, a virtual microphone approach is employed to ameliorate edge effects using extrapolation and to improve imaging resolution using interpolation. To enhance the processing efficiency of the time-domain NESI, an eigensystem realization algorithm (ERA) is developed. Several filtering methods are compared in terms of computational complexity. Significant saving on computations can be achieved using ERA and the frequency-domain NESI, as compared to the traditional method. The NESI technique was also experimentally validated using practical sources including a 125 cc scooter and a wooden box model with a loudspeaker fitted inside. The NESI technique proved effective in identifying broadband and non-stationary sources produced by the sources.

  5. Utilization of MatPIV program to different geotechnical models

    NASA Astrophysics Data System (ADS)

    Aklik, P.; Idinger, G.

    2009-04-01

    The Particle Imaging Velocimetry (PIV) technique is being used to measure soil displacements. PIV has been used for many years in fluid mechanics; but for physical modeling in geotechnical engineering, this technique is still relatively new. PIV is a worldwide growth in soil mechanics over the last decade owing to the developments in digital cameras and laser technologies. The use of PIV is feasible provided the surface contains sufficient texture. A Cambridge group has shown that natural sand contains enough texture for applying PIV. In a texture-based approach, the only requirement is for any patch, big or small to be sufficiently unique so that statistical tracking of this patch is possible. In this paper, some of the soil mechanic's models were investigated such as retaining walls, slope failures, and foundations. The photographs were taken with the help of the high resolution digital camera, the displacements of soils were evaluated with free software named as MatPIV and the displacement graphics between the two images were obtained. Nikon D60 digital camera is 10.2 MB and it has special properties which makes it possible to use in PIV applications. These special properties are Airflow Control System and Image Sensor cleaning for protection against dust, Active D-Lighting for highlighted or shadowy areas while shooting, advanced three-point AF system for fast, efficient and precise autofocus. Its fast and continuous shooting mode enables up to 100 JPEG images at three frames per second. Norm Sand (DIN 1164) was used for all the models in a glass rectangular box. For every experiment, MatPIV was used to calculate the velocities from the two images. MatPIV program was used in two ways such as easy way and difficult way: In the easy way, the two images with 64*64 pixels with 50% or 75% overlap of the interrogation windows were taken into consideration and the calculation was performed with a single iteration through the images and the result consisted of four matrices measured in pixels and pixels/second. At the end of the iteration, the results were visualized. In the application of difficult way of MatPIV, a grid of points into the research model was inserted and the first image was taken with the Nikon D60 digital camera. Afterwards, how large a pixel in the image and the orientation of the coordinate system were calculated. If there are no particles to perform PIV calculations in the investigated region, the best way is to mask out this empty region. The crucial step in PIV is the particle image analysis, which is to determine the displacements between two successive images. The first image was divided into a grid of test patches. Each test patch consisted of a sample of the image matrix of size L * L pixels. To find the displacement of the test patch between images 1 and 2, a search patch was extracted from the second image. The cross-correlation of test patch and search patch was evaluated. The resulting normalized correlation plane indicated the "degree of match" between the test and search patch. The highest peak in the normalized correlation plane indicated the displacement vector of the test patch. The procedure described above for evaluation a single displacement vector was repeated for the entire grid of test patches, producing the displacement field between the image pair. After having performed the calculations, there were so many wild vectors due to low image quality in some parts of the images to be removed with the help of the different filters. There are four different filters in MatPIV, these are: signal-to-noise ratio filter, peak height filter, global filter, and local filter. The filters were used step by step to decide which filter could give the best result for the related images. As a last step, both of the ways were compared in each geotechnical model.

  6. Economical Emission-Line Mapping: ISM Properties of Nearby Protogalaxy Analogs

    NASA Astrophysics Data System (ADS)

    Monkiewicz, Jacqueline A.

    2017-01-01

    Optical emission line imaging can produce a wealth of information about the conditions of the interstellar medium, but a full set of custom emission-line filters for a professional-grade telescope camera can cost many thousands of dollars. A cheaper alternative is to use commercially-produced 2-inch narrow-band astrophotography filters. In order to use these standardized filters with professional-grade telescope cameras, custom filter mounts must be manufactured for each individual filter wheel. These custom filter adaptors are produced by 3-D printing rather than standard machining, which further lowers the total cost.I demonstrate the feasibility of this technique with H-alpha, H-beta, and [OIII] emission line mapping of the low metallicity star-forming galaxies IC10 and NGC 1569, taken with my astrophotography filter set on three different 2-meter class telescopes in Southern Arizona.

  7. Techniques to improve the accuracy of noise power spectrum measurements in digital x-ray imaging based on background trends removal.

    PubMed

    Zhou, Zhongxing; Gao, Feng; Zhao, Huijuan; Zhang, Lixin

    2011-03-01

    Noise characterization through estimation of the noise power spectrum (NPS) is a central component of the evaluation of digital x-ray systems. Extensive works have been conducted to achieve accurate and precise measurement of NPS. One approach to improve the accuracy of the NPS measurement is to reduce the statistical variance of the NPS results by involving more data samples. However, this method is based on the assumption that the noise in a radiographic image is arising from stochastic processes. In the practical data, the artifactuals always superimpose on the stochastic noise as low-frequency background trends and prevent us from achieving accurate NPS. The purpose of this study was to investigate an appropriate background detrending technique to improve the accuracy of NPS estimation for digital x-ray systems. In order to achieve the optimal background detrending technique for NPS estimate, four methods for artifactuals removal were quantitatively studied and compared: (1) Subtraction of a low-pass-filtered version of the image, (2) subtraction of a 2-D first-order fit to the image, (3) subtraction of a 2-D second-order polynomial fit to the image, and (4) subtracting two uniform exposure images. In addition, background trend removal was separately applied within original region of interest or its partitioned sub-blocks for all four methods. The performance of background detrending techniques was compared according to the statistical variance of the NPS results and low-frequency systematic rise suppression. Among four methods, subtraction of a 2-D second-order polynomial fit to the image was most effective in low-frequency systematic rise suppression and variances reduction for NPS estimate according to the authors' digital x-ray system. Subtraction of a low-pass-filtered version of the image led to NPS variance increment above low-frequency components because of the side lobe effects of frequency response of the boxcar filtering function. Subtracting two uniform exposure images obtained the worst result on the smoothness of NPS curve, although it was effective in low-frequency systematic rise suppression. Subtraction of a 2-D first-order fit to the image was also identified effective for background detrending, but it was worse than subtraction of a 2-D second-order polynomial fit to the image according to the authors' digital x-ray system. As a result of this study, the authors verified that it is necessary and feasible to get better NPS estimate by appropriate background trend removal. Subtraction of a 2-D second-order polynomial fit to the image was the most appropriate technique for background detrending without consideration of processing time.

  8. Data Filtering in Instrumental Analyses with Applications to Optical Spectroscopy and Chemical Imaging

    ERIC Educational Resources Information Center

    Vogt, Frank

    2011-01-01

    Most measurement techniques have some limitations imposed by a sensor's signal-to-noise ratio (SNR). Thus, in analytical chemistry, methods for enhancing the SNR are of crucial importance and can be ensured experimentally or established via pre-treatment of digitized data. In many analytical curricula, instrumental techniques are given preference…

  9. ERS-2 SAR and IRS-1C LISS III data fusion: A PCA approach to improve remote sensing based geological interpretation

    NASA Astrophysics Data System (ADS)

    Pal, S. K.; Majumdar, T. J.; Bhattacharya, Amit K.

    Fusion of optical and synthetic aperture radar data has been attempted in the present study for mapping of various lithologic units over a part of the Singhbhum Shear Zone (SSZ) and its surroundings. ERS-2 SAR data over the study area has been enhanced using Fast Fourier Transformation (FFT) based filtering approach, and also using Frost filtering technique. Both the enhanced SAR imagery have been then separately fused with histogram equalized IRS-1C LISS III image using Principal Component Analysis (PCA) technique. Later, Feature-oriented Principal Components Selection (FPCS) technique has been applied to generate False Color Composite (FCC) images, from which corresponding geological maps have been prepared. Finally, GIS techniques have been successfully used for change detection analysis in the lithological interpretation between the published geological map and the fusion based geological maps. In general, there is good agreement between these maps over a large portion of the study area. Based on the change detection studies, few areas could be identified which need attention for further detailed ground-based geological studies.

  10. Label-Free, Flow-Imaging Methods for Determination of Cell Concentration and Viability.

    PubMed

    Sediq, A S; Klem, R; Nejadnik, M R; Meij, P; Jiskoot, Wim

    2018-05-30

    To investigate the potential of two flow imaging microscopy (FIM) techniques (Micro-Flow Imaging (MFI) and FlowCAM) to determine total cell concentration and cell viability. B-lineage acute lymphoblastic leukemia (B-ALL) cells of 2 different donors were exposed to ambient conditions. Samples were taken at different days and measured with MFI, FlowCAM, hemocytometry and automated cell counting. Dead and live cells from a fresh B-ALL cell suspension were fractionated by flow cytometry in order to derive software filters based on morphological parameters of separate cell populations with MFI and FlowCAM. The filter sets were used to assess cell viability in the measured samples. All techniques gave fairly similar cell concentration values over the whole incubation period. MFI showed to be superior with respect to precision, whereas FlowCAM provided particle images with a higher resolution. Moreover, both FIM methods were able to provide similar results for cell viability as the conventional methods (hemocytometry and automated cell counting). FIM-based methods may be advantageous over conventional cell methods for determining total cell concentration and cell viability, as FIM measures much larger sample volumes, does not require labeling, is less laborious and provides images of individual cells.

  11. Dual-energy approach to contrast-enhanced mammography using the balanced filter method: spectral optimization and preliminary phantom measurement.

    PubMed

    Saito, Masatoshi

    2007-11-01

    Dual-energy contrast agent-enhanced mammography is a technique of demonstrating breast cancers obscured by a cluttered background resulting from the contrast between soft tissues in the breast. The technique has usually been implemented by exploiting two exposures to different x-ray tube voltages. In this article, another dual-energy approach using the balanced filter method without switching the tube voltages is described. For the spectral optimization of dual-energy mammography using the balanced filters, we applied a theoretical framework reported by Lemacks et al. [Med. Phys. 29, 1739-1751 (2002)] to calculate the signal-to-noise ratio (SNR) in an iodinated contrast agent subtraction image. This permits the selection of beam parameters such as tube voltage and balanced filter material, and the optimization of the latter's thickness with respect to some critical quantity-in this case, mean glandular dose. For an imaging system with a 0.1 mm thick CsI:T1 scintillator, we predict that the optimal tube voltage would be 45 kVp for a tungsten anode using zirconium, iodine, and neodymium balanced filters. A mean glandular dose of 1.0 mGy is required to obtain an SNR of 5 in order to detect 1.0 mg/cm2 iodine in the resulting clutter-free image of a 5 cm thick breast composed of 50% adipose and 50% glandular tissue. In addition to spectral optimization, we carried out phantom measurements to demonstrate the present dual-energy approach for obtaining a clutter-free image, which preferentially shows iodine, of a breast phantom comprising three major components-acrylic spheres, olive oil, and an iodinated contrast agent. The detection of iodine details on the cluttered background originating from the contrast between acrylic spheres and olive oil is analogous to the task of distinguishing contrast agents in a mixture of glandular and adipose tissues.

  12. Arbitrary shape region-of-interest fluoroscopy system

    NASA Astrophysics Data System (ADS)

    Xu, Tong; Le, Huy; Molloi, Sabee Y.

    2002-05-01

    Region-of-interest (ROI) fluoroscopy has previously been investigated as a method to reduce x-ray exposure to the patient and the operator. This ROI fluoroscopy technique allows the operator to arbitrarily determine the shape, size, and location of the ROI. A device was used to generate patient specific x-ray beam filters. The device is comprised of 18 step-motors that control a 16 X 16 matrix of pistons to form the filter from a deformable attenuating material. Patient exposure reductions were measured to be 84 percent for a 65 kVp beam. Operator exposure reduction was measured to be 69 percent. Due to the reduced x-ray scatter, image contrast was improved by 23 percent inside the ROI. The reduced gray level in the periphery was corrected using an experimentally determined compensation ratio. A running average interpolation technique was used to eliminate the artifacts from the ROI edge. As expected, the final corrected images show increased noise in the periphery. However, the anatomical structures in the periphery could still be visualized. This arbitrary shaped region of interest fluoroscopic technique was shown to be effective in terms of its ability to reduce patient and operator exposure without significant reduction in image quality. The ability to define an arbitrary shaped ROI should make the technique more clinically feasible.

  13. An investigation and conceptual design of a holographic starfield and landmark tracker

    NASA Technical Reports Server (NTRS)

    Welch, J. D.

    1973-01-01

    The analysis, experiments, and design effort of this study have supported the feasibility of the basic holographic tracker concept. Image intensifiers and photoplastic recording materials were examined, along with a Polaroid rapid process silver halide material. Two reference beam, coherent optical matched filter technique was used for multiplexing spatial frequency filters for starfields. A 1 watt HeNe laser and an electro-optical readout are also considered.

  14. REMOTE LAND MINE(FIELD) DETECTION. An Overview of Techniques (DETECTIE VAN LANDMIJNEN EN MIJNENVELDEN OP AFSTAND. Een Overzicht van de technieken),

    DTIC Science & Technology

    1994-09-01

    titel DETECTIE VAN LANDMIJNEN EN MIJNENVELDEN OP AFSTAND, een overzicht van de technieken auteur (s) Drs. J.S. Groot, Ir. Y.H.L. Janssen datum september...functions based on set theory . The fundamental theory is developed in the sixties. This theory was applicable to binary images (black-and-white images...held at TNO-FEL. Various subjects related to fusion techniques: Dempster Shafer theory , Bayesian inference, Kalman filtering, fuzzy logic. [A15], [B4

  15. Snapshot Imaging Spectrometry in the Visible and Long Wave Infrared

    NASA Astrophysics Data System (ADS)

    Maione, Bryan David

    Imaging spectrometry is an optical technique in which the spectral content of an object is measured at each location in space. The main advantage of this modality is that it enables characterization beyond what is possible with a conventional camera, since spectral information is generally related to the chemical composition of the object. Due to this, imaging spectrometers are often capable of detecting targets that are either morphologically inconsistent, or even under resolved. A specific class of imaging spectrometer, known as a snapshot system, seeks to measure all spatial and spectral information simultaneously, thereby rectifying artifacts associated with scanning designs, and enabling the measurement of temporally dynamic scenes. Snapshot designs are the focus of this dissertation. Three designs for snapshot imaging spectrometers are developed, each providing novel contributions to the field of imaging spectrometry. In chapter 2, the first spatially heterodyned snapshot imaging spectrometer is modeled and experimentally validated. Spatial heterodyning is a technique commonly implemented in non-imaging Fourier transform spectrometry. For Fourier transform imaging spectrometers, spatial heterodyning improves the spectral resolution trade space. Additionally, in this chapter a unique neural network based spectral calibration is developed and determined to be an improvement beyond Fourier and linear operator based techniques. Leveraging spatial heterodyning as developed in chapter 2, in chapter 3, a high spectral resolution snapshot Fourier transform imaging spectrometer, based on a Savart plate interferometer, is developed and experimentally validated. The sensor presented in this chapter is the highest spectral resolution sensor in its class. High spectral resolution enables the sensor to discriminate narrowly spaced spectral lines. The capabilities of neural networks in imaging spectrometry are further explored in this chapter. Neural networks are used to perform single target detection on raw instrument data, thereby eliminating the need for an explicit spectral calibration step. As an extension of the results in chapter 2, neural networks are once again demonstrated to be an improvement when compared to linear operator based detection. In chapter 4 a non-interferometric design is developed for the long wave infrared (wavelengths spanning 8-12 microns). The imaging spectrometer developed in this chapter is a multi-aperture filtered microbolometer. Since the detector is uncooled, the presented design is ultra-compact and low power. Additionally, cost effective polymer absorption filters are used in lieu of interference filters. Since, each measurement of the system is spectrally multiplexed, an SNR advantage is realized. A theoretical model for the filtered design is developed, and the performance of the sensor for detecting liquid contaminants is investigated. Similar to past chapters, neural networks are used and achieve false detection rates of less than 1%. Lastly, this dissertation is concluded with a discussion on future work and potential impact of these devices.

  16. Landsat multispectral sharpening using a sensor system model and panchromatic image

    USGS Publications Warehouse

    Lemeshewsky, G.P.; ,

    2003-01-01

    The thematic mapper (TM) sensor aboard Landsats 4, 5 and enhanced TM plus (ETM+) on Landsat 7 collect imagery at 30-m sample distance in six spectral bands. New with ETM+ is a 15-m panchromatic (P) band. With image sharpening techniques, this higher resolution P data, or as an alternative, the 10-m (or 5-m) P data of the SPOT satellite, can increase the spatial resolution of the multispectral (MS) data. Sharpening requires that the lower resolution MS image be coregistered and resampled to the P data before high spatial frequency information is transferred to the MS data. For visual interpretation and machine classification tasks, it is important that the sharpened data preserve the spectral characteristics of the original low resolution data. A technique was developed for sharpening (in this case, 3:1 spatial resolution enhancement) visible spectral band data, based on a model of the sensor system point spread function (PSF) in order to maintain spectral fidelity. It combines high-pass (HP) filter sharpening methods with iterative image restoration to reduce degradations caused by sensor-system-induced blurring and resembling. Also there is a spectral fidelity requirement: sharpened MS when filtered by the modeled degradations should reproduce the low resolution source MS. Quantitative evaluation of sharpening performance was made by using simulated low resolution data generated from digital color-IR aerial photography. In comparison to the HP-filter-based sharpening method, results for the technique in this paper with simulated data show improved spectral fidelity. Preliminary results with TM 30-m visible band data sharpened with simulated 10-m panchromatic data are promising but require further study.

  17. Photographic Film Image Enhancement

    DOT National Transportation Integrated Search

    1975-01-01

    A series of experiments were undertaken to assess the feasibility of defogging color film by the techniques of Optical Spatial Filtering. A coherent optical processor was built using red, blue, and green laser light input and specially designed Fouri...

  18. Autofluorescence imaging of basal cell carcinoma by smartphone RGB camera

    NASA Astrophysics Data System (ADS)

    Lihachev, Alexey; Derjabo, Alexander; Ferulova, Inesa; Lange, Marta; Lihacova, Ilze; Spigulis, Janis

    2015-12-01

    The feasibility of smartphones for in vivo skin autofluorescence imaging has been investigated. Filtered autofluorescence images from the same tissue area were periodically captured by a smartphone RGB camera with subsequent detection of fluorescence intensity decreasing at each image pixel for further imaging the planar distribution of those values. The proposed methodology was tested clinically with 13 basal cell carcinoma and 1 atypical nevus. Several clinical cases and potential future applications of the smartphone-based technique are discussed.

  19. Autofluorescence imaging of basal cell carcinoma by smartphone RGB camera.

    PubMed

    Lihachev, Alexey; Derjabo, Alexander; Ferulova, Inesa; Lange, Marta; Lihacova, Ilze; Spigulis, Janis

    2015-01-01

    The feasibility of smartphones for in vivo skin autofluorescence imaging has been investigated. Filtered autofluorescence images from the same tissue area were periodically captured by a smartphone RGB camera with subsequent detection of fluorescence intensity decreasing at each image pixel for further imaging the planar distribution of those values. The proposed methodology was tested clinically with 13 basal cell carcinoma and 1 atypical nevus. Several clinical cases and potential future applications of the smartphone-based technique are discussed.

  20. Architectures and algorithms for digital image processing; Proceedings of the Meeting, Cannes, France, December 5, 6, 1985

    NASA Technical Reports Server (NTRS)

    Duff, Michael J. B. (Editor); Siegel, Howard J. (Editor); Corbett, Francis J. (Editor)

    1986-01-01

    The conference presents papers on the architectures, algorithms, and applications of image processing. Particular attention is given to a very large scale integration system for image reconstruction from projections, a prebuffer algorithm for instant display of volume data, and an adaptive image sequence filtering scheme based on motion detection. Papers are also presented on a simple, direct practical method of sensing local motion and analyzing local optical flow, image matching techniques, and an automated biological dosimetry system.

  1. Wavelength Scanning with a Tilting Interference Filter for Glow-Discharge Elemental Imaging.

    PubMed

    Storey, Andrew P; Ray, Steven J; Hoffmann, Volker; Voronov, Maxim; Engelhard, Carsten; Buscher, Wolfgang; Hieftje, Gary M

    2017-06-01

    Glow discharges have long been used for depth profiling and bulk analysis of solid samples. In addition, over the past decade, several methods of obtaining lateral surface elemental distributions have been introduced, each with its own strengths and weaknesses. Challenges for each of these techniques are acceptable optical throughput and added instrumental complexity. Here, these problems are addressed with a tilting-filter instrument. A pulsed glow discharge is coupled to an optical system comprising an adjustable-angle tilting filter, collimating and imaging lenses, and a gated, intensified charge-coupled device (CCD) camera, which together provide surface elemental mapping of solid samples. The tilting-filter spectrometer is instrumentally simpler, produces less image distortion, and achieves higher optical throughput than a monochromator-based instrument, but has a much more limited tunable spectral range and poorer spectral resolution. As a result, the tilting-filter spectrometer is limited to single-element or two-element determinations, and only when the target spectral lines fall within an appropriate spectral range and can be spectrally discerned. Spectral interferences that result from heterogeneous impurities can be flagged and overcome by observing the spatially resolved signal response across the available tunable spectral range. The instrument has been characterized and evaluated for the spatially resolved analysis of glow-discharge emission from selected but representative samples.

  2. Using deconvolution to improve the metrological performance of the grid method

    NASA Astrophysics Data System (ADS)

    Grédiac, Michel; Sur, Frédéric; Badulescu, Claudiu; Mathias, Jean-Denis

    2013-06-01

    The use of various deconvolution techniques to enhance strain maps obtained with the grid method is addressed in this study. Since phase derivative maps obtained with the grid method can be approximated by their actual counterparts convolved by the envelope of the kernel used to extract phases and phase derivatives, non-blind restoration techniques can be used to perform deconvolution. Six deconvolution techniques are presented and employed to restore a synthetic phase derivative map, namely direct deconvolution, regularized deconvolution, the Richardson-Lucy algorithm and Wiener filtering, the last two with two variants concerning their practical implementations. Obtained results show that the noise that corrupts the grid images must be thoroughly taken into account to limit its effect on the deconvolved strain maps. The difficulty here is that the noise on the grid image yields a spatially correlated noise on the strain maps. In particular, numerical experiments on synthetic data show that direct and regularized deconvolutions are unstable when noisy data are processed. The same remark holds when Wiener filtering is employed without taking into account noise autocorrelation. On the other hand, the Richardson-Lucy algorithm and Wiener filtering with noise autocorrelation provide deconvolved maps where the impact of noise remains controlled within a certain limit. It is also observed that the last technique outperforms the Richardson-Lucy algorithm. Two short examples of actual strain fields restoration are finally shown. They deal with asphalt and shape memory alloy specimens. The benefits and limitations of deconvolution are presented and discussed in these two cases. The main conclusion is that strain maps are correctly deconvolved when the signal-to-noise ratio is high and that actual noise in the actual strain maps must be more specifically characterized than in the current study to address higher noise levels with Wiener filtering.

  3. Automated segmentation of the lamina cribrosa using Frangi's filter: a novel approach for rapid identification of tissue volume fraction and beam orientation in a trabeculated structure in the eye

    PubMed Central

    Campbell, Ian C.; Coudrillier, Baptiste; Mensah, Johanne; Abel, Richard L.; Ethier, C. Ross

    2015-01-01

    The lamina cribrosa (LC) is a tissue in the posterior eye with a complex trabecular microstructure. This tissue is of great research interest, as it is likely the initial site of retinal ganglion cell axonal damage in glaucoma. Unfortunately, the LC is difficult to access experimentally, and thus imaging techniques in tandem with image processing have emerged as powerful tools to study the microstructure and biomechanics of this tissue. Here, we present a staining approach to enhance the contrast of the microstructure in micro-computed tomography (micro-CT) imaging as well as a comparison between tissues imaged with micro-CT and second harmonic generation (SHG) microscopy. We then apply a modified version of Frangi's vesselness filter to automatically segment the connective tissue beams of the LC and determine the orientation of each beam. This approach successfully segmented the beams of a porcine optic nerve head from micro-CT in three dimensions and SHG microscopy in two dimensions. As an application of this filter, we present finite-element modelling of the posterior eye that suggests that connective tissue volume fraction is the major driving factor of LC biomechanics. We conclude that segmentation with Frangi's filter is a powerful tool for future image-driven studies of LC biomechanics. PMID:25589572

  4. Global Temperature Measurement of Supercooled Water under Icing Conditions using Two-Color Luminescent Images and Multi-Band Filter

    NASA Astrophysics Data System (ADS)

    Tanaka, Mio; Morita, Katsuaki; Kimura, Shigeo; Sakaue, Hirotaka

    2012-11-01

    Icing occurs by a collision of a supercooled-water droplet on a surface. It can be seen in any cold area. A great attention is paid in an aircraft icing. To understand the icing process on an aircraft, it is necessary to give the temperature information of the supercooled water. A conventional technique, such as a thermocouple, is not valid, because it becomes a collision surface that accumulates ice. We introduce a dual-luminescent imaging to capture a global temperature distribution of supercooled water under the icing conditions. It consists of two-color luminescent probes and a multi-band filter. One of the probes is sensitive to the temperature and the other is independent of the temperature. The latter is used to cancel the temperature-independent luminescence of a temperature-dependent image caused by an uneven illumination and a camera location. The multi-band filter only selects the luminescent peaks of the probes to enhance the temperature sensitivity of the imaging system. By applying the system, the time-resolved temperature information of a supercooled-water droplet is captured.

  5. A hybrid algorithm for speckle noise reduction of ultrasound images.

    PubMed

    Singh, Karamjeet; Ranade, Sukhjeet Kaur; Singh, Chandan

    2017-09-01

    Medical images are contaminated by multiplicative speckle noise which significantly reduce the contrast of ultrasound images and creates a negative effect on various image interpretation tasks. In this paper, we proposed a hybrid denoising approach which collaborate the both local and nonlocal information in an efficient manner. The proposed hybrid algorithm consist of three stages in which at first stage the use of local statistics in the form of guided filter is used to reduce the effect of speckle noise initially. Then, an improved speckle reducing bilateral filter (SRBF) is developed to further reduce the speckle noise from the medical images. Finally, to reconstruct the diffused edges we have used the efficient post-processing technique which jointly considered the advantages of both bilateral and nonlocal mean (NLM) filter for the attenuation of speckle noise efficiently. The performance of proposed hybrid algorithm is evaluated on synthetic, simulated and real ultrasound images. The experiments conducted on various test images demonstrate that our proposed hybrid approach outperforms the various traditional speckle reduction approaches included recently proposed NLM and optimized Bayesian-based NLM. The results of various quantitative, qualitative measures and by visual inspection of denoise synthetic and real ultrasound images demonstrate that the proposed hybrid algorithm have strong denoising capability and able to preserve the fine image details such as edge of a lesion better than previously developed methods for speckle noise reduction. The denoising and edge preserving capability of hybrid algorithm is far better than existing traditional and recently proposed speckle reduction (SR) filters. The success of proposed algorithm would help in building the lay foundation for inventing the hybrid algorithms for denoising of ultrasound images. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. A prospective case study of high boost, high frequency emphasis and two-way diffusion filters on MR images of glioblastoma multiforme.

    PubMed

    Anoop, B N; Joseph, Justin; Williams, J; Jayaraman, J Sivaraman; Sebastian, Ansa Maria; Sihota, Praveer

    2018-06-01

    Glioblastoma multiforme (GBM) appears undifferentiated and non-enhancing on magnetic resonance (MR) imagery. As MRI does not offer adequate image quality to allow visual discrimination of the boundary between GBM focus and perifocal vasogenic edema, surgical and radiotherapy planning become difficult. The presence of noise in MR images influences the computation of radiation dosage and precludes the edge based segmentation schemes in automated software for radiation treatment planning. The performance of techniques meant for simultaneous denoising and sharpening, like high boost filters, high frequency emphasize filters and two-way anisotropic diffusion is sensitive to the selection of their operational parameters. Improper selection may cause overshoot and saturation artefacts or noisy grey level transitions can be left unsuppressed. This paper is a prospective case study of the performance of high boost filters, high frequency emphasize filters and two-way anisotropic diffusion on MR images of GBM, for their ability to suppress noise from homogeneous regions and to selectively sharpen the true morphological edges. An objective method for determining the optimum value of the operational parameters of these techniques is also demonstrated. Saturation Evaluation Index (SEI), Perceptual Sharpness Index (PSI), Edge Model based Blur Metric (EMBM), Sharpness of Ridges (SOR), Structural Similarity Index Metric (SSIM), Peak Signal to Noise Ratio (PSNR) and Noise Suppression Ratio (NSR) are the objective functions used. They account for overshoot and saturation artefacts, sharpness of the image, width of salient edges (haloes), susceptibility of edge quality to noise, feature preservation and degree of noise suppression. Two-way diffusion is found to be superior to others in all these respects. The SEI, PSI, EMBM, SOR, SSIM, PSNR and NSR exhibited by two-way diffusion are 0.0016 ± 0.0012, 0.2049 ± 0.0187, 0.0905 ± 0.0408, 2.64 × 10 12  ± 1.6 × 10 12 , 0.9955 ± 0.0024, 38.214 ± 5.2145 and 0.3547 ± 0.0069, respectively.

  7. Image reconstruction

    NASA Astrophysics Data System (ADS)

    Vasilenko, Georgii Ivanovich; Taratorin, Aleksandr Markovich

    Linear, nonlinear, and iterative image-reconstruction (IR) algorithms are reviewed. Theoretical results are presented concerning controllable linear filters, the solution of ill-posed functional minimization problems, and the regularization of iterative IR algorithms. Attention is also given to the problem of superresolution and analytical spectrum continuation, the solution of the phase problem, and the reconstruction of images distorted by turbulence. IR in optical and optical-digital systems is discussed with emphasis on holographic techniques.

  8. Filtering, Coding, and Compression with Malvar Wavelets

    DTIC Science & Technology

    1993-12-01

    speech coding techniques being investigated by the military (38). Imagery: Space imagery often requires adaptive restoration to deblur out-of-focus...and blurred image, find an estimate of the ideal image using a priori information about the blur, noise , and the ideal image" (12). The research for...recording can be described as the original signal convolved with impulses , which appear as echoes in the seismic event. The term deconvolution indicates

  9. Automatic classification of endoscopic images for premalignant conditions of the esophagus

    NASA Astrophysics Data System (ADS)

    Boschetto, Davide; Gambaretto, Gloria; Grisan, Enrico

    2016-03-01

    Barrett's esophagus (BE) is a precancerous complication of gastroesophageal reflux disease in which normal stratified squamous epithelium lining the esophagus is replaced by intestinal metaplastic columnar epithelium. Repeated endoscopies and multiple biopsies are often necessary to establish the presence of intestinal metaplasia. Narrow Band Imaging (NBI) is an imaging technique commonly used with endoscopies that enhances the contrast of vascular pattern on the mucosa. We present a computer-based method for the automatic normal/metaplastic classification of endoscopic NBI images. Superpixel segmentation is used to identify and cluster pixels belonging to uniform regions. From each uniform clustered region of pixels, eight features maximizing differences among normal and metaplastic epithelium are extracted for the classification step. For each superpixel, the three mean intensities of each color channel are firstly selected as features. Three added features are the mean intensities for each superpixel after separately applying to the red-channel image three different morphological filters (top-hat filtering, entropy filtering and range filtering). The last two features require the computation of the Grey-Level Co-Occurrence Matrix (GLCM), and are reflective of the contrast and the homogeneity of each superpixel. The classification step is performed using an ensemble of 50 classification trees, with a 10-fold cross-validation scheme by training the classifier at each step on a random 70% of the images and testing on the remaining 30% of the dataset. Sensitivity and Specificity are respectively of 79.2% and 87.3%, with an overall accuracy of 83.9%.

  10. Contrast-to-noise ratio in magnification mammography: a Monte Carlo study

    NASA Astrophysics Data System (ADS)

    Koutalonis, M.; Delis, H.; Spyrou, G.; Costaridou, L.; Tzanakos, G.; Panayiotakis, G.

    2007-06-01

    Magnification views are a common way to perform a secondary examination when suspicious abnormalities are found in a screening mammogram. The visibility of microcalcifications and breast lesions is restricted by the compromise between the image quality and the absorbed dose. In this study, image quality characteristics in magnification mammography were evaluated based on Monte Carlo techniques. A breast phantom was utilized, simulating a homogeneous mixture of adipose and glandular tissue in various percentages of glandularity, containing inhomogeneities of various sizes and compositions. The effect of the magnification degree, breast glandularity, tube voltage and anode/filter material combination on image quality characteristics was investigated in terms of a contrast-to-noise ratio (CNR). A performance index PIν was introduced in order to study the overall performance of various anode/filter combinations under different exposure parameters. Results demonstrate that CNR is improved with the degree of magnification and degraded as the breast glandularity is increased. Degree of magnification 1.3 offers the best overall performance for most of the anode/filter combinations utilized. Under magnification conditions, the role of dose is demoted against the image quality, as magnification views are secondary, diagnostic examinations and not screening procedures oriented to non-symptomatic women. For decreased image quality weighting, some anode/filter combinations different from Mo/0.030mmMo can be utilized as they offer a similar performance index. However, if the desired weighting for the image quality is high, the Mo/0.030mmMo combination has the best overall performance.

  11. Impulsive noise suppression in color images based on the geodesic digital paths

    NASA Astrophysics Data System (ADS)

    Smolka, Bogdan; Cyganek, Boguslaw

    2015-02-01

    In the paper a novel filtering design based on the concept of exploration of the pixel neighborhood by digital paths is presented. The paths start from the boundary of a filtering window and reach its center. The cost of transitions between adjacent pixels is defined in the hybrid spatial-color space. Then, an optimal path of minimum total cost, leading from pixels of the window's boundary to its center is determined. The cost of an optimal path serves as a degree of similarity of the central pixel to the samples from the local processing window. If a pixel is an outlier, then all the paths starting from the window's boundary will have high costs and the minimum one will also be high. The filter output is calculated as a weighted mean of the central pixel and an estimate constructed using the information on the minimum cost assigned to each image pixel. So, first the costs of optimal paths are used to build a smoothed image and in the second step the minimum cost of the central pixel is utilized for construction of the weights of a soft-switching scheme. The experiments performed on a set of standard color images, revealed that the efficiency of the proposed algorithm is superior to the state-of-the-art filtering techniques in terms of the objective restoration quality measures, especially for high noise contamination ratios. The proposed filter, due to its low computational complexity, can be applied for real time image denoising and also for the enhancement of video streams.

  12. Thermography-based blood flow imaging in human skin of the hands and feet: a spectral filtering approach.

    PubMed

    Sagaidachnyi, A A; Fomin, A V; Usanov, D A; Skripal, A V

    2017-02-01

    The determination of the relationship between skin blood flow and skin temperature dynamics is the main problem in thermography-based blood flow imaging. Oscillations in skin blood flow are the source of thermal waves propagating from micro-vessels toward the skin's surface, as assumed in this study. This hypothesis allows us to use equations for the attenuation and dispersion of thermal waves for converting the temperature signal into the blood flow signal, and vice versa. We developed a spectral filtering approach (SFA), which is a new technique for thermography-based blood flow imaging. In contrast to other processing techniques, the SFA implies calculations in the spectral domain rather than in the time domain. Therefore, it eliminates the need to solve differential equations. The developed technique was verified within 0.005-0.1 Hz, including the endothelial, neurogenic and myogenic frequency bands of blood flow oscillations. The algorithm for an inverse conversion of the blood flow signal into the skin temperature signal is addressed. The examples of blood flow imaging of hands during cuff occlusion and feet during heating of the back are illustrated. The processing of infrared (IR) thermograms using the SFA allowed us to restore the blood flow signals and achieve correlations of about 0.8 with a waveform of a photoplethysmographic signal. The prospective applications of the thermography-based blood flow imaging technique include non-contact monitoring of the blood supply during engraftment of skin flaps and burns healing, as well the use of contact temperature sensors to monitor low-frequency oscillations of peripheral blood flow.

  13. Restoration and reconstruction from overlapping images

    NASA Technical Reports Server (NTRS)

    Reichenbach, Stephen E.; Kaiser, Daniel J.; Hanson, Andrew L.; Li, Jing

    1997-01-01

    This paper describes a technique for restoring and reconstructing a scene from overlapping images. In situations where there are multiple, overlapping images of the same scene, it may be desirable to create a single image that most closely approximates the scene, based on all of the data in the available images. For example, successive swaths acquired by NASA's planned Moderate Imaging Spectrometer (MODIS) will overlap, particularly at wide scan angles, creating a severe visual artifact in the output image. Resampling the overlapping swaths to produce a more accurate image on a uniform grid requires restoration and reconstruction. The one-pass restoration and reconstruction technique developed in this paper yields mean-square-optimal resampling, based on a comprehensive end-to-end system model that accounts for image overlap, and subject to user-defined and data-availability constraints on the spatial support of the filter.

  14. Effect of anapanasati meditation technique through electrophotonic imaging parameters: A pilot study.

    PubMed

    Deo, Guru; Itagi R, Kumar; Thaiyar M, Srinivasan; Kuldeep, Kushwah K

    2015-01-01

    Mindfulness along with breathing is a well-established meditation technique. Breathing is an exquisite tool for exploring subtle awareness of mind and life itself. This study aimed at measuring changes in the different parameters of electrophotonic imaging (EPI) in anapanasati meditators. To carry out this study, 51 subjects comprising 32 males and 19 females of age 18 years and above (mean age 45.64 ± 14.43) were recruited voluntarily with informed consent attending Karnataka Dhyana Mahachakra-1 at Pyramid Valley International, Bengaluru, India. The design was a single group pre- post and data collected by EPI device before and after 5 days of intensive meditation. Results show significant changes in EPI parameter integral area with filter (physiological) in both right and left side, which reflects the availability of high functional energy reserve in meditators. The researchers observed similar trends without filter (psycho-physiological) indicating high reserves of energy at psycho-physiological level also. Activation coefficient, another parameter of EPI, reduced showing more relaxed state than earlier, possibly due to parasympathetic dominance. Integral entropy decreased in the case of psycho-physiological parameters left-side without filter, which indicates less disorder after meditation, but these changes were not significant. The study showed a reversed change in integral entropy in the right side without filter; however, the values on both sides with filter increased, which indicates disorder. The study suggests that EPI can be used in the recording functional physiological and psychophysiological status of meditators at a subtle level.

  15. Application of optical correlation techniques to particle imaging velocimetry

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.; Edwards, Robert V.

    1988-01-01

    Pulsed laser sheet velocimetry yields nonintrusive measurements of velocity vectors across an extended 2-dimensional region of the flow field. The application of optical correlation techniques to the analysis of multiple exposure laser light sheet photographs can reduce and/or simplify the data reduction time and hardware. Here, Matched Spatial Filters (MSF) are used in a pattern recognition system. Usually MSFs are used to identify the assembly line parts. In this application, the MSFs are used to identify the iso-velocity vector contours in the flow. The patterns to be recognized are the recorded particle images in a pulsed laser light sheet photograph. Measurement of the direction of the partical image displacements between exposures yields the velocity vector. The particle image exposure sequence is designed such that the velocity vector direction is determined unambiguously. A global analysis technique is used in comparison to the more common particle tracking algorithms and Young's fringe analysis technique.

  16. Monitoring of sludge dewatering equipment by image classification

    NASA Astrophysics Data System (ADS)

    Maquine de Souza, Sandro; Grandvalet, Yves; Denoeux, Thierry

    2004-11-01

    Belt filter presses represent an economical means to dewater the residual sludge generated in wastewater treatment plants. In order to assure maximal water removal, the raw sludge is mixed with a chemical conditioner prior to being fed into the belt filter press. When the conditioner is properly dosed, the sludge acquires a coarse texture, with space between flocs. This information was exploited for the development of a software sensor, where digital images are the input signal, and the output is a numeric value proportional to the dewatered sludge dry content. Three families of features were used to characterize the textures. Gabor filtering, wavelet decomposition and co-occurrence matrix computation were the techniques used. A database of images, ordered by their corresponding dry contents, was used to calibrate the model that calculates the sensor output. The images were separated in groups that correspond to single experimental sessions. With the calibrated model, all images were correctly ranked within an experiment session. The results were very similar regardless of the family of features used. The output can be fed to a control system, or, in the case of fixed experiment conditions, it can be used to directly estimate the dewatered sludge dry content.

  17. A Dual-Line Detection Rayleigh Scattering Diagnostic Technique for the Combustion of Hydrocarbon Fuels and Filtered UV Rayleigh Scattering for Gas Velocity Measurements

    NASA Technical Reports Server (NTRS)

    Otugen, M. Volkan

    1997-01-01

    Non-intrusive techniques for the dynamic measurement of gas flow properties such as density, temperature and velocity, are needed in the research leading to the development of new generation high-speed aircraft. Accurate velocity, temperature and density data obtained in ground testing and in-flight measurements can help understand the flow physics leading to transition and turbulence in supersonic, high-altitude flight. Such non-intrusive measurement techniques can also be used to study combustion processes of hydrocarbon fuels in aircraft engines. Reliable, time and space resolved temperature measurements in various combustor configurations can lead to a better understanding of high temperature chemical reaction dynamics thus leading to improved modeling and better prediction of such flows. In view of this, a research program was initiated at Polytechnic University's Aerodynamics Laboratory with support from NASA Lewis Research Center through grants NAG3-1301 and NAG3-1690. The overall objective of this program has been to develop laser-based, non-contact, space- and time-resolved temperature and velocity measurement techniques. In the initial phase of the program a ND:YAG laser-based dual-line Rayleigh scattering technique was developed and tested for the accurate measurement of gas temperature in the presence of background laser glare. Effort was next directed towards the development of a filtered, spectrally-resolved Rayleigh/Mie scattering technique with the objective of developing an interferometric method for time-frozen velocity measurements in high-speed flows utilizing the uv line of an ND:YAG laser and an appropriate molecular absorption filter. This effort included both a search for an appropriate filter material for the 266 nm laser line and the development and testing of several image processing techniques for the fast processing of Fabry-Perot images for velocity and temperature information. Finally, work was also carried out for the development of a new laser-based strain-rate and vorticity technique for the time-resolved measurement of vorticity and strain-rates in turbulent flows.

  18. Explicit Filtering Based Low-Dose Differential Phase Reconstruction Algorithm with the Grating Interferometry.

    PubMed

    Jiang, Xiaolei; Zhang, Li; Zhang, Ran; Yin, Hongxia; Wang, Zhenchang

    2015-01-01

    X-ray grating interferometry offers a novel framework for the study of weakly absorbing samples. Three kinds of information, that is, the attenuation, differential phase contrast (DPC), and dark-field images, can be obtained after a single scanning, providing additional and complementary information to the conventional attenuation image. Phase shifts of X-rays are measured by the DPC method; hence, DPC-CT reconstructs refraction indexes rather than attenuation coefficients. In this work, we propose an explicit filtering based low-dose differential phase reconstruction algorithm, which enables reconstruction from reduced scanning without artifacts. The algorithm adopts a differential algebraic reconstruction technique (DART) with the explicit filtering based sparse regularization rather than the commonly used total variation (TV) method. Both the numerical simulation and the biological sample experiment demonstrate the feasibility of the proposed algorithm.

  19. Explicit Filtering Based Low-Dose Differential Phase Reconstruction Algorithm with the Grating Interferometry

    PubMed Central

    Zhang, Li; Zhang, Ran; Yin, Hongxia; Wang, Zhenchang

    2015-01-01

    X-ray grating interferometry offers a novel framework for the study of weakly absorbing samples. Three kinds of information, that is, the attenuation, differential phase contrast (DPC), and dark-field images, can be obtained after a single scanning, providing additional and complementary information to the conventional attenuation image. Phase shifts of X-rays are measured by the DPC method; hence, DPC-CT reconstructs refraction indexes rather than attenuation coefficients. In this work, we propose an explicit filtering based low-dose differential phase reconstruction algorithm, which enables reconstruction from reduced scanning without artifacts. The algorithm adopts a differential algebraic reconstruction technique (DART) with the explicit filtering based sparse regularization rather than the commonly used total variation (TV) method. Both the numerical simulation and the biological sample experiment demonstrate the feasibility of the proposed algorithm. PMID:26089971

  20. Satellite retrievals of Karenia brevis harmful algal blooms in the West Florida shelf using neural networks and impacts of temporal variabilities

    NASA Astrophysics Data System (ADS)

    El-Habashi, Ahmed; Duran, Claudia M.; Lovko, Vincent; Tomlinson, Michelle C.; Stumpf, Richard P.; Ahmed, Sam

    2017-07-01

    We apply a neural network (NN) technique to detect/track Karenia brevis harmful algal blooms (KB HABs) plaguing West Florida shelf (WFS) coasts from Visible-Infrared Imaging Radiometer Suite (VIIRS) satellite observations. Previously KB HABs detection primarily relied on the Moderate Resolution Imaging Spectroradiometer Aqua (MODIS-A) satellite, depending on its remote sensing reflectance signal at the 678-nm chlorophyll fluorescence band (Rrs678) needed for normalized fluorescence height and related red band difference retrieval algorithms. VIIRS, MODIS-A's successor, does not have a 678-nm channel. Instead, our NN uses Rrs at 486-, 551-, and 671-nm VIIRS channels to retrieve phytoplankton absorption at 443 nm (a). The retrieved a images are next filtered by applying limits, defined by (i) low Rrs551-nm backscatter and (ii) a minimum a value associated with KB HABs. The filtered residual images are then converted to show chlorophyll-a concentrations [Chla] and KB cell counts. VIIRS retrievals using our NN and five other retrieval algorithms were compared and evaluated against numerous in situ measurements made over the four-year 2012 to 2016 period, for which VIIRS data are available. These comparisons confirm the viability and higher retrieval accuracies of the NN technique, when combined with the filtering constraints, for effective detection of KB HABs. Analysis of these results as well as sequential satellite observations and recent field measurements underline the importance of short-term temporal variabilities on retrieval accuracies.

  1. The simulation of magnetic resonance elastography through atherosclerosis.

    PubMed

    Thomas-Seale, L E J; Hollis, L; Klatt, D; Sack, I; Roberts, N; Pankaj, P; Hoskins, P R

    2016-06-14

    The clinical diagnosis of atherosclerosis via the measurement of stenosis size is widely acknowledged as an imperfect criterion. The vulnerability of an atherosclerotic plaque to rupture is associated with its mechanical properties. The potential to image these mechanical properties using magnetic resonance elastography (MRE) was investigated through synthetic datasets. An image of the steady state wave propagation, equivalent to the first harmonic, can be extracted directly from finite element analysis. Inversion of this displacement data yields a map of the shear modulus, known as an elastogram. The variation of plaque composition, stenosis size, Gaussian noise, filter thresholds and excitation frequency were explored. A decreasing mean shear modulus with an increasing lipid composition was identified through all stenosis sizes. However the inversion algorithm showed sensitivity to parameter variation leading to artefacts which disrupted both the elastograms and quantitative trends. As noise was increased up to a realistic level, the contrast was maintained between the fully fibrous and lipid plaques but lost between the interim compositions. Although incorporating a Butterworth filter improved the performance of the algorithm, restrictive filter thresholds resulted in a reduction of the sensitivity of the algorithm to composition and noise variation. Increasing the excitation frequency improved the techniques ability to image the magnitude of the shear modulus and identify a contrast between compositions. In conclusion, whilst the technique has the potential to image the shear modulus of atherosclerotic plaques, future research will require the integration of a heterogeneous inversion algorithm. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Detecting prostate cancer and prostatic calcifications using advanced magnetic resonance imaging

    PubMed Central

    Dou, Shewei; Bai, Yan; Shandil, Ankit; Ding, Degang; Shi, Dapeng; Haacke, E Mark; Wang, Meiyun

    2017-01-01

    Prostate cancer and prostatic calcifications have a high incidence in elderly men. We aimed to investigate the diagnostic capabilities of susceptibility-weighted imaging in detecting prostate cancer and prostatic calcifications. A total number of 156 men, including 34 with prostate cancer and 122 with benign prostate were enrolled in this study. Computed tomography, conventional magnetic resonance imaging, diffusion-weighted imaging, and susceptibility-weighted imaging were performed on all the patients. One hundred and twelve prostatic calcifications were detected in 87 patients. The sensitivities and specificities of the conventional magnetic resonance imaging, apparent diffusion coefficient, and susceptibility-filtered phase images in detecting prostate cancer and prostatic calcifications were calculated. McNemar's Chi-square test was used to compare the differences in sensitivities and specificities between the techniques. The results showed that the sensitivity and specificity of susceptibility-filtered phase images in detecting prostatic cancer were greater than that of conventional magnetic resonance imaging and apparent diffusion coefficient (P < 0.05). In addition, the sensitivity and specificity of susceptibility-filtered phase images in detecting prostatic calcifications were comparable to that of computed tomography and greater than that of conventional magnetic resonance imaging and apparent diffusion coefficient (P < 0.05). Given the high incidence of susceptibility-weighted imaging (SWI) abnormality in prostate cancer, we conclude that susceptibility-weighted imaging is more sensitive and specific than conventional magnetic resonance imaging, diffusion-weighted imaging, and computed tomography in detecting prostate cancer. Furthermore, susceptibility-weighted imaging can identify prostatic calcifications similar to computed tomography, and it is much better than conventional magnetic resonance imaging and diffusion-weighted imaging. PMID:27004542

  3. Detecting prostate cancer and prostatic calcifications using advanced magnetic resonance imaging.

    PubMed

    Dou, Shewei; Bai, Yan; Shandil, Ankit; Ding, Degang; Shi, Dapeng; Haacke, E Mark; Wang, Meiyun

    2017-01-01

    Prostate cancer and prostatic calcifications have a high incidence in elderly men. We aimed to investigate the diagnostic capabilities of susceptibility-weighted imaging in detecting prostate cancer and prostatic calcifications. A total number of 156 men, including 34 with prostate cancer and 122 with benign prostate were enrolled in this study. Computed tomography, conventional magnetic resonance imaging, diffusion-weighted imaging, and susceptibility-weighted imaging were performed on all the patients. One hundred and twelve prostatic calcifications were detected in 87 patients. The sensitivities and specificities of the conventional magnetic resonance imaging, apparent diffusion coefficient, and susceptibility-filtered phase images in detecting prostate cancer and prostatic calcifications were calculated. McNemar's Chi-square test was used to compare the differences in sensitivities and specificities between the techniques. The results showed that the sensitivity and specificity of susceptibility-filtered phase images in detecting prostatic cancer were greater than that of conventional magnetic resonance imaging and apparent diffusion coefficient (P < 0.05). In addition, the sensitivity and specificity of susceptibility-filtered phase images in detecting prostatic calcifications were comparable to that of computed tomography and greater than that of conventional magnetic resonance imaging and apparent diffusion coefficient (P < 0.05). Given the high incidence of susceptibility-weighted imaging (SWI) abnormality in prostate cancer, we conclude that susceptibility-weighted imaging is more sensitive and specific than conventional magnetic resonance imaging, diffusion-weighted imaging, and computed tomography in detecting prostate cancer. Furthermore, susceptibility-weighted imaging can identify prostatic calcifications similar to computed tomography, and it is much better than conventional magnetic resonance imaging and diffusion-weighted imaging.

  4. Concerted spatial-frequency and polarization-phase filtering of laser images of polycrystalline networks of blood plasma smears

    NASA Astrophysics Data System (ADS)

    Ushenko, Yu A.

    2012-11-01

    The complex technique of concerted polarization-phase and spatial-frequency filtering of blood plasma laser images is suggested. The possibility of obtaining the coordinate distributions of phases of linearly and circularly birefringent protein networks of blood plasma separately is presented. The statistical (moments of the first to fourth orders) and scale self-similar (logarithmic dependences of power spectra) structure of phase maps of different types of birefringence of blood plasma of two groups of patients-healthy people (donors) and those suffering from rectal cancer-is investigated. The diagnostically sensitive parameters of a pathological change of the birefringence of blood plasma polycrystalline networks are determined. The effectiveness of this technique for detecting change in birefringence in the smears of other biological fluids in diagnosing the appearance of cholelithiasis (bile), operative differentiation of the acute and gangrenous appendicitis (exudate), and differentiation of inflammatory diseases of joints (synovial fluid) is shown.

  5. Morphology of Nano and Micro Fiber Structures in Ultrafine Particles Filtration

    NASA Astrophysics Data System (ADS)

    Kimmer, Dusan; Vincent, Ivo; Fenyk, Jan; Petras, David; Zatloukal, Martin; Sambaer, Wannes; Zdimal, Vladimir

    2011-07-01

    Selected procedures permitting to prepare homogeneous nanofibre structures of the desired morphology by employing a suitable combination of variables during the electrospinning process are presented. A comparison (at the same pressure drop) was made of filtration capabilities of planar polyurethane nanostructures formed exclusively by nanofibres, space polycarbonate nanostructures having bead spacers, structures formed by a combination of polymethyl methacrylate micro- and nanofibres and polypropylene meltblown microstructures, through which ultrafine particles of ammonium sulphate 20-400 nm in size were filtered. The structures studied were described using a new digital image analysis technique based on black and white images obtained by scanning electron microscopy. More voluminous structures modified with distance microspheres and having a greater thickness and mass per square area of the material, i.e. structures possessing better mechanical properties, demanded so much in nanostructures, enable preparation of filters having approximately the same free volume fraction as flat nanofibre filters but an increased effective fibre surface area, changed pore size morphology and, consequently, a higher filter quality.

  6. Implementation theory of distortion-invariant pattern recognition for optical and digital signal processing systems

    NASA Astrophysics Data System (ADS)

    Lhamon, Michael Earl

    A pattern recognition system which uses complex correlation filter banks requires proportionally more computational effort than single-real valued filters. This introduces increased computation burden but also introduces a higher level of parallelism, that common computing platforms fail to identify. As a result, we consider algorithm mapping to both optical and digital processors. For digital implementation, we develop computationally efficient pattern recognition algorithms, referred to as, vector inner product operators that require less computational effort than traditional fast Fourier methods. These algorithms do not need correlation and they map readily onto parallel digital architectures, which imply new architectures for optical processors. These filters exploit circulant-symmetric matrix structures of the training set data representing a variety of distortions. By using the same mathematical basis as with the vector inner product operations, we are able to extend the capabilities of more traditional correlation filtering to what we refer to as "Super Images". These "Super Images" are used to morphologically transform a complicated input scene into a predetermined dot pattern. The orientation of the dot pattern is related to the rotational distortion of the object of interest. The optical implementation of "Super Images" yields feature reduction necessary for using other techniques, such as artificial neural networks. We propose a parallel digital signal processor architecture based on specific pattern recognition algorithms but general enough to be applicable to other similar problems. Such an architecture is classified as a data flow architecture. Instead of mapping an algorithm to an architecture, we propose mapping the DSP architecture to a class of pattern recognition algorithms. Today's optical processing systems have difficulties implementing full complex filter structures. Typically, optical systems (like the 4f correlators) are limited to phase-only implementation with lower detection performance than full complex electronic systems. Our study includes pseudo-random pixel encoding techniques for approximating full complex filtering. Optical filter bank implementation is possible and they have the advantage of time averaging the entire filter bank at real time rates. Time-averaged optical filtering is computational comparable to billions of digital operations-per-second. For this reason, we believe future trends in high speed pattern recognition will involve hybrid architectures of both optical and DSP elements.

  7. Denoising Algorithm for CFA Image Sensors Considering Inter-Channel Correlation.

    PubMed

    Lee, Min Seok; Park, Sang Wook; Kang, Moon Gi

    2017-05-28

    In this paper, a spatio-spectral-temporal filter considering an inter-channel correlation is proposed for the denoising of a color filter array (CFA) sequence acquired by CCD/CMOS image sensors. Owing to the alternating under-sampled grid of the CFA pattern, the inter-channel correlation must be considered in the direct denoising process. The proposed filter is applied in the spatial, spectral, and temporal domain, considering the spatio-tempo-spectral correlation. First, nonlocal means (NLM) spatial filtering with patch-based difference (PBD) refinement is performed by considering both the intra-channel correlation and inter-channel correlation to overcome the spatial resolution degradation occurring with the alternating under-sampled pattern. Second, a motion-compensated temporal filter that employs inter-channel correlated motion estimation and compensation is proposed to remove the noise in the temporal domain. Then, a motion adaptive detection value controls the ratio of the spatial filter and the temporal filter. The denoised CFA sequence can thus be obtained without motion artifacts. Experimental results for both simulated and real CFA sequences are presented with visual and numerical comparisons to several state-of-the-art denoising methods combined with a demosaicing method. Experimental results confirmed that the proposed frameworks outperformed the other techniques in terms of the objective criteria and subjective visual perception in CFA sequences.

  8. Image processing and recognition for biological images

    PubMed Central

    Uchida, Seiichi

    2013-01-01

    This paper reviews image processing and pattern recognition techniques, which will be useful to analyze bioimages. Although this paper does not provide their technical details, it will be possible to grasp their main tasks and typical tools to handle the tasks. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. As the main tasks of image processing, this paper introduces gray-level transformation, binarization, image filtering, image segmentation, visual object tracking, optical flow and image registration. Image pattern recognition is the technique to classify an input image into one of the predefined classes and also has a large research area. This paper overviews its two main modules, that is, feature extraction module and classification module. Throughout the paper, it will be emphasized that bioimage is a very difficult target for even state-of-the-art image processing and pattern recognition techniques due to noises, deformations, etc. This paper is expected to be one tutorial guide to bridge biology and image processing researchers for their further collaboration to tackle such a difficult target. PMID:23560739

  9. Flame filtering and perimeter localization of wildfires using aerial thermal imagery

    NASA Astrophysics Data System (ADS)

    Valero, Mario M.; Verstockt, Steven; Rios, Oriol; Pastor, Elsa; Vandecasteele, Florian; Planas, Eulàlia

    2017-05-01

    Airborne thermal infrared (TIR) imaging systems are being increasingly used for wild fire tactical monitoring since they show important advantages over spaceborne platforms and visible sensors while becoming much more affordable and much lighter than multispectral cameras. However, the analysis of aerial TIR images entails a number of difficulties which have thus far prevented monitoring tasks from being totally automated. One of these issues that needs to be addressed is the appearance of flame projections during the geo-correction of off-nadir images. Filtering these flames is essential in order to accurately estimate the geographical location of the fuel burning interface. Therefore, we present a methodology which allows the automatic localisation of the active fire contour free of flame projections. The actively burning area is detected in TIR georeferenced images through a combination of intensity thresholding techniques, morphological processing and active contours. Subsequently, flame projections are filtered out by the temporal frequency analysis of the appropriate contour descriptors. The proposed algorithm was tested on footages acquired during three large-scale field experimental burns. Results suggest this methodology may be suitable to automatise the acquisition of quantitative data about the fire evolution. As future work, a revision of the low-pass filter implemented for the temporal analysis (currently a median filter) was recommended. The availability of up-to-date information about the fire state would improve situational awareness during an emergency response and may be used to calibrate data-driven simulators capable of emitting short-term accurate forecasts of the subsequent fire evolution.

  10. Inverse halftoning via robust nonlinear filtering

    NASA Astrophysics Data System (ADS)

    Shen, Mei-Yin; Kuo, C.-C. Jay

    1999-10-01

    A new blind inverse halftoning algorithm based on a nonlinear filtering technique of low computational complexity and low memory requirement is proposed in this research. It is called blind since we do not require the knowledge of the halftone kernel. The proposed scheme performs nonlinear filtering in conjunction with edge enhancement to improve the quality of an inverse halftoned image. Distinct features of the proposed approach include: efficiently smoothing halftone patterns in large homogeneous areas, additional edge enhancement capability to recover the edge quality and an excellent PSNR performance with only local integer operations and a small memory buffer.

  11. Optimized Beam Sculpting with Generalized Fringe-rate Filters

    NASA Astrophysics Data System (ADS)

    Parsons, Aaron R.; Liu, Adrian; Ali, Zaki S.; Cheng, Carina

    2016-03-01

    We generalize the technique of fringe-rate filtering, whereby visibilities measured by a radio interferometer are re-weighted according to their temporal variation. As the Earth rotates, radio sources traverse through an interferometer’s fringe pattern at rates that depend on their position on the sky. Capitalizing on this geometric interpretation of fringe rates, we employ time-domain convolution kernels to enact fringe-rate filters that sculpt the effective primary beam of antennas in an interferometer. As we show, beam sculpting through fringe-rate filtering can be used to optimize measurements for a variety of applications, including mapmaking, minimizing polarization leakage, suppressing instrumental systematics, and enhancing the sensitivity of power-spectrum measurements. We show that fringe-rate filtering arises naturally in minimum variance treatments of many of these problems, enabling optimal visibility-based approaches to analyses of interferometric data that avoid systematics potentially introduced by traditional approaches such as imaging. Our techniques have recently been demonstrated in Ali et al., where new upper limits were placed on the 21 {cm} power spectrum from reionization, showcasing the ability of fringe-rate filtering to successfully boost sensitivity and reduce the impact of systematics in deep observations.

  12. Experimental Demonstration of Adaptive Infrared Multispectral Imaging using Plasmonic Filter Array.

    PubMed

    Jang, Woo-Yong; Ku, Zahyun; Jeon, Jiyeon; Kim, Jun Oh; Lee, Sang Jun; Park, James; Noyola, Michael J; Urbas, Augustine

    2016-10-10

    In our previous theoretical study, we performed target detection using a plasmonic sensor array incorporating the data-processing technique termed "algorithmic spectrometry". We achieved the reconstruction of a target spectrum by extracting intensity at multiple wavelengths with high resolution from the image data obtained from the plasmonic array. The ultimate goal is to develop a full-scale focal plane array with a plasmonic opto-coupler in order to move towards the next generation of versatile infrared cameras. To this end, and as an intermediate step, this paper reports the experimental demonstration of adaptive multispectral imagery using fabricated plasmonic spectral filter arrays and proposed target detection scenarios. Each plasmonic filter was designed using periodic circular holes perforated through a gold layer, and an enhanced target detection strategy was proposed to refine the original spectrometry concept for spatial and spectral computation of the data measured from the plasmonic array. Both the spectrum of blackbody radiation and a metal ring object at multiple wavelengths were successfully reconstructed using the weighted superposition of plasmonic output images as specified in the proposed detection strategy. In addition, plasmonic filter arrays were theoretically tested on a target at extremely high temperature as a challenging scenario for the detection scheme.

  13. Color filter array design based on a human visual model

    NASA Astrophysics Data System (ADS)

    Parmar, Manu; Reeves, Stanley J.

    2004-05-01

    To reduce cost and complexity associated with registering multiple color sensors, most consumer digital color cameras employ a single sensor. A mosaic of color filters is overlaid on a sensor array such that only one color channel is sampled per pixel location. The missing color values must be reconstructed from available data before the image is displayed. The quality of the reconstructed image depends fundamentally on the array pattern and the reconstruction technique. We present a design method for color filter array patterns that use red, green, and blue color channels in an RGB array. A model of the human visual response for luminance and opponent chrominance channels is used to characterize the perceptual error between a fully sampled and a reconstructed sparsely-sampled image. Demosaicking is accomplished using Wiener reconstruction. To ensure that the error criterion reflects perceptual effects, reconstruction is done in a perceptually uniform color space. A sequential backward selection algorithm is used to optimize the error criterion to obtain the sampling arrangement. Two different types of array patterns are designed: non-periodic and periodic arrays. The resulting array patterns outperform commonly used color filter arrays in terms of the error criterion.

  14. Damage localization in aluminum plate with compact rectangular phased piezoelectric transducer array

    NASA Astrophysics Data System (ADS)

    Liu, Zenghua; Sun, Kunming; Song, Guorong; He, Cunfu; Wu, Bin

    2016-03-01

    In this work, a detection method for the damage in plate-like structure with a compact rectangular phased piezoelectric transducer array of 16 piezoelectric elements was presented. This compact array can not only detect and locate a single defect (through hole) in plate, but also identify multi-defects (through holes and surface defect simulated by an iron pillar glued to the plate). The experiments proved that the compact rectangular phased transducer array could detect the full range of plate structures and implement multiple-defect detection simultaneously. The processing algorithm proposed in this paper contains two parts: signal filtering and damage imaging. The former part was used to remove noise from signals. Continuous wavelet transform was applicable to signal filtering. Continuous wavelet transform can provide a plot of wavelet coefficients and the signal with narrow frequency band can be easily extracted from the plot. The latter part of processing algorithm was to implement damage detection and localization. In order to accurately locate defects and improve the imaging quality, two images were obtained from amplitude and phase information. One image was obtained with the Total Focusing Method (TFM) and another phase image was obtained with the Sign Coherence Factor (SCF). Furthermore, an image compounding technique for compact rectangular phased piezoelectric transducer array was proposed in this paper. With the proposed technique, the compounded image can be obtained by combining TFM image with SCF image, thus greatly improving the resolution and contrast of image.

  15. A supervised 'lesion-enhancement' filter by use of a massive-training artificial neural network (MTANN) in computer-aided diagnosis (CAD).

    PubMed

    Suzuki, Kenji

    2009-09-21

    Computer-aided diagnosis (CAD) has been an active area of study in medical image analysis. A filter for the enhancement of lesions plays an important role for improving the sensitivity and specificity in CAD schemes. The filter enhances objects similar to a model employed in the filter; e.g. a blob-enhancement filter based on the Hessian matrix enhances sphere-like objects. Actual lesions, however, often differ from a simple model; e.g. a lung nodule is generally modeled as a solid sphere, but there are nodules of various shapes and with internal inhomogeneities such as a nodule with spiculations and ground-glass opacity. Thus, conventional filters often fail to enhance actual lesions. Our purpose in this study was to develop a supervised filter for the enhancement of actual lesions (as opposed to a lesion model) by use of a massive-training artificial neural network (MTANN) in a CAD scheme for detection of lung nodules in CT. The MTANN filter was trained with actual nodules in CT images to enhance actual patterns of nodules. By use of the MTANN filter, the sensitivity and specificity of our CAD scheme were improved substantially. With a database of 69 lung cancers, nodule candidate detection by the MTANN filter achieved a 97% sensitivity with 6.7 false positives (FPs) per section, whereas nodule candidate detection by a difference-image technique achieved a 96% sensitivity with 19.3 FPs per section. Classification-MTANNs were applied for further reduction of the FPs. The classification-MTANNs removed 60% of the FPs with a loss of one true positive; thus, it achieved a 96% sensitivity with 2.7 FPs per section. Overall, with our CAD scheme based on the MTANN filter and classification-MTANNs, an 84% sensitivity with 0.5 FPs per section was achieved.

  16. Model-based iterative reconstruction in low-dose CT colonography-feasibility study in 65 patients for symptomatic investigation.

    PubMed

    Vardhanabhuti, Varut; James, Julia; Nensey, Rehaan; Hyde, Christopher; Roobottom, Carl

    2015-05-01

    To compare image quality on computed tomographic colonography (CTC) acquired at standard dose (STD) and low dose (LD) using filtered-back projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction (MBIR) techniques. A total of 65 symptomatic patients were prospectively enrolled for the study and underwent STD and LD CTC with filtered-back projection, adaptive statistical iterative reconstruction, and MBIR to allow direct per-patient comparison. Objective image noise, subjective image analyses, and polyp detection were assessed. Objective image noise analysis demonstrates significant noise reduction using MBIR technique (P < .05) despite being acquired at lower doses. Subjective image analyses were superior for LD MBIR in all parameters except visibility of extracolonic lesions (two-dimensional) and visibility of colonic wall (three-dimensional) where there were no significant differences. There was no significant difference in polyp detection rates (P > .05). Doses: LD (dose-length product, 257.7), STD (dose-length product, 483.6). LD MBIR CTC objectively shows improved image noise using parameters in our study. Subjectively, image quality is maintained. Polyp detection shows no significant difference but because of small numbers needs further validation. Average dose reduction of 47% can be achieved. This study confirms feasibility of using MBIR in this context of CTC in symptomatic population. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  17. A Synthesis of Star Calibration Techniques for Ground-Based Narrowband Electron-Multiplying Charge-Coupled Device Imagers Used in Auroral Photometry

    NASA Technical Reports Server (NTRS)

    Grubbs, Guy II; Michell, Robert; Samara, Marilia; Hampton, Don; Jahn, Jorg-Micha

    2016-01-01

    A technique is presented for the periodic and systematic calibration of ground-based optical imagers. It is important to have a common system of units (Rayleighs or photon flux) for cross comparison as well as self-comparison over time. With the advancement in technology, the sensitivity of these imagers has improved so that stars can be used for more precise calibration. Background subtraction, flat fielding, star mapping, and other common techniques are combined in deriving a calibration technique appropriate for a variety of ground-based imager installations. Spectral (4278, 5577, and 8446 A ) ground-based imager data with multiple fields of view (19, 47, and 180 deg) are processed and calibrated using the techniques developed. The calibration techniques applied result in intensity measurements in agreement between different imagers using identical spectral filtering, and the intensity at each wavelength observed is within the expected range of auroral measurements. The application of these star calibration techniques, which convert raw imager counts into units of photon flux, makes it possible to do quantitative photometry. The computed photon fluxes, in units of Rayleighs, can be used for the absolute photometry between instruments or as input parameters for auroral electron transport models.

  18. SU-F-J-26: Performance of 2.5MV Portal Imaging in Comparison with KV X-Ray and 6MV and Flattening-Filter-Free 6MV Portal Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, J; Yang, Y; Faught, A

    Purpose: To assess image quality and imaging dose of 2.5MV electronic portal imaging in comparison to kV imaging and 6MV and Flattening-Filter-Free 6MV (6MVFFF) portal imaging using a DMI imager. Methods: Quantitative assessment of image quality was performed with Leeds and Las Vegas test phantoms in conjunction with qualitative evaluation of clinical patient images for kV imaging and 2.5MV, 6MV and 6MVFFF portal imaging. High and low contrast resolutions were evaluated and imaging doses were measured using these x-rays. Phantom test was performed both in air and in solid water. Clinical patient portal images were also reviewed and qualitatively assessedmore » for these three imaging MV energies. Results: Among the 28 objects in Las Vegas phantom, 16, 17 and 26 of them were resolved using Low Dose technique and 18, 22 and 26 were resolved using High Quality technique with 6MV, 6MVFFF and 2.5MV, respectively. The number of Leeds low contrast objects resolved by 6MV, 6MFFFF and 2.5MV was 6, 15 and 18 with Low Dose technique and 14, 17 and 18 with High Quality technique, respectively. When the test phantoms were embedded in 20cm thick solid water, the results were noticeably affected, but the performance of 2.5MV was still substantially better than 6MV and 6MVFFF. Imaging dose with 2.5MV measured at 10 cm depth was about half of that with 6MV or 6MVFFF. Clinical patient portal images were reviewed and qualitatively assessed for different sites including brain, head-and-neck, chest and pelvis. 2.5MV imaging provided more details and substantially higher contrast. Conclusion: While portal imaging with 6MVFFF provides noticeably better image quality than that with 6MV, the performance of 2.5MV portal imaging is substantially better than both 6MV and 6MVFFF in terms of high and low contrast resolutions as well as lower imaging dose. 2.5MV imaging provides near kV imaging quality.« less

  19. Efficient Scalable Median Filtering Using Histogram-Based Operations.

    PubMed

    Green, Oded

    2018-05-01

    Median filtering is a smoothing technique for noise removal in images. While there are various implementations of median filtering for a single-core CPU, there are few implementations for accelerators and multi-core systems. Many parallel implementations of median filtering use a sorting algorithm for rearranging the values within a filtering window and taking the median of the sorted value. While using sorting algorithms allows for simple parallel implementations, the cost of the sorting becomes prohibitive as the filtering windows grow. This makes such algorithms, sequential and parallel alike, inefficient. In this work, we introduce the first software parallel median filtering that is non-sorting-based. The new algorithm uses efficient histogram-based operations. These reduce the computational requirements of the new algorithm while also accessing the image fewer times. We show an implementation of our algorithm for both the CPU and NVIDIA's CUDA supported graphics processing unit (GPU). The new algorithm is compared with several other leading CPU and GPU implementations. The CPU implementation has near perfect linear scaling with a speedup on a quad-core system. The GPU implementation is several orders of magnitude faster than the other GPU implementations for mid-size median filters. For small kernels, and , comparison-based approaches are preferable as fewer operations are required. Lastly, the new algorithm is open-source and can be found in the OpenCV library.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutcliffe, G. D.; Milanese, L. M.; Orozco, D.

    CR-39 detectors are used routinely in inertial confinement fusion (ICF) experiments as a part of nuclear diagnostics. CR-39 is filtered to stop fast ablator ions which have been accelerated from an ICF implosion due to electric fields caused by laser-plasma interactions. In some experiments, the filtering is insufficient to block these ions and the fusion-product signal tracks are lost in the large background of accelerated ion tracks. A technique for recovering signal in these scenarios has been developed, tested, and implemented successfully. The technique involves removing material from the surface of the CR-39 to a depth beyond the endpoint ofmore » the ablator ion tracks. The technique preserves signal magnitude (yield) as well as structure in radiograph images. The technique is effective when signal particle range is at least 10 μm deeper than the necessary bulk material removal.« less

  1. The Value of Rotational Venography Versus Anterior–Posterior Venography in 100 Consecutive IVC Filter Retrievals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiefer, Ryan M., E-mail: rkiefer11@gmail.com; Pandey, Nirnimesh; Trerotola, Scott O.

    PurposeAccurately detecting inferior vena cava (IVC) filter complications is important for safe and successful retrieval as tip-embedded filters require removal with non-standard techniques. Venography prior to IVC filter retrieval has traditionally used a single anterior–posterior (AP) projection. This study compares the utility of rotational venography to AP venography prior to IVC filter removal.Materials and MethodsThe rotational venograms from 100 consecutive IVC filter retrievals over a 35-month period were evaluated retrospectively. The AP view of the rotational venogram was examined separately from the full series by a radiologist blinded to alternative imaging and operative findings. The venograms were evaluated for tipmore » embedding, filter fracture, filter thrombus, and IVC thrombus. Statistical analysis was performed.ResultsUsing operative findings and peri-procedural imaging as the reference standard, tip embedding occurred in 59 of the 100 filters (59 %). AP venography was used to correctly identify 31 tip-embedded filters (53 % sensitivity) with two false positives (95 % specificity) for an accuracy of 70 %. Rotational venography was used to correctly identify 58 tip-embedded filters (98 % sensitivity) with one false positive (98 % specificity) for an accuracy of 98 %. A significant difference was found in the sensitivities of the two diagnostic approaches (P < .01). Other findings of thrombus and filter fracture were not significantly different between the two groups.ConclusionRotational venograms allow for more accurate detection of tip-embedded IVC filters compared to AP views alone. As this determines the approach taken, rotational venograms are helpful if obtained prior to IVC filter retrieval.« less

  2. Crack Imaging and Quantification in Aluminum Plates with Guided Wave Wavenumber Analysis Methods

    NASA Technical Reports Server (NTRS)

    Yu, Lingyu; Tian, Zhenhua; Leckey, Cara A. C.

    2015-01-01

    Guided wavefield analysis methods for detection and quantification of crack damage in an aluminum plate are presented in this paper. New wavenumber components created by abrupt wave changes at the structural discontinuity are identified in the frequency-wavenumber spectra. It is shown that the new wavenumbers can be used to detect and characterize the crack dimensions. Two imaging based approaches, filter reconstructed imaging and spatial wavenumber imaging, are used to demonstrate how the cracks can be evaluated with wavenumber analysis. The filter reconstructed imaging is shown to be a rapid method to map the plate and any existing damage, but with less precision in estimating crack dimensions; while the spatial wavenumber imaging provides an intensity image of spatial wavenumber values with enhanced resolution of crack dimensions. These techniques are applied to simulated wavefield data, and the simulation based studies show that spatial wavenumber imaging method is able to distinguish cracks of different severities. Laboratory experimental validation is performed for a single crack case to confirm the methods' capabilities for imaging cracks in plates.

  3. Anti-aliasing Wiener filtering for wave-front reconstruction in the spatial-frequency domain for high-order astronomical adaptive-optics systems.

    PubMed

    Correia, Carlos M; Teixeira, Joel

    2014-12-01

    Computationally efficient wave-front reconstruction techniques for astronomical adaptive-optics (AO) systems have seen great development in the past decade. Algorithms developed in the spatial-frequency (Fourier) domain have gathered much attention, especially for high-contrast imaging systems. In this paper we present the Wiener filter (resulting in the maximization of the Strehl ratio) and further develop formulae for the anti-aliasing (AA) Wiener filter that optimally takes into account high-order wave-front terms folded in-band during the sensing (i.e., discrete sampling) process. We employ a continuous spatial-frequency representation for the forward measurement operators and derive the Wiener filter when aliasing is explicitly taken into account. We further investigate and compare to classical estimates using least-squares filters the reconstructed wave-front, measurement noise, and aliasing propagation coefficients as a function of the system order. Regarding high-contrast systems, we provide achievable performance results as a function of an ensemble of forward models for the Shack-Hartmann wave-front sensor (using sparse and nonsparse representations) and compute point-spread-function raw intensities. We find that for a 32×32 single-conjugated AOs system the aliasing propagation coefficient is roughly 60% of the least-squares filters, whereas the noise propagation is around 80%. Contrast improvements of factors of up to 2 are achievable across the field in the H band. For current and next-generation high-contrast imagers, despite better aliasing mitigation, AA Wiener filtering cannot be used as a standalone method and must therefore be used in combination with optical spatial filters deployed before image formation actually takes place.

  4. On-Line Multi-Damage Scanning Spatial-Wavenumber Filter Based Imaging Method for Aircraft Composite Structure.

    PubMed

    Ren, Yuanqiang; Qiu, Lei; Yuan, Shenfang; Bao, Qiao

    2017-05-11

    Structural health monitoring (SHM) of aircraft composite structure is helpful to increase reliability and reduce maintenance costs. Due to the great effectiveness in distinguishing particular guided wave modes and identifying the propagation direction, the spatial-wavenumber filter technique has emerged as an interesting SHM topic. In this paper, a new scanning spatial-wavenumber filter (SSWF) based imaging method for multiple damages is proposed to conduct on-line monitoring of aircraft composite structures. Firstly, an on-line multi-damage SSWF is established, including the fundamental principle of SSWF for multiple damages based on a linear piezoelectric (PZT) sensor array, and a corresponding wavenumber-time imaging mechanism by using the multi-damage scattering signal. Secondly, through combining the on-line multi-damage SSWF and a PZT 2D cross-shaped array, an image-mapping method is proposed to conduct wavenumber synthesis and convert the two wavenumber-time images obtained by the PZT 2D cross-shaped array to an angle-distance image, from which the multiple damages can be directly recognized and located. In the experimental validation, both simulated multi-damage and real multi-damage introduced by repeated impacts are performed on a composite plate structure. The maximum localization error is less than 2 cm, which shows good performance of the multi-damage imaging method. Compared with the existing spatial-wavenumber filter based damage evaluation methods, the proposed method requires no more than the multi-damage scattering signal and can be performed without depending on any wavenumber modeling or measuring. Besides, this method locates multiple damages by imaging instead of the geometric method, which helps to improve the signal-to-noise ratio. Thus, it can be easily applied to on-line multi-damage monitoring of aircraft composite structures.

  5. On-Line Multi-Damage Scanning Spatial-Wavenumber Filter Based Imaging Method for Aircraft Composite Structure

    PubMed Central

    Ren, Yuanqiang; Qiu, Lei; Yuan, Shenfang; Bao, Qiao

    2017-01-01

    Structural health monitoring (SHM) of aircraft composite structure is helpful to increase reliability and reduce maintenance costs. Due to the great effectiveness in distinguishing particular guided wave modes and identifying the propagation direction, the spatial-wavenumber filter technique has emerged as an interesting SHM topic. In this paper, a new scanning spatial-wavenumber filter (SSWF) based imaging method for multiple damages is proposed to conduct on-line monitoring of aircraft composite structures. Firstly, an on-line multi-damage SSWF is established, including the fundamental principle of SSWF for multiple damages based on a linear piezoelectric (PZT) sensor array, and a corresponding wavenumber-time imaging mechanism by using the multi-damage scattering signal. Secondly, through combining the on-line multi-damage SSWF and a PZT 2D cross-shaped array, an image-mapping method is proposed to conduct wavenumber synthesis and convert the two wavenumber-time images obtained by the PZT 2D cross-shaped array to an angle-distance image, from which the multiple damages can be directly recognized and located. In the experimental validation, both simulated multi-damage and real multi-damage introduced by repeated impacts are performed on a composite plate structure. The maximum localization error is less than 2 cm, which shows good performance of the multi-damage imaging method. Compared with the existing spatial-wavenumber filter based damage evaluation methods, the proposed method requires no more than the multi-damage scattering signal and can be performed without depending on any wavenumber modeling or measuring. Besides, this method locates multiple damages by imaging instead of the geometric method, which helps to improve the signal-to-noise ratio. Thus, it can be easily applied to on-line multi-damage monitoring of aircraft composite structures. PMID:28772879

  6. Neutrophil-endothelial cell interactions on endothelial monolayers grown on micropore filters.

    PubMed Central

    Taylor, R F; Price, T H; Schwartz, S M; Dale, D C

    1981-01-01

    We have developed a technique for growing endothelial monolayers on micropore filters. These monolayers demonstrate confluence by phase and electron microscopy and provide a functional barrier to passage of radiolabeled albumin. Neutrophils readily penetrate the monolayer in response to chemotaxin, whereas there is little movement in the absence of chemotaxin. This system offers unique advantages over available chemotaxis assays and may have wider applications in the study of endothelial function. Images PMID:7007441

  7. Minerals and aligned collagen fibrils in tilapia fish scales: structural analysis using dark-field and energy-filtered transmission electron microscopy and electron tomography.

    PubMed

    Okuda, Mitsuhiro; Ogawa, Nobuhiro; Takeguchi, Masaki; Hashimoto, Ayako; Tagaya, Motohiro; Chen, Song; Hanagata, Nobutaka; Ikoma, Toshiyuki

    2011-10-01

    The mineralized structure of aligned collagen fibrils in a tilapia fish scale was investigated using transmission electron microscopy (TEM) techniques after a thin sample was prepared using aqueous techniques. Electron diffraction and electron energy loss spectroscopy data indicated that a mineralized internal layer consisting of aligned collagen fibrils contains hydroxyapatite crystals. Bright-field imaging, dark-field imaging, and energy-filtered TEM showed that the hydroxyapatite was mainly distributed in the hole zones of the aligned collagen fibrils structure, while needle-like materials composed of calcium compounds including hydroxyapatite existed in the mineralized internal layer. Dark-field imaging and three-dimensional observation using electron tomography revealed that hydroxyapatite and needle-like materials were mainly found in the matrix between the collagen fibrils. It was observed that hydroxyapatite and needle-like materials were preferentially distributed on the surface of the hole zones in the aligned collagen fibrils structure and in the matrix between the collagen fibrils in the mineralized internal layer of the scale.

  8. Principle component analysis and linear discriminant analysis of multi-spectral autofluorescence imaging data for differentiating basal cell carcinoma and healthy skin

    NASA Astrophysics Data System (ADS)

    Chernomyrdin, Nikita V.; Zaytsev, Kirill I.; Lesnichaya, Anastasiya D.; Kudrin, Konstantin G.; Cherkasova, Olga P.; Kurlov, Vladimir N.; Shikunova, Irina A.; Perchik, Alexei V.; Yurchenko, Stanislav O.; Reshetov, Igor V.

    2016-09-01

    In present paper, an ability to differentiate basal cell carcinoma (BCC) and healthy skin by combining multi-spectral autofluorescence imaging, principle component analysis (PCA), and linear discriminant analysis (LDA) has been demonstrated. For this purpose, the experimental setup, which includes excitation and detection branches, has been assembled. The excitation branch utilizes a mercury arc lamp equipped with a 365-nm narrow-linewidth excitation filter, a beam homogenizer, and a mechanical chopper. The detection branch employs a set of bandpass filters with the central wavelength of spectral transparency of λ = 400, 450, 500, and 550 nm, and a digital camera. The setup has been used to study three samples of freshly excised BCC. PCA and LDA have been implemented to analyze the data of multi-spectral fluorescence imaging. Observed results of this pilot study highlight the advantages of proposed imaging technique for skin cancer diagnosis.

  9. Pipeline for effective denoising of digital mammography and digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Borges, Lucas R.; Bakic, Predrag R.; Foi, Alessandro; Maidment, Andrew D. A.; Vieira, Marcelo A. C.

    2017-03-01

    Denoising can be used as a tool to enhance image quality and enforce low radiation doses in X-ray medical imaging. The effectiveness of denoising techniques relies on the validity of the underlying noise model. In full-field digital mammography (FFDM) and digital breast tomosynthesis (DBT), calibration steps like the detector offset and flat-fielding can affect some assumptions made by most denoising techniques. Furthermore, quantum noise found in X-ray images is signal-dependent and can only be treated by specific filters. In this work we propose a pipeline for FFDM and DBT image denoising that considers the calibration steps and simplifies the modeling of the noise statistics through variance-stabilizing transformations (VST). The performance of a state-of-the-art denoising method was tested with and without the proposed pipeline. To evaluate the method, objective metrics such as the normalized root mean square error (N-RMSE), noise power spectrum, modulation transfer function (MTF) and the frequency signal-to-noise ratio (SNR) were analyzed. Preliminary tests show that the pipeline improves denoising. When the pipeline is not used, bright pixels of the denoised image are under-filtered and dark pixels are over-smoothed due to the assumption of a signal-independent Gaussian model. The pipeline improved denoising up to 20% in terms of spatial N-RMSE and up to 15% in terms of frequency SNR. Besides improving the denoising, the pipeline does not increase signal smoothing significantly, as shown by the MTF. Thus, the proposed pipeline can be used with state-of-the-art denoising techniques to improve the quality of DBT and FFDM images.

  10. Computer-Aided Diagnostic (CAD) Scheme by Use of Contralateral Subtraction Technique

    NASA Astrophysics Data System (ADS)

    Nagashima, Hiroyuki; Harakawa, Tetsumi

    We developed a computer-aided diagnostic (CAD) scheme for detection of subtle image findings of acute cerebral infarction in brain computed tomography (CT) by using a contralateral subtraction technique. In our computerized scheme, the lateral inclination of image was first corrected automatically by rotating and shifting. The contralateral subtraction image was then derived by subtraction of reversed image from original image. Initial candidates for acute cerebral infarctions were identified using the multiple-thresholding and image filtering techniques. As the 1st step for removing false positive candidates, fourteen image features were extracted in each of the initial candidates. Halfway candidates were detected by applying the rule-based test with these image features. At the 2nd step, five image features were extracted using the overlapping scale with halfway candidates in interest slice and upper/lower slice image. Finally, acute cerebral infarction candidates were detected by applying the rule-based test with five image features. The sensitivity in the detection for 74 training cases was 97.4% with 3.7 false positives per image. The performance of CAD scheme for 44 testing cases had an approximate result to training cases. Our CAD scheme using the contralateral subtraction technique can reveal suspected image findings of acute cerebral infarctions in CT images.

  11. Autographic theme extraction

    USGS Publications Warehouse

    Edson, D.; Colvocoresses, Alden P.

    1973-01-01

    Remote-sensor images, including aerial and space photographs, are generally recorded on film, where the differences in density create the image of the scene. With panchromatic and multiband systems the density differences are recorded in shades of gray. On color or color infrared film, with the emulsion containing dyes sensitive to different wavelengths, a color image is created by a combination of color densities. The colors, however, can be separated by filtering or other techniques, and the color image reduced to monochromatic images in which each of the separated bands is recorded as a function of the gray scale.

  12. IPL Processing of the Viking Orbiter Images of Mars

    NASA Technical Reports Server (NTRS)

    Ruiz, R. M.; Elliott, D. A.; Yagi, G. M.; Pomphrey, R. B.; Power, M. A.; Farrell, W., Jr.; Lorre, J. J.; Benton, W. D.; Dewar, R. E.; Cullen, L. E.

    1977-01-01

    The Viking orbiter cameras returned over 9000 images of Mars during the 6-month nominal mission. Digital image processing was required to produce products suitable for quantitative and qualitative scientific interpretation. Processing included the production of surface elevation data using computer stereophotogrammetric techniques, crater classification based on geomorphological characteristics, and the generation of color products using multiple black-and-white images recorded through spectral filters. The Image Processing Laboratory of the Jet Propulsion Laboratory was responsible for the design, development, and application of the software required to produce these 'second-order' products.

  13. Identification of microplastics by FTIR and Raman microscopy: a novel silicon filter substrate opens the important spectral range below 1300 cm(-1) for FTIR transmission measurements.

    PubMed

    Käppler, Andrea; Windrich, Frank; Löder, Martin G J; Malanin, Mikhail; Fischer, Dieter; Labrenz, Matthias; Eichhorn, Klaus-Jochen; Voit, Brigitte

    2015-09-01

    The presence of microplastics in aquatic ecosystems is a topical problem and leads to the need of appropriate and reliable analytical methods to distinctly identify and to quantify these particles in environmental samples. As an example transmission, Fourier transform infrared (FTIR) imaging can be used to analyze samples directly on filters without any visual presorting, when the environmental sample was afore extracted, purified, and filtered. However, this analytical approach is strongly restricted by the limited IR transparency of conventional filter materials. Within this study, we describe a novel silicon (Si) filter substrate produced by photolithographic microstructuring, which guarantees sufficient transparency for the broad mid-infrared region of 4000-600 cm(-1). This filter type features holes with a diameter of 10 μm and exhibits adequate mechanical stability. Furthermore, it will be shown that our Si filter substrate allows a distinct identification of the most common microplastics, polyethylene (PE), and polypropylene (PP), in the characteristic fingerprint region (1400-600 cm(-1)). Moreover, using the Si filter substrate, a differentiation of microparticles of polyesters having quite similar chemical structure, like polyethylene terephthalate (PET) and polybutylene terephthalate (PBT), is now possible, which facilitates a visualization of their distribution within a microplastic sample by FTIR imaging. Finally, this Si filter can also be used as substrate for Raman microscopy-a second complementary spectroscopic technique-to identify microplastic samples.

  14. Technical Note: Image filtering to make computer-aided detection robust to image reconstruction kernel choice in lung cancer CT screening

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohkubo, Masaki, E-mail: mook@clg.niigata-u.ac.jp

    Purpose: In lung cancer computed tomography (CT) screening, the performance of a computer-aided detection (CAD) system depends on the selection of the image reconstruction kernel. To reduce this dependence on reconstruction kernels, the authors propose a novel application of an image filtering method previously proposed by their group. Methods: The proposed filtering process uses the ratio of modulation transfer functions (MTFs) of two reconstruction kernels as a filtering function in the spatial-frequency domain. This method is referred to as MTF{sub ratio} filtering. Test image data were obtained from CT screening scans of 67 subjects who each had one nodule. Imagesmore » were reconstructed using two kernels: f{sub STD} (for standard lung imaging) and f{sub SHARP} (for sharp edge-enhancement lung imaging). The MTF{sub ratio} filtering was implemented using the MTFs measured for those kernels and was applied to the reconstructed f{sub SHARP} images to obtain images that were similar to the f{sub STD} images. A mean filter and a median filter were applied (separately) for comparison. All reconstructed and filtered images were processed using their prototype CAD system. Results: The MTF{sub ratio} filtered images showed excellent agreement with the f{sub STD} images. The standard deviation for the difference between these images was very small, ∼6.0 Hounsfield units (HU). However, the mean and median filtered images showed larger differences of ∼48.1 and ∼57.9 HU from the f{sub STD} images, respectively. The free-response receiver operating characteristic (FROC) curve for the f{sub SHARP} images indicated poorer performance compared with the FROC curve for the f{sub STD} images. The FROC curve for the MTF{sub ratio} filtered images was equivalent to the curve for the f{sub STD} images. However, this similarity was not achieved by using the mean filter or median filter. Conclusions: The accuracy of MTF{sub ratio} image filtering was verified and the method was demonstrated to be effective for reducing the kernel dependence of CAD performance.« less

  15. Task-based modeling and optimization of a cone-beam CT scanner for musculoskeletal imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prakash, P.; Zbijewski, W.; Gang, G. J.

    2011-10-15

    Purpose: This work applies a cascaded systems model for cone-beam CT imaging performance to the design and optimization of a system for musculoskeletal extremity imaging. The model provides a quantitative guide to the selection of system geometry, source and detector components, acquisition techniques, and reconstruction parameters. Methods: The model is based on cascaded systems analysis of the 3D noise-power spectrum (NPS) and noise-equivalent quanta (NEQ) combined with factors of system geometry (magnification, focal spot size, and scatter-to-primary ratio) and anatomical background clutter. The model was extended to task-based analysis of detectability index (d') for tasks ranging in contrast and frequencymore » content, and d' was computed as a function of system magnification, detector pixel size, focal spot size, kVp, dose, electronic noise, voxel size, and reconstruction filter to examine trade-offs and optima among such factors in multivariate analysis. The model was tested quantitatively versus the measured NPS and qualitatively in cadaver images as a function of kVp, dose, pixel size, and reconstruction filter under conditions corresponding to the proposed scanner. Results: The analysis quantified trade-offs among factors of spatial resolution, noise, and dose. System magnification (M) was a critical design parameter with strong effect on spatial resolution, dose, and x-ray scatter, and a fairly robust optimum was identified at M {approx} 1.3 for the imaging tasks considered. The results suggested kVp selection in the range of {approx}65-90 kVp, the lower end (65 kVp) maximizing subject contrast and the upper end maximizing NEQ (90 kVp). The analysis quantified fairly intuitive results--e.g., {approx}0.1-0.2 mm pixel size (and a sharp reconstruction filter) optimal for high-frequency tasks (bone detail) compared to {approx}0.4 mm pixel size (and a smooth reconstruction filter) for low-frequency (soft-tissue) tasks. This result suggests a specific protocol for 1 x 1 (full-resolution) projection data acquisition followed by full-resolution reconstruction with a sharp filter for high-frequency tasks along with 2 x 2 binning reconstruction with a smooth filter for low-frequency tasks. The analysis guided selection of specific source and detector components implemented on the proposed scanner. The analysis also quantified the potential benefits and points of diminishing return in focal spot size, reduced electronic noise, finer detector pixels, and low-dose limits of detectability. Theoretical results agreed quantitatively with the measured NPS and qualitatively with evaluation of cadaver images by a musculoskeletal radiologist. Conclusions: A fairly comprehensive model for 3D imaging performance in cone-beam CT combines factors of quantum noise, system geometry, anatomical background, and imaging task. The analysis provided a valuable, quantitative guide to design, optimization, and technique selection for a musculoskeletal extremities imaging system under development.« less

  16. Development of multiple source data processing for structural analysis at a regional scale. [digital remote sensing in geology

    NASA Technical Reports Server (NTRS)

    Carrere, Veronique

    1990-01-01

    Various image processing techniques developed for enhancement and extraction of linear features, of interest to the structural geologist, from digital remote sensing, geologic, and gravity data, are presented. These techniques include: (1) automatic detection of linear features and construction of rose diagrams from Landsat MSS data; (2) enhancement of principal structural directions using selective filters on Landsat MSS, Spacelab panchromatic, and HCMM NIR data; (3) directional filtering of Spacelab panchromatic data using Fast Fourier Transform; (4) detection of linear/elongated zones of high thermal gradient from thermal infrared data; and (5) extraction of strong gravimetric gradients from digitized Bouguer anomaly maps. Processing results can be compared to each other through the use of a geocoded database to evaluate the structural importance of each lineament according to its depth: superficial structures in the sedimentary cover, or deeper ones affecting the basement. These image processing techniques were successfully applied to achieve a better understanding of the transition between Provence and the Pyrenees structural blocks, in southeastern France, for an improved structural interpretation of the Mediterranean region.

  17. Kinetic filtering of [18F]Fluorothymidine in positron emission tomography studies

    NASA Astrophysics Data System (ADS)

    Gray, Katherine R.; Contractor, Kaiyumars B.; Kenny, Laura M.; Al-Nahhas, Adil; Shousha, Sami; Stebbing, Justin; Wasan, Harpreet S.; Coombes, R. Charles; Aboagye, Eric O.; Turkheimer, Federico E.; Rosso, Lula

    2010-02-01

    [18F]Fluorothymidine (FLT) is a cell proliferation marker that undergoes predominantly hepatic metabolism and therefore shows a high level of accumulation in the liver, as well as in rapidly proliferating tumours. Furthermore, the tracer's uptake is substantial in other organs including the heart. We present a nonlinear kinetic filtering technique which enhances the visualization of tumours imaged with FLT positron emission tomography (FLT-PET). A classification algorithm to isolate cancerous tissue from healthy organs was developed and validated using 29 scan data from patients with locally advanced or metastatic breast cancer. A large reduction in signal from the liver and heart of 80% was observed following application of the kinetic filter, whilst the majority of signal from both primary tumours and metastases was retained. A scan acquisition time of 60 min has been shown to be sufficient to obtain the necessary kinetic data. The algorithm extends utility of FLT-PET imaging in oncology research.

  18. An efficient algorithm for measurement of retinal vessel diameter from fundus images based on directional filtering

    NASA Astrophysics Data System (ADS)

    Wang, Xuchu; Niu, Yanmin

    2011-02-01

    Automatic measurement of vessels from fundus images is a crucial step for assessing vessel anomalies in ophthalmological community, where the change in retinal vessel diameters is believed to be indicative of the risk level of diabetic retinopathy. In this paper, a new retinal vessel diameter measurement method by combining vessel orientation estimation and filter response is proposed. Its interesting characteristics include: (1) different from the methods that only fit the vessel profiles, the proposed method extracts more stable and accurate vessel diameter by casting this problem as a maximal response problem of a variation of Gabor filter; (2) the proposed method can directly and efficiently estimate the vessel's orientation, which is usually captured by time-consuming multi-orientation fitting techniques in many existing methods. Experimental results shows that the proposed method both retains the computational simplicity and achieves stable and accurate estimation results.

  19. A novel pulse compression algorithm for frequency modulated active thermography using band-pass filter

    NASA Astrophysics Data System (ADS)

    Chatterjee, Krishnendu; Roy, Deboshree; Tuli, Suneet

    2017-05-01

    This paper proposes a novel pulse compression algorithm, in the context of frequency modulated thermal wave imaging. The compression filter is derived from a predefined reference pixel in a recorded video, which contains direct measurement of the excitation signal alongside the thermal image of a test piece. The filter causes all the phases of the constituent frequencies to be adjusted to nearly zero value, so that on reconstruction a pulse is obtained. Further, due to band-limited nature of the excitation, signal-to-noise ratio is improved by suppressing out-of-band noise. The result is similar to that of a pulsed thermography experiment, although the peak power is drastically reduced. The algorithm is successfully demonstrated on mild steel and carbon fibre reference samples. Objective comparisons of the proposed pulse compression algorithm with the existing techniques are presented.

  20. 3D-FFT for Signature Detection in LWIR Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Medvick, Patricia A.; Lind, Michael A.; Mackey, Patrick S.

    Improvements in analysis detection exploitation are possible by applying whitened matched filtering within the Fourier domain to hyperspectral data cubes. We describe an implementation of a Three Dimensional Fast Fourier Transform Whitened Matched Filter (3DFFTMF) approach and, using several example sets of Long Wave Infra Red (LWIR) data cubes, compare the results with those from standard Whitened Matched Filter (WMF) techniques. Since the variability in shape of gaseous plumes precludes the use of spatial conformation in the matched filtering, the 3DFFTMF results were similar to those of two other WMF methods. Including a spatial low-pass filter within the Fourier spacemore » can improve signal to noise ratios and therefore improve detection limit by facilitating the mitigation of high frequency clutter. The improvement only occurs if the low-pass filter diameter is smaller than the plume diameter.« less

  1. It all unraveled from there: case report of a central venous catheter guidewire unraveling.

    PubMed

    Zerkle, Samuel; Emdadi, Vanessa; Mancinelli, Marc

    2014-12-01

    Inferior vena cava (IVC) filters can present challenges to emergency physicians in the process of central venous catheter (CVC) placement. A 68-year-old woman presented to the emergency department with severe shortness of breath and was intubated. A central line was placed after the intubation to facilitate peripheral access. A CVC guidewire unraveled during placement after getting caught on an IVC filter. WHY SHOULD AN EMERGENCY PHYSICIAN BE AWARE OF THIS?: Emergency physicians should be aware of the complications that IVC filters can cause in the placement of CVCs. Imaging and identification of IVC filters beforehand will allow for proper planning of how to manage the case in which a filter catches on the guidewire. Simple anecdotal techniques, such as advancing the guidewire and spinning the guidewire between the fingers, can facilitate the removal of the guide wire from the IVC filter. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. The use of adaptive statistical iterative reconstruction (ASiR) technique in evaluation of patients with cervical spine trauma: impact on radiation dose reduction and image quality

    PubMed Central

    Sheikh, Adnan

    2016-01-01

    Objective: The aim of this study was to evaluate the impact of adaptive statistical iterative reconstruction (ASiR) technique on the image quality and radiation dose reduction. The comparison was made with the traditional filtered back projection (FBP) technique. Methods: We retrospectively reviewed 78 patients, who underwent cervical spine CT for blunt cervical trauma between 1 June 2010 and 30 November 2010. 48 patients were imaged using traditional FBP technique and the remaining 30 patients were imaged using the ASiR technique. The patient demographics, radiation dose, objective image signal and noise were recorded; while subjective noise, sharpness, diagnostic acceptability and artefacts were graded by two radiologists blinded to the techniques. Results: We found that the ASiR technique was able to reduce the volume CT dose index, dose–length product and effective dose by 36%, 36.5% and 36.5%, respectively, compared with the FBP technique. There was no significant difference in the image noise (p = 0.39), signal (p = 0.82) and signal-to-noise ratio (p = 0.56) between the groups. The subjective image quality was minimally better in the ASiR group but not statistically significant. There was excellent interobserver agreement on the subjective image quality and diagnostic acceptability for both groups. Conclusion: The use of ASiR technique allowed approximately 36% radiation dose reduction in the evaluation of cervical spine without degrading the image quality. Advances in knowledge: The present study highlights that the ASiR technique is extremely helpful in reducing the patient radiation exposure while maintaining the image quality. It is highly recommended to utilize this novel technique in CT imaging of different body regions. PMID:26882825

  3. The use of adaptive statistical iterative reconstruction (ASiR) technique in evaluation of patients with cervical spine trauma: impact on radiation dose reduction and image quality.

    PubMed

    Patro, Satya N; Chakraborty, Santanu; Sheikh, Adnan

    2016-01-01

    The aim of this study was to evaluate the impact of adaptive statistical iterative reconstruction (ASiR) technique on the image quality and radiation dose reduction. The comparison was made with the traditional filtered back projection (FBP) technique. We retrospectively reviewed 78 patients, who underwent cervical spine CT for blunt cervical trauma between 1 June 2010 and 30 November 2010. 48 patients were imaged using traditional FBP technique and the remaining 30 patients were imaged using the ASiR technique. The patient demographics, radiation dose, objective image signal and noise were recorded; while subjective noise, sharpness, diagnostic acceptability and artefacts were graded by two radiologists blinded to the techniques. We found that the ASiR technique was able to reduce the volume CT dose index, dose-length product and effective dose by 36%, 36.5% and 36.5%, respectively, compared with the FBP technique. There was no significant difference in the image noise (p = 0.39), signal (p = 0.82) and signal-to-noise ratio (p = 0.56) between the groups. The subjective image quality was minimally better in the ASiR group but not statistically significant. There was excellent interobserver agreement on the subjective image quality and diagnostic acceptability for both groups. The use of ASiR technique allowed approximately 36% radiation dose reduction in the evaluation of cervical spine without degrading the image quality. The present study highlights that the ASiR technique is extremely helpful in reducing the patient radiation exposure while maintaining the image quality. It is highly recommended to utilize this novel technique in CT imaging of different body regions.

  4. Effect of Edge-Preserving Adaptive Image Filter on Low-Contrast Detectability in CT Systems: Application of ROC Analysis

    PubMed Central

    Okumura, Miwa; Ota, Takamasa; Kainuma, Kazuhisa; Sayre, James W.; McNitt-Gray, Michael; Katada, Kazuhiro

    2008-01-01

    Objective. For the multislice CT (MSCT) systems with a larger number of detector rows, it is essential to employ dose-reduction techniques. As reported in previous studies, edge-preserving adaptive image filters, which selectively eliminate only the noise elements that are increased when the radiation dose is reduced without affecting the sharpness of images, have been developed. In the present study, we employed receiver operating characteristic (ROC) analysis to assess the effects of the quantum denoising system (QDS), which is an edge-preserving adaptive filter that we have developed, on low-contrast resolution, and to evaluate to what degree the radiation dose can be reduced while maintaining acceptable low-contrast resolution. Materials and Methods. The low-contrast phantoms (Catphan 412) were scanned at various tube current settings, and ROC analysis was then performed for the groups of images obtained with/without the use of QDS at each tube current to determine whether or not a target could be identified. The tube current settings for which the area under the ROC curve (Az value) was approximately 0.7 were determined for both groups of images with/without the use of QDS. Then, the radiation dose reduction ratio when QDS was used was calculated by converting the determined tube current to the radiation dose. Results. The use of the QDS edge-preserving adaptive image filter allowed the radiation dose to be reduced by up to 38%. Conclusion. The QDS was found to be useful for reducing the radiation dose without affecting the low-contrast resolution in MSCT studies. PMID:19043565

  5. Preliminary design of the spatial filters used in the multipass amplification system of TIL

    NASA Astrophysics Data System (ADS)

    Zhu, Qihua; Zhang, Xiao Min; Jing, Feng

    1998-12-01

    The spatial filters are used in Technique Integration Line, which has a multi-pass amplifier, not only to suppress parasitic high spatial frequency modes but also to provide places for inserting a light isolator and injecting the seed beam, and to relay image while the beam passes through the amplifiers several times. To fulfill these functions, the parameters of the spatial filters are optimized by calculations and analyzes with the consideration of avoiding the plasma blow-off effect and components demanding by ghost beam focus. The 'ghost beams' are calculated by ray tracing. A software was developed to evaluate the tolerance of the spatial filters and their components, and to align the whole system on computer simultaneously.

  6. Image processing techniques for noise removal, enhancement and segmentation of cartilage OCT images

    NASA Astrophysics Data System (ADS)

    Rogowska, Jadwiga; Brezinski, Mark E.

    2002-02-01

    Osteoarthritis, whose hallmark is the progressive loss of joint cartilage, is a major cause of morbidity worldwide. Recently, optical coherence tomography (OCT) has demonstrated considerable promise for the assessment of articular cartilage. Among the most important parameters to be assessed is cartilage width. However, detection of the bone cartilage interface is critical for the assessment of cartilage width. At present, the quantitative evaluations of cartilage thickness are being done using manual tracing of cartilage-bone borders. Since data is being obtained near video rate with OCT, automated identification of the bone-cartilage interface is critical. In order to automate the process of boundary detection on OCT images, there is a need for developing new image processing techniques. In this paper we describe the image processing techniques for speckle removal, image enhancement and segmentation of cartilage OCT images. In particular, this paper focuses on rabbit cartilage since this is an important animal model for testing both chondroprotective agents and cartilage repair techniques. In this study, a variety of techniques were examined. Ultimately, by combining an adaptive filtering technique with edge detection (vertical gradient, Sobel edge detection), cartilage edges can be detected. The procedure requires several steps and can be automated. Once the cartilage edges are outlined, the cartilage thickness can be measured.

  7. Targeted Silver Nanoparticles for Dual-Energy Breast X-Ray Imaging

    DTIC Science & Technology

    2013-03-01

    imaging parameters. In addition, Ag performs better than I when imaging at the optimal conditions for I. For example, using a rhodium filter, the...Laboratory. XCOM: Photon Cross Sections Database. Retrieved December 10, 2011 2. Boone J.M. , Fewell, T.R., Jennings, R.J. Molybdenum, rhodium , and tungsten...and a 27 kVp low-energy beam with rhodium filtration, at a dose distribution of 50:50. This low-energy technique is a classic example of an

  8. [Improvement of magnetic resonance phase unwrapping method based on Goldstein Branch-cut algorithm].

    PubMed

    Guo, Lin; Kang, Lili; Wang, Dandan

    2013-02-01

    The phase information of magnetic resonance (MR) phase image can be used in many MR imaging techniques, but phase wrapping of the images often results in inaccurate phase information and phase unwrapping is essential for MR imaging techniques. In this paper we analyze the causes of errors in phase unwrapping with the commonly used Goldstein Brunch-cut algorithm and propose an improved algorithm. During the unwrapping process, masking, filtering, dipole- remover preprocessor, and the Prim algorithm of the minimum spanning tree were introduced to optimize the residues essential for the Goldstein Brunch-cut algorithm. Experimental results showed that the residues, branch-cuts and continuous unwrapped phase surface were efficiently reduced and the quality of MR phase images was obviously improved with the proposed method.

  9. Effect of anapanasati meditation technique through electrophotonic imaging parameters: A pilot study

    PubMed Central

    Deo, Guru; Itagi R, Kumar; Thaiyar M, Srinivasan; Kuldeep, Kushwah K

    2015-01-01

    Background: Mindfulness along with breathing is a well-established meditation technique. Breathing is an exquisite tool for exploring subtle awareness of mind and life itself. Aim: This study aimed at measuring changes in the different parameters of electrophotonic imaging (EPI) in anapanasati meditators. Materials and Methods: To carry out this study, 51 subjects comprising 32 males and 19 females of age 18 years and above (mean age 45.64 ± 14.43) were recruited voluntarily with informed consent attending Karnataka Dhyana Mahachakra-1 at Pyramid Valley International, Bengaluru, India. The design was a single group pre- post and data collected by EPI device before and after 5 days of intensive meditation. Results: Results show significant changes in EPI parameter integral area with filter (physiological) in both right and left side, which reflects the availability of high functional energy reserve in meditators. The researchers observed similar trends without filter (psycho-physiological) indicating high reserves of energy at psycho-physiological level also. Activation coefficient, another parameter of EPI, reduced showing more relaxed state than earlier, possibly due to parasympathetic dominance. Integral entropy decreased in the case of psycho-physiological parameters left-side without filter, which indicates less disorder after meditation, but these changes were not significant. The study showed a reversed change in integral entropy in the right side without filter; however, the values on both sides with filter increased, which indicates disorder. Conclusion: The study suggests that EPI can be used in the recording functional physiological and psychophysiological status of meditators at a subtle level. PMID:26170590

  10. Medical image denoising via optimal implementation of non-local means on hybrid parallel architecture.

    PubMed

    Nguyen, Tuan-Anh; Nakib, Amir; Nguyen, Huy-Nam

    2016-06-01

    The Non-local means denoising filter has been established as gold standard for image denoising problem in general and particularly in medical imaging due to its efficiency. However, its computation time limited its applications in real world application, especially in medical imaging. In this paper, a distributed version on parallel hybrid architecture is proposed to solve the computation time problem and a new method to compute the filters' coefficients is also proposed, where we focused on the implementation and the enhancement of filters' parameters via taking the neighborhood of the current voxel more accurately into account. In terms of implementation, our key contribution consists in reducing the number of shared memory accesses. The different tests of the proposed method were performed on the brain-web database for different levels of noise. Performances and the sensitivity were quantified in terms of speedup, peak signal to noise ratio, execution time, the number of floating point operations. The obtained results demonstrate the efficiency of the proposed method. Moreover, the implementation is compared to that of other techniques, recently published in the literature. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Denoising and segmentation of retinal layers in optical coherence tomography images

    NASA Astrophysics Data System (ADS)

    Dash, Puspita; Sigappi, A. N.

    2018-04-01

    Optical Coherence Tomography (OCT) is an imaging technique used to localize the intra-retinal boundaries for the diagnostics of macular diseases. Due to speckle noise, low image contrast and accurate segmentation of individual retinal layers is difficult. Due to this, a method for retinal layer segmentation from OCT images is presented. This paper proposes a pre-processing filtering approach for denoising and segmentation methods for segmenting retinal layers OCT images using graph based segmentation technique. These techniques are used for segmentation of retinal layers for normal as well as patients with Diabetic Macular Edema. The algorithm based on gradient information and shortest path search is applied to optimize the edge selection. In this paper the four main layers of the retina are segmented namely Internal limiting membrane (ILM), Retinal pigment epithelium (RPE), Inner nuclear layer (INL) and Outer nuclear layer (ONL). The proposed method is applied on a database of OCT images of both ten normal and twenty DME affected patients and the results are found to be promising.

  12. Clutter Mitigation in Echocardiography Using Sparse Signal Separation

    PubMed Central

    Yavneh, Irad

    2015-01-01

    In ultrasound imaging, clutter artifacts degrade images and may cause inaccurate diagnosis. In this paper, we apply a method called Morphological Component Analysis (MCA) for sparse signal separation with the objective of reducing such clutter artifacts. The MCA approach assumes that the two signals in the additive mix have each a sparse representation under some dictionary of atoms (a matrix), and separation is achieved by finding these sparse representations. In our work, an adaptive approach is used for learning the dictionary from the echo data. MCA is compared to Singular Value Filtering (SVF), a Principal Component Analysis- (PCA-) based filtering technique, and to a high-pass Finite Impulse Response (FIR) filter. Each filter is applied to a simulated hypoechoic lesion sequence, as well as experimental cardiac ultrasound data. MCA is demonstrated in both cases to outperform the FIR filter and obtain results comparable to the SVF method in terms of contrast-to-noise ratio (CNR). Furthermore, MCA shows a lower impact on tissue sections while removing the clutter artifacts. In experimental heart data, MCA obtains in our experiments clutter mitigation with an average CNR improvement of 1.33 dB. PMID:26199622

  13. Optical monitor for observing turbulent flow

    DOEpatents

    Albrecht, Georg F.; Moore, Thomas R.

    1992-01-01

    The present invention provides an apparatus and method for non-invasively monitoring turbulent fluid flows including anisotropic flows. The present invention uses an optical technique to filter out the rays travelling in a straight line, while transmitting rays with turbulence induced fluctuations in time. The output is two dimensional, and can provide data regarding the spectral intensity distribution, or a view of the turbulence in real time. The optical monitor of the present invention comprises a laser that produces a coherent output beam that is directed through a fluid flow, which phase-modulates the beam. The beam is applied to a temporal filter that filters out the rays in the beam that are straight, while substantially transmitting the fluctuating, turbulence-induced rays. The temporal filter includes a lens and a photorefractive crystal such as BaTiO.sub.3 that is positioned in the converging section of the beam near the focal plane. An imaging system is used to observe the filtered beam. The imaging system may take a photograph, or it may include a real time camera that is connected to a computer. The present invention may be used for many purposes including research and design in aeronautics, hydrodynamics, and combustion.

  14. Automatic rule generation for high-level vision

    NASA Technical Reports Server (NTRS)

    Rhee, Frank Chung-Hoon; Krishnapuram, Raghu

    1992-01-01

    A new fuzzy set based technique that was developed for decision making is discussed. It is a method to generate fuzzy decision rules automatically for image analysis. This paper proposes a method to generate rule-based approaches to solve problems such as autonomous navigation and image understanding automatically from training data. The proposed method is also capable of filtering out irrelevant features and criteria from the rules.

  15. Processing of single channel air and water gun data for imaging an impact structure at the Chesapeake Bay

    USGS Publications Warehouse

    Lee, Myung W.

    1999-01-01

    Processing of 20 seismic profiles acquired in the Chesapeake Bay area aided in analysis of the details of an impact structure and allowed more accurate mapping of the depression caused by a bolide impact. Particular emphasis was placed on enhancement of seismic reflections from the basement. Application of wavelet deconvolution after a second zero-crossing predictive deconvolution improved the resolution of shallow reflections, and application of a match filter enhanced the basement reflections. The use of deconvolution and match filtering with a two-dimensional signal enhancement technique (F-X filtering) significantly improved the interpretability of seismic sections.

  16. Texture classification of normal tissues in computed tomography using Gabor filters

    NASA Astrophysics Data System (ADS)

    Dettori, Lucia; Bashir, Alia; Hasemann, Julie

    2007-03-01

    The research presented in this article is aimed at developing an automated imaging system for classification of normal tissues in medical images obtained from Computed Tomography (CT) scans. Texture features based on a bank of Gabor filters are used to classify the following tissues of interests: liver, spleen, kidney, aorta, trabecular bone, lung, muscle, IP fat, and SQ fat. The approach consists of three steps: convolution of the regions of interest with a bank of 32 Gabor filters (4 frequencies and 8 orientations), extraction of two Gabor texture features per filter (mean and standard deviation), and creation of a Classification and Regression Tree-based classifier that automatically identifies the various tissues. The data set used consists of approximately 1000 DIACOM images from normal chest and abdominal CT scans of five patients. The regions of interest were labeled by expert radiologists. Optimal trees were generated using two techniques: 10-fold cross-validation and splitting of the data set into a training and a testing set. In both cases, perfect classification rules were obtained provided enough images were available for training (~65%). All performance measures (sensitivity, specificity, precision, and accuracy) for all regions of interest were at 100%. This significantly improves previous results that used Wavelet, Ridgelet, and Curvelet texture features, yielding accuracy values in the 85%-98% range The Gabor filters' ability to isolate features at different frequencies and orientations allows for a multi-resolution analysis of texture essential when dealing with, at times, very subtle differences in the texture of tissues in CT scans.

  17. Looking for the Signal: A guide to iterative noise and artefact removal in X-ray tomographic reconstructions of porous geomaterials

    NASA Astrophysics Data System (ADS)

    Bruns, S.; Stipp, S. L. S.; Sørensen, H. O.

    2017-07-01

    X-ray micro- and nanotomography has evolved into a quantitative analysis tool rather than a mere qualitative visualization technique for the study of porous natural materials. Tomographic reconstructions are subject to noise that has to be handled by image filters prior to quantitative analysis. Typically, denoising filters are designed to handle random noise, such as Gaussian or Poisson noise. In tomographic reconstructions, noise has been projected from Radon space to Euclidean space, i.e. post reconstruction noise cannot be expected to be random but to be correlated. Reconstruction artefacts, such as streak or ring artefacts, aggravate the filtering process so algorithms performing well with random noise are not guaranteed to provide satisfactory results for X-ray tomography reconstructions. With sufficient image resolution, the crystalline origin of most geomaterials results in tomography images of objects that are untextured. We developed a denoising framework for these kinds of samples that combines a noise level estimate with iterative nonlocal means denoising. This allows splitting the denoising task into several weak denoising subtasks where the later filtering steps provide a controlled level of texture removal. We describe a hands-on explanation for the use of this iterative denoising approach and the validity and quality of the image enhancement filter was evaluated in a benchmarking experiment with noise footprints of a varying level of correlation and residual artefacts. They were extracted from real tomography reconstructions. We found that our denoising solutions were superior to other denoising algorithms, over a broad range of contrast-to-noise ratios on artificial piecewise constant signals.

  18. Enhancement of PET Images

    NASA Astrophysics Data System (ADS)

    Davis, Paul B.; Abidi, Mongi A.

    1989-05-01

    PET is the only imaging modality that provides doctors with early analytic and quantitative biochemical assessment and precise localization of pathology. In PET images, boundary information as well as local pixel intensity are both crucial for manual and/or automated feature tracing, extraction, and identification. Unfortunately, the present PET technology does not provide the necessary image quality from which such precise analytic and quantitative measurements can be made. PET images suffer from significantly high levels of radial noise present in the form of streaks caused by the inexactness of the models used in image reconstruction. In this paper, our objective is to model PET noise and remove it without altering dominant features in the image. The ultimate goal here is to enhance these dominant features to allow for automatic computer interpretation and classification of PET images by developing techniques that take into consideration PET signal characteristics, data collection, and data reconstruction. We have modeled the noise steaks in PET images in both rectangular and polar representations and have shown both analytically and through computer simulation that it exhibits consistent mapping patterns. A class of filters was designed and applied successfully. Visual inspection of the filtered images show clear enhancement over the original images.

  19. A novel method to recover DD fusion proton CR-39 data corrupted by fast ablator ions at OMEGA and the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Sutcliffe, G. D.; Milanese, L. M.; Orozco, D.; Lahmann, B.; Gatu Johnson, M.; Séguin, F. H.; Sio, H.; Frenje, J. A.; Li, C. K.; Petrasso, R. D.; Park, H.-S.; Rygg, J. R.; Casey, D. T.; Bionta, R.; Turnbull, D. P.; Huntington, C. M.; Ross, J. S.; Zylstra, A. B.; Rosenberg, M. J.; Glebov, V. Yu.

    2016-11-01

    CR-39 detectors are used routinely in inertial confinement fusion (ICF) experiments as a part of nuclear diagnostics. CR-39 is filtered to stop fast ablator ions which have been accelerated from an ICF implosion due to electric fields caused by laser-plasma interactions. In some experiments, the filtering is insufficient to block these ions and the fusion-product signal tracks are lost in the large background of accelerated ion tracks. A technique for recovering signal in these scenarios has been developed, tested, and implemented successfully. The technique involves removing material from the surface of the CR-39 to a depth beyond the endpoint of the ablator ion tracks. The technique preserves signal magnitude (yield) as well as structure in radiograph images. The technique is effective when signal particle range is at least 10 μm deeper than the necessary bulk material removal.

  20. A novel method to recover DD fusion proton CR-39 data corrupted by fast ablator ions at OMEGA and the National Ignition Facility.

    PubMed

    Sutcliffe, G D; Milanese, L M; Orozco, D; Lahmann, B; Gatu Johnson, M; Séguin, F H; Sio, H; Frenje, J A; Li, C K; Petrasso, R D; Park, H-S; Rygg, J R; Casey, D T; Bionta, R; Turnbull, D P; Huntington, C M; Ross, J S; Zylstra, A B; Rosenberg, M J; Glebov, V Yu

    2016-11-01

    CR-39 detectors are used routinely in inertial confinement fusion (ICF) experiments as a part of nuclear diagnostics. CR-39 is filtered to stop fast ablator ions which have been accelerated from an ICF implosion due to electric fields caused by laser-plasma interactions. In some experiments, the filtering is insufficient to block these ions and the fusion-product signal tracks are lost in the large background of accelerated ion tracks. A technique for recovering signal in these scenarios has been developed, tested, and implemented successfully. The technique involves removing material from the surface of the CR-39 to a depth beyond the endpoint of the ablator ion tracks. The technique preserves signal magnitude (yield) as well as structure in radiograph images. The technique is effective when signal particle range is at least 10 μm deeper than the necessary bulk material removal.

  1. SU-C-207B-02: Maximal Noise Reduction Filter with Anatomical Structures Preservation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maitree, R; Guzman, G; Chundury, A

    Purpose: All medical images contain noise, which can result in an undesirable appearance and can reduce the visibility of anatomical details. There are varieties of techniques utilized to reduce noise such as increasing the image acquisition time and using post-processing noise reduction algorithms. However, these techniques are increasing the imaging time and cost or reducing tissue contrast and effective spatial resolution which are useful diagnosis information. The three main focuses in this study are: 1) to develop a novel approach that can adaptively and maximally reduce noise while preserving valuable details of anatomical structures, 2) to evaluate the effectiveness ofmore » available noise reduction algorithms in comparison to the proposed algorithm, and 3) to demonstrate that the proposed noise reduction approach can be used clinically. Methods: To achieve a maximal noise reduction without destroying the anatomical details, the proposed approach automatically estimated the local image noise strength levels and detected the anatomical structures, i.e. tissue boundaries. Such information was used to adaptively adjust strength of the noise reduction filter. The proposed algorithm was tested on 34 repeating swine head datasets and 54 patients MRI and CT images. The performance was quantitatively evaluated by image quality metrics and manually validated for clinical usages by two radiation oncologists and one radiologist. Results: Qualitative measurements on repeated swine head images demonstrated that the proposed algorithm efficiently removed noise while preserving the structures and tissues boundaries. In comparisons, the proposed algorithm obtained competitive noise reduction performance and outperformed other filters in preserving anatomical structures. Assessments from the manual validation indicate that the proposed noise reduction algorithm is quite adequate for some clinical usages. Conclusion: According to both clinical evaluation (human expert ranking) and qualitative assessment, the proposed approach has superior noise reduction and anatomical structures preservation capabilities over existing noise removal methods. Senior Author Dr. Deshan Yang received research funding form ViewRay and Varian.« less

  2. Infrared image background modeling based on improved Susan filtering

    NASA Astrophysics Data System (ADS)

    Yuehua, Xia

    2018-02-01

    When SUSAN filter is used to model the infrared image, the Gaussian filter lacks the ability of direction filtering. After filtering, the edge information of the image cannot be preserved well, so that there are a lot of edge singular points in the difference graph, increase the difficulties of target detection. To solve the above problems, the anisotropy algorithm is introduced in this paper, and the anisotropic Gauss filter is used instead of the Gauss filter in the SUSAN filter operator. Firstly, using anisotropic gradient operator to calculate a point of image's horizontal and vertical gradient, to determine the long axis direction of the filter; Secondly, use the local area of the point and the neighborhood smoothness to calculate the filter length and short axis variance; And then calculate the first-order norm of the difference between the local area of the point's gray-scale and mean, to determine the threshold of the SUSAN filter; Finally, the built SUSAN filter is used to convolution the image to obtain the background image, at the same time, the difference between the background image and the original image is obtained. The experimental results show that the background modeling effect of infrared image is evaluated by Mean Squared Error (MSE), Structural Similarity (SSIM) and local Signal-to-noise Ratio Gain (GSNR). Compared with the traditional filtering algorithm, the improved SUSAN filter has achieved better background modeling effect, which can effectively preserve the edge information in the image, and the dim small target is effectively enhanced in the difference graph, which greatly reduces the false alarm rate of the image.

  3. Retinal vessel enhancement based on the Gaussian function and image fusion

    NASA Astrophysics Data System (ADS)

    Moraru, Luminita; Obreja, Cristian Dragoş

    2017-01-01

    The Gaussian function is essential in the construction of the Frangi and COSFIRE (combination of shifted filter responses) filters. The connection of the broken vessels and an accurate extraction of the vascular structure is the main goal of this study. Thus, the outcome of the Frangi and COSFIRE edge detection algorithms are fused using the Dempster-Shafer algorithm with the aim to improve detection and to enhance the retinal vascular structure. For objective results, the average diameters of the retinal vessels provided by Frangi, COSFIRE and Dempster-Shafer fusion algorithms are measured. These experimental values are compared to the ground truth values provided by manually segmented retinal images. We prove the superiority of the fusion algorithm in terms of image quality by using the figure of merit objective metric that correlates the effects of all post-processing techniques.

  4. Comment on Vaknine, R. and Lorenz, W.J. Lateral filtering of medical ultrasonic B-scans before image generation.

    PubMed

    Dickinson, R J

    1985-04-01

    In a recent paper, Vaknine and Lorenz discuss the merits of lateral deconvolution of demodulated B-scans. While this technique will decrease the lateral blurring of single discrete targets, such as the diaphragm in their figure 3, it is inappropriate to apply the method to the echoes arising from inhomogeneous structures such as soft tissue. In this latter case, the echoes from individual scatterers within the resolution cell of the transducer interfere to give random fluctuations in received echo amplitude termed speckle. Although his process can be modeled as a linear convolution similar to that of conventional image formation theory, the process of demodulation is a nonlinear process which loses the all-important phase information, and prevents the subsequent restoration of the image by Wiener filtering, itself a linear process.

  5. Image processing and recognition for biological images.

    PubMed

    Uchida, Seiichi

    2013-05-01

    This paper reviews image processing and pattern recognition techniques, which will be useful to analyze bioimages. Although this paper does not provide their technical details, it will be possible to grasp their main tasks and typical tools to handle the tasks. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. As the main tasks of image processing, this paper introduces gray-level transformation, binarization, image filtering, image segmentation, visual object tracking, optical flow and image registration. Image pattern recognition is the technique to classify an input image into one of the predefined classes and also has a large research area. This paper overviews its two main modules, that is, feature extraction module and classification module. Throughout the paper, it will be emphasized that bioimage is a very difficult target for even state-of-the-art image processing and pattern recognition techniques due to noises, deformations, etc. This paper is expected to be one tutorial guide to bridge biology and image processing researchers for their further collaboration to tackle such a difficult target. © 2013 The Author Development, Growth & Differentiation © 2013 Japanese Society of Developmental Biologists.

  6. Optimized suppression of coherent noise from seismic data using the Karhunen-Loève transform

    NASA Astrophysics Data System (ADS)

    Montagne, Raúl; Vasconcelos, Giovani L.

    2006-07-01

    Signals obtained in land seismic surveys are usually contaminated with coherent noise, among which the ground roll (Rayleigh surface waves) is of major concern for it can severely degrade the quality of the information obtained from the seismic record. This paper presents an optimized filter based on the Karhunen-Loève transform for processing seismic images contaminated with ground roll. In this method, the contaminated region of the seismic record, to be processed by the filter, is selected in such way as to correspond to the maximum of a properly defined coherence index. The main advantages of the method are that the ground roll is suppressed with negligible distortion of the remnant reflection signals and that the filtering procedure can be automated. The image processing technique described in this study should also be relevant for other applications where coherent structures embedded in a complex spatiotemporal pattern need to be identified in a more refined way. In particular, it is argued that the method is appropriate for processing optical coherence tomography images whose quality is often degraded by coherent noise (speckle).

  7. Antibacterial performance of nano polypropylene filter media containing nano-TiO2 and clay particles

    NASA Astrophysics Data System (ADS)

    Shafiee, Sara; Zarrebini, Mohammad; Naghashzargar, Elham; Semnani, Dariush

    2015-10-01

    Disinfection and elimination of pathogenic microorganisms from liquid can be achieved by filtration process using antibacterial filter media. The advent of nanotechnology has facilitated the introduction of membranes consisting of nano-fiber in filtration operations. The melt electro-spun fibers due to their extremely small diameters are used in the production of this particular filtration medium. In this work, antibacterial polypropylene filter medium containing clay particles and nano-TiO2 were made using melt electro-spun technology. Antibacterial performance of polypropylene nano-filters was evaluated using E. coli bacteria. Additionally, filtration efficiency of the samples in terms fiber diameter, filter porosity, and fiber distribution using image processing technique was determined. Air permeability and dust aerosol tests were conducted to establish the suitability of the samples as a filter medium. It was concluded that as far as antibacterial property is concerned, nano-fibers filter media containing clay particles are preferential to similar media containing TiO2 nanoparticles.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saha, K; Barbarits, J; Humenik, R

    Purpose: Chang’s mathematical formulation is a common method of attenuation correction applied on reconstructed Jaszczak phantom images. Though Chang’s attenuation correction method has been used for 360° angle acquisition, its applicability for 180° angle acquisition remains a question with one vendor’s camera software producing artifacts. The objective of this work is to ensure that Chang’s attenuation correction technique can be applied for reconstructed Jaszczak phantom images acquired in both 360° and 180° mode. Methods: The Jaszczak phantom filled with 20 mCi of diluted Tc-99m was placed on the patient table of Siemens e.cam™ (n = 2) and Siemens Symbia™ (nmore » = 1) dual head gamma cameras centered both in lateral and axial directions. A total of 3 scans were done at 180° and 2 scans at 360° orbit acquisition modes. Thirty two million counts were acquired for both modes. Reconstruction of the projection data was performed using filtered back projection smoothed with pre reconstruction Butterworth filter (order: 6, cutoff: 0.55). Reconstructed transaxial slices were attenuation corrected by Chang’s attenuation correction technique as implemented in the camera software. Corrections were also done using a modified technique where photon path lengths for all possible attenuation paths through a pixel in the image space were added to estimate the corresponding attenuation factor. The inverse of the attenuation factor was utilized to correct the attenuated pixel counts. Results: Comparable uniformity and noise were observed for 360° acquired phantom images attenuation corrected by the vendor technique (28.3% and 7.9%) and the proposed technique (26.8% and 8.4%). The difference in uniformity for 180° acquisition between the proposed technique (22.6% and 6.8%) and the vendor technique (57.6% and 30.1%) was more substantial. Conclusion: Assessment of attenuation correction performance by phantom uniformity analysis illustrated improved uniformity with the proposed algorithm compared to the camera software.« less

  9. An Improved Filtering Method for Quantum Color Image in Frequency Domain

    NASA Astrophysics Data System (ADS)

    Li, Panchi; Xiao, Hong

    2018-01-01

    In this paper we investigate the use of quantum Fourier transform (QFT) in the field of image processing. We consider QFT-based color image filtering operations and their applications in image smoothing, sharpening, and selective filtering using quantum frequency domain filters. The underlying principle used for constructing the proposed quantum filters is to use the principle of the quantum Oracle to implement the filter function. Compared with the existing methods, our method is not only suitable for color images, but also can flexibly design the notch filters. We provide the quantum circuit that implements the filtering task and present the results of several simulation experiments on color images. The major advantages of the quantum frequency filtering lies in the exploitation of the efficient implementation of the quantum Fourier transform.

  10. Experimental Demonstration of Adaptive Infrared Multispectral Imaging using Plasmonic Filter Array

    PubMed Central

    Jang, Woo-Yong; Ku, Zahyun; Jeon, Jiyeon; Kim, Jun Oh; Lee, Sang Jun; Park, James; Noyola, Michael J.; Urbas, Augustine

    2016-01-01

    In our previous theoretical study, we performed target detection using a plasmonic sensor array incorporating the data-processing technique termed “algorithmic spectrometry”. We achieved the reconstruction of a target spectrum by extracting intensity at multiple wavelengths with high resolution from the image data obtained from the plasmonic array. The ultimate goal is to develop a full-scale focal plane array with a plasmonic opto-coupler in order to move towards the next generation of versatile infrared cameras. To this end, and as an intermediate step, this paper reports the experimental demonstration of adaptive multispectral imagery using fabricated plasmonic spectral filter arrays and proposed target detection scenarios. Each plasmonic filter was designed using periodic circular holes perforated through a gold layer, and an enhanced target detection strategy was proposed to refine the original spectrometry concept for spatial and spectral computation of the data measured from the plasmonic array. Both the spectrum of blackbody radiation and a metal ring object at multiple wavelengths were successfully reconstructed using the weighted superposition of plasmonic output images as specified in the proposed detection strategy. In addition, plasmonic filter arrays were theoretically tested on a target at extremely high temperature as a challenging scenario for the detection scheme. PMID:27721506

  11. Statistical analysis of spectral data: a methodology for designing an intelligent monitoring system for the diabetic foot

    NASA Astrophysics Data System (ADS)

    Liu, Chanjuan; van Netten, Jaap J.; Klein, Marvin E.; van Baal, Jeff G.; Bus, Sicco A.; van der Heijden, Ferdi

    2013-12-01

    Early detection of (pre-)signs of ulceration on a diabetic foot is valuable for clinical practice. Hyperspectral imaging is a promising technique for detection and classification of such (pre-)signs. However, the number of the spectral bands should be limited to avoid overfitting, which is critical for pixel classification with hyperspectral image data. The goal was to design a detector/classifier based on spectral imaging (SI) with a small number of optical bandpass filters. The performance and stability of the design were also investigated. The selection of the bandpass filters boils down to a feature selection problem. A dataset was built, containing reflectance spectra of 227 skin spots from 64 patients, measured with a spectrometer. Each skin spot was annotated manually by clinicians as "healthy" or a specific (pre-)sign of ulceration. Statistical analysis on the data set showed the number of required filters is between 3 and 7, depending on additional constraints on the filter set. The stability analysis revealed that shot noise was the most critical factor affecting the classification performance. It indicated that this impact could be avoided in future SI systems with a camera sensor whose saturation level is higher than 106, or by postimage processing.

  12. High dynamic range imaging by pupil single-mode filtering and remapping

    NASA Astrophysics Data System (ADS)

    Perrin, G.; Lacour, S.; Woillez, J.; Thiébaut, É.

    2006-12-01

    Because of atmospheric turbulence, obtaining high angular resolution images with a high dynamic range is difficult even in the near-infrared domain of wavelengths. We propose a novel technique to overcome this issue. The fundamental idea is to apply techniques developed for long baseline interferometry to the case of a single-aperture telescope. The pupil of the telescope is broken down into coherent subapertures each feeding a single-mode fibre. A remapping of the exit pupil allows interfering all subapertures non-redundantly. A diffraction-limited image with very high dynamic range is reconstructed from the fringe pattern analysis with aperture synthesis techniques, free of speckle noise. The performances of the technique are demonstrated with simulations in the visible range with an 8-m telescope. Raw dynamic ranges of 1:106 can be obtained in only a few tens of seconds of integration time for bright objects.

  13. SU-G-IeP4-15: Ultrasound Imaging of Absorbable Inferior Vena Cava Filters for Proper Placement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitcham, T; Bouchard, R; Melancon, A

    Purpose: Inferior vena cava filters (IVCFs) are used in patients with a high risk of pulmonary embolism in situations when the use of blood thinning drugs would be inappropriate. These filters are implanted under x-ray guidance; however, this provides a dose of ionizing radiation to both patient and physician. B-mode ultrasound (US) imaging allows for localization of certain implanted devices without radiation dose concerns. The goal of this study was to investigate the feasibility of imaging the placement of absorbable IVCFs using US imaging to alleviate the dosage concern inherent to fluoroscopy. Methods: A phantom was constructed to mimic amore » human IVC using tissue-mimicking material with 0.5 dB/cm/MHz acoustic attenuation, while agar inclusions were used to model acoustic mismatch at the venous interface. Absorbable IVCF’s were imaged at 15 cm depth using B-mode US at 2, 3, 5, and 7 MHz transmit frequencies. Then, to determine temporal stability, the IVCF was left in the phantom for 10 weeks; during this time, the IVCF was imaged using the same techniques as above, while the integrity of the filter was analyzed by inspecting for fiber discontinuities. Results: Visualization of the inferior vena cava filter was possible at 5, 7.5, and 15 cm depth at US central frequencies of 2, 3, 5, and 7 MHz. Imaging the IVCF at 5 MHz yielded the clearest images while maintaining acceptable spatial resolution for identifying the IVCF’s, while lower frequencies provided noticeably worse image quality. No obvious degradation was observed over the course of the 10 weeks in a static phantom environment. Conclusion: Biodegradable IVCF localization was possible up to 15 cm in depth using conventional B-mode US in a tissue-mimicking phantom. This leads to the potential for using B-mode US to guide the placement of the IVCF upon deployment by the interventional radiologist. Mitch Eggers is an owner of Adient Medical Technologies. There are no other conflicts of interest to disclose.« less

  14. Compressed sensing for rapid late gadolinium enhanced imaging of the left atrium: A preliminary study.

    PubMed

    Kamesh Iyer, Srikant; Tasdizen, Tolga; Burgon, Nathan; Kholmovski, Eugene; Marrouche, Nassir; Adluru, Ganesh; DiBella, Edward

    2016-09-01

    Current late gadolinium enhancement (LGE) imaging of left atrial (LA) scar or fibrosis is relatively slow and requires 5-15min to acquire an undersampled (R=1.7) 3D navigated dataset. The GeneRalized Autocalibrating Partially Parallel Acquisitions (GRAPPA) based parallel imaging method is the current clinical standard for accelerating 3D LGE imaging of the LA and permits an acceleration factor ~R=1.7. Two compressed sensing (CS) methods have been developed to achieve higher acceleration factors: a patch based collaborative filtering technique tested with acceleration factor R~3, and a technique that uses a 3D radial stack-of-stars acquisition pattern (R~1.8) with a 3D total variation constraint. The long reconstruction time of these CS methods makes them unwieldy to use, especially the patch based collaborative filtering technique. In addition, the effect of CS techniques on the quantification of percentage of scar/fibrosis is not known. We sought to develop a practical compressed sensing method for imaging the LA at high acceleration factors. In order to develop a clinically viable method with short reconstruction time, a Split Bregman (SB) reconstruction method with 3D total variation (TV) constraints was developed and implemented. The method was tested on 8 atrial fibrillation patients (4 pre-ablation and 4 post-ablation datasets). Blur metric, normalized mean squared error and peak signal to noise ratio were used as metrics to analyze the quality of the reconstructed images, Quantification of the extent of LGE was performed on the undersampled images and compared with the fully sampled images. Quantification of scar from post-ablation datasets and quantification of fibrosis from pre-ablation datasets showed that acceleration factors up to R~3.5 gave good 3D LGE images of the LA wall, using a 3D TV constraint and constrained SB methods. This corresponds to reducing the scan time by half, compared to currently used GRAPPA methods. Reconstruction of 3D LGE images using the SB method was over 20 times faster than standard gradient descent methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Integrated Raman spectroscopy and trimodal wide-field imaging techniques for real-time in vivo tissue Raman measurements at endoscopy.

    PubMed

    Huang, Zhiwei; Teh, Seng Khoon; Zheng, Wei; Mo, Jianhua; Lin, Kan; Shao, Xiaozhuo; Ho, Khek Yu; Teh, Ming; Yeoh, Khay Guan

    2009-03-15

    We report an integrated Raman spectroscopy and trimodal (white-light reflectance, autofluorescence, and narrow-band) imaging techniques for real-time in vivo tissue Raman measurements at endoscopy. A special 1.8 mm endoscopic Raman probe with filtering modules is developed, permitting effective elimination of interference of fluorescence background and silica Raman in fibers while maximizing tissue Raman collections. We demonstrate that high-quality in vivo Raman spectra of upper gastrointestinal tract can be acquired within 1 s or subseconds under the guidance of wide-field endoscopic imaging modalities, greatly facilitating the adoption of Raman spectroscopy into clinical research and practice during routine endoscopic inspections.

  16. Spatially variant apodization for squinted synthetic aperture radar images.

    PubMed

    Castillo-Rubio, Carlos F; Llorente-Romano, Sergio; Burgos-García, Mateo

    2007-08-01

    Spatially variant apodization (SVA) is a nonlinear sidelobe reduction technique that improves sidelobe level and preserves resolution at the same time. This method implements a bidimensional finite impulse response filter with adaptive taps depending on image information. Some papers that have been previously published analyze SVA at the Nyquist rate or at higher rates focused on strip synthetic aperture radar (SAR). This paper shows that traditional SVA techniques are useless when the sensor operates with a squint angle. The reasons for this behaviour are analyzed, and a new implementation that largely improves the results is presented. The algorithm is applied to simulated SAR images in order to demonstrate the good quality achieved along with efficient computation.

  17. Hyperspectral Infrared Imaging of Flames Using a Spectrally Scanning Fabry-Perot Filter

    NASA Technical Reports Server (NTRS)

    Rawlins, W. T.; Lawrence, W. G.; Marinelli, W. J.; Allen, M. G.; Piltch, N. (Technical Monitor)

    2001-01-01

    The temperatures and compositions of gases in and around flames can be diagnosed using infrared emission spectroscopy to observe molecular band shapes and intensities. We have combined this approach with a low-order scanning Fabry-Perot filter and an infrared camera to obtain spectrally scanned infrared emission images of a laboratory flame and exhaust plume from 3.7 to 5.0 micrometers, at a spectral resolution of 0.043 micrometers, and a spatial resolution of 1 mm. The scanning filter or AIRIS (Adaptive Infrared Imaging Spectroradiometer) is a Fabry-Perot etalon operating in low order (mirror spacing = wavelength) such that the central spot, containing a monochromatic image of the scene, is viewed by the detector array. The detection system is a 128 x 128 liquid-nitrogen-cooled InSb focal plane array. The field of view is controlled by a 50 mm focal length multielement lens and an V4.8 aperture, resulting in an image 6.4 x 6.4 cm in extent at the flame and a depth of field of approximately 4 cm. Hyperspectral images above a laboratory CH4/air flame show primarily the strong emission from CO2 at 4.3 micrometers, and weaker emissions from CO and H2O. We discuss techniques to analyze the spectra, and plans to use this instrument in microgravity flame spread experiments.

  18. The study of infrared target recognition at sea background based on visual attention computational model

    NASA Astrophysics Data System (ADS)

    Wang, Deng-wei; Zhang, Tian-xu; Shi, Wen-jun; Wei, Long-sheng; Wang, Xiao-ping; Ao, Guo-qing

    2009-07-01

    Infrared images at sea background are notorious for the low signal-to-noise ratio, therefore, the target recognition of infrared image through traditional methods is very difficult. In this paper, we present a novel target recognition method based on the integration of visual attention computational model and conventional approach (selective filtering and segmentation). The two distinct techniques for image processing are combined in a manner to utilize the strengths of both. The visual attention algorithm searches the salient regions automatically, and represented them by a set of winner points, at the same time, demonstrated the salient regions in terms of circles centered at these winner points. This provides a priori knowledge for the filtering and segmentation process. Based on the winner point, we construct a rectangular region to facilitate the filtering and segmentation, then the labeling operation will be added selectively by requirement. Making use of the labeled information, from the final segmentation result we obtain the positional information of the interested region, label the centroid on the corresponding original image, and finish the localization for the target. The cost time does not depend on the size of the image but the salient regions, therefore the consumed time is greatly reduced. The method is used in the recognition of several kinds of real infrared images, and the experimental results reveal the effectiveness of the algorithm presented in this paper.

  19. Fast estimate of Hartley entropy in image sharpening

    NASA Astrophysics Data System (ADS)

    Krbcová, Zuzana; Kukal, Jaromír.; Svihlik, Jan; Fliegel, Karel

    2016-09-01

    Two classes of linear IIR filters: Laplacian of Gaussian (LoG) and Difference of Gaussians (DoG) are frequently used as high pass filters for contextual vision and edge detection. They are also used for image sharpening when linearly combined with the original image. Resulting sharpening filters are radially symmetric in spatial and frequency domains. Our approach is based on the radial approximation of unknown optimal filter, which is designed as a weighted sum of Gaussian filters with various radii. The novel filter is designed for MRI image enhancement where the image intensity represents anatomical structure plus additive noise. We prefer the gradient norm of Hartley entropy of whole image intensity as a measure which has to be maximized for the best sharpening. The entropy estimation procedure is as fast as FFT included in the filter but this estimate is a continuous function of enhanced image intensities. Physically motivated heuristic is used for optimum sharpening filter design by its parameter tuning. Our approach is compared with Wiener filter on MRI images.

  20. Hierarchical image coding with diamond-shaped sub-bands

    NASA Technical Reports Server (NTRS)

    Li, Xiaohui; Wang, Jie; Bauer, Peter; Sauer, Ken

    1992-01-01

    We present a sub-band image coding/decoding system using a diamond-shaped pyramid frequency decomposition to more closely match visual sensitivities than conventional rectangular bands. Filter banks are composed of simple, low order IIR components. The coder is especially designed to function in a multiple resolution reconstruction setting, in situations such as variable capacity channels or receivers, where images must be reconstructed without the entire pyramid of sub-bands. We use a nonlinear interpolation technique for lost subbands to compensate for loss of aliasing cancellation.

  1. Small maritime target detection through false color fusion

    NASA Astrophysics Data System (ADS)

    Toet, Alexander; Wu, Tirui

    2008-04-01

    We present an algorithm that produces a fused false color representation of a combined multiband IR and visual imaging system for maritime applications. Multispectral IR imaging techniques are increasingly deployed in maritime operations, to detect floating mines or to find small dinghies and swimmers during search and rescue operations. However, maritime backgrounds usually contain a large amount of clutter that severely hampers the detection of small targets. Our new algorithm deploys the correlation between the target signatures in two different IR frequency bands (3-5 and 8-12 μm) to construct a fused IR image with a reduced amount of clutter. The fused IR image is then combined with a visual image in a false color RGB representation for display to a human operator. The algorithm works as follows. First, both individual IR bands are filtered with a morphological opening top-hat transform to extract small details. Second, a common image is extracted from the two filtered IR bands, and assigned to the red channel of an RGB image. Regions of interest that appear in both IR bands remain in this common image, while most uncorrelated noise details are filtered out. Third, the visual band is assigned to the green channel and, after multiplication with a constant (typically 1.6) also to the blue channel. Fourth, the brightness and colors of this intermediate false color image are renormalized by adjusting its first order statistics to those of a representative reference scene. The result of these four steps is a fused color image, with naturalistic colors (bluish sky and grayish water), in which small targets are clearly visible.

  2. Application of off-line image processing for optimization in chest computed radiography using a low cost system.

    PubMed

    Muhogora, Wilbroad E; Msaki, Peter; Padovani, Renato

    2015-03-08

     The objective of this study was to improve the visibility of anatomical details by applying off-line postimage processing in chest computed radiography (CR). Four spatial domain-based external image processing techniques were developed by using MATLAB software version 7.0.0.19920 (R14) and image processing tools. The developed techniques were implemented to sample images and their visual appearances confirmed by two consultant radiologists to be clinically adequate. The techniques were then applied to 200 chest clinical images and randomized with other 100 images previously processed online. These 300 images were presented to three experienced radiologists for image quality assessment using standard quality criteria. The mean and ranges of the average scores for three radiologists were characterized for each of the developed technique and imaging system. The Mann-Whitney U-test was used to test the difference of details visibility between the images processed using each of the developed techniques and the corresponding images processed using default algorithms. The results show that the visibility of anatomical features improved significantly (0.005 ≤ p ≤ 0.02) with combinations of intensity values adjustment and/or spatial linear filtering techniques for images acquired using 60 ≤ kVp ≤ 70. However, there was no improvement for images acquired using 102 ≤ kVp ≤ 107 (0.127 ≤ p ≤ 0.48). In conclusion, the use of external image processing for optimization can be effective in chest CR, but should be implemented in consultations with the radiologists.

  3. Application of off‐line image processing for optimization in chest computed radiography using a low cost system

    PubMed Central

    Msaki, Peter; Padovani, Renato

    2015-01-01

    The objective of this study was to improve the visibility of anatomical details by applying off‐line postimage processing in chest computed radiography (CR). Four spatial domain‐based external image processing techniques were developed by using MATLAB software version 7.0.0.19920 (R14) and image processing tools. The developed techniques were implemented to sample images and their visual appearances confirmed by two consultant radiologists to be clinically adequate. The techniques were then applied to 200 chest clinical images and randomized with other 100 images previously processed online. These 300 images were presented to three experienced radiologists for image quality assessment using standard quality criteria. The mean and ranges of the average scores for three radiologists were characterized for each of the developed technique and imaging system. The Mann‐Whitney U‐test was used to test the difference of details visibility between the images processed using each of the developed techniques and the corresponding images processed using default algorithms. The results show that the visibility of anatomical features improved significantly (0.005≤p≤0.02) with combinations of intensity values adjustment and/or spatial linear filtering techniques for images acquired using 60≤kVp≤70. However, there was no improvement for images acquired using 102≤kVp≤107 (0.127≤p≤0.48). In conclusion, the use of external image processing for optimization can be effective in chest CR, but should be implemented in consultations with the radiologists. PACS number: 87.59.−e, 87.59.−B, 87.59.−bd PMID:26103165

  4. Polarization-Insensitive Tunable Optical Filters based on Liquid Crystal Polarization Gratings

    NASA Astrophysics Data System (ADS)

    Nicolescu, Elena

    Tunable optical filters are widely used for a variety of applications including spectroscopy, optical communication networks, remote sensing, and biomedical imaging and diagnostics. All of these application areas can greatly benefit from improvements in the key characteristics of the tunable optical filters embedded in them. Some of these key parameters include peak transmittance, bandwidth, tuning range, and transition width. In recent years research efforts have also focused on miniaturizing tunable optical filters into physically small packages for compact portable spectroscopy and hyperspectral imaging applications such as real-time medical diagnostics and defense applications. However, it is important that miniaturization not have a detrimental effect on filter performance. The overarching theme of this dissertation is to explore novel configurations of Polarization Gratings (PGs) as simple, low-cost, polarization-insensitive alternatives to conventional optical filtering technologies for applications including hyperspectral imaging and telecommunications. We approach this goal from several directions with a combination of theory and experimental demonstration leading to, in our opinion, a significant contribution to the field. We present three classes of tunable optical filters, the first of which is an angle-filtering scheme where the stop-band wavelengths are redirected off axis and the passband is transmitted on-axis. This is achieved using a stacked configuration of polarization gratings of various thicknesses. To improve this class of filter, we also introduce a novel optical element, the Bilayer Polarization Grating, exhibiting unique optical properties and demonstrating complex anchoring conditions with high quality. The second class of optical filter is analogous to a Lyot filter, utilizing stacks of static or tunable waveplates sandwiched with polarizing elements. However, we introduce a new configuration using PGs and static waveplates to replace the polarizers in the system, thereby greatly increasing the filter throughput. We then turn our attention to a Fourier filtering technique. This is a fundamentally different filtering approach involving a single PG where the filtering functionality involves selecting a spectral band with a movable aperture or slit and a diffractive element (PG in our case). Finally, we study the integration of a PG in a multi-channel wavelength blocker system focusing on the practical and fundamental limitations of using a PG as a variable optical attenuator/wavelength blocker in a commercial optical telecommunications network.

  5. Shack-Hartmann wavefront sensing based on binary-aberration-mode filtering.

    PubMed

    Wang, Shuai; Yang, Ping; Xu, Bing; Dong, Lizhi; Ao, Mingwu

    2015-02-23

    Spot centroid detection is required by Shack-Hartmann wavefront sensing since the technique was first proposed. For a Shack-Hartmann wavefront sensor, the standard structure is to place a camera behind a lenslet array to record the image of spots. We proposed a new Shack-Hartmann wavefront sensing technique without using spot centroid detection. Based on the principle of binary-aberration-mode filtering, for each subaperture, only one light-detecting unit is used to measure the local wavefront slopes. It is possible to adopt single detectors in Shack-Hartmann wavefront sensor. Thereby, the method is able to gain noise benefits from using singe detectors behind each subaperture when used for sensing rapid varying wavefront in weak light. Moreover, due to non-discrete pixel imaging, this method is a potential solution for high measurement precision with fewer detecting units. Our simulations demonstrate the validity of the theoretical model. In addition, the results also indicate the advantage in measurement accuracy.

  6. Measurement of Ambient Air Motion of D. I. Gasoline Spray by LIF-PIV

    NASA Astrophysics Data System (ADS)

    Yamakawa, Masahisa; Isshiki, Seiji; Yoshizaki, Takuo; Nishida, Keiya

    Ambient air velocity distributions in and around a D. I. gasoline spray were measured using a combination of LIF and PIV techniques. A rhodamine and water solution was injected into ambient air to disperse the fine fluorescent liquid particles used as tracers. A fuel spray was injected into the fluorescent tracer cloud and was illuminated by an Nd: YAG laser light sheet (532nm). The scattered light from the spray droplets and tracers was cut off by a high-pass filter (>560nm). As the fluorescence (>600nm) was transmitted through the high-pass filter, the tracer images were captured using a CCD camera and the ambient air velocity distribution could be obtained by PIV based on the images. This technique was applied to a D. I. gasoline spray. The ambient air flowed up around the spray and entered into the tail of the spray. Furthermore, the relative velocity between the spray and ambient air was investigated.

  7. Extraction of Black Hole Shadows Using Ridge Filtering and the Circle Hough Transform

    NASA Astrophysics Data System (ADS)

    Hennessey, Ryan; Akiyama, Kazunori; Fish, Vincent

    2018-01-01

    Supermassive black holes are widely considered to reside at the center of most large galaxies. One of the foremost tasks in modern astronomy is to image the centers of local galaxies, such as that of Messier 87 (M87) and Sagittarius A* at the center of our own Milky Way, to gain the first glimpses of black holes and their surrounding structures. Using data obtained from the Event Horizon Telescope (EHT), a global collection of millimeter-wavelength telescopes designed to perform very long baseline interferometry, new imaging techniques will likely be able to yield images of these structures at fine enough resolutions to compare with the predictions of general relativity and give us more insight into the formation of black holes, their surrounding jets and accretion disks, and galaxies themselves. Techniques to extract features from these images are already being developed. In this work, we present a new method for measuring the size of the black hole shadow, a feature that encodes information about the black hole mass and spin, using ridge filtering and the circle Hough transform. Previous methods have succeeded in extracting the black hole shadow with an accuracy of about 10- 20%, but using this new technique we are able to measure the shadow size with even finer accuracy. Our work indicates that the EHT will be able to significantly reduce the uncertainty in the estimate of the mass of the supermassive black hole in M87.

  8. Noise Power Spectrum in PROPELLER MR Imaging.

    PubMed

    Ichinoseki, Yuki; Nagasaka, Tatsuo; Miyamoto, Kota; Tamura, Hajime; Mori, Issei; Machida, Yoshio

    2015-01-01

    The noise power spectrum (NPS), an index for noise evaluation, represents the frequency characteristics of image noise. We measured the NPS in PROPELLER (Periodically Rotated Overlapping ParallEL Lines with Enhanced Reconstruction) magnetic resonance (MR) imaging, a nonuniform data sampling technique, as an initial study for practical MR image evaluation using the NPS. The 2-dimensional (2D) NPS reflected the k-space sampling density and showed agreement with the shape of the k-space trajectory as expected theoretically. Additionally, the 2D NPS allowed visualization of a part of the image reconstruction process, such as filtering and motion correction.

  9. Non-linear Post Processing Image Enhancement

    NASA Technical Reports Server (NTRS)

    Hunt, Shawn; Lopez, Alex; Torres, Angel

    1997-01-01

    A non-linear filter for image post processing based on the feedforward Neural Network topology is presented. This study was undertaken to investigate the usefulness of "smart" filters in image post processing. The filter has shown to be useful in recovering high frequencies, such as those lost during the JPEG compression-decompression process. The filtered images have a higher signal to noise ratio, and a higher perceived image quality. Simulation studies comparing the proposed filter with the optimum mean square non-linear filter, showing examples of the high frequency recovery, and the statistical properties of the filter are given,

  10. Reduction of noise and image artifacts in computed tomography by nonlinear filtration of projection images

    NASA Astrophysics Data System (ADS)

    Demirkaya, Omer

    2001-07-01

    This study investigates the efficacy of filtering two-dimensional (2D) projection images of Computer Tomography (CT) by the nonlinear diffusion filtration in removing the statistical noise prior to reconstruction. The projection images of Shepp-Logan head phantom were degraded by Gaussian noise. The variance of the Gaussian distribution was adaptively changed depending on the intensity at a given pixel in the projection image. The corrupted projection images were then filtered using the nonlinear anisotropic diffusion filter. The filtered projections as well as original noisy projections were reconstructed using filtered backprojection (FBP) with Ram-Lak filter and/or Hanning window. The ensemble variance was computed for each pixel on a slice. The nonlinear filtering of projection images improved the SNR substantially, on the order of fourfold, in these synthetic images. The comparison of intensity profiles across a cross-sectional slice indicated that the filtering did not result in any significant loss of image resolution.

  11. Improving signal to noise in labeled biological specimens using energy-filtered TEM of sections with a drift correction strategy and a direct detection device.

    PubMed

    Ramachandra, Ranjan; Bouwer, James C; Mackey, Mason R; Bushong, Eric; Peltier, Steven T; Xuong, Nguyen-Huu; Ellisman, Mark H

    2014-06-01

    Energy filtered transmission electron microscopy techniques are regularly used to build elemental maps of spatially distributed nanoparticles in materials and biological specimens. When working with thick biological sections, electron energy loss spectroscopy techniques involving core-loss electrons often require exposures exceeding several minutes to provide sufficient signal to noise. Image quality with these long exposures is often compromised by specimen drift, which results in blurring and reduced resolution. To mitigate drift artifacts, a series of short exposure images can be acquired, aligned, and merged to form a single image. For samples where the target elements have extremely low signal yields, the use of charge coupled device (CCD)-based detectors for this purpose can be problematic. At short acquisition times, the images produced by CCDs can be noisy and may contain fixed pattern artifacts that impact subsequent correlative alignment. Here we report on the use of direct electron detection devices (DDD's) to increase the signal to noise as compared with CCD's. A 3× improvement in signal is reported with a DDD versus a comparably formatted CCD, with equivalent dose on each detector. With the fast rolling-readout design of the DDD, the duty cycle provides a major benefit, as there is no dead time between successive frames.

  12. Geometric calibration of lens and filter distortions for multispectral filter-wheel cameras.

    PubMed

    Brauers, Johannes; Aach, Til

    2011-02-01

    High-fidelity color image acquisition with a multispectral camera utilizes optical filters to separate the visible electromagnetic spectrum into several passbands. This is often realized with a computer-controlled filter wheel, where each position is equipped with an optical bandpass filter. For each filter wheel position, a grayscale image is acquired and the passbands are finally combined to a multispectral image. However, the different optical properties and non-coplanar alignment of the filters cause image aberrations since the optical path is slightly different for each filter wheel position. As in a normal camera system, the lens causes additional wavelength-dependent image distortions called chromatic aberrations. When transforming the multispectral image with these aberrations into an RGB image, color fringes appear, and the image exhibits a pincushion or barrel distortion. In this paper, we address both the distortions caused by the lens and by the filters. Based on a physical model of the bandpass filters, we show that the aberrations caused by the filters can be modeled by displaced image planes. The lens distortions are modeled by an extended pinhole camera model, which results in a remaining mean calibration error of only 0.07 pixels. Using an absolute calibration target, we then geometrically calibrate each passband and compensate for both lens and filter distortions simultaneously. We show that both types of aberrations can be compensated and present detailed results on the remaining calibration errors.

  13. Focusing attention on objects of interest using multiple matched filters.

    PubMed

    Stough, T M; Brodley, C E

    2001-01-01

    In order to be of use to scientists, large image databases need to be analyzed to create a catalog of the objects of interest. One approach is to apply a multiple tiered search algorithm that uses reduction techniques of increasing computational complexity to select the desired objects from the database. The first tier of this type of algorithm, often called a focus of attention (FOA) algorithm, selects candidate regions from the image data and passes them to the next tier of the algorithm. In this paper we present a new approach to FOA that employs multiple matched filters (MMF), one for each object prototype, to detect the regions of interest. The MMFs are formed using k-means clustering on a set of image patches identified by domain experts as positive examples of objects of interest. An innovation of the approach is to radically reduce the dimensionality of the feature space, used by the k-means algorithm, by taking block averages (spoiling) the sample image patches. The process of spoiling is analyzed and its applicability to other domains is discussed. The combination of the output of the MMFs is achieved through the projection of the detections back into an empty image and then thresholding. This research was motivated by the need to detect small volcanos in the Magellan probe data from Venus. An empirical evaluation of the approach illustrates that a combination of the MMF plus the average filter results in a higher likelihood of 100% detection of the objects of interest at a lower false positive rate than a single matched filter alone.

  14. Artificial Structural Color Pixels: A Review

    PubMed Central

    Zhao, Yuqian; Zhao, Yong; Hu, Sheng; Lv, Jiangtao; Ying, Yu; Gervinskas, Gediminas; Si, Guangyuan

    2017-01-01

    Inspired by natural photonic structures (Morpho butterfly, for instance), researchers have demonstrated varying artificial color display devices using different designs. Photonic-crystal/plasmonic color filters have drawn increasing attention most recently. In this review article, we show the developing trend of artificial structural color pixels from photonic crystals to plasmonic nanostructures. Such devices normally utilize the distinctive optical features of photonic/plasmon resonance, resulting in high compatibility with current display and imaging technologies. Moreover, dynamical color filtering devices are highly desirable because tunable optical components are critical for developing new optical platforms which can be integrated or combined with other existing imaging and display techniques. Thus, extensive promising potential applications have been triggered and enabled including more abundant functionalities in integrated optics and nanophotonics. PMID:28805736

  15. Angular filter refractometry analysis using simulated annealing.

    PubMed

    Angland, P; Haberberger, D; Ivancic, S T; Froula, D H

    2017-10-01

    Angular filter refractometry (AFR) is a novel technique used to characterize the density profiles of laser-produced, long-scale-length plasmas [Haberberger et al., Phys. Plasmas 21, 056304 (2014)]. A new method of analysis for AFR images was developed using an annealing algorithm to iteratively converge upon a solution. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on the minimization of the χ 2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in an average uncertainty in the density profile of 5%-20% in the region of interest.

  16. Novel, in-situ Raman and fluorescence measurement techniques: Imaging using optical waveguides

    NASA Astrophysics Data System (ADS)

    Carter, Jerry Chance

    The following dissertation describes the development of methods for performing standoff and in- situ Raman and fluorescence spectroscopy for chemical imaging and non-imaging analytical applications. The use of Raman spectroscopy for the in- situ identification of crack cocaine and cocaine.HCl using a fiberoptic Raman probe and a portable Raman spectrograph has been demonstrated. We show that the Raman spectra of both forms of cocaine are easily distinguishable from common cutting agents and impurities such as benzocaine and lidocaine. We have also demonstrated the use of Raman spectroscopy for in-situ identification of drugs separated by thin layer chromatography. We have investigated the use of small, transportable, Raman systems for standoff Raman spectroscopy (e.g. <20 m). For this work, acousto-optical (AOTF) and liquid crystal tunable filters (LCTF) are being used both with, and in place of dispersive spectrographs and fixed filtering devices. In addition, we improved the flexibility of the system by the use of a modified holographic fiber-optic probe for light and image collection. A comparison of tunable filter technologies for standoff Raman imaging is discussed along with the merits of image transfer devices using small diameter image guides. A standoff Raman imaging system has been developed that utilizes a unique polymer collection mirror. The techniques used to produce these mirrors make it easy to design low f/# polymer mirrors. The performance of a low f/# polymer mirror system for standoff Raman chemical imaging has been demonstrated and evaluated. We have also demonstrated remote Raman hyperspectral imaging using a dimension-reduction, 2-dimensional (2-D) to 1-dimensional (1-D), fiber optic array. In these studies, a modified holographic fiber-optic probe was combined with the dimension-reduction fiber array for remote Raman imaging. The utility of this setup for standoff Raman imaging is demonstrated by monitoring the polymerization of dibromostyrene. To further demonstrate the utility of in- situ spectral imaging, we have shown that small diameter (350 μm) image guides can be used for in-situ measurements of analyte transport in thin membranes. This has been applied to the measurement of H2O diffusion in Nafion™ membranes using the luminescent compound, [Ru(phen)2dppz] 2+, which is a H2O indicator.

  17. Switching non-local vector median filter

    NASA Astrophysics Data System (ADS)

    Matsuoka, Jyohei; Koga, Takanori; Suetake, Noriaki; Uchino, Eiji

    2016-04-01

    This paper describes a novel image filtering method that removes random-valued impulse noise superimposed on a natural color image. In impulse noise removal, it is essential to employ a switching-type filtering method, as used in the well-known switching median filter, to preserve the detail of an original image with good quality. In color image filtering, it is generally preferable to deal with the red (R), green (G), and blue (B) components of each pixel of a color image as elements of a vectorized signal, as in the well-known vector median filter, rather than as component-wise signals to prevent a color shift after filtering. By taking these fundamentals into consideration, we propose a switching-type vector median filter with non-local processing that mainly consists of a noise detector and a noise removal filter. Concretely, we propose a noise detector that proactively detects noise-corrupted pixels by focusing attention on the isolation tendencies of pixels of interest not in an input image but in difference images between RGB components. Furthermore, as the noise removal filter, we propose an extended version of the non-local median filter, we proposed previously for grayscale image processing, named the non-local vector median filter, which is designed for color image processing. The proposed method realizes a superior balance between the preservation of detail and impulse noise removal by proactive noise detection and non-local switching vector median filtering, respectively. The effectiveness and validity of the proposed method are verified in a series of experiments using natural color images.

  18. Digitizing zone maps, using modified LARSYS program. [computer graphics and computer techniques for mapping

    NASA Technical Reports Server (NTRS)

    Giddings, L.; Boston, S.

    1976-01-01

    A method for digitizing zone maps is presented, starting with colored images and producing a final one-channel digitized tape. This method automates the work previously done interactively on the Image-100 and Data Analysis System computers of the Johnson Space Center (JSC) Earth Observations Division (EOD). A color-coded map was digitized through color filters on a scanner to form a digital tape in LARSYS-2 or JSC Universal format. The taped image was classified by the EOD LARSYS program on the basis of training fields included in the image. Numerical values were assigned to all pixels in a given class, and the resulting coded zone map was written on a LARSYS or Universal tape. A unique spatial filter option permitted zones to be made homogeneous and edges of zones to be abrupt transitions from one zone to the next. A zoom option allowed the output image to have arbitrary dimensions in terms of number of lines and number of samples on a line. Printouts of the computer program are given and the images that were digitized are shown.

  19. Optical Scatter Imaging with a digital micromirror device.

    PubMed

    Zheng, Jing-Yi; Pasternack, Robert M; Boustany, Nada N

    2009-10-26

    We had developed Optical Scatter Imaging (OSI) as a method which combines light scattering spectroscopy with microscopic imaging to probe local particle size in situ. Using a variable diameter iris as a Fourier spatial filter, the technique consisted of collecting images that encoded the intensity ratio of wide-to-narrow angle scatter at each pixel in the full field of view. In this paper, we replace the variable diameter Fourier filter with a digital micromirror device (DMD) to extend our assessment of morphology to the characterization of particle shape and orientation. We describe our setup in detail and demonstrate how to eliminate aberrations associated with the placement of the DMD in a conjugate Fourier plane of our microscopic imaging system. Using bacteria and polystyrene spheres, we show how this system can be used to assess particle aspect ratio even when imaged at low resolution. We also show the feasibility of detecting alterations in organelle aspect ratio in situ within living cells. This improved OSI system could be further developed to automate morphological quantification and sorting of non-spherical particles in situ.

  20. Portable multispectral imaging system for oral cancer diagnosis

    NASA Astrophysics Data System (ADS)

    Hsieh, Yao-Fang; Ou-Yang, Mang; Lee, Cheng-Chung

    2013-09-01

    This study presents the portable multispectral imaging system that can acquire the image of specific spectrum in vivo for oral cancer diagnosis. According to the research literature, the autofluorescence of cells and tissue have been widely applied to diagnose oral cancer. The spectral distribution is difference for lesions of epithelial cells and normal cells after excited fluorescence. We have been developed the hyperspectral and multispectral techniques for oral cancer diagnosis in three generations. This research is the third generation. The excited and emission spectrum for the diagnosis are acquired from the research of first generation. The portable system for detection of oral cancer is modified for existing handheld microscope. The UV LED is used to illuminate the surface of oral cavity and excite the cells to produce fluorescent. The image passes through the central channel and filters out unwanted spectrum by the selection of filter, and focused by the focus lens on the image sensor. Therefore, we can achieve the specific wavelength image via fluorescence reaction. The specificity and sensitivity of the system are 85% and 90%, respectively.

  1. Micromachined Tunable Fabry-Perot Filters for Infrared Astronomy

    NASA Technical Reports Server (NTRS)

    Barclay, Richard; Bier, Alexander; Chen, Tina; DiCamillo, Barbara; Deming, Drake; Greenhouse, Matthew; Henry, Ross; Hewagama, Tilak; Jacobson, Mindy; Loughlin, James; hide

    2002-01-01

    Micromachined Fabry-Perot tunable filters with a large clear aperture (12.5 to 40 mm) are being developed as an optical component for wide-field imaging 1:1 spectroscopy. This program applies silicon micromachining fabrication techniques to miniaturize Fabry-Perot filters for astronomical science instruments. The filter assembly consists of a stationary etalon plate mated to a plate in which the etalon is free to move along the optical axis on silicon springs attached to a stiff silicon support ring. The moving etalon is actuated electrostatically by electrode pairs on the fixed and moving etalons. To reduce mass, both etalons are fabricated by applying optical coatings to a thin freestanding silicon nitride film held flat in drumhead tension rather than to a thick optical substrate. The design, electro-mechanical modeling, fabrication, and initial results will be discussed. The potential application of the miniature Fabry-Perot filters will be briefly discussed with emphasis on the detection of extra-solar planets.

  2. Filter Media Tests Under Simulated Martian Atmospheric Conditions

    NASA Technical Reports Server (NTRS)

    Agui, Juan H.

    2016-01-01

    Human exploration of Mars will require the optimal utilization of planetary resources. One of its abundant resources is the Martian atmosphere that can be harvested through filtration and chemical processes that purify and separate it into its gaseous and elemental constituents. Effective filtration needs to be part of the suite of resource utilization technologies. A unique testing platform is being used which provides the relevant operational and instrumental capabilities to test articles under the proper simulated Martian conditions. A series of tests were conducted to assess the performance of filter media. Light sheet imaging of the particle flow provided a means of detecting and quantifying particle concentrations to determine capturing efficiencies. The media's efficiency was also evaluated by gravimetric means through a by-layer filter media configuration. These tests will help to establish techniques and methods for measuring capturing efficiency and arrestance of conventional fibrous filter media. This paper will describe initial test results on different filter media.

  3. A robust approach to optimal matched filter design in ultrasonic non-destructive evaluation (NDE)

    NASA Astrophysics Data System (ADS)

    Li, Minghui; Hayward, Gordon

    2017-02-01

    The matched filter was demonstrated to be a powerful yet efficient technique to enhance defect detection and imaging in ultrasonic non-destructive evaluation (NDE) of coarse grain materials, provided that the filter was properly designed and optimized. In the literature, in order to accurately approximate the defect echoes, the design utilized the real excitation signals, which made it time consuming and less straightforward to implement in practice. In this paper, we present a more robust and flexible approach to optimal matched filter design using the simulated excitation signals, and the control parameters are chosen and optimized based on the real scenario of array transducer, transmitter-receiver system response, and the test sample, as a result, the filter response is optimized and depends on the material characteristics. Experiments on industrial samples are conducted and the results confirm the great benefits of the method.

  4. Fabrication of artificially stacked ultrathin ZnS/MgF2 multilayer dielectric optical filters.

    PubMed

    Kedawat, Garima; Srivastava, Subodh; Jain, Vipin Kumar; Kumar, Pawan; Kataria, Vanjula; Agrawal, Yogyata; Gupta, Bipin Kumar; Vijay, Yogesh K

    2013-06-12

    We report a design and fabrication strategy for creating an artificially stacked multilayered optical filters using a thermal evaporation technique. We have selectively chosen a zinc sulphide (ZnS) lattice for the high refractive index (n = 2.35) layer and a magnesium fluoride (MgF2) lattice as the low refractive index (n = 1.38) layer. Furthermore, the microstructures of the ZnS/MgF2 multilayer films are also investigated through TEM and HRTEM imaging. The fabricated filters consist of high and low refractive 7 and 13 alternating layers, which exhibit a reflectance of 89.60% and 99%, respectively. The optical microcavity achieved an average transmittance of 85.13% within the visible range. The obtained results suggest that these filters could be an exceptional choice for next-generation antireflection coatings, high-reflection mirrors, and polarized interference filters.

  5. Metallic artifact mitigation and organ-constrained tissue assignment for Monte Carlo calculations of permanent implant lung brachytherapy.

    PubMed

    Sutherland, J G H; Miksys, N; Furutani, K M; Thomson, R M

    2014-01-01

    To investigate methods of generating accurate patient-specific computational phantoms for the Monte Carlo calculation of lung brachytherapy patient dose distributions. Four metallic artifact mitigation methods are applied to six lung brachytherapy patient computed tomography (CT) images: simple threshold replacement (STR) identifies high CT values in the vicinity of the seeds and replaces them with estimated true values; fan beam virtual sinogram replaces artifact-affected values in a virtual sinogram and performs a filtered back-projection to generate a corrected image; 3D median filter replaces voxel values that differ from the median value in a region of interest surrounding the voxel and then applies a second filter to reduce noise; and a combination of fan beam virtual sinogram and STR. Computational phantoms are generated from artifact-corrected and uncorrected images using several tissue assignment schemes: both lung-contour constrained and unconstrained global schemes are considered. Voxel mass densities are assigned based on voxel CT number or using the nominal tissue mass densities. Dose distributions are calculated using the EGSnrc user-code BrachyDose for (125)I, (103)Pd, and (131)Cs seeds and are compared directly as well as through dose volume histograms and dose metrics for target volumes surrounding surgical sutures. Metallic artifact mitigation techniques vary in ability to reduce artifacts while preserving tissue detail. Notably, images corrected with the fan beam virtual sinogram have reduced artifacts but residual artifacts near sources remain requiring additional use of STR; the 3D median filter removes artifacts but simultaneously removes detail in lung and bone. Doses vary considerably between computational phantoms with the largest differences arising from artifact-affected voxels assigned to bone in the vicinity of the seeds. Consequently, when metallic artifact reduction and constrained tissue assignment within lung contours are employed in generated phantoms, this erroneous assignment is reduced, generally resulting in higher doses. Lung-constrained tissue assignment also results in increased doses in regions of interest due to a reduction in the erroneous assignment of adipose to voxels within lung contours. Differences in dose metrics calculated for different computational phantoms are sensitive to radionuclide photon spectra with the largest differences for (103)Pd seeds and smallest but still considerable differences for (131)Cs seeds. Despite producing differences in CT images, dose metrics calculated using the STR, fan beam + STR, and 3D median filter techniques produce similar dose metrics. Results suggest that the accuracy of dose distributions for permanent implant lung brachytherapy is improved by applying lung-constrained tissue assignment schemes to metallic artifact corrected images.

  6. Supervised retinal vessel segmentation from color fundus images based on matched filtering and AdaBoost classifier.

    PubMed

    Memari, Nogol; Ramli, Abd Rahman; Bin Saripan, M Iqbal; Mashohor, Syamsiah; Moghbel, Mehrdad

    2017-01-01

    The structure and appearance of the blood vessel network in retinal fundus images is an essential part of diagnosing various problems associated with the eyes, such as diabetes and hypertension. In this paper, an automatic retinal vessel segmentation method utilizing matched filter techniques coupled with an AdaBoost classifier is proposed. The fundus image is enhanced using morphological operations, the contrast is increased using contrast limited adaptive histogram equalization (CLAHE) method and the inhomogeneity is corrected using Retinex approach. Then, the blood vessels are enhanced using a combination of B-COSFIRE and Frangi matched filters. From this preprocessed image, different statistical features are computed on a pixel-wise basis and used in an AdaBoost classifier to extract the blood vessel network inside the image. Finally, the segmented images are postprocessed to remove the misclassified pixels and regions. The proposed method was validated using publicly accessible Digital Retinal Images for Vessel Extraction (DRIVE), Structured Analysis of the Retina (STARE) and Child Heart and Health Study in England (CHASE_DB1) datasets commonly used for determining the accuracy of retinal vessel segmentation methods. The accuracy of the proposed segmentation method was comparable to other state of the art methods while being very close to the manual segmentation provided by the second human observer with an average accuracy of 0.972, 0.951 and 0.948 in DRIVE, STARE and CHASE_DB1 datasets, respectively.

  7. Training-based descreening.

    PubMed

    Siddiqui, Hasib; Bouman, Charles A

    2007-03-01

    Conventional halftoning methods employed in electrophotographic printers tend to produce Moiré artifacts when used for printing images scanned from printed material, such as books and magazines. We present a novel approach for descreening color scanned documents aimed at providing an efficient solution to the Moiré problem in practical imaging devices, including copiers and multifunction printers. The algorithm works by combining two nonlinear image-processing techniques, resolution synthesis-based denoising (RSD), and modified smallest univalue segment assimilating nucleus (SUSAN) filtering. The RSD predictor is based on a stochastic image model whose parameters are optimized beforehand in a separate training procedure. Using the optimized parameters, RSD classifies the local window around the current pixel in the scanned image and applies filters optimized for the selected classes. The output of the RSD predictor is treated as a first-order estimate to the descreened image. The modified SUSAN filter uses the output of RSD for performing an edge-preserving smoothing on the raw scanned data and produces the final output of the descreening algorithm. Our method does not require any knowledge of the screening method, such as the screen frequency or dither matrix coefficients, that produced the printed original. The proposed scheme not only suppresses the Moiré artifacts, but, in addition, can be trained with intrinsic sharpening for deblurring scanned documents. Finally, once optimized for a periodic clustered-dot halftoning method, the same algorithm can be used to inverse halftone scanned images containing stochastic error diffusion halftone noise.

  8. Advanced microlens and color filter process technology for the high-efficiency CMOS and CCD image sensors

    NASA Astrophysics Data System (ADS)

    Fan, Yang-Tung; Peng, Chiou-Shian; Chu, Cheng-Yu

    2000-12-01

    New markets are emerging for digital electronic image device, especially in visual communications, PC camera, mobile/cell phone, security system, toys, vehicle image system and computer peripherals for document capture. To enable one-chip image system that image sensor is with a full digital interface, can make image capture devices in our daily lives. Adding a color filter to such image sensor in a pattern of mosaics pixel or wide stripes can make image more real and colorful. We can say 'color filter makes the life more colorful color filter is? Color filter means can filter image light source except the color with specific wavelength and transmittance that is same as color filter itself. Color filter process is coating and patterning green, red and blue (or cyan, magenta and yellow) mosaic resists onto matched pixel in image sensing array pixels. According to the signal caught from each pixel, we can figure out the environment image picture. Widely use of digital electronic camera and multimedia applications today makes the feature of color filter becoming bright. Although it has challenge but it is very worthy to develop the process of color filter. We provide the best service on shorter cycle time, excellent color quality, high and stable yield. The key issues of advanced color process have to be solved and implemented are planarization and micro-lens technology. Lost of key points of color filter process technology have to consider will also be described in this paper.

  9. Radar image processing for rock-type discrimination

    NASA Technical Reports Server (NTRS)

    Blom, R. G.; Daily, M.

    1982-01-01

    Image processing and enhancement techniques for improving the geologic utility of digital satellite radar images are reviewed. Preprocessing techniques such as mean and variance correction on a range or azimuth line by line basis to provide uniformly illuminated swaths, median value filtering for four-look imagery to eliminate speckle, and geometric rectification using a priori elevation data. Examples are presented of application of preprocessing methods to Seasat and Landsat data, and Seasat SAR imagery was coregistered with Landsat imagery to form composite scenes. A polynomial was developed to distort the radar picture to fit the Landsat image of a 90 x 90 km sq grid, using Landsat color ratios with Seasat intensities. Subsequent linear discrimination analysis was employed to discriminate rock types from known areas. Seasat additions to the Landsat data improved rock identification by 7%.

  10. Synthetic aperture tomographic phase microscopy for 3D imaging of live cells in translational motion

    PubMed Central

    Lue, Niyom; Choi, Wonshik; Popescu, Gabriel; Badizadegan, Kamran; Dasari, Ramachandra R.; Feld, Michael S.

    2009-01-01

    We present a technique for 3D imaging of live cells in translational motion without need of axial scanning of objective lens. A set of transmitted electric field images of cells at successive points of transverse translation is taken with a focused beam illumination. Based on Hyugens’ principle, angular plane waves are synthesized from E-field images of a focused beam. For a set of synthesized angular plane waves, we apply a filtered back-projection algorithm and obtain 3D maps of refractive index of live cells. This technique, which we refer to as synthetic aperture tomographic phase microscopy, can potentially be combined with flow cytometry or microfluidic devices, and will enable high throughput acquisition of quantitative refractive index data from large numbers of cells. PMID:18825263

  11. Preparing Colorful Astronomical Images III: Cosmetic Cleaning

    NASA Astrophysics Data System (ADS)

    Frattare, L. M.; Levay, Z. G.

    2003-12-01

    We present cosmetic cleaning techniques for use with mainstream graphics software (Adobe Photoshop) to produce presentation-quality images and illustrations from astronomical data. These techniques have been used on numerous images from the Hubble Space Telescope when producing photographic, print and web-based products for news, education and public presentation as well as illustrations for technical publication. We expand on a previous paper to discuss the treatment of various detector-attributed artifacts such as cosmic rays, chip seams, gaps, optical ghosts, diffraction spikes and the like. While Photoshop is not intended for quantitative analysis of full dynamic range data (as are IRAF or IDL, for example), we have had much success applying Photoshop's numerous, versatile tools to final presentation images. Other pixel-to-pixel applications such as filter smoothing and global noise reduction will be discussed.

  12. Visually enhanced CCTV digital surveillance utilizing Intranet and Internet.

    PubMed

    Ozaki, Nobuyuki

    2002-07-01

    This paper describes a solution for integrated plant supervision utilizing closed circuit television (CCTV) digital surveillance. Three basic requirements are first addressed as the platform of the system, with discussion on the suitable video compression. The system configuration is described in blocks. The system provides surveillance functionality: real-time monitoring, and process analysis functionality: a troubleshooting tool. This paper describes the formulation of practical performance design for determining various encoder parameters. It also introduces image processing techniques for enhancing the original CCTV digital image to lessen the burden on operators. Some screenshots are listed for the surveillance functionality. For the process analysis, an image searching filter supported by image processing techniques is explained with screenshots. Multimedia surveillance, which is the merger with process data surveillance, or the SCADA system, is also explained.

  13. Distant Cluster Hunting. II; A Comparison of X-Ray and Optical Cluster Detection Techniques and Catalogs from the ROSAT Optical X-Ray Survey

    NASA Technical Reports Server (NTRS)

    Donahue, Megan; Scharf, Caleb A.; Mack, Jennifer; Lee, Y. Paul; Postman, Marc; Rosait, Piero; Dickinson, Mark; Voit, G. Mark; Stocke, John T.

    2002-01-01

    We present and analyze the optical and X-ray catalogs of moderate-redshift cluster candidates from the ROSA TOptical X-Ray Survey, or ROXS. The survey covers the sky area contained in the fields of view of 23 deep archival ROSA T PSPC pointings, 4.8 square degrees. The cross-correlated cluster catalogs were con- structed by comparing two independent catalogs extracted from the optical and X-ray bandpasses, using a matched-filter technique for the optical data and a wavelet technique for the X-ray data. We cross-identified cluster candidates in each catalog. As reported in Paper 1, the matched-filter technique found optical counter- parts for at least 60% (26 out of 43) of the X-ray cluster candidates; the estimated redshifts from the matched filter algorithm agree with at least 7 of 1 1 spectroscopic confirmations (Az 5 0.10). The matched filter technique. with an imaging sensitivity of ml N 23, identified approximately 3 times the number of candidates (155 candidates, 142 with a detection confidence >3 u) found in the X-ray survey of nearly the same area. There are 57 X-ray candidates, 43 of which are unobscured by scattered light or bright stars in the optical images. Twenty-six of these have fairly secure optical counterparts. We find that the matched filter algorithm, when applied to images with galaxy flux sensitivities of mI N 23, is fairly well-matched to discovering z 5 1 clusters detected by wavelets in ROSAT PSPC exposures of 8000-60,000 s. The difference in the spurious fractions between the optical and X-ray (30%) and IO%, respectively) cannot account for the difference in source number. In Paper I, we compared the optical and X-ray cluster luminosity functions and we found that the luminosity functions are consistent if the relationship between X-ray and optical luminosities is steep (Lx o( L&f). Here, in Paper 11, we present the cluster catalogs and a numerical simulation of the ROXS. We also present color-magnitude plots for several of the cluster candidates, and examine the prominence of the red sequence in each. We find that the X-ray clusters in our survey do not all have a prominent red sequence. We conclude that while the red sequence may be a distinct feature in the color-magnitude plots for virialized massive clusters, it may be less distinct in lower mass clusters of galaxies at even moderate redshifts. Multiple, complementary methods of selecting and defining clusters may be essential, particularly at high redshift where all methods start to run into completeness limits, incomplete understanding of physical evolution, and projection effects.

  14. Ultra-wide-band 3D microwave imaging scanner for the detection of concealed weapons

    NASA Astrophysics Data System (ADS)

    Rezgui, Nacer-Ddine; Andrews, David A.; Bowring, Nicholas J.

    2015-10-01

    The threat of concealed weapons, explosives and contraband in footwear, bags and suitcases has led to the development of new devices, which can be deployed for security screening. To address known deficiencies of metal detectors and x-rays, an UWB 3D microwave imaging scanning apparatus using FMCW stepped frequency working in the K and Q bands and with a planar scanning geometry based on an x y stage, has been developed to screen suspicious luggage and footwear. To obtain microwave images of the concealed weapons, the targets are placed above the platform and the single transceiver horn antenna attached to the x y stage is moved mechanically to perform a raster scan to create a 2D synthetic aperture array. The S11 reflection signal of the transmitted sweep frequency from the target is acquired by a VNA in synchronism with each position step. To enhance and filter from clutter and noise the raw data and to obtain the 2D and 3D microwave images of the concealed weapons or explosives, data processing techniques are applied to the acquired signals. These techniques include background subtraction, Inverse Fast Fourier Transform (IFFT), thresholding, filtering by gating and windowing and deconvolving with the transfer function of the system using a reference target. To focus the 3D reconstructed microwave image of the target in range and across the x y aperture without using focusing elements, 3D Synthetic Aperture Radar (SAR) techniques are applied to the post-processed data. The K and Q bands, between 15 to 40 GHz, show good transmission through clothing and dielectric materials found in luggage and footwear. A description of the system, algorithms and some results with replica guns and a comparison of microwave images obtained by IFFT, 2D and 3D SAR techniques are presented.

  15. Complex noise suppression using a sparse representation and 3D filtering of images

    NASA Astrophysics Data System (ADS)

    Kravchenko, V. F.; Ponomaryov, V. I.; Pustovoit, V. I.; Palacios-Enriquez, A.

    2017-08-01

    A novel method for the filtering of images corrupted by complex noise composed of randomly distributed impulses and additive Gaussian noise has been substantiated for the first time. The method consists of three main stages: the detection and filtering of pixels corrupted by impulsive noise, the subsequent image processing to suppress the additive noise based on 3D filtering and a sparse representation of signals in a basis of wavelets, and the concluding image processing procedure to clean the final image of the errors emerged at the previous stages. A physical interpretation of the filtering method under complex noise conditions is given. A filtering block diagram has been developed in accordance with the novel approach. Simulations of the novel image filtering method have shown an advantage of the proposed filtering scheme in terms of generally recognized criteria, such as the structural similarity index measure and the peak signal-to-noise ratio, and when visually comparing the filtered images.

  16. Full-color large-scaled computer-generated holograms for physical and non-physical objects

    NASA Astrophysics Data System (ADS)

    Matsushima, Kyoji; Tsuchiyama, Yasuhiro; Sonobe, Noriaki; Masuji, Shoya; Yamaguchi, Masahiro; Sakamoto, Yuji

    2017-05-01

    Several full-color high-definition CGHs are created for reconstructing 3D scenes including real-existing physical objects. The field of the physical objects are generated or captured by employing three techniques; 3D scanner, synthetic aperture digital holography, and multi-viewpoint images. Full-color reconstruction of high-definition CGHs is realized by RGB color filters. The optical reconstructions are presented for verifying these techniques.

  17. Orbital angular momentum light in microscopy

    PubMed Central

    2017-01-01

    Light with a helical phase has had an impact on optical imaging, pushing the limits of resolution or sensitivity. Here, special emphasis will be given to classical light microscopy of phase samples and to Fourier filtering techniques with a helical phase profile, such as the spiral phase contrast technique in its many variants and areas of application. This article is part of the themed issue ‘Optical orbital angular momentum’. PMID:28069768

  18. Shortwave infrared hyperspectral Imaging for cotton foreign matter classification

    USDA-ARS?s Scientific Manuscript database

    Various types of cotton foreign matter seriously reduce the commercial value of cotton lint and further degrade the quality of textile products for consumers. This research was aimed to investigate the potential of a non-contact technique, i.e., liquid crystal tunable filter (LCTF) hyperspectral ima...

  19. Automatic system for radar echoes filtering based on textural features and artificial intelligence

    NASA Astrophysics Data System (ADS)

    Hedir, Mehdia; Haddad, Boualem

    2017-10-01

    Among the very popular Artificial Intelligence (AI) techniques, Artificial Neural Network (ANN) and Support Vector Machine (SVM) have been retained to process Ground Echoes (GE) on meteorological radar images taken from Setif (Algeria) and Bordeaux (France) with different climates and topologies. To achieve this task, AI techniques were associated with textural approaches. We used Gray Level Co-occurrence Matrix (GLCM) and Completed Local Binary Pattern (CLBP); both methods were largely used in image analysis. The obtained results show the efficiency of texture to preserve precipitations forecast on both sites with the accuracy of 98% on Bordeaux and 95% on Setif despite the AI technique used. 98% of GE are suppressed with SVM, this rate is outperforming ANN skills. CLBP approach associated to SVM eliminates 98% of GE and preserves precipitations forecast on Bordeaux site better than on Setif's, while it exhibits lower accuracy with ANN. SVM classifier is well adapted to the proposed application since the average filtering rate is 95-98% with texture and 92-93% with CLBP. These approaches allow removing Anomalous Propagations (APs) too with a better accuracy of 97.15% with texture and SVM. In fact, textural features associated to AI techniques are an efficient tool for incoherent radars to surpass spurious echoes.

  20. Improving the visualization of 3D ultrasound data with 3D filtering

    NASA Astrophysics Data System (ADS)

    Shamdasani, Vijay; Bae, Unmin; Managuli, Ravi; Kim, Yongmin

    2005-04-01

    3D ultrasound imaging is quickly gaining widespread clinical acceptance as a visualization tool that allows clinicians to obtain unique views not available with traditional 2D ultrasound imaging and an accurate understanding of patient anatomy. The ability to acquire, manipulate and interact with the 3D data in real time is an important feature of 3D ultrasound imaging. Volume rendering is often used to transform the 3D volume into 2D images for visualization. Unlike computed tomography (CT) and magnetic resonance imaging (MRI), volume rendering of 3D ultrasound data creates noisy images in which surfaces cannot be readily discerned due to speckles and low signal-to-noise ratio. The degrading effect of speckles is especially severe when gradient shading is performed to add depth cues to the image. Several researchers have reported that smoothing the pre-rendered volume with a 3D convolution kernel, such as 5x5x5, can significantly improve the image quality, but at the cost of decreased resolution. In this paper, we have analyzed the reasons for the improvement in image quality with 3D filtering and determined that the improvement is due to two effects. The filtering reduces speckles in the volume data, which leads to (1) more accurate gradient computation and better shading and (2) decreased noise during compositing. We have found that applying a moderate-size smoothing kernel (e.g., 7x7x7) to the volume data before gradient computation combined with some smoothing of the volume data (e.g., with a 3x3x3 lowpass filter) before compositing yielded images with good depth perception and no appreciable loss in resolution. Providing the clinician with the flexibility to control both of these effects (i.e., shading and compositing) independently could improve the visualization of the 3D ultrasound data. Introducing this flexibility into the ultrasound machine requires 3D filtering to be performed twice on the volume data, once before gradient computation and again before compositing. 3D filtering of an ultrasound volume containing millions of voxels requires a large amount of computation, and doing it twice decreases the number of frames that can be visualized per second. To address this, we have developed several techniques to make computation efficient. For example, we have used the moving average method to filter a 128x128x128 volume with a 3x3x3 boxcar kernel in 17 ms on a single MAP processor running at 400 MHz. The same methods reduced the computing time on a Pentium 4 running at 3 GHz from 110 ms to 62 ms. We believe that our proposed method can improve 3D ultrasound visualization without sacrificing resolution and incurring an excessive computing time.

  1. Digital micromirror devices in Raman trace detection of explosives

    NASA Astrophysics Data System (ADS)

    Glimtoft, Martin; Svanqvist, Mattias; Ågren, Matilda; Nordberg, Markus; Östmark, Henric

    2016-05-01

    Imaging Raman spectroscopy based on tunable filters is an established technique for detecting single explosives particles at stand-off distances. However, large light losses are inherent in the design due to sequential imaging at different wavelengths, leading to effective transmission often well below 1 %. The use of digital micromirror devices (DMD) and compressive sensing (CS) in imaging Raman explosives trace detection can improve light throughput and add significant flexibility compared to existing systems. DMDs are based on mature microelectronics technology, and are compact, scalable, and can be customized for specific tasks, including new functions not available with current technologies. This paper has been focusing on investigating how a DMD can be used when applying CS-based imaging Raman spectroscopy on stand-off explosives trace detection, and evaluating the performance in terms of light throughput, image reconstruction ability and potential detection limits. This type of setup also gives the possibility to combine imaging Raman with non-spatially resolved fluorescence suppression techniques, such as Kerr gating. The system used consists of a 2nd harmonics Nd:YAG laser for sample excitation, collection optics, DMD, CMOScamera and a spectrometer with ICCD camera for signal gating and detection. Initial results for compressive sensing imaging Raman shows a stable reconstruction procedure even at low signals and in presence of interfering background signal. It is also shown to give increased effective light transmission without sacrificing molecular specificity or area coverage compared to filter based imaging Raman. At the same time it adds flexibility so the setup can be customized for new functionality.

  2. Adaptive noise Wiener filter for scanning electron microscope imaging system.

    PubMed

    Sim, K S; Teh, V; Nia, M E

    2016-01-01

    Noise on scanning electron microscope (SEM) images is studied. Gaussian noise is the most common type of noise in SEM image. We developed a new noise reduction filter based on the Wiener filter. We compared the performance of this new filter namely adaptive noise Wiener (ANW) filter, with four common existing filters as well as average filter, median filter, Gaussian smoothing filter and the Wiener filter. Based on the experiments results the proposed new filter has better performance on different noise variance comparing to the other existing noise removal filters in the experiments. © Wiley Periodicals, Inc.

  3. Filtering of the Radon transform to enhance linear signal features via wavelet pyramid decomposition

    NASA Astrophysics Data System (ADS)

    Meckley, John R.

    1995-09-01

    The information content in many signal processing applications can be reduced to a set of linear features in a 2D signal transform. Examples include the narrowband lines in a spectrogram, ship wakes in a synthetic aperture radar image, and blood vessels in a medical computer-aided tomography scan. The line integrals that generate the values of the projections of the Radon transform can be characterized as a bank of matched filters for linear features. This localization of energy in the Radon transform for linear features can be exploited to enhance these features and to reduce noise by filtering the Radon transform with a filter explicitly designed to pass only linear features, and then reconstructing a new 2D signal by inverting the new filtered Radon transform (i.e., via filtered backprojection). Previously used methods for filtering the Radon transform include Fourier based filtering (a 2D elliptical Gaussian linear filter) and a nonlinear filter ((Radon xfrm)**y with y >= 2.0). Both of these techniques suffer from the mismatch of the filter response to the true functional form of the Radon transform of a line. The Radon transform of a line is not a point but is a function of the Radon variables (rho, theta) and the total line energy. This mismatch leads to artifacts in the reconstructed image and a reduction in achievable processing gain. The Radon transform for a line is computed as a function of angle and offset (rho, theta) and the line length. The 2D wavelet coefficients are then compared for the Haar wavelets and the Daubechies wavelets. These filter responses are used as frequency filters for the Radon transform. The filtering is performed on the wavelet pyramid decomposition of the Radon transform by detecting the most likely positions of lines in the transform and then by convolving the local area with the appropriate response and zeroing the pyramid coefficients outside of the response area. The response area is defined to contain 95% of the total wavelet coefficient energy. The detection algorithm provides an estimate of the line offset, orientation, and length that is then used to index the appropriate filter shape. Additional wavelet pyramid decomposition is performed in areas of high energy to refine the line position estimate. After filtering, the new Radon transform is generated by inverting the wavelet pyramid. The Radon transform is then inverted by filtered backprojection to produce the final 2D signal estimate with the enhanced linear features. The wavelet-based method is compared to both the Fourier and the nonlinear filtering with examples of sparse and dense shapes in imaging, acoustics and medical tomography with test images of noisy concentric lines, a real spectrogram of a blow fish (a very nonstationary spectrum), and the Shepp Logan Computer Tomography phantom image. Both qualitative and derived quantitative measures demonstrate the improvement of wavelet-based filtering. Additional research is suggested based on these results. Open questions include what level(s) to use for detection and filtering because multiple-level representations exist. The lower levels are smoother at reduced spatial resolution, while the higher levels provide better response to edges. Several examples are discussed based on analytical and phenomenological arguments.

  4. Automatic measurement of images on astrometric plates

    NASA Astrophysics Data System (ADS)

    Ortiz Gil, A.; Lopez Garcia, A.; Martinez Gonzalez, J. M.; Yershov, V.

    1994-04-01

    We present some results on the process of automatic detection and measurement of objects in overlapped fields of astrometric plates. The main steps of our algorithm are the following: determination of the Scale and Tilt between charge coupled devices (CCD) and microscope coordinate systems and estimation of signal-to-noise ratio in each field;--image identification and improvement of its position and size;--image final centering;--image selection and storage. Several parameters allow the use of variable criteria for image identification, characterization and selection. Problems related with faint images and crowded fields will be approached by special techniques (morphological filters, histogram properties and fitting models).

  5. A feasibility study of automatic lung nodule detection in chest digital tomosynthesis with machine learning based on support vector machine

    NASA Astrophysics Data System (ADS)

    Lee, Donghoon; Kim, Ye-seul; Choi, Sunghoon; Lee, Haenghwa; Jo, Byungdu; Choi, Seungyeon; Shin, Jungwook; Kim, Hee-Joung

    2017-03-01

    The chest digital tomosynthesis(CDT) is recently developed medical device that has several advantage for diagnosing lung disease. For example, CDT provides depth information with relatively low radiation dose compared to computed tomography (CT). However, a major problem with CDT is the image artifacts associated with data incompleteness resulting from limited angle data acquisition in CDT geometry. For this reason, the sensitivity of lung disease was not clear compared to CT. In this study, to improve sensitivity of lung disease detection in CDT, we developed computer aided diagnosis (CAD) systems based on machine learning. For design CAD systems, we used 100 cases of lung nodules cropped images and 100 cases of normal lesion cropped images acquired by lung man phantoms and proto type CDT. We used machine learning techniques based on support vector machine and Gabor filter. The Gabor filter was used for extracting characteristics of lung nodules and we compared performance of feature extraction of Gabor filter with various scale and orientation parameters. We used 3, 4, 5 scales and 4, 6, 8 orientations. After extracting features, support vector machine (SVM) was used for classifying feature of lesions. The linear, polynomial and Gaussian kernels of SVM were compared to decide the best SVM conditions for CDT reconstruction images. The results of CAD system with machine learning showed the capability of automatically lung lesion detection. Furthermore detection performance was the best when Gabor filter with 5 scale and 8 orientation and SVM with Gaussian kernel were used. In conclusion, our suggested CAD system showed improving sensitivity of lung lesion detection in CDT and decide Gabor filter and SVM conditions to achieve higher detection performance of our developed CAD system for CDT.

  6. On dealing with multiple correlation peaks in PIV

    NASA Astrophysics Data System (ADS)

    Masullo, A.; Theunissen, R.

    2018-05-01

    A novel algorithm to analyse PIV images in the presence of strong in-plane displacement gradients and reduce sub-grid filtering is proposed in this paper. Interrogation windows subjected to strong in-plane displacement gradients often produce correlation maps presenting multiple peaks. Standard multi-grid procedures discard such ambiguous correlation windows using a signal to noise (SNR) filter. The proposed algorithm improves the standard multi-grid algorithm allowing the detection of splintered peaks in a correlation map through an automatic threshold, producing multiple displacement vectors for each correlation area. Vector locations are chosen by translating images according to the peak displacements and by selecting the areas with the strongest match. The method is assessed on synthetic images of a boundary layer of varying intensity and a sinusoidal displacement field of changing wavelength. An experimental case of a flow exhibiting strong velocity gradients is also provided to show the improvements brought by this technique.

  7. Impact induced damage assessment by means of Lamb wave image processing

    NASA Astrophysics Data System (ADS)

    Kudela, Pawel; Radzienski, Maciej; Ostachowicz, Wieslaw

    2018-03-01

    The aim of this research is an analysis of full wavefield Lamb wave interaction with impact-induced damage at various impact energies in order to find out the limitation of the wavenumber adaptive image filtering method. In other words, the relation between impact energy and damage detectability will be shown. A numerical model based on the time domain spectral element method is used for modeling of Lamb wave propagation and interaction with barely visible impact damage in a carbon-epoxy laminate. Numerical studies are followed by experimental research on the same material with an impact damage induced by various energy and also a Teflon insert simulating delamination. Wavenumber adaptive image filtering and signal processing are used for damage visualization and assessment for both numerical and experimental full wavefield data. It is shown that it is possible to visualize and assess the impact damage location, size and to some extent severity by using the proposed technique.

  8. Optics of high-performance electron microscopes*

    PubMed Central

    Rose, H H

    2008-01-01

    During recent years, the theory of charged particle optics together with advances in fabrication tolerances and experimental techniques has lead to very significant advances in high-performance electron microscopes. Here, we will describe which theoretical tools, inventions and designs have driven this development. We cover the basic theory of higher-order electron optics and of image formation in electron microscopes. This leads to a description of different methods to correct aberrations by multipole fields and to a discussion of the most advanced design that take advantage of these techniques. The theory of electron mirrors is developed and it is shown how this can be used to correct aberrations and to design energy filters. Finally, different types of energy filters are described. PMID:27877933

  9. Quantum image median filtering in the spatial domain

    NASA Astrophysics Data System (ADS)

    Li, Panchi; Liu, Xiande; Xiao, Hong

    2018-03-01

    Spatial filtering is one principal tool used in image processing for a broad spectrum of applications. Median filtering has become a prominent representation of spatial filtering because its performance in noise reduction is excellent. Although filtering of quantum images in the frequency domain has been described in the literature, and there is a one-to-one correspondence between linear spatial filters and filters in the frequency domain, median filtering is a nonlinear process that cannot be achieved in the frequency domain. We therefore investigated the spatial filtering of quantum image, focusing on the design method of the quantum median filter and applications in image de-noising. To this end, first, we presented the quantum circuits for three basic modules (i.e., Cycle Shift, Comparator, and Swap), and then, we design two composite modules (i.e., Sort and Median Calculation). We next constructed a complete quantum circuit that implements the median filtering task and present the results of several simulation experiments on some grayscale images with different noise patterns. Although experimental results show that the proposed scheme has almost the same noise suppression capacity as its classical counterpart, the complexity analysis shows that the proposed scheme can reduce the computational complexity of the classical median filter from the exponential function of image size n to the second-order polynomial function of image size n, so that the classical method can be speeded up.

  10. Fuzzy Logic-Based Filter for Removing Additive and Impulsive Noise from Color Images

    NASA Astrophysics Data System (ADS)

    Zhu, Yuhong; Li, Hongyang; Jiang, Huageng

    2017-12-01

    This paper presents an efficient filter method based on fuzzy logics for adaptively removing additive and impulsive noise from color images. The proposed filter comprises two parts including noise detection and noise removal filtering. In the detection part, the fuzzy peer group concept is applied to determine what type of noise is added to each pixel of the corrupted image. In the filter part, the impulse noise is deducted by the vector median filter in the CIELAB color space and an optimal fuzzy filter is introduced to reduce the Gaussian noise, while they can work together to remove the mixed Gaussian-impulse noise from color images. Experimental results on several color images proves the efficacy of the proposed fuzzy filter.

  11. Novel and general approach to linear filter design for contrast-to-noise ratio enhancement of magnetic resonance images with multiple interfering features in the scene

    NASA Astrophysics Data System (ADS)

    Soltanian-Zadeh, Hamid; Windham, Joe P.

    1992-04-01

    Maximizing the minimum absolute contrast-to-noise ratios (CNRs) between a desired feature and multiple interfering processes, by linear combination of images in a magnetic resonance imaging (MRI) scene sequence, is attractive for MRI analysis and interpretation. A general formulation of the problem is presented, along with a novel solution utilizing the simple and numerically stable method of Gram-Schmidt orthogonalization. We derive explicit solutions for the case of two interfering features first, then for three interfering features, and, finally, using a typical example, for an arbitrary number of interfering feature. For the case of two interfering features, we also provide simplified analytical expressions for the signal-to-noise ratios (SNRs) and CNRs of the filtered images. The technique is demonstrated through its applications to simulated and acquired MRI scene sequences of a human brain with a cerebral infarction. For these applications, a 50 to 100% improvement for the smallest absolute CNR is obtained.

  12. Fission gas bubble identification using MATLAB's image processing toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collette, R.; King, J.; Keiser, Jr., D.

    Automated image processing routines have the potential to aid in the fuel performance evaluation process by eliminating bias in human judgment that may vary from person-to-person or sample-to-sample. In addition, this study presents several MATLAB based image analysis routines designed for fission gas void identification in post-irradiation examination of uranium molybdenum (U–Mo) monolithic-type plate fuels. Frequency domain filtration, enlisted as a pre-processing technique, can eliminate artifacts from the image without compromising the critical features of interest. This process is coupled with a bilateral filter, an edge-preserving noise removal technique aimed at preparing the image for optimal segmentation. Adaptive thresholding provedmore » to be the most consistent gray-level feature segmentation technique for U–Mo fuel microstructures. The Sauvola adaptive threshold technique segments the image based on histogram weighting factors in stable contrast regions and local statistics in variable contrast regions. Once all processing is complete, the algorithm outputs the total fission gas void count, the mean void size, and the average porosity. The final results demonstrate an ability to extract fission gas void morphological data faster, more consistently, and at least as accurately as manual segmentation methods.« less

  13. Fission gas bubble identification using MATLAB's image processing toolbox

    DOE PAGES

    Collette, R.; King, J.; Keiser, Jr., D.; ...

    2016-06-08

    Automated image processing routines have the potential to aid in the fuel performance evaluation process by eliminating bias in human judgment that may vary from person-to-person or sample-to-sample. In addition, this study presents several MATLAB based image analysis routines designed for fission gas void identification in post-irradiation examination of uranium molybdenum (U–Mo) monolithic-type plate fuels. Frequency domain filtration, enlisted as a pre-processing technique, can eliminate artifacts from the image without compromising the critical features of interest. This process is coupled with a bilateral filter, an edge-preserving noise removal technique aimed at preparing the image for optimal segmentation. Adaptive thresholding provedmore » to be the most consistent gray-level feature segmentation technique for U–Mo fuel microstructures. The Sauvola adaptive threshold technique segments the image based on histogram weighting factors in stable contrast regions and local statistics in variable contrast regions. Once all processing is complete, the algorithm outputs the total fission gas void count, the mean void size, and the average porosity. The final results demonstrate an ability to extract fission gas void morphological data faster, more consistently, and at least as accurately as manual segmentation methods.« less

  14. Intraoperative laser speckle contrast imaging with retrospective motion correction for quantitative assessment of cerebral blood flow

    PubMed Central

    Richards, Lisa M.; Towle, Erica L.; Fox, Douglas J.; Dunn, Andrew K.

    2014-01-01

    Abstract. Although multiple intraoperative cerebral blood flow (CBF) monitoring techniques are currently available, a quantitative method that allows for continuous monitoring and that can be easily integrated into the surgical workflow is still needed. Laser speckle contrast imaging (LSCI) is an optical imaging technique with a high spatiotemporal resolution that has been recently demonstrated as feasible and effective for intraoperative monitoring of CBF during neurosurgical procedures. This study demonstrates the impact of retrospective motion correction on the quantitative analysis of intraoperatively acquired LSCI images. LSCI images were acquired through a surgical microscope during brain tumor resection procedures from 10 patients under baseline conditions and after a cortical stimulation in three of those patients. The patient’s electrocardiogram (ECG) was recorded during acquisition for postprocess correction of pulsatile artifacts. Automatic image registration was retrospectively performed to correct for tissue motion artifacts, and the performance of rigid and nonrigid transformations was compared. In baseline cases, the original images had 25%±27% noise across 16 regions of interest (ROIs). ECG filtering moderately reduced the noise to 20%±21%, while image registration resulted in a further noise reduction of 15%±4%. Combined ECG filtering and image registration significantly reduced the noise to 6.2%±2.6% (p<0.05). Using the combined motion correction, accuracy and sensitivity to small changes in CBF were improved in cortical stimulation cases. There was also excellent agreement between rigid and nonrigid registration methods (15/16 ROIs with <3% difference). Results from this study demonstrate the importance of motion correction for improved visualization of CBF changes in clinical LSCI images. PMID:26157974

  15. Specialized Color Targets for Spectral Reflectance Reconstruction of Magnified Images

    NASA Astrophysics Data System (ADS)

    Kruschwitz, Jennifer D. T.

    Digital images are used almost exclusively instead of film to capture visual information across many scientific fields. The colorimetric color representation within these digital images can be relayed from the digital counts produced by the camera with the use of a known color target. In image capture of magnified images, there is currently no reliable color target that can be used at multiple magnifications and give the user a solid understanding of the color ground truth within those images. The first part of this dissertation included the design, fabrication, and testing of a color target produced with optical interference coated microlenses for use in an off-axis illumination, compound microscope. An ideal target was designed to increase the color gamut for colorimetric imaging and provide the necessary "Block Dye" spectral reflectance profiles across the visible spectrum to reduce the number of color patches necessary for multiple filter imaging systems that rely on statistical models for spectral reflectance reconstruction. There are other scientific disciplines that can benefit from a specialized color target to determine the color ground truth in their magnified images and perform spectral estimation. Not every discipline has the luxury of having a multi-filter imaging system. The second part of this dissertation developed two unique ways of using an interference coated color mirror target: one that relies on multiple light-source angles, and one that leverages a dynamic color change with time. The source multi-angle technique would be used for the microelectronic discipline where the reconstructed spectral reflectance would be used to determine a dielectric film thickness on a silicon substrate, and the time varying technique would be used for a biomedical example to determine the thickness of human tear film.

  16. SU-E-J-243: Possibility of Exposure Dose Reduction of Cone-Beam Computed Tomography in An Image Guided Patient Positioning System by Using Various Noise Suppression Filters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamezawa, H; Fujimoto General Hospital, Miyakonojo, Miyazaki; Arimura, H

    Purpose: To investigate the possibility of exposure dose reduction of the cone-beam computed tomography (CBCT) in an image guided patient positioning system by using 6 noise suppression filters. Methods: First, a reference dose (RD) and low-dose (LD)-CBCT (X-ray volume imaging system, Elekta Co.) images were acquired with a reference dose of 86.2 mGy (weighted CT dose index: CTDIw) and various low doses of 1.4 to 43.1 mGy, respectively. Second, an automated rigid registration for three axes was performed for estimating setup errors between a planning CT image and the LD-CBCT images, which were processed by 6 noise suppression filters, i.e.,more » averaging filter (AF), median filter (MF), Gaussian filter (GF), bilateral filter (BF), edge preserving smoothing filter (EPF) and adaptive partial median filter (AMF). Third, residual errors representing the patient positioning accuracy were calculated as an Euclidean distance between the setup error vectors estimated using the LD-CBCT image and RD-CBCT image. Finally, the relationships between the residual error and CTDIw were obtained for 6 noise suppression filters, and then the CTDIw for LD-CBCT images processed by the noise suppression filters were measured at the same residual error, which was obtained with the RD-CBCT. This approach was applied to an anthropomorphic pelvic phantom and two cancer patients. Results: For the phantom, the exposure dose could be reduced from 61% (GF) to 78% (AMF) by applying the noise suppression filters to the CBCT images. The exposure dose in a prostate cancer case could be reduced from 8% (AF) to 61% (AMF), and the exposure dose in a lung cancer case could be reduced from 9% (AF) to 37% (AMF). Conclusion: Using noise suppression filters, particularly an adaptive partial median filter, could be feasible to decrease the additional exposure dose to patients in image guided patient positioning systems.« less

  17. New Studies of Polar Spicules

    NASA Astrophysics Data System (ADS)

    Zirin, H.; Cameron, R.

    1998-05-01

    We have studied several hundred images of solar spicules obtained on June 18 and 19 and July 15 of 1997. The observations were made at BBSO with the 65cm telescope feeding a Zeiss 1/4 Angstroms filter and a 1536x1024 Kodak CCD. Overexposed observations were made above the limb as well as normal exposures on the limb. The filter was tuned to Hα -0.65A and a 30sec interval was used. We were limited to a single wavelength because new software was being installed in a new control computer. The images obtained were processed by high-pass digital filtering of the original FITS images and reregistered by an FFT technique. The image scale is 0.17 arcsec per pixel. The disk was observed on June 18, 1997 to detect the sources of macrospicules and the limb was observed by overexposure on June 19 to determine the height trajectory of the faintest Hα We found that: Many more spicules go up than come down. There are numerous double and multiple spicules. The macrospicules come from normal network elements and start with an "Eiffel tower" shape. There is evidence of magnetic changes underlying these features. Both long macrospicules and complex eruptions are important at the pole. There is some evidence for rotation in spicules.

  18. Design and Ground Calibration of the Helioseismic and Magnetic Imager (HMI) Instrument on the Solar Dynamics Observatory (SDO)

    NASA Technical Reports Server (NTRS)

    Schou, J.; Scherrer, P. H.; Bush, R. I.; Wachter, R.; Couvidat, S.; Rabello-Soares, M. C.; Bogart, R. S.; Hoeksema, J. T.; Liu, Y.; Duvall, T. L., Jr.; hide

    2012-01-01

    The Helioseismic and Magnetic Imager (HMI) investigation will study the solar interior using helioseismic techniques as well as the magnetic field near the solar surface. The HMI instrument is part of the Solar Dynamics Observatory (SDO) that was launched on 11 February 2010. The instrument is designed to measure the Doppler shift, intensity, and vector magnetic field at the solar photosphere using the 6173 Fe I absorption line. The instrument consists of a front-window filter, a telescope, a set of wave plates for polarimetry, an image-stabilization system, a blocking filter, a five-stage Lyot filter with one tunable element, two wide-field tunable Michelson interferometers, a pair of 4096(exo 2) pixel cameras with independent shutters, and associated electronics. Each camera takes a full-disk image roughly every 3.75 seconds giving an overall cadence of 45 seconds for the Doppler, intensity, and line-of-sight magnetic-field measurements and a slower cadence for the full vector magnetic field. This article describes the design of the HMI instrument and provides an overview of the pre-launch calibration efforts. Overviews of the investigation, details of the calibrations, data handling, and the science analysis are provided in accompanying articles.

  19. Reading the lines in the face: The contribution of angularity and roundness to perceptions of facial anger and joy.

    PubMed

    Franklin, Robert G; Adams, Reginald B; Steiner, Troy G; Zebrowitz, Leslie A

    2018-05-14

    Through 3 studies, we investigated whether angularity and roundness present in faces contributes to the perception of anger and joyful expressions, respectively. First, in Study 1 we found that angry expressions naturally contain more inward-pointing lines, whereas joyful expressions contain more outward-pointing lines. Then, using image-processing techniques in Studies 2 and 3, we filtered images to contain only inward-pointing or outward-pointing lines as a way to approximate angularity and roundness. We found that filtering images to be more angular increased how threatening and angry a neutral face was rated, increased how intense angry expressions were rated, and enhanced the recognition of anger. Conversely, filtering images to be rounder increased how warm and joyful a neutral face was rated, increased the intensity of joyful expressions, and enhanced recognition of joy. Together these findings show that angularity and roundness play a direct role in the recognition of angry and joyful expressions. Given evidence that angularity and roundness may play a biological role in indicating threat and safety in the environment, this suggests that angularity and roundness represent primitive facial cues used to signal threat-anger and warmth-joy pairings. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. Adaptive marginal median filter for colour images.

    PubMed

    Morillas, Samuel; Gregori, Valentín; Sapena, Almanzor

    2011-01-01

    This paper describes a new filter for impulse noise reduction in colour images which is aimed at improving the noise reduction capability of the classical vector median filter. The filter is inspired by the application of a vector marginal median filtering process over a selected group of pixels in each filtering window. This selection, which is based on the vector median, along with the application of the marginal median operation constitutes an adaptive process that leads to a more robust filter design. Also, the proposed method is able to process colour images without introducing colour artifacts. Experimental results show that the images filtered with the proposed method contain less noisy pixels than those obtained through the vector median filter.

  1. Supervoxels for graph cuts-based deformable image registration using guided image filtering

    NASA Astrophysics Data System (ADS)

    Szmul, Adam; Papież, Bartłomiej W.; Hallack, Andre; Grau, Vicente; Schnabel, Julia A.

    2017-11-01

    We propose combining a supervoxel-based image representation with the concept of graph cuts as an efficient optimization technique for three-dimensional (3-D) deformable image registration. Due to the pixels/voxels-wise graph construction, the use of graph cuts in this context has been mainly limited to two-dimensional (2-D) applications. However, our work overcomes some of the previous limitations by posing the problem on a graph created by adjacent supervoxels, where the number of nodes in the graph is reduced from the number of voxels to the number of supervoxels. We demonstrate how a supervoxel image representation combined with graph cuts-based optimization can be applied to 3-D data. We further show that the application of a relaxed graph representation of the image, followed by guided image filtering over the estimated deformation field, allows us to model "sliding motion." Applying this method to lung image registration results in highly accurate image registration and anatomically plausible estimations of the deformations. Evaluation of our method on a publicly available computed tomography lung image dataset leads to the observation that our approach compares very favorably with state of the art methods in continuous and discrete image registration, achieving target registration error of 1.16 mm on average per landmark.

  2. Supervoxels for Graph Cuts-Based Deformable Image Registration Using Guided Image Filtering.

    PubMed

    Szmul, Adam; Papież, Bartłomiej W; Hallack, Andre; Grau, Vicente; Schnabel, Julia A

    2017-10-04

    In this work we propose to combine a supervoxel-based image representation with the concept of graph cuts as an efficient optimization technique for 3D deformable image registration. Due to the pixels/voxels-wise graph construction, the use of graph cuts in this context has been mainly limited to 2D applications. However, our work overcomes some of the previous limitations by posing the problem on a graph created by adjacent supervoxels, where the number of nodes in the graph is reduced from the number of voxels to the number of supervoxels. We demonstrate how a supervoxel image representation, combined with graph cuts-based optimization can be applied to 3D data. We further show that the application of a relaxed graph representation of the image, followed by guided image filtering over the estimated deformation field, allows us to model 'sliding motion'. Applying this method to lung image registration, results in highly accurate image registration and anatomically plausible estimations of the deformations. Evaluation of our method on a publicly available Computed Tomography lung image dataset (www.dir-lab.com) leads to the observation that our new approach compares very favorably with state-of-the-art in continuous and discrete image registration methods achieving Target Registration Error of 1.16mm on average per landmark.

  3. Supervoxels for Graph Cuts-Based Deformable Image Registration Using Guided Image Filtering

    PubMed Central

    Szmul, Adam; Papież, Bartłomiej W.; Hallack, Andre; Grau, Vicente; Schnabel, Julia A.

    2017-01-01

    In this work we propose to combine a supervoxel-based image representation with the concept of graph cuts as an efficient optimization technique for 3D deformable image registration. Due to the pixels/voxels-wise graph construction, the use of graph cuts in this context has been mainly limited to 2D applications. However, our work overcomes some of the previous limitations by posing the problem on a graph created by adjacent supervoxels, where the number of nodes in the graph is reduced from the number of voxels to the number of supervoxels. We demonstrate how a supervoxel image representation, combined with graph cuts-based optimization can be applied to 3D data. We further show that the application of a relaxed graph representation of the image, followed by guided image filtering over the estimated deformation field, allows us to model ‘sliding motion’. Applying this method to lung image registration, results in highly accurate image registration and anatomically plausible estimations of the deformations. Evaluation of our method on a publicly available Computed Tomography lung image dataset (www.dir-lab.com) leads to the observation that our new approach compares very favorably with state-of-the-art in continuous and discrete image registration methods achieving Target Registration Error of 1.16mm on average per landmark. PMID:29225433

  4. SIR-B ocean-wave enhancement with fast Fourier transform techniques

    NASA Technical Reports Server (NTRS)

    Tilley, David G.

    1987-01-01

    Shuttle Imaging Radar (SIR-B) imagery is Fourier filtered to remove the estimated system-transfer function, reduce speckle noise, and produce ocean scenes with a gray scale that is proportional to wave height. The SIR-B system response to speckled scenes of uniform surfaces yields an estimate of the stationary wavenumber response of the imaging radar, modeled by the 15 even terms of an eighth-order two-dimensional polynomial. Speckle can also be used to estimate the dynamic wavenumber response of the system due to surface motion during the aperture synthesis period, modeled with a single adaptive parameter describing an exponential correlation along track. A Fourier filter can then be devised to correct for the wavenumber response of the remote sensor and scene correlation, with subsequent subtraction of an estimate of the speckle noise component. A linearized velocity bunching model, combined with a surface tilt and hydrodynamic model, is incorporated in the Fourier filter to derive estimates of wave height from the radar intensities corresponding to individual picture elements.

  5. Single-energy pediatric chest computed tomography with spectral filtration at 100 kVp: effects on radiation parameters and image quality.

    PubMed

    Bodelle, Boris; Fischbach, Constanze; Booz, Christian; Yel, Ibrahim; Frellesen, Claudia; Kaup, Moritz; Beeres, Martin; Vogl, Thomas J; Scholtz, Jan-Erik

    2017-06-01

    Most of the applied radiation dose at CT is in the lower photon energy range, which is of limited diagnostic importance. To investigate image quality and effects on radiation parameters of 100-kVp spectral filtration single-energy chest CT using a tin-filter at third-generation dual-source CT in comparison to standard 100-kVp chest CT. Thirty-three children referred for a non-contrast chest CT performed on a third-generation dual-source CT scanner were examined at 100 kVp with a dedicated tin filter with a tube current-time product resulting in standard protocol dose. We compared resulting images with images from children examined using standard single-source chest CT at 100 kVp. We assessed objective and subjective image quality and compared radiation dose parameters. Radiation dose was comparable for children 5 years old and younger, and it was moderately decreased for older children when using spectral filtration (P=0.006). Effective tube current increased significantly (P=0.0001) with spectral filtration, up to a factor of 10. Signal-to-noise ratio and image noise were similar for both examination techniques (P≥0.06). Subjective image quality showed no significant differences (P≥0.2). Using 100-kVp spectral filtration chest CT in children by means of a tube-based tin-filter on a third-generation dual-source CT scanner increases effective tube current up to a factor of 10 to provide similar image quality at equivalent dose compared to standard single-source CT without spectral filtration.

  6. Laser-induced fluorescence imaging of bacteria

    NASA Astrophysics Data System (ADS)

    Hilton, Peter J.

    1998-12-01

    This paper outlines a method for optically detecting bacteria on various backgrounds, such as meat, by imaging their laser induced auto-fluorescence response. This method can potentially operate in real-time, which is many times faster than current bacterial detection methods, which require culturing of bacterial samples. This paper describes the imaging technique employed whereby a laser spot is scanned across an object while capturing, filtering, and digitizing the returned light. Preliminary results of the bacterial auto-fluorescence are reported and plans for future research are discussed. The results to date are encouraging with six of the eight bacterial strains investigated exhibiting auto-fluorescence when excited at 488 nm. Discrimination of these bacterial strains against red meat is shown and techniques for reducing background fluorescence discussed.

  7. The magic of image processing

    NASA Astrophysics Data System (ADS)

    Sulentic, Jack W.; Lorre, Jean J.

    1984-05-01

    Digital technology has been used to improve enhancement techniques in astronomical image processing. Continuous tone variations in photographs are assigned density number (DN) values which are arranged in an array. DN locations are processed by computer and turned into pixels which form a reconstruction of the original scene on a television monitor. Digitized data can be manipulated to enhance contrast and filter out gross patterns of light and dark which obscure small scale features. Separate black and white frames exposed at different wavelengths can be digitized and processed individually, then recombined to produce a final image in color. Several examples of the use of the technique are provided, including photographs of spiral galaxy M33; four galaxies in Coma Berenices (NGC 4169, 4173, 4174, and 4175); and Stephens Quintet.

  8. Full-color digitized holography for large-scale holographic 3D imaging of physical and nonphysical objects.

    PubMed

    Matsushima, Kyoji; Sonobe, Noriaki

    2018-01-01

    Digitized holography techniques are used to reconstruct three-dimensional (3D) images of physical objects using large-scale computer-generated holograms (CGHs). The object field is captured at three wavelengths over a wide area at high densities. Synthetic aperture techniques using single sensors are used for image capture in phase-shifting digital holography. The captured object field is incorporated into a virtual 3D scene that includes nonphysical objects, e.g., polygon-meshed CG models. The synthetic object field is optically reconstructed as a large-scale full-color CGH using red-green-blue color filters. The CGH has a wide full-parallax viewing zone and reconstructs a deep 3D scene with natural motion parallax.

  9. Photon counting x-ray imaging with K-edge filtered x-rays: A simulation study.

    PubMed

    Atak, Haluk; Shikhaliev, Polad M

    2016-03-01

    In photon counting (PC) x-ray imaging and computed tomography (CT), the broad x-ray spectrum can be split into two parts using an x-ray filter with appropriate K-edge energy, which can improve material decomposition. Recent experimental study has demonstrated substantial improvement in material decomposition with PC CT when K-edge filtered x-rays were used. The purpose of the current work was to conduct further investigations of the K-edge filtration method using comprehensive simulation studies. The study was performed in the following aspects: (1) optimization of the K-edge filter for a particular imaging configuration, (2) effects of the K-edge filter parameters on material decomposition, (3) trade-off between the energy bin separation, tube load, and beam quality with K-edge filter, (4) image quality of general (unsubtracted) images when a K-edge filter is used to improve dual energy (DE) subtracted images, and (5) improvements with K-edge filtered x-rays when PC detector has limited energy resolution. The PC x-ray images of soft tissue phantoms with 15 and 30 cm thicknesses including iodine, CaCO3, and soft tissue contrast materials, were simulated. The signal to noise ratio (SNR) of the contrast elements was determined in general and material-decomposed images using K-edge filters with different atomic numbers and thicknesses. The effect of the filter atomic number and filter thickness on energy separation factor and SNR was determined. The boundary conditions for the tube load and halfvalue layer were determined when the K-edge filters are used. The material-decomposed images were also simulated using PC detector with limited energy resolution, and improvements with K-edge filtered x-rays were quantified. The K-edge filters with atomic numbers from 56 to 71 and K-edge energies 37.4-63.4 keV, respectively, can be used for tube voltages from 60 to 150 kVp, respectively. For a particular tube voltage of 120 kVp, the Gd and Ho were the optimal filter materials to achieve highest SNR. For a particular K-edge filter of Gd and tube voltage of 120 kVp, the filter thickness 0.6 mm provided maximum SNR for considered imaging applications. While K-edge filtration improved SNR of CaCO3 and iodine by 41% and 36%, respectively, in DE subtracted images, it did not deteriorate SNR in general images. For x-ray imaging with nonideal PC detector, the positive effect of the K-edge filter was increased when FWHM energy resolution was degraded, and maximum improvement was at 60% FWHM. This study has shown that K-edge filtered x-rays can provide substantial improvements of material selective PC x-ray and CT imaging for nearly all imaging applications using 60-150 kVp tube voltages. Potential limitations such as tube load, beam hardening, and availability of filter material were shown to not be critical.

  10. Geological mapping potential of computer-enhanced images from the Shuttle Imaging Radar - Lisbon Valley Anticline, Utah

    NASA Technical Reports Server (NTRS)

    Curlis, J. D.; Frost, V. S.; Dellwig, L. F.

    1986-01-01

    Computer-enhancement techniques applied to the SIR-A data from the Lisbon Valley area in the northern portion of the Paradox basin increased the value of the imagery in the development of geologically useful maps. The enhancement techniques include filtering to remove image speckle from the SIR-A data and combining these data with Landsat multispectral scanner data. A method well-suited for the combination of the data sets utilized a three-dimensional domain defined by intensity-hue-saturation (IHS) coordinates. Such a system allows the Landsat data to modulate image intensity, while the SIR-A data control image hue and saturation. Whereas the addition of Landsat data to the SIR-A image by means of a pixel-by-pixel ratio accentuated textural variations within the image, the addition of color to the combined images enabled isolation of areas in which gray-tone contrast was minimal. This isolation resulted in a more precise definition of stratigraphic units.

  11. Quantitative coronary angiography using image recovery techniques for background estimation in unsubtracted images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Jerry T.; Kamyar, Farzad; Molloi, Sabee

    2007-10-15

    Densitometry measurements have been performed previously using subtracted images. However, digital subtraction angiography (DSA) in coronary angiography is highly susceptible to misregistration artifacts due to the temporal separation of background and target images. Misregistration artifacts due to respiration and patient motion occur frequently, and organ motion is unavoidable. Quantitative densitometric techniques would be more clinically feasible if they could be implemented using unsubtracted images. The goal of this study is to evaluate image recovery techniques for densitometry measurements using unsubtracted images. A humanoid phantom and eight swine (25-35 kg) were used to evaluate the accuracy and precision of the followingmore » image recovery techniques: Local averaging (LA), morphological filtering (MF), linear interpolation (LI), and curvature-driven diffusion image inpainting (CDD). Images of iodinated vessel phantoms placed over the heart of the humanoid phantom or swine were acquired. In addition, coronary angiograms were obtained after power injections of a nonionic iodinated contrast solution in an in vivo swine study. Background signals were estimated and removed with LA, MF, LI, and CDD. Iodine masses in the vessel phantoms were quantified and compared to known amounts. Moreover, the total iodine in left anterior descending arteries was measured and compared with DSA measurements. In the humanoid phantom study, the average root mean square errors associated with quantifying iodine mass using LA and MF were approximately 6% and 9%, respectively. The corresponding average root mean square errors associated with quantifying iodine mass using LI and CDD were both approximately 3%. In the in vivo swine study, the root mean square errors associated with quantifying iodine in the vessel phantoms with LA and MF were approximately 5% and 12%, respectively. The corresponding average root mean square errors using LI and CDD were both 3%. The standard deviations in the differences between measured iodine mass in left anterior descending arteries using DSA and LA, MF, LI, or CDD were calculated. The standard deviations in the DSA-LA and DSA-MF differences (both {approx}21 mg) were approximately a factor of 3 greater than that of the DSA-LI and DSA-CDD differences (both {approx}7 mg). Local averaging and morphological filtering were considered inadequate for use in quantitative densitometry. Linear interpolation and curvature-driven diffusion image inpainting were found to be effective techniques for use with densitometry in quantifying iodine mass in vitro and in vivo. They can be used with unsubtracted images to estimate background anatomical signals and obtain accurate densitometry results. The high level of accuracy and precision in quantification associated with using LI and CDD suggests the potential of these techniques in applications where background mask images are difficult to obtain, such as lumen volume and blood flow quantification using coronary arteriography.« less

  12. An Eulerian time filtering technique to study large-scale transient flow phenomena

    NASA Astrophysics Data System (ADS)

    Vanierschot, Maarten; Persoons, Tim; van den Bulck, Eric

    2009-10-01

    Unsteady fluctuating velocity fields can contain large-scale periodic motions with frequencies well separated from those of turbulence. Examples are the wake behind a cylinder or the processing vortex core in a swirling jet. These turbulent flow fields contain large-scale, low-frequency oscillations, which are obscured by turbulence, making it impossible to identify them. In this paper, we present an Eulerian time filtering (ETF) technique to extract the large-scale motions from unsteady statistical non-stationary velocity fields or flow fields with multiple phenomena that have sufficiently separated spectral content. The ETF method is based on non-causal time filtering of the velocity records in each point of the flow field. It is shown that the ETF technique gives good results, similar to the ones obtained by the phase-averaging method. In this paper, not only the influence of the temporal filter is checked, but also parameters such as the cut-off frequency and sampling frequency of the data are investigated. The technique is validated on a selected set of time-resolved stereoscopic particle image velocimetry measurements such as the initial region of an annular jet and the transition between flow patterns in an annular jet. The major advantage of the ETF method in the extraction of large scales is that it is computationally less expensive and it requires less measurement time compared to other extraction methods. Therefore, the technique is suitable in the startup phase of an experiment or in a measurement campaign where several experiments are needed such as parametric studies.

  13. Learnable despeckling framework for optical coherence tomography images

    NASA Astrophysics Data System (ADS)

    Adabi, Saba; Rashedi, Elaheh; Clayton, Anne; Mohebbi-Kalkhoran, Hamed; Chen, Xue-wen; Conforto, Silvia; Nasiriavanaki, Mohammadreza

    2018-01-01

    Optical coherence tomography (OCT) is a prevalent, interferometric, high-resolution imaging method with broad biomedical applications. Nonetheless, OCT images suffer from an artifact called speckle, which degrades the image quality. Digital filters offer an opportunity for image improvement in clinical OCT devices, where hardware modification to enhance images is expensive. To reduce speckle, a wide variety of digital filters have been proposed; selecting the most appropriate filter for an OCT image/image set is a challenging decision, especially in dermatology applications of OCT where a different variety of tissues are imaged. To tackle this challenge, we propose an expandable learnable despeckling framework, we call LDF. LDF decides which speckle reduction algorithm is most effective on a given image by learning a figure of merit (FOM) as a single quantitative image assessment measure. LDF is learnable, which means when implemented on an OCT machine, each given image/image set is retrained and its performance is improved. Also, LDF is expandable, meaning that any despeckling algorithm can easily be added to it. The architecture of LDF includes two main parts: (i) an autoencoder neural network and (ii) filter classifier. The autoencoder learns the FOM based on several quality assessment measures obtained from the OCT image including signal-to-noise ratio, contrast-to-noise ratio, equivalent number of looks, edge preservation index, and mean structural similarity index. Subsequently, the filter classifier identifies the most efficient filter from the following categories: (a) sliding window filters including median, mean, and symmetric nearest neighborhood, (b) adaptive statistical-based filters including Wiener, homomorphic Lee, and Kuwahara, and (c) edge preserved patch or pixel correlation-based filters including nonlocal mean, total variation, and block matching three-dimensional filtering.

  14. Classification images for localization performance in ramp-spectrum noise.

    PubMed

    Abbey, Craig K; Samuelson, Frank W; Zeng, Rongping; Boone, John M; Eckstein, Miguel P; Myers, Kyle

    2018-05-01

    This study investigates forced localization of targets in simulated images with statistical properties similar to trans-axial sections of x-ray computed tomography (CT) volumes. A total of 24 imaging conditions are considered, comprising two target sizes, three levels of background variability, and four levels of frequency apodization. The goal of the study is to better understand how human observers perform forced-localization tasks in images with CT-like statistical properties. The transfer properties of CT systems are modeled by a shift-invariant transfer function in addition to apodization filters that modulate high spatial frequencies. The images contain noise that is the combination of a ramp-spectrum component, simulating the effect of acquisition noise in CT, and a power-law component, simulating the effect of normal anatomy in the background, which are modulated by the apodization filter as well. Observer performance is characterized using two psychophysical techniques: efficiency analysis and classification image analysis. Observer efficiency quantifies how much diagnostic information is being used by observers to perform a task, and classification images show how that information is being accessed in the form of a perceptual filter. Psychophysical studies from five subjects form the basis of the results. Observer efficiency ranges from 29% to 77% across the different conditions. The lowest efficiency is observed in conditions with uniform backgrounds, where significant effects of apodization are found. The classification images, estimated using smoothing windows, suggest that human observers use center-surround filters to perform the task, and these are subjected to a number of subsequent analyses. When implemented as a scanning linear filter, the classification images appear to capture most of the observer variability in efficiency (r 2 = 0.86). The frequency spectra of the classification images show that frequency weights generally appear bandpass in nature, with peak frequency and bandwidth that vary with statistical properties of the images. In these experiments, the classification images appear to capture important features of human-observer performance. Frequency apodization only appears to have a significant effect on performance in the absence of anatomical variability, where the observers appear to underweight low spatial frequencies that have relatively little noise. Frequency weights derived from the classification images generally have a bandpass structure, with adaptation to different conditions seen in the peak frequency and bandwidth. The classification image spectra show relatively modest changes in response to different levels of apodization, with some evidence that observers are attempting to rebalance the apodized spectrum presented to them. © 2018 American Association of Physicists in Medicine.

  15. Efficiency analysis of color image filtering

    NASA Astrophysics Data System (ADS)

    Fevralev, Dmitriy V.; Ponomarenko, Nikolay N.; Lukin, Vladimir V.; Abramov, Sergey K.; Egiazarian, Karen O.; Astola, Jaakko T.

    2011-12-01

    This article addresses under which conditions filtering can visibly improve the image quality. The key points are the following. First, we analyze filtering efficiency for 25 test images, from the color image database TID2008. This database allows assessing filter efficiency for images corrupted by different noise types for several levels of noise variance. Second, the limit of filtering efficiency is determined for independent and identically distributed (i.i.d.) additive noise and compared to the output mean square error of state-of-the-art filters. Third, component-wise and vector denoising is studied, where the latter approach is demonstrated to be more efficient. Fourth, using of modern visual quality metrics, we determine that for which levels of i.i.d. and spatially correlated noise the noise in original images or residual noise and distortions because of filtering in output images are practically invisible. We also demonstrate that it is possible to roughly estimate whether or not the visual quality can clearly be improved by filtering.

  16. A nowcasting technique based on application of the particle filter blending algorithm

    NASA Astrophysics Data System (ADS)

    Chen, Yuanzhao; Lan, Hongping; Chen, Xunlai; Zhang, Wenhai

    2017-10-01

    To improve the accuracy of nowcasting, a new extrapolation technique called particle filter blending was configured in this study and applied to experimental nowcasting. Radar echo extrapolation was performed by using the radar mosaic at an altitude of 2.5 km obtained from the radar images of 12 S-band radars in Guangdong Province, China. The first bilateral filter was applied in the quality control of the radar data; an optical flow method based on the Lucas-Kanade algorithm and the Harris corner detection algorithm were used to track radar echoes and retrieve the echo motion vectors; then, the motion vectors were blended with the particle filter blending algorithm to estimate the optimal motion vector of the true echo motions; finally, semi-Lagrangian extrapolation was used for radar echo extrapolation based on the obtained motion vector field. A comparative study of the extrapolated forecasts of four precipitation events in 2016 in Guangdong was conducted. The results indicate that the particle filter blending algorithm could realistically reproduce the spatial pattern, echo intensity, and echo location at 30- and 60-min forecast lead times. The forecasts agreed well with observations, and the results were of operational significance. Quantitative evaluation of the forecasts indicates that the particle filter blending algorithm performed better than the cross-correlation method and the optical flow method. Therefore, the particle filter blending method is proved to be superior to the traditional forecasting methods and it can be used to enhance the ability of nowcasting in operational weather forecasts.

  17. Reduction of radiation exposure while maintaining high-quality fluoroscopic images during interventional cardiology using novel x-ray tube technology with extra beam filtering.

    PubMed

    den Boer, A; de Feyter, P J; Hummel, W A; Keane, D; Roelandt, J R

    1994-06-01

    Radiographic technology plays an integral role in interventional cardiology. The number of interventions continues to increase, and the associated radiation exposure to patients and personnel is of major concern. This study was undertaken to determine whether a newly developed x-ray tube deploying grid-switched pulsed fluoroscopy and extra beam filtering can achieve a reduction in radiation exposure while maintaining fluoroscopic images of high quality. Three fluoroscopic techniques were compared: continuous fluoroscopy, pulsed fluoroscopy, and a newly developed high-output pulsed fluoroscopy with extra filtering. To ascertain differences in the quality of images and to determine differences in patient entrance and investigator radiation exposure, the radiated volume curve was measured to determine the required high voltage levels (kVpeak) for different object sizes for each fluoroscopic mode. The fluoroscopic data of 124 patient procedures were combined. The data were analyzed for radiographic projections, image intensifier field size, and x-ray tube kilovoltage levels (kVpeak). On the basis of this analysis, a reference procedure was constructed. The reference procedure was tested on a phantom or dummy patient by all three fluoroscopic modes. The phantom was so designed that the kilovoltage requirements for each projection were comparable to those needed for the average patient. Radiation exposure of the operator and patient was measured during each mode. The patient entrance dose was measured in air, and the operator dose was measured by 18 dosimeters on a dummy operator. Pulsed compared with continuous fluoroscopy could be performed with improved image quality at lower kilovoltages. The patient entrance dose was reduced by 21% and the operator dose by 54%. High-output pulsed fluoroscopy with extra beam filtering compared with continuous fluoroscopy improved the image quality, lowered the kilovoltage requirements, and reduced the patient entrance dose by 55% and the operator dose by 69%. High-output pulsed fluoroscopy with a grid-switched tube and extra filtering improves the image quality and significantly reduces both the operator dose and patient dose.

  18. Computer image processing in marine resource exploration

    NASA Technical Reports Server (NTRS)

    Paluzzi, P. R.; Normark, W. R.; Hess, G. R.; Hess, H. D.; Cruickshank, M. J.

    1976-01-01

    Pictographic data or imagery is commonly used in marine exploration. Pre-existing image processing techniques (software) similar to those used on imagery obtained from unmanned planetary exploration were used to improve marine photography and side-scan sonar imagery. Features and details not visible by conventional photo processing methods were enhanced by filtering and noise removal on selected deep-sea photographs. Information gained near the periphery of photographs allows improved interpretation and facilitates construction of bottom mosaics where overlapping frames are available. Similar processing techniques were applied to side-scan sonar imagery, including corrections for slant range distortion, and along-track scale changes. The use of digital data processing and storage techniques greatly extends the quantity of information that can be handled, stored, and processed.

  19. Image-adaptive and robust digital wavelet-domain watermarking for images

    NASA Astrophysics Data System (ADS)

    Zhao, Yi; Zhang, Liping

    2018-03-01

    We propose a new frequency domain wavelet based watermarking technique. The key idea of our scheme is twofold: multi-tier solution representation of image and odd-even quantization embedding/extracting watermark. Because many complementary watermarks need to be hidden, the watermark image designed is image-adaptive. The meaningful and complementary watermark images was embedded into the original image (host image) by odd-even quantization modifying coefficients, which was selected from the detail wavelet coefficients of the original image, if their magnitudes are larger than their corresponding Just Noticeable Difference thresholds. The tests show good robustness against best-known attacks such as noise addition, image compression, median filtering, clipping as well as geometric transforms. Further research may improve the performance by refining JND thresholds.

  20. Design of efficient circularly symmetric two-dimensional variable digital FIR filters.

    PubMed

    Bindima, Thayyil; Elias, Elizabeth

    2016-05-01

    Circularly symmetric two-dimensional (2D) finite impulse response (FIR) filters find extensive use in image and medical applications, especially for isotropic filtering. Moreover, the design and implementation of 2D digital filters with variable fractional delay and variable magnitude responses without redesigning the filter has become a crucial topic of interest due to its significance in low-cost applications. Recently the design using fixed word length coefficients has gained importance due to the replacement of multipliers by shifters and adders, which reduces the hardware complexity. Among the various approaches to 2D design, transforming a one-dimensional (1D) filter to 2D by transformation, is reported to be an efficient technique. In this paper, 1D variable digital filters (VDFs) with tunable cut-off frequencies are designed using Farrow structure based interpolation approach, and the sub-filter coefficients in the Farrow structure are made multiplier-less using canonic signed digit (CSD) representation. The resulting performance degradation in the filters is overcome by using artificial bee colony (ABC) optimization. Finally, the optimized 1D VDFs are mapped to 2D using generalized McClellan transformation resulting in low complexity, circularly symmetric 2D VDFs with real-time tunability.

Top