Sample records for optimization digital image

  1. Image compression system and method having optimized quantization tables

    NASA Technical Reports Server (NTRS)

    Ratnakar, Viresh (Inventor); Livny, Miron (Inventor)

    1998-01-01

    A digital image compression preprocessor for use in a discrete cosine transform-based digital image compression device is provided. The preprocessor includes a gathering mechanism for determining discrete cosine transform statistics from input digital image data. A computing mechanism is operatively coupled to the gathering mechanism to calculate a image distortion array and a rate of image compression array based upon the discrete cosine transform statistics for each possible quantization value. A dynamic programming mechanism is operatively coupled to the computing mechanism to optimize the rate of image compression array against the image distortion array such that a rate-distortion-optimal quantization table is derived. In addition, a discrete cosine transform-based digital image compression device and a discrete cosine transform-based digital image compression and decompression system are provided. Also, a method for generating a rate-distortion-optimal quantization table, using discrete cosine transform-based digital image compression, and operating a discrete cosine transform-based digital image compression and decompression system are provided.

  2. Local sharpening and subspace wavefront correction with predictive dynamic digital holography

    NASA Astrophysics Data System (ADS)

    Sulaiman, Sennan; Gibson, Steve

    2017-09-01

    Digital holography holds several advantages over conventional imaging and wavefront sensing, chief among these being significantly fewer and simpler optical components and the retrieval of complex field. Consequently, many imaging and sensing applications including microscopy and optical tweezing have turned to using digital holography. A significant obstacle for digital holography in real-time applications, such as wavefront sensing for high energy laser systems and high speed imaging for target racking, is the fact that digital holography is computationally intensive; it requires iterative virtual wavefront propagation and hill-climbing to optimize some sharpness criteria. It has been shown recently that minimum-variance wavefront prediction can be integrated with digital holography and image sharpening to reduce significantly large number of costly sharpening iterations required to achieve near-optimal wavefront correction. This paper demonstrates further gains in computational efficiency with localized sharpening in conjunction with predictive dynamic digital holography for real-time applications. The method optimizes sharpness of local regions in a detector plane by parallel independent wavefront correction on reduced-dimension subspaces of the complex field in a spectral plane.

  3. How to optimize radiological images captured from digital cameras, using the Adobe Photoshop 6.0 program.

    PubMed

    Chalazonitis, A N; Koumarianos, D; Tzovara, J; Chronopoulos, P

    2003-06-01

    Over the past decade, the technology that permits images to be digitized and the reduction in the cost of digital equipment allows quick digital transfer of any conventional radiological film. Images then can be transferred to a personal computer, and several software programs are available that can manipulate their digital appearance. In this article, the fundamentals of digital imaging are discussed, as well as the wide variety of optional adjustments that the Adobe Photoshop 6.0 (Adobe Systems, San Jose, CA) program can offer to present radiological images with satisfactory digital imaging quality.

  4. Optimized digital speckle patterns for digital image correlation by consideration of both accuracy and efficiency.

    PubMed

    Chen, Zhenning; Shao, Xinxing; Xu, Xiangyang; He, Xiaoyuan

    2018-02-01

    The technique of digital image correlation (DIC), which has been widely used for noncontact deformation measurements in both the scientific and engineering fields, is greatly affected by the quality of speckle patterns in terms of its performance. This study was concerned with the optimization of the digital speckle pattern (DSP) for DIC in consideration of both the accuracy and efficiency. The root-mean-square error of the inverse compositional Gauss-Newton algorithm and the average number of iterations were used as quality metrics. Moreover, the influence of subset sizes and the noise level of images, which are the basic parameters in the quality assessment formulations, were also considered. The simulated binary speckle patterns were first compared with the Gaussian speckle patterns and captured DSPs. Both the single-radius and multi-radius DSPs were optimized. Experimental tests and analyses were conducted to obtain the optimized and recommended DSP. The vector diagram of the optimized speckle pattern was also uploaded as reference.

  5. [Development of a digital chest phantom for studies on energy subtraction techniques].

    PubMed

    Hayashi, Norio; Taniguchi, Anna; Noto, Kimiya; Shimosegawa, Masayuki; Ogura, Toshihiro; Doi, Kunio

    2014-03-01

    Digital chest phantoms continue to play a significant role in optimizing imaging parameters for chest X-ray examinations. The purpose of this study was to develop a digital chest phantom for studies on energy subtraction techniques under ideal conditions without image noise. Computed tomography (CT) images from the LIDC (Lung Image Database Consortium) were employed to develop a digital chest phantom. The method consisted of the following four steps: 1) segmentation of the lung and bone regions on CT images; 2) creation of simulated nodules; 3) transformation to attenuation coefficient maps from the segmented images; and 4) projection from attenuation coefficient maps. To evaluate the usefulness of digital chest phantoms, we determined the contrast of the simulated nodules in projection images of the digital chest phantom using high and low X-ray energies, soft tissue images obtained by energy subtraction, and "gold standard" images of the soft tissues. Using our method, the lung and bone regions were segmented on the original CT images. The contrast of simulated nodules in soft tissue images obtained by energy subtraction closely matched that obtained using the gold standard images. We thus conclude that it is possible to carry out simulation studies based on energy subtraction techniques using the created digital chest phantoms. Our method is potentially useful for performing simulation studies for optimizing the imaging parameters in chest X-ray examinations.

  6. Phase noise optimization in temporal phase-shifting digital holography with partial coherence light sources and its application in quantitative cell imaging.

    PubMed

    Remmersmann, Christian; Stürwald, Stephan; Kemper, Björn; Langehanenberg, Patrik; von Bally, Gert

    2009-03-10

    In temporal phase-shifting-based digital holographic microscopy, high-resolution phase contrast imaging requires optimized conditions for hologram recording and phase retrieval. To optimize the phase resolution, for the example of a variable three-step algorithm, a theoretical analysis on statistical errors, digitalization errors, uncorrelated errors, and errors due to a misaligned temporal phase shift is carried out. In a second step the theoretically predicted results are compared to the measured phase noise obtained from comparative experimental investigations with several coherent and partially coherent light sources. Finally, the applicability for noise reduction is demonstrated by quantitative phase contrast imaging of pancreas tumor cells.

  7. Clinical performance of a prototype flat-panel digital detector for general radiography

    NASA Astrophysics Data System (ADS)

    Huda, Walter; Scalzetti, Ernest M.; Roskopf, Marsha L.; Geiger, Robert

    2001-08-01

    Digital radiographs obtained using a prototype Digital Radiography System (Stingray) were compared with those obtained using conventional screen-film. Forty adult volunteers each had two identical radiographs taken at the same level of radiation exposure, one using screen-film and the other the digital detector. Each digital image was processed by hand to ensure that the printed quality was optimal. Ten radiologists compared the diagnostic image quality of the digital images with the corresponding film radiographs using a seven point ranking scheme.

  8. Dual-energy contrast-enhanced digital mammography (DE-CEDM): optimization on digital subtraction with practical x-ray low/high-energy spectra

    NASA Astrophysics Data System (ADS)

    Chen, Biao; Jing, Zhenxue; Smith, Andrew P.; Parikh, Samir; Parisky, Yuri

    2006-03-01

    Dual-energy contrast enhanced digital mammography (DE-CEDM), which is based upon the digital subtraction of low/high-energy image pairs acquired before/after the administration of contrast agents, may provide physicians physiologic and morphologic information of breast lesions and help characterize their probability of malignancy. This paper proposes to use only one pair of post-contrast low / high-energy images to obtain digitally subtracted dual-energy contrast-enhanced images with an optimal weighting factor deduced from simulated characteristics of the imaging chain. Based upon our previous CEDM framework, quantitative characteristics of the materials and imaging components in the x-ray imaging chain, including x-ray tube (tungsten) spectrum, filters, breast tissues / lesions, contrast agents (non-ionized iodine solution), and selenium detector, were systemically modeled. Using the base-material (polyethylene-PMMA) decomposition method based on entrance low / high-energy x-ray spectra and breast thickness, the optimal weighting factor was calculated to cancel the contrast between fatty and glandular tissues while enhancing the contrast of iodized lesions. By contrast, previous work determined the optimal weighting factor through either a calibration step or through acquisition of a pre-contrast low/high-energy image pair. Computer simulations were conducted to determine weighting factors, lesions' contrast signal values, and dose levels as functions of x-ray techniques and breast thicknesses. Phantom and clinical feasibility studies were performed on a modified Selenia full field digital mammography system to verify the proposed method and computer-simulated results. The resultant conclusions from the computer simulations and phantom/clinical feasibility studies will be used in the upcoming clinical study.

  9. Optimized algorithm for the spatial nonuniformity correction of an imaging system based on a charge-coupled device color camera.

    PubMed

    de Lasarte, Marta; Pujol, Jaume; Arjona, Montserrat; Vilaseca, Meritxell

    2007-01-10

    We present an optimized linear algorithm for the spatial nonuniformity correction of a CCD color camera's imaging system and the experimental methodology developed for its implementation. We assess the influence of the algorithm's variables on the quality of the correction, that is, the dark image, the base correction image, and the reference level, and the range of application of the correction using a uniform radiance field provided by an integrator cube. The best spatial nonuniformity correction is achieved by having a nonzero dark image, by using an image with a mean digital level placed in the linear response range of the camera as the base correction image and taking the mean digital level of the image as the reference digital level. The response of the CCD color camera's imaging system to the uniform radiance field shows a high level of spatial uniformity after the optimized algorithm has been applied, which also allows us to achieve a high-quality spatial nonuniformity correction of captured images under different exposure conditions.

  10. Electronic Photography at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Holm, Jack; Judge, Nancianne

    1995-01-01

    An electronic photography facility has been established in the Imaging & Photographic Technology Section, Visual Imaging Branch, at the NASA Langley Research Center (LaRC). The purpose of this facility is to provide the LaRC community with access to digital imaging technology. In particular, capabilities have been established for image scanning, direct image capture, optimized image processing for storage, image enhancement, and optimized device dependent image processing for output. Unique approaches include: evaluation and extraction of the entire film information content through scanning; standardization of image file tone reproduction characteristics for optimal bit utilization and viewing; education of digital imaging personnel on the effects of sampling and quantization to minimize image processing related information loss; investigation of the use of small kernel optimal filters for image restoration; characterization of a large array of output devices and development of image processing protocols for standardized output. Currently, the laboratory has a large collection of digital image files which contain essentially all the information present on the original films. These files are stored at 8-bits per color, but the initial image processing was done at higher bit depths and/or resolutions so that the full 8-bits are used in the stored files. The tone reproduction of these files has also been optimized so the available levels are distributed according to visual perceptibility. Look up tables are available which modify these files for standardized output on various devices, although color reproduction has been allowed to float to some extent to allow for full utilization of output device gamut.

  11. High resolution near on-axis digital holography using constrained optimization approach with faster convergence

    NASA Astrophysics Data System (ADS)

    Pandiyan, Vimal Prabhu; Khare, Kedar; John, Renu

    2017-09-01

    A constrained optimization approach with faster convergence is proposed to recover the complex object field from a near on-axis digital holography (DH). We subtract the DC from the hologram after recording the object beam and reference beam intensities separately. The DC-subtracted hologram is used to recover the complex object information using a constrained optimization approach with faster convergence. The recovered complex object field is back propagated to the image plane using the Fresnel back-propagation method. The results reported in this approach provide high-resolution images compared with the conventional Fourier filtering approach and is 25% faster than the previously reported constrained optimization approach due to the subtraction of two DC terms in the cost function. We report this approach in DH and digital holographic microscopy using the U.S. Air Force resolution target as the object to retrieve the high-resolution image without DC and twin image interference. We also demonstrate the high potential of this technique in transparent microelectrode patterned on indium tin oxide-coated glass, by reconstructing a high-resolution quantitative phase microscope image. We also demonstrate this technique by imaging yeast cells.

  12. Digital radiography: optimization of image quality and dose using multi-frequency software.

    PubMed

    Precht, H; Gerke, O; Rosendahl, K; Tingberg, A; Waaler, D

    2012-09-01

    New developments in processing of digital radiographs (DR), including multi-frequency processing (MFP), allow optimization of image quality and radiation dose. This is particularly promising in children as they are believed to be more sensitive to ionizing radiation than adults. To examine whether the use of MFP software reduces the radiation dose without compromising quality at DR of the femur in 5-year-old-equivalent anthropomorphic and technical phantoms. A total of 110 images of an anthropomorphic phantom were imaged on a DR system (Canon DR with CXDI-50 C detector and MLT[S] software) and analyzed by three pediatric radiologists using Visual Grading Analysis. In addition, 3,500 images taken of a technical contrast-detail phantom (CDRAD 2.0) provide an objective image-quality assessment. Optimal image-quality was maintained at a dose reduction of 61% with MLT(S) optimized images. Even for images of diagnostic quality, MLT(S) provided a dose reduction of 88% as compared to the reference image. Software impact on image quality was found significant for dose (mAs), dynamic range dark region and frequency band. By optimizing image processing parameters, a significant dose reduction is possible without significant loss of image quality.

  13. MO-G-18A-01: Radiation Dose Reducing Strategies in CT, Fluoroscopy and Radiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahesh, M; Gingold, E; Jones, A

    2014-06-15

    Advances in medical x-ray imaging have provided significant benefits to patient care. According to NCRP 160, there are more than 400 million x-ray procedures performed annually in the United States alone that contributes to nearly half of all the radiation exposure to the US population. Similar growth trends in medical x-ray imaging are observed worldwide. Apparent increase in number of medical x-ray imaging procedures, new protocols and the associated radiation dose and risk has drawn considerable attention. This has led to a number of technological innovations such as tube current modulation, iterative reconstruction algorithms, dose alerts, dose displays, flat panelmore » digital detectors, high efficient digital detectors, storage phosphor radiography, variable filters, etc. that are enabling users to acquire medical x-ray images at a much lower radiation dose. Along with these, there are number of radiation dose optimization strategies that users can adapt to effectively lower radiation dose in medical x-ray procedures. The main objectives of this SAM course are to provide information and how to implement the various radiation dose optimization strategies in CT, Fluoroscopy and Radiography. Learning Objectives: To update impact of technological advances on dose optimization in medical imaging. To identify radiation optimization strategies in computed tomography. To describe strategies for configuring fluoroscopic equipment that yields optimal images at reasonable radiation dose. To assess ways to configure digital radiography systems and recommend ways to improve image quality at optimal dose.« less

  14. Optimization of digital image processing to determine quantum dots' height and density from atomic force microscopy.

    PubMed

    Ruiz, J E; Paciornik, S; Pinto, L D; Ptak, F; Pires, M P; Souza, P L

    2018-01-01

    An optimized method of digital image processing to interpret quantum dots' height measurements obtained by atomic force microscopy is presented. The method was developed by combining well-known digital image processing techniques and particle recognition algorithms. The properties of quantum dot structures strongly depend on dots' height, among other features. Determination of their height is sensitive to small variations in their digital image processing parameters, which can generate misleading results. Comparing the results obtained with two image processing techniques - a conventional method and the new method proposed herein - with the data obtained by determining the height of quantum dots one by one within a fixed area, showed that the optimized method leads to more accurate results. Moreover, the log-normal distribution, which is often used to represent natural processes, shows a better fit to the quantum dots' height histogram obtained with the proposed method. Finally, the quantum dots' height obtained were used to calculate the predicted photoluminescence peak energies which were compared with the experimental data. Again, a better match was observed when using the proposed method to evaluate the quantum dots' height. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Assessment of visual communication by information theory

    NASA Astrophysics Data System (ADS)

    Huck, Friedrich O.; Fales, Carl L.

    1994-01-01

    This assessment of visual communication integrates the optical design of the image-gathering device with the digital processing for image coding and restoration. Results show that informationally optimized image gathering ordinarily can be relied upon to maximize the information efficiency of decorrelated data and the visual quality of optimally restored images.

  16. Quality assessment of digital X-ray chest images using an anthropomorphic chest phantom

    NASA Astrophysics Data System (ADS)

    Vodovatov, A. V.; Kamishanskaya, I. G.; Drozdov, A. A.; Bernhardsson, C.

    2017-02-01

    The current study is focused on determining the optimal tube voltage for the conventional X-ray digital chest screening examinations, using a visual grading analysis method. Chest images of an anthropomorphic phantom were acquired in posterior-anterior projection on four digital X-ray units with different detector types. X-ray images obtained with an anthropomorphic phantom were accepted by the radiologists as corresponding to a normal human anatomy, hence allowing using phantoms in image quality trials without limitations.

  17. Topology-Preserving Rigid Transformation of 2D Digital Images.

    PubMed

    Ngo, Phuc; Passat, Nicolas; Kenmochi, Yukiko; Talbot, Hugues

    2014-02-01

    We provide conditions under which 2D digital images preserve their topological properties under rigid transformations. We consider the two most common digital topology models, namely dual adjacency and well-composedness. This paper leads to the proposal of optimal preprocessing strategies that ensure the topological invariance of images under arbitrary rigid transformations. These results and methods are proved to be valid for various kinds of images (binary, gray-level, label), thus providing generic and efficient tools, which can be used in particular in the context of image registration and warping.

  18. The impact of the condenser on cytogenetic image quality in digital microscope system.

    PubMed

    Ren, Liqiang; Li, Zheng; Li, Yuhua; Zheng, Bin; Li, Shibo; Chen, Xiaodong; Liu, Hong

    2013-01-01

    Optimizing operational parameters of the digital microscope system is an important technique to acquire high quality cytogenetic images and facilitate the process of karyotyping so that the efficiency and accuracy of diagnosis can be improved. This study investigated the impact of the condenser on cytogenetic image quality and system working performance using a prototype digital microscope image scanning system. Both theoretical analysis and experimental validations through objectively evaluating a resolution test chart and subjectively observing large numbers of specimen were conducted. The results show that the optimal image quality and large depth of field (DOF) are simultaneously obtained when the numerical aperture of condenser is set as 60%-70% of the corresponding objective. Under this condition, more analyzable chromosomes and diagnostic information are obtained. As a result, the system shows higher working stability and less restriction for the implementation of algorithms such as autofocusing especially when the system is designed to achieve high throughput continuous image scanning. Although the above quantitative results were obtained using a specific prototype system under the experimental conditions reported in this paper, the presented evaluation methodologies can provide valuable guidelines for optimizing operational parameters in cytogenetic imaging using the high throughput continuous scanning microscopes in clinical practice.

  19. Use of the Hotelling observer to optimize image reconstruction in digital breast tomosynthesis

    PubMed Central

    Sánchez, Adrian A.; Sidky, Emil Y.; Pan, Xiaochuan

    2015-01-01

    Abstract. We propose an implementation of the Hotelling observer that can be applied to the optimization of linear image reconstruction algorithms in digital breast tomosynthesis. The method is based on considering information within a specific region of interest, and it is applied to the optimization of algorithms for detectability of microcalcifications. Several linear algorithms are considered: simple back-projection, filtered back-projection, back-projection filtration, and Λ-tomography. The optimized algorithms are then evaluated through the reconstruction of phantom data. The method appears robust across algorithms and parameters and leads to the generation of algorithm implementations which subjectively appear optimized for the task of interest. PMID:26702408

  20. Optimization of digitization procedures in cultural heritage preservation

    NASA Astrophysics Data System (ADS)

    Martínez, Bea; Mitjà, Carles; Escofet, Jaume

    2013-11-01

    The digitization of both volumetric and flat objects is the nowadays-preferred method in order to preserve cultural heritage items. High quality digital files obtained from photographic plates, films and prints, paintings, drawings, gravures, fabrics and sculptures, allows not only for a wider diffusion and on line transmission, but also for the preservation of the original items from future handling. Early digitization procedures used scanners for flat opaque or translucent objects and camera only for volumetric or flat highly texturized materials. The technical obsolescence of the high-end scanners and the improvement achieved by professional cameras has result in a wide use of cameras with digital back to digitize any kind of cultural heritage item. Since the lens, the digital back, the software controlling the camera and the digital image processing provide a wide range of possibilities, there is necessary to standardize the methods used in the reproduction work leading to preserve as high as possible the original item properties. This work presents an overview about methods used for camera system characterization, as well as the best procedures in order to identify and counteract the effect of the lens residual aberrations, sensor aliasing, image illumination, color management and image optimization by means of parametric image processing. As a corollary, the work shows some examples of reproduction workflow applied to the digitization of valuable art pieces and glass plate photographic black and white negatives.

  1. On the assessment of visual communication by information theory

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.

    1993-01-01

    This assessment of visual communication integrates the optical design of the image-gathering device with the digital processing for image coding and restoration. Results show that informationally optimized image gathering ordinarily can be relied upon to maximize the information efficiency of decorrelated data and the visual quality of optimally restored images.

  2. Digital compression algorithms for HDTV transmission

    NASA Technical Reports Server (NTRS)

    Adkins, Kenneth C.; Shalkhauser, Mary JO; Bibyk, Steven B.

    1990-01-01

    Digital compression of video images is a possible avenue for high definition television (HDTV) transmission. Compression needs to be optimized while picture quality remains high. Two techniques for compression the digital images are explained and comparisons are drawn between the human vision system and artificial compression techniques. Suggestions for improving compression algorithms through the use of neural and analog circuitry are given.

  3. Design Method of Digital Optimal Control Scheme and Multiple Paralleled Bridge Type Current Amplifier for Generating Gradient Magnetic Fields in MRI Systems

    NASA Astrophysics Data System (ADS)

    Watanabe, Shuji; Takano, Hiroshi; Fukuda, Hiroya; Hiraki, Eiji; Nakaoka, Mutsuo

    This paper deals with a digital control scheme of multiple paralleled high frequency switching current amplifier with four-quadrant chopper for generating gradient magnetic fields in MRI (Magnetic Resonance Imaging) systems. In order to track high precise current pattern in Gradient Coils (GC), the proposal current amplifier cancels the switching current ripples in GC with each other and designed optimum switching gate pulse patterns without influences of the large filter current ripple amplitude. The optimal control implementation and the linear control theory in GC current amplifiers have affinity to each other with excellent characteristics. The digital control system can be realized easily through the digital control implementation, DSPs or microprocessors. Multiple-parallel operational microprocessors realize two or higher paralleled GC current pattern tracking amplifier with optimal control design and excellent results are given for improving the image quality of MRI systems.

  4. Smartphone adapters for digital photomicrography.

    PubMed

    Roy, Somak; Pantanowitz, Liron; Amin, Milon; Seethala, Raja R; Ishtiaque, Ahmed; Yousem, Samuel A; Parwani, Anil V; Cucoranu, Ioan; Hartman, Douglas J

    2014-01-01

    Photomicrographs in Anatomic Pathology provide a means of quickly sharing information from a glass slide for consultation, education, documentation and publication. While static image acquisition historically involved the use of a permanently mounted camera unit on a microscope, such cameras may be expensive, need to be connected to a computer, and often require proprietary software to acquire and process images. Another novel approach for capturing digital microscopic images is to use smartphones coupled with the eyepiece of a microscope. Recently, several smartphone adapters have emerged that allow users to attach mobile phones to the microscope. The aim of this study was to test the utility of these various smartphone adapters. We surveyed the market for adapters to attach smartphones to the ocular lens of a conventional light microscope. Three adapters (Magnifi, Skylight and Snapzoom) were tested. We assessed the designs of these adapters and their effectiveness at acquiring static microscopic digital images. All adapters facilitated the acquisition of digital microscopic images with a smartphone. The optimal adapter was dependent on the type of phone used. The Magnifi adapters for iPhone were incompatible when using a protective case. The Snapzoom adapter was easiest to use with iPhones and other smartphones even with protective cases. Smartphone adapters are inexpensive and easy to use for acquiring digital microscopic images. However, they require some adjustment by the user in order to optimize focus and obtain good quality images. Smartphone microscope adapters provide an economically feasible method of acquiring and sharing digital pathology photomicrographs.

  5. The Impact of the Condenser on Cytogenetic Image Quality in Digital Microscope System

    PubMed Central

    Ren, Liqiang; Li, Zheng; Li, Yuhua; Zheng, Bin; Li, Shibo; Chen, Xiaodong; Liu, Hong

    2013-01-01

    Background: Optimizing operational parameters of the digital microscope system is an important technique to acquire high quality cytogenetic images and facilitate the process of karyotyping so that the efficiency and accuracy of diagnosis can be improved. OBJECTIVE: This study investigated the impact of the condenser on cytogenetic image quality and system working performance using a prototype digital microscope image scanning system. Methods: Both theoretical analysis and experimental validations through objectively evaluating a resolution test chart and subjectively observing large numbers of specimen were conducted. Results: The results show that the optimal image quality and large depth of field (DOF) are simultaneously obtained when the numerical aperture of condenser is set as 60%–70% of the corresponding objective. Under this condition, more analyzable chromosomes and diagnostic information are obtained. As a result, the system shows higher working stability and less restriction for the implementation of algorithms such as autofocusing especially when the system is designed to achieve high throughput continuous image scanning. Conclusions: Although the above quantitative results were obtained using a specific prototype system under the experimental conditions reported in this paper, the presented evaluation methodologies can provide valuable guidelines for optimizing operational parameters in cytogenetic imaging using the high throughput continuous scanning microscopes in clinical practice. PMID:23676284

  6. Optimization of exposure factors for X-ray radiography non-destructive testing of pearl oyster

    NASA Astrophysics Data System (ADS)

    Susilo; Yulianti, I.; Addawiyah, A.; Setiawan, R.

    2018-03-01

    One of the processes in pearl oyster cultivation is detecting the pearl nucleus to gain information whether the pearl nucleus is still attached in the shell or vomited. The common tool used to detect pearl nucleus is an X-ray machine. However, an X-ray machine has a drawback that is the energy used is higher than that used by digital radiography. The high energy make the resulted image is difficult to be analysed. One of the advantages of digital radiography is the energy used can be adjusted so that the resulted image can be analysed easily. To obtain a high quality of pearl image using digital radiography, the exposure factors should be optimized. In this work, optimization was done by varying the voltage, current, and exposure time. Then, the radiography images were analysed using Contrast to Noise Ratio (CNR). From the analysis, it can be determined that the optimum exposure factors are 60 kV of voltage, 16 mA of current, and 0.125 s of exposure time which result in CNR of 5.71.

  7. Estimation of melanin content in iris of human eye: prognosis for glaucoma diagnostics

    NASA Astrophysics Data System (ADS)

    Bashkatov, Alexey N.; Koblova, Ekaterina V.; Genina, Elina A.; Kamenskikh, Tatyana G.; Dolotov, Leonid E.; Sinichkin, Yury P.; Tuchin, Valery V.

    2007-02-01

    Based on the experimental data obtained in vivo from digital analysis of color images of human irises, the mean melanin content in human eye irises has been estimated. For registration of the color images a digital camera Olympus C-5060 has been used. The images have been obtained from irises of healthy volunteers as well as from irises of patients with open-angle glaucoma. The computer program has been developed for digital analysis of the images. The result has been useful for development of novel and optimization of already existing methods of non-invasive glaucoma diagnostics.

  8. Smartphone adapters for digital photomicrography

    PubMed Central

    Roy, Somak; Pantanowitz, Liron; Amin, Milon; Seethala, Raja R.; Ishtiaque, Ahmed; Yousem, Samuel A.; Parwani, Anil V.; Cucoranu, Ioan; Hartman, Douglas J.

    2014-01-01

    Background: Photomicrographs in Anatomic Pathology provide a means of quickly sharing information from a glass slide for consultation, education, documentation and publication. While static image acquisition historically involved the use of a permanently mounted camera unit on a microscope, such cameras may be expensive, need to be connected to a computer, and often require proprietary software to acquire and process images. Another novel approach for capturing digital microscopic images is to use smartphones coupled with the eyepiece of a microscope. Recently, several smartphone adapters have emerged that allow users to attach mobile phones to the microscope. The aim of this study was to test the utility of these various smartphone adapters. Materials and Methods: We surveyed the market for adapters to attach smartphones to the ocular lens of a conventional light microscope. Three adapters (Magnifi, Skylight and Snapzoom) were tested. We assessed the designs of these adapters and their effectiveness at acquiring static microscopic digital images. Results: All adapters facilitated the acquisition of digital microscopic images with a smartphone. The optimal adapter was dependent on the type of phone used. The Magnifi adapters for iPhone were incompatible when using a protective case. The Snapzoom adapter was easiest to use with iPhones and other smartphones even with protective cases. Conclusions: Smartphone adapters are inexpensive and easy to use for acquiring digital microscopic images. However, they require some adjustment by the user in order to optimize focus and obtain good quality images. Smartphone microscope adapters provide an economically feasible method of acquiring and sharing digital pathology photomicrographs. PMID:25191623

  9. Comprehensive optimization process of paranasal sinus radiography.

    PubMed

    Saarakkala, S; Nironen, K; Hermunen, H; Aarnio, J; Heikkinen, J O

    2009-04-01

    The optimization of radiological examinations is important in order to reduce unnecessary patient radiation exposure. To perform a comprehensive optimization process for paranasal sinus radiography at Mikkeli Central Hospital, Finland. Patients with suspicion of acute sinusitis were imaged with a Kodak computed radiography (CR) system (n=20) and with a Philips digital radiography (DR) system (n=30) using focus-detector distances (FDDs) of 110 cm, 150 cm, or 200 cm. Patients' radiation exposure was determined in terms of entrance surface dose and dose-area product. Furthermore, an anatomical phantom was used for the estimation of point doses inside the head. Clinical image quality was evaluated by an experienced radiologist, and physical image quality was evaluated from the digital radiography phantom. Patient doses were significantly lower and image quality better with the DR system compared to the CR system. The differences in patient dose and physical image quality were small with varying FDD. Clinical image quality of the DR system was lowest with FDD of 200 cm. Further, imaging with FDD of 150 cm was technically easier for the technologist to perform than with FDD of 110 cm. After optimization, it was recommended that the DR system with FDD of 150 cm should always be used at Mikkeli Central Hospital. We recommend this kind of comprehensive approach in all optimization processes of radiological examinations.

  10. The effects of gray scale image processing on digital mammography interpretation performance.

    PubMed

    Cole, Elodia B; Pisano, Etta D; Zeng, Donglin; Muller, Keith; Aylward, Stephen R; Park, Sungwook; Kuzmiak, Cherie; Koomen, Marcia; Pavic, Dag; Walsh, Ruth; Baker, Jay; Gimenez, Edgardo I; Freimanis, Rita

    2005-05-01

    To determine the effects of three image-processing algorithms on diagnostic accuracy of digital mammography in comparison with conventional screen-film mammography. A total of 201 cases consisting of nonprocessed soft copy versions of the digital mammograms acquired from GE, Fischer, and Trex digital mammography systems (1997-1999) and conventional screen-film mammograms of the same patients were interpreted by nine radiologists. The raw digital data were processed with each of three different image-processing algorithms creating three presentations-manufacturer's default (applied and laser printed to film by each of the manufacturers), MUSICA, and PLAHE-were presented in soft copy display. There were three radiologists per presentation. Area under the receiver operating characteristic curve for GE digital mass cases was worse than screen-film for all digital presentations. The area under the receiver operating characteristic for Trex digital mass cases was better, but only with images processed with the manufacturer's default algorithm. Sensitivity for GE digital mass cases was worse than screen film for all digital presentations. Specificity for Fischer digital calcifications cases was worse than screen film for images processed in default and PLAHE algorithms. Specificity for Trex digital calcifications cases was worse than screen film for images processed with MUSICA. Specific image-processing algorithms may be necessary for optimal presentation for interpretation based on machine and lesion type.

  11. Cost-effectiveness of angiographic imaging in isolated perimesencephalic subarachnoid hemorrhage.

    PubMed

    Kalra, Vivek B; Wu, Xiao; Forman, Howard P; Malhotra, Ajay

    2014-12-01

    The purpose of this study is to perform a comprehensive cost-effectiveness analysis of all possible permutations of computed tomographic angiography (CTA) and digital subtraction angiography imaging strategies for both initial diagnosis and follow-up imaging in patients with perimesencephalic subarachnoid hemorrhage on noncontrast CT. Each possible imaging strategy was evaluated in a decision tree created with TreeAge Pro Suite 2014, with parameters derived from a meta-analysis of 40 studies and literature values. Base case and sensitivity analyses were performed to assess the cost-effectiveness of each strategy. A Monte Carlo simulation was conducted with distributional variables to evaluate the robustness of the optimal strategy. The base case scenario showed performing initial CTA with no follow-up angiographic studies in patients with perimesencephalic subarachnoid hemorrhage to be the most cost-effective strategy ($5422/quality adjusted life year). Using a willingness-to-pay threshold of $50 000/quality adjusted life year, the most cost-effective strategy based on net monetary benefit is CTA with no follow-up when the sensitivity of initial CTA is >97.9%, and CTA with CTA follow-up otherwise. The Monte Carlo simulation reported CTA with no follow-up to be the optimal strategy at willingness-to-pay of $50 000 in 99.99% of the iterations. Digital subtraction angiography, whether at initial diagnosis or as part of follow-up imaging, is never the optimal strategy in our model. CTA without follow-up imaging is the optimal strategy for evaluation of patients with perimesencephalic subarachnoid hemorrhage when modern CT scanners and a strict definition of perimesencephalic subarachnoid hemorrhage are used. Digital subtraction angiography and follow-up imaging are not optimal as they carry complications and associated costs. © 2014 American Heart Association, Inc.

  12. Digital processing of the Mariner 10 images of Venus and Mercury

    NASA Technical Reports Server (NTRS)

    Soha, J. M.; Lynn, D. J.; Mosher, J. A.; Elliot, D. A.

    1977-01-01

    An extensive effort was devoted to the digital processing of the Mariner 10 images of Venus and Mercury at the Image Processing Laboratory of the Jet Propulsion Laboratory. This effort was designed to optimize the display of the considerable quantity of information contained in the images. Several image restoration, enhancement, and transformation procedures were applied; examples of these techniques are included. A particular task was the construction of large mosaics which characterize the surface of Mercury and the atmospheric structure of Venus.

  13. [Digital breast tomosynthesis : technical principles, current clinical relevance and future perspectives].

    PubMed

    Hellerhoff, K

    2010-11-01

    In recent years digital full field mammography has increasingly replaced conventional film mammography. High quality imaging is guaranteed by high quantum efficiency and very good contrast resolution with optimized dosing even for women with dense glandular tissue. However, digital mammography remains a projection procedure by which overlapping tissue limits the detectability of subtle alterations. Tomosynthesis is a procedure developed from digital mammography for slice examination of breasts which eliminates the effects of overlapping tissue and allows 3D imaging of breasts. A curved movement of the X-ray tube during scanning allows the acquisition of many 2D images from different angles. Subseqently, reconstruction algorithms employing a shift and add method improve the recognition of details at a defined level and at the same time eliminate smear artefacts due to overlapping structures. The total dose corresponds to that of conventional mammography imaging. The technical procedure, including the number of levels, suitable anodes/filter combinations, angle regions of images and selection of reconstruction algorithms, is presently undergoing optimization. Previous studies on the clinical value of tomosynthesis have examined screening parameters, such as recall rate and detection rate as well as information on tumor extent for histologically proven breast tumors. More advanced techniques, such as contrast medium-enhanced tomosynthesis, are presently under development and dual-energy imaging is of particular importance.

  14. Learning optimal features for visual pattern recognition

    NASA Astrophysics Data System (ADS)

    Labusch, Kai; Siewert, Udo; Martinetz, Thomas; Barth, Erhardt

    2007-02-01

    The optimal coding hypothesis proposes that the human visual system has adapted to the statistical properties of the environment by the use of relatively simple optimality criteria. We here (i) discuss how the properties of different models of image coding, i.e. sparseness, decorrelation, and statistical independence are related to each other (ii) propose to evaluate the different models by verifiable performance measures (iii) analyse the classification performance on images of handwritten digits (MNIST data base). We first employ the SPARSENET algorithm (Olshausen, 1998) to derive a local filter basis (on 13 × 13 pixels windows). We then filter the images in the database (28 × 28 pixels images of digits) and reduce the dimensionality of the resulting feature space by selecting the locally maximal filter responses. We then train a support vector machine on a training set to classify the digits and report results obtained on a separate test set. Currently, the best state-of-the-art result on the MNIST data base has an error rate of 0,4%. This result, however, has been obtained by using explicit knowledge that is specific to the data (elastic distortion model for digits). We here obtain an error rate of 0,55% which is second best but does not use explicit data specific knowledge. In particular it outperforms by far all methods that do not use data-specific knowledge.

  15. Resolution analysis of archive films for the purpose of their optimal digitization and distribution

    NASA Astrophysics Data System (ADS)

    Fliegel, Karel; Vítek, Stanislav; Páta, Petr; Myslík, Jiří; Pecák, Josef; Jícha, Marek

    2017-09-01

    With recent high demand for ultra-high-definition (UHD) content to be screened in high-end digital movie theaters but also in the home environment, film archives full of movies in high-definition and above are in the scope of UHD content providers. Movies captured with the traditional film technology represent a virtually unlimited source of UHD content. The goal to maintain complete image information is also related to the choice of scanning resolution and spatial resolution for further distribution. It might seem that scanning the film material in the highest possible resolution using state-of-the-art film scanners and also its distribution in this resolution is the right choice. The information content of the digitized images is however limited, and various degradations moreover lead to its further reduction. Digital distribution of the content in the highest image resolution might be therefore unnecessary or uneconomical. In other cases, the highest possible resolution is inevitable if we want to preserve fine scene details or film grain structure for archiving purposes. This paper deals with the image detail content analysis of archive film records. The resolution limit in captured scene image and factors which lower the final resolution are discussed. Methods are proposed to determine the spatial details of the film picture based on the analysis of its digitized image data. These procedures allow determining recommendations for optimal distribution of digitized video content intended for various display devices with lower resolutions. Obtained results are illustrated on spatial downsampling use case scenario, and performance evaluation of the proposed techniques is presented.

  16. Detection of suspicious pain regions on a digital infrared thermal image using the multimodal function optimization.

    PubMed

    Lee, Junghoon; Lee, Joosung; Song, Sangha; Lee, Hyunsook; Lee, Kyoungjoung; Yoon, Youngro

    2008-01-01

    Automatic detection of suspicious pain regions is very useful in the medical digital infrared thermal imaging research area. To detect those regions, we use the SOFES (Survival Of the Fitness kind of the Evolution Strategy) algorithm which is one of the multimodal function optimization methods. We apply this algorithm to famous diseases, such as a foot of the glycosuria, the degenerative arthritis and the varicose vein. The SOFES algorithm is available to detect some hot spots or warm lines as veins. And according to a hundred of trials, the algorithm is very fast to converge.

  17. System for objective assessment of image differences in digital cinema

    NASA Astrophysics Data System (ADS)

    Fliegel, Karel; Krasula, Lukáš; Páta, Petr; Myslík, Jiří; Pecák, Josef; Jícha, Marek

    2014-09-01

    There is high demand for quick digitization and subsequent image restoration of archived film records. Digitization is very urgent in many cases because various invaluable pieces of cultural heritage are stored on aging media. Only selected records can be reconstructed perfectly using painstaking manual or semi-automatic procedures. This paper aims to answer the question what are the quality requirements on the restoration process in order to obtain acceptably close visual perception of the digitally restored film in comparison to the original analog film copy. This knowledge is very important to preserve the original artistic intention of the movie producers. Subjective experiment with artificially distorted images has been conducted in order to answer the question what is the visual impact of common image distortions in digital cinema. Typical color and contrast distortions were introduced and test images were presented to viewers using digital projector. Based on the outcome of this subjective evaluation a system for objective assessment of image distortions has been developed and its performance tested. The system utilizes calibrated digital single-lens reflex camera and subsequent analysis of suitable features of images captured from the projection screen. The evaluation of captured image data has been optimized in order to obtain predicted differences between the reference and distorted images while achieving high correlation with the results of subjective assessment. The system can be used to objectively determine the difference between analog film and digital cinema images on the projection screen.

  18. Optimized Quasi-Interpolators for Image Reconstruction.

    PubMed

    Sacht, Leonardo; Nehab, Diego

    2015-12-01

    We propose new quasi-interpolators for the continuous reconstruction of sampled images, combining a narrowly supported piecewise-polynomial kernel and an efficient digital filter. In other words, our quasi-interpolators fit within the generalized sampling framework and are straightforward to use. We go against standard practice and optimize for approximation quality over the entire Nyquist range, rather than focusing exclusively on the asymptotic behavior as the sample spacing goes to zero. In contrast to previous work, we jointly optimize with respect to all degrees of freedom available in both the kernel and the digital filter. We consider linear, quadratic, and cubic schemes, offering different tradeoffs between quality and computational cost. Experiments with compounded rotations and translations over a range of input images confirm that, due to the additional degrees of freedom and the more realistic objective function, our new quasi-interpolators perform better than the state of the art, at a similar computational cost.

  19. Threshold matrix for digital halftoning by genetic algorithm optimization

    NASA Astrophysics Data System (ADS)

    Alander, Jarmo T.; Mantere, Timo J.; Pyylampi, Tero

    1998-10-01

    Digital halftoning is used both in low and high resolution high quality printing technologies. Our method is designed to be mainly used for low resolution ink jet marking machines to produce both gray tone and color images. The main problem with digital halftoning is pink noise caused by the human eye's visual transfer function. To compensate for this the random dot patterns used are optimized to contain more blue than pink noise. Several such dot pattern generator threshold matrices have been created automatically by using genetic algorithm optimization, a non-deterministic global optimization method imitating natural evolution and genetics. A hybrid of genetic algorithm with a search method based on local backtracking was developed together with several fitness functions evaluating dot patterns for rectangular grids. By modifying the fitness function, a family of dot generators results, each with its particular statistical features. Several versions of genetic algorithms, backtracking and fitness functions were tested to find a reasonable combination. The generated threshold matrices have been tested by simulating a set of test images using the Khoros image processing system. Even though the work was focused on developing low resolution marking technology, the resulting family of dot generators can be applied also in other halftoning application areas including high resolution printing technology.

  20. Development of optimized techniques and requirements for computer enhancement of structural weld radiographs. Volume 1: Technical report

    NASA Technical Reports Server (NTRS)

    Adams, J. R.; Hawley, S. W.; Peterson, G. R.; Salinger, S. S.; Workman, R. A.

    1971-01-01

    A hardware and software specification covering requirements for the computer enhancement of structural weld radiographs was considered. Three scanning systems were used to digitize more than 15 weld radiographs. The performance of these systems was evaluated by determining modulation transfer functions and noise characteristics. Enhancement techniques were developed and applied to the digitized radiographs. The scanning parameters of spot size and spacing and film density were studied to optimize the information content of the digital representation of the image.

  1. Normal and abnormal tissue identification system and method for medical images such as digital mammograms

    NASA Technical Reports Server (NTRS)

    Heine, John J. (Inventor); Clarke, Laurence P. (Inventor); Deans, Stanley R. (Inventor); Stauduhar, Richard Paul (Inventor); Cullers, David Kent (Inventor)

    2001-01-01

    A system and method for analyzing a medical image to determine whether an abnormality is present, for example, in digital mammograms, includes the application of a wavelet expansion to a raw image to obtain subspace images of varying resolution. At least one subspace image is selected that has a resolution commensurate with a desired predetermined detection resolution range. A functional form of a probability distribution function is determined for each selected subspace image, and an optimal statistical normal image region test is determined for each selected subspace image. A threshold level for the probability distribution function is established from the optimal statistical normal image region test for each selected subspace image. A region size comprising at least one sector is defined, and an output image is created that includes a combination of all regions for each selected subspace image. Each region has a first value when the region intensity level is above the threshold and a second value when the region intensity level is below the threshold. This permits the localization of a potential abnormality within the image.

  2. Contrast-enhanced digital mammography (CEDM): imaging modeling, computer simulations, and phantom study

    NASA Astrophysics Data System (ADS)

    Chen, Biao; Jing, Zhenxue; Smith, Andrew

    2005-04-01

    Contrast enhanced digital mammography (CEDM), which is based upon the analysis of a series of x-ray projection images acquired before/after the administration of contrast agents, may provide physicians critical physiologic and morphologic information of breast lesions to determine the malignancy of lesions. This paper proposes to combine the kinetic analysis (KA) of contrast agent uptake/washout process and the dual-energy (DE) contrast enhancement together to formulate a hybrid contrast enhanced breast-imaging framework. The quantitative characteristics of materials and imaging components in the x-ray imaging chain, including x-ray tube (tungsten) spectrum, filter, breast tissues/lesions, contrast agents (non-ionized iodine solution), and selenium detector, were systematically modeled. The contrast-noise-ration (CNR) of iodinated lesions and mean absorbed glandular dose were estimated mathematically. The x-ray techniques optimization was conducted through a series of computer simulations to find the optimal tube voltage, filter thickness, and exposure levels for various breast thicknesses, breast density, and detectable contrast agent concentration levels in terms of detection efficiency (CNR2/dose). A phantom study was performed on a modified Selenia full field digital mammography system to verify the simulated results. The dose level was comparable to the dose in diagnostic mode (less than 4 mGy for an average 4.2 cm compressed breast). The results from the computer simulations and phantom study are being used to optimize an ongoing clinical study.

  3. Theoretical and Monte Carlo optimization of a stacked three-layer flat-panel x-ray imager for applications in multi-spectral diagnostic medical imaging

    NASA Astrophysics Data System (ADS)

    Lopez Maurino, Sebastian; Badano, Aldo; Cunningham, Ian A.; Karim, Karim S.

    2016-03-01

    We propose a new design of a stacked three-layer flat-panel x-ray detector for dual-energy (DE) imaging. Each layer consists of its own scintillator of individual thickness and an underlying thin-film-transistor-based flat-panel. Three images are obtained simultaneously in the detector during the same x-ray exposure, thereby eliminating any motion artifacts. The detector operation is two-fold: a conventional radiography image can be obtained by combining all three layers' images, while a DE subtraction image can be obtained from the front and back layers' images, where the middle layer acts as a mid-filter that helps achieve spectral separation. We proceed to optimize the detector parameters for two sample imaging tasks that could particularly benefit from this new detector by obtaining the best possible signal to noise ratio per root entrance exposure using well-established theoretical models adapted to fit our new design. These results are compared to a conventional DE temporal subtraction detector and a single-shot DE subtraction detector with a copper mid-filter, both of which underwent the same theoretical optimization. The findings are then validated using advanced Monte Carlo simulations for all optimized detector setups. Given the performance expected from initial results and the recent decrease in price for digital x-ray detectors, the simplicity of the three-layer stacked imager approach appears promising to usher in a new generation of multi-spectral digital x-ray diagnostics.

  4. TU-FG-209-11: Validation of a Channelized Hotelling Observer to Optimize Chest Radiography Image Processing for Nodule Detection: A Human Observer Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanchez, A; Little, K; Chung, J

    Purpose: To validate the use of a Channelized Hotelling Observer (CHO) model for guiding image processing parameter selection and enable improved nodule detection in digital chest radiography. Methods: In a previous study, an anthropomorphic chest phantom was imaged with and without PMMA simulated nodules using a GE Discovery XR656 digital radiography system. The impact of image processing parameters was then explored using a CHO with 10 Laguerre-Gauss channels. In this work, we validate the CHO’s trend in nodule detectability as a function of two processing parameters by conducting a signal-known-exactly, multi-reader-multi-case (MRMC) ROC observer study. Five naive readers scored confidencemore » of nodule visualization in 384 images with 50% nodule prevalence. The image backgrounds were regions-of-interest extracted from 6 normal patient scans, and the digitally inserted simulated nodules were obtained from phantom data in previous work. Each patient image was processed with both a near-optimal and a worst-case parameter combination, as determined by the CHO for nodule detection. The same 192 ROIs were used for each image processing method, with 32 randomly selected lung ROIs per patient image. Finally, the MRMC data was analyzed using the freely available iMRMC software of Gallas et al. Results: The image processing parameters which were optimized for the CHO led to a statistically significant improvement (p=0.049) in human observer AUC from 0.78 to 0.86, relative to the image processing implementation which produced the lowest CHO performance. Conclusion: Differences in user-selectable image processing methods on a commercially available digital radiography system were shown to have a marked impact on performance of human observers in the task of lung nodule detection. Further, the effect of processing on humans was similar to the effect on CHO performance. Future work will expand this study to include a wider range of detection/classification tasks and more observers, including experienced chest radiologists.« less

  5. WE-G-204-08: Optimized Digital Radiographic Technique for Lost Surgical Devices/Needle Identification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorman, A; Seabrook, G; Brakken, A

    Purpose: Small surgical devices and needles are used in many surgical procedures. Conventionally, an x-ray film is taken to identify missing devices/needles if post procedure count is incorrect. There is no data to indicate smallest surgical devices/needles that can be identified with digital radiography (DR), and its optimized acquisition technique. Methods: In this study, the DR equipment used is a Canon RadPro mobile with CXDI-70c wireless DR plate, and the same DR plate on a fixed Siemens Multix unit. Small surgical devices and needles tested include Rubber Shod, Bulldog, Fogarty Hydrogrip, and needles with sizes 3-0 C-T1 through 8-0 BV175-6.more » They are imaged with PMMA block phantoms with thickness of 2–8 inch, and an abdomen phantom. Various DR techniques are used. Images are reviewed on the portable x-ray acquisition display, a clinical workstation, and a diagnostic workstation. Results: all small surgical devices and needles are visible in portable DR images with 2–8 inch of PMMA. However, when they are imaged with the abdomen phantom plus 2 inch of PMMA, needles smaller than 9.3 mm length can not be visualized at the optimized technique of 81 kV and 16 mAs. There is no significant difference in visualization with various techniques, or between mobile and fixed radiography unit. However, there is noticeable difference in visualizing the smallest needle on a diagnostic reading workstation compared to the acquisition display on a portable x-ray unit. Conclusion: DR images should be reviewed on a diagnostic reading workstation. Using optimized DR techniques, the smallest needle that can be identified on all phantom studies is 9.3 mm. Sample DR images of various small surgical devices/needles available on diagnostic workstation for comparison may improve their identification. Further in vivo study is needed to confirm the optimized digital radiography technique for identification of lost small surgical devices and needles.« less

  6. MO-DE-207-04: Imaging educational program on solutions to common pediatric imaging challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnamurthy, R.

    This imaging educational program will focus on solutions to common pediatric imaging challenges. The speakers will present collective knowledge on best practices in pediatric imaging from their experience at dedicated children’s hospitals. The educational program will begin with a detailed discussion of the optimal configuration of fluoroscopes for general pediatric procedures. Following this introduction will be a focused discussion on the utility of Dual Energy CT for imaging children. The third lecture will address the substantial challenge of obtaining consistent image post -processing in pediatric digital radiography. The fourth and final lecture will address best practices in pediatric MRI includingmore » a discussion of ancillary methods to reduce sedation and anesthesia rates. Learning Objectives: To learn techniques for optimizing radiation dose and image quality in pediatric fluoroscopy To become familiar with the unique challenges and applications of Dual Energy CT in pediatric imaging To learn solutions for consistent post-processing quality in pediatric digital radiography To understand the key components of an effective MRI safety and quality program for the pediatric practice.« less

  7. Going fully digital: Perspective of a Dutch academic pathology lab

    PubMed Central

    Stathonikos, Nikolas; Veta, Mitko; Huisman, André; van Diest, Paul J.

    2013-01-01

    During the last years, whole slide imaging has become more affordable and widely accepted in pathology labs. Digital slides are increasingly being used for digital archiving of routinely produced clinical slides, remote consultation and tumor boards, and quantitative image analysis for research purposes and in education. However, the implementation of a fully digital Pathology Department requires an in depth look into the suitability of digital slides for routine clinical use (the image quality of the produced digital slides and the factors that affect it) and the required infrastructure to support such use (the storage requirements and integration with lab management and hospital information systems). Optimization of digital pathology workflow requires communication between several systems, which can be facilitated by the use of open standards for digital slide storage and scanner management. Consideration of these aspects along with appropriate validation of the use of digital slides for routine pathology can pave the way for pathology departments to go “fully digital.” In this paper, we summarize our experiences so far in the process of implementing a fully digital workflow at our Pathology Department and the steps that are needed to complete this process. PMID:23858390

  8. Applications and challenges of digital pathology and whole slide imaging.

    PubMed

    Higgins, C

    2015-07-01

    Virtual microscopy is a method for digitizing images of tissue on glass slides and using a computer to view, navigate, change magnification, focus and mark areas of interest. Virtual microscope systems (also called digital pathology or whole slide imaging systems) offer several advantages for biological scientists who use slides as part of their general, pharmaceutical, biotechnology or clinical research. The systems usually are based on one of two methodologies: area scanning or line scanning. Virtual microscope systems enable automatic sample detection, virtual-Z acquisition and creation of focal maps. Virtual slides are layered with multiple resolutions at each location, including the highest resolution needed to allow more detailed review of specific regions of interest. Scans may be acquired at 2, 10, 20, 40, 60 and 100 × or a combination of magnifications to highlight important detail. Digital microscopy starts when a slide collection is put into an automated or manual scanning system. The original slides are archived, then a server allows users to review multilayer digital images of the captured slides either by a closed network or by the internet. One challenge for adopting the technology is the lack of a universally accepted file format for virtual slides. Additional challenges include maintaining focus in an uneven sample, detecting specimens accurately, maximizing color fidelity with optimal brightness and contrast, optimizing resolution and keeping the images artifact-free. There are several manufacturers in the field and each has not only its own approach to these issues, but also its own image analysis software, which provides many options for users to enhance the speed, quality and accuracy of their process through virtual microscopy. Virtual microscope systems are widely used and are trusted to provide high quality solutions for teleconsultation, education, quality control, archiving, veterinary medicine, research and other fields.

  9. [Research and realization of signal processing algorithms based on FPGA in digital ophthalmic ultrasonography imaging].

    PubMed

    Fang, Simin; Zhou, Sheng; Wang, Xiaochun; Ye, Qingsheng; Tian, Ling; Ji, Jianjun; Wang, Yanqun

    2015-01-01

    To design and improve signal processing algorithms of ophthalmic ultrasonography based on FPGA. Achieved three signal processing modules: full parallel distributed dynamic filter, digital quadrature demodulation, logarithmic compression, using Verilog HDL hardware language in Quartus II. Compared to the original system, the hardware cost is reduced, the whole image shows clearer and more information of the deep eyeball contained in the image, the depth of detection increases from 5 cm to 6 cm. The new algorithms meet the design requirements and achieve the system's optimization that they can effectively improve the image quality of existing equipment.

  10. Color standardization and optimization in whole slide imaging.

    PubMed

    Yagi, Yukako

    2011-03-30

    Standardization and validation of the color displayed by digital slides is an important aspect of digital pathology implementation. While the most common reason for color variation is the variance in the protocols and practices in the histology lab, the color displayed can also be affected by variation in capture parameters (for example, illumination and filters), image processing and display factors in the digital systems themselves. We have been developing techniques for color validation and optimization along two paths. The first was based on two standard slides that are scanned and displayed by the imaging system in question. In this approach, one slide is embedded with nine filters with colors selected especially for H&E stained slides (looking like tiny Macbeth color chart); the specific color of the nine filters were determined in our previous study and modified for whole slide imaging (WSI). The other slide is an H&E stained mouse embryo. Both of these slides were scanned and the displayed images were compared to a standard. The second approach was based on our previous multispectral imaging research. As a first step, the two slide method (above) was used to identify inaccurate display of color and its cause, and to understand the importance of accurate color in digital pathology. We have also improved the multispectral-based algorithm for more consistent results in stain standardization. In near future, the results of the two slide and multispectral techniques can be combined and will be widely available. We have been conducting a series of researches and developing projects to improve image quality to establish Image Quality Standardization. This paper discusses one of most important aspects of image quality - color.

  11. Composite ultrasound imaging apparatus and method

    DOEpatents

    Morimoto, Alan K.; Bow, Jr., Wallace J.; Strong, David Scott; Dickey, Fred M.

    1998-01-01

    An imaging apparatus and method for use in presenting composite two dimensional and three dimensional images from individual ultrasonic frames. A cross-sectional reconstruction is applied by using digital ultrasound frames, transducer orientation and a known center. Motion compensation, rank value filtering, noise suppression and tissue classification are utilized to optimize the composite image.

  12. Composite ultrasound imaging apparatus and method

    DOEpatents

    Morimoto, A.K.; Bow, W.J. Jr.; Strong, D.S.; Dickey, F.M.

    1998-09-15

    An imaging apparatus and method for use in presenting composite two dimensional and three dimensional images from individual ultrasonic frames. A cross-sectional reconstruction is applied by using digital ultrasound frames, transducer orientation and a known center. Motion compensation, rank value filtering, noise suppression and tissue classification are utilized to optimize the composite image. 37 figs.

  13. Lifting scheme-based method for joint coding 3D stereo digital cinema with luminace correction and optimized prediction

    NASA Astrophysics Data System (ADS)

    Darazi, R.; Gouze, A.; Macq, B.

    2009-01-01

    Reproducing a natural and real scene as we see in the real world everyday is becoming more and more popular. Stereoscopic and multi-view techniques are used for this end. However due to the fact that more information are displayed requires supporting technologies such as digital compression to ensure the storage and transmission of the sequences. In this paper, a new scheme for stereo image coding is proposed. The original left and right images are jointly coded. The main idea is to optimally exploit the existing correlation between the two images. This is done by the design of an efficient transform that reduces the existing redundancy in the stereo image pair. This approach was inspired by Lifting Scheme (LS). The novelty in our work is that the prediction step is been replaced by an hybrid step that consists in disparity compensation followed by luminance correction and an optimized prediction step. The proposed scheme can be used for lossless and for lossy coding. Experimental results show improvement in terms of performance and complexity compared to recently proposed methods.

  14. New Colors for Histology: Optimized Bivariate Color Maps Increase Perceptual Contrast in Histological Images

    PubMed Central

    Kather, Jakob Nikolas; Weis, Cleo-Aron; Marx, Alexander; Schuster, Alexander K.; Schad, Lothar R.; Zöllner, Frank Gerrit

    2015-01-01

    Background Accurate evaluation of immunostained histological images is required for reproducible research in many different areas and forms the basis of many clinical decisions. The quality and efficiency of histopathological evaluation is limited by the information content of a histological image, which is primarily encoded as perceivable contrast differences between objects in the image. However, the colors of chromogen and counterstain used for histological samples are not always optimally distinguishable, even under optimal conditions. Methods and Results In this study, we present a method to extract the bivariate color map inherent in a given histological image and to retrospectively optimize this color map. We use a novel, unsupervised approach based on color deconvolution and principal component analysis to show that the commonly used blue and brown color hues in Hematoxylin—3,3’-Diaminobenzidine (DAB) images are poorly suited for human observers. We then demonstrate that it is possible to construct improved color maps according to objective criteria and that these color maps can be used to digitally re-stain histological images. Validation To validate whether this procedure improves distinguishability of objects and background in histological images, we re-stain phantom images and N = 596 large histological images of immunostained samples of human solid tumors. We show that perceptual contrast is improved by a factor of 2.56 in phantom images and up to a factor of 2.17 in sets of histological tumor images. Context Thus, we provide an objective and reliable approach to measure object distinguishability in a given histological image and to maximize visual information available to a human observer. This method could easily be incorporated in digital pathology image viewing systems to improve accuracy and efficiency in research and diagnostics. PMID:26717571

  15. New Colors for Histology: Optimized Bivariate Color Maps Increase Perceptual Contrast in Histological Images.

    PubMed

    Kather, Jakob Nikolas; Weis, Cleo-Aron; Marx, Alexander; Schuster, Alexander K; Schad, Lothar R; Zöllner, Frank Gerrit

    2015-01-01

    Accurate evaluation of immunostained histological images is required for reproducible research in many different areas and forms the basis of many clinical decisions. The quality and efficiency of histopathological evaluation is limited by the information content of a histological image, which is primarily encoded as perceivable contrast differences between objects in the image. However, the colors of chromogen and counterstain used for histological samples are not always optimally distinguishable, even under optimal conditions. In this study, we present a method to extract the bivariate color map inherent in a given histological image and to retrospectively optimize this color map. We use a novel, unsupervised approach based on color deconvolution and principal component analysis to show that the commonly used blue and brown color hues in Hematoxylin-3,3'-Diaminobenzidine (DAB) images are poorly suited for human observers. We then demonstrate that it is possible to construct improved color maps according to objective criteria and that these color maps can be used to digitally re-stain histological images. To validate whether this procedure improves distinguishability of objects and background in histological images, we re-stain phantom images and N = 596 large histological images of immunostained samples of human solid tumors. We show that perceptual contrast is improved by a factor of 2.56 in phantom images and up to a factor of 2.17 in sets of histological tumor images. Thus, we provide an objective and reliable approach to measure object distinguishability in a given histological image and to maximize visual information available to a human observer. This method could easily be incorporated in digital pathology image viewing systems to improve accuracy and efficiency in research and diagnostics.

  16. Finding-specific display presets for computed radiography soft-copy reading.

    PubMed

    Andriole, K P; Gould, R G; Webb, W R

    1999-05-01

    Much work has been done to optimize the display of cross-sectional modality imaging examinations for soft-copy reading (i.e., window/level tissue presets, and format presentations such as tile and stack modes, four-on-one, nine-on-one, etc). Less attention has been paid to the display of digital forms of the conventional projection x-ray. The purpose of this study is to assess the utility of providing presets for computed radiography (CR) soft-copy display, based not on the window/level settings, but on processing applied to the image optimized for visualization of specific findings, pathologies, etc (i.e., pneumothorax, tumor, tube location). It is felt that digital display of CR images based on finding-specific processing presets has the potential to: speed reading of digital projection x-ray examinations on soft copy; improve diagnostic efficacy; standardize display across examination type, clinical scenario, important key findings, and significant negatives; facilitate image comparison; and improve confidence in and acceptance of soft-copy reading. Clinical chest images are acquired using an Agfa-Gevaert (Mortsel, Belgium) ADC 70 CR scanner and Fuji (Stamford, CT) 9000 and AC2 CR scanners. Those demonstrating pertinent findings are transferred over the clinical picture archiving and communications system (PACS) network to a research image processing station (Agfa PS5000), where the optimal image-processing settings per finding, pathologic category, etc, are developed in conjunction with a thoracic radiologist, by manipulating the multiscale image contrast amplification (Agfa MUSICA) algorithm parameters. Soft-copy display of images processed with finding-specific settings are compared with the standard default image presentation for 50 cases of each category. Comparison is scored using a 5-point scale with the positive scale denoting the standard presentation is preferred over the finding-specific processing, the negative scale denoting the finding-specific processing is preferred over the standard presentation, and zero denoting no difference. Processing settings have been developed for several findings including pneumothorax and lung nodules, and clinical cases are currently being collected in preparation for formal clinical trials. Preliminary results indicate a preference for the optimized-processing presentation of images over the standard default, particularly by inexperienced radiology residents and referring clinicians.

  17. The use of low cost compact cameras with focus stacking functionality in entomological digitization projects

    PubMed Central

    Mertens, Jan E.J.; Roie, Martijn Van; Merckx, Jonas; Dekoninck, Wouter

    2017-01-01

    Abstract Digitization of specimen collections has become a key priority of many natural history museums. The camera systems built for this purpose are expensive, providing a barrier in institutes with limited funding, and therefore hampering progress. An assessment is made on whether a low cost compact camera with image stacking functionality can help expedite the digitization process in large museums or provide smaller institutes and amateur entomologists with the means to digitize their collections. Images of a professional setup were compared with the Olympus Stylus TG-4 Tough, a low-cost compact camera with internal focus stacking functions. Parameters considered include image quality, digitization speed, price, and ease-of-use. The compact camera’s image quality, although inferior to the professional setup, is exceptional considering its fourfold lower price point. Producing the image slices in the compact camera is a matter of seconds and when optimal image quality is less of a priority, the internal stacking function omits the need for dedicated stacking software altogether, further decreasing the cost and speeding up the process. In general, it is found that, aware of its limitations, this compact camera is capable of digitizing entomological collections with sufficient quality. As technology advances, more institutes and amateur entomologists will be able to easily and affordably catalogue their specimens. PMID:29134038

  18. Technique for improving the quality of images from digital cameras using ink-jet printers and smoothed RGB transfer curves

    NASA Astrophysics Data System (ADS)

    Sampat, Nitin; Grim, John F.; O'Hara, James E.

    1998-04-01

    The digital camera market is growing at an explosive rate. At the same time, the quality of photographs printed on ink- jet printers continues to improve. Most of the consumer cameras are designed with the monitor as the target output device and ont the printer. When a user is printing his images from a camera, he/she needs to optimize the camera and printer combination in order to maximize image quality. We describe the details of one such method for improving image quality using a AGFA digital camera and an ink jet printer combination. Using Adobe PhotoShop, we generated optimum red, green and blue transfer curves that match the scene content to the printers output capabilities. Application of these curves to the original digital image resulted in a print with more shadow detail, no loss of highlight detail, a smoother tone scale, and more saturated colors. The image also exhibited an improved tonal scale and visually more pleasing images than those captured and printed without any 'correction'. While we report the results for one camera-printer combination we tested this technique on numbers digital cameras and printer combinations and in each case produced a better looking image. We also discuss the problems we encountered in implementing this technique.

  19. TRIIG - Time-lapse reproduction of images through interactive graphics. [digital processing of quality hard copy

    NASA Technical Reports Server (NTRS)

    Buckner, J. D.; Council, H. W.; Edwards, T. R.

    1974-01-01

    Description of the hardware and software implementing the system of time-lapse reproduction of images through interactive graphics (TRIIG). The system produces a quality hard copy of processed images in a fast and inexpensive manner. This capability allows for optimal development of processing software through the rapid viewing of many image frames in an interactive mode. Three critical optical devices are used to reproduce an image: an Optronics photo reader/writer, the Adage Graphics Terminal, and Polaroid Type 57 high speed film. Typical sources of digitized images are observation satellites, such as ERTS or Mariner, computer coupled electron microscopes for high-magnification studies, or computer coupled X-ray devices for medical research.

  20. Examples of subjective image quality enhancement in multimedia

    NASA Astrophysics Data System (ADS)

    Klíma, Miloš; Pazderák, Jiří; Fliegel, Karel

    2007-09-01

    The subjective image quality is an important issue in all multimedia imaging systems with a significant impact onto QoS (Quality of Service). For long time the image fidelity criterion was widely applied in technical systems esp. in both television and image source compression fields but the optimization of subjective perception quality and fidelity approach (such as the minimum of MSE) are very different. The paper presents an experimental testing of three different digital techniques for the subjective image quality enhancement - color saturation, edge enhancement, denoising operators and noise addition - well known from both the digital photography and video. The evaluation has been done for extensive operator parameterization and the results are summarized and discussed. It has been demonstrated that there are relevant types of image corrections improving to some extent the subjective perception of the image. The above mentioned techniques have been tested for five image tests with significantly different image characteristics (fine details, large saturated color areas, high color contrast, easy-to-remember colors etc.). The experimental results show the way to optimized use of image enhancing operators. Finally the concept of impressiveness as a new possible expression of subjective quality improvement is presented and discussed.

  1. An image based vibration sensor for soft tissue modal analysis in a Digital Image Elasto Tomography (DIET) system.

    PubMed

    Feng, Sheng; Lotz, Thomas; Chase, J Geoffrey; Hann, Christopher E

    2010-01-01

    Digital Image Elasto Tomography (DIET) is a non-invasive elastographic breast cancer screening technology, based on image-based measurement of surface vibrations induced on a breast by mechanical actuation. Knowledge of frequency response characteristics of a breast prior to imaging is critical to maximize the imaging signal and diagnostic capability of the system. A feasibility analysis for a non-invasive image based modal analysis system is presented that is able to robustly and rapidly identify resonant frequencies in soft tissue. Three images per oscillation cycle are enough to capture the behavior at a given frequency. Thus, a sweep over critical frequency ranges can be performed prior to imaging to determine critical imaging settings of the DIET system to optimize its tumor detection performance.

  2. High resolution quantitative phase imaging of live cells with constrained optimization approach

    NASA Astrophysics Data System (ADS)

    Pandiyan, Vimal Prabhu; Khare, Kedar; John, Renu

    2016-03-01

    Quantitative phase imaging (QPI) aims at studying weakly scattering and absorbing biological specimens with subwavelength accuracy without any external staining mechanisms. Use of a reference beam at an angle is one of the necessary criteria for recording of high resolution holograms in most of the interferometric methods used for quantitative phase imaging. The spatial separation of the dc and twin images is decided by the reference beam angle and Fourier-filtered reconstructed image will have a very poor resolution if hologram is recorded below a minimum reference angle condition. However, it is always inconvenient to have a large reference beam angle while performing high resolution microscopy of live cells and biological specimens with nanometric features. In this paper, we treat reconstruction of digital holographic microscopy images as a constrained optimization problem with smoothness constraint in order to recover only complex object field in hologram plane even with overlapping dc and twin image terms. We solve this optimization problem by gradient descent approach iteratively and the smoothness constraint is implemented by spatial averaging with appropriate size. This approach will give excellent high resolution image recovery compared to Fourier filtering while keeping a very small reference angle. We demonstrate this approach on digital holographic microscopy of live cells by recovering the quantitative phase of live cells from a hologram recorded with nearly zero reference angle.

  3. Methods and reproducibility of grading optimized digital color fundus photographs in the Age-Related Eye Disease Study 2 (AREDS2 Report Number 2).

    PubMed

    Danis, Ronald P; Domalpally, Amitha; Chew, Emily Y; Clemons, Traci E; Armstrong, Jane; SanGiovanni, John Paul; Ferris, Frederick L

    2013-07-08

    To establish continuity with the grading procedures and outcomes from the historical data of the Age-Related Eye Disease Study (AREDS), color photographic imaging and evaluation procedures for the assessment of age-related macular degeneration (AMD) were modified for digital imaging in the AREDS2. The reproducibility of the grading of index AMD lesion components and for the AREDS severity scale was tested at the AREDS2 reading center. Digital color stereoscopic fundus photographs from 4203 AREDS2 subjects collected at baseline and annual follow-up visits were optimized for tonal balance and graded according to a standard protocol slightly modified from AREDS. The reproducibility of digital grading of AREDS2 images was assessed by reproducibility exercises, temporal drift (regrading a subset of baseline annually, n = 88), and contemporaneous masked regrading (ongoing, monthly regrade on 5% of submissions, n = 1335 eyes). In AREDS2, 91% and 96% of images received replicate grades within two steps of the baseline value on the AREDS severity scale for temporal drift and contemporaneous assessment, respectively (weighted Kappa of 0.73 and 0.76). Historical data for temporal drift in replicate gradings on the AREDS film-based images were 88% within two steps (weighted Kappa = 0.88). There was no difference in AREDS2-AREDS concordance for temporal drift (exact P = 0.57). Digital color grading has nearly the same reproducibility as historical film grading. There is substantial agreement for testing the predictive utility of the AREDS severity scale in AREDS2 as a clinical trial outcome. (ClinicalTrials.gov number, NCT00345176.)

  4. Image Segmentation Method Using Fuzzy C Mean Clustering Based on Multi-Objective Optimization

    NASA Astrophysics Data System (ADS)

    Chen, Jinlin; Yang, Chunzhi; Xu, Guangkui; Ning, Li

    2018-04-01

    Image segmentation is not only one of the hottest topics in digital image processing, but also an important part of computer vision applications. As one kind of image segmentation algorithms, fuzzy C-means clustering is an effective and concise segmentation algorithm. However, the drawback of FCM is that it is sensitive to image noise. To solve the problem, this paper designs a novel fuzzy C-mean clustering algorithm based on multi-objective optimization. We add a parameter λ to the fuzzy distance measurement formula to improve the multi-objective optimization. The parameter λ can adjust the weights of the pixel local information. In the algorithm, the local correlation of neighboring pixels is added to the improved multi-objective mathematical model to optimize the clustering cent. Two different experimental results show that the novel fuzzy C-means approach has an efficient performance and computational time while segmenting images by different type of noises.

  5. Simultaneous digital super-resolution and nonuniformity correction for infrared imaging systems.

    PubMed

    Meza, Pablo; Machuca, Guillermo; Torres, Sergio; Martin, Cesar San; Vera, Esteban

    2015-07-20

    In this article, we present a novel algorithm to achieve simultaneous digital super-resolution and nonuniformity correction from a sequence of infrared images. We propose to use spatial regularization terms that exploit nonlocal means and the absence of spatial correlation between the scene and the nonuniformity noise sources. We derive an iterative optimization algorithm based on a gradient descent minimization strategy. Results from infrared image sequences corrupted with simulated and real fixed-pattern noise show a competitive performance compared with state-of-the-art methods. A qualitative analysis on the experimental results obtained with images from a variety of infrared cameras indicates that the proposed method provides super-resolution images with significantly less fixed-pattern noise.

  6. Statistical model for speckle pattern optimization.

    PubMed

    Su, Yong; Zhang, Qingchuan; Gao, Zeren

    2017-11-27

    Image registration is the key technique of optical metrologies such as digital image correlation (DIC), particle image velocimetry (PIV), and speckle metrology. Its performance depends critically on the quality of image pattern, and thus pattern optimization attracts extensive attention. In this article, a statistical model is built to optimize speckle patterns that are composed of randomly positioned speckles. It is found that the process of speckle pattern generation is essentially a filtered Poisson process. The dependence of measurement errors (including systematic errors, random errors, and overall errors) upon speckle pattern generation parameters is characterized analytically. By minimizing the errors, formulas of the optimal speckle radius are presented. Although the primary motivation is from the field of DIC, we believed that scholars in other optical measurement communities, such as PIV and speckle metrology, will benefit from these discussions.

  7. [Optimization of radiological scoliosis assessment].

    PubMed

    Enríquez, Goya; Piqueras, Joaquim; Catalá, Ana; Oliva, Glòria; Ruiz, Agustí; Ribas, Montserrat; Duran, Carmina; Rodrigo, Carlos; Rodríguez, Eugenia; Garriga, Victoria; Maristany, Teresa; García-Fontecha, César; Baños, Joan; Muchart, Jordi; Alava, Fernando

    2014-07-01

    Most scoliosis are idiopathic (80%) and occur more frequently in adolescent girls. Plain radiography is the imaging method of choice, both for the initial study and follow-up studies but has the disadvantage of using ionizing radiation. The breasts are exposed to x-ray along these repeated examinations. The authors present a range of recommendations in order to optimize radiographic exam technique for both conventional and digital x-ray settings to prevent unnecessary patients' radiation exposure and to reduce the risk of breast cancer in patients with scoliosis. With analogue systems, leaded breast protectors should always be used, and with any radiographic equipment, analog or digital radiography, the examination should be performed in postero-anterior projection and optimized low-dose techniques. The ALARA (as low as reasonable achievable) rule should always be followed to achieve diagnostic quality images with the lowest feasible dose. Copyright © 2014. Published by Elsevier Espana.

  8. Investigation of Optimal Digital Image Correlation Patterns for Deformation Measurement

    NASA Technical Reports Server (NTRS)

    Bomarito, G. F.; Ruggles, T. J.; Hochhalter, J. D.; Cannon, A. H.

    2016-01-01

    Digital image correlation (DIC) relies on the surface texture of a specimen to measure deformation. When the specimen itself has little or no texture, a pattern is applied to the surface which deforms with the specimen and acts as an artificial surface texture. Because the applied pattern has an effect on the accuracy of DIC, an ideal pattern is sought for which the error introduced into DIC measurements is minimal. In this work, a study is performed in which several DIC pattern quality metrics from the literature are correlated to DIC measurement error. The resulting correlations give insight on the optimality of DIC patterns in general. Optimizations are then performed to produce patterns which are well suited for DIC. These patterns are tested to show their relative benefits. Chief among these benefits are a reduction in error of approximately 30 with respect to a randomly generated pattern.

  9. Imaging standards for smart cards

    NASA Astrophysics Data System (ADS)

    Ellson, Richard N.; Ray, Lawrence A.

    1996-02-01

    "Smart cards" are plastic cards the size of credit cards which contain integrated circuits for the storage of digital information. The applications of these cards for image storage has been growing as card data capacities have moved from tens of bytes to thousands of bytes. This has prompted the recommendation of standards by the X3B10 committee of ANSI for inclusion in ISO standards for card image storage of a variety of image data types including digitized signatures and color portrait images. This paper will review imaging requirements of the smart card industry, challenges of image storage for small memory devices, card image communications, and the present status of standards. The paper will conclude with recommendations for the evolution of smart card image standards towards image formats customized to the image content and more optimized for smart card memory constraints.

  10. Imaging standards for smart cards

    NASA Astrophysics Data System (ADS)

    Ellson, Richard N.; Ray, Lawrence A.

    1996-01-01

    'Smart cards' are plastic cards the size of credit cards which contain integrated circuits for the storage of digital information. The applications of these cards for image storage has been growing as card data capacities have moved from tens of bytes to thousands of bytes. This has prompted the recommendation of standards by the X3B10 committee of ANSI for inclusion in ISO standards for card image storage of a variety of image data types including digitized signatures and color portrait images. This paper reviews imaging requirements of the smart card industry, challenges of image storage for small memory devices, card image communications, and the present status of standards. The paper concludes with recommendations for the evolution of smart card image standards towards image formats customized to the image content and more optimized for smart card memory constraints.

  11. General equations for optimal selection of diagnostic image acquisition parameters in clinical X-ray imaging.

    PubMed

    Zheng, Xiaoming

    2017-12-01

    The purpose of this work was to examine the effects of relationship functions between diagnostic image quality and radiation dose on the governing equations for image acquisition parameter variations in X-ray imaging. Various equations were derived for the optimal selection of peak kilovoltage (kVp) and exposure parameter (milliAmpere second, mAs) in computed tomography (CT), computed radiography (CR), and direct digital radiography. Logistic, logarithmic, and linear functions were employed to establish the relationship between radiation dose and diagnostic image quality. The radiation dose to the patient, as a function of image acquisition parameters (kVp, mAs) and patient size (d), was used in radiation dose and image quality optimization. Both logistic and logarithmic functions resulted in the same governing equation for optimal selection of image acquisition parameters using a dose efficiency index. For image quality as a linear function of radiation dose, the same governing equation was derived from the linear relationship. The general equations should be used in guiding clinical X-ray imaging through optimal selection of image acquisition parameters. The radiation dose to the patient could be reduced from current levels in medical X-ray imaging.

  12. Computer image processing - The Viking experience. [digital enhancement techniques

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1977-01-01

    Computer processing of digital imagery from the Viking mission to Mars is discussed, with attention given to subjective enhancement and quantitative processing. Contrast stretching and high-pass filtering techniques of subjective enhancement are described; algorithms developed to determine optimal stretch and filtering parameters are also mentioned. In addition, geometric transformations to rectify the distortion of shapes in the field of view and to alter the apparent viewpoint of the image are considered. Perhaps the most difficult problem in quantitative processing of Viking imagery was the production of accurate color representations of Orbiter and Lander camera images.

  13. Does the choice of display system influence perception and visibility of clinically relevant features in digital pathology images?

    NASA Astrophysics Data System (ADS)

    Kimpe, Tom; Rostang, Johan; Avanaki, Ali; Espig, Kathryn; Xthona, Albert; Cocuranu, Ioan; Parwani, Anil V.; Pantanowitz, Liron

    2014-03-01

    Digital pathology systems typically consist of a slide scanner, processing software, visualization software, and finally a workstation with display for visualization of the digital slide images. This paper studies whether digital pathology images can look different when presenting them on different display systems, and whether these visual differences can result in different perceived contrast of clinically relevant features. By analyzing a set of four digital pathology images of different subspecialties on three different display systems, it was concluded that pathology images look different when visualized on different display systems. The importance of these visual differences is elucidated when they are located in areas of the digital slide that contain clinically relevant features. Based on a calculation of dE2000 differences between background and clinically relevant features, it was clear that perceived contrast of clinically relevant features is influenced by the choice of display system. Furthermore, it seems that the specific calibration target chosen for the display system has an important effect on the perceived contrast of clinically relevant features. Preliminary results suggest that calibrating to DICOM GSDF calibration performed slightly worse than sRGB, while a new experimental calibration target CSDF performed better than both DICOM GSDF and sRGB. This result is promising as it suggests that further research work could lead to better definition of an optimized calibration target for digital pathology images resulting in a positive effect on clinical performance.

  14. Digital Correction of Motion Artifacts in Microscopy Image Sequences Collected from Living Animals Using Rigid and Non-Rigid Registration

    PubMed Central

    Lorenz, Kevin S.; Salama, Paul; Dunn, Kenneth W.; Delp, Edward J.

    2013-01-01

    Digital image analysis is a fundamental component of quantitative microscopy. However, intravital microscopy presents many challenges for digital image analysis. In general, microscopy volumes are inherently anisotropic, suffer from decreasing contrast with tissue depth, lack object edge detail, and characteristically have low signal levels. Intravital microscopy introduces the additional problem of motion artifacts, resulting from respiratory motion and heartbeat from specimens imaged in vivo. This paper describes an image registration technique for use with sequences of intravital microscopy images collected in time-series or in 3D volumes. Our registration method involves both rigid and non-rigid components. The rigid registration component corrects global image translations, while the non-rigid component manipulates a uniform grid of control points defined by B-splines. Each control point is optimized by minimizing a cost function consisting of two parts: a term to define image similarity, and a term to ensure deformation grid smoothness. Experimental results indicate that this approach is promising based on the analysis of several image volumes collected from the kidney, lung, and salivary gland of living rodents. PMID:22092443

  15. [Basic concept in computer assisted surgery].

    PubMed

    Merloz, Philippe; Wu, Hao

    2006-03-01

    To investigate application of medical digital imaging systems and computer technologies in orthopedics. The main computer-assisted surgery systems comprise the four following subcategories. (1) A collection and recording process for digital data on each patient, including preoperative images (CT scans, MRI, standard X-rays), intraoperative visualization (fluoroscopy, ultrasound), and intraoperative position and orientation of surgical instruments or bone sections (using 3D localises). Data merging based on the matching of preoperative imaging (CT scans, MRI, standard X-rays) and intraoperative visualization (anatomical landmarks, or bone surfaces digitized intraoperatively via 3D localiser; intraoperative ultrasound images processed for delineation of bone contours). (2) In cases where only intraoperative images are used for computer-assisted surgical navigation, the calibration of the intraoperative imaging system replaces the merged data system, which is then no longer necessary. (3) A system that provides aid in decision-making, so that the surgical approach is planned on basis of multimodal information: the interactive positioning of surgical instruments or bone sections transmitted via pre- or intraoperative images, display of elements to guide surgical navigation (direction, axis, orientation, length and diameter of a surgical instrument, impingement, etc. ). And (4) A system that monitors the surgical procedure, thereby ensuring that the optimal strategy defined at the preoperative stage is taken into account. It is possible that computer-assisted orthopedic surgery systems will enable surgeons to better assess the accuracy and reliability of the various operative techniques, an indispensable stage in the optimization of surgery.

  16. Astrometric Observations of Phobos and Deimos During the 1971 Opposition of Mars

    DTIC Science & Technology

    2014-10-06

    measured with the digitizer of the Royal Observatory of Belgium and reduced through an optimal process that includes image, instrumental, and spherical...measurements of planets and satellites that will be used to compute new orbital ephemerides. Since we had demonstrated that a precise digitization and...unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 A&A proofs: manuscript no. manuscript Fig. 1. Center of the digitization

  17. Printing line/space patterns on nonplanar substrates using a digital micromirror device-based point-array scanning technique

    NASA Astrophysics Data System (ADS)

    Kuo, Hung-Fei; Kao, Guan-Hsuan; Zhu, Liang-Xiu; Hung, Kuo-Shu; Lin, Yu-Hsin

    2018-02-01

    This study used a digital micromirror device (DMD) to produce point-array patterns and employed a self-developed optical system to define line-and-space patterns on nonplanar substrates. First, field tracing was employed to analyze the aerial images of the lithographic system, which comprised an optical system and the DMD. Multiobjective particle swarm optimization was then applied to determine the spot overlapping rate used. The objective functions were set to minimize linewidth and maximize image log slope, through which the dose of the exposure agent could be effectively controlled and the quality of the nonplanar lithography could be enhanced. Laser beams with 405-nm wavelength were employed as the light source. Silicon substrates coated with photoresist were placed on a nonplanar translation stage. The DMD was used to produce lithographic patterns, during which the parameters were analyzed and optimized. The optimal delay time-sequence combinations were used to scan images of the patterns. Finally, an exposure linewidth of less than 10 μm was successfully achieved using the nonplanar lithographic process.

  18. Spatial resolution requirements for soft-copy reporting in digital radiography

    NASA Astrophysics Data System (ADS)

    Davies, Andrew G.; Cowen, Arnold R.; Fowler, Richard C.; Bury, Robert F.; Parkin, Geoff J. S.; Lintott, David J.; Martinez, Delia; Safudim, Asif

    1996-04-01

    The issue of the spatial resolution required in order to present diagnostic quality digital images, especially for softcopy reporting, has received much attention over recent years. The aim of this study was to compare the diagnostic performance reporting from hardcopy and optimized softcopy image presentations. One-hundred-fifteen radiographs of the hand acquired on a photostimulable phosphor computed radiography (CR) system were chosen as the image material. The study group was taken from patients who demonstrated subtle erosions of the bone in the digits. The control group consisted of radiologically normal bands. The images were presented in three modes, the CR system's hardcopy output, and softcopy presentations at full and half spatial resolutions. Four consultant radiologists participated as observers. Results were analyzed using the receiver operating characteristic (ROC) technique, and showed a statistically significant improvement in observer performance for both softcopy formats, when compared to the hardcopy presentation. However, no significant difference in observer performance was found between the two softcopy presentations. We therefore conclude that, with appropriate attention to the processing and presentation of digital image data, softcopy reporting can, for most examinations, provide superior diagnostic performance, even for images viewed at modest (1 k2) resolutions.

  19. Physical characterization and optimal magnification of a portal imaging system

    NASA Astrophysics Data System (ADS)

    Bissonnette, Jean-Pierre; Jaffray, David A.; Fenster, Aaron; Munro, Peter

    1992-06-01

    One problem in radiation therapy is ensuring accurate positioning of the patient so that the prescribed dose is delivered to the diseased regions while healthy tissues are spared. Positioning is usually assessed by exposing film to the high-energy treatment beam. Unfortunately, these films exhibit poor image quality (primarily due to low subject contrast) and the development delays make film impractical to check patient positioning routinely. Therefore, we have been developing a digital video-based imaging system to replace film. The system consists of a copper plate/fluorescent screen detector, a 45 degree(s) mirror, and a TV camera equipped with a large aperture lens. We have determined the signal and noise transfer properties of the imaging system by measuring its MTF(f) and NPS(f) and used these valued to estimate the optimal magnification for the imaging system. We have found that the optimal magnification is 2.3 - 2.5 when optimizing signal transfer (spatial resolution) alone; however, the optimal magnification is only 1.5 - 2.0 if SNR transfer is considered.

  20. Imaging and computational considerations for image computed permeability: Operating envelope of Digital Rock Physics

    NASA Astrophysics Data System (ADS)

    Saxena, Nishank; Hows, Amie; Hofmann, Ronny; Alpak, Faruk O.; Freeman, Justin; Hunter, Sander; Appel, Matthias

    2018-06-01

    This study defines the optimal operating envelope of the Digital Rock technology from the perspective of imaging and numerical simulations of transport properties. Imaging larger volumes of rocks for Digital Rock Physics (DRP) analysis improves the chances of achieving a Representative Elementary Volume (REV) at which flow-based simulations (1) do not vary with change in rock volume, and (2) is insensitive to the choice of boundary conditions. However, this often comes at the expense of image resolution. This trade-off exists due to the finiteness of current state-of-the-art imaging detectors. Imaging and analyzing digital rocks that sample the REV and still sufficiently resolve pore throats is critical to ensure simulation quality and robustness of rock property trends for further analysis. We find that at least 10 voxels are needed to sufficiently resolve pore throats for single phase fluid flow simulations. If this condition is not met, additional analyses and corrections may allow for meaningful comparisons between simulation results and laboratory measurements of permeability, but some cases may fall outside the current technical feasibility of DRP. On the other hand, we find that the ratio of field of view and effective grain size provides a reliable measure of the REV for siliciclastic rocks. If this ratio is greater than 5, the coefficient of variation for single-phase permeability simulations drops below 15%. These imaging considerations are crucial when comparing digitally computed rock flow properties with those measured in the laboratory. We find that the current imaging methods are sufficient to achieve both REV (with respect to numerical boundary conditions) and required image resolution to perform digital core analysis for coarse to fine-grained sandstones.

  1. A novel method for repeatedly generating speckle patterns used in digital image correlation

    NASA Astrophysics Data System (ADS)

    Zhang, Juan; Sweedy, Ahmed; Gitzhofer, François; Baroud, Gamal

    2018-01-01

    Speckle patterns play a key role in Digital Image Correlation (DIC) measurement, and generating an optimal speckle pattern has been the goal for decades now. The usual method of generating a speckle pattern is by manually spraying the paint on the specimen. However, this makes it difficult to reproduce the optimal pattern for maintaining identical testing conditions and achieving consistent DIC results. This study proposed and evaluated a novel method using an atomization system to repeatedly generate speckle patterns. To verify the repeatability of the speckle patterns generated by this system, simulation and experimental studies were systematically performed. The results from both studies showed that the speckle patterns and, accordingly, the DIC measurements become highly accurate and repeatable using the proposed atomization system.

  2. Homogeneous Canine Chest Phantom Construction: A Tool for Image Quality Optimization.

    PubMed

    Pavan, Ana Luiza Menegatti; Rosa, Maria Eugênia Dela; Giacomini, Guilherme; Bacchim Neto, Fernando Antonio; Yamashita, Seizo; Vulcano, Luiz Carlos; Duarte, Sergio Barbosa; Miranda, José Ricardo de Arruda; de Pina, Diana Rodrigues

    2016-01-01

    Digital radiographic imaging is increasing in veterinary practice. The use of radiation demands responsibility to maintain high image quality. Low doses are necessary because workers are requested to restrain the animal. Optimizing digital systems is necessary to avoid unnecessary exposure, causing the phenomenon known as dose creep. Homogeneous phantoms are widely used to optimize image quality and dose. We developed an automatic computational methodology to classify and quantify tissues (i.e., lung tissue, adipose tissue, muscle tissue, and bone) in canine chest computed tomography exams. The thickness of each tissue was converted to simulator materials (i.e., Lucite, aluminum, and air). Dogs were separated into groups of 20 animals each according to weight. Mean weights were 6.5 ± 2.0 kg, 15.0 ± 5.0 kg, 32.0 ± 5.5 kg, and 50.0 ± 12.0 kg, for the small, medium, large, and giant groups, respectively. The one-way analysis of variance revealed significant differences in all simulator material thicknesses (p < 0.05) quantified between groups. As a result, four phantoms were constructed for dorsoventral and lateral views. In conclusion, the present methodology allows the development of phantoms of the canine chest and possibly other body regions and/or animals. The proposed phantom is a practical tool that may be employed in future work to optimize veterinary X-ray procedures.

  3. Homogeneous Canine Chest Phantom Construction: A Tool for Image Quality Optimization

    PubMed Central

    2016-01-01

    Digital radiographic imaging is increasing in veterinary practice. The use of radiation demands responsibility to maintain high image quality. Low doses are necessary because workers are requested to restrain the animal. Optimizing digital systems is necessary to avoid unnecessary exposure, causing the phenomenon known as dose creep. Homogeneous phantoms are widely used to optimize image quality and dose. We developed an automatic computational methodology to classify and quantify tissues (i.e., lung tissue, adipose tissue, muscle tissue, and bone) in canine chest computed tomography exams. The thickness of each tissue was converted to simulator materials (i.e., Lucite, aluminum, and air). Dogs were separated into groups of 20 animals each according to weight. Mean weights were 6.5 ± 2.0 kg, 15.0 ± 5.0 kg, 32.0 ± 5.5 kg, and 50.0 ± 12.0 kg, for the small, medium, large, and giant groups, respectively. The one-way analysis of variance revealed significant differences in all simulator material thicknesses (p < 0.05) quantified between groups. As a result, four phantoms were constructed for dorsoventral and lateral views. In conclusion, the present methodology allows the development of phantoms of the canine chest and possibly other body regions and/or animals. The proposed phantom is a practical tool that may be employed in future work to optimize veterinary X-ray procedures. PMID:27101001

  4. Three-dimensional Imaging and Scanning: Current and Future Applications for Pathology

    PubMed Central

    Farahani, Navid; Braun, Alex; Jutt, Dylan; Huffman, Todd; Reder, Nick; Liu, Zheng; Yagi, Yukako; Pantanowitz, Liron

    2017-01-01

    Imaging is vital for the assessment of physiologic and phenotypic details. In the past, biomedical imaging was heavily reliant on analog, low-throughput methods, which would produce two-dimensional images. However, newer, digital, and high-throughput three-dimensional (3D) imaging methods, which rely on computer vision and computer graphics, are transforming the way biomedical professionals practice. 3D imaging has been useful in diagnostic, prognostic, and therapeutic decision-making for the medical and biomedical professions. Herein, we summarize current imaging methods that enable optimal 3D histopathologic reconstruction: Scanning, 3D scanning, and whole slide imaging. Briefly mentioned are emerging platforms, which combine robotics, sectioning, and imaging in their pursuit to digitize and automate the entire microscopy workflow. Finally, both current and emerging 3D imaging methods are discussed in relation to current and future applications within the context of pathology. PMID:28966836

  5. Digitized hand-wrist radiographs: comparison of subjective and software-derived image quality at various compression ratios.

    PubMed

    McCord, Layne K; Scarfe, William C; Naylor, Rachel H; Scheetz, James P; Silveira, Anibal; Gillespie, Kevin R

    2007-05-01

    The objectives of this study were to compare the effect of JPEG 2000 compression of hand-wrist radiographs on observer image quality qualitative assessment and to compare with a software-derived quantitative image quality index. Fifteen hand-wrist radiographs were digitized and saved as TIFF and JPEG 2000 images at 4 levels of compression (20:1, 40:1, 60:1, and 80:1). The images, including rereads, were viewed by 13 orthodontic residents who determined the image quality rating on a scale of 1 to 5. A quantitative analysis was also performed by using a readily available software based on the human visual system (Image Quality Measure Computer Program, version 6.2, Mitre, Bedford, Mass). ANOVA was used to determine the optimal compression level (P < or =.05). When we compared subjective indexes, JPEG compression greater than 60:1 significantly reduced image quality. When we used quantitative indexes, the JPEG 2000 images had lower quality at all compression ratios compared with the original TIFF images. There was excellent correlation (R2 >0.92) between qualitative and quantitative indexes. Image Quality Measure indexes are more sensitive than subjective image quality assessments in quantifying image degradation with compression. There is potential for this software-based quantitative method in determining the optimal compression ratio for any image without the use of subjective raters.

  6. A GPU Simulation Tool for Training and Optimisation in 2D Digital X-Ray Imaging.

    PubMed

    Gallio, Elena; Rampado, Osvaldo; Gianaria, Elena; Bianchi, Silvio Diego; Ropolo, Roberto

    2015-01-01

    Conventional radiology is performed by means of digital detectors, with various types of technology and different performance in terms of efficiency and image quality. Following the arrival of a new digital detector in a radiology department, all the staff involved should adapt the procedure parameters to the properties of the detector, in order to achieve an optimal result in terms of correct diagnostic information and minimum radiation risks for the patient. The aim of this study was to develop and validate a software capable of simulating a digital X-ray imaging system, using graphics processing unit computing. All radiological image components were implemented in this application: an X-ray tube with primary beam, a virtual patient, noise, scatter radiation, a grid and a digital detector. Three different digital detectors (two digital radiography and a computed radiography systems) were implemented. In order to validate the software, we carried out a quantitative comparison of geometrical and anthropomorphic phantom simulated images with those acquired. In terms of average pixel values, the maximum differences were below 15%, while the noise values were in agreement with a maximum difference of 20%. The relative trends of contrast to noise ratio versus beam energy and intensity were well simulated. Total calculation times were below 3 seconds for clinical images with pixel size of actual dimensions less than 0.2 mm. The application proved to be efficient and realistic. Short calculation times and the accuracy of the results obtained make this software a useful tool for training operators and dose optimisation studies.

  7. Photography in Dermatologic Surgery: Selection of an Appropriate Camera Type for a Particular Clinical Application.

    PubMed

    Chen, Brian R; Poon, Emily; Alam, Murad

    2017-08-01

    Photographs are an essential tool for the documentation and sharing of findings in dermatologic surgery, and various camera types are available. To evaluate the currently available camera types in view of the special functional needs of procedural dermatologists. Mobile phone, point and shoot, digital single-lens reflex (DSLR), digital medium format, and 3-dimensional cameras were compared in terms of their usefulness for dermatologic surgeons. For each camera type, the image quality, as well as the other practical benefits and limitations, were evaluated with reference to a set of ideal camera characteristics. Based on these assessments, recommendations were made regarding the specific clinical circumstances in which each camera type would likely be most useful. Mobile photography may be adequate when ease of use, availability, and accessibility are prioritized. Point and shoot cameras and DSLR cameras provide sufficient resolution for a range of clinical circumstances, while providing the added benefit of portability. Digital medium format cameras offer the highest image quality, with accurate color rendition and greater color depth. Three-dimensional imaging may be optimal for the definition of skin contour. The selection of an optimal camera depends on the context in which it will be used.

  8. Color correction pipeline optimization for digital cameras

    NASA Astrophysics Data System (ADS)

    Bianco, Simone; Bruna, Arcangelo R.; Naccari, Filippo; Schettini, Raimondo

    2013-04-01

    The processing pipeline of a digital camera converts the RAW image acquired by the sensor to a representation of the original scene that should be as faithful as possible. There are mainly two modules responsible for the color-rendering accuracy of a digital camera: the former is the illuminant estimation and correction module, and the latter is the color matrix transformation aimed to adapt the color response of the sensor to a standard color space. These two modules together form what may be called the color correction pipeline. We design and test new color correction pipelines that exploit different illuminant estimation and correction algorithms that are tuned and automatically selected on the basis of the image content. Since the illuminant estimation is an ill-posed problem, illuminant correction is not error-free. An adaptive color matrix transformation module is optimized, taking into account the behavior of the first module in order to alleviate the amplification of color errors. The proposed pipelines are tested on a publicly available dataset of RAW images. Experimental results show that exploiting the cross-talks between the modules of the pipeline can lead to a higher color-rendition accuracy.

  9. A Preliminary Comparison of Three Dimensional Particle Tracking and Sizing using Plenoptic Imaging and Digital In-line Holography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guildenbecher, Daniel Robert; Munz, Elise Dahnke; Farias, Paul Abraham

    2015-12-01

    Digital in-line holography and plenoptic photography are two techniques for single-shot, volumetric measurement of 3D particle fields. Here we present a preliminary comparison of the two methods by applying plenoptic imaging to experimental configurations that have been previously investigated with digital in-line holography. These experiments include the tracking of secondary droplets from the impact of a water drop on a thin film of water and tracking of pellets from a shotgun. Both plenoptic imaging and digital in-line holography successfully quantify the 3D nature of these particle fields. This includes measurement of the 3D particle position, individual particle sizes, and three-componentmore » velocity vectors. For the initial processing methods presented here, both techniques give out-of-plane positional accuracy of approximately 1-2 particle diameters. For a fixed image sensor, digital holography achieves higher effective in-plane spatial resolutions. However, collimated and coherent illumination makes holography susceptible to image distortion through index of refraction gradients, as demonstrated in the shotgun experiments. On the other hand, plenotpic imaging allows for a simpler experimental configuration. Furthermore, due to the use of diffuse, white-light illumination, plenoptic imaging is less susceptible to image distortion in the shotgun experiments. Additional work is needed to better quantify sources of uncertainty, particularly in the plenoptic experiments, as well as develop data processing methodologies optimized for the plenoptic measurement.« less

  10. A Preliminary Comparison of Three Dimensional Particle Tracking and Sizing using Plenoptic Imaging and Digital In-line Holography [PowerPoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guildenbecher, Daniel Robert; Munz, Elise Dahnke; Farias, Paul Abraham

    2015-12-01

    Digital in-line holography and plenoptic photography are two techniques for single-shot, volumetric measurement of 3D particle fields. Here we present a preliminary comparison of the two methods by applying plenoptic imaging to experimental configurations that have been previously investigated with digital in-line holography. These experiments include the tracking of secondary droplets from the impact of a water drop on a thin film of water and tracking of pellets from a shotgun. Both plenoptic imaging and digital in-line holography successfully quantify the 3D nature of these particle fields. This includes measurement of the 3D particle position, individual particle sizes, and three-componentmore » velocity vectors. For the initial processing methods presented here, both techniques give out-of-plane positional accuracy of approximately 1-2 particle diameters. For a fixed image sensor, digital holography achieves higher effective in-plane spatial resolutions. However, collimated and coherent illumination makes holography susceptible to image distortion through index of refraction gradients, as demonstrated in the shotgun experiments. On the other hand, plenotpic imaging allows for a simpler experimental configuration. Furthermore, due to the use of diffuse, white-light illumination, plenoptic imaging is less susceptible to image distortion in the shotgun experiments. Additional work is needed to better quantify sources of uncertainty, particularly in the plenoptic experiments, as well as develop data processing methodologies optimized for the plenoptic measurement.« less

  11. A Closed-Loop Proportional-Integral (PI) Control Software for Fully Mechanically Controlled Automated Electron Microscopic Tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    REN, GANG; LIU, JINXIN; LI, HONGCHANG

    A closed-loop proportional-integral (PI) control software is provided for fully mechanically controlled automated electron microscopic tomography. The software is developed based on Gatan DigitalMicrograph, and is compatible with Zeiss LIBRA 120 transmission electron microscope. However, it can be expanded to other TEM instrument with modification. The software consists of a graphical user interface, a digital PI controller, an image analyzing unit, and other drive units (i.e.: image acquire unit and goniometer drive unit). During a tomography data collection process, the image analyzing unit analyzes both the accumulated shift and defocus value of the latest acquired image, and provides the resultsmore » to the digital PI controller. The digital PI control compares the results with the preset values and determines the optimum adjustments of the goniometer. The goniometer drive unit adjusts the spatial position of the specimen according to the instructions given by the digital PI controller for the next tilt angle and image acquisition. The goniometer drive unit achieves high precision positioning by using a backlash elimination method. The major benefits of the software are: 1) the goniometer drive unit keeps pre-aligned/optimized beam conditions unchanged and achieves position tracking solely through mechanical control; 2) the image analyzing unit relies on only historical data and therefore does not require additional images/exposures; 3) the PI controller enables the system to dynamically track the imaging target with extremely low system error.« less

  12. Integrated optical 3D digital imaging based on DSP scheme

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Peng, Xiang; Gao, Bruce Z.

    2008-03-01

    We present a scheme of integrated optical 3-D digital imaging (IO3DI) based on digital signal processor (DSP), which can acquire range images independently without PC support. This scheme is based on a parallel hardware structure with aid of DSP and field programmable gate array (FPGA) to realize 3-D imaging. In this integrated scheme of 3-D imaging, the phase measurement profilometry is adopted. To realize the pipeline processing of the fringe projection, image acquisition and fringe pattern analysis, we present a multi-threads application program that is developed under the environment of DSP/BIOS RTOS (real-time operating system). Since RTOS provides a preemptive kernel and powerful configuration tool, with which we are able to achieve a real-time scheduling and synchronization. To accelerate automatic fringe analysis and phase unwrapping, we make use of the technique of software optimization. The proposed scheme can reach a performance of 39.5 f/s (frames per second), so it may well fit into real-time fringe-pattern analysis and can implement fast 3-D imaging. Experiment results are also presented to show the validity of proposed scheme.

  13. Computer-based desktop system for surgical videotape editing.

    PubMed

    Vincent-Hamelin, E; Sarmiento, J M; de la Puente, J M; Vicente, M

    1997-05-01

    The educational role of surgical video presentations should be optimized by linking surgical images to graphic evaluation of indications, techniques, and results. We describe a PC-based video production system for personal editing of surgical tapes, according to the objectives of each presentation. The hardware requirement is a personal computer (100 MHz processor, 1-Gb hard disk, 16 Mb RAM) with a PC-to-TV/video transfer card plugged into a slot. Computer-generated numerical data, texts, and graphics are transformed into analog signals displayed on TV/video. A Genlock interface (a special interface card) synchronizes digital and analog signals, to overlay surgical images to electronic illustrations. The presentation is stored as digital information or recorded on a tape. The proliferation of multimedia tools is leading us to adapt presentations to the objectives of lectures and to integrate conceptual analyses with dynamic image-based information. We describe a system that handles both digital and analog signals, production being recorded on a tape. Movies may be managed in a digital environment, with either an "on-line" or "off-line" approach. System requirements are high, but handling a single device optimizes editing without incurring such complexity that management becomes impractical to surgeons. Our experience suggests that computerized editing allows linking surgical scientific and didactic messages on a single communication medium, either a videotape or a CD-ROM.

  14. Information theoretic analysis of linear shift-invariant edge-detection operators

    NASA Astrophysics Data System (ADS)

    Jiang, Bo; Rahman, Zia-ur

    2012-06-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the influences by the image gathering process. However, experiments show that the image gathering process has a profound impact on the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. We perform an end-to-end information theory based system analysis to assess linear shift-invariant edge-detection algorithms. We evaluate the performance of the different algorithms as a function of the characteristics of the scene and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge-detection algorithm is regarded as having high performance only if the information rate from the scene to the edge image approaches its maximum possible. This goal can be achieved only by jointly optimizing all processes. Our information-theoretic assessment provides a new tool that allows us to compare different linear shift-invariant edge detectors in a common environment.

  15. Accuracy of DSM based on digital aerial image matching. (Polish Title: Dokładność NMPT tworzonego metodą automatycznego dopasowania cyfrowych zdjęć lotniczych)

    NASA Astrophysics Data System (ADS)

    Kubalska, J. L.; Preuss, R.

    2013-12-01

    Digital Surface Models (DSM) are used in GIS data bases as single product more often. They are also necessary to create other products such as3D city models, true-ortho and object-oriented classification. This article presents results of DSM generation for classification of vegetation in urban areas. Source data allowed producing DSM with using of image matching method and ALS data. The creation of DSM from digital images, obtained by Ultra Cam-D digital Vexcel camera, was carried out in Match-T by INPHO. This program optimizes the configuration of images matching process, which ensures high accuracy and minimize gap areas. The analysis of the accuracy of this process was made by comparison of DSM generated in Match-T with DSM generated from ALS data. Because of further purpose of generated DSM it was decided to create model in GRID structure with cell size of 1 m. With this parameter differential model from both DSMs was also built that allowed determining the relative accuracy of the compared models. The analysis indicates that the generation of DSM with multi-image matching method is competitive for the same surface model creation from ALS data. Thus, when digital images with high overlap are available, the additional registration of ALS data seems to be unnecessary.

  16. Digital to analog conversion and visual evaluation of Thematic Mapper data

    USGS Publications Warehouse

    McCord, James R.; Binnie, Douglas R.; Seevers, Paul M.

    1985-01-01

    As a part of the National Aeronautics and Space Administration Landsat D Image Data Quality Analysis Program, the Earth Resources Observation Systems Data Center (EDC) developed procedures to optimize the visual information content of Thematic Mapper data and evaluate the resulting photographic products by visual interpretation. A digital-to-analog transfer function was developed which would properly place the digital values on the most useable portion of a film response curve. Individual black-and-white transparencies generated using the resulting look-up tables were utilized in the production of color-composite images with varying band combinations. Four experienced photointerpreters ranked 2-cm-diameter (0. 75 inch) chips of selected image features of each band combination for ease of interpretability. A nonparametric rank-order test determined the significance of interpreter preference for the band combinations.

  17. Digital to Analog Conversion and Visual Evaluation of Thematic Mapper Data

    USGS Publications Warehouse

    McCord, James R.; Binnie, Douglas R.; Seevers, Paul M.

    1985-01-01

    As a part of the National Aeronautics and Space Administration Landsat D Image Data Quality Analysis Program, the Earth Resources Observation Systems Data Center (EDC) developed procedures to optimize the visual information content of Thematic Mapper data and evaluate the resulting photographic products by visual interpretation. A digital-to-analog transfer function was developed which would properly place the digital values on the most useable portion of a film response curve. Individual black-and-white transparencies generated using the resulting look-up tables were utilized in the production of color-composite images with varying band combinations. Four experienced photointerpreters ranked 2-cm-diameter (0. 75 inch) chips of selected image features of each band combination for ease of interpretability. A nonparametric rank-order test determined the significance of interpreter preference for the band combinations.

  18. Digital mammography: physical principles and future applications.

    PubMed

    Gambaccini, Mauro; Baldelli, Paola

    2003-01-01

    Mammography is currently considered the best tool for the detection of breast cancer, pathology with a rate of incidence in constant increase. To produce the radiological picture a screen film combination is conventionally used. One of the inherent limitations of screen- film combination is the fact that the detection, display and storage processes are one and the same, making it impossible to separately optimize each stage. These limitations can be overcome with digital systems. In this work we evaluate the main characteristics of digital detectors available on the market and we compare the performance of digital and conventional systems. Digital mammography, due to the possibility to process images, offers many potential advantages, among these the possibility to introduce the dual-energy technique which employs the composition of two digital images obtained with two different energies to enhance the inherent contrast of pathologies by removing the uniform background. This technique was previously tested by using synchrotron monochromatic beam and a digital detector, and then the Senographe 2000D full-field digital system manufactured by GE Medical Systems. In this work we present preliminary results and the future applications of this technique.

  19. A comprehensive study on urban true orthorectification

    USGS Publications Warehouse

    Zhou, G.; Chen, W.; Kelmelis, J.A.; Zhang, Dongxiao

    2005-01-01

    To provide some advanced technical bases (algorithms and procedures) and experience needed for national large-scale digital orthophoto generation and revision of the Standards for National Large-Scale City Digital Orthophoto in the National Digital Orthophoto Program (NDOP), this paper presents a comprehensive study on theories, algorithms, and methods of large-scale urban orthoimage generation. The procedures of orthorectification for digital terrain model (DTM)-based and digital building model (DBM)-based orthoimage generation and their mergence for true orthoimage generation are discussed in detail. A method of compensating for building occlusions using photogrammetric geometry is developed. The data structure needed to model urban buildings for accurately generating urban orthoimages is presented. Shadow detection and removal, the optimization of seamline for automatic mosaic, and the radiometric balance of neighbor images are discussed. Street visibility analysis, including the relationship between flight height, building height, street width, and relative location of the street to the imaging center, is analyzed for complete true orthoimage generation. The experimental results demonstrated that our method can effectively and correctly orthorectify the displacements caused by terrain and buildings in urban large-scale aerial images. ?? 2005 IEEE.

  20. Optimal Binarization of Gray-Scaled Digital Images via Fuzzy Reasoning

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A. (Inventor); Klinko, Steven J. (Inventor)

    2007-01-01

    A technique for finding an optimal threshold for binarization of a gray scale image employs fuzzy reasoning. A triangular membership function is employed which is dependent on the degree to which the pixels in the image belong to either the foreground class or the background class. Use of a simplified linear fuzzy entropy factor function facilitates short execution times and use of membership values between 0.0 and 1.0 for improved accuracy. To improve accuracy further, the membership function employs lower and upper bound gray level limits that can vary from image to image and are selected to be equal to the minimum and the maximum gray levels, respectively, that are present in the image to be converted. To identify the optimal binarization threshold, an iterative process is employed in which different possible thresholds are tested and the one providing the minimum fuzzy entropy measure is selected.

  1. Digitizing an Analog Radiography Teaching File Under Time Constraint: Trade-Offs in Efficiency and Image Quality.

    PubMed

    Loehfelm, Thomas W; Prater, Adam B; Debebe, Tequam; Sekhar, Aarti K

    2017-02-01

    We digitized the radiography teaching file at Black Lion Hospital (Addis Ababa, Ethiopia) during a recent trip, using a standard digital camera and a fluorescent light box. Our goal was to photograph every radiograph in the existing library while optimizing the final image size to the maximum resolution of a high quality tablet computer, preserving the contrast resolution of the radiographs, and minimizing total library file size. A secondary important goal was to minimize the cost and time required to take and process the images. Three workers were able to efficiently remove the radiographs from their storage folders, hang them on the light box, operate the camera, catalog the image, and repack the radiographs back to the storage folder. Zoom, focal length, and film speed were fixed, while aperture and shutter speed were manually adjusted for each image, allowing for efficiency and flexibility in image acquisition. Keeping zoom and focal length fixed, which kept the view box at the same relative position in all of the images acquired during a single photography session, allowed unused space to be batch-cropped, saving considerable time in post-processing, at the expense of final image resolution. We present an analysis of the trade-offs in workflow efficiency and final image quality, and demonstrate that a few people with minimal equipment can efficiently digitize a teaching file library.

  2. Ortho Image and DTM Generation with Intelligent Methods

    NASA Astrophysics Data System (ADS)

    Bagheri, H.; Sadeghian, S.

    2013-10-01

    Nowadays the artificial intelligent algorithms has considered in GIS and remote sensing. Genetic algorithm and artificial neural network are two intelligent methods that are used for optimizing of image processing programs such as edge extraction and etc. these algorithms are very useful for solving of complex program. In this paper, the ability and application of genetic algorithm and artificial neural network in geospatial production process like geometric modelling of satellite images for ortho photo generation and height interpolation in raster Digital Terrain Model production process is discussed. In first, the geometric potential of Ikonos-2 and Worldview-2 with rational functions, 2D & 3D polynomials were tested. Also comprehensive experiments have been carried out to evaluate the viability of the genetic algorithm for optimization of rational function, 2D & 3D polynomials. Considering the quality of Ground Control Points, the accuracy (RMSE) with genetic algorithm and 3D polynomials method for Ikonos-2 Geo image was 0.508 pixel sizes and the accuracy (RMSE) with GA algorithm and rational function method for Worldview-2 image was 0.930 pixel sizes. For more another optimization artificial intelligent methods, neural networks were used. With the use of perceptron network in Worldview-2 image, a result of 0.84 pixel sizes with 4 neurons in middle layer was gained. The final conclusion was that with artificial intelligent algorithms it is possible to optimize the existing models and have better results than usual ones. Finally the artificial intelligence methods, like genetic algorithms as well as neural networks, were examined on sample data for optimizing interpolation and for generating Digital Terrain Models. The results then were compared with existing conventional methods and it appeared that these methods have a high capacity in heights interpolation and that using these networks for interpolating and optimizing the weighting methods based on inverse distance leads to a high accurate estimation of heights.

  3. Diagnosis and Diagnostic Imaging of Anal Canal Cancer.

    PubMed

    Ciombor, Kristen K; Ernst, Randy D; Brown, Gina

    2017-01-01

    Anal canal cancer is an uncommon malignancy but one that is often curable with optimal therapy. Owing to its unique location, histology, risk factors, and usual presentation, a careful diagnostic approach is warranted. This approach includes an excellent history and physical examination, including digital rectal examination, laboratory data, and comprehensive imaging. Anal cancer staging and formulation of a treatment plan depends on accurate imaging data. Modern radiographic techniques have improved staging quality and accuracy, and a thorough knowledge of anal anatomy is paramount to the optimal multidisciplinary treatment of this disease. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Digital mammography--DQE versus optimized image quality in clinical environment: an on site study

    NASA Astrophysics Data System (ADS)

    Oberhofer, Nadia; Fracchetti, Alessandro; Springeth, Margareth; Moroder, Ehrenfried

    2010-04-01

    The intrinsic quality of the detection system of 7 different digital mammography units (5 direct radiography DR; 2 computed radiography CR), expressed by DQE, has been compared with their image quality/dose performances in clinical use. DQE measurements followed IEC 62220-1-2 using a tungsten test object for MTF determination. For image quality assessment two different methods have been applied: 1) measurement of contrast to noise ratio (CNR) according to the European guidelines and 2) contrast-detail (CD) evaluation. The latter was carried out with the phantom CDMAM ver. 3.4 and the commercial software CDMAM Analyser ver. 1.1 (both Artinis) for automated image analysis. The overall image quality index IQFinv proposed by the software has been validated. Correspondence between the two methods has been shown figuring out a linear correlation between CNR and IQFinv. All systems were optimized with respect to image quality and average glandular dose (AGD) within the constraints of automatic exposure control (AEC). For each equipment, a good image quality level was defined by means of CD analysis, and the corresponding CNR value considered as target value. The goal was to achieve for different PMMA-phantom thicknesses constant image quality, that means the CNR target value, at minimum dose. All DR systems exhibited higher DQE and significantly better image quality compared to CR systems. Generally switching, where available, to a target/filter combination with an x-ray spectrum of higher mean energy permitted dose savings at equal image quality. However, several systems did not allow to modify the AEC in order to apply optimal radiographic technique in clinical use. The best ratio image quality/dose was achieved by a unit with a-Se detector and W anode only recently available on the market.

  5. Quadratic trigonometric B-spline for image interpolation using GA

    PubMed Central

    Abbas, Samreen; Irshad, Misbah

    2017-01-01

    In this article, a new quadratic trigonometric B-spline with control parameters is constructed to address the problems related to two dimensional digital image interpolation. The newly constructed spline is then used to design an image interpolation scheme together with one of the soft computing techniques named as Genetic Algorithm (GA). The idea of GA has been formed to optimize the control parameters in the description of newly constructed spline. The Feature SIMilarity (FSIM), Structure SIMilarity (SSIM) and Multi-Scale Structure SIMilarity (MS-SSIM) indices along with traditional Peak Signal-to-Noise Ratio (PSNR) are employed as image quality metrics to analyze and compare the outcomes of approach offered in this work, with three of the present digital image interpolation schemes. The upshots show that the proposed scheme is better choice to deal with the problems associated to image interpolation. PMID:28640906

  6. Quadratic trigonometric B-spline for image interpolation using GA.

    PubMed

    Hussain, Malik Zawwar; Abbas, Samreen; Irshad, Misbah

    2017-01-01

    In this article, a new quadratic trigonometric B-spline with control parameters is constructed to address the problems related to two dimensional digital image interpolation. The newly constructed spline is then used to design an image interpolation scheme together with one of the soft computing techniques named as Genetic Algorithm (GA). The idea of GA has been formed to optimize the control parameters in the description of newly constructed spline. The Feature SIMilarity (FSIM), Structure SIMilarity (SSIM) and Multi-Scale Structure SIMilarity (MS-SSIM) indices along with traditional Peak Signal-to-Noise Ratio (PSNR) are employed as image quality metrics to analyze and compare the outcomes of approach offered in this work, with three of the present digital image interpolation schemes. The upshots show that the proposed scheme is better choice to deal with the problems associated to image interpolation.

  7. SU-F-P-48: The Quantitative Evaluation and Comparison of Image Distortion and Loss of X-Ray Images Between Anti-Scattered Grid and Moire Compensation Processing in Digital Radiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chung, W; Jung, J; Kang, Y

    Purpose: To quantitatively analyze the influence image processing for Moire elimination has in digital radiography by comparing the image acquired from optimized anti-scattered grid only and the image acquired from software processing paired with misaligned low-frequency grid. Methods: Special phantom, which does not create scattered radiation, was used to acquire non-grid reference images and they were acquired without any grids. A set of images was acquired with optimized grid, aligned to pixel of a detector and other set of images was acquired with misaligned low-frequency grid paired with Moire elimination processing algorithm. X-ray technique used was based on consideration tomore » Bucky factor derived from non-grid reference images. For evaluation, we analyze by comparing pixel intensity of acquired images with grids to that of reference images. Results: When compared to image acquired with optimized grid, images acquired with Moire elimination processing algorithm showed 10 to 50% lower mean contrast value of ROI. Severe distortion of images was found with when the object’s thickness was measured at 7 or less pixels. In this case, contrast value measured from images acquired with Moire elimination processing algorithm was under 30% of that taken from reference image. Conclusion: This study shows the potential risk of Moire compensation images in diagnosis. Images acquired with misaligned low-frequency grid results in Moire noise and Moire compensation processing algorithm used to remove this Moire noise actually caused an image distortion. As a result, fractures and/or calcifications which are presented in few pixels only may not be diagnosed properly. In future work, we plan to evaluate the images acquired without grid but based on 100% image processing and the potential risks it possesses.« less

  8. Ultramap: the all in One Photogrammetric Solution

    NASA Astrophysics Data System (ADS)

    Wiechert, A.; Gruber, M.; Karner, K.

    2012-07-01

    This paper describes in detail the dense matcher developed since years by Vexcel Imaging in Graz for Microsoft's Bing Maps project. This dense matcher was exclusively developed for and used by Microsoft for the production of the 3D city models of Virtual Earth. It will now be made available to the public with the UltraMap software release mid-2012. That represents a revolutionary step in digital photogrammetry. The dense matcher generates digital surface models (DSM) and digital terrain models (DTM) automatically out of a set of overlapping UltraCam images. The models have an outstanding point density of several hundred points per square meter and sub-pixel accuracy and are generated automatically. The dense matcher consists of two steps. The first step rectifies overlapping image areas to speed up the dense image matching process. This rectification step ensures a very efficient processing and detects occluded areas by applying a back-matching step. In this dense image matching process a cost function consisting of a matching score as well as a smoothness term is minimized. In the second step the resulting range image patches are fused into a DSM by optimizing a global cost function. The whole process is optimized for multi-core CPUs and optionally uses GPUs if available. UltraMap 3.0 features also an additional step which is presented in this paper, a complete automated true-ortho and ortho workflow. For this, the UltraCam images are combined with the DSM or DTM in an automated rectification step and that results in high quality true-ortho or ortho images as a result of a highly automated workflow. The paper presents the new workflow and first results.

  9. Image Halftoning Using Optimized Dot Diffusion

    DTIC Science & Technology

    1998-01-01

    ppvnath@sys.caltech.edu ABSTRACT The dot diffusion method for digital halftoning has the advantage of parallelism unlike the error diffusion ...digital halftoning : ordered dither [1], error diffusion [2], neural-net based methods [8], and more recently direct binary search (DBS) [7]. Ordered...from periodic patterns. On the other hand error diffused halftones do not suffer from periodicity and offer blue noise characteristic [3] which is

  10. Optimizing Radiometric Fidelity to Enhance Aerial Image Change Detection Utilizing Digital Single Lens Reflex (DSLR) Cameras

    NASA Astrophysics Data System (ADS)

    Kerr, Andrew D.

    Determining optimal imaging settings and best practices related to the capture of aerial imagery using consumer-grade digital single lens reflex (DSLR) cameras, should enable remote sensing scientists to generate consistent, high quality, and low cost image data sets. Radiometric optimization, image fidelity, image capture consistency and repeatability were evaluated in the context of detailed image-based change detection. The impetus for this research is in part, a dearth of relevant, contemporary literature, on the utilization of consumer grade DSLR cameras for remote sensing, and the best practices associated with their use. The main radiometric control settings on a DSLR camera, EV (Exposure Value), WB (White Balance), light metering, ISO, and aperture (f-stop), are variables that were altered and controlled over the course of several image capture missions. These variables were compared for their effects on dynamic range, intra-frame brightness variation, visual acuity, temporal consistency, and the detectability of simulated cracks placed in the images. This testing was conducted from a terrestrial, rather than an airborne collection platform, due to the large number of images per collection, and the desire to minimize inter-image misregistration. The results point to a range of slightly underexposed image exposure values as preferable for change detection and noise minimization fidelity. The makeup of the scene, the sensor, and aerial platform, influence the selection of the aperture and shutter speed which along with other variables, allow for estimation of the apparent image motion (AIM) motion blur in the resulting images. The importance of the image edges in the image application, will in part dictate the lowest usable f-stop, and allow the user to select a more optimal shutter speed and ISO. The single most important camera capture variable is exposure bias (EV), with a full dynamic range, wide distribution of DN values, and high visual contrast and acuity occurring around -0.7 to -0.3EV exposure bias. The ideal values for sensor gain, was found to be ISO 100, with ISO 200 a less desirable. This study offers researchers a better understanding of the effects of camera capture settings on RSI pairs and their influence on image-based change detection.

  11. Estimation of breast percent density in raw and processed full field digital mammography images via adaptive fuzzy c-means clustering and support vector machine segmentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keller, Brad M.; Nathan, Diane L.; Wang Yan

    Purpose: The amount of fibroglandular tissue content in the breast as estimated mammographically, commonly referred to as breast percent density (PD%), is one of the most significant risk factors for developing breast cancer. Approaches to quantify breast density commonly focus on either semiautomated methods or visual assessment, both of which are highly subjective. Furthermore, most studies published to date investigating computer-aided assessment of breast PD% have been performed using digitized screen-film mammograms, while digital mammography is increasingly replacing screen-film mammography in breast cancer screening protocols. Digital mammography imaging generates two types of images for analysis, raw (i.e., 'FOR PROCESSING') andmore » vendor postprocessed (i.e., 'FOR PRESENTATION'), of which postprocessed images are commonly used in clinical practice. Development of an algorithm which effectively estimates breast PD% in both raw and postprocessed digital mammography images would be beneficial in terms of direct clinical application and retrospective analysis. Methods: This work proposes a new algorithm for fully automated quantification of breast PD% based on adaptive multiclass fuzzy c-means (FCM) clustering and support vector machine (SVM) classification, optimized for the imaging characteristics of both raw and processed digital mammography images as well as for individual patient and image characteristics. Our algorithm first delineates the breast region within the mammogram via an automated thresholding scheme to identify background air followed by a straight line Hough transform to extract the pectoral muscle region. The algorithm then applies adaptive FCM clustering based on an optimal number of clusters derived from image properties of the specific mammogram to subdivide the breast into regions of similar gray-level intensity. Finally, a SVM classifier is trained to identify which clusters within the breast tissue are likely fibroglandular, which are then aggregated into a final dense tissue segmentation that is used to compute breast PD%. Our method is validated on a group of 81 women for whom bilateral, mediolateral oblique, raw and processed screening digital mammograms were available, and agreement is assessed with both continuous and categorical density estimates made by a trained breast-imaging radiologist. Results: Strong association between algorithm-estimated and radiologist-provided breast PD% was detected for both raw (r= 0.82, p < 0.001) and processed (r= 0.85, p < 0.001) digital mammograms on a per-breast basis. Stronger agreement was found when overall breast density was assessed on a per-woman basis for both raw (r= 0.85, p < 0.001) and processed (0.89, p < 0.001) mammograms. Strong agreement between categorical density estimates was also seen (weighted Cohen's {kappa}{>=} 0.79). Repeated measures analysis of variance demonstrated no statistically significant differences between the PD% estimates (p > 0.1) due to either presentation of the image (raw vs processed) or method of PD% assessment (radiologist vs algorithm). Conclusions: The proposed fully automated algorithm was successful in estimating breast percent density from both raw and processed digital mammographic images. Accurate assessment of a woman's breast density is critical in order for the estimate to be incorporated into risk assessment models. These results show promise for the clinical application of the algorithm in quantifying breast density in a repeatable manner, both at time of imaging as well as in retrospective studies.« less

  12. Estimation of breast percent density in raw and processed full field digital mammography images via adaptive fuzzy c-means clustering and support vector machine segmentation

    PubMed Central

    Keller, Brad M.; Nathan, Diane L.; Wang, Yan; Zheng, Yuanjie; Gee, James C.; Conant, Emily F.; Kontos, Despina

    2012-01-01

    Purpose: The amount of fibroglandular tissue content in the breast as estimated mammographically, commonly referred to as breast percent density (PD%), is one of the most significant risk factors for developing breast cancer. Approaches to quantify breast density commonly focus on either semiautomated methods or visual assessment, both of which are highly subjective. Furthermore, most studies published to date investigating computer-aided assessment of breast PD% have been performed using digitized screen-film mammograms, while digital mammography is increasingly replacing screen-film mammography in breast cancer screening protocols. Digital mammography imaging generates two types of images for analysis, raw (i.e., “FOR PROCESSING”) and vendor postprocessed (i.e., “FOR PRESENTATION”), of which postprocessed images are commonly used in clinical practice. Development of an algorithm which effectively estimates breast PD% in both raw and postprocessed digital mammography images would be beneficial in terms of direct clinical application and retrospective analysis. Methods: This work proposes a new algorithm for fully automated quantification of breast PD% based on adaptive multiclass fuzzy c-means (FCM) clustering and support vector machine (SVM) classification, optimized for the imaging characteristics of both raw and processed digital mammography images as well as for individual patient and image characteristics. Our algorithm first delineates the breast region within the mammogram via an automated thresholding scheme to identify background air followed by a straight line Hough transform to extract the pectoral muscle region. The algorithm then applies adaptive FCM clustering based on an optimal number of clusters derived from image properties of the specific mammogram to subdivide the breast into regions of similar gray-level intensity. Finally, a SVM classifier is trained to identify which clusters within the breast tissue are likely fibroglandular, which are then aggregated into a final dense tissue segmentation that is used to compute breast PD%. Our method is validated on a group of 81 women for whom bilateral, mediolateral oblique, raw and processed screening digital mammograms were available, and agreement is assessed with both continuous and categorical density estimates made by a trained breast-imaging radiologist. Results: Strong association between algorithm-estimated and radiologist-provided breast PD% was detected for both raw (r = 0.82, p < 0.001) and processed (r = 0.85, p < 0.001) digital mammograms on a per-breast basis. Stronger agreement was found when overall breast density was assessed on a per-woman basis for both raw (r = 0.85, p < 0.001) and processed (0.89, p < 0.001) mammograms. Strong agreement between categorical density estimates was also seen (weighted Cohen's κ ≥ 0.79). Repeated measures analysis of variance demonstrated no statistically significant differences between the PD% estimates (p > 0.1) due to either presentation of the image (raw vs processed) or method of PD% assessment (radiologist vs algorithm). Conclusions: The proposed fully automated algorithm was successful in estimating breast percent density from both raw and processed digital mammographic images. Accurate assessment of a woman's breast density is critical in order for the estimate to be incorporated into risk assessment models. These results show promise for the clinical application of the algorithm in quantifying breast density in a repeatable manner, both at time of imaging as well as in retrospective studies. PMID:22894417

  13. Estimation of breast percent density in raw and processed full field digital mammography images via adaptive fuzzy c-means clustering and support vector machine segmentation.

    PubMed

    Keller, Brad M; Nathan, Diane L; Wang, Yan; Zheng, Yuanjie; Gee, James C; Conant, Emily F; Kontos, Despina

    2012-08-01

    The amount of fibroglandular tissue content in the breast as estimated mammographically, commonly referred to as breast percent density (PD%), is one of the most significant risk factors for developing breast cancer. Approaches to quantify breast density commonly focus on either semiautomated methods or visual assessment, both of which are highly subjective. Furthermore, most studies published to date investigating computer-aided assessment of breast PD% have been performed using digitized screen-film mammograms, while digital mammography is increasingly replacing screen-film mammography in breast cancer screening protocols. Digital mammography imaging generates two types of images for analysis, raw (i.e., "FOR PROCESSING") and vendor postprocessed (i.e., "FOR PRESENTATION"), of which postprocessed images are commonly used in clinical practice. Development of an algorithm which effectively estimates breast PD% in both raw and postprocessed digital mammography images would be beneficial in terms of direct clinical application and retrospective analysis. This work proposes a new algorithm for fully automated quantification of breast PD% based on adaptive multiclass fuzzy c-means (FCM) clustering and support vector machine (SVM) classification, optimized for the imaging characteristics of both raw and processed digital mammography images as well as for individual patient and image characteristics. Our algorithm first delineates the breast region within the mammogram via an automated thresholding scheme to identify background air followed by a straight line Hough transform to extract the pectoral muscle region. The algorithm then applies adaptive FCM clustering based on an optimal number of clusters derived from image properties of the specific mammogram to subdivide the breast into regions of similar gray-level intensity. Finally, a SVM classifier is trained to identify which clusters within the breast tissue are likely fibroglandular, which are then aggregated into a final dense tissue segmentation that is used to compute breast PD%. Our method is validated on a group of 81 women for whom bilateral, mediolateral oblique, raw and processed screening digital mammograms were available, and agreement is assessed with both continuous and categorical density estimates made by a trained breast-imaging radiologist. Strong association between algorithm-estimated and radiologist-provided breast PD% was detected for both raw (r = 0.82, p < 0.001) and processed (r = 0.85, p < 0.001) digital mammograms on a per-breast basis. Stronger agreement was found when overall breast density was assessed on a per-woman basis for both raw (r = 0.85, p < 0.001) and processed (0.89, p < 0.001) mammograms. Strong agreement between categorical density estimates was also seen (weighted Cohen's κ ≥ 0.79). Repeated measures analysis of variance demonstrated no statistically significant differences between the PD% estimates (p > 0.1) due to either presentation of the image (raw vs processed) or method of PD% assessment (radiologist vs algorithm). The proposed fully automated algorithm was successful in estimating breast percent density from both raw and processed digital mammographic images. Accurate assessment of a woman's breast density is critical in order for the estimate to be incorporated into risk assessment models. These results show promise for the clinical application of the algorithm in quantifying breast density in a repeatable manner, both at time of imaging as well as in retrospective studies.

  14. A Degree Distribution Optimization Algorithm for Image Transmission

    NASA Astrophysics Data System (ADS)

    Jiang, Wei; Yang, Junjie

    2016-09-01

    Luby Transform (LT) code is the first practical implementation of digital fountain code. The coding behavior of LT code is mainly decided by the degree distribution which determines the relationship between source data and codewords. Two degree distributions are suggested by Luby. They work well in typical situations but not optimally in case of finite encoding symbols. In this work, the degree distribution optimization algorithm is proposed to explore the potential of LT code. Firstly selection scheme of sparse degrees for LT codes is introduced. Then probability distribution is optimized according to the selected degrees. In image transmission, bit stream is sensitive to the channel noise and even a single bit error may cause the loss of synchronization between the encoder and the decoder. Therefore the proposed algorithm is designed for image transmission situation. Moreover, optimal class partition is studied for image transmission with unequal error protection. The experimental results are quite promising. Compared with LT code with robust soliton distribution, the proposed algorithm improves the final quality of recovered images obviously with the same overhead.

  15. Single-shot full resolution region-of-interest (ROI) reconstruction in image plane digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Singh, Mandeep; Khare, Kedar

    2018-05-01

    We describe a numerical processing technique that allows single-shot region-of-interest (ROI) reconstruction in image plane digital holographic microscopy with full pixel resolution. The ROI reconstruction is modelled as an optimization problem where the cost function to be minimized consists of an L2-norm squared data fitting term and a modified Huber penalty term that are minimized alternately in an adaptive fashion. The technique can provide full pixel resolution complex-valued images of the selected ROI which is not possible to achieve with the commonly used Fourier transform method. The technique can facilitate holographic reconstruction of individual cells of interest from a large field-of-view digital holographic microscopy data. The complementary phase information in addition to the usual absorption information already available in the form of bright field microscopy can make the methodology attractive to the biomedical user community.

  16. [Computer-aided method and rapid prototyping for the personalized fabrication of a silicone bandage digital prosthesis].

    PubMed

    Ventura Ferreira, Nuno; Leal, Nuno; Correia Sá, Inês; Reis, Ana; Marques, Marisa

    2014-01-01

    The fabrication of digital prostheses has acquired growing importance not only for the possibility for the patient to overcome psychosocial trauma but also to promote grip functionality. An application method of three dimensional-computer-aided design technologies for the production of passive prostheses is presented by means of a fifth finger amputee clinical case following bilateral hand replantation.Three-dimensional-computerized tomography was used for the collection of anthropometric images of the hands. Computer-aided design techniques were used to develop the digital file-based prosthesis from the reconstruction images by inversion and superimposing the contra-lateral finger images. The rapid prototyping manufacturing method was used for the production of a silicone bandage prosthesis prototype. This approach replaces the traditional manual method by a virtual method that is basis for the optimization of a high speed, accurate and innovative process.

  17. Image management research

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1988-01-01

    Two types of research issues are involved in image management systems with space station applications: image processing research and image perception research. The image processing issues are the traditional ones of digitizing, coding, compressing, storing, analyzing, and displaying, but with a new emphasis on the constraints imposed by the human perceiver. Two image coding algorithms have been developed that may increase the efficiency of image management systems (IMS). Image perception research involves a study of the theoretical and practical aspects of visual perception of electronically displayed images. Issues include how rapidly a user can search through a library of images, how to make this search more efficient, and how to present images in terms of resolution and split screens. Other issues include optimal interface to an IMS and how to code images in a way that is optimal for the human perceiver. A test-bed within which such issues can be addressed has been designed.

  18. Digital authentication with copy-detection patterns

    NASA Astrophysics Data System (ADS)

    Picard, Justin

    2004-06-01

    Technologies for making high-quality copies of documents are getting more available, cheaper, and more efficient. As a result, the counterfeiting business engenders huge losses, ranging to 5% to 8% of worldwide sales of brand products, and endangers the reputation and value of the brands themselves. Moreover, the growth of the Internet drives the business of counterfeited documents (fake IDs, university diplomas, checks, and so on), which can be bought easily and anonymously from hundreds of companies on the Web. The incredible progress of digital imaging equipment has put in question the very possibility of verifying the authenticity of documents: how can we discern genuine documents from seemingly "perfect" copies? This paper proposes a solution based on creating digital images with specific properties, called a Copy-detection patterns (CDP), that is printed on arbitrary documents, packages, etc. CDPs make an optimal use of an "information loss principle": every time an imae is printed or scanned, some information is lost about the original digital image. That principle applies even for the highest quality scanning, digital imaging, printing or photocopying equipment today, and will likely remain true for tomorrow. By measuring the amount of information contained in a scanned CDP, the CDP detector can take a decision on the authenticity of the document.

  19. Use of the variable gain settings on SPOT

    USGS Publications Warehouse

    Chavez, P.S.

    1989-01-01

    Often the brightness or digital number (DN) range of satellite image data is less than optimal and uses only a portion of the available values (0 to 255) because the range of reflectance values is small. Most imaging systems have been designed with only two gain settings, normal and high. The SPOT High Resolution Visible (HRV) imaging system has the capability to collect image data using one of eight different gain settings. With the proper procedure this allows the brightness or reflectance resolution, which is directly related to the range of DN values recorded, to be optimized for any given site as compared to using a single set of gain settings everywhere. -from Author

  20. Observer performance assessment of JPEG-compressed high-resolution chest images

    NASA Astrophysics Data System (ADS)

    Good, Walter F.; Maitz, Glenn S.; King, Jill L.; Gennari, Rose C.; Gur, David

    1999-05-01

    The JPEG compression algorithm was tested on a set of 529 chest radiographs that had been digitized at a spatial resolution of 100 micrometer and contrast sensitivity of 12 bits. Images were compressed using five fixed 'psychovisual' quantization tables which produced average compression ratios in the range 15:1 to 61:1, and were then printed onto film. Six experienced radiologists read all cases from the laser printed film, in each of the five compressed modes as well as in the non-compressed mode. For comparison purposes, observers also read the same cases with reduced pixel resolutions of 200 micrometer and 400 micrometer. The specific task involved detecting masses, pneumothoraces, interstitial disease, alveolar infiltrates and rib fractures. Over the range of compression ratios tested, for images digitized at 100 micrometer, we were unable to demonstrate any statistically significant decrease (p greater than 0.05) in observer performance as measured by ROC techniques. However, the observers' subjective assessments of image quality did decrease significantly as image resolution was reduced and suggested a decreasing, but nonsignificant, trend as the compression ratio was increased. The seeming discrepancy between our failure to detect a reduction in observer performance, and other published studies, is likely due to: (1) the higher resolution at which we digitized our images; (2) the higher signal-to-noise ratio of our digitized films versus typical CR images; and (3) our particular choice of an optimized quantization scheme.

  1. Optimizing morphology through blood cell image analysis.

    PubMed

    Merino, A; Puigví, L; Boldú, L; Alférez, S; Rodellar, J

    2018-05-01

    Morphological review of the peripheral blood smear is still a crucial diagnostic aid as it provides relevant information related to the diagnosis and is important for selection of additional techniques. Nevertheless, the distinctive cytological characteristics of the blood cells are subjective and influenced by the reviewer's interpretation and, because of that, translating subjective morphological examination into objective parameters is a challenge. The use of digital microscopy systems has been extended in the clinical laboratories. As automatic analyzers have some limitations for abnormal or neoplastic cell detection, it is interesting to identify quantitative features through digital image analysis for morphological characteristics of different cells. Three main classes of features are used as follows: geometric, color, and texture. Geometric parameters (nucleus/cytoplasmic ratio, cellular area, nucleus perimeter, cytoplasmic profile, RBC proximity, and others) are familiar to pathologists, as they are related to the visual cell patterns. Different color spaces can be used to investigate the rich amount of information that color may offer to describe abnormal lymphoid or blast cells. Texture is related to spatial patterns of color or intensities, which can be visually detected and quantitatively represented using statistical tools. This study reviews current and new quantitative features, which can contribute to optimize morphology through blood cell digital image processing techniques. © 2018 John Wiley & Sons Ltd.

  2. Modeling digital breast tomosynthesis imaging systems for optimization studies

    NASA Astrophysics Data System (ADS)

    Lau, Beverly Amy

    Digital breast tomosynthesis (DBT) is a new imaging modality for breast imaging. In tomosynthesis, multiple images of the compressed breast are acquired at different angles, and the projection view images are reconstructed to yield images of slices through the breast. One of the main problems to be addressed in the development of DBT is the optimal parameter settings to obtain images ideal for detection of cancer. Since it would be unethical to irradiate women multiple times to explore potentially optimum geometries for tomosynthesis, it is ideal to use a computer simulation to generate projection images. Existing tomosynthesis models have modeled scatter and detector without accounting for oblique angles of incidence that tomosynthesis introduces. Moreover, these models frequently use geometry-specific physical factors measured from real systems, which severely limits the robustness of their algorithms for optimization. The goal of this dissertation was to design the framework for a computer simulation of tomosynthesis that would produce images that are sensitive to changes in acquisition parameters, so an optimization study would be feasible. A computer physics simulation of the tomosynthesis system was developed. The x-ray source was modeled as a polychromatic spectrum based on published spectral data, and inverse-square law was applied. Scatter was applied using a convolution method with angle-dependent scatter point spread functions (sPSFs), followed by scaling using an angle-dependent scatter-to-primary ratio (SPR). Monte Carlo simulations were used to generate sPSFs for a 5-cm breast with a 1-cm air gap. Detector effects were included through geometric propagation of the image onto layers of the detector, which were blurred using depth-dependent detector point-spread functions (PRFs). Depth-dependent PRFs were calculated every 5-microns through a 200-micron thick CsI detector using Monte Carlo simulations. Electronic noise was added as Gaussian noise as a last step of the model. The sPSFs and detector PRFs were verified to match published data, and noise power spectrum (NPS) from simulated flat field images were shown to match empirically measured data from a digital mammography unit. A novel anthropomorphic software breast phantom was developed for 3D imaging simulation. Projection view images of the phantom were shown to have similar structure as real breasts in the spatial frequency domain, using the power-law exponent beta to quantify tissue complexity. The physics simulation and computer breast phantom were used together, following methods from a published study with real tomosynthesis images of real breasts. The simulation model and 3D numerical breast phantoms were able to reproduce the trends in the experimental data. This result demonstrates the ability of the tomosynthesis physics model to generate images sensitive to changes in acquisition parameters.

  3. Validation of a new UNIX-based quantitative coronary angiographic system for the measurement of coronary artery lesions.

    PubMed

    Bell, M R; Britson, P J; Chu, A; Holmes, D R; Bresnahan, J F; Schwartz, R S

    1997-01-01

    We describe a method of validation of computerized quantitative coronary arteriography and report the results of a new UNIX-based quantitative coronary arteriography software program developed for rapid on-line (digital) and off-line (digital or cinefilm) analysis. The UNIX operating system is widely available in computer systems using very fast processors and has excellent graphics capabilities. The system is potentially compatible with any cardiac digital x-ray system for on-line analysis and has been designed to incorporate an integrated database, have on-line and immediate recall capabilities, and provide digital access to all data. The accuracy (mean signed differences of the observed minus the true dimensions) and precision (pooled standard deviations of the measurements) of the program were determined x-ray vessel phantoms. Intra- and interobserver variabilities were assessed from in vivo studies during routine clinical coronary arteriography. Precision from the x-ray phantom studies (6-In. field of view) for digital images was 0.066 mm and for digitized cine images was 0.060 mm. Accuracy was 0.076 mm (overestimation) for digital images compared to 0.008 mm for digitized cine images. Diagnostic coronary catheters were also used for calibration; accuracy.varied according to size of catheter and whether or not they were filled with iodinated contrast. Intra- and interobserver variabilities were excellent and indicated that coronary lesion measurements were relatively user-independent. Thus, this easy to use and very fast UNIX based program appears to be robust with optimal accuracy and precision for clinical and research applications.

  4. Digital liver biopsy: Bio-imaging of fatty liver for translational and clinical research.

    PubMed

    Mancini, Marcello; Summers, Paul; Faita, Francesco; Brunetto, Maurizia R; Callea, Francesco; De Nicola, Andrea; Di Lascio, Nicole; Farinati, Fabio; Gastaldelli, Amalia; Gridelli, Bruno; Mirabelli, Peppino; Neri, Emanuele; Salvadori, Piero A; Rebelos, Eleni; Tiribelli, Claudio; Valenti, Luca; Salvatore, Marco; Bonino, Ferruccio

    2018-02-27

    The rapidly growing field of functional, molecular and structural bio-imaging is providing an extraordinary new opportunity to overcome the limits of invasive liver biopsy and introduce a "digital biopsy" for in vivo study of liver pathophysiology. To foster the application of bio-imaging in clinical and translational research, there is a need to standardize the methods of both acquisition and the storage of the bio-images of the liver. It can be hoped that the combination of digital, liquid and histologic liver biopsies will provide an innovative synergistic tri-dimensional approach to identifying new aetiologies, diagnostic and prognostic biomarkers and therapeutic targets for the optimization of personalized therapy of liver diseases and liver cancer. A group of experts of different disciplines (Special Interest Group for Personalized Hepatology of the Italian Association for the Study of the Liver, Institute for Biostructures and Bio-imaging of the National Research Council and Bio-banking and Biomolecular Resources Research Infrastructure) discussed criteria, methods and guidelines for facilitating the requisite application of data collection. This manuscript provides a multi-Author review of the issue with special focus on fatty liver.

  5. Optimization and Comparison of Different Digital Mammographic Tomosynthesis Reconstruction Methods

    DTIC Science & Technology

    2007-04-01

    physical measurements of impulse response analysis, modulation transfer function (MTF) and noise power spectrum (NPS). (Months 5- 12). 1.2.1. Simulate...added: projection images with simulated impulse and the 1/r2 shading difference. Other system blur and noise issues were not addressed in this paper...spectrum (NPS), Noise -equivalent quanta (NEQ), impulse response, Back Projection (BP) 1. INTRODUCTION Digital breast tomosynthesis is a new

  6. Using applet-servlet communication for optimizing window, level and crop for DICOM to JPEG conversion.

    PubMed

    Kamauu, Aaron W C; DuVall, Scott L; Wiggins, Richard H; Avrin, David E

    2008-09-01

    In the creation of interesting radiological cases in a digital teaching file, it is necessary to adjust the window and level settings of an image to effectively display the educational focus. The web-based applet described in this paper presents an effective solution for real-time window and level adjustments without leaving the picture archiving and communications system workstation. Optimized images are created, as user-defined parameters are passed between the applet and a servlet on the Health Insurance Portability and Accountability Act-compliant teaching file server.

  7. Information hiding techniques for infrared images: exploring the state-of-the art and challenges

    NASA Astrophysics Data System (ADS)

    Pomponiu, Victor; Cavagnino, Davide; Botta, Marco; Nejati, Hossein

    2015-10-01

    The proliferation of Infrared technology and imaging systems enables a different perspective to tackle many computer vision problems in defense and security applications. Infrared images are widely used by the law enforcement, Homeland Security and military organizations to achieve a significant advantage or situational awareness, and thus is vital to protect these data against malicious attacks. Concurrently, sophisticated malware are developed which are able to disrupt the security and integrity of these digital media. For instance, illegal distribution and manipulation are possible malicious attacks to the digital objects. In this paper we explore the use of a new layer of defense for the integrity of the infrared images through the aid of information hiding techniques such as watermarking. In this context, we analyze the efficiency of several optimal decoding schemes for the watermark inserted into the Singular Value Decomposition (SVD) domain of the IR images using an additive spread spectrum (SS) embedding framework. In order to use the singular values (SVs) of the IR images with the SS embedding we adopt several restrictions that ensure that the values of the SVs will maintain their statistics. For both the optimal maximum likelihood decoder and sub-optimal decoders we assume that the PDF of SVs can be modeled by the Weibull distribution. Furthermore, we investigate the challenges involved in protecting and assuring the integrity of IR images such as data complexity and the error probability behavior, i.e., the probability of detection and the probability of false detection, for the applied optimal decoders. By taking into account the efficiency and the necessary auxiliary information for decoding the watermark, we discuss the suitable decoder for various operating situations. Experimental results are carried out on a large dataset of IR images to show the imperceptibility and efficiency of the proposed scheme against various attack scenarios.

  8. Evaluation of automatic exposure control system chamber for the dose optimization when examining pelvic in digital radiography.

    PubMed

    Kim, Sung-Chul; Lee, Hae-Kag; Lee, Yang-Sub; Cho, Jae-Hwan

    2015-01-01

    We found a way to optimize the image quality and reduce the exposure dose of patients through the proper activity combination of the automatic exposure control system chamber for the dose optimization when examining the pelvic anteroposterior side using the phantom of the human body standard model. We set 7 combinations of the chamber of automatic exposure control system. The effective dose was yielded by measuring five times for each according to the activity combination of the chamber for the dose measurement. Five radiologists with more than five years of experience evaluated the image through picture archiving and communication system using double blind test while classifying the 6 anatomical sites into 3-point level (improper, proper, perfect). When only one central chamber was activated, the effective dose was found to be the highest level, 0.287 mSv; and lowest when only the top left chamber was used, 0.165 mSv. After the subjective evaluation by five panel members on the pelvic image was completed, there was no statistically meaningful difference between the 7 chamber combinations, and all had good image quality. When testing the pelvic anteroposterior side with digital radiography, we were able to reduce the exposure dose of patients using the combination of the top right side of or the top two of the chamber.

  9. Spectrally optimal illuminations for diabetic retinopathy detection in retinal imaging

    NASA Astrophysics Data System (ADS)

    Bartczak, Piotr; Fält, Pauli; Penttinen, Niko; Ylitepsa, Pasi; Laaksonen, Lauri; Lensu, Lasse; Hauta-Kasari, Markku; Uusitalo, Hannu

    2017-04-01

    Retinal photography is a standard method for recording retinal diseases for subsequent analysis and diagnosis. However, the currently used white light or red-free retinal imaging does not necessarily provide the best possible visibility of different types of retinal lesions, important when developing diagnostic tools for handheld devices, such as smartphones. Using specifically designed illumination, the visibility and contrast of retinal lesions could be improved. In this study, spectrally optimal illuminations for diabetic retinopathy lesion visualization are implemented using a spectrally tunable light source based on digital micromirror device. The applicability of this method was tested in vivo by taking retinal monochrome images from the eyes of five diabetic volunteers and two non-diabetic control subjects. For comparison to existing methods, we evaluated the contrast of retinal images taken with our method and red-free illumination. The preliminary results show that the use of optimal illuminations improved the contrast of diabetic lesions in retinal images by 30-70%, compared to the traditional red-free illumination imaging.

  10. An evaluation of the impact of digital imaging on radiographic practice and patient doses

    NASA Astrophysics Data System (ADS)

    Horrocks, J.; Violaki, K.

    2015-09-01

    Direct digital imaging technology was implemented in all areas in general and mobile radiology at Barts and the Royal London Hospitals in 2012. Evidence from recent radiation incident investigations indicates optimum exposure factors are not consistently selected, with the greater dynamic range of the digital detectors allowing sub-optimal practice. To investigate further patient dose data were extracted from the Radiology Information System for adult chest X-ray examinations in 2014, covering over 50,000 studies in the Trust. Chest X-ray examinations were selected as they are low dose but frequent examinations. The patient dose data were evaluated taking into account X-ray system type and detector performance measurements, and individual cases studies were used to highlight where practice can be improved.

  11. Tolerance of brightness and contrast adjustments on chronic apical abscess and apical granuloma interpretation

    NASA Astrophysics Data System (ADS)

    Purnamasari, L.; Iskandar, H. H. B.; Makes, B. N.

    2017-08-01

    In digitized radiography techniques, adjusting the image enhancement can improve the subjective image quality by optimizing the brightness and contrast for diagnostic needs. To determine the value range of image enhancement (brightness and contrast) on chronic apical abscess and apical granuloma interpretation. 30 periapical radiographs that diagnosed chronic apical abscess and 30 that diagnosed apical granuloma were adjusted by changing brightness and contrast values. The value range of brightness and contrast adjustment that can be tolerated in radiographic interpretations of chronic apical abscess and apical granuloma spans from -10 to +10. Brightness and contrast adjustments on digital radiographs do not affect the radiographic interpretation of chronic apical abscess and apical granuloma if conducted within the value range.

  12. An optimized color transformation for the analysis of digital images of hematoxylin & eosin stained slides.

    PubMed

    Zarella, Mark D; Breen, David E; Plagov, Andrei; Garcia, Fernando U

    2015-01-01

    Hematoxylin and eosin (H&E) staining is ubiquitous in pathology practice and research. As digital pathology has evolved, the reliance of quantitative methods that make use of H&E images has similarly expanded. For example, cell counting and nuclear morphometry rely on the accurate demarcation of nuclei from other structures and each other. One of the major obstacles to quantitative analysis of H&E images is the high degree of variability observed between different samples and different laboratories. In an effort to characterize this variability, as well as to provide a substrate that can potentially mitigate this factor in quantitative image analysis, we developed a technique to project H&E images into an optimized space more appropriate for many image analysis procedures. We used a decision tree-based support vector machine learning algorithm to classify 44 H&E stained whole slide images of resected breast tumors according to the histological structures that are present. This procedure takes an H&E image as an input and produces a classification map of the image that predicts the likelihood of a pixel belonging to any one of a set of user-defined structures (e.g., cytoplasm, stroma). By reducing these maps into their constituent pixels in color space, an optimal reference vector is obtained for each structure, which identifies the color attributes that maximally distinguish one structure from other elements in the image. We show that tissue structures can be identified using this semi-automated technique. By comparing structure centroids across different images, we obtained a quantitative depiction of H&E variability for each structure. This measurement can potentially be utilized in the laboratory to help calibrate daily staining or identify troublesome slides. Moreover, by aligning reference vectors derived from this technique, images can be transformed in a way that standardizes their color properties and makes them more amenable to image processing.

  13. Features Extraction of Flotation Froth Images and BP Neural Network Soft-Sensor Model of Concentrate Grade Optimized by Shuffled Cuckoo Searching Algorithm

    PubMed Central

    Wang, Jie-sheng; Han, Shuang; Shen, Na-na; Li, Shu-xia

    2014-01-01

    For meeting the forecasting target of key technology indicators in the flotation process, a BP neural network soft-sensor model based on features extraction of flotation froth images and optimized by shuffled cuckoo search algorithm is proposed. Based on the digital image processing technique, the color features in HSI color space, the visual features based on the gray level cooccurrence matrix, and the shape characteristics based on the geometric theory of flotation froth images are extracted, respectively, as the input variables of the proposed soft-sensor model. Then the isometric mapping method is used to reduce the input dimension, the network size, and learning time of BP neural network. Finally, a shuffled cuckoo search algorithm is adopted to optimize the BP neural network soft-sensor model. Simulation results show that the model has better generalization results and prediction accuracy. PMID:25133210

  14. How many times can we use a phosphor plate? A preliminary study.

    PubMed

    Ergün, S; Güneri, P; Ilgüy, D; Ilgüy, M; Boyacioglu, H

    2009-01-01

    Digital radiography has become a useful tool in daily dental practice due to the advances in imaging technologies. Charge coupled devices (CCDs) and photostimulable phosphor plates (PSPs) are currently in use for dental imaging; however, the longevity of PSPs in dental practice is not yet established. The aim of this study was to determine the service life of PSPs in a clinical setting. Five unused PSPs were exposed with a conventional X-ray device and converted into digital images with Digora Optime (Soredex, Milwaukee, WI). These were recorded as the baseline images. Subsequent digital images of the plates were obtained after 20, 40, 60, 80, 100, 120, 140, 160, 180 and 200 exposures. All radiographic images were subtracted from the first digital image (baseline) and the mean grey values (MGVs) of the subtracted images were established using software. The data were grouped in 3 classes according to the number of exposures (20-80; 100-140; 160-200), and were analysed using variance analysis and chi(2) tests. The MGVs of the subtracted images varied between 126.25 and 127.59, and the difference was not significant among the groups (P = 0.11). However, the differences between the MGVs of the plates on each exposure settings were significantly different than those of the baseline image (P < 0.05). The findings of this study revealed that even though a slight deterioration occurred after the first exposure, each plate can be used up to 200 times. Further studies are required to reach a more concrete conclusion.

  15. Optimizing Robinson Operator with Ant Colony Optimization As a Digital Image Edge Detection Method

    NASA Astrophysics Data System (ADS)

    Yanti Nasution, Tarida; Zarlis, Muhammad; K. M Nasution, Mahyuddin

    2017-12-01

    Edge detection serves to identify the boundaries of an object against a background of mutual overlap. One of the classic method for edge detection is operator Robinson. Operator Robinson produces a thin, not assertive and grey line edge. To overcome these deficiencies, the proposed improvements to edge detection method with the approach graph with Ant Colony Optimization algorithm. The repairs may be performed are thicken the edge and connect the edges cut off. Edge detection research aims to do optimization of operator Robinson with Ant Colony Optimization then compare the output and generated the inferred extent of Ant Colony Optimization can improve result of edge detection that has not been optimized and improve the accuracy of the results of Robinson edge detection. The parameters used in performance measurement of edge detection are morphology of the resulting edge line, MSE and PSNR. The result showed that Robinson and Ant Colony Optimization method produces images with a more assertive and thick edge. Ant Colony Optimization method is able to be used as a method for optimizing operator Robinson by improving the image result of Robinson detection average 16.77 % than classic Robinson result.

  16. Evaluation of random errors in Williams’ series coefficients obtained with digital image correlation

    NASA Astrophysics Data System (ADS)

    Lychak, Oleh V.; Holyns'kiy, Ivan S.

    2016-03-01

    The use of the Williams’ series parameters for fracture analysis requires valid information about their error values. The aim of this investigation is the development of the method for estimation of the standard deviation of random errors of the Williams’ series parameters, obtained from the measured components of the stress field. Also, the criteria for choosing the optimal number of terms in the truncated Williams’ series for derivation of their parameters with minimal errors is proposed. The method was used for the evaluation of the Williams’ parameters, obtained from the data, and measured by the digital image correlation technique for testing a three-point bending specimen.

  17. Optimizing digital elevation models (DEMs) accuracy for planning and design of mobile communication networks

    NASA Astrophysics Data System (ADS)

    Hassan, Mahmoud A.

    2004-02-01

    Digital elevation models (DEMs) are important tools in the planning, design and maintenance of mobile communication networks. This research paper proposes a method for generating high accuracy DEMs based on SPOT satellite 1A stereo pair images, ground control points (GCP) and Erdas OrthoBASE Pro image processing software. DEMs with 0.2911 m mean error were achieved for the hilly and heavily populated city of Amman. The generated DEM was used to design a mobile communication network resulted in a minimum number of radio base transceiver stations, maximum number of covered regions and less than 2% of dead zones.

  18. A Cadaveric Analysis of the Optimal Radiographic Angle for Evaluating Trochlear Depth.

    PubMed

    Weinberg, Douglas Stanley; Gilmore, Allison; Guraya, Sahejmeet S; Wang, David M; Liu, Raymond W

    2017-02-01

    Disorders of the patellofemoral joint are common. Diagnosis and management often involves the use tangential imaging of the patella and trochlear grove, with the sunrise projection being the most common. However, imaging protocols vary between institutions, and limited data exist to determine which radiographic projections provide optimal visualization of the trochlear groove at its deepest point. Plain radiographs of 48 cadaveric femora were taken at various beam-femur angles and the maximum trochlear depth was measured; a tilt-board apparatus was used to elevate the femur in 5-degree increments between 40 and 75 degrees. A corollary experiment was undertaken to investigate beam-femur angles osteologically: digital representations of each bone were created with a MicroScribe digitizer, and trochlear depth was measured on all specimens at beam-femur angles from 0 to 75 degrees. The results of the radiographic and digitizer experiments showed that the maximum trochlear grove depth occurred at a beam-femur angle of 50 degrees. These results suggest that the optimal beam-femur angle for visualizing maximum trochlear depth is 50 degrees. This is significantly lower than the beam-femur angle of 90 degrees typically used in the sunrise projection. Clinicians evaluating trochlear depth on sunrise projections may be underestimating maximal depth and evaluating a nonarticulating portion of the femur. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  19. Detection of proximal caries using digital radiographic systems with different resolutions.

    PubMed

    Nikneshan, Sima; Abbas, Fatemeh Mashhadi; Sabbagh, Sedigheh

    2015-01-01

    Dental radiography is an important tool for detection of caries and digital radiography is the latest advancement in this regard. Spatial resolution is a characteristic of digital receptors used for describing the quality of images. This study was aimed to compare the diagnostic accuracy of two digital radiographic systems with three different resolutions for detection of noncavitated proximal caries. Diagnostic accuracy. Seventy premolar teeth were mounted in 14 gypsum blocks. Digora; Optime and RVG Access were used for obtaining digital radiographs. Six observers evaluated the proximal surfaces in radiographs for each resolution in order to determine the depth of caries based on a 4-point scale. The teeth were then histologically sectioned, and the results of histologic analysis were considered as the gold standard. Data were entered using SPSS version 18 software and the Kruskal-Wallis test was used for data analysis. P <0.05 was considered as statistically significant. No significant difference was found between different resolutions for detection of proximal caries (P > 0.05). RVG access system had the highest specificity (87.7%) and Digora; Optime at high resolution had the lowest specificity (84.2%). Furthermore, Digora; Optime had higher sensitivity for detection of caries exceeding outer half of enamel. Judgment of oral radiologists for detection of the depth of caries had higher reliability than that of restorative dentistry specialists. The three resolutions of Digora; Optime and RVG access had similar accuracy in detection of noncavitated proximal caries.

  20. Task-driven dictionary learning.

    PubMed

    Mairal, Julien; Bach, Francis; Ponce, Jean

    2012-04-01

    Modeling data with linear combinations of a few elements from a learned dictionary has been the focus of much recent research in machine learning, neuroscience, and signal processing. For signals such as natural images that admit such sparse representations, it is now well established that these models are well suited to restoration tasks. In this context, learning the dictionary amounts to solving a large-scale matrix factorization problem, which can be done efficiently with classical optimization tools. The same approach has also been used for learning features from data for other purposes, e.g., image classification, but tuning the dictionary in a supervised way for these tasks has proven to be more difficult. In this paper, we present a general formulation for supervised dictionary learning adapted to a wide variety of tasks, and present an efficient algorithm for solving the corresponding optimization problem. Experiments on handwritten digit classification, digital art identification, nonlinear inverse image problems, and compressed sensing demonstrate that our approach is effective in large-scale settings, and is well suited to supervised and semi-supervised classification, as well as regression tasks for data that admit sparse representations.

  1. 2D and 3D registration methods for dual-energy contrast-enhanced digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Lau, Kristen C.; Roth, Susan; Maidment, Andrew D. A.

    2014-03-01

    Contrast-enhanced digital breast tomosynthesis (CE-DBT) uses an iodinated contrast agent to image the threedimensional breast vasculature. The University of Pennsylvania is conducting a CE-DBT clinical study in patients with known breast cancers. The breast is compressed continuously and imaged at four time points (1 pre-contrast; 3 postcontrast). A hybrid subtraction scheme is proposed. First, dual-energy (DE) images are obtained by a weighted logarithmic subtraction of the high-energy and low-energy image pairs. Then, post-contrast DE images are subtracted from the pre-contrast DE image. This hybrid temporal subtraction of DE images is performed to analyze iodine uptake, but suffers from motion artifacts. Employing image registration further helps to correct for motion, enhancing the evaluation of vascular kinetics. Registration using ANTS (Advanced Normalization Tools) is performed in an iterative manner. Mutual information optimization first corrects large-scale motions. Normalized cross-correlation optimization then iteratively corrects fine-scale misalignment. Two methods have been evaluated: a 2D method using a slice-by-slice approach, and a 3D method using a volumetric approach to account for out-of-plane breast motion. Our results demonstrate that iterative registration qualitatively improves with each iteration (five iterations total). Motion artifacts near the edge of the breast are corrected effectively and structures within the breast (e.g. blood vessels, surgical clip) are better visualized. Statistical and clinical evaluations of registration accuracy in the CE-DBT images are ongoing.

  2. An automatic fuzzy-based multi-temporal brain digital subtraction angiography image fusion algorithm using curvelet transform and content selection strategy.

    PubMed

    Momeni, Saba; Pourghassem, Hossein

    2014-08-01

    Recently image fusion has prominent role in medical image processing and is useful to diagnose and treat many diseases. Digital subtraction angiography is one of the most applicable imaging to diagnose brain vascular diseases and radiosurgery of brain. This paper proposes an automatic fuzzy-based multi-temporal fusion algorithm for 2-D digital subtraction angiography images. In this algorithm, for blood vessel map extraction, the valuable frames of brain angiography video are automatically determined to form the digital subtraction angiography images based on a novel definition of vessel dispersion generated by injected contrast material. Our proposed fusion scheme contains different fusion methods for high and low frequency contents based on the coefficient characteristic of wrapping second generation of curvelet transform and a novel content selection strategy. Our proposed content selection strategy is defined based on sample correlation of the curvelet transform coefficients. In our proposed fuzzy-based fusion scheme, the selection of curvelet coefficients are optimized by applying weighted averaging and maximum selection rules for the high frequency coefficients. For low frequency coefficients, the maximum selection rule based on local energy criterion is applied to better visual perception. Our proposed fusion algorithm is evaluated on a perfect brain angiography image dataset consisting of one hundred 2-D internal carotid rotational angiography videos. The obtained results demonstrate the effectiveness and efficiency of our proposed fusion algorithm in comparison with common and basic fusion algorithms.

  3. Spatio-spectral color filter array design for optimal image recovery.

    PubMed

    Hirakawa, Keigo; Wolfe, Patrick J

    2008-10-01

    In digital imaging applications, data are typically obtained via a spatial subsampling procedure implemented as a color filter array-a physical construction whereby only a single color value is measured at each pixel location. Owing to the growing ubiquity of color imaging and display devices, much recent work has focused on the implications of such arrays for subsequent digital processing, including in particular the canonical demosaicking task of reconstructing a full color image from spatially subsampled and incomplete color data acquired under a particular choice of array pattern. In contrast to the majority of the demosaicking literature, we consider here the problem of color filter array design and its implications for spatial reconstruction quality. We pose this problem formally as one of simultaneously maximizing the spectral radii of luminance and chrominance channels subject to perfect reconstruction, and-after proving sub-optimality of a wide class of existing array patterns-provide a constructive method for its solution that yields robust, new panchromatic designs implementable as subtractive colors. Empirical evaluations on multiple color image test sets support our theoretical results, and indicate the potential of these patterns to increase spatial resolution for fixed sensor size, and to contribute to improved reconstruction fidelity as well as significantly reduced hardware complexity.

  4. Low-power low-noise mixed-mode VLSI ASIC for infinite dynamic range imaging applications

    NASA Astrophysics Data System (ADS)

    Turchetta, Renato; Hu, Y.; Zinzius, Y.; Colledani, C.; Loge, A.

    1998-11-01

    Solid state solutions for imaging are mainly represented by CCDs and, more recently, by CMOS imagers. Both devices are based on the integration of the total charge generated by the impinging radiation, with no processing of the single photon information. The dynamic range of these devices is intrinsically limited by the finite value of noise. Here we present the design of an architecture which allows efficient, in-pixel, noise reduction to a practically zero level, thus allowing infinite dynamic range imaging. A detailed calculation of the dynamic range is worked out, showing that noise is efficiently suppressed. This architecture is based on the concept of single-photon counting. In each pixel, we integrate both the front-end, low-noise, low-power analog part and the digital part. The former consists of a charge preamplifier, an active filter for optimal noise bandwidth reduction, a buffer and a threshold comparator, and the latter is simply a counter, which can be programmed to act as a normal shift register for the readout of the counters' contents. Two different ASIC's based on this concept have been designed for different applications. The first one has been optimized for silicon edge-on microstrips detectors, used in a digital mammography R and D project. It is a 32-channel circuit, with a 16-bit binary static counter.It has been optimized for a relatively large detector capacitance of 5 pF. Noise has been measured to be equal to 100 + 7*Cd (pF) electron rms with the digital part, showing no degradation of the noise performances with respect to the design values. The power consumption is 3.8mW/channel for a peaking time of about 1 microsecond(s) . The second circuit is a prototype for pixel imaging. The total active area is about (250 micrometers )**2. The main differences of the electronic architecture with respect to the first prototype are: i) different optimization of the analog front-end part for low-capacitance detectors, ii) in- pixel 4-bit comparator-offset compensation, iii) 15-bit pseudo-random counter. The power consumption is 255 (mu) W/channel for a peaking time of 300 ns and an equivalent noise charge of 185 + 97*Cd electrons rms. Simulation and experimental result as well as imaging results will be presented.

  5. Crystal surface analysis using matrix textural features classified by a probabilistic neural network

    NASA Astrophysics Data System (ADS)

    Sawyer, Curry R.; Quach, Viet; Nason, Donald; van den Berg, Lodewijk

    1991-12-01

    A system is under development in which surface quality of a growing bulk mercuric iodide crystal is monitored by video camera at regular intervals for early detection of growth irregularities. Mercuric iodide single crystals are employed in radiation detectors. A microcomputer system is used for image capture and processing. The digitized image is divided into multiple overlapping sub-images and features are extracted from each sub-image based on statistical measures of the gray tone distribution, according to the method of Haralick. Twenty parameters are derived from each sub-image and presented to a probabilistic neural network (PNN) for classification. This number of parameters was found to be optimal for the system. The PNN is a hierarchical, feed-forward network that can be rapidly reconfigured as additional training data become available. Training data is gathered by reviewing digital images of many crystals during their growth cycle and compiling two sets of images, those with and without irregularities.

  6. Development of a stationary digital breast tomosynthesis system for clinical applications

    NASA Astrophysics Data System (ADS)

    Tucker, Andrew Wallace

    Digital breast tomosynthesis (DBT) has been shown to be a very beneficial tool in the fight against breast cancer. However, current DBT systems have poor spatial resolution compared to full field digital mammography (FFDM), the current gold standard for screening mammography. The poor spatial resolution of DBT systems is a result of the single X-ray source design. In DBT systems a single X-ray source is rotated over an angular span in order to acquire the images needed for 3D reconstruction. The rotation of the X-ray source degrades the spatial resolution of the images. DBT systems which are approved for use in the United States for screening mammography are required to also take a full field digital mammogram with every DBT acquisition in order to compensate for the poor spatial resolution. This double exposure essentially doubles the radiation dose to patients. Over the past few years our research group has developed a carbon nanotube (CNT) based X-ray source technology. The unique nature of CNT X-ray sources allows for multiple X-ray focal spots in a single X-ray source. Using this technology we have recently developed a stationary DBT system (s-DBT) system which is capable of producing a full tomosynthesis image dataset with zero motion of the X-ray source. This system has been shown to have increased spatial resolution over other DBT systems in a laboratory setting. The goal of this thesis work was to optimize the s-DBT system, demonstrate its usefulness over other systems, and finally implement it into the clinic for a clinical trial. The s-DBT system was optimized using different image quality measurements. The optimized system was then used in a breast specimen imaging trial which compared s-DBT to magnified 2D mammography and a conventional single source DBT system. Readers preferred s-DBT to magnified 2D mammography for specimen margin delineation and mass detection, these results were not significant. Using physical measures for spatial resolution the s-DBT system was shown to have improved image quality over conventional single source DBT systems in breast tissue. A separate study showed that s-DBT could be a feasible alternative to FFDM for screening patients with breast implants. Finally, a second s-DBT system was constructed and implemented into the Department of Mammography at UNC hospitals. The first patient was imaged on the system in December of 2013.

  7. Making cytological diagnoses on digital images using the iPath network.

    PubMed

    Dalquen, Peter; Savic Prince, Spasenija; Spieler, Peter; Kunze, Dietmar; Neumann, Heinrich; Eppenberger-Castori, Serenella; Adams, Heiner; Glatz, Katharina; Bubendorf, Lukas

    2014-01-01

    The iPath telemedicine platform Basel is mainly used for histological and cytological consultations, but also serves as a valuable learning tool. To study the level of accuracy in making diagnoses based on still images achieved by experienced cytopathologists, to identify limiting factors, and to provide a cytological image series as a learning set. Images from 167 consecutive cytological specimens of different origin were uploaded on the iPath platform and evaluated by four cytopathologists. Only wet-fixed and well-stained specimens were used. The consultants made specific diagnoses and categorized each as benign, suspicious or malignant. For all consultants, specificity and sensitivity regarding categorized diagnoses were 83-92 and 85-93%, respectively; the overall accuracy was 88-90%. The interobserver agreement was substantial (κ = 0.791). The lowest rate of concordance was achieved in urine and bladder washings and in the identification of benign lesions. Using a digital image set for diagnostic purposes implies that even under optimal conditions the accuracy rate will not exceed to 80-90%, mainly because of lacking supportive immunocytochemical or molecular tests. This limitation does not disqualify digital images for teleconsulting or as a learning aid. The series of images used for the study are open to the public at http://pathorama.wordpress.com/extragenital-cytology-2013/. © 2014 S. Karger AG, Basel.

  8. An application of digital image processing techniques to the characterization of liquid petroleum gas (LPG) spray

    NASA Astrophysics Data System (ADS)

    Qi, Y. L.; Xu, B. Y.; Cai, S. L.

    2006-12-01

    To control fuel injection, optimize combustion and reduce emissions for LPG (liquefied petroleum gas) engines, it is necessary and important to understand the characteristics of LPG sprays. The present work investigates the geometry of LPG sprays, including spray tip penetration, spray angle, projected spray area and spray volume, by using schlieren photography and digital image processing techniques. Two types of single nozzle injectors were studied, with the same nozzle diameter, but one with and one without a double-hole flow-split head. A code developed to analyse the results directly from the digitized images is shown to be more accurate and efficient than manual measurement and analysis. Test results show that a higher injection pressure produces a longer spray tip penetration, a larger projected spray area and spray volume, but a smaller spray cone angle. The injector with the double-hole split-head nozzle produces better atomization and shorter tip penetration at medium and late injection times, but longer tip penetration in the early stage.

  9. Radiometry simulation within the end-to-end simulation tool SENSOR

    NASA Astrophysics Data System (ADS)

    Wiest, Lorenz; Boerner, Anko

    2001-02-01

    12 An end-to-end simulation is a valuable tool for sensor system design, development, optimization, testing, and calibration. This contribution describes the radiometry module of the end-to-end simulation tool SENSOR. It features MODTRAN 4.0-based look up tables in conjunction with a cache-based multilinear interpolation algorithm to speed up radiometry calculations. It employs a linear reflectance parameterization to reduce look up table size, considers effects due to the topology of a digital elevation model (surface slope, sky view factor) and uses a reflectance class feature map to assign Lambertian and BRDF reflectance properties to the digital elevation model. The overall consistency of the radiometry part is demonstrated by good agreement between ATCOR 4-retrieved reflectance spectra of a simulated digital image cube and the original reflectance spectra used to simulate this image data cube.

  10. Digital liver biopsy: Bio-imaging of fatty liver for translational and clinical research

    PubMed Central

    Mancini, Marcello; Summers, Paul; Faita, Francesco; Brunetto, Maurizia R; Callea, Francesco; De Nicola, Andrea; Di Lascio, Nicole; Farinati, Fabio; Gastaldelli, Amalia; Gridelli, Bruno; Mirabelli, Peppino; Neri, Emanuele; Salvadori, Piero A; Rebelos, Eleni; Tiribelli, Claudio; Valenti, Luca; Salvatore, Marco; Bonino, Ferruccio

    2018-01-01

    The rapidly growing field of functional, molecular and structural bio-imaging is providing an extraordinary new opportunity to overcome the limits of invasive liver biopsy and introduce a “digital biopsy” for in vivo study of liver pathophysiology. To foster the application of bio-imaging in clinical and translational research, there is a need to standardize the methods of both acquisition and the storage of the bio-images of the liver. It can be hoped that the combination of digital, liquid and histologic liver biopsies will provide an innovative synergistic tri-dimensional approach to identifying new aetiologies, diagnostic and prognostic biomarkers and therapeutic targets for the optimization of personalized therapy of liver diseases and liver cancer. A group of experts of different disciplines (Special Interest Group for Personalized Hepatology of the Italian Association for the Study of the Liver, Institute for Biostructures and Bio-imaging of the National Research Council and Bio-banking and Biomolecular Resources Research Infrastructure) discussed criteria, methods and guidelines for facilitating the requisite application of data collection. This manuscript provides a multi-Author review of the issue with special focus on fatty liver. PMID:29527259

  11. Low-cost space-varying FIR filter architecture for computational imaging systems

    NASA Astrophysics Data System (ADS)

    Feng, Guotong; Shoaib, Mohammed; Schwartz, Edward L.; Dirk Robinson, M.

    2010-01-01

    Recent research demonstrates the advantage of designing electro-optical imaging systems by jointly optimizing the optical and digital subsystems. The optical systems designed using this joint approach intentionally introduce large and often space-varying optical aberrations that produce blurry optical images. Digital sharpening restores reduced contrast due to these intentional optical aberrations. Computational imaging systems designed in this fashion have several advantages including extended depth-of-field, lower system costs, and improved low-light performance. Currently, most consumer imaging systems lack the necessary computational resources to compensate for these optical systems with large aberrations in the digital processor. Hence, the exploitation of the advantages of the jointly designed computational imaging system requires low-complexity algorithms enabling space-varying sharpening. In this paper, we describe a low-cost algorithmic framework and associated hardware enabling the space-varying finite impulse response (FIR) sharpening required to restore largely aberrated optical images. Our framework leverages the space-varying properties of optical images formed using rotationally-symmetric optical lens elements. First, we describe an approach to leverage the rotational symmetry of the point spread function (PSF) about the optical axis allowing computational savings. Second, we employ a specially designed bank of sharpening filters tuned to the specific radial variation common to optical aberrations. We evaluate the computational efficiency and image quality achieved by using this low-cost space-varying FIR filter architecture.

  12. Review of image processing fundamentals

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1985-01-01

    Image processing through convolution, transform coding, spatial frequency alterations, sampling, and interpolation are considered. It is postulated that convolution in one domain (real or frequency) is equivalent to multiplication in the other (frequency or real), and that the relative amplitudes of the Fourier components must be retained to reproduce any waveshape. It is suggested that all digital systems may be considered equivalent, with a frequency content approximately at the Nyquist limit, and with a Gaussian frequency response. An optimized cubic version of the interpolation continuum image is derived as a set of cubic spines. Pixel replication has been employed to enlarge the visable area of digital samples, however, suitable elimination of the extraneous high frequencies involved in the visable edges, by defocusing, is necessary to allow the underlying object represented by the data values to be seen.

  13. Impulsive noise suppression in color images based on the geodesic digital paths

    NASA Astrophysics Data System (ADS)

    Smolka, Bogdan; Cyganek, Boguslaw

    2015-02-01

    In the paper a novel filtering design based on the concept of exploration of the pixel neighborhood by digital paths is presented. The paths start from the boundary of a filtering window and reach its center. The cost of transitions between adjacent pixels is defined in the hybrid spatial-color space. Then, an optimal path of minimum total cost, leading from pixels of the window's boundary to its center is determined. The cost of an optimal path serves as a degree of similarity of the central pixel to the samples from the local processing window. If a pixel is an outlier, then all the paths starting from the window's boundary will have high costs and the minimum one will also be high. The filter output is calculated as a weighted mean of the central pixel and an estimate constructed using the information on the minimum cost assigned to each image pixel. So, first the costs of optimal paths are used to build a smoothed image and in the second step the minimum cost of the central pixel is utilized for construction of the weights of a soft-switching scheme. The experiments performed on a set of standard color images, revealed that the efficiency of the proposed algorithm is superior to the state-of-the-art filtering techniques in terms of the objective restoration quality measures, especially for high noise contamination ratios. The proposed filter, due to its low computational complexity, can be applied for real time image denoising and also for the enhancement of video streams.

  14. Optimization of image processing algorithms on mobile platforms

    NASA Astrophysics Data System (ADS)

    Poudel, Pramod; Shirvaikar, Mukul

    2011-03-01

    This work presents a technique to optimize popular image processing algorithms on mobile platforms such as cell phones, net-books and personal digital assistants (PDAs). The increasing demand for video applications like context-aware computing on mobile embedded systems requires the use of computationally intensive image processing algorithms. The system engineer has a mandate to optimize them so as to meet real-time deadlines. A methodology to take advantage of the asymmetric dual-core processor, which includes an ARM and a DSP core supported by shared memory, is presented with implementation details. The target platform chosen is the popular OMAP 3530 processor for embedded media systems. It has an asymmetric dual-core architecture with an ARM Cortex-A8 and a TMS320C64x Digital Signal Processor (DSP). The development platform was the BeagleBoard with 256 MB of NAND RAM and 256 MB SDRAM memory. The basic image correlation algorithm is chosen for benchmarking as it finds widespread application for various template matching tasks such as face-recognition. The basic algorithm prototypes conform to OpenCV, a popular computer vision library. OpenCV algorithms can be easily ported to the ARM core which runs a popular operating system such as Linux or Windows CE. However, the DSP is architecturally more efficient at handling DFT algorithms. The algorithms are tested on a variety of images and performance results are presented measuring the speedup obtained due to dual-core implementation. A major advantage of this approach is that it allows the ARM processor to perform important real-time tasks, while the DSP addresses performance-hungry algorithms.

  15. Efficient high-performance ultrasound beamforming using oversampling

    NASA Astrophysics Data System (ADS)

    Freeman, Steven R.; Quick, Marshall K.; Morin, Marc A.; Anderson, R. C.; Desilets, Charles S.; Linnenbrink, Thomas E.; O'Donnell, Matthew

    1998-05-01

    High-performance and efficient beamforming circuitry is very important in large channel count clinical ultrasound systems. Current state-of-the-art digital systems using multi-bit analog to digital converters (A/Ds) have matured to provide exquisite image quality with moderate levels of integration. A simplified oversampling beamforming architecture has been proposed that may a low integration of delta-sigma A/Ds onto the same chip as digital delay and processing circuitry to form a monolithic ultrasound beamformer. Such a beamformer may enable low-power handheld scanners for high-end systems with very large channel count arrays. This paper presents an oversampling beamformer architecture that generates high-quality images using very simple; digitization, delay, and summing circuits. Additional performance may be obtained with this oversampled system for narrow bandwidth excitations by mixing the RF signal down in frequency to a range where the electronic signal to nose ratio of the delta-sigma A/D is optimized. An oversampled transmit beamformer uses the same delay circuits as receive and eliminates the need for separate transmit function generators.

  16. Optimization of Breast Tomosynthesis Imaging Systems for Computer-Aided Detection

    DTIC Science & Technology

    2011-05-01

    R. Saunders, E. Samei, C. Badea, H. Yuan, K. Ghaghada, Y. Qi, L. Hedlund, and S. Mukundan, “Optimization of dual energy contrast enhanced breast...14 4 1 Introduction This is the final report for this body of research. Screen-film mammography and...digital mammography have been used for over 30 years in the early detection of cancer. The combination of screening and adjuvant therapies have led to

  17. Using multiscale texture and density features for near-term breast cancer risk analysis

    PubMed Central

    Sun, Wenqing; Tseng, Tzu-Liang (Bill); Qian, Wei; Zhang, Jianying; Saltzstein, Edward C.; Zheng, Bin; Lure, Fleming; Yu, Hui; Zhou, Shi

    2015-01-01

    Purpose: To help improve efficacy of screening mammography by eventually establishing a new optimal personalized screening paradigm, the authors investigated the potential of using the quantitative multiscale texture and density feature analysis of digital mammograms to predict near-term breast cancer risk. Methods: The authors’ dataset includes digital mammograms acquired from 340 women. Among them, 141 were positive and 199 were negative/benign cases. The negative digital mammograms acquired from the “prior” screening examinations were used in the study. Based on the intensity value distributions, five subregions at different scales were extracted from each mammogram. Five groups of features, including density and texture features, were developed and calculated on every one of the subregions. Sequential forward floating selection was used to search for the effective combinations. Using the selected features, a support vector machine (SVM) was optimized using a tenfold validation method to predict the risk of each woman having image-detectable cancer in the next sequential mammography screening. The area under the receiver operating characteristic curve (AUC) was used as the performance assessment index. Results: From a total number of 765 features computed from multiscale subregions, an optimal feature set of 12 features was selected. Applying this feature set, a SVM classifier yielded performance of AUC = 0.729 ± 0.021. The positive predictive value was 0.657 (92 of 140) and the negative predictive value was 0.755 (151 of 200). Conclusions: The study results demonstrated a moderately high positive association between risk prediction scores generated by the quantitative multiscale mammographic image feature analysis and the actual risk of a woman having an image-detectable breast cancer in the next subsequent examinations. PMID:26127038

  18. Next-generation digital camera integration and software development issues

    NASA Astrophysics Data System (ADS)

    Venkataraman, Shyam; Peters, Ken; Hecht, Richard

    1998-04-01

    This paper investigates the complexities associated with the development of next generation digital cameras due to requirements in connectivity and interoperability. Each successive generation of digital camera improves drastically in cost, performance, resolution, image quality and interoperability features. This is being accomplished by advancements in a number of areas: research, silicon, standards, etc. As the capabilities of these cameras increase, so do the requirements for both hardware and software. Today, there are two single chip camera solutions in the market including the Motorola MPC 823 and LSI DCAM- 101. Real time constraints for a digital camera may be defined by the maximum time allowable between capture of images. Constraints in the design of an embedded digital camera include processor architecture, memory, processing speed and the real-time operating systems. This paper will present the LSI DCAM-101, a single-chip digital camera solution. It will present an overview of the architecture and the challenges in hardware and software for supporting streaming video in such a complex device. Issues presented include the development of the data flow software architecture, testing and integration on this complex silicon device. The strategy for optimizing performance on the architecture will also be presented.

  19. Increasing clinical relevance in oral radiology: Benefits and challenges when implementing digital assessment.

    PubMed

    de Lange, T; Møystad, A; Torgersen, G R

    2018-02-13

    The aims of the study were to investigate benefits and challenges in implementing a digital examination and study the clinical relevance of the digital examination in relation to clinical training and practice. The study was based on semi-structured focus-group interviews from two distinct student populations (2016 and 2017) in a bachelor programme in dental hygiene. In addition, conversational data from a plenary discussion from the whole second student population (2017) were collected and analysed. The data were approached on basis of content analysis. A benefit experienced in the digital examination was the ease in typing and editing answers on the computer. This suggests an increased effectiveness in computer-based compared to analogue examinations. An additional advantage was the experienced relevance of the examination related to the clinic. This finding refers not only to the digital presentations of images, but also to the entire setting in the clinic and dental practice. The limitations reported by the students were non-optimal viewing conditions for presenting radiographic images and difficulties in obtaining an overview of the assignments compared to paper-based examinations due to the linear digital examination format. The last finding on lacking overview revealed an influence on student performances which should be taken seriously in designing digital examinations. In conclusion, the digital layout increases efficiency and clinical relevance of examinations to a certain extent. Obstacles were found in limitations related to image presentation and lack of overview of the examination. The latter challenge raises questions related to developing suitable assessment software. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Toward a perceptual video-quality metric

    NASA Astrophysics Data System (ADS)

    Watson, Andrew B.

    1998-07-01

    The advent of widespread distribution of digital video creates a need for automated methods for evaluating the visual quality of digital video. This is particularly so since most digital video is compressed using lossy methods, which involve the controlled introduction of potentially visible artifacts. Compounding the problem is the bursty nature of digital video, which requires adaptive bit allocation based on visual quality metrics, and the economic need to reduce bit-rate to the lowest level that yields acceptable quality. In previous work, we have developed visual quality metrics for evaluating, controlling,a nd optimizing the quality of compressed still images. These metrics incorporate simplified models of human visual sensitivity to spatial and chromatic visual signals. Here I describe a new video quality metric that is an extension of these still image metrics into the time domain. Like the still image metrics, it is based on the Discrete Cosine Transform. An effort has been made to minimize the amount of memory and computation required by the metric, in order that might be applied in the widest range of applications. To calibrate the basic sensitivity of this metric to spatial and temporal signals we have made measurements of visual thresholds for temporally varying samples of DCT quantization noise.

  1. SU-F-I-14: 3D Breast Digital Phantom for XACT Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, S; Laaroussi, R; Chen, J

    Purpose: The X-ray induced acoustic computed tomography (XACT) is a new imaging modality which combines X-ray contrast and high ultrasonic resolution in a single modality. Using XACT in breast imaging, a 3D breast volume can be imaged by only one pulsed X-ray radiation, which could dramatically reduce the imaging dose for patients undergoing breast cancer screening and diagnosis. A 3D digital phantom that contains both X-ray properties and acoustic properties of different tissue types is indeed needed for developing and optimizing the XACT system. The purpose of this study is to offer a realistic breast digital phantom as a valuablemore » tool for improving breast XACT imaging techniques and potentially leading to better diagnostic outcomes. Methods: A series of breast CT images along the coronal plane from a patient who has breast calcifications are used as the source images. A HU value based segmentation algorithm is employed to identify breast tissues in five categories, namely the skin tissue, fat tissue, glandular tissue, chest bone and calcifications. For each pixel, the dose related parameters, such as material components and density, and acoustic related parameters, such as frequency-dependent acoustic attenuation coefficient and bandwidth, are assigned based on tissue types. Meanwhile, other parameters which are used in sound propagation, including the sound speed, thermal expansion coefficient, and heat capacity are also assigned to each tissue. Results: A series of 2D tissue type image is acquired first and the 3D digital breast phantom is obtained by using commercial 3D reconstruction software. When giving specific settings including dose depositions and ultrasound center frequency, the X-ray induced initial pressure rise can be calculated accordingly. Conclusion: The proposed 3D breast digital phantom represents a realistic breast anatomic structure and provides a valuable tool for developing and evaluating the system performance for XACT.« less

  2. The x-ray light valve: a potentially low-cost, digital radiographic imaging system--a liquid crystal cell design for chest radiography.

    PubMed

    Szeto, Timothy C; Webster, Christie Ann; Koprinarov, Ivaylo; Rowlands, J A

    2008-03-01

    Digital x-ray radiographic systems are desirable as they offer high quality images which can be processed, transferred, and stored without secondary steps. However, current clinical systems are extraordinarily expensive in comparison to film-based systems. Thus, there is a need for an economical digital imaging system for general radiology. The x-ray light valve (XLV) is a novel digital x-ray detector concept with the potential for high image quality and low cost. The XLV is comprised of a photoconductive detector layer and liquid crystal (LC) cell physically coupled in a sandwich structure. Upon exposure to x rays, charge is collected at the surface of the photoconductor, causing a change in the reflective properties of the LC cell. The visible image so formed can subsequently be digitized with an optical scanner. By choosing the properties of the LC cell in combination with the appropriate photoconductor thickness and bias potentials, the XLV can be optimized for various diagnostic imaging tasks. Specifically for chest radiography, we identified three potentially practical reflective cell designs by selecting from those commonly used in LC display technology. The relationship between reflectance and x-ray exposure (i.e., the characteristic curve) was determined for all three cells using a theoretical model. The results indicate that the reflective electrically controlled birefringence (r-ECB) cell is the preferred choice for chest radiography, provided that the characteristic curve can be shifted towards lower exposures. The feasibility of the shift of the characteristic curve is shown experimentally. The experimental results thus demonstrate that an XLV based on the r-ECB cell design exhibits a characteristic curve suitable for chest radiography.

  3. Resolution improvement in positron emission tomography using anatomical Magnetic Resonance Imaging.

    PubMed

    Chu, Yong; Su, Min-Ying; Mandelkern, Mark; Nalcioglu, Orhan

    2006-08-01

    An ideal imaging system should provide information with high-sensitivity, high spatial, and temporal resolution. Unfortunately, it is not possible to satisfy all of these desired features in a single modality. In this paper, we discuss methods to improve the spatial resolution in positron emission imaging (PET) using a priori information from Magnetic Resonance Imaging (MRI). Our approach uses an image restoration algorithm based on the maximization of mutual information (MMI), which has found significant success for optimizing multimodal image registration. The MMI criterion is used to estimate the parameters in the Sharpness-Constrained Wiener filter. The generated filter is then applied to restore PET images of a realistic digital brain phantom. The resulting restored images show improved resolution and better signal-to-noise ratio compared to the interpolated PET images. We conclude that a Sharpness-Constrained Wiener filter having parameters optimized from a MMI criterion may be useful for restoring spatial resolution in PET based on a priori information from correlated MRI.

  4. Computer-aided classification of breast masses using contrast-enhanced digital mammograms

    NASA Astrophysics Data System (ADS)

    Danala, Gopichandh; Aghaei, Faranak; Heidari, Morteza; Wu, Teresa; Patel, Bhavika; Zheng, Bin

    2018-02-01

    By taking advantages of both mammography and breast MRI, contrast-enhanced digital mammography (CEDM) has emerged as a new promising imaging modality to improve efficacy of breast cancer screening and diagnosis. The primary objective of study is to develop and evaluate a new computer-aided detection and diagnosis (CAD) scheme of CEDM images to classify between malignant and benign breast masses. A CEDM dataset consisting of 111 patients (33 benign and 78 malignant) was retrospectively assembled. Each case includes two types of images namely, low-energy (LE) and dual-energy subtracted (DES) images. First, CAD scheme applied a hybrid segmentation method to automatically segment masses depicting on LE and DES images separately. Optimal segmentation results from DES images were also mapped to LE images and vice versa. Next, a set of 109 quantitative image features related to mass shape and density heterogeneity was initially computed. Last, four multilayer perceptron-based machine learning classifiers integrated with correlationbased feature subset evaluator and leave-one-case-out cross-validation method was built to classify mass regions depicting on LE and DES images, respectively. Initially, when CAD scheme was applied to original segmentation of DES and LE images, the areas under ROC curves were 0.7585+/-0.0526 and 0.7534+/-0.0470, respectively. After optimal segmentation mapping from DES to LE images, AUC value of CAD scheme significantly increased to 0.8477+/-0.0376 (p<0.01). Since DES images eliminate overlapping effect of dense breast tissue on lesions, segmentation accuracy was significantly improved as compared to regular mammograms, the study demonstrated that computer-aided classification of breast masses using CEDM images yielded higher performance.

  5. A Dynamic Image Quality Evaluation of Videofluoroscopy Images: Considerations for Telepractice Applications.

    PubMed

    Burns, Clare L; Keir, Benjamin; Ward, Elizabeth C; Hill, Anne J; Farrell, Anna; Phillips, Nick; Porter, Linda

    2015-08-01

    High-quality fluoroscopy images are required for accurate interpretation of videofluoroscopic swallow studies (VFSS) by speech pathologists and radiologists. Consequently, integral to developing any system to conduct VFSS remotely via telepractice is ensuring that the quality of the VFSS images transferred via the telepractice system is optimized. This study evaluates the extent of change observed in image quality when videofluoroscopic images are transmitted from a digital fluoroscopy system to (a) current clinical equipment (KayPentax Digital Swallowing Workstation, and b) four different telepractice system configurations. The telepractice system configurations consisted of either a local C20 or C60 Cisco TelePresence System (codec unit) connected to the digital fluoroscopy system and linked to a second remote C20 or C60 Cisco TelePresence System via a network running at speeds of either 2, 4 or 6 megabits per second (Mbit/s). Image quality was tested using the NEMA XR 21 Phantom, and results demonstrated some loss in spatial resolution, low contrast detectability and temporal resolution for all transferred images when compared to the fluoroscopy source. When using higher capacity codec units and/or the highest bandwidths to support data transmission, image quality transmitted through the telepractice system was found to be comparable if not better than the current clinical system. This study confirms that telepractice systems can be designed to support fluoroscopy image transfer and highlights important considerations when developing telepractice systems for VFSS analysis to ensure high-quality radiological image reproduction.

  6. Digital image processing based identification of nodes and internodes of chopped biomass stems

    USDA-ARS?s Scientific Manuscript database

    Chemical composition of biomass feedstock is an important parameter for optimizing the yield and economics of various bioconversion pathways. Although understandably, the chemical composition of biomass varies among species, varieties, and plant components, there is distinct variation even among ste...

  7. Quantifying Particle Numbers and Mass Flux in Drifting Snow

    NASA Astrophysics Data System (ADS)

    Crivelli, Philip; Paterna, Enrico; Horender, Stefan; Lehning, Michael

    2016-12-01

    We compare two of the most common methods of quantifying mass flux, particle numbers and particle-size distribution for drifting snow events, the snow-particle counter (SPC), a laser-diode-based particle detector, and particle tracking velocimetry based on digital shadowgraphic imaging. The two methods were correlated for mass flux and particle number flux. For the SPC measurements, the device was calibrated by the manufacturer beforehand. The shadowgrapic imaging method measures particle size and velocity directly from consecutive images, and before each new test the image pixel length is newly calibrated. A calibration study with artificially scattered sand particles and glass beads provides suitable settings for the shadowgraphical imaging as well as obtaining a first correlation of the two methods in a controlled environment. In addition, using snow collected in trays during snowfall, several experiments were performed to observe drifting snow events in a cold wind tunnel. The results demonstrate a high correlation between the mass flux obtained for the calibration studies (r ≥slant 0.93) and good correlation for the drifting snow experiments (r ≥slant 0.81). The impact of measurement settings is discussed in order to reliably quantify particle numbers and mass flux in drifting snow. The study was designed and performed to optimize the settings of the digital shadowgraphic imaging system for both the acquisition and the processing of particles in a drifting snow event. Our results suggest that these optimal settings can be transferred to different imaging set-ups to investigate sediment transport processes.

  8. A home-built digital optical MRI console using high-speed serial links.

    PubMed

    Tang, Weinan; Wang, Weimin; Liu, Wentao; Ma, Yajun; Tang, Xin; Xiao, Liang; Gao, Jia-Hong

    2015-08-01

    To develop a high performance, cost-effective digital optical console for scalable multichannel MRI. The console system was implemented with flexibility and efficiency based on a modular architecture with distributed pulse sequencers. High-speed serial links were optimally utilized to interconnect the system, providing fast digital communication with a multi-gigabit data rate. The conventional analog radio frequency (RF) chain was replaced with a digital RF manipulation. The acquisition electronics were designed in close proximity to RF coils and preamplifiers, using a digital optical link to transmit the MR signal. A prototype of the console was constructed with a broad frequency range from direct current to 100 MHz. A temporal resolution of 1 μs was achieved for both the RF and gradient operations. The MR signal was digitized in the scanner room with an overall dynamic range between 16 and 24 bits and was transmitted to a master controller over a duplex optic fiber with a high data rate of 3.125 gigabits per second. High-quality phantom and human images were obtained using the prototype on both 0.36T and 1.5T clinical MRI scanners. A homemade digital optical MRI console with high-speed serial interconnection has been developed to better serve imaging research and clinical applications. © 2014 Wiley Periodicals, Inc.

  9. Optimized lighting method of applying shaped-function signal for increasing the dynamic range of LED-multispectral imaging system

    NASA Astrophysics Data System (ADS)

    Yang, Xue; Hu, Yajia; Li, Gang; Lin, Ling

    2018-02-01

    This paper proposes an optimized lighting method of applying a shaped-function signal for increasing the dynamic range of light emitting diode (LED)-multispectral imaging system. The optimized lighting method is based on the linear response zone of the analog-to-digital conversion (ADC) and the spectral response of the camera. The auxiliary light at a higher sensitivity-camera area is introduced to increase the A/D quantization levels that are within the linear response zone of ADC and improve the signal-to-noise ratio. The active light is modulated by the shaped-function signal to improve the gray-scale resolution of the image. And the auxiliary light is modulated by the constant intensity signal, which is easy to acquire the images under the active light irradiation. The least square method is employed to precisely extract the desired images. One wavelength in multispectral imaging based on LED illumination was taken as an example. It has been proven by experiments that the gray-scale resolution and the accuracy of information of the images acquired by the proposed method were both significantly improved. The optimum method opens up avenues for the hyperspectral imaging of biological tissue.

  10. Optimized lighting method of applying shaped-function signal for increasing the dynamic range of LED-multispectral imaging system.

    PubMed

    Yang, Xue; Hu, Yajia; Li, Gang; Lin, Ling

    2018-02-01

    This paper proposes an optimized lighting method of applying a shaped-function signal for increasing the dynamic range of light emitting diode (LED)-multispectral imaging system. The optimized lighting method is based on the linear response zone of the analog-to-digital conversion (ADC) and the spectral response of the camera. The auxiliary light at a higher sensitivity-camera area is introduced to increase the A/D quantization levels that are within the linear response zone of ADC and improve the signal-to-noise ratio. The active light is modulated by the shaped-function signal to improve the gray-scale resolution of the image. And the auxiliary light is modulated by the constant intensity signal, which is easy to acquire the images under the active light irradiation. The least square method is employed to precisely extract the desired images. One wavelength in multispectral imaging based on LED illumination was taken as an example. It has been proven by experiments that the gray-scale resolution and the accuracy of information of the images acquired by the proposed method were both significantly improved. The optimum method opens up avenues for the hyperspectral imaging of biological tissue.

  11. An optimized digital watermarking algorithm in wavelet domain based on differential evolution for color image.

    PubMed

    Cui, Xinchun; Niu, Yuying; Zheng, Xiangwei; Han, Yingshuai

    2018-01-01

    In this paper, a new color watermarking algorithm based on differential evolution is proposed. A color host image is first converted from RGB space to YIQ space, which is more suitable for the human visual system. Then, apply three-level discrete wavelet transformation to luminance component Y and generate four different frequency sub-bands. After that, perform singular value decomposition on these sub-bands. In the watermark embedding process, apply discrete wavelet transformation to a watermark image after the scrambling encryption processing. Our new algorithm uses differential evolution algorithm with adaptive optimization to choose the right scaling factors. Experimental results show that the proposed algorithm has a better performance in terms of invisibility and robustness.

  12. Cancer risk estimation in Digital Breast Tomosynthesis using GEANT4 Monte Carlo simulations and voxel phantoms.

    PubMed

    Ferreira, P; Baptista, M; Di Maria, S; Vaz, P

    2016-05-01

    The aim of this work was to estimate the risk of radiation induced cancer following the Portuguese breast screening recommendations for Digital Mammography (DM) when applied to Digital Breast Tomosynthesis (DBT) and to evaluate how the risk to induce cancer could influence the energy used in breast diagnostic exams. The organ doses were calculated by Monte Carlo simulations using a female voxel phantom and considering the acquisition of 25 projection images. Single organ cancer incidence risks were calculated in order to assess the total effective radiation induced cancer risk. The screening strategy techniques considered were: DBT in Cranio-Caudal (CC) view and two-view DM (CC and Mediolateral Oblique (MLO)). The risk of cancer incidence following the Portuguese screening guidelines (screening every two years in the age range of 50-80years) was calculated by assuming a single CC DBT acquisition view as standalone screening strategy and compared with two-view DM. The difference in the total effective risk between DBT and DM is quite low. Nevertheless in DBT an increase of risk for the lung is observed with respect to DM. The lung is also the organ that is mainly affected when non-optimal beam energy (in terms of image quality and absorbed dose) is used instead of an optimal one. The use of non-optimal energies could increase the risk of lung cancer incidence by a factor of about 2. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  13. The effect of image processing on the detection of cancers in digital mammography.

    PubMed

    Warren, Lucy M; Given-Wilson, Rosalind M; Wallis, Matthew G; Cooke, Julie; Halling-Brown, Mark D; Mackenzie, Alistair; Chakraborty, Dev P; Bosmans, Hilde; Dance, David R; Young, Kenneth C

    2014-08-01

    OBJECTIVE. The objective of our study was to investigate the effect of image processing on the detection of cancers in digital mammography images. MATERIALS AND METHODS. Two hundred seventy pairs of breast images (both breasts, one view) were collected from eight systems using Hologic amorphous selenium detectors: 80 image pairs showed breasts containing subtle malignant masses; 30 image pairs, biopsy-proven benign lesions; 80 image pairs, simulated calcification clusters; and 80 image pairs, no cancer (normal). The 270 image pairs were processed with three types of image processing: standard (full enhancement), low contrast (intermediate enhancement), and pseudo-film-screen (no enhancement). Seven experienced observers inspected the images, locating and rating regions they suspected to be cancer for likelihood of malignancy. The results were analyzed using a jackknife-alternative free-response receiver operating characteristic (JAFROC) analysis. RESULTS. The detection of calcification clusters was significantly affected by the type of image processing: The JAFROC figure of merit (FOM) decreased from 0.65 with standard image processing to 0.63 with low-contrast image processing (p = 0.04) and from 0.65 with standard image processing to 0.61 with film-screen image processing (p = 0.0005). The detection of noncalcification cancers was not significantly different among the image-processing types investigated (p > 0.40). CONCLUSION. These results suggest that image processing has a significant impact on the detection of calcification clusters in digital mammography. For the three image-processing versions and the system investigated, standard image processing was optimal for the detection of calcification clusters. The effect on cancer detection should be considered when selecting the type of image processing in the future.

  14. Image gathering and restoration - Information and visual quality

    NASA Technical Reports Server (NTRS)

    Mccormick, Judith A.; Alter-Gartenberg, Rachel; Huck, Friedrich O.

    1989-01-01

    A method is investigated for optimizing the end-to-end performance of image gathering and restoration for visual quality. To achieve this objective, one must inevitably confront the problems that the visual quality of restored images depends on perceptual rather than mathematical considerations and that these considerations vary with the target, the application, and the observer. The method adopted in this paper is to optimize image gathering informationally and to restore images interactively to obtain the visually preferred trade-off among fidelity resolution, sharpness, and clarity. The results demonstrate that this method leads to significant improvements in the visual quality obtained by the traditional digital processing methods. These traditional methods allow a significant loss of visual quality to occur because they treat the design of the image-gathering system and the formulation of the image-restoration algorithm as two separate tasks and fail to account for the transformations between the continuous and the discrete representations in image gathering and reconstruction.

  15. Validation of no-reference image quality index for the assessment of digital mammographic images

    NASA Astrophysics Data System (ADS)

    de Oliveira, Helder C. R.; Barufaldi, Bruno; Borges, Lucas R.; Gabarda, Salvador; Bakic, Predrag R.; Maidment, Andrew D. A.; Schiabel, Homero; Vieira, Marcelo A. C.

    2016-03-01

    To ensure optimal clinical performance of digital mammography, it is necessary to obtain images with high spatial resolution and low noise, keeping radiation exposure as low as possible. These requirements directly affect the interpretation of radiologists. The quality of a digital image should be assessed using objective measurements. In general, these methods measure the similarity between a degraded image and an ideal image without degradation (ground-truth), used as a reference. These methods are called Full-Reference Image Quality Assessment (FR-IQA). However, for digital mammography, an image without degradation is not available in clinical practice; thus, an objective method to assess the quality of mammograms must be performed without reference. The purpose of this study is to present a Normalized Anisotropic Quality Index (NAQI), based on the Rényi entropy in the pseudo-Wigner domain, to assess mammography images in terms of spatial resolution and noise without any reference. The method was validated using synthetic images acquired through an anthropomorphic breast software phantom, and the clinical exposures on anthropomorphic breast physical phantoms and patient's mammograms. The results reported by this noreference index follow the same behavior as other well-established full-reference metrics, e.g., the peak signal-to-noise ratio (PSNR) and structural similarity index (SSIM). Reductions of 50% on the radiation dose in phantom images were translated as a decrease of 4dB on the PSNR, 25% on the SSIM and 33% on the NAQI, evidencing that the proposed metric is sensitive to the noise resulted from dose reduction. The clinical results showed that images reduced to 53% and 30% of the standard radiation dose reported reductions of 15% and 25% on the NAQI, respectively. Thus, this index may be used in clinical practice as an image quality indicator to improve the quality assurance programs in mammography; hence, the proposed method reduces the subjectivity inter-observers in the reporting of image quality assessment.

  16. An integral design strategy combining optical system and image processing to obtain high resolution images

    NASA Astrophysics Data System (ADS)

    Wang, Jiaoyang; Wang, Lin; Yang, Ying; Gong, Rui; Shao, Xiaopeng; Liang, Chao; Xu, Jun

    2016-05-01

    In this paper, an integral design that combines optical system with image processing is introduced to obtain high resolution images, and the performance is evaluated and demonstrated. Traditional imaging methods often separate the two technical procedures of optical system design and imaging processing, resulting in the failures in efficient cooperation between the optical and digital elements. Therefore, an innovative approach is presented to combine the merit function during optical design together with the constraint conditions of image processing algorithms. Specifically, an optical imaging system with low resolution is designed to collect the image signals which are indispensable for imaging processing, while the ultimate goal is to obtain high resolution images from the final system. In order to optimize the global performance, the optimization function of ZEMAX software is utilized and the number of optimization cycles is controlled. Then Wiener filter algorithm is adopted to process the image simulation and mean squared error (MSE) is taken as evaluation criterion. The results show that, although the optical figures of merit for the optical imaging systems is not the best, it can provide image signals that are more suitable for image processing. In conclusion. The integral design of optical system and image processing can search out the overall optimal solution which is missed by the traditional design methods. Especially, when designing some complex optical system, this integral design strategy has obvious advantages to simplify structure and reduce cost, as well as to gain high resolution images simultaneously, which has a promising perspective of industrial application.

  17. X-ray imaging using amorphous selenium: photoinduced discharge (PID) readout for digital general radiography.

    PubMed

    Rowlands, J A; Hunter, D M

    1995-12-01

    Digital radiographic systems based on photoconductive layers with the latent charge image readout by photoinduced discharge (PID) are investigated theoretically. Previously, a number of different systems have been proposed using sandwiched photoconductor and insulator layers and readout using a scanning laser beam. These systems are shown to have the general property of being very closely coupled (i.e., optimization of one imaging characteristic usually impacts negatively on others). The presence of a condensed state insulator between the photoconductor surface and the readout electrode does, however, confer a great advantage over systems using air gaps with their relatively low breakdown field. The greater breakdown field of condensed state dielectrics permits the modification of the electric field during the period between image formation and image readout. The trade-off between readout speed and noise makes this system suitable for instant general radiography and even rapid sequence radiography, however, the system is unsuitable for the low exposure rates used in fluoroscopy.

  18. Optimization of digital breast tomosynthesis (DBT) acquisition parameters for human observers: effect of reconstruction algorithms

    NASA Astrophysics Data System (ADS)

    Zeng, Rongping; Badano, Aldo; Myers, Kyle J.

    2017-04-01

    We showed in our earlier work that the choice of reconstruction methods does not affect the optimization of DBT acquisition parameters (angular span and number of views) using simulated breast phantom images in detecting lesions with a channelized Hotelling observer (CHO). In this work we investigate whether the model-observer based conclusion is valid when using humans to interpret images. We used previously generated DBT breast phantom images and recruited human readers to find the optimal geometry settings associated with two reconstruction algorithms, filtered back projection (FBP) and simultaneous algebraic reconstruction technique (SART). The human reader results show that image quality trends as a function of the acquisition parameters are consistent between FBP and SART reconstructions. The consistent trends confirm that the optimization of DBT system geometry is insensitive to the choice of reconstruction algorithm. The results also show that humans perform better in SART reconstructed images than in FBP reconstructed images. In addition, we applied CHOs with three commonly used channel models, Laguerre-Gauss (LG) channels, square (SQR) channels and sparse difference-of-Gaussian (sDOG) channels. We found that LG channels predict human performance trends better than SQR and sDOG channel models for the task of detecting lesions in tomosynthesis backgrounds. Overall, this work confirms that the choice of reconstruction algorithm is not critical for optimizing DBT system acquisition parameters.

  19. Automated reconstruction of standing posture panoramas from multi-sector long limb x-ray images

    NASA Astrophysics Data System (ADS)

    Miller, Linzey; Trier, Caroline; Ben-Zikri, Yehuda K.; Linte, Cristian A.

    2016-03-01

    Due to the digital X-ray imaging system's limited field of view, several individual sector images are required to capture the posture of an individual in standing position. These images are then "stitched together" to reconstruct the standing posture. We have created an image processing application that automates the stitching, therefore minimizing user input, optimizing workflow, and reducing human error. The application begins with pre-processing the input images by removing artifacts, filtering out isolated noisy regions, and amplifying a seamless bone edge. The resulting binary images are then registered together using a rigid-body intensity based registration algorithm. The identified registration transformations are then used to map the original sector images into the panorama image. Our method focuses primarily on the use of the anatomical content of the images to generate the panoramas as opposed to using external markers employed to aid with the alignment process. Currently, results show robust edge detection prior to registration and we have tested our approach by comparing the resulting automatically-stitched panoramas to the manually stitched panoramas in terms of registration parameters, target registration error of homologous markers, and the homogeneity of the digitally subtracted automatically- and manually-stitched images using 26 patient datasets.

  20. An Automatic Procedure for Combining Digital Images and Laser Scanner Data

    NASA Astrophysics Data System (ADS)

    Moussa, W.; Abdel-Wahab, M.; Fritsch, D.

    2012-07-01

    Besides improving both the geometry and the visual quality of the model, the integration of close-range photogrammetry and terrestrial laser scanning techniques directs at filling gaps in laser scanner point clouds to avoid modeling errors, reconstructing more details in higher resolution and recovering simple structures with less geometric details. Thus, within this paper a flexible approach for the automatic combination of digital images and laser scanner data is presented. Our approach comprises two methods for data fusion. The first method starts by a marker-free registration of digital images based on a point-based environment model (PEM) of a scene which stores the 3D laser scanner point clouds associated with intensity and RGB values. The PEM allows the extraction of accurate control information for the direct computation of absolute camera orientations with redundant information by means of accurate space resection methods. In order to use the computed relations between the digital images and the laser scanner data, an extended Helmert (seven-parameter) transformation is introduced and its parameters are estimated. Precedent to that, in the second method, the local relative orientation parameters of the camera images are calculated by means of an optimized Structure and Motion (SaM) reconstruction method. Then, using the determined transformation parameters results in having absolute oriented images in relation to the laser scanner data. With the resulting absolute orientations we have employed robust dense image reconstruction algorithms to create oriented dense image point clouds, which are automatically combined with the laser scanner data to form a complete detailed representation of a scene. Examples of different data sets are shown and experimental results demonstrate the effectiveness of the presented procedures.

  1. Dual-energy in mammography: feasibility study

    NASA Astrophysics Data System (ADS)

    Jafroudi, Hamid; Lo, Shih-Chung B.; Li, Huai; Steller Artz, Dorothy E.; Freedman, Matthew T.; Mun, Seong K.

    1996-04-01

    The purpose of this work is to examine the feasibility of dual-energy techniques to enhance the detection of microcalcifications in digital mammography. The digital mammography system used in this study consists of two different mammography systems; one is the conventional mammography system with molybdenum target and Mo filtration and the other is the clinical version of a low dose x-ray system with tungsten target and aluminum filtration. The low dose system is optimized for screen-film mammography with a highly efficient scatter rejection device built by Fischer Imaging Systems for evaluation at NIH. The system was designed by the University of Southern California based on multiparameter optimization techniques. Prototypes of this system have been constructed and evaluated at the Center for Devices and Radiological Health. The digital radiography system is based on the Fuji 9000 computed radiography (CR) system which uses a storage phosphor imaging plate as the receptor. High resolution plates (HR-V) are used in this study. Dual-energy is one technique to reduce the structured noise associated with the complexity of the background of normal anatomy surrounding a lesion. This can be done by taking the advantage of the x-ray attenuation characteristics of two different structures such as soft tissue and bone in chest radiography. We have applied this technique to the detection of microcalcifications in mammography. The overall system performance based on this technique is evaluated. Results presented are based on the evaluation of phantom images.

  2. Grid artifact reduction for direct digital radiography detectors based on rotated stationary grids with homomorphic filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Dong Sik; Lee, Sanggyun

    2013-06-15

    Purpose: Grid artifacts are caused when using the antiscatter grid in obtaining digital x-ray images. In this paper, research on grid artifact reduction techniques is conducted especially for the direct detectors, which are based on amorphous selenium. Methods: In order to analyze and reduce the grid artifacts, the authors consider a multiplicative grid image model and propose a homomorphic filtering technique. For minimal damage due to filters, which are used to suppress the grid artifacts, rotated grids with respect to the sampling direction are employed, and min-max optimization problems for searching optimal grid frequencies and angles for given sampling frequenciesmore » are established. The authors then propose algorithms for the grid artifact reduction based on the band-stop filters as well as low-pass filters. Results: The proposed algorithms are experimentally tested for digital x-ray images, which are obtained from direct detectors with the rotated grids, and are compared with other algorithms. It is shown that the proposed algorithms can successfully reduce the grid artifacts for direct detectors. Conclusions: By employing the homomorphic filtering technique, the authors can considerably suppress the strong grid artifacts with relatively narrow-bandwidth filters compared to the normal filtering case. Using rotated grids also significantly reduces the ringing artifact. Furthermore, for specific grid frequencies and angles, the authors can use simple homomorphic low-pass filters in the spatial domain, and thus alleviate the grid artifacts with very low implementation complexity.« less

  3. Digital breast tomosynthesis for detecting multifocal and multicentric breast cancer: influence of acquisition geometry on model observer performance in breast phantom images

    NASA Astrophysics Data System (ADS)

    Wen, Gezheng; Park, Subok; Markey, Mia K.

    2017-03-01

    Multifocal and multicentric breast cancer (MFMC), i.e., the presence of two or more tumor foci within the same breast, has an immense clinical impact on treatment planning and survival outcomes. Detecting multiple breast tumors is challenging as MFMC breast cancer is relatively uncommon, and human observers do not know the number or locations of tumors a priori. Digital breast tomosynthesis (DBT), in which an x-ray beam sweeps over a limited angular range across the breast, has the potential to improve the detection of multiple tumors.1, 2 However, prior efforts to optimize DBT image quality only considered unifocal breast cancers (e.g.,3-9), so the recommended geometries may not necessarily yield images that are informative for the task of detecting MFMC. Hence, the goal of this study is to employ a 3D multi-lesion (ml) channelized-Hotelling observer (CHO) to identify optimal DBT acquisition geometries for MFMC. Digital breast phantoms and simulated DBT scanners of different geometries (e.g., wide or narrow arc scans, different number of projections in each scan) were used to generate image data for the simulation study. Multiple 3D synthetic lesions were inserted into different breast regions to simulate MF cases and MC cases. 3D partial least squares (PLS) channels, and 3D Laguerre-Gauss (LG) channels were estimated to capture discriminant information and correlations among signals in locally varying anatomical backgrounds, enabling the model observer to make both image-level and location-specific detection decisions. The 3D ml-CHO with PLS channels outperformed that with LG channels in this study. The simulated MC cases and MC cases were not equally difficult for the ml-CHO to detect across the different simulated DBT geometries considered in this analysis. Also, the results suggest that the optimal design of DBT may vary as the task of clinical interest changes, e.g., a geometry that is better for finding at least one lesion may be worse for counting the number of lesions.

  4. WE-D-BRD-01: Innovation in Radiation Therapy Delivery: Advanced Digital Linac Features

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xing, L; Wong, J; Li, R

    2014-06-15

    Last few years has witnessed significant advances in linac technology and therapeutic dose delivery method. Digital linacs equipped with high dose rate FFF beams have been clinically implemented in a number of hospitals. Gated VMAT is becoming increasingly popular in treating tumors affected by respiratory motion. This session is devoted to update the audience with these technical advances and to present our experience in clinically implementing the new linacs and dose delivery methods. Topics to be covered include, technical features of new generation of linacs from different vendors, dosimetric characteristics and clinical need for FFF-beam based IMRT and VMAT, respiration-gatedmore » VMAT, the concept and implementation of station parameter optimized radiation therapy (SPORT), beam level imaging and onboard image guidance tools. Emphasis will be on providing fundamental understanding of the new treatment delivery and image guidance strategies, control systems, and the associated dosimetric characteristics. Commissioning and acceptance experience on these new treatment delivery technologies will be reported. Clinical experience and challenges encountered during the process of implementation of the new treatment techniques and future applications of the systems will also be highlighted. Learning Objectives: Present background knowledge of emerging digital linacs and summarize their key geometric and dosimetric features. SPORT as an emerging radiation therapy modality specifically designed to take advantage of digital linacs. Discuss issues related to the acceptance and commissioning of the digital linacs and FFF beams. Describe clinical utility of the new generation of digital linacs and their future applications.« less

  5. Optimization of wavefront coding imaging system using heuristic algorithms

    NASA Astrophysics Data System (ADS)

    González-Amador, E.; Padilla-Vivanco, A.; Toxqui-Quitl, C.; Zermeño-Loreto, O.

    2017-08-01

    Wavefront Coding (WFC) systems make use of an aspheric Phase-Mask (PM) and digital image processing to extend the Depth of Field (EDoF) of computational imaging systems. For years, several kinds of PM have been designed to produce a point spread function (PSF) near defocus-invariant. In this paper, the optimization of the phase deviation parameter is done by means of genetic algorithms (GAs). In this, the merit function minimizes the mean square error (MSE) between the diffraction limited Modulated Transfer Function (MTF) and the MTF of the system that is wavefront coded with different misfocus. WFC systems were simulated using the cubic, trefoil, and 4 Zernike polynomials phase-masks. Numerical results show defocus invariance aberration in all cases. Nevertheless, the best results are obtained by using the trefoil phase-mask, because the decoded image is almost free of artifacts.

  6. Performance of a computer-aided digital dermoscopic image analyzer for melanoma detection in 1,076 pigmented skin lesion biopsies.

    PubMed

    Del Rosario, Francis; Farahi, Jessica M; Drendel, Jesse; Buntinx-Krieg, Talayesa; Caravaglio, Joseph; Domozych, Renee; Chapman, Stephanie; Braunberger, Taylor; Dellavalle, Robert P; Norris, David A; Fathi, Ramin; Alkousakis, Theodore

    2018-05-01

    Digital dermoscopic image analysis of pigmented skin lesions (PSLs) has become increasingly popular, despite its unclear clinical utility. Unbiased, high-powered studies investigating the efficacy of commercially available systems are limited. To investigate the diagnostic performance of the FotoFinder Mole-Analyzer in assessing PSLs for cutaneous melanoma. In this 15-year retrospective study, the histopathologies of 1076 biopsied PSLs among a total of 2500 imaged PSLs were collected. The biopsied PSLs were categorized as benign or malignant (cutaneous melanoma) based on histopathology. Analyzer scores (0-1.00) for these PSLs were obtained and grouped according to histopathology. At an optimized cutoff score of 0.50, a sensitivity of 56% and a specificity of 74% were achieved. The area under the receiver operating characteristics curve was 0.698, indicating poor accuracy as a diagnostic tool. This study had a retrospective design and involved only a single institution. Our study reveals a low sensitivity of the scoring function of this digital dermoscopic image analyzer for detecting cutaneous melanomas. Physicians must apply keen clinical judgment when using such devices in the screening of suspicious PSLs. Copyright © 2017 American Academy of Dermatology, Inc. All rights reserved.

  7. Preparing images for publication: part 1.

    PubMed

    Devigus, Alessandro; Paul, Stefan

    2006-04-01

    Images play a vital role in the publication and presentation of clinical and scientific work. Within clinical photography, color reproduction has always been a contentious issue. With the development of new technologies, the variables affecting color reproduction have changed, and photographers have moved away from film-based to digital photographic imaging systems. To develop an understanding of color, knowledge about the basic principles of light and vision is important. An object's color is determined by which wavelengths of light it reflects. Colors of light and colors of pigment behave differently. Due to technical limitations, monitors and printers are unable to reproduce all the colors we can see with our eyes, also called the LAB color space. In order to optimize the output of digital clinical images, color management solutions need to be integrated in the photographic workflow; however, their use is still limited in the medical field. As described in part 2 of this article, calibrating your computer monitor and using an 18% gray background card are easy ways to enable more consistent color reproduction for publication. In addition, some basic information about the various camera settings is given to facilitate the use of this new digital equipment in daily practice.

  8. Information theoretic analysis of edge detection in visual communication

    NASA Astrophysics Data System (ADS)

    Jiang, Bo; Rahman, Zia-ur

    2010-08-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.

  9. Information theoretic methods for image processing algorithm optimization

    NASA Astrophysics Data System (ADS)

    Prokushkin, Sergey F.; Galil, Erez

    2015-01-01

    Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).

  10. Optimal focal-plane restoration

    NASA Technical Reports Server (NTRS)

    Reichenbach, Stephen E.; Park, Stephen K.

    1989-01-01

    Image restoration can be implemented efficiently by calculating the convolution of the digital image and a small kernel during image acquisition. Processing the image in the focal-plane in this way requires less computation than traditional Fourier-transform-based techniques such as the Wiener filter and constrained least-squares filter. Here, the values of the convolution kernel that yield the restoration with minimum expected mean-square error are determined using a frequency analysis of the end-to-end imaging system. This development accounts for constraints on the size and shape of the spatial kernel and all the components of the imaging system. Simulation results indicate the technique is effective and efficient.

  11. Digital radiography in general dental practice: a field study.

    PubMed

    Hellén-Halme, K; Nilsson, M; Petersson, A

    2007-07-01

    The aim of this study was to conduct a field study to survey the performance of digital radiography and how it was used by dentists in general dental practice. 19 general dental practitioners were visited at their clinics. Ambient light (illuminance) was measured in the rooms where the monitors were placed. Different technical display parameters were noted. Test images and two phantoms--one low-contrast phantom and one line-pair resolution phantom--were used to evaluate the digital system. How the dentists used the enhancement program was investigated by noting which functions were used. Average illuminance in the operating room was 668 lux (range 190-1250 lux). On radiographs of the low-contrast phantom taken at the clinic, the ability to observe the holes decreased as illuminance increased. On average, the "light percentage" initially set on the monitor had to be decreased by 17% and contrast by 10% to optimize the display of the test images. The general dental practitioners used the enhancement programs most often to alter brightness and contrast to obtain the subjectively best image. Large differences between the clinics were noted. Knowledge of how to handle digital equipment in general dental practice should be improved. A calibrated monitor of good quality should be a given priority, as should proper ambient light conditions. There is a need to develop standardized quality controls for digital dental radiography.

  12. SU-F-P-06: Moving From Computed Radiography to Digital Radiography: A Collaborative Approach to Improve Image Quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandoval, D; Mlady, G; Selwyn, R

    Purpose: To bring together radiologists, technologists, and physicists to utilize post-processing techniques in digital radiography (DR) in order to optimize image acquisition and improve image quality. Methods: Sub-optimal images acquired on a new General Electric (GE) DR system were flagged for follow-up by radiologists and reviewed by technologists and medical physicists. Various exam types from adult musculoskeletal (n=35), adult chest (n=4), and pediatric (n=7) were chosen for review. 673 total images were reviewed. These images were processed using five customized algorithms provided by GE. An image score sheet was created allowing the radiologist to assign a numeric score to eachmore » of the processed images, this allowed for objective comparison to the original images. Each image was scored based on seven properties: 1) overall image look, 2) soft tissue contrast, 3) high contrast, 4) latitude, 5) tissue equalization, 6) edge enhancement, 7) visualization of structures. Additional space allowed for additional comments not captured in scoring categories. Radiologists scored the images from 1 – 10 with 1 being non-diagnostic quality and 10 being superior diagnostic quality. Scores for each custom algorithm for each image set were summed. The algorithm with the highest score for each image set was then set as the default processing. Results: Images placed into the PACS “QC folder” for image processing reasons decreased. Feedback from radiologists was, overall, that image quality for these studies had improved. All default processing for these image types was changed to the new algorithm. Conclusion: This work is an example of the collaboration between radiologists, technologists, and physicists at the University of New Mexico to add value to the radiology department. The significant amount of work required to prepare the processing algorithms, reprocessing and scoring of the images was eagerly taken on by all team members in order to produce better quality images and improve patient care.« less

  13. Evaluation of color grading impact in restoration process of archive films

    NASA Astrophysics Data System (ADS)

    Fliegel, Karel; Vítek, Stanislav; Páta, Petr; Janout, Petr; Myslík, Jiří; Pecák, Josef; Jícha, Marek

    2016-09-01

    Color grading of archive films is a very particular task in the process of their restoration. The ultimate goal of color grading here is to achieve the same look of the movie as intended at the time of its first presentation. The role of the expert restorer, expert group and a digital colorist in this complicated process is to find the optimal settings of the digital color grading system so that the resulting image look is as close as possible to the estimate of the original reference release print adjusted by the expert group of cinematographers. A methodology for subjective assessment of perceived differences between the outcomes of color grading is introduced, and results of a subjective study are presented. Techniques for objective assessment of perceived differences are discussed, and their performance is evaluated using ground truth obtained from the subjective experiment. In particular, a solution based on calibrated digital single-lens reflex camera and subsequent analysis of image features captured from the projection screen is described. The system based on our previous work is further developed so that it can be used for the analysis of projected images. It allows assessing color differences in these images and predict their impact on the perceived difference in image look.

  14. Diagnostic accuracy of phosphor plate systems and conventional radiography in the detection of simulated internal root resorption.

    PubMed

    Vasconcelos, Karla de Faria; Rovaris, Karla; Nascimento, Eduarda Helena Leandro; Oliveira, Matheus Lima; Távora, Débora de Melo; Bóscolo, Frab Norberto

    2017-11-01

    To evaluate the performance of conventional radiography and photostimulable phosphor (PSP) plate in the detection of simulated internal root resorption (IRR) lesions in early stages. Twenty single-rooted teeth were X-rayed before and after having a simulated IRR early lesion. Three imaging systems were used: Kodak InSight dental film and two PSPs digital systems, Digora Optime and VistaScan. The digital images were displayed on a 20.1″ LCD monitor using the native software of each system, and the conventional radiographs were evaluated on a masked light box. Two radiologists were asked to indicate the presence or absence of IRR and, after two weeks, all images were re-evaluated. Cohen's kappa coefficient was calculated to assess intra- and interobserver agreement. The three imaging systems were compared using the Kruskal-Wallis test. For interexaminer agreement, overall kappa values were 0.70, 0.65 and 0.70 for conventional film, Digora Optima and VistaScan, respectively. Both the conventional and digital radiography presented low sensitivity, specificity, accuracy, positive and negative predictive values with no significant difference between imaging systems (p = .0725). The performance of conventional and PSP was similar in the detection of simulated IRR lesions in early stages with low accuracy.

  15. THE SLOAN DIGITAL SKY SURVEY STRIPE 82 IMAGING DATA: DEPTH-OPTIMIZED CO-ADDS OVER 300 deg{sup 2} IN FIVE FILTERS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Linhua; Fan, Xiaohui; McGreer, Ian D.

    We present and release co-added images of the Sloan Digital Sky Survey (SDSS) Stripe 82. Stripe 82 covers an area of ∼300 deg{sup 2} on the celestial equator, and has been repeatedly scanned 70-90 times in the ugriz bands by the SDSS imaging survey. By making use of all available data in the SDSS archive, our co-added images are optimized for depth. Input single-epoch frames were properly processed and weighted based on seeing, sky transparency, and background noise before co-addition. The resultant products are co-added science images and their associated weight images that record relative weights at individual pixels. Themore » depths of the co-adds, measured as the 5σ detection limits of the aperture (3.''2 diameter) magnitudes for point sources, are roughly 23.9, 25.1, 24.6, 24.1, and 22.8 AB magnitudes in the five bands, respectively. They are 1.9-2.2 mag deeper than the best SDSS single-epoch data. The co-added images have good image quality, with an average point-spread function FWHM of ∼1'' in the r, i, and z bands. We also release object catalogs that were made with SExtractor. These co-added products have many potential uses for studies of galaxies, quasars, and Galactic structure. We further present and release near-IR J-band images that cover ∼90 deg{sup 2} of Stripe 82. These images were obtained using the NEWFIRM camera on the NOAO 4 m Mayall telescope, and have a depth of about 20.0-20.5 Vega magnitudes (also 5σ detection limits for point sources)« less

  16. The Sloan Digital Sky Survey Stripe 82 Imaging Data: Depth-Optimized Co-adds Over 300 deg$^2$ in Five Filters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Linhua; Fan, Xiaohui; Bian, Fuyan

    We present and release co-added images of the Sloan Digital Sky Survey (SDSS) Stripe 82. Stripe 82 covers an area of ~300 deg(2) on the celestial equator, and has been repeatedly scanned 70-90 times in the ugriz bands by the SDSS imaging survey. By making use of all available data in the SDSS archive, our co-added images are optimized for depth. Input single-epoch frames were properly processed and weighted based on seeing, sky transparency, and background noise before co-addition. The resultant products are co-added science images and their associated weight images that record relative weights at individual pixels. The depths of themore » co-adds, measured as the 5σ detection limits of the aperture (3.''2 diameter) magnitudes for point sources, are roughly 23.9, 25.1, 24.6, 24.1, and 22.8 AB magnitudes in the five bands, respectively. They are 1.9-2.2 mag deeper than the best SDSS single-epoch data. The co-added images have good image quality, with an average point-spread function FWHM of ~1'' in the r, i, and z bands. We also release object catalogs that were made with SExtractor. These co-added products have many potential uses for studies of galaxies, quasars, and Galactic structure. We further present and release near-IR J-band images that cover ~90 deg(2) of Stripe 82. These images were obtained using the NEWFIRM camera on the NOAO 4 m Mayall telescope, and have a depth of about 20.0-20.5 Vega magnitudes (also 5σ detection limits for point sources).« less

  17. Hammersmith cardiology workshop series. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maseri, A.

    1985-01-01

    This book contains over 30 selections. Some of the titles are: Digital Subtraction Angiography: The Optimal Radiologic Technique for Cardiac Diagnosis; NMR Imaging of the Heart; Radioisotopes in the Evaluation of Right and Left Ventricular Function; Role of Membrane Abnormalities in the Pathogenesis of Heart Disease; and Influence of Arrhythmias on Cardiac Function.

  18. Feature selection methods for object-based classification of sub-decimeter resolution digital aerial imagery

    USDA-ARS?s Scientific Manuscript database

    Due to the availability of numerous spectral, spatial, and contextual features, the determination of optimal features and class separabilities can be a time consuming process in object-based image analysis (OBIA). While several feature selection methods have been developed to assist OBIA, a robust c...

  19. Experimental generation of Laguerre-Gaussian beam using digital micromirror device.

    PubMed

    Ren, Yu-Xuan; Li, Ming; Huang, Kun; Wu, Jian-Guang; Gao, Hong-Fang; Wang, Zi-Qiang; Li, Yin-Mei

    2010-04-01

    A digital micromirror device (DMD) modulates laser intensity through computer control of the device. We experimentally investigate the performance of the modulation property of a DMD and optimize the modulation procedure through image correction. Furthermore, Laguerre-Gaussian (LG) beams with different topological charges are generated by projecting a series of forklike gratings onto the DMD. We measure the field distribution with and without correction, the energy of LG beams with different topological charges, and the polarization property in sequence. Experimental results demonstrate that it is possible to generate LG beams with a DMD that allows the use of a high-intensity laser with proper correction to the input images, and that the polarization state of the LG beam differs from that of the input beam.

  20. Simultaneous multimodal ophthalmic imaging using swept-source spectrally encoded scanning laser ophthalmoscopy and optical coherence tomography

    PubMed Central

    Malone, Joseph D.; El-Haddad, Mohamed T.; Bozic, Ivan; Tye, Logan A.; Majeau, Lucas; Godbout, Nicolas; Rollins, Andrew M.; Boudoux, Caroline; Joos, Karen M.; Patel, Shriji N.; Tao, Yuankai K.

    2016-01-01

    Scanning laser ophthalmoscopy (SLO) benefits diagnostic imaging and therapeutic guidance by allowing for high-speed en face imaging of retinal structures. When combined with optical coherence tomography (OCT), SLO enables real-time aiming and retinal tracking and provides complementary information for post-acquisition volumetric co-registration, bulk motion compensation, and averaging. However, multimodality SLO-OCT systems generally require dedicated light sources, scanners, relay optics, detectors, and additional digitization and synchronization electronics, which increase system complexity. Here, we present a multimodal ophthalmic imaging system using swept-source spectrally encoded scanning laser ophthalmoscopy and optical coherence tomography (SS-SESLO-OCT) for in vivo human retinal imaging. SESLO reduces the complexity of en face imaging systems by multiplexing spatial positions as a function of wavelength. SESLO image quality benefited from single-mode illumination and multimode collection through a prototype double-clad fiber coupler, which optimized scattered light throughput and reduce speckle contrast while maintaining lateral resolution. Using a shared 1060 nm swept-source, shared scanner and imaging optics, and a shared dual-channel high-speed digitizer, we acquired inherently co-registered en face retinal images and OCT cross-sections simultaneously at 200 frames-per-second. PMID:28101411

  1. Image correlation method for DNA sequence alignment.

    PubMed

    Curilem Saldías, Millaray; Villarroel Sassarini, Felipe; Muñoz Poblete, Carlos; Vargas Vásquez, Asticio; Maureira Butler, Iván

    2012-01-01

    The complexity of searches and the volume of genomic data make sequence alignment one of bioinformatics most active research areas. New alignment approaches have incorporated digital signal processing techniques. Among these, correlation methods are highly sensitive. This paper proposes a novel sequence alignment method based on 2-dimensional images, where each nucleic acid base is represented as a fixed gray intensity pixel. Query and known database sequences are coded to their pixel representation and sequence alignment is handled as object recognition in a scene problem. Query and database become object and scene, respectively. An image correlation process is carried out in order to search for the best match between them. Given that this procedure can be implemented in an optical correlator, the correlation could eventually be accomplished at light speed. This paper shows an initial research stage where results were "digitally" obtained by simulating an optical correlation of DNA sequences represented as images. A total of 303 queries (variable lengths from 50 to 4500 base pairs) and 100 scenes represented by 100 x 100 images each (in total, one million base pair database) were considered for the image correlation analysis. The results showed that correlations reached very high sensitivity (99.01%), specificity (98.99%) and outperformed BLAST when mutation numbers increased. However, digital correlation processes were hundred times slower than BLAST. We are currently starting an initiative to evaluate the correlation speed process of a real experimental optical correlator. By doing this, we expect to fully exploit optical correlation light properties. As the optical correlator works jointly with the computer, digital algorithms should also be optimized. The results presented in this paper are encouraging and support the study of image correlation methods on sequence alignment.

  2. Electrophoresis gel image processing and analysis using the KODAK 1D software.

    PubMed

    Pizzonia, J

    2001-06-01

    The present article reports on the performance of the KODAK 1D Image Analysis Software for the acquisition of information from electrophoresis experiments and highlights the utility of several mathematical functions for subsequent image processing, analysis, and presentation. Digital images of Coomassie-stained polyacrylamide protein gels containing molecular weight standards and ethidium bromide stained agarose gels containing DNA mass standards are acquired using the KODAK Electrophoresis Documentation and Analysis System 290 (EDAS 290). The KODAK 1D software is used to optimize lane and band identification using features such as isomolecular weight lines. Mathematical functions for mass standard representation are presented, and two methods for estimation of unknown band mass are compared. Given the progressive transition of electrophoresis data acquisition and daily reporting in peer-reviewed journals to digital formats ranging from 8-bit systems such as EDAS 290 to more expensive 16-bit systems, the utility of algorithms such as Gaussian modeling, which can correct geometric aberrations such as clipping due to signal saturation common at lower bit depth levels, is discussed. Finally, image-processing tools that can facilitate image preparation for presentation are demonstrated.

  3. Model-based Estimation for Pose, Velocity of Projectile from Stereo Linear Array Image

    NASA Astrophysics Data System (ADS)

    Zhao, Zhuxin; Wen, Gongjian; Zhang, Xing; Li, Deren

    2012-01-01

    The pose (position and attitude) and velocity of in-flight projectiles have major influence on the performance and accuracy. A cost-effective method for measuring the gun-boosted projectiles is proposed. The method adopts only one linear array image collected by the stereo vision system combining a digital line-scan camera and a mirror near the muzzle. From the projectile's stereo image, the motion parameters (pose and velocity) are acquired by using a model-based optimization algorithm. The algorithm achieves optimal estimation of the parameters by matching the stereo projection of the projectile and that of the same size 3D model. The speed and the AOA (angle of attack) could also be determined subsequently. Experiments are made to test the proposed method.

  4. Ongoing quality control in digital radiography: Report of AAPM Imaging Physics Committee Task Group 151.

    PubMed

    Jones, A Kyle; Heintz, Philip; Geiser, William; Goldman, Lee; Jerjian, Khachig; Martin, Melissa; Peck, Donald; Pfeiffer, Douglas; Ranger, Nicole; Yorkston, John

    2015-11-01

    Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist is responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.

  5. Ongoing quality control in digital radiography: Report of AAPM Imaging Physics Committee Task Group 151

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, A. Kyle, E-mail: kyle.jones@mdanderson.org; Geiser, William; Heintz, Philip

    Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist ismore » responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.« less

  6. Tomographic digital subtraction angiography for lung perfusion estimation in rodents.

    PubMed

    Badea, Cristian T; Hedlund, Laurence W; De Lin, Ming; Mackel, Julie S Boslego; Samei, Ehsan; Johnson, G Allan

    2007-05-01

    In vivo measurements of perfusion present a challenge to existing small animal imaging techniques such as magnetic resonance microscopy, micro computed tomography, micro positron emission tomography, and microSPECT, due to combined requirements for high spatial and temporal resolution. We demonstrate the use of tomographic digital subtraction angiography (TDSA) for estimation of perfusion in small animals. TDSA augments conventional digital subtraction angiography (DSA) by providing three-dimensional spatial information using tomosynthesis algorithms. TDSA is based on the novel paradigm that the same time density curves can be reproduced in a number of consecutive injections of microL volumes of contrast at a series of different angles of rotation. The capabilities of TDSA are established in studies on lung perfusion in rats. Using an imaging system developed in-house, we acquired data for four-dimensional (4D) imaging with temporal resolution of 140 ms, in-plane spatial resolution of 100 microm, and slice thickness on the order of millimeters. Based on a structured experimental approach, we optimized TDSA imaging providing a good trade-off between slice thickness, the number of injections, contrast to noise, and immunity to artifacts. Both DSA and TDSA images were used to create parametric maps of perfusion. TDSA imaging has potential application in a number of areas where functional perfusion measurements in 4D can provide valuable insight into animal models of disease and response to therapeutics.

  7. An optimal algorithm for reconstructing images from binary measurements

    NASA Astrophysics Data System (ADS)

    Yang, Feng; Lu, Yue M.; Sbaiz, Luciano; Vetterli, Martin

    2010-01-01

    We have studied a camera with a very large number of binary pixels referred to as the gigavision camera [1] or the gigapixel digital film camera [2, 3]. Potential advantages of this new camera design include improved dynamic range, thanks to its logarithmic sensor response curve, and reduced exposure time in low light conditions, due to its highly sensitive photon detection mechanism. We use maximum likelihood estimator (MLE) to reconstruct a high quality conventional image from the binary sensor measurements of the gigavision camera. We prove that when the threshold T is "1", the negative loglikelihood function is a convex function. Therefore, optimal solution can be achieved using convex optimization. Base on filter bank techniques, fast algorithms are given for computing the gradient and the multiplication of a vector and Hessian matrix of the negative log-likelihood function. We show that with a minor change, our algorithm also works for estimating conventional images from multiple binary images. Numerical experiments with synthetic 1-D signals and images verify the effectiveness and quality of the proposed algorithm. Experimental results also show that estimation performance can be improved by increasing the oversampling factor or the number of binary images.

  8. DCTune Perceptual Optimization of Compressed Dental X-Rays

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Null, Cynthia H. (Technical Monitor)

    1996-01-01

    In current dental practice, x-rays of completed dental work are often sent to the insurer for verification. It is faster and cheaper to transmit instead digital scans of the x-rays. Further economies result if the images are sent in compressed form. DCTune is a technology for optimizing DCT (digital communication technology) quantization matrices to yield maximum perceptual quality for a given bit-rate, or minimum bit-rate for a given perceptual quality. Perceptual optimization of DCT color quantization matrices. In addition, the technology provides a means of setting the perceptual quality of compressed imagery in a systematic way. The purpose of this research was, with respect to dental x-rays, 1) to verify the advantage of DCTune over standard JPEG (Joint Photographic Experts Group), 2) to verify the quality control feature of DCTune, and 3) to discover regularities in the optimized matrices of a set of images. We optimized matrices for a total of 20 images at two resolutions (150 and 300 dpi) and four bit-rates (0.25, 0.5, 0.75, 1.0 bits/pixel), and examined structural regularities in the resulting matrices. We also conducted psychophysical studies (1) to discover the DCTune quality level at which the images became 'visually lossless,' and (2) to rate the relative quality of DCTune and standard JPEG images at various bitrates. Results include: (1) At both resolutions, DCTune quality is a linear function of bit-rate. (2) DCTune quantization matrices for all images at all bitrates and resolutions are modeled well by an inverse Gaussian, with parameters of amplitude and width. (3) As bit-rate is varied, optimal values of both amplitude and width covary in an approximately linear fashion. (4) Both amplitude and width vary in systematic and orderly fashion with either bit-rate or DCTune quality; simple mathematical functions serve to describe these relationships. (5) In going from 150 to 300 dpi, amplitude parameters are substantially lower and widths larger at corresponding bit-rates or qualities. (6) Visually lossless compression occurs at a DCTune quality value of about 1. (7) At 0.25 bits/pixel, comparative ratings give DCTune a substantial advantage over standard JPEG. As visually lossless bit-rates are approached, this advantage of necessity diminishes. We have concluded that DCTune optimized quantization matrices provide better visual quality than standard JPEG. Meaningful quality levels may be specified by means of the DCTune metric. Optimized matrices are very similar across the class of dental x-rays, suggesting the possibility of a 'class-optimal' matrix. DCTune technology appears to provide some value in the context of compressed dental x-rays.

  9. An efficient multiple exposure image fusion in JPEG domain

    NASA Astrophysics Data System (ADS)

    Hebbalaguppe, Ramya; Kakarala, Ramakrishna

    2012-01-01

    In this paper, we describe a method to fuse multiple images taken with varying exposure times in the JPEG domain. The proposed algorithm finds its application in HDR image acquisition and image stabilization for hand-held devices like mobile phones, music players with cameras, digital cameras etc. Image acquisition at low light typically results in blurry and noisy images for hand-held camera's. Altering camera settings like ISO sensitivity, exposure times and aperture for low light image capture results in noise amplification, motion blur and reduction of depth-of-field respectively. The purpose of fusing multiple exposures is to combine the sharp details of the shorter exposure images with high signal-to-noise-ratio (SNR) of the longer exposure images. The algorithm requires only a single pass over all images, making it efficient. It comprises of - sigmoidal boosting of shorter exposed images, image fusion, artifact removal and saturation detection. Algorithm does not need more memory than a single JPEG macro block to be kept in memory making it feasible to be implemented as the part of a digital cameras hardware image processing engine. The Artifact removal step reuses the JPEGs built-in frequency analysis and hence benefits from the considerable optimization and design experience that is available for JPEG.

  10. Development of a Digital Microarray with Interferometric Reflectance Imaging

    NASA Astrophysics Data System (ADS)

    Sevenler, Derin

    This dissertation describes a new type of molecular assay for nucleic acids and proteins. We call this technique a digital microarray since it is conceptually similar to conventional fluorescence microarrays, yet it performs enumerative ('digital') counting of the number captured molecules. Digital microarrays are approximately 10,000-fold more sensitive than fluorescence microarrays, yet maintain all of the strengths of the platform including low cost and high multiplexing (i.e., many different tests on the same sample simultaneously). Digital microarrays use gold nanorods to label the captured target molecules. Each gold nanorod on the array is individually detected based on its light scattering, with an interferometric microscopy technique called SP-IRIS. Our optimized high-throughput version of SP-IRIS is able to scan a typical array of 500 spots in less than 10 minutes. Digital DNA microarrays may have utility in applications where sequencing is prohibitively expensive or slow. As an example, we describe a digital microarray assay for gene expression markers of bacterial drug resistance.

  11. Computer object segmentation by nonlinear image enhancement, multidimensional clustering, and geometrically constrained contour optimization

    NASA Astrophysics Data System (ADS)

    Bruynooghe, Michel M.

    1998-04-01

    In this paper, we present a robust method for automatic object detection and delineation in noisy complex images. The proposed procedure is a three stage process that integrates image segmentation by multidimensional pixel clustering and geometrically constrained optimization of deformable contours. The first step is to enhance the original image by nonlinear unsharp masking. The second step is to segment the enhanced image by multidimensional pixel clustering, using our reducible neighborhoods clustering algorithm that has a very interesting theoretical maximal complexity. Then, candidate objects are extracted and initially delineated by an optimized region merging algorithm, that is based on ascendant hierarchical clustering with contiguity constraints and on the maximization of average contour gradients. The third step is to optimize the delineation of previously extracted and initially delineated objects. Deformable object contours have been modeled by cubic splines. An affine invariant has been used to control the undesired formation of cusps and loops. Non linear constrained optimization has been used to maximize the external energy. This avoids the difficult and non reproducible choice of regularization parameters, that are required by classical snake models. The proposed method has been applied successfully to the detection of fine and subtle microcalcifications in X-ray mammographic images, to defect detection by moire image analysis, and to the analysis of microrugosities of thin metallic films. The later implementation of the proposed method on a digital signal processor associated to a vector coprocessor would allow the design of a real-time object detection and delineation system for applications in medical imaging and in industrial computer vision.

  12. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis.

    PubMed

    Held, Christian; Nattkemper, Tim; Palmisano, Ralf; Wittenberg, Thomas

    2013-01-01

    Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline's modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum.

  13. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis

    PubMed Central

    Held, Christian; Nattkemper, Tim; Palmisano, Ralf; Wittenberg, Thomas

    2013-01-01

    Introduction: Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. Methods: In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline's modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. Results: This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. Conclusion: The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum. PMID:23766941

  14. Optimization of a dual-energy contrast-enhanced technique for a photon-counting digital breast tomosynthesis system: I. A theoretical model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carton, Ann-Katherine; Ullberg, Christer; Lindman, Karin

    2010-11-15

    Purpose: Dual-energy (DE) iodine contrast-enhanced x-ray imaging of the breast has been shown to identify cancers that would otherwise be mammographically occult. In this article, theoretical modeling was performed to obtain optimally enhanced iodine images for a photon-counting digital breast tomosynthesis (DBT) system using a DE acquisition technique. Methods: In the system examined, the breast is scanned with a multislit prepatient collimator aligned with a multidetector camera. Each detector collects a projection image at a unique angle during the scan. Low-energy (LE) and high-energy (HE) projection images are acquired simultaneously in a single scan by covering alternate collimator slits withmore » Sn and Cu filters, respectively. Sn filters ranging from 0.08 to 0.22 mm thickness and Cu filters from 0.11 to 0.27 mm thickness were investigated. A tube voltage of 49 kV was selected. Tomographic images, hereafter referred to as DBT images, were reconstructed using a shift-and-add algorithm. Iodine-enhanced DBT images were acquired by performing a weighted logarithmic subtraction of the HE and LE DBT images. The DE technique was evaluated for 20-80 mm thick breasts. Weighting factors, w{sub t}, that optimally cancel breast tissue were computed. Signal-difference-to-noise ratios (SDNRs) between iodine-enhanced and nonenhanced breast tissue normalized to the square root of the mean glandular dose (MGD) were computed as a function of the fraction of the MGD allocated to the HE images. Peak SDNR/{radical}(MGD) and optimal dose allocations were identified. SDNR/{radical}(MGD) and dose allocations were computed for several practical feasible system configurations (i.e., determined by the number of collimator slits covered by Sn and Cu). A practical system configuration and Sn-Cu filter pair that accounts for the trade-off between SDNR, tube-output, and MGD were selected. Results: w{sub t} depends on the Sn-Cu filter combination used, as well as on the breast thickness; to optimally cancel 0% with 50% glandular breast tissue, w{sub t} values were found to range from 0.46 to 0.72 for all breast thicknesses and Sn-Cu filter pairs studied. The optimal w{sub t} values needed to cancel all possible breast tissue glandularites vary by less than 1% for 20 mm thick breasts and 18% for 80 mm breasts. The system configuration where one collimator slit covered by Sn is alternated with two collimator slits covered by Cu delivers SDNR/{radical}(MGD) nearest to the peak value. A reasonable compromise is a 0.16 mm Sn-0.23 mm Cu filter pair, resulting in SDNR values between 1.64 and 0.61 and MGD between 0.70 and 0.53 mGy for 20-80 mm thick breasts at the maximum tube current. Conclusions: A DE acquisition technique for a photon-counting DBT imaging system has been developed and optimized.« less

  15. Image acquisition optimization of a limited-angle intrafraction verification (LIVE) system for lung radiotherapy.

    PubMed

    Zhang, Yawei; Deng, Xinchen; Yin, Fang-Fang; Ren, Lei

    2018-01-01

    Limited-angle intrafraction verification (LIVE) has been previously developed for four-dimensional (4D) intrafraction target verification either during arc delivery or between three-dimensional (3D)/IMRT beams. Preliminary studies showed that LIVE can accurately estimate the target volume using kV/MV projections acquired over orthogonal view 30° scan angles. Currently, the LIVE imaging acquisition requires slow gantry rotation and is not clinically optimized. The goal of this study is to optimize the image acquisition parameters of LIVE for different patient respiratory periods and gantry rotation speeds for the effective clinical implementation of the system. Limited-angle intrafraction verification imaging acquisition was optimized using a digital anthropomorphic phantom (XCAT) with simulated respiratory periods varying from 3 s to 6 s and gantry rotation speeds varying from 1°/s to 6°/s. LIVE scanning time was optimized by minimizing the number of respiratory cycles needed for the four-dimensional scan, and imaging dose was optimized by minimizing the number of kV and MV projections needed for four-dimensional estimation. The estimation accuracy was evaluated by calculating both the center-of-mass-shift (COMS) and three-dimensional volume-percentage-difference (VPD) between the tumor in estimated images and the ground truth images. The robustness of LIVE was evaluated with varied respiratory patterns, tumor sizes, and tumor locations in XCAT simulation. A dynamic thoracic phantom (CIRS) was used to further validate the optimized imaging schemes from XCAT study with changes of respiratory patterns, tumor sizes, and imaging scanning directions. Respiratory periods, gantry rotation speeds, number of respiratory cycles scanned and number of kV/MV projections acquired were all positively correlated with the estimation accuracy of LIVE. Faster gantry rotation speed or longer respiratory period allowed less respiratory cycles to be scanned and less kV/MV projections to be acquired to estimate the target volume accurately. Regarding the scanning time minimization, for patient respiratory periods of 3-4 s, gantry rotation speeds of 1°/s, 2°/s, 3-6°/s required scanning of five, four, and three respiratory cycles, respectively. For patient respiratory periods of 5-6 s, the corresponding respiratory cycles required in the scan changed to four, three, and two cycles, respectively. Regarding the imaging dose minimization, for patient respiratory periods of 3-4 s, gantry rotation speeds of 1°/s, 2-4°/s, 5-6°/s required acquiring of 7, 5, 4 kV and MV projections, respectively. For patient respiratory periods of 5-6 s, 5 kV and 5 MV projections are sufficient for all gantry rotation speeds. The optimized LIVE system was robust against breathing pattern, tumor size and tumor location changes. In the CIRS study, the optimized LIVE system achieved the average center-of-mass-shift (COMS)/volume-percentage-difference (VPD) of 0.3 ± 0.1 mm/7.7 ± 2.0% for the scanning time priority case, 0.2 ± 0.1 mm/6.1 ± 1.2% for the imaging dose priority case, respectively, among all gantry rotation speeds tested. LIVE was robust against different scanning directions investigated. The LIVE system has been preliminarily optimized for different patient respiratory periods and treatment gantry rotation speeds using digital and physical phantoms. The optimized imaging parameters, including number of respiratory cycles scanned and kV/MV projection numbers acquired, provide guidelines for optimizing the scanning time and imaging dose of the LIVE system for its future evaluations and clinical implementations through patient studies. © 2017 American Association of Physicists in Medicine.

  16. Image restoration techniques as applied to Landsat MSS and TM data

    USGS Publications Warehouse

    Meyer, David

    1987-01-01

    Two factors are primarily responsible for the loss of image sharpness in processing digital Landsat images. The first factor is inherent in the data because the sensor's optics and electronics, along with other sensor elements, blur and smear the data. Digital image restoration can be used to reduce this degradation. The second factor, which further degrades by blurring or aliasing, is the resampling performed during geometric correction. An image restoration procedure, when used in place of typical resampled techniques, reduces sensor degradation without introducing the artifacts associated with resampling. The EROS Data Center (EDC) has implemented the restoration proceed for Landsat multispectral scanner (MSS) and thematic mapper (TM) data. This capability, developed at the University of Arizona by Dr. Robert Schowengerdt and Lynette Wood, combines restoration and resampling in a single step to produce geometrically corrected MSS and TM imagery. As with resampling, restoration demands a tradeoff be made between aliasing, which occurs when attempting to extract maximum sharpness from an image, and blurring, which reduces the aliasing problem but sacrifices image sharpness. The restoration procedure used at EDC minimizes these artifacts by being adaptive, tailoring the tradeoff to be optimal for individual images.

  17. Statistical wiring of thalamic receptive fields optimizes spatial sampling of the retinal image

    PubMed Central

    Wang, Xin; Sommer, Friedrich T.; Hirsch, Judith A.

    2014-01-01

    Summary It is widely assumed that mosaics of retinal ganglion cells establish the optimal representation of visual space. However, relay cells in the visual thalamus often receive convergent input from several retinal afferents and, in cat, outnumber ganglion cells. To explore how the thalamus transforms the retinal image, we built a model of the retinothalamic circuit using experimental data and simple wiring rules. The model shows how the thalamus might form a resampled map of visual space with the potential to facilitate detection of stimulus position in the presence of sensor noise. Bayesian decoding conducted with the model provides support for this scenario. Despite its benefits, however, resampling introduces image blur, thus impairing edge perception. Whole-cell recordings obtained in vivo suggest that this problem is mitigated by arrangements of excitation and inhibition within the receptive field that effectively boost contrast borders, much like strategies used in digital image processing. PMID:24559681

  18. How to COAAD Images. I. Optimal Source Detection and Photometry of Point Sources Using Ensembles of Images

    NASA Astrophysics Data System (ADS)

    Zackay, Barak; Ofek, Eran O.

    2017-02-01

    Stacks of digital astronomical images are combined in order to increase image depth. The variable seeing conditions, sky background, and transparency of ground-based observations make the coaddition process nontrivial. We present image coaddition methods that maximize the signal-to-noise ratio (S/N) and optimized for source detection and flux measurement. We show that for these purposes, the best way to combine images is to apply a matched filter to each image using its own point-spread function (PSF) and only then to sum the images with the appropriate weights. Methods that either match the filter after coaddition or perform PSF homogenization prior to coaddition will result in loss of sensitivity. We argue that our method provides an increase of between a few and 25% in the survey speed of deep ground-based imaging surveys compared with weighted coaddition techniques. We demonstrate this claim using simulated data as well as data from the Palomar Transient Factory data release 2. We present a variant of this coaddition method, which is optimal for PSF or aperture photometry. We also provide an analytic formula for calculating the S/N for PSF photometry on single or multiple observations. In the next paper in this series, we present a method for image coaddition in the limit of background-dominated noise, which is optimal for any statistical test or measurement on the constant-in-time image (e.g., source detection, shape or flux measurement, or star-galaxy separation), making the original data redundant. We provide an implementation of these algorithms in MATLAB.

  19. WE-AB-BRA-11: Improved Imaging of Permanent Prostate Brachytherapy Seed Implants by Combining an Endorectal X-Ray Sensor with a CT Scanner

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steiner, J; Matthews, K; Jia, G

    Purpose: To test feasibility of the use of a digital endorectal x-ray sensor for improved image resolution of permanent brachytherapy seed implants compared to conventional CT. Methods: Two phantoms simulating the male pelvic region were used to test the capabilities of a digital endorectal x-ray sensor for imaging permanent brachytherapy seed implants. Phantom 1 was constructed from acrylic plastic with cavities milled in the locations of the prostate and the rectum. The prostate cavity was filled a Styrofoam plug implanted with 10 training seeds. Phantom 2 was constructed from tissue-equivalent gelatins and contained a prostate phantom implanted with 18 strandsmore » of training seeds. For both phantoms, an intraoral digital dental x-ray sensor was placed in the rectum within 2 cm of the seed implants. Scout scans were taken of the phantoms over a limited arc angle using a CT scanner (80 kV, 120–200 mA). The dental sensor was removed from the phantoms and normal helical CT and scout (0 degree) scans using typical parameters for pelvic CT (120 kV, auto-mA) were collected. A shift-and add tomosynthesis algorithm was developed to localize seed plane location normal to detector face. Results: The endorectal sensor produced images with improved resolution compared to CT scans. Seed clusters and individual seed geometry were more discernable using the endorectal sensor. Seed 3D locations, including seeds that were not located in every projection image, were discernable using the shift and add algorithm. Conclusion: This work shows that digital endorectal x-ray sensors are a feasible method for improving imaging of permanent brachytherapy seed implants. Future work will consist of optimizing the tomosynthesis technique to produce higher resolution, lower dose images of 1) permanent brachytherapy seed implants for post-implant dosimetry and 2) fine anatomic details for imaging and managing prostatic disease compared to CT images. Funding: LSU Faculty Start-up Funding. Disclosure: XDR Radiography has loaned our research group the digital x-ray detector used in this work. CoI: None.« less

  20. Automation in photogrammetry: Recent developments and applications (1972-1976)

    USGS Publications Warehouse

    Thompson, M.M.; Mikhail, E.M.

    1976-01-01

    An overview of recent developments in the automation of photogrammetry in various countries is presented. Conclusions regarding automated photogrammetry reached at the 1972 Congress in Ottawa are reviewed first as a background for examining the developments of 1972-1976. Applications are described for each country reporting significant developments. Among fifteen conclusions listed are statements concerning: the widespread practice of equipping existing stereoplotters with simple digitizers; the growing tendency to use minicomputers on-line with stereoplotters; the optimization of production of digital terrain models by progressive sampling in stereomodels; the potential of digitization of a photogrammetric model by density correlation on epipolar lines; the capabilities and economic aspects of advanced systems which permit simultaneous production of orthophotos, contours, and digital terrain models; the economy of off-line orthophoto systems; applications of digital image processing; automation by optical techniques; applications of sensors other than photographic imagery, and the role of photogrammetric phases in a completely automated cartographic system. ?? 1976.

  1. A Novel Approach to Detect Therapeutic Resistance in Breast Cancer

    DTIC Science & Technology

    2008-09-01

    Resistance in Breast Cancer PRINCIPAL INVESTIGATOR: Kamila Czene, Ph.D. CONTRACTING ORGANIZATION: Karolinska Institutet ...ORGANIZATION REPORT NUMBER Karolinska Institutet Stockholm, Sweden 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10...analysis. The digital image analysis algorithms and software that have been developed at Karolinska Institutet consists of an optimized combination of

  2. Thickness optimization of auricular silicone scaffold based on finite element analysis.

    PubMed

    Jiang, Tao; Shang, Jianzhong; Tang, Li; Wang, Zhuo

    2016-01-01

    An optimized thickness of a transplantable auricular silicone scaffold was researched. The original image data were acquired from CT scans, and reverse modeling technology was used to build a digital 3D model of an auricle. The transplant process was simulated in ANSYS Workbench by finite element analysis (FEA), solid scaffolds were manufactured based on the FEA results, and the transplantable artificial auricle was finally obtained with an optimized thickness, as well as sufficient intensity and hardness. This paper provides a reference for clinical transplant surgery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Noise removal in extended depth of field microscope images through nonlinear signal processing.

    PubMed

    Zahreddine, Ramzi N; Cormack, Robert H; Cogswell, Carol J

    2013-04-01

    Extended depth of field (EDF) microscopy, achieved through computational optics, allows for real-time 3D imaging of live cell dynamics. EDF is achieved through a combination of point spread function engineering and digital image processing. A linear Wiener filter has been conventionally used to deconvolve the image, but it suffers from high frequency noise amplification and processing artifacts. A nonlinear processing scheme is proposed which extends the depth of field while minimizing background noise. The nonlinear filter is generated via a training algorithm and an iterative optimizer. Biological microscope images processed with the nonlinear filter show a significant improvement in image quality and signal-to-noise ratio over the conventional linear filter.

  4. Evaluation of clinical image processing algorithms used in digital mammography.

    PubMed

    Zanca, Federica; Jacobs, Jurgen; Van Ongeval, Chantal; Claus, Filip; Celis, Valerie; Geniets, Catherine; Provost, Veerle; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2009-03-01

    Screening is the only proven approach to reduce the mortality of breast cancer, but significant numbers of breast cancers remain undetected even when all quality assurance guidelines are implemented. With the increasing adoption of digital mammography systems, image processing may be a key factor in the imaging chain. Although to our knowledge statistically significant effects of manufacturer-recommended image processings have not been previously demonstrated, the subjective experience of our radiologists, that the apparent image quality can vary considerably between different algorithms, motivated this study. This article addresses the impact of five such algorithms on the detection of clusters of microcalcifications. A database of unprocessed (raw) images of 200 normal digital mammograms, acquired with the Siemens Novation DR, was collected retrospectively. Realistic simulated microcalcification clusters were inserted in half of the unprocessed images. All unprocessed images were subsequently processed with five manufacturer-recommended image processing algorithms (Agfa Musica 1, IMS Raffaello Mammo 1.2, Sectra Mamea AB Sigmoid, Siemens OPVIEW v2, and Siemens OPVIEW v1). Four breast imaging radiologists were asked to locate and score the clusters in each image on a five point rating scale. The free-response data were analyzed by the jackknife free-response receiver operating characteristic (JAFROC) method and, for comparison, also with the receiver operating characteristic (ROC) method. JAFROC analysis revealed highly significant differences between the image processings (F = 8.51, p < 0.0001), suggesting that image processing strongly impacts the detectability of clusters. Siemens OPVIEW2 and Siemens OPVIEW1 yielded the highest and lowest performances, respectively. ROC analysis of the data also revealed significant differences between the processing but at lower significance (F = 3.47, p = 0.0305) than JAFROC. Both statistical analysis methods revealed that the same six pairs of modalities were significantly different, but the JAFROC confidence intervals were about 32% smaller than ROC confidence intervals. This study shows that image processing has a significant impact on the detection of microcalcifications in digital mammograms. Objective measurements, such as described here, should be used by the manufacturers to select the optimal image processing algorithm.

  5. High speed wavefront sensorless aberration correction in digital micromirror based confocal microscopy.

    PubMed

    Pozzi, P; Wilding, D; Soloviev, O; Verstraete, H; Bliek, L; Vdovin, G; Verhaegen, M

    2017-01-23

    The quality of fluorescence microscopy images is often impaired by the presence of sample induced optical aberrations. Adaptive optical elements such as deformable mirrors or spatial light modulators can be used to correct aberrations. However, previously reported techniques either require special sample preparation, or time consuming optimization procedures for the correction of static aberrations. This paper reports a technique for optical sectioning fluorescence microscopy capable of correcting dynamic aberrations in any fluorescent sample during the acquisition. This is achieved by implementing adaptive optics in a non conventional confocal microscopy setup, with multiple programmable confocal apertures, in which out of focus light can be separately detected, and used to optimize the correction performance with a sampling frequency an order of magnitude faster than the imaging rate of the system. The paper reports results comparing the correction performances to traditional image optimization algorithms, and demonstrates how the system can compensate for dynamic changes in the aberrations, such as those introduced during a focal stack acquisition though a thick sample.

  6. Digital Camera Control for Faster Inspection

    NASA Technical Reports Server (NTRS)

    Brown, Katharine; Siekierski, James D.; Mangieri, Mark L.; Dekome, Kent; Cobarruvias, John; Piplani, Perry J.; Busa, Joel

    2009-01-01

    Digital Camera Control Software (DCCS) is a computer program for controlling a boom and a boom-mounted camera used to inspect the external surface of a space shuttle in orbit around the Earth. Running in a laptop computer in the space-shuttle crew cabin, DCCS commands integrated displays and controls. By means of a simple one-button command, a crewmember can view low- resolution images to quickly spot problem areas and can then cause a rapid transition to high- resolution images. The crewmember can command that camera settings apply to a specific small area of interest within the field of view of the camera so as to maximize image quality within that area. DCCS also provides critical high-resolution images to a ground screening team, which analyzes the images to assess damage (if any); in so doing, DCCS enables the team to clear initially suspect areas more quickly than would otherwise be possible and further saves time by minimizing the probability of re-imaging of areas already inspected. On the basis of experience with a previous version (2.0) of the software, the present version (3.0) incorporates a number of advanced imaging features that optimize crewmember capability and efficiency.

  7. Estimation of saturated pixel values in digital color imaging

    PubMed Central

    Zhang, Xuemei; Brainard, David H.

    2007-01-01

    Pixel saturation, where the incident light at a pixel causes one of the color channels of the camera sensor to respond at its maximum value, can produce undesirable artifacts in digital color images. We present a Bayesian algorithm that estimates what the saturated channel's value would have been in the absence of saturation. The algorithm uses the non-saturated responses from the other color channels, together with a multivariate Normal prior that captures the correlation in response across color channels. The appropriate parameters for the prior may be estimated directly from the image data, since most image pixels are not saturated. Given the prior, the responses of the non-saturated channels, and the fact that the true response of the saturated channel is known to be greater than the saturation level, the algorithm returns the optimal expected mean square estimate for the true response. Extensions of the algorithm to the case where more than one channel is saturated are also discussed. Both simulations and examples with real images are presented to show that the algorithm is effective. PMID:15603065

  8. Edge detection - Image-plane versus digital processing

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.; Park, Stephen K.; Triplett, Judith A.

    1987-01-01

    To optimize edge detection with the familiar Laplacian-of-Gaussian operator, it has become common to implement this operator with a large digital convolution mask followed by some interpolation of the processed data to determine the zero crossings that locate edges. It is generally recognized that this large mask causes substantial blurring of fine detail. It is shown that the spatial detail can be improved by a factor of about four with either the Wiener-Laplacian-of-Gaussian filter or an image-plane processor. The Wiener-Laplacian-of-Gaussian filter minimizes the image-gathering degradations if the scene statistics are at least approximately known and also serves as an interpolator to determine the desired zero crossings directly. The image-plane processor forms the Laplacian-of-Gaussian response by properly combining the optical design of the image-gathering system with a minimal three-by-three lateral-inhibitory processing mask. This approach, which is suggested by Marr's model of early processing in human vision, also reduces data processing by about two orders of magnitude and data transmission by up to an order of magnitude.

  9. Statistical mechanics of image processing by digital halftoning

    NASA Astrophysics Data System (ADS)

    Inoue, Jun-Ichi; Norimatsu, Wataru; Saika, Yohei; Okada, Masato

    2009-03-01

    We consider the problem of digital halftoning (DH). The DH is an image processing representing each grayscale in images in terms of black and white dots, and it is achieved by making use of the threshold dither mask, namely, each pixel is determined as black if the grayscale pixel is greater than or equal to the mask value and as white vice versa. To determine the mask for a given grayscale image, we assume that human-eyes might recognize the BW dots as the corresponding grayscale by linear filters. Then, the Hamiltonian is constructed as a distance between the original and recognized images which is written in terms of the mask. Finding the ground state of the Hamiltonian via deterministic annealing, we obtain the optimal mask and the BW dots simultaneously. From the spectrum analysis, we find that the BW dots are desirable from the view point of human-eyes modulation properties. We also show that the lower bound of the mean square error for the inverse process of the DH is minimized on the Nishimori line which is well-known in the research field of spin glasses.

  10. Concurrent validation and reliability of digital image analysis of granulation tissue color for clinical pressure ulcers.

    PubMed

    Iizaka, Shinji; Sugama, Junko; Nakagami, Gojiro; Kaitani, Toshiko; Naito, Ayumi; Koyanagi, Hiroe; Matsuo, Junko; Kadono, Takafumi; Konya, Chizuko; Sanada, Hiromi

    2011-01-01

    Granulation tissue color is one indicator for pressure ulcer (PU) assessment. However, it entails a subjective evaluation only, and quantitative methods have not been established. We developed color indicators from digital image analysis and investigated their concurrent validity and reliability for clinical PUs. A cross-sectional study was conducted on 47 patients with 55 full-thickness PUs. After color calibration, a wound photograph was converted into three images representing red color: erythema index (EI), modified erythema index with additional color calibration (granulation red index [GRI]), and , which represents the artificially created red-green axis of L(*) a(*) b(*) color space. The mean intensity of the granulation tissue region and the percentage of pixels exceeding the optimal cutoff intensity (% intensity) were calculated. Mean GRI (ρ=0.39, p=0.007) and (ρ=0.55, p<0.001), as well as their % intensity indicators, showed positive correlations with a(*) measured by tristimulus colorimeter, but erythema index did not. They were correlated with hydroxyproline concentration in wound fluid, healthy granulation tissue area, and blood hemoglobin level. Intra- and interrater reliability of the indicator calculation using both GRI and had an intraclass correlation coefficient >0.9. GRI and from digital image analysis can quantitatively evaluate granulation tissue color of clinical PUs. © 2011 by the Wound Healing Society.

  11. Breast Cancer Screening, Mammography, and Other Modalities.

    PubMed

    Fiorica, James V

    2016-12-01

    This article is an overview of the modalities available for breast cancer screening. The modalities discussed include digital mammography, digital breast tomosynthesis, breast ultrasonography, magnetic resonance imaging, and clinical breast examination. There is a review of pertinent randomized controlled trials, studies and meta-analyses which contributed to the evolution of screening guidelines. Ultimately, 5 major medical organizations formulated the current screening guidelines in the United States. The lack of consensus in these guidelines represents an ongoing controversy about the optimal timing and method for breast cancer screening in women. For mammography screening, the Breast Imaging Reporting and Data System lexicon is explained which corresponds with recommended clinical management. The presentation and discussion of the data in this article are designed to help the clinician individualize breast cancer screening for each patient.

  12. [The application of X-ray imaging in forensic medicine].

    PubMed

    Kučerová, Stěpánka; Safr, Miroslav; Ublová, Michaela; Urbanová, Petra; Hejna, Petr

    2014-07-01

    X-ray is the most common, basic and essential imaging method used in forensic medicine. It serves to display and localize the foreign objects in the body and helps to detect various traumatic and pathological changes. X-ray imaging is valuable in anthropological assessment of an individual. X-ray allows non-invasive evaluation of important findings before the autopsy and thus selection of the optimal strategy for dissection. Basic indications for postmortem X-ray imaging in forensic medicine include gunshot and explosive fatalities (identification and localization of projectiles or other components of ammunition, visualization of secondary missiles), sharp force injuries (air embolism, identification of the weapon) and motor vehicle related deaths. The method is also helpful for complex injury evaluation in abused victims or in persons where abuse is suspected. Finally, X-ray imaging still remains the gold standard method for identification of unknown deceased. With time modern imaging methods, especially computed tomography and magnetic resonance imaging, are more and more applied in forensic medicine. Their application extends possibilities of the visualization the bony structures toward a more detailed imaging of soft tissues and internal organs. The application of modern imaging methods in postmortem body investigation is known as digital or virtual autopsy. At present digital postmortem imaging is considered as a bloodless alternative to the conventional autopsy.

  13. The design of wavefront coded imaging system

    NASA Astrophysics Data System (ADS)

    Lan, Shun; Cen, Zhaofeng; Li, Xiaotong

    2016-10-01

    Wavefront Coding is a new method to extend the depth of field, which combines optical design and signal processing together. By using optical design software ZEMAX ,we designed a practical wavefront coded imaging system based on a conventional Cooke triplet system .Unlike conventional optical system, the wavefront of this new system is modulated by a specially designed phase mask, which makes the point spread function (PSF)of optical system not sensitive to defocus. Therefore, a series of same blurred images obtained at the image plane. In addition, the optical transfer function (OTF) of the wavefront coded imaging system is independent of focus, which is nearly constant with misfocus and has no regions of zeros. All object information can be completely recovered through digital filtering at different defocus positions. The focus invariance of MTF is selected as merit function in this design. And the coefficients of phase mask are set as optimization goals. Compared to conventional optical system, wavefront coded imaging system obtains better quality images under different object distances. Some deficiencies appear in the restored images due to the influence of digital filtering algorithm, which are also analyzed in this paper. The depth of field of the designed wavefront coded imaging system is about 28 times larger than initial optical system, while keeping higher optical power and resolution at the image plane.

  14. Design of efficient circularly symmetric two-dimensional variable digital FIR filters.

    PubMed

    Bindima, Thayyil; Elias, Elizabeth

    2016-05-01

    Circularly symmetric two-dimensional (2D) finite impulse response (FIR) filters find extensive use in image and medical applications, especially for isotropic filtering. Moreover, the design and implementation of 2D digital filters with variable fractional delay and variable magnitude responses without redesigning the filter has become a crucial topic of interest due to its significance in low-cost applications. Recently the design using fixed word length coefficients has gained importance due to the replacement of multipliers by shifters and adders, which reduces the hardware complexity. Among the various approaches to 2D design, transforming a one-dimensional (1D) filter to 2D by transformation, is reported to be an efficient technique. In this paper, 1D variable digital filters (VDFs) with tunable cut-off frequencies are designed using Farrow structure based interpolation approach, and the sub-filter coefficients in the Farrow structure are made multiplier-less using canonic signed digit (CSD) representation. The resulting performance degradation in the filters is overcome by using artificial bee colony (ABC) optimization. Finally, the optimized 1D VDFs are mapped to 2D using generalized McClellan transformation resulting in low complexity, circularly symmetric 2D VDFs with real-time tunability.

  15. Design of efficient circularly symmetric two-dimensional variable digital FIR filters

    PubMed Central

    Bindima, Thayyil; Elias, Elizabeth

    2016-01-01

    Circularly symmetric two-dimensional (2D) finite impulse response (FIR) filters find extensive use in image and medical applications, especially for isotropic filtering. Moreover, the design and implementation of 2D digital filters with variable fractional delay and variable magnitude responses without redesigning the filter has become a crucial topic of interest due to its significance in low-cost applications. Recently the design using fixed word length coefficients has gained importance due to the replacement of multipliers by shifters and adders, which reduces the hardware complexity. Among the various approaches to 2D design, transforming a one-dimensional (1D) filter to 2D by transformation, is reported to be an efficient technique. In this paper, 1D variable digital filters (VDFs) with tunable cut-off frequencies are designed using Farrow structure based interpolation approach, and the sub-filter coefficients in the Farrow structure are made multiplier-less using canonic signed digit (CSD) representation. The resulting performance degradation in the filters is overcome by using artificial bee colony (ABC) optimization. Finally, the optimized 1D VDFs are mapped to 2D using generalized McClellan transformation resulting in low complexity, circularly symmetric 2D VDFs with real-time tunability. PMID:27222739

  16. Personalized development of human organs using 3D printing technology.

    PubMed

    Radenkovic, Dina; Solouk, Atefeh; Seifalian, Alexander

    2016-02-01

    3D printing is a technique of fabricating physical models from a 3D volumetric digital image. The image is sliced and printed using a specific material into thin layers, and successive layering of the material produces a 3D model. It has already been used for printing surgical models for preoperative planning and in constructing personalized prostheses for patients. The ultimate goal is to achieve the development of functional human organs and tissues, to overcome limitations of organ transplantation created by the lack of organ donors and life-long immunosuppression. We hypothesized a precision medicine approach to human organ fabrication using 3D printed technology, in which the digital volumetric data would be collected by imaging of a patient, i.e. CT or MRI images followed by mathematical modeling to create a digital 3D image. Then a suitable biocompatible material, with an optimal resolution for cells seeding and maintenance of cell viability during the printing process, would be printed with a compatible printer type and finally implanted into the patient. Life-saving operations with 3D printed implants were already performed in patients. However, several issues need to be addressed before translational application of 3D printing into clinical medicine. These are vascularization, innervation, and financial cost of 3D printing and safety of biomaterials used for the construct. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Optimization of tungsten x-ray spectra for digital mammography: a comparison of model to experiment

    NASA Astrophysics Data System (ADS)

    Andre, Michael P.; Spivey, Brett A.

    1997-05-01

    Tungsten (W) target x-rays tubes are being studied for use in digital mammography to improve x-ray flux, reduce noise and increase tube heat capacity. A parametric model was developed for digital mammography to evaluate optimization of x-ray spectra for a particular sensor. The model computes spectra and mean glandular doses (MGD) for combinations of W target, beam filters, kVp, breast type and thickness. Two figures of merit were defined: (signal/noise)2/MGD and spectral quantum efficiency; these were computed as a means to approach optimization of object contrast. The model is derived from a combination of classic equations, XCOM from NBS, and published data. X-ray spectra were calculated and measured for filters of Al, Sn, Rh, Mo and Ag on a Eureka tube. (Signal/noise)2/MGD was measured for a filtered W target tube and a digital camera employing CsI scintillator optically coupled to a CCD for which the detective quantum efficiency (DQE) was known. A 3-mm thick acrylic disk was imaged on thickness of 3-8 cm of acrylic and the results were compared to the predictions of the model. The relative error between predicted and measured spectra was +/- 2 percent from 24 to 34 kVp. Calculated MGD as a function of breast thickness, half-value layer and beam filter compares very well to published data. Best performance was found for the following combinations: Mo filter with 30 mm breast, Ag filter with 45 mm, Sn filter for 60 mm, and Al filter for 75 mm thick breast. The parametric model agrees well with measurement and provides a means to explore optimum combinations of kVp and beam filter. For a particular detector, this data may be used with the DQE to estimate total system signal-to-noise ratio for a particular imaging task.

  18. [Optimal beam quality for chest digital radiography].

    PubMed

    Oda, Nobuhiro; Tabata, Yoshito; Nakano, Tsutomu

    2014-11-01

    To investigate the optimal beam quality for chest computed radiography (CR), we measured the radiographic contrast and evaluated the image quality of chest CR using various X-ray tube voltages. The contrast between lung and rib or heart increased on CR images obtained by lowering the tube voltage from 140 to 60 kV, but the degree of increase was less. Scattered radiation was reduced on CR images with a lower tube voltage. The Wiener spectrum of CR images with a low tube voltage showed a low value under identical conditions of amount of light stimulated emission. The quality of chest CR images obtained using a lower tube voltage (80 kV and 100 kV) was evaluated as being superior to those obtained with a higher tube voltage (120 kV and 140 kV). Considering the problem of tube loading and exposure in clinical applications, a tube voltage of 90 to 100 kV (0.1 mm copper filter backed by 0.5 mm aluminum) is recommended for chest CR.

  19. Content dependent selection of image enhancement parameters for mobile displays

    NASA Astrophysics Data System (ADS)

    Lee, Yoon-Gyoo; Kang, Yoo-Jin; Kim, Han-Eol; Kim, Ka-Hee; Kim, Choon-Woo

    2011-01-01

    Mobile devices such as cellular phones and portable multimedia player with capability of playing terrestrial digital multimedia broadcasting (T-DMB) contents have been introduced into consumer market. In this paper, content dependent image quality enhancement method for sharpness and colorfulness and noise reduction is presented to improve perceived image quality on mobile displays. Human visual experiments are performed to analyze viewers' preference. Relationship between the objective measures and the optimal values of image control parameters are modeled by simple lookup tables based on the results of human visual experiments. Content dependent values of image control parameters are determined based on the calculated measures and predetermined lookup tables. Experimental results indicate that dynamic selection of image control parameters yields better image quality.

  20. The Ultracam Story

    NASA Astrophysics Data System (ADS)

    Leberl, F.; Gruber, M.; Ponticelli, M.; Wiechert, A.

    2012-07-01

    The UltraCam-project created a novel Large Format Digital Aerial Camera. It was inspired by the ISPRS Congress 2000 in Amsterdam. The search for a promising imaging idea succeeded in May 2001, defining a tiling approach with multiple lenses and multiple area CCD arrays to assemble a seamless and geometrically stable monolithic photogrammetric aerial large format image. First resources were spent on the project in September 2011. The initial UltraCam-D was announced and demonstrated in May 2003. By now the imaging principle has resulted in a 4th generation UltraCam Eagle, increasing the original swath width from 11,500 pixels to beyond 20,000. Inspired by the original imaging principle, alternatives have been investigated, and the UltraCam-G carries the swath width even further, namely to a frame image with nearly 30,000 pixels, however, with a modified tiling concept and optimized for orthophoto production. We explain the advent of digital aerial large format imaging and how it benefits from improvements in computing technology to cope with data flows at a rate of 3 Gigabits per second and a need to deal with Terabytes of imagery within a single aerial sortie. We also address the many benefits of a transition to a fully digital workflow with a paradigm shift away from minimizing a project's number of aerial photographs and towards maximizing the automation of photogrammetric workflows by means of high redundancy imaging strategies. The instant gratification from near-real-time aerial triangulations and dense image matching has led to a reassessment of the value of photogrammetric point clouds to successfully compete with direct point cloud measurements by LiDAR.

  1. Routine Digital Pathology Workflow: The Catania Experience

    PubMed Central

    Fraggetta, Filippo; Garozzo, Salvatore; Zannoni, Gian Franco; Pantanowitz, Liron; Rossi, Esther Diana

    2017-01-01

    Introduction: Successful implementation of whole slide imaging (WSI) for routine clinical practice has been accomplished in only a few pathology laboratories worldwide. We report the transition to an effective and complete digital surgical pathology workflow in the pathology laboratory at Cannizzaro Hospital in Catania, Italy. Methods: All (100%) permanent histopathology glass slides were digitized at ×20 using Aperio AT2 scanners. Compatible stain and scanning slide racks were employed to streamline operations. eSlide Manager software was bidirectionally interfaced with the anatomic pathology laboratory information system. Virtual slide trays connected to the two-dimensional (2D) barcode tracking system allowed pathologists to confirm that they were correctly assigned slides and that all tissues on these glass slides were scanned. Results: Over 115,000 glass slides were digitized with a scan fail rate of around 1%. Drying glass slides before scanning minimized them sticking to scanner racks. Implementation required introduction of a 2D barcode tracking system and modification of histology workflow processes. Conclusion: Our experience indicates that effective adoption of WSI for primary diagnostic use was more dependent on optimizing preimaging variables and integration with the laboratory information system than on information technology infrastructure and ensuring pathologist buy-in. Implementation of digital pathology for routine practice not only leveraged the benefits of digital imaging but also creates an opportunity for establishing standardization of workflow processes in the pathology laboratory. PMID:29416914

  2. Routine Digital Pathology Workflow: The Catania Experience.

    PubMed

    Fraggetta, Filippo; Garozzo, Salvatore; Zannoni, Gian Franco; Pantanowitz, Liron; Rossi, Esther Diana

    2017-01-01

    Successful implementation of whole slide imaging (WSI) for routine clinical practice has been accomplished in only a few pathology laboratories worldwide. We report the transition to an effective and complete digital surgical pathology workflow in the pathology laboratory at Cannizzaro Hospital in Catania, Italy. All (100%) permanent histopathology glass slides were digitized at ×20 using Aperio AT2 scanners. Compatible stain and scanning slide racks were employed to streamline operations. eSlide Manager software was bidirectionally interfaced with the anatomic pathology laboratory information system. Virtual slide trays connected to the two-dimensional (2D) barcode tracking system allowed pathologists to confirm that they were correctly assigned slides and that all tissues on these glass slides were scanned. Over 115,000 glass slides were digitized with a scan fail rate of around 1%. Drying glass slides before scanning minimized them sticking to scanner racks. Implementation required introduction of a 2D barcode tracking system and modification of histology workflow processes. Our experience indicates that effective adoption of WSI for primary diagnostic use was more dependent on optimizing preimaging variables and integration with the laboratory information system than on information technology infrastructure and ensuring pathologist buy-in. Implementation of digital pathology for routine practice not only leveraged the benefits of digital imaging but also creates an opportunity for establishing standardization of workflow processes in the pathology laboratory.

  3. Digital imaging biomarkers feed machine learning for melanoma screening.

    PubMed

    Gareau, Daniel S; Correa da Rosa, Joel; Yagerman, Sarah; Carucci, John A; Gulati, Nicholas; Hueto, Ferran; DeFazio, Jennifer L; Suárez-Fariñas, Mayte; Marghoob, Ashfaq; Krueger, James G

    2017-07-01

    We developed an automated approach for generating quantitative image analysis metrics (imaging biomarkers) that are then analysed with a set of 13 machine learning algorithms to generate an overall risk score that is called a Q-score. These methods were applied to a set of 120 "difficult" dermoscopy images of dysplastic nevi and melanomas that were subsequently excised/classified. This approach yielded 98% sensitivity and 36% specificity for melanoma detection, approaching sensitivity/specificity of expert lesion evaluation. Importantly, we found strong spectral dependence of many imaging biomarkers in blue or red colour channels, suggesting the need to optimize spectral evaluation of pigmented lesions. © 2016 The Authors. Experimental Dermatology Published by John Wiley & Sons Ltd.

  4. Comparison of low-light nonmydriatic digital imaging with 35-mm ETDRS seven-standard field stereo color fundus photographs and clinical examination.

    PubMed

    Silva, Paolo S; Walia, Saloni; Cavallerano, Jerry D; Sun, Jennifer K; Dunn, Cheri; Bursell, Sven-Erik; Aiello, Lloyd M; Aiello, Lloyd Paul

    2012-09-01

    To compare agreement between diagnosis of clinical level of diabetic retinopathy (DR) and diabetic macular edema (DME) derived from nonmydriatic fundus images using a digital camera back optimized for low-flash image capture (MegaVision) compared with standard seven-field Early Treatment Diabetic Retinopathy Study (ETDRS) photographs and dilated clinical examination. Subject comfort and image acquisition time were also evaluated. In total, 126 eyes from 67 subjects with diabetes underwent Joslin Vision Network nonmydriatic retinal imaging. ETDRS photographs were obtained after pupillary dilation, and fundus examination was performed by a retina specialist. There was near-perfect agreement between MegaVision and ETDRS photographs (κ=0.81, 95% confidence interval [CI] 0.73-0.89) for clinical DR severity levels. Substantial agreement was observed with clinical examination (κ=0.71, 95% CI 0.62-0.80). For DME severity level there was near-perfect agreement with ETDRS photographs (κ=0.92, 95% CI 0.87-0.98) and moderate agreement with clinical examination (κ=0.58, 95% CI 0.46-0.71). The wider MegaVision 45° field led to identification of nonproliferative changes in areas not imaged by the 30° field of ETDRS photos. Field area unique to ETDRS photographs identified proliferative changes not visualized with MegaVision. Mean MegaVision acquisition time was 9:52 min. After imaging, 60% of subjects preferred the MegaVision lower flash settings. When evaluated using a rigorous protocol, images captured using a low-light digital camera compared favorably with ETDRS photography and clinical examination for grading level of DR and DME. Furthermore, these data suggest the importance of more extensive peripheral images and suggest that utilization of wide-field retinal imaging may further improve accuracy of DR assessment.

  5. Evaluating RGB photogrammetry and multi-temporal digital surface models for detecting soil erosion

    NASA Astrophysics Data System (ADS)

    Anders, Niels; Keesstra, Saskia; Seeger, Manuel

    2013-04-01

    Photogrammetry is a widely used tool for generating high-resolution digital surface models. Unmanned Aerial Vehicles (UAVs), equipped with a Red Green Blue (RGB) camera, have great potential in quickly acquiring multi-temporal high-resolution orthophotos and surface models. Such datasets would ease the monitoring of geomorphological processes, such as local soil erosion and rill formation after heavy rainfall events. In this study we test a photogrammetric setup to determine data requirements for soil erosion studies with UAVs. We used a rainfall simulator (5 m2) and above a rig with attached a Panasonic GX1 16 megapixel digital camera and 20mm lens. The soil material in the simulator consisted of loamy sand at an angle of 5 degrees. Stereo pair images were taken before and after rainfall simulation with 75-85% overlap. Acquired images were automatically mosaicked to create high-resolution orthorectified images and digital surface models (DSM). We resampled the DSM to different spatial resolutions to analyze the effect of cell size to the accuracy of measured rill depth and soil loss estimations, and determined an optimal cell size (thus flight altitude). Furthermore, the high spatial accuracy of the acquired surface models allows further analysis of rill formation and channel initiation related to e.g. surface roughness. We suggest implementing near-infrared and temperature sensors to combine soil moisture and soil physical properties with surface morphology for future investigations.

  6. Stable and accurate methods for identification of water bodies from Landsat series imagery using meta-heuristic algorithms

    NASA Astrophysics Data System (ADS)

    Gamshadzaei, Mohammad Hossein; Rahimzadegan, Majid

    2017-10-01

    Identification of water extents in Landsat images is challenging due to surfaces with similar reflectance to water extents. The objective of this study is to provide stable and accurate methods for identifying water extents in Landsat images based on meta-heuristic algorithms. Then, seven Landsat images were selected from various environmental regions in Iran. Training of the algorithms was performed using 40 water pixels and 40 nonwater pixels in operational land imager images of Chitgar Lake (one of the study regions). Moreover, high-resolution images from Google Earth were digitized to evaluate the results. Two approaches were considered: index-based and artificial intelligence (AI) algorithms. In the first approach, nine common water spectral indices were investigated. AI algorithms were utilized to acquire coefficients of optimal band combinations to extract water extents. Among the AI algorithms, the artificial neural network algorithm and also the ant colony optimization, genetic algorithm, and particle swarm optimization (PSO) meta-heuristic algorithms were implemented. Index-based methods represented different performances in various regions. Among AI methods, PSO had the best performance with average overall accuracy and kappa coefficient of 93% and 98%, respectively. The results indicated the applicability of acquired band combinations to extract accurately and stably water extents in Landsat imagery.

  7. Threshold Determination for Local Instantaneous Sea Surface Height Derivation with Icebridge Data in Beaufort Sea

    NASA Astrophysics Data System (ADS)

    Zhu, C.; Zhang, S.; Xiao, F.; Li, J.; Yuan, L.; Zhang, Y.; Zhu, T.

    2018-05-01

    The NASA Operation IceBridge (OIB) mission is the largest program in the Earth's polar remote sensing science observation project currently, initiated in 2009, which collects airborne remote sensing measurements to bridge the gap between NASA's ICESat and the upcoming ICESat-2 mission. This paper develop an improved method that optimizing the selection method of Digital Mapping System (DMS) image and using the optimal threshold obtained by experiments in Beaufort Sea to calculate the local instantaneous sea surface height in this area. The optimal threshold determined by comparing manual selection with the lowest (Airborne Topographic Mapper) ATM L1B elevation threshold of 2 %, 1 %, 0.5 %, 0.2 %, 0.1 % and 0.05 % in A, B, C sections, the mean of mean difference are 0.166 m, 0.124 m, 0.083 m, 0.018 m, 0.002 m and -0.034 m. Our study shows the lowest L1B data of 0.1 % is the optimal threshold. The optimal threshold and manual selections are also used to calculate the instantaneous sea surface height over images with leads, we find that improved methods has closer agreement with those from L1B manual selections. For these images without leads, the local instantaneous sea surface height estimated by using the linear equations between distance and sea surface height calculated over images with leads.

  8. Structure and properties of clinical coralline implants measured via 3D imaging and analysis.

    PubMed

    Knackstedt, Mark Alexander; Arns, Christoph H; Senden, Tim J; Gross, Karlis

    2006-05-01

    The development and design of advanced porous materials for biomedical applications requires a thorough understanding of how material structure impacts on mechanical and transport properties. This paper illustrates a 3D imaging and analysis study of two clinically proven coral bone graft samples (Porites and Goniopora). Images are obtained from X-ray micro-computed tomography (micro-CT) at a resolution of 16.8 microm. A visual comparison of the two images shows very different structure; Porites has a homogeneous structure and consistent pore size while Goniopora has a bimodal pore size and a strongly disordered structure. A number of 3D structural characteristics are measured directly on the images including pore volume-to-surface-area, pore and solid size distributions, chord length measurements and tortuosity. Computational results made directly on the digitized tomographic images are presented for the permeability, diffusivity and elastic modulus of the coral samples. The results allow one to quantify differences between the two samples. 3D digital analysis can provide a more thorough assessment of biomaterial structure including the pore wall thickness, local flow, mechanical properties and diffusion pathways. We discuss the implications of these results to the development of optimal scaffold design for tissue ingrowth.

  9. Diagnosing and ranking retinopathy disease level using diabetic fundus image recuperation approach.

    PubMed

    Somasundaram, K; Rajendran, P Alli

    2015-01-01

    Retinal fundus images are widely used in diagnosing different types of eye diseases. The existing methods such as Feature Based Macular Edema Detection (FMED) and Optimally Adjusted Morphological Operator (OAMO) effectively detected the presence of exudation in fundus images and identified the true positive ratio of exudates detection, respectively. These mechanically detected exudates did not include more detailed feature selection technique to the system for detection of diabetic retinopathy. To categorize the exudates, Diabetic Fundus Image Recuperation (DFIR) method based on sliding window approach is developed in this work to select the features of optic cup in digital retinal fundus images. The DFIR feature selection uses collection of sliding windows with varying range to obtain the features based on the histogram value using Group Sparsity Nonoverlapping Function. Using support vector model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy disease level. The ranking of disease level on each candidate set provides a much promising result for developing practically automated and assisted diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, ranking efficiency, and feature selection time.

  10. Fluid surface compensation in digital holographic microscopy for topography measurement

    NASA Astrophysics Data System (ADS)

    Lin, Li-Chien; Tu, Han-Yen; Lai, Xin-Ji; Wang, Sheng-Shiun; Cheng, Chau-Jern

    2012-06-01

    A novel technique is presented for surface compensation and topography measurement of a specimen in fluid medium by digital holographic microscopy (DHM). In the measurement, the specimen is preserved in a culture dish full of liquid culture medium and an environmental vibration induces a series of ripples to create a non-uniform background on the reconstructed phase image. A background surface compensation algorithm is proposed to account for this problem. First, we distinguish the cell image from the non-uniform background and a morphological image operation is used to reduce the noise effect on the background surface areas. Then, an adaptive sampling from the background surface is employed, taking dense samples from the high-variation area while leaving the smooth region mostly untouched. A surface fitting algorithm based on the optimal bi-cubic functional approximation is used to establish a whole background surface for the phase image. Once the background surface is found, the background compensated phase can be obtained by subtracting the estimated background from the original phase image. From the experimental results, the proposed algorithm performs effectively in removing the non-uniform background of the phase image and has the ability to obtain the specimen topography inside fluid medium under environmental vibrations.

  11. Diagnosing and Ranking Retinopathy Disease Level Using Diabetic Fundus Image Recuperation Approach

    PubMed Central

    Somasundaram, K.; Alli Rajendran, P.

    2015-01-01

    Retinal fundus images are widely used in diagnosing different types of eye diseases. The existing methods such as Feature Based Macular Edema Detection (FMED) and Optimally Adjusted Morphological Operator (OAMO) effectively detected the presence of exudation in fundus images and identified the true positive ratio of exudates detection, respectively. These mechanically detected exudates did not include more detailed feature selection technique to the system for detection of diabetic retinopathy. To categorize the exudates, Diabetic Fundus Image Recuperation (DFIR) method based on sliding window approach is developed in this work to select the features of optic cup in digital retinal fundus images. The DFIR feature selection uses collection of sliding windows with varying range to obtain the features based on the histogram value using Group Sparsity Nonoverlapping Function. Using support vector model in the second phase, the DFIR method based on Spiral Basis Function effectively ranks the diabetic retinopathy disease level. The ranking of disease level on each candidate set provides a much promising result for developing practically automated and assisted diabetic retinopathy diagnosis system. Experimental work on digital fundus images using the DFIR method performs research on the factors such as sensitivity, ranking efficiency, and feature selection time. PMID:25945362

  12. Lunar UV-visible-IR mapping interferometric spectrometer

    NASA Technical Reports Server (NTRS)

    Smith, W. Hayden; Haskin, L.; Korotev, R.; Arvidson, R.; Mckinnon, W.; Hapke, B.; Larson, S.; Lucey, P.

    1992-01-01

    Ultraviolet-visible-infrared mapping digital array scanned interferometers for lunar compositional surveys was developed. The research has defined a no-moving-parts, low-weight and low-power, high-throughput, and electronically adaptable digital array scanned interferometer that achieves measurement objectives encompassing and improving upon all the requirements defined by the LEXSWIG for lunar mineralogical investigation. In addition, LUMIS provides a new, important, ultraviolet spectral mapping, high-spatial-resolution line scan camera, and multispectral camera capabilities. An instrument configuration optimized for spectral mapping and imaging of the lunar surface and provide spectral results in support of the instrument design are described.

  13. DCTune Perceptual Optimization of Compressed Dental X-Rays

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Null, Cynthia H. (Technical Monitor)

    1997-01-01

    In current dental practice, x-rays of completed dental work are often sent to the insurer for verification. It is faster and cheaper to transmit instead digital scans of the x-rays. Further economies result if the images are sent in compressed form. DCtune is a technology for optimizing DCT quantization matrices to yield maximum perceptual quality for a given bit-rate, or minimum bit-rate for a given perceptual quality. In addition, the technology provides a means of setting the perceptual quality of compressed imagery in a systematic way. The purpose of this research was, with respect to dental x-rays: (1) to verify the advantage of DCTune over standard JPEG; (2) to verify the quality control feature of DCTune; and (3) to discover regularities in the optimized matrices of a set of images. Additional information is contained in the original extended abstract.

  14. Fractal analysis of mandibular trabecular bone: optimal tile sizes for the tile counting method.

    PubMed

    Huh, Kyung-Hoe; Baik, Jee-Seon; Yi, Won-Jin; Heo, Min-Suk; Lee, Sam-Sun; Choi, Soon-Chul; Lee, Sun-Bok; Lee, Seung-Pyo

    2011-06-01

    This study was performed to determine the optimal tile size for the fractal dimension of the mandibular trabecular bone using a tile counting method. Digital intraoral radiographic images were obtained at the mandibular angle, molar, premolar, and incisor regions of 29 human dry mandibles. After preprocessing, the parameters representing morphometric characteristics of the trabecular bone were calculated. The fractal dimensions of the processed images were analyzed in various tile sizes by the tile counting method. The optimal range of tile size was 0.132 mm to 0.396 mm for the fractal dimension using the tile counting method. The sizes were closely related to the morphometric parameters. The fractal dimension of mandibular trabecular bone, as calculated with the tile counting method, can be best characterized with a range of tile sizes from 0.132 to 0.396 mm.

  15. Fractal analysis of mandibular trabecular bone: optimal tile sizes for the tile counting method

    PubMed Central

    Huh, Kyung-Hoe; Baik, Jee-Seon; Heo, Min-Suk; Lee, Sam-Sun; Choi, Soon-Chul; Lee, Sun-Bok; Lee, Seung-Pyo

    2011-01-01

    Purpose This study was performed to determine the optimal tile size for the fractal dimension of the mandibular trabecular bone using a tile counting method. Materials and Methods Digital intraoral radiographic images were obtained at the mandibular angle, molar, premolar, and incisor regions of 29 human dry mandibles. After preprocessing, the parameters representing morphometric characteristics of the trabecular bone were calculated. The fractal dimensions of the processed images were analyzed in various tile sizes by the tile counting method. Results The optimal range of tile size was 0.132 mm to 0.396 mm for the fractal dimension using the tile counting method. The sizes were closely related to the morphometric parameters. Conclusion The fractal dimension of mandibular trabecular bone, as calculated with the tile counting method, can be best characterized with a range of tile sizes from 0.132 to 0.396 mm. PMID:21977478

  16. Digital pathology: DICOM-conform draft, testbed, and first results.

    PubMed

    Zwönitzer, Ralf; Kalinski, Thomas; Hofmann, Harald; Roessner, Albert; Bernarding, Johannes

    2007-09-01

    Hospital information systems are state of the art nowadays. Therefore, Digital Pathology, also labelled as Virtual Microscopy, has gained increased attention. Triggered by radiology, standardized information models and workflows were world-wide defined based on DICOM. However, DICOM-conform integration of Digital Pathology into existing clinical information systems imposes new problems requiring specific solutions concerning the huge amount of data as well as the special structure of the data to be managed, transferred, and stored. We implemented a testbed to realize and evaluate the workflow of digitized slides from acquisition to archiving. The experiences led to the draft of a DICOM-conform information model that accounted for extensions, definitions, and technical requirements necessary to integrate digital pathology in a hospital-wide DICOM environment. Slides were digitized, compressed, and could be viewed remotely. Real-time transfer of the huge amount of data was optimized using streaming techniques. Compared to a recent discussion in the DICOM Working Group for Digital Pathology (WG26) our experiences led to a preference of a JPEG2000/JPIP-based streaming of the whole slide image. The results showed that digital pathology is feasible but strong efforts by users and vendors are still necessary to integrate Digital Pathology into existing information systems.

  17. TOPICAL REVIEW: Digital x-ray tomosynthesis: current state of the art and clinical potential

    NASA Astrophysics Data System (ADS)

    Dobbins, James T., III; Godfrey, Devon J.

    2003-10-01

    Digital x-ray tomosynthesis is a technique for producing slice images using conventional x-ray systems. It is a refinement of conventional geometric tomography, which has been known since the 1930s. In conventional geometric tomography, the x-ray tube and image receptor move in synchrony on opposite sides of the patient to produce a plane of structures in sharp focus at the plane containing the fulcrum of the motion; all other structures above and below the fulcrum plane are blurred and thus less visible in the resulting image. Tomosynthesis improves upon conventional geometric tomography in that it allows an arbitrary number of in-focus planes to be generated retrospectively from a sequence of projection radiographs that are acquired during a single motion of the x-ray tube. By shifting and adding these projection radiographs, specific planes may be reconstructed. This topical review describes the various reconstruction algorithms used to produce tomosynthesis images, as well as approaches used to minimize the residual blur from out-of-plane structures. Historical background and mathematical details are given for the various approaches described. Approaches for optimizing the tomosynthesis image are given. Applications of tomosynthesis to various clinical tasks, including angiography, chest imaging, mammography, dental imaging and orthopaedic imaging, are also described.

  18. A Digital Preclinical PET/MRI Insert and Initial Results.

    PubMed

    Weissler, Bjoern; Gebhardt, Pierre; Dueppenbecker, Peter M; Wehner, Jakob; Schug, David; Lerche, Christoph W; Goldschmidt, Benjamin; Salomon, Andre; Verel, Iris; Heijman, Edwin; Perkuhn, Michael; Heberling, Dirk; Botnar, Rene M; Kiessling, Fabian; Schulz, Volkmar

    2015-11-01

    Combining Positron Emission Tomography (PET) with Magnetic Resonance Imaging (MRI) results in a promising hybrid molecular imaging modality as it unifies the high sensitivity of PET for molecular and cellular processes with the functional and anatomical information from MRI. Digital Silicon Photomultipliers (dSiPMs) are the digital evolution in scintillation light detector technology and promise high PET SNR. DSiPMs from Philips Digital Photon Counting (PDPC) were used to develop a preclinical PET/RF gantry with 1-mm scintillation crystal pitch as an insert for clinical MRI scanners. With three exchangeable RF coils, the hybrid field of view has a maximum size of 160 mm × 96.6 mm (transaxial × axial). 0.1 ppm volume-root-mean-square B 0-homogeneity is kept within a spherical diameter of 96 mm (automatic volume shimming). Depending on the coil, MRI SNR is decreased by 13% or 5% by the PET system. PET count rates, energy resolution of 12.6% FWHM, and spatial resolution of 0.73 mm (3) (isometric volume resolution at isocenter) are not affected by applied MRI sequences. PET time resolution of 565 ps (FWHM) degraded by 6 ps during an EPI sequence. Timing-optimized settings yielded 260 ps time resolution. PET and MR images of a hot-rod phantom show no visible differences when the other modality was in operation and both resolve 0.8-mm rods. Versatility of the insert is shown by successfully combining multi-nuclei MRI ((1)H/(19)F) with simultaneously measured PET ((18)F-FDG). A longitudinal study of a tumor-bearing mouse verifies the operability, stability, and in vivo capabilities of the system. Cardiac- and respiratory-gated PET/MRI motion-capturing (CINE) images of the mouse heart demonstrate the advantage of simultaneous acquisition for temporal and spatial image registration.

  19. Digital tumor fluoroscopy (DTF)--a new direct imaging system in the therapy planning for brain tumors.

    PubMed

    Herbst, M; Fröder, M

    1990-01-01

    Digital Tumor Fluoroscopy is an expanded x-ray video chain optimized to iodine contrast with an extended Gy scale up to 64000 Gy values. Series of pictures are taken before and after injection of contrast medium. With the most recent unit, up to ten images can be taken and stored. The microprogrammable processor allows the subtraction of images recorded at any moment of the examination. Dynamic views of the distribution of contrast medium in the intravasal and extravasal spaces of brain and tumor tissue are gained by the subtraction of stored images. Tumors can be differentiated by studying the storage and drainage behavior of the contrast medium during the period of examination. Meningiomas store contrast medium very intensively during the whole time of investigation, whereas astrocytomas grade 2-3 pick it up less strongly at the beginning and release it within 2 min. Glioblastomas show a massive but delayed accumulation of contrast medium and a decreased flow-off-rate. In comparison with radiography and MR-imaging the most important advantage of Digital Tumor Fluoroscopy is that direct information on tumor localization is gained in relation to the skull-cap. This enables the radiotherapist to mark the treatment field directly on the skull. Therefore it is no longer necessary to calculate the tumor volume from several CT scans for localization. In radiotherapy Digital Tumor Fluoroscopy a unit combined with a simulator can replace CT planning. This would help overcome the disadvantages arising from the lack of a collimating system, and the inaccuracies which result from completely different geometric relationships between a CT unit and a therapy machine.

  20. Data Processing Factory for the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.

  1. Fast words boundaries localization in text fields for low quality document images

    NASA Astrophysics Data System (ADS)

    Ilin, Dmitry; Novikov, Dmitriy; Polevoy, Dmitry; Nikolaev, Dmitry

    2018-04-01

    The paper examines the problem of word boundaries precise localization in document text zones. Document processing on a mobile device consists of document localization, perspective correction, localization of individual fields, finding words in separate zones, segmentation and recognition. While capturing an image with a mobile digital camera under uncontrolled capturing conditions, digital noise, perspective distortions or glares may occur. Further document processing gets complicated because of its specifics: layout elements, complex background, static text, document security elements, variety of text fonts. However, the problem of word boundaries localization has to be solved at runtime on mobile CPU with limited computing capabilities under specified restrictions. At the moment, there are several groups of methods optimized for different conditions. Methods for the scanned printed text are quick but limited only for images of high quality. Methods for text in the wild have an excessively high computational complexity, thus, are hardly suitable for running on mobile devices as part of the mobile document recognition system. The method presented in this paper solves a more specialized problem than the task of finding text on natural images. It uses local features, a sliding window and a lightweight neural network in order to achieve an optimal algorithm speed-precision ratio. The duration of the algorithm is 12 ms per field running on an ARM processor of a mobile device. The error rate for boundaries localization on a test sample of 8000 fields is 0.3

  2. A Workflow to Improve the Alignment of Prostate Imaging with Whole-mount Histopathology.

    PubMed

    Yamamoto, Hidekazu; Nir, Dror; Vyas, Lona; Chang, Richard T; Popert, Rick; Cahill, Declan; Challacombe, Ben; Dasgupta, Prokar; Chandra, Ashish

    2014-08-01

    Evaluation of prostate imaging tests against whole-mount histology specimens requires accurate alignment between radiologic and histologic data sets. Misalignment results in false-positive and -negative zones as assessed by imaging. We describe a workflow for three-dimensional alignment of prostate imaging data against whole-mount prostatectomy reference specimens and assess its performance against a standard workflow. Ethical approval was granted. Patients underwent motorized transrectal ultrasound (Prostate Histoscanning) to generate a three-dimensional image of the prostate before radical prostatectomy. The test workflow incorporated steps for axial alignment between imaging and histology, size adjustments following formalin fixation, and use of custom-made parallel cutters and digital caliper instruments. The control workflow comprised freehand cutting and assumed homogeneous block thicknesses at the same relative angles between pathology and imaging sections. Thirty radical prostatectomy specimens were histologically and radiologically processed, either by an alignment-optimized workflow (n = 20) or a control workflow (n = 10). The optimized workflow generated tissue blocks of heterogeneous thicknesses but with no significant drifting in the cutting plane. The control workflow resulted in significantly nonparallel blocks, accurately matching only one out of four histology blocks to their respective imaging data. The image-to-histology alignment accuracy was 20% greater in the optimized workflow (P < .0001), with higher sensitivity (85% vs. 69%) and specificity (94% vs. 73%) for margin prediction in a 5 × 5-mm grid analysis. A significantly better alignment was observed in the optimized workflow. Evaluation of prostate imaging biomarkers using whole-mount histology references should include a test-to-reference spatial alignment workflow. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  3. Image quality, threshold contrast and mean glandular dose in CR mammography

    NASA Astrophysics Data System (ADS)

    Jakubiak, R. R.; Gamba, H. R.; Neves, E. B.; Peixoto, J. E.

    2013-09-01

    In many countries, computed radiography (CR) systems represent the majority of equipment used in digital mammography. This study presents a method for optimizing image quality and dose in CR mammography of patients with breast thicknesses between 45 and 75 mm. Initially, clinical images of 67 patients (group 1) were analyzed by three experienced radiologists, reporting about anatomical structures, noise and contrast in low and high pixel value areas, and image sharpness and contrast. Exposure parameters (kV, mAs and target/filter combination) used in the examinations of these patients were reproduced to determine the contrast-to-noise ratio (CNR) and mean glandular dose (MGD). The parameters were also used to radiograph a CDMAM (version 3.4) phantom (Artinis Medical Systems, The Netherlands) for image threshold contrast evaluation. After that, different breast thicknesses were simulated with polymethylmethacrylate layers and various sets of exposure parameters were used in order to determine optimal radiographic parameters. For each simulated breast thickness, optimal beam quality was defined as giving a target CNR to reach the threshold contrast of CDMAM images for acceptable MGD. These results were used for adjustments in the automatic exposure control (AEC) by the maintenance team. Using optimized exposure parameters, clinical images of 63 patients (group 2) were evaluated as described above. Threshold contrast, CNR and MGD for such exposure parameters were also determined. Results showed that the proposed optimization method was effective for all breast thicknesses studied in phantoms. The best result was found for breasts of 75 mm. While in group 1 there was no detection of the 0.1 mm critical diameter detail with threshold contrast below 23%, after the optimization, detection occurred in 47.6% of the images. There was also an average MGD reduction of 7.5%. The clinical image quality criteria were attended in 91.7% for all breast thicknesses evaluated in both patient groups. Finally, this study also concluded that the use of the AEC of the x-ray unit based on the constant dose to the detector may bring some difficulties to CR systems to operate under optimal conditions. More studies must be performed, so that the compatibility between systems and optimization methodologies can be evaluated, as well as this optimization method. Most methods are developed for phantoms, so comparative studies including clinical images must be developed.

  4. Digital image transformation and rectification of spacecraft and radar images

    NASA Technical Reports Server (NTRS)

    Wu, S. S. C.

    1985-01-01

    The application of digital processing techniques to spacecraft television pictures and radar images is discussed. The use of digital rectification to produce contour maps from spacecraft pictures is described; images with azimuth and elevation angles are converted into point-perspective frame pictures. The digital correction of the slant angle of radar images to ground scale is examined. The development of orthophoto and stereoscopic shaded relief maps from digital terrain and digital image data is analyzed. Digital image transformations and rectifications are utilized on Viking Orbiter and Lander pictures of Mars.

  5. Technical experiences of implementing a wireless tracking and facial biometric verification system for a clinical environment

    NASA Astrophysics Data System (ADS)

    Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.

    2006-03-01

    By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).

  6. Digital camera with apparatus for authentication of images produced from an image file

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L. (Inventor)

    1993-01-01

    A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely based upon the private key that digital data encrypted with the private key by the processor may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating at any time the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match, since even one bit change in the image hash will cause the image hash to be totally different from the secure hash.

  7. A novel data hiding scheme for block truncation coding compressed images using dynamic programming strategy

    NASA Astrophysics Data System (ADS)

    Chang, Ching-Chun; Liu, Yanjun; Nguyen, Son T.

    2015-03-01

    Data hiding is a technique that embeds information into digital cover data. This technique has been concentrated on the spatial uncompressed domain, and it is considered more challenging to perform in the compressed domain, i.e., vector quantization, JPEG, and block truncation coding (BTC). In this paper, we propose a new data hiding scheme for BTC-compressed images. In the proposed scheme, a dynamic programming strategy was used to search for the optimal solution of the bijective mapping function for LSB substitution. Then, according to the optimal solution, each mean value embeds three secret bits to obtain high hiding capacity with low distortion. The experimental results indicated that the proposed scheme obtained both higher hiding capacity and hiding efficiency than the other four existing schemes, while ensuring good visual quality of the stego-image. In addition, the proposed scheme achieved a low bit rate as original BTC algorithm.

  8. Locally optimum nonlinearities for DCT watermark detection.

    PubMed

    Briassouli, Alexia; Strintzis, Michael G

    2004-12-01

    The issue of copyright protection of digital multimedia data has attracted a lot of attention during the last decade. An efficient copyright protection method that has been gaining popularity is watermarking, i.e., the embedding of a signature in a digital document that can be detected only by its rightful owner. Watermarks are usually blindly detected using correlating structures, which would be optimal in the case of Gaussian data. However, in the case of DCT-domain image watermarking, the data is more heavy-tailed and the correlator is clearly suboptimal. Nonlinear receivers have been shown to be particularly well suited for the detection of weak signals in heavy-tailed noise, as they are locally optimal. This motivates the use of the Gaussian-tailed zero-memory nonlinearity, as well as the locally optimal Cauchy nonlinearity for the detection of watermarks in DCT transformed images. We analyze the performance of these schemes theoretically and compare it to that of the traditionally used Gaussian correlator, but also to the recently proposed generalized Gaussian detector, which outperforms the correlator. The theoretical analysis and the actual performance of these systems is assessed through experiments, which verify the theoretical analysis and also justify the use of nonlinear structures for watermark detection. The performance of the correlator and the nonlinear detectors in the presence of quantization is also analyzed, using results from dither theory, and also verified experimentally.

  9. Full field vertical scanning in short coherence digital holographic microscope.

    PubMed

    Monemahghdoust, Zahra; Montfort, Frederic; Cuche, Etienne; Emery, Yves; Depeursinge, Christian; Moser, Christophe

    2013-05-20

    In Digital holography Microscopes (DHM) implemented in the so-called "off axis" configuration, the object and reference wave fronts are not co-planar but form an angle of a few degrees. This results into two main drawbacks. First, the contrast of the interference is not uniform spatially when the light source has low coherence. The interference contrast is optimal along a line, but decreases when moving away from it, resulting in a lower image quality. Second, the non-coplanarity between the coherence plane of both wavefronts impacts the coherence vertical scanning measurement mode: when the optical path difference between the signal and the reference beam is changed, the region of maximum interference contrast shifts laterally in the plane of the objective. This results in more complex calculations to extract the topography of the sample and requires scanning over a much larger vertical range, leading to a longer measurement time. We have previously shown that by placing a volume diffractive optical element (VDOE) in the reference arm, the wavefront can be made coplanar with the object wavefront and the image plane of the microscope objective, resulting in a uniform and optimal interferogram. In this paper, we demonstrate a vertical scanning speed improvement by an order of magnitude. Noise in the phase and intensity images caused by scattering and non-uniform diffraction in the VDOE is analyzed quantitatively. Five VDOEs were fabricated with an identical procedure. We observe that VDOEs introduce a small intensity non-uniformity in the reference beam which results in a 20% noise increase in the extracted phase image as compared to the noise in extracted phase image when the VDOE is removed. However, the VDOE has no impact on the temporal noise measured from extracted phase images.

  10. Digital Rock Physics Aplications: Visualisation Complex Pore and Porosity-Permeability Estimations of the Porous Sandstone Reservoir

    NASA Astrophysics Data System (ADS)

    Handoyo; Fatkhan; Del, Fourier

    2018-03-01

    Reservoir rock containing oil and gas generally has high porosity and permeability. High porosity is expected to accommodate hydrocarbon fluid in large quantities and high permeability is associated with the rock’s ability to let hydrocarbon fluid flow optimally. Porosity and permeability measurement of a rock sample is usually performed in the laboratory. We estimate the porosity and permeability of sandstones digitally by using digital images from μCT-Scan. Advantages of the method are non-destructive and can be applied for small rock pieces also easily to construct the model. The porosity values are calculated by comparing the digital image of the pore volume to the total volume of the sandstones; while the permeability values are calculated using the Lattice Boltzmann calculations utilizing the nature of the law of conservation of mass and conservation of momentum of a particle. To determine variations of the porosity and permeability, the main sandstone samples with a dimension of 300 × 300 × 300 pixels are made into eight sub-cubes with a size of 150 × 150 × 150 pixels. Results of digital image modeling fluid flow velocity are visualized as normal velocity (streamline). Variations in value sandstone porosity vary between 0.30 to 0.38 and permeability variations in the range of 4000 mD to 6200 mD. The results of calculations show that the sandstone sample in this research is highly porous and permeable. The method combined with rock physics can be powerful tools for determining rock properties from small rock fragments.

  11. Comparison of Thermal Detector Arrays for Off-Axis THz Holography and Real-Time THz Imaging

    PubMed Central

    Hack, Erwin; Valzania, Lorenzo; Gäumann, Gregory; Shalaby, Mostafa; Hauri, Christoph P.; Zolliker, Peter

    2016-01-01

    In terahertz (THz) materials science, imaging by scanning prevails when low power THz sources are used. However, the application of array detectors operating with high power THz sources is increasingly reported. We compare the imaging properties of four different array detectors that are able to record THz radiation directly. Two micro-bolometer arrays are designed for infrared imaging in the 8–14 μm wavelength range, but are based on different absorber materials (i) vanadium oxide; (ii) amorphous silicon; (iii) a micro-bolometer array optimized for recording THz radiation based on silicon nitride; and (iv) a pyroelectric array detector for THz beam profile measurements. THz wavelengths of 96.5 μm, 118.8 μm, and 393.6 μm from a powerful far infrared laser were used to assess the technical performance in terms of signal to noise ratio, detector response and detectivity. The usefulness of the detectors for beam profiling and digital holography is assessed. Finally, the potential and limitation for real-time digital holography are discussed. PMID:26861341

  12. Comparison of Thermal Detector Arrays for Off-Axis THz Holography and Real-Time THz Imaging.

    PubMed

    Hack, Erwin; Valzania, Lorenzo; Gäumann, Gregory; Shalaby, Mostafa; Hauri, Christoph P; Zolliker, Peter

    2016-02-06

    In terahertz (THz) materials science, imaging by scanning prevails when low power THz sources are used. However, the application of array detectors operating with high power THz sources is increasingly reported. We compare the imaging properties of four different array detectors that are able to record THz radiation directly. Two micro-bolometer arrays are designed for infrared imaging in the 8-14 μm wavelength range, but are based on different absorber materials (i) vanadium oxide; (ii) amorphous silicon; (iii) a micro-bolometer array optimized for recording THz radiation based on silicon nitride; and (iv) a pyroelectric array detector for THz beam profile measurements. THz wavelengths of 96.5 μm, 118.8 μm, and 393.6 μm from a powerful far infrared laser were used to assess the technical performance in terms of signal to noise ratio, detector response and detectivity. The usefulness of the detectors for beam profiling and digital holography is assessed. Finally, the potential and limitation for real-time digital holography are discussed.

  13. Rotational digital subtraction angiography of the renal arteries: technique and evaluation in the study of native and transplant renal arteries.

    PubMed

    Seymour, H R; Matson, M B; Belli, A M; Morgan, R; Kyriou, J; Patel, U

    2001-02-01

    Rotational digital subtraction angiography (RDSA) allows multidirectional angiographic acquisitions with a single injection of contrast medium. The role of RDSA was evaluated in 60 patients referred over a 7-month period for diagnostic renal angiography and 12 patients referred for renal transplant studies. All angiograms were assessed for their diagnostic value, the presence of anomalies and the quantity of contrast medium used. The effective dose for native renal RDSA was determined. 41 (68.3%) native renal RDSA images and 8 (66.7%) transplant renal RDSA images were of diagnostic quality. Multiple renal arteries were identified in 9/41 (22%) native renal RDSA diagnostic images. The mean volume of contrast medium in the RDSA runs was 51.2 ml and 50 ml for native and transplant renal studies, respectively. The mean effective dose for 120 degrees native renal RDSA was 2.36 mSv, equivalent to 1 year's mean background radiation. Those RDSA images that were non-diagnostic allowed accurate prediction of the optimal angle for further static angiographic series, which is of great value in transplant renal vessels.

  14. Neighborhood binary speckle pattern for deformation measurements insensitive to local illumination variation by digital image correlation.

    PubMed

    Zhao, Jian; Yang, Ping; Zhao, Yue

    2017-06-01

    Speckle pattern-based characteristics of digital image correlation (DIC) restrict its application in engineering fields and nonlaboratory environments, since serious decorrelation effect occurs due to localized sudden illumination variation. A simple and efficient speckle pattern adjusting and optimizing approach presented in this paper is aimed at providing a novel speckle pattern robust enough to resist local illumination variation. The new speckle pattern, called neighborhood binary speckle pattern, derived from original speckle pattern, is obtained by means of thresholding the pixels of a neighborhood at its central pixel value and considering the result as a binary number. The efficiency of the proposed speckle pattern is evaluated in six experimental scenarios. Experiment results indicate that the DIC measurements based on neighborhood binary speckle pattern are able to provide reliable and accurate results, even though local brightness and contrast of the deformed images have been seriously changed. It is expected that the new speckle pattern will have more potential value in engineering applications.

  15. An adaptive block-based fusion method with LUE-SSIM for multi-focus images

    NASA Astrophysics Data System (ADS)

    Zheng, Jianing; Guo, Yongcai; Huang, Yukun

    2016-09-01

    Because of the lenses' limited depth of field, digital cameras are incapable of acquiring an all-in-focus image of objects at varying distances in a scene. Multi-focus image fusion technique can effectively solve this problem. Aiming at the block-based multi-focus image fusion methods, the problem that blocking-artifacts often occurs. An Adaptive block-based fusion method based on lifting undistorted-edge structural similarity (LUE-SSIM) is put forward. In this method, image quality metrics LUE-SSIM is firstly proposed, which utilizes the characteristics of human visual system (HVS) and structural similarity (SSIM) to make the metrics consistent with the human visual perception. Particle swarm optimization(PSO) algorithm which selects LUE-SSIM as the object function is used for optimizing the block size to construct the fused image. Experimental results on LIVE image database shows that LUE-SSIM outperform SSIM on Gaussian defocus blur images quality assessment. Besides, multi-focus image fusion experiment is carried out to verify our proposed image fusion method in terms of visual and quantitative evaluation. The results show that the proposed method performs better than some other block-based methods, especially in reducing the blocking-artifact of the fused image. And our method can effectively preserve the undistorted-edge details in focus region of the source images.

  16. Is the linear modeling technique good enough for optimal form design? A comparison of quantitative analysis models.

    PubMed

    Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun

    2012-01-01

    How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process.

  17. Is the Linear Modeling Technique Good Enough for Optimal Form Design? A Comparison of Quantitative Analysis Models

    PubMed Central

    Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun

    2012-01-01

    How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process. PMID:23258961

  18. Towards a Visual Quality Metric for Digital Video

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1998-01-01

    The advent of widespread distribution of digital video creates a need for automated methods for evaluating visual quality of digital video. This is particularly so since most digital video is compressed using lossy methods, which involve the controlled introduction of potentially visible artifacts. Compounding the problem is the bursty nature of digital video, which requires adaptive bit allocation based on visual quality metrics. In previous work, we have developed visual quality metrics for evaluating, controlling, and optimizing the quality of compressed still images. These metrics incorporate simplified models of human visual sensitivity to spatial and chromatic visual signals. The challenge of video quality metrics is to extend these simplified models to temporal signals as well. In this presentation I will discuss a number of the issues that must be resolved in the design of effective video quality metrics. Among these are spatial, temporal, and chromatic sensitivity and their interactions, visual masking, and implementation complexity. I will also touch on the question of how to evaluate the performance of these metrics.

  19. Automated Assessment of Visual Quality of Digital Video

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Ellis, Stephen R. (Technical Monitor)

    1997-01-01

    The advent of widespread distribution of digital video creates a need for automated methods for evaluating visual quality of digital video. This is particularly so since most digital video is compressed using lossy methods, which involve the controlled introduction of potentially visible artifacts. Compounding the problem is the bursty nature of digital video, which requires adaptive bit allocation based on visual quality metrics. In previous work, we have developed visual quality metrics for evaluating, controlling, and optimizing the quality of compressed still images[1-4]. These metrics incorporate simplified models of human visual sensitivity to spatial and chromatic visual signals. The challenge of video quality metrics is to extend these simplified models to temporal signals as well. In this presentation I will discuss a number of the issues that must be resolved in the design of effective video quality metrics. Among these are spatial, temporal, and chromatic sensitivity and their interactions, visual masking, and implementation complexity. I will also touch on the question of how to evaluate the performance of these metrics.

  20. Digital processing of radiographic images from PACS to publishing.

    PubMed

    Christian, M E; Davidson, H C; Wiggins, R H; Berges, G; Cannon, G; Jackson, G; Chapman, B; Harnsberger, H R

    2001-03-01

    Several studies have addressed the implications of filmless radiologic imaging on telemedicine, diagnostic ability, and electronic teaching files. However, many publishers still require authors to submit hard-copy images for publication of articles and textbooks. This study compares the quality digital images directly exported from picture archive and communications systems (PACS) to images digitized from radiographic film. The authors evaluated the quality of publication-grade glossy photographs produced from digital radiographic images using 3 different methods: (1) film images digitized using a desktop scanner and then printed, (2) digital images obtained directly from PACS then printed, and (3) digital images obtained from PACS and processed to improve sharpness prior to printing. Twenty images were printed using each of the 3 different methods and rated for quality by 7 radiologists. The results were analyzed for statistically significant differences among the image sets. Subjective evaluations of the filmless images found them to be of equal or better quality than the digitized images. Direct electronic transfer of PACS images reduces the number of steps involved in creating publication-quality images as well as providing the means to produce high-quality radiographic images in a digital environment.

  1. Digital Camera with Apparatus for Authentication of Images Produced from an Image File

    NASA Technical Reports Server (NTRS)

    Friedman, Gary L. (Inventor)

    1996-01-01

    A digital camera equipped with a processor for authentication of images produced from an image file taken by the digital camera is provided. The digital camera processor has embedded therein a private key unique to it, and the camera housing has a public key that is so uniquely related to the private key that digital data encrypted with the private key may be decrypted using the public key. The digital camera processor comprises means for calculating a hash of the image file using a predetermined algorithm, and second means for encrypting the image hash with the private key, thereby producing a digital signature. The image file and the digital signature are stored in suitable recording means so they will be available together. Apparatus for authenticating the image file as being free of any alteration uses the public key for decrypting the digital signature, thereby deriving a secure image hash identical to the image hash produced by the digital camera and used to produce the digital signature. The authenticating apparatus calculates from the image file an image hash using the same algorithm as before. By comparing this last image hash with the secure image hash, authenticity of the image file is determined if they match. Other techniques to address time-honored methods of deception, such as attaching false captions or inducing forced perspectives, are included.

  2. 21 CFR 892.2030 - Medical image digitizer.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Medical image digitizer. 892.2030 Section 892.2030...) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.2030 Medical image digitizer. (a) Identification. A medical image digitizer is a device intended to convert an analog medical image into a digital...

  3. 21 CFR 892.2030 - Medical image digitizer.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Medical image digitizer. 892.2030 Section 892.2030...) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.2030 Medical image digitizer. (a) Identification. A medical image digitizer is a device intended to convert an analog medical image into a digital...

  4. 21 CFR 892.2030 - Medical image digitizer.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Medical image digitizer. 892.2030 Section 892.2030...) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.2030 Medical image digitizer. (a) Identification. A medical image digitizer is a device intended to convert an analog medical image into a digital...

  5. 21 CFR 892.2030 - Medical image digitizer.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Medical image digitizer. 892.2030 Section 892.2030...) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.2030 Medical image digitizer. (a) Identification. A medical image digitizer is a device intended to convert an analog medical image into a digital...

  6. [Digital pathology : The time has come!

    PubMed

    Grobholz, R

    2018-05-01

    Digital pathology (DP) and whole-slide imaging (WSI) technology have matured substantially over the last few years. Meanwhile, commercial systems are available that can be used in routine practice. Illustration of DP experiences in a routine diagnostic setting. A DP system offers several advantages: 1) glass slides are no longer unique; 2) access to cases is possible from any location; 3) digital image analysis can be applied; and 4) archived WSI can be easily accessed. From this point, several secondary advantages arise: a) the slide compilation of the case and the case assignment is fast and safe; b) carrying cases to the pathologist is obsolete and paperless work is possible; c) WSI can be used for a second opinion and be accessible in remote locations; d) WSI of referred cases are still accessible after returning the slides; e) histological images can easily be provided in tumor boards; f) the office desk is clean; and g) a "home office" is possible. To introduce a DP system, a comprehensive workflow analysis is needed that clarifies the needs and wishes of the respective institute. In order to optimally meet the requirements, open DP platforms are of particular advantage, because they enable the integration of scanners from various manufacturers. Further developments in image analysis, such as virtual tissue reconstruction, could enrich the diagnostic process in the future and improve treatment quality.

  7. Binarization of Gray-Scaled Digital Images Via Fuzzy Reasoning

    NASA Technical Reports Server (NTRS)

    Dominquez, Jesus A.; Klinko, Steve; Voska, Ned (Technical Monitor)

    2002-01-01

    A new fast-computational technique based on fuzzy entropy measure has been developed to find an optimal binary image threshold. In this method, the image pixel membership functions are dependent on the threshold value and reflect the distribution of pixel values in two classes; thus, this technique minimizes the classification error. This new method is compared with two of the best-known threshold selection techniques, Otsu and Huang-Wang. The performance of the proposed method supersedes the performance of Huang- Wang and Otsu methods when the image consists of textured background and poor printing quality. The three methods perform well but yield different binarization approaches if the background and foreground of the image have well-separated gray-level ranges.

  8. Binarization of Gray-Scaled Digital Images Via Fuzzy Reasoning

    NASA Technical Reports Server (NTRS)

    Dominquez, Jesus A.; Klinko, Steve; Voska, Ned (Technical Monitor)

    2002-01-01

    A new fast-computational technique based on fuzzy entropy measure has been developed to find an optimal binary image threshold. In this method, the image pixel membership functions are dependent on the threshold value and reflect the distribution of pixel values in two classes; thus, this technique minimizes the classification error. This new method is compared with two of the best-known threshold selection techniques, Otsu and Huang-Wang. The performance of the proposed method supersedes the performance of Huang-Wang and Otsu methods when the image consists of textured background and poor printing quality. The three methods perform well but yield different binarization approaches if the background and foreground of the image have well-separated gray-level ranges.

  9. Optical Fourier diffractometry applied to degraded bone structure recognition

    NASA Astrophysics Data System (ADS)

    Galas, Jacek; Godwod, Krzysztof; Szawdyn, Jacek; Sawicki, Andrzej

    1993-09-01

    Image processing and recognition methods are useful in many fields. This paper presents the hybrid optical and digital method applied to recognition of pathological changes in bones involved by metabolic bone diseases. The trabecular bone structure, registered by x ray on the photographic film, is analyzed in the new type of computer controlled diffractometer. The set of image parameters, extracted from diffractogram, is evaluated by statistical analysis. The synthetic image descriptors in discriminant space, constructed on the base of 3 training groups of images (control, osteoporosis, and osteomalacia groups) by discriminant analysis, allow us to recognize bone samples with degraded bone structure and to recognize the disease. About 89% of the images were classified correctly. This method after optimization process will be verified in medical investigations.

  10. Point Cloud Generation from Aerial Image Data Acquired by a Quadrocopter Type Micro Unmanned Aerial Vehicle and a Digital Still Camera

    PubMed Central

    Rosnell, Tomi; Honkavaara, Eija

    2012-01-01

    The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems’ SOCET SET classical commercial photogrammetric software and another is built using Microsoft®’s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation. PMID:22368479

  11. Point cloud generation from aerial image data acquired by a quadrocopter type micro unmanned aerial vehicle and a digital still camera.

    PubMed

    Rosnell, Tomi; Honkavaara, Eija

    2012-01-01

    The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems' SOCET SET classical commercial photogrammetric software and another is built using Microsoft(®)'s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation.

  12. Barriers and facilitators to adoption of soft copy interpretation from the user perspective: Lessons learned from filmless radiology for slideless pathology

    PubMed Central

    Patterson, Emily S.; Rayo, Mike; Gill, Carolina; Gurcan, Metin N.

    2011-01-01

    Background: Adoption of digital images for pathological specimens has been slower than adoption of digital images in radiology, despite a number of anticipated advantages for digital images in pathology. In this paper, we explore the factors that might explain this slower rate of adoption. Materials and Method: Semi-structured interviews on barriers and facilitators to the adoption of digital images were conducted with two radiologists, three pathologists, and one pathologist's assistant. Results: Barriers and facilitators to adoption of digital images were reported in the areas of performance, workflow-efficiency, infrastructure, integration with other software, and exposure to digital images. The primary difference between the settings was that performance with the use of digital images as compared to the traditional method was perceived to be higher in radiology and lower in pathology. Additionally, exposure to digital images was higher in radiology than pathology, with some radiologists exclusively having been trained and/or practicing with digital images. The integration of digital images both improved and reduced efficiency in routine and non-routine workflow patterns in both settings, and was variable across the different organizations. A comparison of these findings with prior research on adoption of other health information technologies suggests that the barriers to adoption of digital images in pathology are relatively tractable. Conclusions: Improving performance using digital images in pathology would likely accelerate adoption of innovative technologies that are facilitated by the use of digital images, such as electronic imaging databases, electronic health records, double reading for challenging cases, and computer-aided diagnostic systems. PMID:21383925

  13. Novel computer-based endoscopic camera

    NASA Astrophysics Data System (ADS)

    Rabinovitz, R.; Hai, N.; Abraham, Martin D.; Adler, Doron; Nissani, M.; Fridental, Ron; Vitsnudel, Ilia

    1995-05-01

    We have introduced a computer-based endoscopic camera which includes (a) unique real-time digital image processing to optimize image visualization by reducing over exposed glared areas and brightening dark areas, and by accentuating sharpness and fine structures, and (b) patient data documentation and management. The image processing is based on i Sight's iSP1000TM digital video processor chip and Adaptive SensitivityTM patented scheme for capturing and displaying images with wide dynamic range of light, taking into account local neighborhood image conditions and global image statistics. It provides the medical user with the ability to view images under difficult lighting conditions, without losing details `in the dark' or in completely saturated areas. The patient data documentation and management allows storage of images (approximately 1 MB per image for a full 24 bit color image) to any storage device installed into the camera, or to an external host media via network. The patient data which is included with every image described essential information on the patient and procedure. The operator can assign custom data descriptors, and can search for the stored image/data by typing any image descriptor. The camera optics has extended zoom range of f equals 20 - 45 mm allowing control of the diameter of the field which is displayed on the monitor such that the complete field of view of the endoscope can be displayed on all the area of the screen. All these features provide versatile endoscopic camera with excellent image quality and documentation capabilities.

  14. Selection of optimal spectral sensitivity functions for color filter arrays.

    PubMed

    Parmar, Manu; Reeves, Stanley J

    2010-12-01

    A color image meant for human consumption can be appropriately displayed only if at least three distinct color channels are present. Typical digital cameras acquire three-color images with only one sensor. A color filter array (CFA) is placed on the sensor such that only one color is sampled at a particular spatial location. This sparsely sampled signal is then reconstructed to form a color image with information about all three colors at each location. In this paper, we show that the wavelength sensitivity functions of the CFA color filters affect both the color reproduction ability and the spatial reconstruction quality of recovered images. We present a method to select perceptually optimal color filter sensitivity functions based upon a unified spatial-chromatic sampling framework. A cost function independent of particular scenes is defined that expresses the error between a scene viewed by the human visual system and the reconstructed image that represents the scene. A constrained minimization of the cost function is used to obtain optimal values of color-filter sensitivity functions for several periodic CFAs. The sensitivity functions are shown to perform better than typical RGB and CMY color filters in terms of both the s-CIELAB ∆E error metric and a qualitative assessment.

  15. Image manipulation: Fraudulence in digital dental records: Study and review

    PubMed Central

    Chowdhry, Aman; Sircar, Keya; Popli, Deepika Bablani; Tandon, Ankita

    2014-01-01

    Introduction: In present-day times, freely available software allows dentists to tweak their digital records as never before. But, there is a fine line between acceptable enhancements and scientific delinquency. Aims and Objective: To manipulate digital images (used in forensic dentistry) of casts, lip prints, and bite marks in order to highlight tampering techniques and methods of detecting and preventing manipulation of digital images. Materials and Methods: Digital image records of forensic data (casts, lip prints, and bite marks photographed using Samsung Techwin L77 digital camera) were manipulated using freely available software. Results: Fake digital images can be created either by merging two or more digital images, or by altering an existing image. Discussion and Conclusion: Retouched digital images can be used for fraudulent purposes in forensic investigations. However, tools are available to detect such digital frauds, which are extremely difficult to assess visually. Thus, all digital content should mandatorily have attached metadata and preferably watermarking in order to avert their malicious re-use. Also, computer alertness, especially about imaging software's, should be promoted among forensic odontologists/dental professionals. PMID:24696587

  16. A Web-based telemedicine system for diabetic retinopathy screening using digital fundus photography.

    PubMed

    Wei, Jack C; Valentino, Daniel J; Bell, Douglas S; Baker, Richard S

    2006-02-01

    The purpose was to design and implement a Web-based telemedicine system for diabetic retinopathy screening using digital fundus cameras and to make the software publicly available through Open Source release. The process of retinal imaging and case reviewing was modeled to optimize workflow and implement use of computer system. The Web-based system was built on Java Servlet and Java Server Pages (JSP) technologies. Apache Tomcat was chosen as the JSP engine, while MySQL was used as the main database and Laboratory of Neuro Imaging (LONI) Image Storage Architecture, from the LONI-UCLA, as the platform for image storage. For security, all data transmissions were carried over encrypted Internet connections such as Secure Socket Layer (SSL) and HyperText Transfer Protocol over SSL (HTTPS). User logins were required and access to patient data was logged for auditing. The system was deployed at Hubert H. Humphrey Comprehensive Health Center and Martin Luther King/Drew Medical Center of Los Angeles County Department of Health Services. Within 4 months, 1500 images of more than 650 patients were taken at Humphrey's Eye Clinic and successfully transferred to King/Drew's Department of Ophthalmology. This study demonstrates an effective architecture for remote diabetic retinopathy screening.

  17. Portable lensless wide-field microscopy imaging platform based on digital inline holography and multi-frame pixel super-resolution

    PubMed Central

    Sobieranski, Antonio C; Inci, Fatih; Tekin, H Cumhur; Yuksekkaya, Mehmet; Comunello, Eros; Cobra, Daniel; von Wangenheim, Aldo; Demirci, Utkan

    2017-01-01

    In this paper, an irregular displacement-based lensless wide-field microscopy imaging platform is presented by combining digital in-line holography and computational pixel super-resolution using multi-frame processing. The samples are illuminated by a nearly coherent illumination system, where the hologram shadows are projected into a complementary metal-oxide semiconductor-based imaging sensor. To increase the resolution, a multi-frame pixel resolution approach is employed to produce a single holographic image from multiple frame observations of the scene, with small planar displacements. Displacements are resolved by a hybrid approach: (i) alignment of the LR images by a fast feature-based registration method, and (ii) fine adjustment of the sub-pixel information using a continuous optimization approach designed to find the global optimum solution. Numerical method for phase-retrieval is applied to decode the signal and reconstruct the morphological details of the analyzed sample. The presented approach was evaluated with various biological samples including sperm and platelets, whose dimensions are in the order of a few microns. The obtained results demonstrate a spatial resolution of 1.55 µm on a field-of-view of ≈30 mm2. PMID:29657866

  18. Digital image transformation and rectification of spacecraft and radar images

    USGS Publications Warehouse

    Wu, S.S.C.

    1985-01-01

    Digital image transformation and rectification can be described in three categories: (1) digital rectification of spacecraft pictures on workable stereoplotters; (2) digital correction of radar image geometry; and (3) digital reconstruction of shaded relief maps and perspective views including stereograms. Digital rectification can make high-oblique pictures workable on stereoplotters that would otherwise not accommodate such extreme tilt angles. It also enables panoramic line-scan geometry to be used to compile contour maps with photogrammetric plotters. Rectifications were digitally processed on both Viking Orbiter and Lander pictures of Mars as well as radar images taken by various radar systems. By merging digital terrain data with image data, perspective and three-dimensional views of Olympus Mons and Tithonium Chasma, also of Mars, are reconstructed through digital image processing. ?? 1985.

  19. Semi-automated extraction of landslides in Taiwan based on SPOT imagery and DEMs

    NASA Astrophysics Data System (ADS)

    Eisank, Clemens; Hölbling, Daniel; Friedl, Barbara; Chen, Yi-Chin; Chang, Kang-Tsung

    2014-05-01

    The vast availability and improved quality of optical satellite data and digital elevation models (DEMs), as well as the need for complete and up-to-date landslide inventories at various spatial scales have fostered the development of semi-automated landslide recognition systems. Among the tested approaches for designing such systems, object-based image analysis (OBIA) stepped out to be a highly promising methodology. OBIA offers a flexible, spatially enabled framework for effective landslide mapping. Most object-based landslide mapping systems, however, have been tailored to specific, mainly small-scale study areas or even to single landslides only. Even though reported mapping accuracies tend to be higher than for pixel-based approaches, accuracy values are still relatively low and depend on the particular study. There is still room to improve the applicability and objectivity of object-based landslide mapping systems. The presented study aims at developing a knowledge-based landslide mapping system implemented in an OBIA environment, i.e. Trimble eCognition. In comparison to previous knowledge-based approaches, the classification of segmentation-derived multi-scale image objects relies on digital landslide signatures. These signatures hold the common operational knowledge on digital landslide mapping, as reported by 25 Taiwanese landslide experts during personal semi-structured interviews. Specifically, the signatures include information on commonly used data layers, spectral and spatial features, and feature thresholds. The signatures guide the selection and implementation of mapping rules that were finally encoded in Cognition Network Language (CNL). Multi-scale image segmentation is optimized by using the improved Estimation of Scale Parameter (ESP) tool. The approach described above is developed and tested for mapping landslides in a sub-region of the Baichi catchment in Northern Taiwan based on SPOT imagery and a high-resolution DEM. An object-based accuracy assessment is conducted by quantitatively comparing extracted landslide objects with landslide polygons that were visually interpreted by local experts. The applicability and transferability of the mapping system are evaluated by comparing initial accuracies with those achieved for the following two tests: first, usage of a SPOT image from the same year, but for a different area within the Baichi catchment; second, usage of SPOT images from multiple years for the same region. The integration of the common knowledge via digital landslide signatures is new in object-based landslide studies. In combination with strategies to optimize image segmentation this may lead to a more objective, transferable and stable knowledge-based system for the mapping of landslides from optical satellite data and DEMs.

  20. Digital radiographic imaging: is the dental practice ready?

    PubMed

    Parks, Edwin T

    2008-04-01

    Digital radiographic imaging is slowly, but surely, replacing film-based imaging. It has many advantages over traditional imaging, but the technology also has some drawbacks. The author presents an overview of the types of digital image receptors available, image enhancement software and the range of costs for the new technology. PRACTICE IMPLICATIONS. The expenses associated with converting to digital radiographic imaging are considerable. The purpose of this article is to provide the clinician with an overview of digital radiographic imaging technology so that he or she can be an informed consumer when evaluating the numerous digital systems in the marketplace.

  1. WE-FG-207B-05: Iterative Reconstruction Via Prior Image Constrained Total Generalized Variation for Spectral CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niu, S; Zhang, Y; Ma, J

    Purpose: To investigate iterative reconstruction via prior image constrained total generalized variation (PICTGV) for spectral computed tomography (CT) using fewer projections while achieving greater image quality. Methods: The proposed PICTGV method is formulated as an optimization problem, which balances the data fidelity and prior image constrained total generalized variation of reconstructed images in one framework. The PICTGV method is based on structure correlations among images in the energy domain and high-quality images to guide the reconstruction of energy-specific images. In PICTGV method, the high-quality image is reconstructed from all detector-collected X-ray signals and is referred as the broad-spectrum image. Distinctmore » from the existing reconstruction methods applied on the images with first order derivative, the higher order derivative of the images is incorporated into the PICTGV method. An alternating optimization algorithm is used to minimize the PICTGV objective function. We evaluate the performance of PICTGV on noise and artifacts suppressing using phantom studies and compare the method with the conventional filtered back-projection method as well as TGV based method without prior image. Results: On the digital phantom, the proposed method outperforms the existing TGV method in terms of the noise reduction, artifacts suppression, and edge detail preservation. Compared to that obtained by the TGV based method without prior image, the relative root mean square error in the images reconstructed by the proposed method is reduced by over 20%. Conclusion: The authors propose an iterative reconstruction via prior image constrained total generalize variation for spectral CT. Also, we have developed an alternating optimization algorithm and numerically demonstrated the merits of our approach. Results show that the proposed PICTGV method outperforms the TGV method for spectral CT.« less

  2. Lunar Terrain and Albedo Reconstruction from Apollo Imagery

    NASA Technical Reports Server (NTRS)

    Nefian, Ara V.; Kim, Taemin; Broxton, Michael; Moratto, Zach

    2010-01-01

    Generating accurate three dimensional planetary models and albedo maps is becoming increasingly more important as NASA plans more robotics missions to the Moon in the coming years. This paper describes a novel approach for separation of topography and albedo maps from orbital Lunar images. Our method uses an optimal Bayesian correlator to refine the stereo disparity map and generate a set of accurate digital elevation models (DEM). The albedo maps are obtained using a multi-image formation model that relies on the derived DEMs and the Lunar- Lambert reflectance model. The method is demonstrated on a set of high resolution scanned images from the Apollo era missions.

  3. CAD - CAM Procedures Used for Rapid Prototyping of Prosthetic Hip Joint Bone

    NASA Astrophysics Data System (ADS)

    Popa, Luminita I.; Popa, Vasile N.

    2016-11-01

    The article addresses the issue of rapid prototyping CAD/ CAM procedures, based on CT imaging, for custom implants dedicated to hip arthroplasty and the correlation study to be achieved between femoral canal shape, valued by modern imaging methods, and the prosthesis form. A set of CT images is transformed into a digital model using one of several software packages available for conversion. The purpose of research is to obtain prosthesis with compatible characteristics as close to the physiological, with an optimal adjustment of the prosthesis to the bone in which it is implanted, allowing the recovery of the patient physically, mentally and socially.

  4. Use of digital micromirror devices as dynamic pinhole arrays for adaptive confocal fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Pozzi, Paolo; Wilding, Dean; Soloviev, Oleg; Vdovin, Gleb; Verhaegen, Michel

    2018-02-01

    In this work, we present a new confocal laser scanning microscope capable to perform sensorless wavefront optimization in real time. The device is a parallelized laser scanning microscope in which the excitation light is structured in a lattice of spots by a spatial light modulator, while a deformable mirror provides aberration correction and scanning. A binary DMD is positioned in an image plane of the detection optical path, acting as a dynamic array of reflective confocal pinholes, images by a high performance cmos camera. A second camera detects images of the light rejected by the pinholes for sensorless aberration correction.

  5. Design and validation of a mathematical breast phantom for contrast-enhanced digital mammography

    NASA Astrophysics Data System (ADS)

    Hill, Melissa L.; Mainprize, James G.; Jong, Roberta A.; Yaffe, Martin J.

    2011-03-01

    In contrast-enhanced digital mammography (CEDM) an iodinated contrast agent is employed to increase lesion contrast and to provide tissue functional information. Here, we present the details of a software phantom that can be used as a tool for the simulation of CEDM images, and compare the degree of anatomic noise present in images simulated using the phantom to that associated with breast parenchyma in clinical CEDM images. Such a phantom could be useful for multiparametric investigations including characterization of CEDM imaging performance and system optimization. The phantom has a realistic mammographic appearance based on a clustered lumpy background and models contrast agent uptake according to breast tissue physiology. Fifty unique phantoms were generated and used to simulate regions of interest (ROI) of pre-contrast images and logarithmically subtracted CEDM images using monoenergetic ray tracing. Power law exponents, β, were used as a measure of anatomic noise and were determined using a linear least-squares fit to log-log plots of the square of the modulus of radially averaged image power spectra versus spatial frequency. The power spectra for ROI selected from regions of normal parenchyma in 10 pairs of clinical CEDM pre-contrast and subtracted images were also measured for comparison with the simulated images. There was good agreement between the measured β in the simulated CEDM images and the clinical images. The values of β were consistently lower for the logarithmically subtracted CEDM images compared to the pre-contrast images, indicating that the subtraction process reduced anatomical noise.

  6. Machine vision system for inspecting characteristics of hybrid rice seed

    NASA Astrophysics Data System (ADS)

    Cheng, Fang; Ying, Yibin

    2004-03-01

    Obtaining clear images advantaged of improving the classification accuracy involves many factors, light source, lens extender and background were discussed in this paper. The analysis of rice seed reflectance curves showed that the wavelength of light source for discrimination of the diseased seeds from normal rice seeds in the monochromic image recognition mode was about 815nm for jinyou402 and shanyou10. To determine optimizing conditions for acquiring digital images of rice seed using a computer vision system, an adjustable color machine vision system was developed. The machine vision system with 20mm to 25mm lens extender produce close-up images which made it easy to object recognition of characteristics in hybrid rice seeds. White background was proved to be better than black background for inspecting rice seeds infected by disease and using the algorithms based on shape. Experimental results indicated good classification for most of the characteristics with the machine vision system. The same algorithm yielded better results in optimizing condition for quality inspection of rice seed. Specifically, the image processing can correct for details such as fine fissure with the machine vision system.

  7. Single-snapshot 2D color measurement by plenoptic imaging system

    NASA Astrophysics Data System (ADS)

    Masuda, Kensuke; Yamanaka, Yuji; Maruyama, Go; Nagai, Sho; Hirai, Hideaki; Meng, Lingfei; Tosic, Ivana

    2014-03-01

    Plenoptic cameras enable capture of directional light ray information, thus allowing applications such as digital refocusing, depth estimation, or multiband imaging. One of the most common plenoptic camera architectures contains a microlens array at the conventional image plane and a sensor at the back focal plane of the microlens array. We leverage the multiband imaging (MBI) function of this camera and develop a single-snapshot, single-sensor high color fidelity camera. Our camera is based on a plenoptic system with XYZ filters inserted in the pupil plane of the main lens. To achieve high color measurement precision of this system, we perform an end-to-end optimization of the system model that includes light source information, object information, optical system information, plenoptic image processing and color estimation processing. Optimized system characteristics are exploited to build an XYZ plenoptic colorimetric camera prototype that achieves high color measurement precision. We describe an application of our colorimetric camera to color shading evaluation of display and show that it achieves color accuracy of ΔE<0.01.

  8. Cost-effective forensic image enhancement

    NASA Astrophysics Data System (ADS)

    Dalrymple, Brian E.

    1998-12-01

    In 1977, a paper was presented at the SPIE conference in Reston, Virginia, detailing the computer enhancement of the Zapruder film. The forensic value of this examination in a major homicide investigation was apparent to the viewer. Equally clear was the potential for extracting evidence which is beyond the reach of conventional detection techniques. The cost of this technology in 1976, however, was prohibitive, and well beyond the means of most police agencies. Twenty-two years later, a highly efficient means of image enhancement is easily within the grasp of most police agencies, not only for homicides but for any case application. A PC workstation combined with an enhancement software package allows a forensic investigator to fully exploit digital technology. The goal of this approach is the optimization of the signal to noise ratio in images. Obstructive backgrounds may be diminished or eliminated while weak signals are optimized by the use of algorithms including Fast Fourier Transform, Histogram Equalization and Image Subtraction. An added benefit is the speed with which these processes are completed and the results known. The efficacy of forensic image enhancement is illustrated through case applications.

  9. Combination of image descriptors for the exploration of cultural photographic collections

    NASA Astrophysics Data System (ADS)

    Bhowmik, Neelanjan; Gouet-Brunet, Valérie; Bloch, Gabriel; Besson, Sylvain

    2017-01-01

    The rapid growth of image digitization and collections in recent years makes it challenging and burdensome to organize, categorize, and retrieve similar images from voluminous collections. Content-based image retrieval (CBIR) is immensely convenient in this context. A considerable number of local feature detectors and descriptors are present in the literature of CBIR. We propose a model to anticipate the best feature combinations for image retrieval-related applications. Several spatial complementarity criteria of local feature detectors are analyzed and then engaged in a regression framework to find the optimal combination of detectors for a given dataset and are better adapted for each given image; the proposed model is also useful to optimally fix some other parameters, such as the k in k-nearest neighbor retrieval. Three public datasets of various contents and sizes are employed to evaluate the proposal, which is legitimized by improving the quality of retrieval notably facing classical approaches. Finally, the proposed image search engine is applied to the cultural photographic collections of a French museum, where it demonstrates its added value for the exploration and promotion of these contents at different levels from their archiving up to their exhibition in or ex situ.

  10. 49 CFR 384.227 - Record of digital image or photograph.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 5 2014-10-01 2014-10-01 false Record of digital image or photograph. 384.227... § 384.227 Record of digital image or photograph. The State must: (a) Record the digital color image or.... The digital color image or photograph or black and white laser engraved photograph must either be made...

  11. 49 CFR 384.227 - Record of digital image or photograph.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 5 2013-10-01 2013-10-01 false Record of digital image or photograph. 384.227... § 384.227 Record of digital image or photograph. The State must: (a) Record the digital color image or.... The digital color image or photograph or black and white laser engraved photograph must either be made...

  12. 49 CFR 384.227 - Record of digital image or photograph.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 5 2011-10-01 2011-10-01 false Record of digital image or photograph. 384.227... § 384.227 Record of digital image or photograph. The State must: (a) Record the digital color image or.... The digital color image or photograph or black and white laser engraved photograph must either be made...

  13. 49 CFR 384.227 - Record of digital image or photograph.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 5 2012-10-01 2012-10-01 false Record of digital image or photograph. 384.227... § 384.227 Record of digital image or photograph. The State must: (a) Record the digital color image or.... The digital color image or photograph or black and white laser engraved photograph must either be made...

  14. High-Speed Data Acquisition and Digital Signal Processing System for PET Imaging Techniques Applied to Mammography

    NASA Astrophysics Data System (ADS)

    Martinez, J. D.; Benlloch, J. M.; Cerda, J.; Lerche, Ch. W.; Pavon, N.; Sebastia, A.

    2004-06-01

    This paper is framed into the Positron Emission Mammography (PEM) project, whose aim is to develop an innovative gamma ray sensor for early breast cancer diagnosis. Currently, breast cancer is detected using low-energy X-ray screening. However, functional imaging techniques such as PET/FDG could be employed to detect breast cancer and track disease changes with greater sensitivity. Furthermore, a small and less expensive PET camera can be utilized minimizing main problems of whole body PET. To accomplish these objectives, we are developing a new gamma ray sensor based on a newly released photodetector. However, a dedicated PEM detector requires an adequate data acquisition (DAQ) and processing system. The characterization of gamma events needs a free-running analog-to-digital converter (ADC) with sampling rates of more than 50 Ms/s and must achieve event count rates up to 10 MHz. Moreover, comprehensive data processing must be carried out to obtain event parameters necessary for performing the image reconstruction. A new generation digital signal processor (DSP) has been used to comply with these requirements. This device enables us to manage the DAQ system at up to 80 Ms/s and to execute intensive calculi over the detector signals. This paper describes our designed DAQ and processing architecture whose main features are: very high-speed data conversion, multichannel synchronized acquisition with zero dead time, a digital triggering scheme, and high throughput of data with an extensive optimization of the signal processing algorithms.

  15. Single Phase Dual-energy CT Angiography: One-stop-shop Tool for Evaluating Aneurysmal Subarachnoid Hemorrhage.

    PubMed

    Ni, Qian Qian; Tang, Chun Xiang; Zhao, Yan E; Zhou, Chang Sheng; Chen, Guo Zhong; Lu, Guang Ming; Zhang, Long Jiang

    2016-05-25

    Aneurysmal subarachnoid hemorrhages have extremely high case fatality in clinic. Early and rapid identifications of ruptured intracranial aneurysms seem to be especially important. Here we evaluate clinical value of single phase contrast-enhanced dual-energy CT angiograph (DE-CTA) as a one-stop-shop tool in detecting aneurysmal subarachnoid hemorrhage. One hundred and five patients who underwent true non-enhanced CT (TNCT), contrast-enhanced DE-CTA and digital subtraction angiography (DSA) were included. Image quality and detectability of intracranial hemorrhage were evaluated and compared between virtual non-enhanced CT (VNCT) images reconstructed from DE-CTA and TNCT. There was no statistical difference in image quality (P > 0.05) between VNCT and TNCT. The agreement of VNCT and TNCT in detecting intracranial hemorrhage reached 98.1% on a per-patient basis. With DSA as reference standard, sensitivity and specificity on a per-patient were 98.3% and 97.9% for DE-CTA in intracranial aneurysm detection. Effective dose of DE-CTA was reduced by 75.0% compared to conventional digital subtraction CTA. Thus, single phase contrast-enhanced DE-CTA is optimal reliable one-stop-shop tool for detecting intracranial hemorrhage with VNCT and intracranial aneurysms with DE-CTA with substantial radiation dose reduction compared with conventional digital subtraction CTA.

  16. Transcranial sonography of brainstem structures in panic disorder.

    PubMed

    Šilhán, Petr; Jelínková, Monika; Walter, Uwe; Pavlov Praško, Ján; Herzig, Roman; Langová, Kateřina; Školoudík, David

    2015-10-30

    Panic disorder has been associated with altered serotonin metabolism in the brainstem raphe. The aim of study was to evaluate the BR echogenicity on transcranial sonography (TCS) in panic disorder. A total of 96 healthy volunteers were enrolled in the "derivation" cohort, and 26 healthy volunteers and 26 panic disorder patients were enrolled in the "validation" cohort. TCS echogenicity of brainstem raphe and substantia nigra was assessed on anonymized images visually and by means of digitized image analysis. Significantly reduced brainstem raphe echogenicity was detected more frequently in panic disorder patients than in controls using both visual (68% vs. 31%) and digitized image analysis (52% vs. 12%). The optimal cut-off value of digitized brainstem raphe echogenicity indicated the diagnosis of panic disorder with a sensitivity of 64% and a specificity of 73%, and corresponded to the 30th percentile in the derivation cohort. Reduced brainstem raphe echogenicity was associated with shorter treatment duration, and, by trend, lower severity of anxiety. No relationship was found between echogenicity of brainstem raphe or substantia nigra and age, gender, severity of panic disorder, or severity of depression. Patients with panic disorder exhibit changes of brainstem raphe on TCS suggesting an alteration of the central serotonergic system. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. Single Phase Dual-energy CT Angiography: One-stop-shop Tool for Evaluating Aneurysmal Subarachnoid Hemorrhage

    PubMed Central

    Ni, Qian Qian; Tang, Chun Xiang; Zhao, Yan E; Zhou, Chang Sheng; Chen, Guo Zhong; Lu, Guang Ming; Zhang, Long Jiang

    2016-01-01

    Aneurysmal subarachnoid hemorrhages have extremely high case fatality in clinic. Early and rapid identifications of ruptured intracranial aneurysms seem to be especially important. Here we evaluate clinical value of single phase contrast-enhanced dual-energy CT angiograph (DE-CTA) as a one-stop-shop tool in detecting aneurysmal subarachnoid hemorrhage. One hundred and five patients who underwent true non-enhanced CT (TNCT), contrast-enhanced DE-CTA and digital subtraction angiography (DSA) were included. Image quality and detectability of intracranial hemorrhage were evaluated and compared between virtual non-enhanced CT (VNCT) images reconstructed from DE-CTA and TNCT. There was no statistical difference in image quality (P > 0.05) between VNCT and TNCT. The agreement of VNCT and TNCT in detecting intracranial hemorrhage reached 98.1% on a per-patient basis. With DSA as reference standard, sensitivity and specificity on a per-patient were 98.3% and 97.9% for DE-CTA in intracranial aneurysm detection. Effective dose of DE-CTA was reduced by 75.0% compared to conventional digital subtraction CTA. Thus, single phase contrast-enhanced DE-CTA is optimal reliable one-stop-shop tool for detecting intracranial hemorrhage with VNCT and intracranial aneurysms with DE-CTA with substantial radiation dose reduction compared with conventional digital subtraction CTA. PMID:27222163

  18. The AAPM/RSNA physics tutorial for residents: digital fluoroscopy.

    PubMed

    Pooley, R A; McKinney, J M; Miller, D A

    2001-01-01

    A digital fluoroscopy system is most commonly configured as a conventional fluoroscopy system (tube, table, image intensifier, video system) in which the analog video signal is converted to and stored as digital data. Other methods of acquiring the digital data (eg, digital or charge-coupled device video and flat-panel detectors) will become more prevalent in the future. Fundamental concepts related to digital imaging in general include binary numbers, pixels, and gray levels. Digital image data allow the convenient use of several image processing techniques including last image hold, gray-scale processing, temporal frame averaging, and edge enhancement. Real-time subtraction of digital fluoroscopic images after injection of contrast material has led to widespread use of digital subtraction angiography (DSA). Additional image processing techniques used with DSA include road mapping, image fade, mask pixel shift, frame summation, and vessel size measurement. Peripheral angiography performed with an automatic moving table allows imaging of the peripheral vasculature with a single contrast material injection.

  19. Binocular optical axis parallelism detection precision analysis based on Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ying, Jiaju; Liu, Bingqi

    2018-02-01

    According to the working principle of the binocular photoelectric instrument optical axis parallelism digital calibration instrument, and in view of all components of the instrument, the various factors affect the system precision is analyzed, and then precision analysis model is established. Based on the error distribution, Monte Carlo method is used to analyze the relationship between the comprehensive error and the change of the center coordinate of the circle target image. The method can further guide the error distribution, optimize control the factors which have greater influence on the comprehensive error, and improve the measurement accuracy of the optical axis parallelism digital calibration instrument.

  20. Quality improvement in neonatal digital radiography: implementing the basic quality improvement tools.

    PubMed

    Eslamy, Hedieh K; Newman, Beverley; Weinberger, Ed

    2014-12-01

    A quality improvement (QI) program may be implemented using the plan-do-study-act cycle (as a model for making improvements) and the basic QI tools (used to visually display and analyze variation in data). Managing radiation dose has come to the forefront as a safety goal for radiology departments. This is especially true in the pediatric population, which is more radiosensitive than the adult population. In this article, we use neonatal digital radiography to discuss developing a QI program with the principle goals of decreasing the radiation dose, decreasing variation in radiation dose, and optimizing image quality. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Enhanced Axial Resolution of Wide-Field Two-Photon Excitation Microscopy by Line Scanning Using a Digital Micromirror Device.

    PubMed

    Park, Jong Kang; Rowlands, Christopher J; So, Peter T C

    2017-01-01

    Temporal focusing multiphoton microscopy is a technique for performing highly parallelized multiphoton microscopy while still maintaining depth discrimination. While the conventional wide-field configuration for temporal focusing suffers from sub-optimal axial resolution, line scanning temporal focusing, implemented here using a digital micromirror device (DMD), can provide substantial improvement. The DMD-based line scanning temporal focusing technique dynamically trades off the degree of parallelization, and hence imaging speed, for axial resolution, allowing performance parameters to be adapted to the experimental requirements. We demonstrate this new instrument in calibration specimens and in biological specimens, including a mouse kidney slice.

  2. Enhanced Axial Resolution of Wide-Field Two-Photon Excitation Microscopy by Line Scanning Using a Digital Micromirror Device

    PubMed Central

    Park, Jong Kang; Rowlands, Christopher J.; So, Peter T. C.

    2017-01-01

    Temporal focusing multiphoton microscopy is a technique for performing highly parallelized multiphoton microscopy while still maintaining depth discrimination. While the conventional wide-field configuration for temporal focusing suffers from sub-optimal axial resolution, line scanning temporal focusing, implemented here using a digital micromirror device (DMD), can provide substantial improvement. The DMD-based line scanning temporal focusing technique dynamically trades off the degree of parallelization, and hence imaging speed, for axial resolution, allowing performance parameters to be adapted to the experimental requirements. We demonstrate this new instrument in calibration specimens and in biological specimens, including a mouse kidney slice. PMID:29387484

  3. Practical algorithms for simulation and reconstruction of digital in-line holograms.

    PubMed

    Latychevskaia, Tatiana; Fink, Hans-Werner

    2015-03-20

    Here we present practical methods for simulation and reconstruction of in-line digital holograms recorded with plane and spherical waves. The algorithms described here are applicable to holographic imaging of an object exhibiting absorption as well as phase-shifting properties. Optimal parameters, related to distances, sampling rate, and other factors for successful simulation and reconstruction of holograms are evaluated and criteria for the achievable resolution are worked out. Moreover, we show that the numerical procedures for the reconstruction of holograms recorded with plane and spherical waves are identical under certain conditions. Experimental examples of holograms and their reconstructions are also discussed.

  4. Detecting Copy Move Forgery In Digital Images

    NASA Astrophysics Data System (ADS)

    Gupta, Ashima; Saxena, Nisheeth; Vasistha, S. K.

    2012-03-01

    In today's world several image manipulation software's are available. Manipulation of digital images has become a serious problem nowadays. There are many areas like medical imaging, digital forensics, journalism, scientific publications, etc, where image forgery can be done very easily. To determine whether a digital image is original or doctored is a big challenge. To find the marks of tampering in a digital image is a challenging task. The detection methods can be very useful in image forensics which can be used as a proof for the authenticity of a digital image. In this paper we propose the method to detect region duplication forgery by dividing the image into overlapping block and then perform searching to find out the duplicated region in the image.

  5. Multiscale image processing and antiscatter grids in digital radiography.

    PubMed

    Lo, Winnie Y; Hornof, William J; Zwingenberger, Allison L; Robertson, Ian D

    2009-01-01

    Scatter radiation is a source of noise and results in decreased signal-to-noise ratio and thus decreased image quality in digital radiography. We determined subjectively whether a digitally processed image made without a grid would be of similar quality to an image made with a grid but without image processing. Additionally the effects of exposure dose and of a using a grid with digital radiography on overall image quality were studied. Thoracic and abdominal radiographs of five dogs of various sizes were made. Four acquisition techniques were included (1) with a grid, standard exposure dose, digital image processing; (2) without a grid, standard exposure dose, digital image processing; (3) without a grid, half the exposure dose, digital image processing; and (4) with a grid, standard exposure dose, no digital image processing (to mimic a film-screen radiograph). Full-size radiographs as well as magnified images of specific anatomic regions were generated. Nine reviewers rated the overall image quality subjectively using a five-point scale. All digitally processed radiographs had higher overall scores than nondigitally processed radiographs regardless of patient size, exposure dose, or use of a grid. The images made at half the exposure dose had a slightly lower quality than those made at full dose, but this was only statistically significant in magnified images. Using a grid with digital image processing led to a slight but statistically significant increase in overall quality when compared with digitally processed images made without a grid but whether this increase in quality is clinically significant is unknown.

  6. Investigation of optimal parameters for penalized maximum-likelihood reconstruction applied to iodinated contrast-enhanced breast CT

    NASA Astrophysics Data System (ADS)

    Makeev, Andrey; Ikejimba, Lynda; Lo, Joseph Y.; Glick, Stephen J.

    2016-03-01

    Although digital mammography has reduced breast cancer mortality by approximately 30%, sensitivity and specificity are still far from perfect. In particular, the performance of mammography is especially limited for women with dense breast tissue. Two out of every three biopsies performed in the U.S. are unnecessary, thereby resulting in increased patient anxiety, pain, and possible complications. One promising tomographic breast imaging method that has recently been approved by the FDA is dedicated breast computed tomography (BCT). However, visualizing lesions with BCT can still be challenging for women with dense breast tissue due to the minimal contrast for lesions surrounded by fibroglandular tissue. In recent years there has been renewed interest in improving lesion conspicuity in x-ray breast imaging by administration of an iodinated contrast agent. Due to the fully 3-D imaging nature of BCT, as well as sub-optimal contrast enhancement while the breast is under compression with mammography and breast tomosynthesis, dedicated BCT of the uncompressed breast is likely to offer the best solution for injected contrast-enhanced x-ray breast imaging. It is well known that use of statistically-based iterative reconstruction in CT results in improved image quality at lower radiation dose. Here we investigate possible improvements in image reconstruction for BCT, by optimizing free regularization parameter in method of maximum likelihood and comparing its performance with clinical cone-beam filtered backprojection (FBP) algorithm.

  7. Design, optimization and evaluation of a "smart" pixel sensor array for low-dose digital radiography

    NASA Astrophysics Data System (ADS)

    Wang, Kai; Liu, Xinghui; Ou, Hai; Chen, Jun

    2016-04-01

    Amorphous silicon (a-Si:H) thin-film transistors (TFTs) have been widely used to build flat-panel X-ray detectors for digital radiography (DR). As the demand for low-dose X-ray imaging grows, a detector with high signal-to-noise-ratio (SNR) pixel architecture emerges. "Smart" pixel is intended to use a dual-gate photosensitive TFT for sensing, storage, and switch. It differs from a conventional passive pixel sensor (PPS) and active pixel sensor (APS) in that all these three functions are combined into one device instead of three separate units in a pixel. Thus, it is expected to have high fill factor and high spatial resolution. In addition, it utilizes the amplification effect of the dual-gate photosensitive TFT to form a one-transistor APS that leads to a potentially high SNR. This paper addresses the design, optimization and evaluation of the smart pixel sensor and array for low-dose DR. We will design and optimize the smart pixel from the scintillator to TFT levels and validate it through optical and electrical simulation and experiments of a 4x4 sensor array.

  8. Accurate joint space quantification in knee osteoarthritis: a digital x-ray tomosynthesis phantom study

    NASA Astrophysics Data System (ADS)

    Sewell, Tanzania S.; Piacsek, Kelly L.; Heckel, Beth A.; Sabol, John M.

    2011-03-01

    The current imaging standard for diagnosis and monitoring of knee osteoarthritis (OA) is projection radiography. However radiographs may be insensitive to markers of early disease such as osteophytes and joint space narrowing (JSN). Relative to standard radiography, digital X-ray tomosynthesis (DTS) may provide improved visualization of the markers of knee OA without the interference of superimposed anatomy. DTS utilizes a series of low-dose projection images over an arc of +/-20 degrees to reconstruct tomographic images parallel to the detector. We propose that DTS can increase accuracy and precision in JSN quantification. The geometric accuracy of DTS was characterized by quantifying joint space width (JSW) as a function of knee flexion and position using physical and anthropomorphic phantoms. Using a commercially available digital X-ray system, projection and DTS images were acquired for a Lucite rod phantom with known gaps at various source-object-distances, and angles of flexion. Gap width, representative of JSW, was measured using a validated algorithm. Over an object-to-detector-distance range of 5-21cm, a 3.0mm gap width was reproducibly measured in the DTS images, independent of magnification. A simulated 0.50mm (+/-0.13) JSN was quantified accurately (95% CI 0.44-0.56mm) in the DTS images. Angling the rods to represent knee flexion, the minimum gap could be precisely determined from the DTS images and was independent of flexion angle. JSN quantification using DTS was insensitive to distance from patient barrier and flexion angle. Potential exists for the optimization of DTS for accurate radiographic quantification of knee OA independent of patient positioning.

  9. Unified Digital Image Display And Processing System

    NASA Astrophysics Data System (ADS)

    Horii, Steven C.; Maguire, Gerald Q.; Noz, Marilyn E.; Schimpf, James H.

    1981-11-01

    Our institution like many others, is faced with a proliferation of medical imaging techniques. Many of these methods give rise to digital images (e.g. digital radiography, computerized tomography (CT) , nuclear medicine and ultrasound). We feel that a unified, digital system approach to image management (storage, transmission and retrieval), image processing and image display will help in integrating these new modalities into the present diagnostic radiology operations. Future techniques are likely to employ digital images, so such a system could readily be expanded to include other image sources. We presently have the core of such a system. We can both view and process digital nuclear medicine (conventional gamma camera) images, positron emission tomography (PET) and CT images on a single system. Images from our recently installed digital radiographic unit can be added. Our paper describes our present system, explains the rationale for its configuration, and describes the directions in which it will expand.

  10. Analytical optimization of digital subtraction mammography with contrast medium using a commercial unit.

    PubMed

    Rosado-Méndez, I; Palma, B A; Brandan, M E

    2008-12-01

    Contrast-medium-enhanced digital mammography (CEDM) is an image subtraction technique which might help unmasking lesions embedded in very dense breasts. Previous works have stated the feasibility of CEDM and the imperative need of radiological optimization. This work presents an extension of a former analytical formalism to predict contrast-to-noise ratio (CNR) in subtracted mammograms. The goal is to optimize radiological parameters available in a clinical mammographic unit (x-ray tube anode/filter combination, voltage, and loading) by maximizing CNR and minimizing total mean glandular dose (D(gT)), simulating the experimental application of an iodine-based contrast medium and the image subtraction under dual-energy nontemporal, and single- or dual-energy temporal modalities. Total breast-entrance air kerma is limited to a fixed 8.76 mGy (1 R, similar to screening studies). Mathematical expressions obtained from the formalism are evaluated using computed mammographic x-ray spectra attenuated by an adipose/glandular breast containing an elongated structure filled with an iodinated solution in various concentrations. A systematic study of contrast, its associated variance, and CNR for different spectral combinations is performed, concluding in the proposal of optimum x-ray spectra. The linearity between contrast in subtracted images and iodine mass thickness is proven, including the determination of iodine visualization limits based on Rose's detection criterion. Finally, total breast-entrance air kerma is distributed between both images in various proportions in order to maximize the figure of merit CNR2/D(gT). Predicted results indicate the advantage of temporal subtraction (either single- or dual-energy modalities) with optimum parameters corresponding to high-voltage, strongly hardened Rh/Rh spectra. For temporal techniques, CNR was found to depend mostly on the energy of the iodinated image, and thus reduction in D(gT) could be achieved if the spectral energy of the noniodinated image is decreased and the breast-entrance air kerma is evenly distributed between both acquisitions. Predicted limits, in terms of iodine concentration, are found to guarantee the visualization of common clinical angiogenic concentrations in the breast.

  11. Contrast-enhanced dual-energy digital subtraction mammography: optimization of the beam energy

    NASA Astrophysics Data System (ADS)

    Kwan, Alexander L. C.; Boone, John M.; Le-Petross, Huong; Lindfors, Karen K.; Seibert, J. A.; Lewin, John M.

    2005-04-01

    The implementation of contrast-enhanced dual-energy digital subtraction mammography may lead to better identification of breast tumors, and thus provide a lower cost and more widely available alternative to breast MRI. This technique involves the acquisition of low- and high-energy images after the IV administration of iodinated contrast agent. In this study, the effect of the beam energy (kVp) was examined using the CNR2/dose metric, where CNR is the contrast-to-noise ratio and dose implies the mean glandular dose. The mean glandular dose was calculated using parameterized normalized glandular dose coefficients (DgN), which allowed the computation of the mean glandular dose for the modeled spectra considered in this study, coupled with incident kerma measurements. Optimization studies were performed using a dedicated cone-beam breast CT scanner designed and fabricated in our laboratory, with the system operating in stationary imaging mode. A flat tissue-equivalent phantom (7.5 cm in thickness) was placed at the isocenter of the scanner, and an air gap of 34.5 cm was used in lieu of a grid. Dilute iodine-based contrast agent was introduced into the phantoms using plastic vials. Data were acquired from 40 to 90 kVp at 10 kVp intervals. Due to the low mA available on the breast CT system, a large number of images (1000) were acquired in fluoroscopic mode, which allowed us to match the dose and noise properties for each kVp combinations by changing the number of images used for averaging. Preliminary results demonstrate that the best CNR2/dose is achieved with a 50 kVp low-energy image and a 90 kVp high-energy image. Consequently, radiation doses for contrast-enhanced mammography should be far lower than regular mammography. Since the spatial resolution requirements should also be lower than regular mammography, dual-energy contrast-enhanced mammography, when performed using the optimal technique factor, may indeed provide very similar diagnostic information as breast MRI but at significantly reduced costs.

  12. [Affine transformation-based automatic registration for peripheral digital subtraction angiography (DSA)].

    PubMed

    Kong, Gang; Dai, Dao-Qing; Zou, Lu-Min

    2008-07-01

    In order to remove the artifacts of peripheral digital subtraction angiography (DSA), an affine transformation-based automatic image registration algorithm is introduced here. The whole process is described as follows: First, rectangle feature templates are constructed with their centers of the extracted Harris corners in the mask, and motion vectors of the central feature points are estimated using template matching technology with the similarity measure of maximum histogram energy. And then the optimal parameters of the affine transformation are calculated with the matrix singular value decomposition (SVD) method. Finally, bilinear intensity interpolation is taken to the mask according to the specific affine transformation. More than 30 peripheral DSA registrations are performed with the presented algorithm, and as the result, moving artifacts of the images are removed with sub-pixel precision, and the time consumption is less enough to satisfy the clinical requirements. Experimental results show the efficiency and robustness of the algorithm.

  13. The application of digital image analysis for blood typing: the comparison of anti-A and anti-B monoclonal antibodies activity with standard hemagglutinating sera

    NASA Astrophysics Data System (ADS)

    Medvedeva, Maria F.; Doubrovski, Valery A.

    2017-03-01

    The resolution of the acousto-optical method for blood typing was estimated experimentally by means of two types of reagents: monoclonal antibodies and standard hemagglutinating sera. The peculiarity of this work is the application of digital photo images processing by pixel analysis previously proposed by the authors. The influence of the concentrations of reagents, of blood sample, which is to be tested, as well as of the duration of the ultrasonic action on the biological object upon the resolution of acousto-optical method were investigated. The optimal experimental conditions to obtain maximum of the resolution of the acousto-optical method were found, it creates the prerequisites for a reliable blood typing. The present paper is a further step in the development of acousto-optical method for determining human blood groups.

  14. Optimally weighted least-squares steganalysis

    NASA Astrophysics Data System (ADS)

    Ker, Andrew D.

    2007-02-01

    Quantitative steganalysis aims to estimate the amount of payload in a stego object, and such estimators seem to arise naturally in steganalysis of Least Significant Bit (LSB) replacement in digital images. However, as with all steganalysis, the estimators are subject to errors, and their magnitude seems heavily dependent on properties of the cover. In very recent work we have given the first derivation of estimation error, for a certain method of steganalysis (the Least-Squares variant of Sample Pairs Analysis) of LSB replacement steganography in digital images. In this paper we make use of our theoretical results to find an improved estimator and detector. We also extend the theoretical analysis to another (more accurate) steganalysis estimator (Triples Analysis) and hence derive an improved version of that estimator too. Experimental results show that the new steganalyzers have improved accuracy, particularly in the difficult case of never-compressed covers.

  15. Generation-3 programmable array microscope (PAM) with digital micro-mirror device (DMD)

    NASA Astrophysics Data System (ADS)

    De Beule, Pieter A. A.; de Vries, Anthony H. B.; Arndt-Jovin, Donna J.; Jovin, Thomas M.

    2011-03-01

    We report progress on the construction of an optical sectioning programmable array microscope (PAM) implemented with a digital micro-mirror device (DMD) spatial light modulator (SLM) utilized for both fluorescence illumination and detection. The introduction of binary intensity modulation at the focal plane of a microscope objective in a computer controlled pixilated mode allows the recovery of an optically sectioned image. Illumination patterns can be changed very quickly, in contrast to static Nipkow disk or aperture correlation implementations, thereby creating an optical system that can be optimized to the optical specimen in a convenient manner, e.g. for patterned photobleaching, photobleaching reduction, or spatial superresolution. We present a third generation (Gen-3) dual path PAM module incorporating the 25 kHz binary frame rate TI 1080p DMD and a newly developed optical system that offers diffraction limited imaging with compensation of tilt angle distortion.

  16. Validating a new methodology for optical probe design and image registration in fNIRS studies

    PubMed Central

    Wijeakumar, Sobanawartiny; Spencer, John P.; Bohache, Kevin; Boas, David A.; Magnotta, Vincent A.

    2015-01-01

    Functional near-infrared spectroscopy (fNIRS) is an imaging technique that relies on the principle of shining near-infrared light through tissue to detect changes in hemodynamic activation. An important methodological issue encountered is the creation of optimized probe geometry for fNIRS recordings. Here, across three experiments, we describe and validate a processing pipeline designed to create an optimized, yet scalable probe geometry based on selected regions of interest (ROIs) from the functional magnetic resonance imaging (fMRI) literature. In experiment 1, we created a probe geometry optimized to record changes in activation from target ROIs important for visual working memory. Positions of the sources and detectors of the probe geometry on an adult head were digitized using a motion sensor and projected onto a generic adult atlas and a segmented head obtained from the subject's MRI scan. In experiment 2, the same probe geometry was scaled down to fit a child's head and later digitized and projected onto the generic adult atlas and a segmented volume obtained from the child's MRI scan. Using visualization tools and by quantifying the amount of intersection between target ROIs and channels, we show that out of 21 ROIs, 17 and 19 ROIs intersected with fNIRS channels from the adult and child probe geometries, respectively. Further, both the adult atlas and adult subject-specific MRI approaches yielded similar results and can be used interchangeably. However, results suggest that segmented heads obtained from MRI scans be used for registering children's data. Finally, in experiment 3, we further validated our processing pipeline by creating a different probe geometry designed to record from target ROIs involved in language and motor processing. PMID:25705757

  17. Radiology on handheld devices: image display, manipulation, and PACS integration issues.

    PubMed

    Raman, Bhargav; Raman, Raghav; Raman, Lalithakala; Beaulieu, Christopher F

    2004-01-01

    Handheld personal digital assistants (PDAs) have undergone continuous and substantial improvements in hardware and graphics capabilities, making them a compelling platform for novel developments in teleradiology. The latest PDAs have processor speeds of up to 400 MHz and storage capacities of up to 80 Gbytes with memory expansion methods. A Digital Imaging and Communications in Medicine (DICOM)-compliant, vendor-independent handheld image access system was developed in which a PDA server acts as the gateway between a picture archiving and communication system (PACS) and PDAs. The system is compatible with most currently available PDA models. It is capable of both wired and wireless transfer of images and includes custom PDA software and World Wide Web interfaces that implement a variety of basic image manipulation functions. Implementation of this system, which is currently undergoing debugging and beta testing, required optimization of the user interface to efficiently display images on smaller PDA screens. The PDA server manages user work lists and implements compression and security features to accelerate transfer speeds, protect patient information, and regulate access. Although some limitations remain, PDA-based teleradiology has the potential to increase the efficiency of the radiologic work flow, increasing productivity and improving communication with referring physicians and patients. Copyright RSNA, 2004

  18. Method for inserting noise in digital mammography to simulate reduction in radiation dose

    NASA Astrophysics Data System (ADS)

    Borges, Lucas R.; de Oliveira, Helder C. R.; Nunes, Polyana F.; Vieira, Marcelo A. C.

    2015-03-01

    The quality of clinical x-ray images is closely related to the radiation dose used in the imaging study. The general principle for selecting the radiation is ALARA ("as low as reasonably achievable"). The practical optimization, however, remains challenging. It is well known that reducing the radiation dose increases the quantum noise, which could compromise the image quality. In order to conduct studies about dose reduction in mammography, it would be necessary to acquire repeated clinical images, from the same patient, with different dose levels. However, such practice would be unethical due to radiation related risks. One solution is to simulate the effects of dose reduction in clinical images. This work proposes a new method, based on the Anscombe transformation, which simulates dose reduction in digital mammography by inserting quantum noise into clinical mammograms acquired with the standard radiation dose. Thus, it is possible to simulate different levels of radiation doses without exposing the patient to new levels of radiation. Results showed that the achieved quality of simulated images generated with our method is the same as when using other methods found in the literature, with the novelty of using the Anscombe transformation for converting signal-independent Gaussian noise into signal-dependent quantum noise.

  19. Intelligent image capture of cartridge cases for firearms examiners

    NASA Astrophysics Data System (ADS)

    Jones, Brett C.; Guerci, Joseph R.

    1997-02-01

    The FBI's DRUGFIRETM system is a nationwide computerized networked image database of ballistic forensic evidence. This evidence includes images of cartridge cases and bullets obtained from both crime scenes and controlled test firings of seized weapons. Currently, the system is installed in over 80 forensic labs across the country and has enjoyed a high degree of success. In this paper, we discuss some of the issues and methods associated with providing a front-end semi-automated image capture system that simultaneously satisfies the often conflicting criteria of the many human examiners visual perception versus the criteria associated with optimizing autonomous digital image correlation. Specifically, we detail the proposed processing chain of an intelligent image capture system (IICS), involving a real- time capture 'assistant,' which assesses the quality of the image under test utilizing a custom designed neural network.

  20. Images Encryption Method using Steganographic LSB Method, AES and RSA algorithm

    NASA Astrophysics Data System (ADS)

    Moumen, Abdelkader; Sissaoui, Hocine

    2017-03-01

    Vulnerability of communication of digital images is an extremely important issue nowadays, particularly when the images are communicated through insecure channels. To improve communication security, many cryptosystems have been presented in the image encryption literature. This paper proposes a novel image encryption technique based on an algorithm that is faster than current methods. The proposed algorithm eliminates the step in which the secrete key is shared during the encryption process. It is formulated based on the symmetric encryption, asymmetric encryption and steganography theories. The image is encrypted using a symmetric algorithm, then, the secret key is encrypted by means of an asymmetrical algorithm and it is hidden in the ciphered image using a least significant bits steganographic scheme. The analysis results show that while enjoying the faster computation, our method performs close to optimal in terms of accuracy.

  1. Validity of plant fiber length measurement : a review of fiber length measurement based on kenaf as a model

    Treesearch

    James S. Han; Theodore Mianowski; Yi-yu Lin

    1999-01-01

    The efficacy of fiber length measurement techniques such as digitizing, the Kajaani procedure, and NIH Image are compared in order to determine the optimal tool. Kenaf bast fibers, aspen, and red pine fibers were collected from different anatomical parts, and the fiber lengths were compared using various analytical tools. A statistical analysis on the validity of the...

  2. Cost-effectiveness prospects of picture archiving and communication systems.

    PubMed

    Hindel, R; Preger, W

    1988-01-01

    PAC (picture archiving and communication) systems are widely discussed and promoted as the organizational solution to digital image management in a radiology department. For approximately two decades digital imaging has increasingly been used for such diagnostic modalities as CT, DSA, MRI, DR (Digital Radiography) and others. PACS are seen as a step toward high technology integration and more efficient management. Although the acquisition of such technology is investment intensive, there are well-founded projections that prolonged operation will prove cost justified. Such justification can only partly be derived from cost reduction through PAC with respect to present department management--the major justification is preparation for future economic pressures which could make survival of a department without modern technology difficult. Especially in the United States the political climate favors 'competitive medicine' and reduced government support. Seen in this context PACS promises to speed the transition of Health Care Services into a business with tight resource management, cost accounting and marketing. The following paper analyzes cost and revenue in a typical larger Radiology Department, projects various scenarios of cost reduction by means of digital technology and concludes with cautious optimism that the investment expenses for a PACS will be justified in the near future by prudent utilization of high technology.

  3. The impact of digital imaging in the field of cytopathology.

    PubMed

    Pantanowitz, Liron; Hornish, Maryanne; Goulart, Robert A

    2009-03-06

    With the introduction of digital imaging, pathology is undergoing a digital transformation. In the field of cytology, digital images are being used for telecytology, automated screening of Pap test slides, training and education (e.g. online digital atlases), and proficiency testing. To date, there has been no systematic review on the impact of digital imaging on the practice of cytopathology. This article critically addresses the emerging role of computer-assisted screening and the application of digital imaging to the field of cytology, including telecytology, virtual microscopy, and the impact of online cytology resources. The role of novel diagnostic techniques like image cytometry is also reviewed.

  4. Incorporating digital imaging into dental hygiene practice.

    PubMed

    Saxe, M J; West, D J

    1997-01-01

    The objective of this paper is to describe digital imaging technology: available modalities, scientific imaging process, advantages and limitations, and applications to dental hygiene practice. Advances in technology have created innovative imaging modalities for intraoral radiography that eliminate film as the traditional image receptor. Digital imaging generates instantaneous radiographic images on a display monitor following exposure. Advantages include lower patient exposure per image and elimination of film processing. Digital imaging enhances diagnostic capabilities and, therefore, treatment decisions by the oral healthcare provider. Utilization of digital imaging technology for intraoral radiography will advance the practice of dental hygiene. Although spatial resolution is inferior to conventional film, digital imaging provides adequate resolution to diagnose oral diseases. Dental hygienists must evaluate new technologies in radiography to continue providing quality care while reducing patient exposure to ionizing radiation.

  5. [Dry view laser imager--a new economical photothermal imaging method].

    PubMed

    Weberling, R

    1996-11-01

    The production of hard copies is currently achieved by means of laser imagers and wet film processing in systems attached either directly in or to the laser imager or in a darkroom. Variations in image quality resulting from a not always optimal wet film development are frequent. A newly developed thermographic film developer for laser films without liquid powdered chemicals, on the other hand, is environmentally preferable and reducing operating costs. The completely dry developing process provides permanent image documentation meeting the quality and safety requirements of RöV and BAK. One of the currently available systems of this type, the DryView Laser Imager is inexpensive and easy to install. The selective connection principle of the DryView Laser Imager can be expanded as required and accepts digital and/or analog interfaces with all imaging systems (CT, MR, DR, US, NM) from the various manufactures.

  6. Automatic digital surface model (DSM) generation from aerial imagery data

    NASA Astrophysics Data System (ADS)

    Zhou, Nan; Cao, Shixiang; He, Hongyan; Xing, Kun; Yue, Chunyu

    2018-04-01

    Aerial sensors are widely used to acquire imagery for photogrammetric and remote sensing application. In general, the images have large overlapped region, which provide a lot of redundant geometry and radiation information for matching. This paper presents a POS supported dense matching procedure for automatic DSM generation from aerial imagery data. The method uses a coarse-to-fine hierarchical strategy with an effective combination of several image matching algorithms: image radiation pre-processing, image pyramid generation, feature point extraction and grid point generation, multi-image geometrically constraint cross-correlation (MIG3C), global relaxation optimization, multi-image geometrically constrained least squares matching (MIGCLSM), TIN generation and point cloud filtering. The image radiation pre-processing is used in order to reduce the effects of the inherent radiometric problems and optimize the images. The presented approach essentially consists of 3 components: feature point extraction and matching procedure, grid point matching procedure and relational matching procedure. The MIGCLSM method is used to achieve potentially sub-pixel accuracy matches and identify some inaccurate and possibly false matches. The feasibility of the method has been tested on different aerial scale images with different landcover types. The accuracy evaluation is based on the comparison between the automatic extracted DSMs derived from the precise exterior orientation parameters (EOPs) and the POS.

  7. Visual-servoing optical microscopy

    DOEpatents

    Callahan, Daniel E.; Parvin, Bahram

    2009-06-09

    The present invention provides methods and devices for the knowledge-based discovery and optimization of differences between cell types. In particular, the present invention provides visual servoing optical microscopy, as well as analysis methods. The present invention provides means for the close monitoring of hundreds of individual, living cells over time: quantification of dynamic physiological responses in multiple channels; real-time digital image segmentation and analysis; intelligent, repetitive computer-applied cell stress and cell stimulation; and the ability to return to the same field of cells for long-term studies and observation. The present invention further provides means to optimize culture conditions for specific subpopulations of cells.

  8. Visual-servoing optical microscopy

    DOEpatents

    Callahan, Daniel E [Martinez, CA; Parvin, Bahram [Mill Valley, CA

    2011-05-24

    The present invention provides methods and devices for the knowledge-based discovery and optimization of differences between cell types. In particular, the present invention provides visual servoing optical microscopy, as well as analysis methods. The present invention provides means for the close monitoring of hundreds of individual, living cells over time; quantification of dynamic physiological responses in multiple channels; real-time digital image segmentation and analysis; intelligent, repetitive computer-applied cell stress and cell stimulation; and the ability to return to the same field of cells for long-term studies and observation. The present invention further provides means to optimize culture conditions for specific subpopulations of cells.

  9. Visual-servoing optical microscopy

    DOEpatents

    Callahan, Daniel E; Parvin, Bahram

    2013-10-01

    The present invention provides methods and devices for the knowledge-based discovery and optimization of differences between cell types. In particular, the present invention provides visual servoing optical microscopy, as well as analysis methods. The present invention provides means for the close monitoring of hundreds of individual, living cells over time; quantification of dynamic physiological responses in multiple channels; real-time digital image segmentation and analysis; intelligent, repetitive computer-applied cell stress and cell stimulation; and the ability to return to the same field of cells for long-term studies and observation. The present invention further provides means to optimize culture conditions for specific subpopulations of cells.

  10. Advanced digital image archival system using MPEG technologies

    NASA Astrophysics Data System (ADS)

    Chang, Wo

    2009-08-01

    Digital information and records are vital to the human race regardless of the nationalities and eras in which they were produced. Digital image contents are produced at a rapid pace from cultural heritages via digitalization, scientific and experimental data via high speed imaging sensors, national defense satellite images from governments, medical and healthcare imaging records from hospitals, personal collection of photos from digital cameras. With these mass amounts of precious and irreplaceable data and knowledge, what standards technologies can be applied to preserve and yet provide an interoperable framework for accessing the data across varieties of systems and devices? This paper presents an advanced digital image archival system by applying the international standard of MPEG technologies to preserve digital image content.

  11. Subpixel edge estimation with lens aberrations compensation based on the iterative image approximation for high-precision thermal expansion measurements of solids

    NASA Astrophysics Data System (ADS)

    Inochkin, F. M.; Kruglov, S. K.; Bronshtein, I. G.; Kompan, T. A.; Kondratjev, S. V.; Korenev, A. S.; Pukhov, N. F.

    2017-06-01

    A new method for precise subpixel edge estimation is presented. The principle of the method is the iterative image approximation in 2D with subpixel accuracy until the appropriate simulated is found, matching the simulated and acquired images. A numerical image model is presented consisting of three parts: an edge model, object and background brightness distribution model, lens aberrations model including diffraction. The optimal values of model parameters are determined by means of conjugate-gradient numerical optimization of a merit function corresponding to the L2 distance between acquired and simulated images. Computationally-effective procedure for the merit function calculation along with sufficient gradient approximation is described. Subpixel-accuracy image simulation is performed in a Fourier domain with theoretically unlimited precision of edge points location. The method is capable of compensating lens aberrations and obtaining the edge information with increased resolution. Experimental method verification with digital micromirror device applied to physically simulate an object with known edge geometry is shown. Experimental results for various high-temperature materials within the temperature range of 1000°C..2400°C are presented.

  12. The Orthanc Ecosystem for Medical Imaging.

    PubMed

    Jodogne, Sébastien

    2018-05-03

    This paper reviews the components of Orthanc, a free and open-source, highly versatile ecosystem for medical imaging. At the core of the Orthanc ecosystem, the Orthanc server is a lightweight vendor neutral archive that provides PACS managers with a powerful environment to automate and optimize the imaging flows that are very specific to each hospital. The Orthanc server can be extended with plugins that provide solutions for teleradiology, digital pathology, or enterprise-ready databases. It is shown how software developers and research engineers can easily develop external software or Web portals dealing with medical images, with minimal knowledge of the DICOM standard, thanks to the advanced programming interface of the Orthanc server. The paper concludes by introducing the Stone of Orthanc, an innovative toolkit for the cross-platform rendering of medical images.

  13. Omniview motionless camera orientation system

    NASA Technical Reports Server (NTRS)

    Martin, H. Lee (Inventor); Kuban, Daniel P. (Inventor); Zimmermann, Steven D. (Inventor); Busko, Nicholas (Inventor)

    2010-01-01

    An apparatus and method is provided for converting digital images for use in an imaging system. The apparatus includes a data memory which stores digital data representing an image having a circular or spherical field of view such as an image captured by a fish-eye lens, a control input for receiving a signal for selecting a portion of the image, and a converter responsive to the control input for converting digital data corresponding to the selected portion into digital data representing a planar image for subsequent display. Various methods include the steps of storing digital data representing an image having a circular or spherical field of view, selecting a portion of the image, and converting the stored digital data corresponding to the selected portion into digital data representing a planar image for subsequent display. In various embodiments, the data converter and data conversion step may use an orthogonal set of transformation algorithms.

  14. Image Format Conversion to DICOM and Lookup Table Conversion to Presentation Value of the Japanese Society of Radiological Technology (JSRT) Standard Digital Image Database.

    PubMed

    Yanagita, Satoshi; Imahana, Masato; Suwa, Kazuaki; Sugimura, Hitomi; Nishiki, Masayuki

    2016-01-01

    Japanese Society of Radiological Technology (JSRT) standard digital image database contains many useful cases of chest X-ray images, and has been used in many state-of-the-art researches. However, the pixel values of all the images are simply digitized as relative density values by utilizing a scanned film digitizer. As a result, the pixel values are completely different from the standardized display system input value of digital imaging and communications in medicine (DICOM), called presentation value (P-value), which can maintain a visual consistency when observing images using different display luminance. Therefore, we converted all the images from JSRT standard digital image database to DICOM format followed by the conversion of the pixel values to P-value using an original program developed by ourselves. Consequently, JSRT standard digital image database has been modified so that the visual consistency of images is maintained among different luminance displays.

  15. Building Extraction Based on an Optimized Stacked Sparse Autoencoder of Structure and Training Samples Using LIDAR DSM and Optical Images.

    PubMed

    Yan, Yiming; Tan, Zhichao; Su, Nan; Zhao, Chunhui

    2017-08-24

    In this paper, a building extraction method is proposed based on a stacked sparse autoencoder with an optimized structure and training samples. Building extraction plays an important role in urban construction and planning. However, some negative effects will reduce the accuracy of extraction, such as exceeding resolution, bad correction and terrain influence. Data collected by multiple sensors, as light detection and ranging (LIDAR), optical sensor etc., are used to improve the extraction. Using digital surface model (DSM) obtained from LIDAR data and optical images, traditional method can improve the extraction effect to a certain extent, but there are some defects in feature extraction. Since stacked sparse autoencoder (SSAE) neural network can learn the essential characteristics of the data in depth, SSAE was employed to extract buildings from the combined DSM data and optical image. A better setting strategy of SSAE network structure is given, and an idea of setting the number and proportion of training samples for better training of SSAE was presented. The optical data and DSM were combined as input of the optimized SSAE, and after training by an optimized samples, the appropriate network structure can extract buildings with great accuracy and has good robustness.

  16. Simulation of a complete X-ray digital radiographic system for industrial applications.

    PubMed

    Nazemi, E; Rokrok, B; Movafeghi, A; Choopan Dastjerdi, M H

    2018-05-19

    Simulating X-ray images is of great importance in industry and medicine. Using such simulation permits us to optimize parameters which affect image's quality without the limitations of an experimental procedure. This study revolves around a novel methodology to simulate a complete industrial X-ray digital radiographic system composed of an X-ray tube and a computed radiography (CR) image plate using Monte Carlo N Particle eXtended (MCNPX) code. In the process of our research, an industrial X-ray tube with maximum voltage of 300 kV and current of 5 mA was simulated. A 3-layer uniform plate including a polymer overcoat layer, a phosphor layer and a polycarbonate backing layer was also defined and simulated as the CR imaging plate. To model the image formation in the image plate, at first the absorbed dose was calculated in each pixel inside the phosphor layer of CR imaging plate using the mesh tally in MCNPX code and then was converted to gray value using a mathematical relationship determined in a separate procedure. To validate the simulation results, an experimental setup was designed and the images of two step wedges created out of aluminum and steel were captured by the experiments and compared with the simulations. The results show that the simulated images are in good agreement with the experimental ones demonstrating the ability of the proposed methodology for simulating an industrial X-ray imaging system. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. A new texture descriptor based on local micro-pattern for detection of architectural distortion in mammographic images

    NASA Astrophysics Data System (ADS)

    de Oliveira, Helder C. R.; Moraes, Diego R.; Reche, Gustavo A.; Borges, Lucas R.; Catani, Juliana H.; de Barros, Nestor; Melo, Carlos F. E.; Gonzaga, Adilson; Vieira, Marcelo A. C.

    2017-03-01

    This paper presents a new local micro-pattern texture descriptor for the detection of Architectural Distortion (AD) in digital mammography images. AD is a subtle contraction of breast parenchyma that may represent an early sign of breast cancer. Due to its subtlety and variability, AD is more difficult to detect compared to microcalcifications and masses, and is commonly found in retrospective evaluations of false-negative mammograms. Several computer-based systems have been proposed for automatic detection of AD, but their performance are still unsatisfactory. The proposed descriptor, Local Mapped Pattern (LMP), is a generalization of the Local Binary Pattern (LBP), which is considered one of the most powerful feature descriptor for texture classification in digital images. Compared to LBP, the LMP descriptor captures more effectively the minor differences between the local image pixels. Moreover, LMP is a parametric model which can be optimized for the desired application. In our work, the LMP performance was compared to the LBP and four Haralick's texture descriptors for the classification of 400 regions of interest (ROIs) extracted from clinical mammograms. ROIs were selected and divided into four classes: AD, normal tissue, microcalcifications and masses. Feature vectors were used as input to a multilayer perceptron neural network, with a single hidden layer. Results showed that LMP is a good descriptor to distinguish AD from other anomalies in digital mammography. LMP performance was slightly better than the LBP and comparable to Haralick's descriptors (mean classification accuracy = 83%).

  18. Digital Imaging

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Digital Imaging is the computer processed numerical representation of physical images. Enhancement of images results in easier interpretation. Quantitative digital image analysis by Perceptive Scientific Instruments, locates objects within an image and measures them to extract quantitative information. Applications are CAT scanners, radiography, microscopy in medicine as well as various industrial and manufacturing uses. The PSICOM 327 performs all digital image analysis functions. It is based on Jet Propulsion Laboratory technology, is accurate and cost efficient.

  19. Reduction of false-positives in a CAD scheme for automated detection of architectural distortion in digital mammography

    NASA Astrophysics Data System (ADS)

    de Oliveira, Helder C. R.; Mencattini, Arianna; Casti, Paola; Martinelli, Eugenio; di Natale, Corrado; Catani, Juliana H.; de Barros, Nestor; Melo, Carlos F. E.; Gonzaga, Adilson; Vieira, Marcelo A. C.

    2018-02-01

    This paper proposes a method to reduce the number of false-positives (FP) in a computer-aided detection (CAD) scheme for automated detection of architectural distortion (AD) in digital mammography. AD is a subtle contraction of breast parenchyma that may represent an early sign of breast cancer. Due to its subtlety and variability, AD is more difficult to detect compared to microcalcifications and masses, and is commonly found in retrospective evaluations of false-negative mammograms. Several computer-based systems have been proposed for automated detection of AD in breast images. The usual approach is automatically detect possible sites of AD in a mammographic image (segmentation step) and then use a classifier to eliminate the false-positives and identify the suspicious regions (classification step). This paper focus on the optimization of the segmentation step to reduce the number of FPs that is used as input to the classifier. The proposal is to use statistical measurements to score the segmented regions and then apply a threshold to select a small quantity of regions that should be submitted to the classification step, improving the detection performance of a CAD scheme. We evaluated 12 image features to score and select suspicious regions of 74 clinical Full-Field Digital Mammography (FFDM). All images in this dataset contained at least one region with AD previously marked by an expert radiologist. The results showed that the proposed method can reduce the false positives of the segmentation step of the CAD scheme from 43.4 false positives (FP) per image to 34.5 FP per image, without increasing the number of false negatives.

  20. Development of an MRI-compatible digital SiPM detector stack for simultaneous PET/MRI.

    PubMed

    Düppenbecker, Peter M; Weissler, Bjoern; Gebhardt, Pierre; Schug, David; Wehner, Jakob; Marsden, Paul K; Schulz, Volkmar

    2016-02-01

    Advances in solid-state photon detectors paved the way to combine positron emission tomography (PET) and magnetic resonance imaging (MRI) into highly integrated, truly simultaneous, hybrid imaging systems. Based on the most recent digital SiPM technology, we developed an MRI-compatible PET detector stack, intended as a building block for next generation simultaneous PET/MRI systems. Our detector stack comprises an array of 8 × 8 digital SiPM channels with 4 mm pitch using Philips Digital Photon Counting DPC 3200-22 devices, an FPGA for data acquisition, a supply voltage control system and a cooling infrastructure. This is the first detector design that allows the operation of digital SiPMs simultaneously inside an MRI system. We tested and optimized the MRI-compatibility of our detector stack on a laboratory test bench as well as in combination with a Philips Achieva 3 T MRI system. Our design clearly reduces distortions of the static magnetic field compared to a conventional design. The MRI static magnetic field causes weak and directional drift effects on voltage regulators, but has no direct impact on detector performance. MRI gradient switching initially degraded energy and timing resolution. Both distortions could be ascribed to voltage variations induced on the bias and the FPGA core voltage supply respectively. Based on these findings, we improved our detector design and our final design shows virtually no energy or timing degradations, even during heavy and continuous MRI gradient switching. In particular, we found no evidence that the performance of the DPC 3200-22 digital SiPM itself is degraded by the MRI system.

  1. Remote sensing for industrial applications in the energy business: digital territorial data integration for planning of overhead power transmission lines (OHTLs)

    NASA Astrophysics Data System (ADS)

    Terrazzino, Alfonso; Volponi, Silvia; Borgogno Mondino, Enrico

    2001-12-01

    An investigation has been carried out, concerning remote sensing techniques, in order to assess their potential application to the energy system business: the most interesting results concern a new approach, based on digital data from remote sensing, to infrastructures with a large territorial distribution: in particular OverHead Transmission Lines, for the high voltage transmission and distribution of electricity on large distances. Remote sensing could in principle be applied to all the phases of the system lifetime, from planning to design, to construction, management, monitoring and maintenance. In this article, a remote sensing based approach is presented, targeted to the line planning: optimization of OHTLs path and layout, according to different parameters (technical, environmental and industrial). Planning new OHTLs is of particular interest in emerging markets, where typically the cartography is missing or available only on low accuracy scale (1:50.000 and lower), often not updated. Multi- spectral images can be used to generate thematic maps of the region of interest for the planning (soil coverage). Digital Elevation Models (DEMs), allow the planners to easily access the morphologic information of the surface. Other auxiliary information from local laws, environmental instances, international (IEC) standards can be integrated in order to perform an accurate optimized path choice and preliminary spotting of the OHTLs. This operation is carried out by an ABB proprietary optimization algorithm: the output is a preliminary path that bests fits the optimization parameters of the line in a life cycle approach.

  2. Color correction optimization with hue regularization

    NASA Astrophysics Data System (ADS)

    Zhang, Heng; Liu, Huaping; Quan, Shuxue

    2011-01-01

    Previous work has suggested that observers are capable of judging the quality of an image without any knowledge of the original scene. When no reference is available, observers can extract the apparent objects in an image and compare them with the typical colors of similar objects recalled from their memories. Some generally agreed upon research results indicate that although perfect colorimetric rendering is not conspicuous and color errors can be well tolerated, the appropriate rendition of certain memory colors such as skin, grass, and sky is an important factor in the overall perceived image quality. These colors are appreciated in a fairly consistent manner and are memorized with slightly different hues and higher color saturation. The aim of color correction for a digital color pipeline is to transform the image data from a device dependent color space to a target color space, usually through a color correction matrix which in its most basic form is optimized through linear regressions between the two sets of data in two color spaces in the sense of minimized Euclidean color error. Unfortunately, this method could result in objectionable distortions if the color error biased certain colors undesirably. In this paper, we propose a color correction optimization method with preferred color reproduction in mind through hue regularization and present some experimental results.

  3. Enhanced optical design by distortion control

    NASA Astrophysics Data System (ADS)

    Thibault, Simon; Gauvin, Jonny; Doucet, Michel; Wang, Min

    2005-09-01

    The control of optical distortion is useful for the design of a variety of optical system. The most popular is the F-theta lens used in laser scanning system to produce a constant scan velocity across the image plane. Many authors have designed during the last 20 years distortion control corrector. Today, many challenging digital imaging system can use distortion the enhanced their imaging capability. A well know example is a reversed telephoto type, if the barrel distortion is increased instead of being corrected; the result is a so-called Fish-eye lens. However, if we control the barrel distortion instead of only increasing it, the resulting system can have enhanced imaging capability. This paper will present some lens design and real system examples that clearly demonstrate how the distortion control can improve the system performances such as resolution. We present innovative optical system which increases the resolution in the field of view of interest to meet the needs of specific applications. One critical issue when we designed using distortion is the optimization management. Like most challenging lens design, the automatic optimization is less reliable. Proper management keeps the lens design within the correct range, which is critical for optimal performance (size, cost, manufacturability). Many lens design presented tailor a custom merit function and approach.

  4. Pre-Hardware Optimization of Spacecraft Image Processing Algorithms and Hardware Implementation

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Petrick, David J.; Flatley, Thomas P.; Hestnes, Phyllis; Jentoft-Nilsen, Marit; Day, John H. (Technical Monitor)

    2002-01-01

    Spacecraft telemetry rates and telemetry product complexity have steadily increased over the last decade presenting a problem for real-time processing by ground facilities. This paper proposes a solution to a related problem for the Geostationary Operational Environmental Spacecraft (GOES-8) image data processing and color picture generation application. Although large super-computer facilities are the obvious heritage solution, they are very costly, making it imperative to seek a feasible alternative engineering solution at a fraction of the cost. The proposed solution is based on a Personal Computer (PC) platform and synergy of optimized software algorithms, and reconfigurable computing hardware (RC) technologies, such as Field Programmable Gate Arrays (FPGA) and Digital Signal Processors (DSP). It has been shown that this approach can provide superior inexpensive performance for a chosen application on the ground station or on-board a spacecraft.

  5. Utility of Digital Stereo Images for Optic Disc Evaluation

    PubMed Central

    Ying, Gui-shuang; Pearson, Denise J.; Bansal, Mayank; Puri, Manika; Miller, Eydie; Alexander, Judith; Piltz-Seymour, Jody; Nyberg, William; Maguire, Maureen G.; Eledath, Jayan; Sawhney, Harpreet

    2010-01-01

    Purpose. To assess the suitability of digital stereo images for optic disc evaluations in glaucoma. Methods. Stereo color optic disc images in both digital and 35-mm slide film formats were acquired contemporaneously from 29 subjects with various cup-to-disc ratios (range, 0.26–0.76; median, 0.475). Using a grading scale designed to assess image quality, the ease of visualizing optic disc features important for glaucoma diagnosis, and the comparative diameters of the optic disc cup, experienced observers separately compared the primary digital stereo images to each subject's 35-mm slides, to scanned images of the same 35-mm slides, and to grayscale conversions of the digital images. Statistical analysis accounted for multiple gradings and comparisons and also assessed image formats under monoscopic viewing. Results. Overall, the quality of primary digital color images was judged superior to that of 35-mm slides (P < 0.001), including improved stereo (P < 0.001), but the primary digital color images were mostly equivalent to the scanned digitized images of the same slides. Color seemingly added little to grayscale optic disc images, except that peripapillary atrophy was best seen in color (P < 0.0001); both the nerve fiber layer (P < 0.0001) and the paths of blood vessels on the optic disc (P < 0.0001) were best seen in grayscale. The preference for digital over film images was maintained under monoscopic viewing conditions. Conclusions. Digital stereo optic disc images are useful for evaluating the optic disc in glaucoma and allow the application of advanced image processing applications. Grayscale images, by providing luminance distinct from color, may be informative for assessing certain features. PMID:20505199

  6. Attenuation characteristics of fiberoptic plates for digital mammography and other X-ray imaging applications.

    PubMed

    Vedantham, S; Karellas, A; Suryanarayanan, S

    2003-01-01

    Spatially coherent fiberoptic plates are important components of some charge-coupled device (CCD)-based x-ray imaging systems. These plates efficiently transmit scintillations from the phosphor, and also filter out x-rays not absorbed by the phosphor, thus protecting the CCD from direct x-ray interaction. The thickness of the fiberoptic plate and the CCD package present a significant challenge in the design of a digital x-ray cassette capable of insertion into the existing film-screen cassette holders of digital mammography systems. This study was performed with an aim to optimize fiberoptic plate thickness. Attenuation measurements were performed on nine fiberoptic plates varying in material composition that exhibit desirable optical characteristics such as good coupling efficiency. Mammographic spectra from a clinical mammographic system and an Americium-241 (Am-241) source (59.54 KeV) were used. The spectra were recorded with a high-resolution cadmium zinc telluride (CZT)-based spectrometer and corrected for dead time and pile-up. The linear attenuation coefficients varied by a factor of 3 in the set of tested fiberoptic plates at both mammographic energies and 59.54 keV. Our results suggest that a 3-mm thick high-absorption plate might provide adequate for shielding at mammographic energies. A thickness of 2-mm is feasible for mammographic applications with further optimization of the fiberoptic plate composition by incorporating non-scintillating, high-atomic number material. This would allow more space for cooling components of the cassette and for a more compact device, which is critical for clinical implementation of the technology.

  7. HIPAA, dermatology images, and the law.

    PubMed

    Scheinfeld, Noah; Rothstein, Brooke

    2013-12-01

    From smart phones to iPads, the world has grown increasingly reliant on new technology. In this ever-expanding digital age, medicine is at the forefront of these new technologies. In the field of dermatology and general medicine, digital images have become an important tool used in patient management. Today, one can even find physicians who use their cellular phone cameras to take patient images and transmit them to other physicians. However, as digital imaging technology has become more prevalent so too have concerns about the impact of this technology on the electronic medical record, quality of patient care, and medicolegal issues. This article will discuss the advent of digital imaging technology in dermatology and the legal ramifications digital images have on medical care, abiding by HIPAA, the use of digital images as evidence, and the possible abuses digital images can pose in a health care setting.

  8. Comparative study between quantitative digital image analysis and fluorescence in situ hybridization of breast cancer equivocal human epidermal growth factor receptors 2 score 2(+) cases.

    PubMed

    Ayad, Essam; Mansy, Mina; Elwi, Dalal; Salem, Mostafa; Salama, Mohamed; Kayser, Klaus

    2015-01-01

    Optimization of workflow for breast cancer samples with equivocal human epidermal growth factor receptors 2 (HER2)/neu score 2(+) results in routine practice, remains to be a central focus of the on-going efforts to assess HER2 status. According to the College of American Pathologists/American Society of Clinical Oncology guidelines equivocal HER2/neu score 2(+) cases are subject for further testing, usually by fluorescence in situ hybridization (FISH) investigations. It still remains on open question, whether quantitative digital image analysis of HER2 immunohistochemistry (IHC) stained slides can assist in further refining the HER2 score 2(+). To assess utility of quantitative digital analysis of IHC stained slides and compare its performance to FISH in cases of breast cancer with equivocal HER2 score 2(+). Fifteen specimens (previously diagnosed as breast cancer and was evaluated as HER 2(-) score 2(+)) represented the study population. Contemporary new cuts were prepared for re-evaluation of HER2 immunohistochemical studies and FISH examination. All the cases were digitally scanned by iScan (Produced by BioImagene [Now Roche-Ventana]). The IHC signals of HER2 were measured using an automated image analyzing system (MECES, www.Diagnomx.eu/meces). Finally, a comparative study was done between the results of the FISH and the quantitative analysis of the virtual slides. Three out of the 15 cases with equivocal HER2 score 2(+), turned out to be positive (3(+)) by quantitative digital analysis, and 12 were found to be negative in FISH too. Two of these three positive cases proved to be positive with FISH, and only one was negative. Quantitative digital analysis is highly sensitive and relatively specific when compared to FISH in detecting HER2/neu overexpression. Therefore, it represents a potential reliable substitute for FISH in breast cancer cases, which desire further refinement of equivocal IHC results.

  9. Improved Dot Diffusion For Image Halftoning

    DTIC Science & Technology

    1999-01-01

    The dot diffusion method for digital halftoning has the advantage of parallelism unlike the error diffusion method. The method was recently improved...by optimization of the so-called class matrix so that the resulting halftones are comparable to the error diffused halftones . In this paper we will...first review the dot diffusion method. Previously, 82 class matrices were used for dot diffusion method. A problem with this size of class matrix is

  10. Optimization and Comparison of Different Digital Mammographic Tomosynthesis Reconstruction Methods

    DTIC Science & Technology

    2008-04-01

    physical measurements of impulse response analysis, modulation transfer function (MTF) and noise power spectrum (NPS). (Months 5- 12). This task has...and 2 impulse -added: projection images with simulated impulse and the I /r2 shading difference. Other system blur and noise issues are not...blur, and suppressed high frequency noise . Point-by-point BP rather than traditional SAA should be considered as the basis of further deblurring

  11. WE-EF-207-03: Design and Optimization of a CBCT Head Scanner for Detection of Acute Intracranial Hemorrhage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, J; Sisniega, A; Zbijewski, W

    Purpose: To design a dedicated x-ray cone-beam CT (CBCT) system suitable to deployment at the point-of-care and offering reliable detection of acute intracranial hemorrhage (ICH), traumatic brain injury (TBI), stroke, and other head and neck injuries. Methods: A comprehensive task-based image quality model was developed to guide system design and optimization of a prototype head scanner suitable to imaging of acute TBI and ICH. Previously reported models were expanded to include the effects of x-ray scatter correction necessary for detection of low contrast ICH and the contribution of bit depth (digitization noise) to imaging performance. Task-based detectablity index provided themore » objective function for optimization of system geometry, x-ray source, detector type, anti-scatter grid, and technique at 10–25 mGy dose. Optimal characteristics were experimentally validated using a custom head phantom with 50 HU contrast ICH inserts imaged on a CBCT imaging bench allowing variation of system geometry, focal spot size, detector, grid selection, and x-ray technique. Results: The model guided selection of system geometry with a nominal source-detector distance 1100 mm and optimal magnification of 1.50. Focal spot size ∼0.6 mm was sufficient for spatial resolution requirements in ICH detection. Imaging at 90 kVp yielded the best tradeoff between noise and contrast. The model provided quantitation of tradeoffs between flat-panel and CMOS detectors with respect to electronic noise, field of view, and readout speed required for imaging of ICH. An anti-scatter grid was shown to provide modest benefit in conjunction with post-acquisition scatter correction. Images of the head phantom demonstrate visualization of millimeter-scale simulated ICH. Conclusions: Performance consistent with acute TBI and ICH detection is feasible with model-based system design and robust artifact correction in a dedicated head CBCT system. Further improvements can be achieved with incorporation of model-based iterative reconstruction techniques also within the scope of the task-based optimization framework. David Foos and Xiaohui Wang are employees of Carestream Health.« less

  12. Experimental high-speed network

    NASA Astrophysics Data System (ADS)

    McNeill, Kevin M.; Klein, William P.; Vercillo, Richard; Alsafadi, Yasser H.; Parra, Miguel V.; Dallas, William J.

    1993-09-01

    Many existing local area networking protocols currently applied in medical imaging were originally designed for relatively low-speed, low-volume networking. These protocols utilize small packet sizes appropriate for text based communication. Local area networks of this type typically provide raw bandwidth under 125 MHz. These older network technologies are not optimized for the low delay, high data traffic environment of a totally digital radiology department. Some current implementations use point-to-point links when greater bandwidth is required. However, the use of point-to-point communications for a total digital radiology department network presents many disadvantages. This paper describes work on an experimental multi-access local area network called XFT. The work includes the protocol specification, and the design and implementation of network interface hardware and software. The protocol specifies the Physical and Data Link layers (OSI layers 1 & 2) for a fiber-optic based token ring providing a raw bandwidth of 500 MHz. The protocol design and implementation of the XFT interface hardware includes many features to optimize image transfer and provide flexibility for additional future enhancements which include: a modular hardware design supporting easy portability to a variety of host system buses, a versatile message buffer design providing 16 MB of memory, and the capability to extend the raw bandwidth of the network to 3.0 GHz.

  13. Overview of CMOS process and design options for image sensor dedicated to space applications

    NASA Astrophysics Data System (ADS)

    Martin-Gonthier, P.; Magnan, P.; Corbiere, F.

    2005-10-01

    With the growth of huge volume markets (mobile phones, digital cameras...) CMOS technologies for image sensor improve significantly. New process flows appear in order to optimize some parameters such as quantum efficiency, dark current, and conversion gain. Space applications can of course benefit from these improvements. To illustrate this evolution, this paper reports results from three technologies that have been evaluated with test vehicles composed of several sub arrays designed with some space applications as target. These three technologies are CMOS standard, improved and sensor optimized process in 0.35μm generation. Measurements are focussed on quantum efficiency, dark current, conversion gain and noise. Other measurements such as Modulation Transfer Function (MTF) and crosstalk are depicted in [1]. A comparison between results has been done and three categories of CMOS process for image sensors have been listed. Radiation tolerance has been also studied for the CMOS improved process in the way of hardening the imager by design. Results at 4, 15, 25 and 50 krad prove a good ionizing dose radiation tolerance applying specific techniques.

  14. Techniques to improve the accuracy of noise power spectrum measurements in digital x-ray imaging based on background trends removal.

    PubMed

    Zhou, Zhongxing; Gao, Feng; Zhao, Huijuan; Zhang, Lixin

    2011-03-01

    Noise characterization through estimation of the noise power spectrum (NPS) is a central component of the evaluation of digital x-ray systems. Extensive works have been conducted to achieve accurate and precise measurement of NPS. One approach to improve the accuracy of the NPS measurement is to reduce the statistical variance of the NPS results by involving more data samples. However, this method is based on the assumption that the noise in a radiographic image is arising from stochastic processes. In the practical data, the artifactuals always superimpose on the stochastic noise as low-frequency background trends and prevent us from achieving accurate NPS. The purpose of this study was to investigate an appropriate background detrending technique to improve the accuracy of NPS estimation for digital x-ray systems. In order to achieve the optimal background detrending technique for NPS estimate, four methods for artifactuals removal were quantitatively studied and compared: (1) Subtraction of a low-pass-filtered version of the image, (2) subtraction of a 2-D first-order fit to the image, (3) subtraction of a 2-D second-order polynomial fit to the image, and (4) subtracting two uniform exposure images. In addition, background trend removal was separately applied within original region of interest or its partitioned sub-blocks for all four methods. The performance of background detrending techniques was compared according to the statistical variance of the NPS results and low-frequency systematic rise suppression. Among four methods, subtraction of a 2-D second-order polynomial fit to the image was most effective in low-frequency systematic rise suppression and variances reduction for NPS estimate according to the authors' digital x-ray system. Subtraction of a low-pass-filtered version of the image led to NPS variance increment above low-frequency components because of the side lobe effects of frequency response of the boxcar filtering function. Subtracting two uniform exposure images obtained the worst result on the smoothness of NPS curve, although it was effective in low-frequency systematic rise suppression. Subtraction of a 2-D first-order fit to the image was also identified effective for background detrending, but it was worse than subtraction of a 2-D second-order polynomial fit to the image according to the authors' digital x-ray system. As a result of this study, the authors verified that it is necessary and feasible to get better NPS estimate by appropriate background trend removal. Subtraction of a 2-D second-order polynomial fit to the image was the most appropriate technique for background detrending without consideration of processing time.

  15. On the Optimum Architecture of the Biologically Inspired Hierarchical Temporal Memory Model Applied to the Hand-Written Digit Recognition

    NASA Astrophysics Data System (ADS)

    Štolc, Svorad; Bajla, Ivan

    2010-01-01

    In the paper we describe basic functions of the Hierarchical Temporal Memory (HTM) network based on a novel biologically inspired model of the large-scale structure of the mammalian neocortex. The focus of this paper is in a systematic exploration of possibilities how to optimize important controlling parameters of the HTM model applied to the classification of hand-written digits from the USPS database. The statistical properties of this database are analyzed using the permutation test which employs a randomization distribution of the training and testing data. Based on a notion of the homogeneous usage of input image pixels, a methodology of the HTM parameter optimization is proposed. In order to study effects of two substantial parameters of the architecture: the patch size and the overlap in more details, we have restricted ourselves to the single-level HTM networks. A novel method for construction of the training sequences by ordering series of the static images is developed. A novel method for estimation of the parameter maxDist based on the box counting method is proposed. The parameter sigma of the inference Gaussian is optimized on the basis of the maximization of the belief distribution entropy. Both optimization algorithms can be equally applied to the multi-level HTM networks as well. The influences of the parameters transitionMemory and requestedGroupCount on the HTM network performance have been explored. Altogether, we have investigated 2736 different HTM network configurations. The obtained classification accuracy results have been benchmarked with the published results of several conventional classifiers.

  16. Image processing techniques for digital orthophotoquad production

    USGS Publications Warehouse

    Hood, Joy J.; Ladner, L. J.; Champion, Richard A.

    1989-01-01

    Orthophotographs have long been recognized for their value as supplements or alternatives to standard maps. Recent trends towards digital cartography have resulted in efforts by the US Geological Survey to develop a digital orthophotoquad production system. Digital image files were created by scanning color infrared photographs on a microdensitometer. Rectification techniques were applied to remove tile and relief displacement, thereby creating digital orthophotos. Image mosaicking software was then used to join the rectified images, producing digital orthophotos in quadrangle format.

  17. Laser doppler blood flow imaging using a CMOS imaging sensor with on-chip signal processing.

    PubMed

    He, Diwei; Nguyen, Hoang C; Hayes-Gill, Barrie R; Zhu, Yiqun; Crowe, John A; Gill, Cally; Clough, Geraldine F; Morgan, Stephen P

    2013-09-18

    The first fully integrated 2D CMOS imaging sensor with on-chip signal processing for applications in laser Doppler blood flow (LDBF) imaging has been designed and tested. To obtain a space efficient design over 64 × 64 pixels means that standard processing electronics used off-chip cannot be implemented. Therefore the analog signal processing at each pixel is a tailored design for LDBF signals with balanced optimization for signal-to-noise ratio and silicon area. This custom made sensor offers key advantages over conventional sensors, viz. the analog signal processing at the pixel level carries out signal normalization; the AC amplification in combination with an anti-aliasing filter allows analog-to-digital conversion with a low number of bits; low resource implementation of the digital processor enables on-chip processing and the data bottleneck that exists between the detector and processing electronics has been overcome. The sensor demonstrates good agreement with simulation at each design stage. The measured optical performance of the sensor is demonstrated using modulated light signals and in vivo blood flow experiments. Images showing blood flow changes with arterial occlusion and an inflammatory response to a histamine skin-prick demonstrate that the sensor array is capable of detecting blood flow signals from tissue.

  18. Validation of the international labour office digitized standard images for recognition and classification of radiographs of pneumoconiosis.

    PubMed

    Halldin, Cara N; Petsonk, Edward L; Laney, A Scott

    2014-03-01

    Chest radiographs are recommended for prevention and detection of pneumoconiosis. In 2011, the International Labour Office (ILO) released a revision of the International Classification of Radiographs of Pneumoconioses that included a digitized standard images set. The present study compared results of classifications of digital chest images performed using the new ILO 2011 digitized standard images to classification approaches used in the past. Underground coal miners (N = 172) were examined using both digital and film-screen radiography (FSR) on the same day. Seven National Institute for Occupational Safety and Health-certified B Readers independently classified all 172 digital radiographs, once using the ILO 2011 digitized standard images (DRILO2011-D) and once using digitized standard images used in the previous research (DRRES). The same seven B Readers classified all the miners' chest films using the ILO film-based standards. Agreement between classifications of FSR and digital radiography was identical, using a standard image set (either DRILO2011-D or DRRES). The overall weighted κ value was 0.58. Some specific differences in the results were seen and noted. However, intrareader variability in this study was similar to the published values and did not appear to be affected by the use of the new ILO 2011 digitized standard images. These findings validate the use of the ILO digitized standard images for classification of small pneumoconiotic opacities. When digital chest radiographs are obtained and displayed appropriately, results of pneumoconiosis classifications using the 2011 ILO digitized standards are comparable to film-based ILO classifications and to classifications using earlier research standards. Published by Elsevier Inc.

  19. 42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... image acquisition, digitization, processing, compression, transmission, display, archiving, and... quality digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object... digital radiographic image files from six or more sample chest radiographs that are of acceptable quality...

  20. 42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... image acquisition, digitization, processing, compression, transmission, display, archiving, and... quality digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object... digital radiographic image files from six or more sample chest radiographs that are of acceptable quality...

  1. Study of optical techniques for the Ames unitary wind tunnel: Digital image processing, part 6

    NASA Technical Reports Server (NTRS)

    Lee, George

    1993-01-01

    A survey of digital image processing techniques and processing systems for aerodynamic images has been conducted. These images covered many types of flows and were generated by many types of flow diagnostics. These include laser vapor screens, infrared cameras, laser holographic interferometry, Schlieren, and luminescent paints. Some general digital image processing systems, imaging networks, optical sensors, and image computing chips were briefly reviewed. Possible digital imaging network systems for the Ames Unitary Wind Tunnel were explored.

  2. Going forward through the world: thinking theoretically about first person perspective digital ethnography.

    PubMed

    Pink, Sarah

    2015-06-01

    Engaging first person perspective recording as a type of digital ethnography invites the question of how we might understand the status of the knowledge it produces. To examine this question I will focus on how first person perspective camera recordings might be engaged and made analytically meaningful in disciplines where naturalistic and observational visual recording is uncommon and where the idea of producing naturalistic or optimally objective visual recordings of people's lives is problematized. In doing so I explore the wider possibilities of these technologies for ethnographic research both beyond their existing uses and for interdisciplinary research where the images they produce might be analysed from more than one perspective.

  3. Fast polyenergetic forward projection for image formation using OpenCL on a heterogeneous parallel computing platform.

    PubMed

    Zhou, Lili; Clifford Chao, K S; Chang, Jenghwa

    2012-11-01

    Simulated projection images of digital phantoms constructed from CT scans have been widely used for clinical and research applications but their quality and computation speed are not optimal for real-time comparison with the radiography acquired with an x-ray source of different energies. In this paper, the authors performed polyenergetic forward projections using open computing language (OpenCL) in a parallel computing ecosystem consisting of CPU and general purpose graphics processing unit (GPGPU) for fast and realistic image formation. The proposed polyenergetic forward projection uses a lookup table containing the NIST published mass attenuation coefficients (μ∕ρ) for different tissue types and photon energies ranging from 1 keV to 20 MeV. The CT images of interested sites are first segmented into different tissue types based on the CT numbers and converted to a three-dimensional attenuation phantom by linking each voxel to the corresponding tissue type in the lookup table. The x-ray source can be a radioisotope or an x-ray generator with a known spectrum described as weight w(n) for energy bin E(n). The Siddon method is used to compute the x-ray transmission line integral for E(n) and the x-ray fluence is the weighted sum of the exponential of line integral for all energy bins with added Poisson noise. To validate this method, a digital head and neck phantom constructed from the CT scan of a Rando head phantom was segmented into three (air, gray∕white matter, and bone) regions for calculating the polyenergetic projection images for the Mohan 4 MV energy spectrum. To accelerate the calculation, the authors partitioned the workloads using the task parallelism and data parallelism and scheduled them in a parallel computing ecosystem consisting of CPU and GPGPU (NVIDIA Tesla C2050) using OpenCL only. The authors explored the task overlapping strategy and the sequential method for generating the first and subsequent DRRs. A dispatcher was designed to drive the high-degree parallelism of the task overlapping strategy. Numerical experiments were conducted to compare the performance of the OpenCL∕GPGPU-based implementation with the CPU-based implementation. The projection images were similar to typical portal images obtained with a 4 or 6 MV x-ray source. For a phantom size of 512 × 512 × 223, the time for calculating the line integrals for a 512 × 512 image panel was 16.2 ms on GPGPU for one energy bin in comparison to 8.83 s on CPU. The total computation time for generating one polyenergetic projection image of 512 × 512 was 0.3 s (141 s for CPU). The relative difference between the projection images obtained with the CPU-based and OpenCL∕GPGPU-based implementations was on the order of 10(-6) and was virtually indistinguishable. The task overlapping strategy was 5.84 and 1.16 times faster than the sequential method for the first and the subsequent digitally reconstruction radiographies, respectively. The authors have successfully built digital phantoms using anatomic CT images and NIST μ∕ρ tables for simulating realistic polyenergetic projection images and optimized the processing speed with parallel computing using GPGPU∕OpenCL-based implementation. The computation time was fast (0.3 s per projection image) enough for real-time IGRT (image-guided radiotherapy) applications.

  4. 42 CFR 37.51 - Interpreting and classifying chest radiographs-digital radiography systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... standard digital chest radiographic images provided for use with the Guidelines for the Use of the ILO... NIOSH-approved standard digital images may be used for classifying digital chest images for pneumoconiosis. Modification of the appearance of the standard images using software tools is not permitted. (d...

  5. Confocal mosaicing microscopy of human skin ex vivo: spectral analysis for digital staining to simulate histology-like appearance

    NASA Astrophysics Data System (ADS)

    Bini, Jason; Spain, James; Nehal, Kishwer; Hazelwood, Vikki; Dimarzio, Charles; Rajadhyaksha, Milind

    2011-07-01

    Confocal mosaicing microscopy enables rapid imaging of large areas of fresh tissue, without the processing that is necessary for conventional histology. Mosaicing may offer a means to perform rapid histology at the bedside. A possible barrier toward clinical acceptance is that the mosaics are based on a single mode of grayscale contrast and appear black and white, whereas histology is based on two stains (hematoxylin for nuclei, eosin for cellular cytoplasm and dermis) and appears purple and pink. Toward addressing this barrier, we report advances in digital staining: fluorescence mosaics that show only nuclei, are digitally stained purple and overlaid on reflectance mosaics, which show only cellular cytoplasm and dermis, and are digitally stained pink. With digital staining, the appearance of confocal mosaics mimics the appearance of histology. Using multispectral analysis and color matching functions, red, green, and blue (RGB) components of hematoxylin and eosin stains in tissue were determined. The resulting RGB components were then applied in a linear algorithm to transform fluorescence and reflectance contrast in confocal mosaics to the absorbance contrast seen in pathology. Optimization of staining with acridine orange showed improved quality of digitally stained mosaics, with good correlation to the corresponding histology.

  6. [Comparative evaluation of six different body regions of the dog using analog and digital radiography].

    PubMed

    Meyer-Lindenberg, Andrea; Ebermaier, Christine; Wolvekamp, Pim; Tellhelm, Bernd; Meutstege, Freek J; Lang, Johann; Hartung, Klaus; Fehr, Michael; Nolte, Ingo

    2008-01-01

    In this study the quality of digital and analog radiography in dogs was compared. For this purpose, three conventional radiographs (varying in exposure) and three digital radiographs (varying in MUSI-contrast [MUSI = MUlti Scale Image Contrast], the main post-processing parameter) of six different body regions of the dog were evaluated (thorax, abdomen, skull, femur, hip joints, elbow). The quality of the radiographs was evaluated by eight veterinary specialists familiar with radiographic images using a questionnaire based on details of each body region significant in obtaining a radiographic diagnosis. In the first part of the study the overall quality of the radiographs was evaluated. Within one region, 89.5% (43/48) chose a digital radiograph as the best image. Divided into analog and digital groups, the digital image with the highest MUSI-contrast was most often considered the best, while the analog image considered the best varied between the one with the medium and the one with the longest exposure time. In the second part of the study, each image was rated for the visibility of specific, diagnostically important details. After summarisation of the scores for each criterion, divided into analog and digital imaging, the digital images were rated considerably superior to conventional images. The results of image comparison revealed that digital radiographs showed better image detail than radiographs taken with the analog technique in all six areas of the body.

  7. GCaMP expression in retinal ganglion cells characterized using a low-cost fundus imaging system

    NASA Astrophysics Data System (ADS)

    Chang, Yao-Chuan; Walston, Steven T.; Chow, Robert H.; Weiland, James D.

    2017-10-01

    Objective. Virus-transduced, intracellular-calcium indicators are effective reporters of neural activity, offering the advantage of cell-specific labeling. Due to the existence of an optimal time window for the expression of calcium indicators, a suitable tool for tracking GECI expression in vivo following transduction is highly desirable. Approach. We developed a noninvasive imaging approach based on a custom-modified, low-cost fundus viewing system that allowed us to monitor and characterize in vivo bright-field and fluorescence images of the mouse retina. AAV2-CAG-GCaMP6f was injected into a mouse eye. The fundus imaging system was used to measure fluorescence at several time points post injection. At defined time points, we prepared wholemount retina mounted on a transparent multielectrode array and used calcium imaging to evaluate the responsiveness of retinal ganglion cells (RGCs) to external electrical stimulation. Main results. The noninvasive fundus imaging system clearly resolves individual (RGCs and axons. RGC fluorescence intensity and the number of observable fluorescent cells show a similar rising trend from week 1 to week 3 after viral injection, indicating a consistent increase of GCaMP6f expression. Analysis of the in vivo fluorescence intensity trend and in vitro neurophysiological responsiveness shows that the slope of intensity versus days post injection can be used to estimate the optimal time for calcium imaging of RGCs in response to external electrical stimulation. Significance. The proposed fundus imaging system enables high-resolution digital fundus imaging in the mouse eye, based on off-the-shelf components. The long-term tracking experiment with in vitro calcium imaging validation demonstrates the system can serve as a powerful tool monitoring the level of genetically-encoded calcium indicator expression, further determining the optimal time window for following experiment.

  8. Computerized morphometry as an aid in distinguishing recurrent versus nonrecurrent meningiomas.

    PubMed

    Noy, Shawna; Vlodavsky, Euvgeni; Klorin, Geula; Drumea, Karen; Ben Izhak, Ofer; Shor, Eli; Sabo, Edmond

    2011-06-01

    To use novel digital and morphometric methods to identify variables able to better predict the recurrence of intracranial meningiomas. Histologic images from 30 previously diagnosed meningioma tumors that recurred over 10 years of follow-up were consecutively selected from the Rambam Pathology Archives. Images were captured and morphometrically analyzed. Novel algorithms of digital pattern recognition using Fourier transformation and fractal and nuclear texture analyses were applied to evaluate the overall growth pattern complexity of the tumors, as well as the chromatin texture of individual tumor nuclei. The extracted parameters were then correlated with patient prognosis. Kaplan-Meier analyses revealed statistically significant associations between tumor morphometric parameters and recurrence times. Tumors with less nuclear orientation, more nuclear density, higher fractal dimension, and less regular chromatin textures tended to recur faster than those with a higher degree of nuclear order, less pattern complexity, lower density, and more homogeneous chromatin nuclear textures (p < 0.01). To our knowledge, these digital morphometric methods were used for the first time to accurately predict tumor recurrence in patients with intracranial meningiomas. The use of these methods may bring additional valuable information to the clinician regarding the optimal management of these patients.

  9. Digital mammography, cancer screening: Factors important for image compression

    NASA Technical Reports Server (NTRS)

    Clarke, Laurence P.; Blaine, G. James; Doi, Kunio; Yaffe, Martin J.; Shtern, Faina; Brown, G. Stephen; Winfield, Daniel L.; Kallergi, Maria

    1993-01-01

    The use of digital mammography for breast cancer screening poses several novel problems such as development of digital sensors, computer assisted diagnosis (CAD) methods for image noise suppression, enhancement, and pattern recognition, compression algorithms for image storage, transmission, and remote diagnosis. X-ray digital mammography using novel direct digital detection schemes or film digitizers results in large data sets and, therefore, image compression methods will play a significant role in the image processing and analysis by CAD techniques. In view of the extensive compression required, the relative merit of 'virtually lossless' versus lossy methods should be determined. A brief overview is presented here of the developments of digital sensors, CAD, and compression methods currently proposed and tested for mammography. The objective of the NCI/NASA Working Group on Digital Mammography is to stimulate the interest of the image processing and compression scientific community for this medical application and identify possible dual use technologies within the NASA centers.

  10. Development of a fusion approach selection tool

    NASA Astrophysics Data System (ADS)

    Pohl, C.; Zeng, Y.

    2015-06-01

    During the last decades number and quality of available remote sensing satellite sensors for Earth observation has grown significantly. The amount of available multi-sensor images along with their increased spatial and spectral resolution provides new challenges to Earth scientists. With a Fusion Approach Selection Tool (FAST) the remote sensing community would obtain access to an optimized and improved image processing technology. Remote sensing image fusion is a mean to produce images containing information that is not inherent in the single image alone. In the meantime the user has access to sophisticated commercialized image fusion techniques plus the option to tune the parameters of each individual technique to match the anticipated application. This leaves the operator with an uncountable number of options to combine remote sensing images, not talking about the selection of the appropriate images, resolution and bands. Image fusion can be a machine and time-consuming endeavour. In addition it requires knowledge about remote sensing, image fusion, digital image processing and the application. FAST shall provide the user with a quick overview of processing flows to choose from to reach the target. FAST will ask for available images, application parameters and desired information to process this input to come out with a workflow to quickly obtain the best results. It will optimize data and image fusion techniques. It provides an overview on the possible results from which the user can choose the best. FAST will enable even inexperienced users to use advanced processing methods to maximize the benefit of multi-sensor image exploitation.

  11. An Interactive Program on Digitizing Historical Seismograms

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Xu, T.

    2013-12-01

    Retrieving information from historical seismograms is of great importance since they are considered the unique sources that provide quantitative information of historical earthquakes. Modern techniques of seismology require digital forms of seismograms that are essentially a sequence of time-amplitude pairs. However, the historical seismograms, after scanned into computers, are two dimensional arrays. Each element of the arrays contains the grayscale value or RGB value of the corresponding pixel. The problem of digitizing historical seismograms, referred to as converting historical seismograms to digital seismograms, can be formulated as an inverse problem that generating sequences of time-amplitude pairs from a two dimension arrays. This problem has infinite solutions. The algorithm for automatic digitization of historical seismogram presented considers several features of seismograms, including continuity, smoothness of the seismic traces as the prior information, and assumes that the amplitude is a single-valued function of time. An interactive program based on the algorithm is also presented. The program is developed using Matlab GUI and has both automatic and manual modality digitization. Users can easily switch between them, and try different combinations to get the optimal results. Several examples are given to illustrate the results of digitizing seismograms using the program, including a photographic record and a wide-angle reflection/refraction seismogram. Digitized result of the program (redrawn using Golden Software Surfer for high resolution image). (a) shows the result of automatic digitization, and (b) is the result after manual correction.

  12. Evaluation of LANDSAT-4 Thematic Mapper Data as Applied to Geologic Exploration: Summary of Results. [Death Valley, California, Cement-Velma, Oklahoma; Big Horn and Wind River Basins, Wyoming; Spanish Peaks, Colorado; and the Four Corners area (Paradox Basin of Utah and Colorado)

    NASA Technical Reports Server (NTRS)

    Dykstra, J. D.; Sheffield, C. A.; Everett, J. R.

    1984-01-01

    As with any tool applied to geologic exploration, maximum value results from the innovative integration of optimally processed LANDSAT-4 data with existing pertinent information and perceptive geologic thinking. The synoptic view of the satellite images and the relatively high resolution of the data permits recognization of regional tectonic patterns and their detailed mapping. The refined spatial and spectral characteristics and digital nature surface alterations associated with hydrothermal activity and microseepage of hydrocarbons. In general, as vegetation and soil cover increase, the value of spectral components of TM data decreases with respect to the value of the spatial component of the data. This observation reinforces the experience from working with MSS data that digital processing must be optimized both for the area and for the application.

  13. Motion correction of PET brain images through deconvolution: II. Practical implementation and algorithm optimization

    NASA Astrophysics Data System (ADS)

    Raghunath, N.; Faber, T. L.; Suryanarayanan, S.; Votaw, J. R.

    2009-02-01

    Image quality is significantly degraded even by small amounts of patient motion in very high-resolution PET scanners. When patient motion is known, deconvolution methods can be used to correct the reconstructed image and reduce motion blur. This paper describes the implementation and optimization of an iterative deconvolution method that uses an ordered subset approach to make it practical and clinically viable. We performed ten separate FDG PET scans using the Hoffman brain phantom and simultaneously measured its motion using the Polaris Vicra tracking system (Northern Digital Inc., Ontario, Canada). The feasibility and effectiveness of the technique was studied by performing scans with different motion and deconvolution parameters. Deconvolution resulted in visually better images and significant improvement as quantified by the Universal Quality Index (UQI) and contrast measures. Finally, the technique was applied to human studies to demonstrate marked improvement. Thus, the deconvolution technique presented here appears promising as a valid alternative to existing motion correction methods for PET. It has the potential for deblurring an image from any modality if the causative motion is known and its effect can be represented in a system matrix.

  14. Radiologists' preferences for digital mammographic display. The International Digital Mammography Development Group.

    PubMed

    Pisano, E D; Cole, E B; Major, S; Zong, S; Hemminger, B M; Muller, K E; Johnston, R E; Walsh, R; Conant, E; Fajardo, L L; Feig, S A; Nishikawa, R M; Yaffe, M J; Williams, M B; Aylward, S R

    2000-09-01

    To determine the preferences of radiologists among eight different image processing algorithms applied to digital mammograms obtained for screening and diagnostic imaging tasks. Twenty-eight images representing histologically proved masses or calcifications were obtained by using three clinically available digital mammographic units. Images were processed and printed on film by using manual intensity windowing, histogram-based intensity windowing, mixture model intensity windowing, peripheral equalization, multiscale image contrast amplification (MUSICA), contrast-limited adaptive histogram equalization, Trex processing, and unsharp masking. Twelve radiologists compared the processed digital images with screen-film mammograms obtained in the same patient for breast cancer screening and breast lesion diagnosis. For the screening task, screen-film mammograms were preferred to all digital presentations, but the acceptability of images processed with Trex and MUSICA algorithms were not significantly different. All printed digital images were preferred to screen-film radiographs in the diagnosis of masses; mammograms processed with unsharp masking were significantly preferred. For the diagnosis of calcifications, no processed digital mammogram was preferred to screen-film mammograms. When digital mammograms were preferred to screen-film mammograms, radiologists selected different digital processing algorithms for each of three mammographic reading tasks and for different lesion types. Soft-copy display will eventually allow radiologists to select among these options more easily.

  15. Lumbar spine radiography — poor collimation practices after implementation of digital technology

    PubMed Central

    Zetterberg, L G; Espeland, A

    2011-01-01

    Objectives The transition from analogue to digital radiography may have reduced the motivation to perform proper collimation, as digital techniques have made it possible to mask areas irradiated outside the area of diagnostic interest (ADI). We examined the hypothesis that collimation practices have deteriorated since digitalisation. Methods After defining the ADI, we compared the proportion of the irradiated field outside the ADI in 86 digital and 86 analogue frontal lumbar spine radiographs using the Mann–Whitney test. 50 digital images and 50 analogue images were from a Norwegian hospital and the remainder from a Danish hospital. Consecutive digital images were compared with analogue images (from the hospitals' archives) produced in the 4 years prior to digitalisation. Both hospitals' standard radiographic procedures remained unchanged during the study. For digital images, the irradiated field was assessed using non-masked raw-data images. Results The proportion of the irradiated field outside the ADI was larger in digital than in analogue images (mean 61.7% vs 42.4%, p<0.001), and also in a subsample of 39 image pairs that could be matched for patient age (p<0.001). The mean total field size was 46% larger in digital than in analogue images (791 cm2 vs 541 cm2). Conclusion Following the implementation of digital radiography, considerably larger areas were irradiated. This causes unnecessarily high radiation doses to patients. PMID:21606070

  16. Digital Image Analysis System for Monitoring Crack Growth at Elevated Temperature

    DTIC Science & Technology

    1988-05-01

    The objective of the research work reported here was to develop a new concept, based on Digital Image Analysis , for monitoring the crack-tip position...a 512 x 512 pixel frame. c) Digital Image Analysis software developed to locate and digitize the position of the crack-tip, on the observed image

  17. [Improvement of Digital Capsule Endoscopy System and Image Interpolation].

    PubMed

    Zhao, Shaopeng; Yan, Guozheng; Liu, Gang; Kuang, Shuai

    2016-01-01

    Traditional capsule image collects and transmits analog image, with weak anti-interference ability, low frame rate, low resolution. This paper presents a new digital image capsule, which collects and transmits digital image, with frame rate up to 30 frames/sec and pixels resolution of 400 x 400. The image is compressed in the capsule, and is transmitted to the outside of the capsule for decompression and interpolation. A new type of interpolation algorithm is proposed, which is based on the relationship between the image planes, to obtain higher quality colour images. capsule endoscopy, digital image, SCCB protocol, image interpolation

  18. A model for a PC-based, universal-format, multimedia digitization system: moving beyond the scanner.

    PubMed

    McEachen, James C; Cusack, Thomas J; McEachen, John C

    2003-08-01

    Digitizing images for use in case presentations based on hardcopy films, slides, photographs, negatives, books, and videos can present a challenging task. Scanners and digital cameras have become standard tools of the trade. Unfortunately, use of these devices to digitize multiple images in many different media formats can be a time-consuming and in some cases unachievable process. The authors' goal was to create a PC-based solution for digitizing multiple media formats in a timely fashion while maintaining adequate image presentation quality. The authors' PC-based solution makes use of off-the-shelf hardware applications to include a digital document camera (DDC), VHS video player, and video-editing kit. With the assistance of five staff radiologists, the authors examined the quality of multiple image types digitized with this equipment. The authors also quantified the speed of digitization of various types of media using the DDC and video-editing kit. With regard to image quality, the five staff radiologists rated the digitized angiography, CT, and MR images as adequate to excellent for use in teaching files and case presentations. With regard to digitized plain films, the average rating was adequate. As for performance, the authors recognized a 68% improvement in the time required to digitize hardcopy films using the DDC instead of a professional quality scanner. The PC-based solution provides a means for digitizing multiple images from many different types of media in a timely fashion while maintaining adequate image presentation quality.

  19. Pre-Hardware Optimization and Implementation Of Fast Optics Closed Control Loop Algorithms

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Lyon, Richard G.; Herman, Jay R.; Abuhassan, Nader

    2004-01-01

    One of the main heritage tools used in scientific and engineering data spectrum analysis is the Fourier Integral Transform and its high performance digital equivalent - the Fast Fourier Transform (FFT). The FFT is particularly useful in two-dimensional (2-D) image processing (FFT2) within optical systems control. However, timing constraints of a fast optics closed control loop would require a supercomputer to run the software implementation of the FFT2 and its inverse, as well as other image processing representative algorithm, such as numerical image folding and fringe feature extraction. A laboratory supercomputer is not always available even for ground operations and is not feasible for a night project. However, the computationally intensive algorithms still warrant alternative implementation using reconfigurable computing technologies (RC) such as Digital Signal Processors (DSP) and Field Programmable Gate Arrays (FPGA), which provide low cost compact super-computing capabilities. We present a new RC hardware implementation and utilization architecture that significantly reduces the computational complexity of a few basic image-processing algorithm, such as FFT2, image folding and phase diversity for the NASA Solar Viewing Interferometer Prototype (SVIP) using a cluster of DSPs and FPGAs. The DSP cluster utilization architecture also assures avoidance of a single point of failure, while using commercially available hardware. This, combined with the control algorithms pre-hardware optimization, or the first time allows construction of image-based 800 Hertz (Hz) optics closed control loops on-board a spacecraft, based on the SVIP ground instrument. That spacecraft is the proposed Earth Atmosphere Solar Occultation Imager (EASI) to study greenhouse gases CO2, C2H, H2O, O3, O2, N2O from Lagrange-2 point in space. This paper provides an advanced insight into a new type of science capabilities for future space exploration missions based on on-board image processing for control and for robotics missions using vision sensors. It presents a top-level description of technologies required for the design and construction of SVIP and EASI and to advance the spatial-spectral imaging and large-scale space interferometry science and engineering.

  20. Orthographic Stereo Correlator on the Terrain Model for Apollo Metric Images

    NASA Technical Reports Server (NTRS)

    Kim, Taemin; Husmann, Kyle; Moratto, Zachary; Nefian, Ara V.

    2011-01-01

    A stereo correlation method on the object domain is proposed to generate the accurate and dense Digital Elevation Models (DEMs) from lunar orbital imagery. The NASA Ames Intelligent Robotics Group (IRG) aims to produce high-quality terrain reconstructions of the Moon from Apollo Metric Camera (AMC) data. In particular, IRG makes use of a stereo vision process, the Ames Stereo Pipeline (ASP), to automatically generate DEMs from consecutive AMC image pairs. Given camera parameters of an image pair from bundle adjustment in ASP, a correlation window is defined on the terrain with the predefined surface normal of a post rather than image domain. The squared error of back-projected images on the local terrain is minimized with respect to the post elevation. This single dimensional optimization is solved efficiently and improves the accuracy of the elevation estimate.

  1. 3D Lunar Terrain Reconstruction from Apollo Images

    NASA Technical Reports Server (NTRS)

    Broxton, Michael J.; Nefian, Ara V.; Moratto, Zachary; Kim, Taemin; Lundy, Michael; Segal, Alkeksandr V.

    2009-01-01

    Generating accurate three dimensional planetary models is becoming increasingly important as NASA plans manned missions to return to the Moon in the next decade. This paper describes a 3D surface reconstruction system called the Ames Stereo Pipeline that is designed to produce such models automatically by processing orbital stereo imagery. We discuss two important core aspects of this system: (1) refinement of satellite station positions and pose estimates through least squares bundle adjustment; and (2) a stochastic plane fitting algorithm that generalizes the Lucas-Kanade method for optimal matching between stereo pair images.. These techniques allow us to automatically produce seamless, highly accurate digital elevation models from multiple stereo image pairs while significantly reducing the influence of image noise. Our technique is demonstrated on a set of 71 high resolution scanned images from the Apollo 15 mission

  2. Exposing exposure: automated anatomy-specific CT radiation exposure extraction for quality assurance and radiation monitoring.

    PubMed

    Sodickson, Aaron; Warden, Graham I; Farkas, Cameron E; Ikuta, Ichiro; Prevedello, Luciano M; Andriole, Katherine P; Khorasani, Ramin

    2012-08-01

    To develop and validate an informatics toolkit that extracts anatomy-specific computed tomography (CT) radiation exposure metrics (volume CT dose index and dose-length product) from existing digital image archives through optical character recognition of CT dose report screen captures (dose screens) combined with Digital Imaging and Communications in Medicine attributes. This institutional review board-approved HIPAA-compliant study was performed in a large urban health care delivery network. Data were drawn from a random sample of CT encounters that occurred between 2000 and 2010; images from these encounters were contained within the enterprise image archive, which encompassed images obtained at an adult academic tertiary referral hospital and its affiliated sites, including a cancer center, a community hospital, and outpatient imaging centers, as well as images imported from other facilities. Software was validated by using 150 randomly selected encounters for each major CT scanner manufacturer, with outcome measures of dose screen retrieval rate (proportion of correctly located dose screens) and anatomic assignment precision (proportion of extracted exposure data with correctly assigned anatomic region, such as head, chest, or abdomen and pelvis). The 95% binomial confidence intervals (CIs) were calculated for discrete proportions, and CIs were derived from the standard error of the mean for continuous variables. After validation, the informatics toolkit was used to populate an exposure repository from a cohort of 54 549 CT encounters; of which 29 948 had available dose screens. Validation yielded a dose screen retrieval rate of 99% (597 of 605 CT encounters; 95% CI: 98%, 100%) and an anatomic assignment precision of 94% (summed DLP fraction correct 563 in 600 CT encounters; 95% CI: 92%, 96%). Patient safety applications of the resulting data repository include benchmarking between institutions, CT protocol quality control and optimization, and cumulative patient- and anatomy-specific radiation exposure monitoring. Large-scale anatomy-specific radiation exposure data repositories can be created with high fidelity from existing digital image archives by using open-source informatics tools.

  3. 50 μm pixel pitch wafer-scale CMOS active pixel sensor x-ray detector for digital breast tomosynthesis.

    PubMed

    Zhao, C; Konstantinidis, A C; Zheng, Y; Anaxagoras, T; Speller, R D; Kanicki, J

    2015-12-07

    Wafer-scale CMOS active pixel sensors (APSs) have been developed recently for x-ray imaging applications. The small pixel pitch and low noise are very promising properties for medical imaging applications such as digital breast tomosynthesis (DBT). In this work, we evaluated experimentally and through modeling the imaging properties of a 50 μm pixel pitch CMOS APS x-ray detector named DynAMITe (Dynamic Range Adjustable for Medical Imaging Technology). A modified cascaded system model was developed for CMOS APS x-ray detectors by taking into account the device nonlinear signal and noise properties. The imaging properties such as modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) were extracted from both measurements and the nonlinear cascaded system analysis. The results show that the DynAMITe x-ray detector achieves a high spatial resolution of 10 mm(-1) and a DQE of around 0.5 at spatial frequencies  <1 mm(-1). In addition, the modeling results were used to calculate the image signal-to-noise ratio (SNRi) of microcalcifications at various mean glandular dose (MGD). For an average breast (5 cm thickness, 50% glandular fraction), 165 μm microcalcifications can be distinguished at a MGD of 27% lower than the clinical value (~1.3 mGy). To detect 100 μm microcalcifications, further optimizations of the CMOS APS x-ray detector, image aquisition geometry and image reconstruction techniques should be considered.

  4. Joint Chroma Subsampling and Distortion-Minimization-Based Luma Modification for RGB Color Images With Application.

    PubMed

    Chung, Kuo-Liang; Hsu, Tsu-Chun; Huang, Chi-Chao

    2017-10-01

    In this paper, we propose a novel and effective hybrid method, which joins the conventional chroma subsampling and the distortion-minimization-based luma modification together, to improve the quality of the reconstructed RGB full-color image. Assume the input RGB full-color image has been transformed to a YUV image, prior to compression. For each 2×2 UV block, one 4:2:0 subsampling is applied to determine the one subsampled U and V components, U s and V s . Based on U s , V s , and the corresponding 2×2 original RGB block, a main theorem is provided to determine the ideally modified 2×2 luma block in constant time such that the color peak signal-to-noise ratio (CPSNR) quality distortion between the original 2×2 RGB block and the reconstructed 2×2 RGB block can be minimized in a globally optimal sense. Furthermore, the proposed hybrid method and the delivered theorem are adjusted to tackle the digital time delay integration images and the Bayer mosaic images whose Bayer CFA structure has been widely used in modern commercial digital cameras. Based on the IMAX, Kodak, and screen content test image sets, the experimental results demonstrate that in high efficiency video coding, the proposed hybrid method has substantial quality improvement, in terms of the CPSNR quality, visual effect, CPSNR-bitrate trade-off, and Bjøntegaard delta PSNR performance, of the reconstructed RGB images when compared with existing chroma subsampling schemes.

  5. Design of a digital phantom population for myocardial perfusion SPECT imaging research.

    PubMed

    Ghaly, Michael; Du, Yong; Fung, George S K; Tsui, Benjamin M W; Links, Jonathan M; Frey, Eric

    2014-06-21

    Digital phantoms and Monte Carlo (MC) simulations have become important tools for optimizing and evaluating instrumentation, acquisition and processing methods for myocardial perfusion SPECT (MPS). In this work, we designed a new adult digital phantom population and generated corresponding Tc-99m and Tl-201 projections for use in MPS research. The population is based on the three-dimensional XCAT phantom with organ parameters sampled from the Emory PET Torso Model Database. Phantoms included three variations each in body size, heart size, and subcutaneous adipose tissue level, for a total of 27 phantoms of each gender. The SimSET MC code and angular response functions were used to model interactions in the body and the collimator-detector system, respectively. We divided each phantom into seven organs, each simulated separately, allowing use of post-simulation summing to efficiently model uptake variations. Also, we adapted and used a criterion based on the relative Poisson effective count level to determine the required number of simulated photons for each simulated organ. This technique provided a quantitative estimate of the true noise in the simulated projection data, including residual MC simulation noise. Projections were generated in 1 keV wide energy windows from 48-184 keV assuming perfect energy resolution to permit study of the effects of window width, energy resolution, and crosstalk in the context of dual isotope MPS. We have developed a comprehensive method for efficiently simulating realistic projections for a realistic population of phantoms in the context of MPS imaging. The new phantom population and realistic database of simulated projections will be useful in performing mathematical and human observer studies to evaluate various acquisition and processing methods such as optimizing the energy window width, investigating the effect of energy resolution on image quality and evaluating compensation methods for degrading factors such as crosstalk in the context of single and dual isotope MPS.

  6. Design of a digital phantom population for myocardial perfusion SPECT imaging research

    NASA Astrophysics Data System (ADS)

    Ghaly, Michael; Du, Yong; Fung, George S. K.; Tsui, Benjamin M. W.; Links, Jonathan M.; Frey, Eric

    2014-06-01

    Digital phantoms and Monte Carlo (MC) simulations have become important tools for optimizing and evaluating instrumentation, acquisition and processing methods for myocardial perfusion SPECT (MPS). In this work, we designed a new adult digital phantom population and generated corresponding Tc-99m and Tl-201 projections for use in MPS research. The population is based on the three-dimensional XCAT phantom with organ parameters sampled from the Emory PET Torso Model Database. Phantoms included three variations each in body size, heart size, and subcutaneous adipose tissue level, for a total of 27 phantoms of each gender. The SimSET MC code and angular response functions were used to model interactions in the body and the collimator-detector system, respectively. We divided each phantom into seven organs, each simulated separately, allowing use of post-simulation summing to efficiently model uptake variations. Also, we adapted and used a criterion based on the relative Poisson effective count level to determine the required number of simulated photons for each simulated organ. This technique provided a quantitative estimate of the true noise in the simulated projection data, including residual MC simulation noise. Projections were generated in 1 keV wide energy windows from 48-184 keV assuming perfect energy resolution to permit study of the effects of window width, energy resolution, and crosstalk in the context of dual isotope MPS. We have developed a comprehensive method for efficiently simulating realistic projections for a realistic population of phantoms in the context of MPS imaging. The new phantom population and realistic database of simulated projections will be useful in performing mathematical and human observer studies to evaluate various acquisition and processing methods such as optimizing the energy window width, investigating the effect of energy resolution on image quality and evaluating compensation methods for degrading factors such as crosstalk in the context of single and dual isotope MPS.

  7. Dual-view inverted selective plane illumination microscopy (diSPIM) with improved background rejection for accurate 3D digital pathology

    NASA Astrophysics Data System (ADS)

    Hu, Bihe; Bolus, Daniel; Brown, J. Quincy

    2018-02-01

    Current gold-standard histopathology for cancerous biopsies is destructive, time consuming, and limited to 2D slices, which do not faithfully represent true 3D tumor micro-morphology. Light sheet microscopy has emerged as a powerful tool for 3D imaging of cancer biospecimens. Here, we utilize the versatile dual-view inverted selective plane illumination microscopy (diSPIM) to render digital histological images of cancer biopsies. Dual-view architecture enabled more isotropic resolution in X, Y, and Z; and different imaging modes, such as adding electronic confocal slit detection (eCSD) or structured illumination (SI), can be used to improve degraded image quality caused by background signal of large, scattering samples. To obtain traditional H&E-like images, we used DRAQ5 and eosin (D&E) staining, with 488nm and 647nm laser illumination, and multi-band filter sets. Here, phantom beads and a D&E stained buccal cell sample have been used to verify our dual-view method. We also show that via dual view imaging and deconvolution, more isotropic resolution has been achieved for optical cleared human prostate sample, providing more accurate quantitation of 3D tumor architecture than was possible with single-view SPIM methods. We demonstrate that the optimized diSPIM delivers more precise analysis of 3D cancer microarchitecture in human prostate biopsy than simpler light sheet microscopy arrangements.

  8. Radiometric calibration of wide-field camera system with an application in astronomy

    NASA Astrophysics Data System (ADS)

    Vítek, Stanislav; Nasyrova, Maria; Stehlíková, Veronika

    2017-09-01

    Camera response function (CRF) is widely used for the description of the relationship between scene radiance and image brightness. Most common application of CRF is High Dynamic Range (HDR) reconstruction of the radiance maps of imaged scenes from a set of frames with different exposures. The main goal of this work is to provide an overview of CRF estimation algorithms and compare their outputs with results obtained under laboratory conditions. These algorithms, typically designed for multimedia content, are unfortunately quite useless with astronomical image data, mostly due to their nature (blur, noise, and long exposures). Therefore, we propose an optimization of selected methods to use in an astronomical imaging application. Results are experimentally verified on the wide-field camera system using Digital Single Lens Reflex (DSLR) camera.

  9. Overlapped Fourier coding for optical aberration removal

    PubMed Central

    Horstmeyer, Roarke; Ou, Xiaoze; Chung, Jaebum; Zheng, Guoan; Yang, Changhuei

    2014-01-01

    We present an imaging procedure that simultaneously optimizes a camera’s resolution and retrieves a sample’s phase over a sequence of snapshots. The technique, termed overlapped Fourier coding (OFC), first digitally pans a small aperture across a camera’s pupil plane with a spatial light modulator. At each aperture location, a unique image is acquired. The OFC algorithm then fuses these low-resolution images into a full-resolution estimate of the complex optical field incident upon the detector. Simultaneously, the algorithm utilizes redundancies within the acquired dataset to computationally estimate and remove unknown optical aberrations and system misalignments via simulated annealing. The result is an imaging system that can computationally overcome its optical imperfections to offer enhanced resolution, at the expense of taking multiple snapshots over time. PMID:25321982

  10. Patient-generated Digital Images after Pediatric Ambulatory Surgery.

    PubMed

    Miller, Matthew W; Ross, Rachael K; Voight, Christina; Brouwer, Heather; Karavite, Dean J; Gerber, Jeffrey S; Grundmeier, Robert W; Coffin, Susan E

    2016-07-06

    To describe the use of digital images captured by parents or guardians and sent to clinicians for assessment of wounds after pediatric ambulatory surgery. Subjects with digital images of post-operative wounds were identified as part of an on-going cohort study of infections after ambulatory surgery within a large pediatric healthcare system. We performed a structured review of the electronic health record (EHR) to determine how digital images were documented in the EHR and used in clinical care. We identified 166 patients whose parent or guardian reported sending a digital image of the wound to the clinician after surgery. A corresponding digital image was located in the EHR in only 121 of these encounters. A change in clinical management was documented in 20% of these encounters, including referral for in-person evaluation of the wound and antibiotic prescription. Clinical teams have developed ad hoc workflows to use digital images to evaluate post-operative pediatric surgical patients. Because the use of digital images to support follow-up care after ambulatory surgery is likely to increase, it is important that high-quality images are captured and documented appropriately in the EHR to ensure privacy, security, and a high-level of care.

  11. Patient-Generated Digital Images after Pediatric Ambulatory Surgery

    PubMed Central

    Ross, Rachael K.; Voight, Christina; Brouwer, Heather; Karavite, Dean J.; Gerber, Jeffrey S.; Grundmeier, Robert W.; Coffin, Susan E.

    2016-01-01

    Summary Objective To describe the use of digital images captured by parents or guardians and sent to clinicians for assessment of wounds after pediatric ambulatory surgery. Methods Subjects with digital images of post-operative wounds were identified as part of an ongoing cohort study of infections after ambulatory surgery within a large pediatric healthcare system. We performed a structured review of the electronic health record (EHR) to determine how digital images were documented in the EHR and used in clinical care. Results We identified 166 patients whose parent or guardian reported sending a digital image of the wound to the clinician after surgery. A corresponding digital image was located in the EHR in only 121 of these encounters. A change in clinical management was documented in 20% of these encounters, including referral for in-person evaluation of the wound and antibiotic prescription. Conclusion Clinical teams have developed ad hoc workflows to use digital images to evaluate post-operative pediatric surgical patients. Because the use of digital images to support follow-up care after ambulatory surgery is likely to increase, it is important that high-quality images are captured and documented appropriately in the EHR to ensure privacy, security, and a high-level of care. PMID:27452477

  12. Simultaneous acquisition of 3D shape and deformation by combination of interferometric and correlation-based laser speckle metrology.

    PubMed

    Dekiff, Markus; Berssenbrügge, Philipp; Kemper, Björn; Denz, Cornelia; Dirksen, Dieter

    2015-12-01

    A metrology system combining three laser speckle measurement techniques for simultaneous determination of 3D shape and micro- and macroscopic deformations is presented. While microscopic deformations are determined by a combination of Digital Holographic Interferometry (DHI) and Digital Speckle Photography (DSP), macroscopic 3D shape, position and deformation are retrieved by photogrammetry based on digital image correlation of a projected laser speckle pattern. The photogrammetrically obtained data extend the measurement range of the DHI-DSP system and also increase the accuracy of the calculation of the sensitivity vector. Furthermore, a precise assignment of microscopic displacements to the object's macroscopic shape for enhanced visualization is achieved. The approach allows for fast measurements with a simple setup. Key parameters of the system are optimized, and its precision and measurement range are demonstrated. As application examples, the deformation of a mandible model and the shrinkage of dental impression material are measured.

  13. Proposals for best-quality immunohistochemical staining of paraffin-embedded brain tissue slides in forensics.

    PubMed

    Trautz, Florian; Dreßler, Jan; Stassart, Ruth; Müller, Wolf; Ondruschka, Benjamin

    2018-01-03

    Immunohistochemistry (IHC) has become an integral part in forensic histopathology over the last decades. However, the underlying methods for IHC vary greatly depending on the institution, creating a lack of comparability. The aim of this study was to assess the optimal approach for different technical aspects of IHC, in order to improve and standardize this procedure. Therefore, qualitative results from manual and automatic IHC staining of brain samples were compared, as well as potential differences in suitability of common IHC glass slides. Further, possibilities of image digitalization and connected issues were investigated. In our study, automatic staining showed more consistent staining results, compared to manual staining procedures. Digitalization and digital post-processing facilitated direct analysis and analysis for reproducibility considerably. No differences were found for different commercially available microscopic glass slides regarding suitability of IHC brain researches, but a certain rate of tissue loss should be expected during the staining process.

  14. A compact radiation source for digital subtractive angiography

    NASA Astrophysics Data System (ADS)

    Wiedemann, H.; Baltay, M.; Carr, R.; Hernandez, M.; Lavender, W.

    1994-08-01

    Beam requirements for 33 keV radiation used in digital subtraction angiography have been established through extended experimentation first at Stanford and later at the National Synchrotron Light Source in Brookhaven. So far research and development of this medical procedure to image coronary blood vessels have been undertaken on large high energy electron storage rings. With progress in this diagnostic procedure, it is interesting to look for an optimum concept for providing a 33 keV radiation source which would fit into the environment of a hospital. A variety of competing effects and technologies to produce 33 keV radiation are available, but none of these processes provides the combination of sufficient photon flux and monochromaticity except for synchrotron radiation from an electron storage ring. The conceptual design of a compact storage ring optimized to fit into a hospital environment and producing sufficient 33 keV radiation for digital subtraction radiography will be discussed.

  15. US National Large-scale City Orthoimage Standard Initiative

    USGS Publications Warehouse

    Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.

    2003-01-01

    The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.

  16. VHDL Modeling and Simulation of a Digital Image Synthesizer for Countering ISAR

    DTIC Science & Technology

    2003-06-01

    This thesis discusses VHDL modeling and simulation of a full custom Application Specific Integrated Circuit (ASIC) for a Digital Image Synthesizer...necessary for a given application . With such a digital method, it is possible for a small ship to appear as large as an aircraft carrier or any high...INTRODUCTION TO DIGITAL IMAGE SYNTHESIZER (DIS) A. BACKGROUND The Digital Image Synthesizer (DIS) is an Application Specific Integrated Circuit

  17. 21 CFR 892.1715 - Full-field digital mammography system.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... planar digital x-ray images of the entire breast. This generic type of device may include digital mammography acquisition software, full-field digital image receptor, acquisition workstation, automatic exposure control, image processing and reconstruction programs, patient and equipment supports, component...

  18. 21 CFR 892.1715 - Full-field digital mammography system.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... planar digital x-ray images of the entire breast. This generic type of device may include digital mammography acquisition software, full-field digital image receptor, acquisition workstation, automatic exposure control, image processing and reconstruction programs, patient and equipment supports, component...

  19. 21 CFR 892.1715 - Full-field digital mammography system.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... planar digital x-ray images of the entire breast. This generic type of device may include digital mammography acquisition software, full-field digital image receptor, acquisition workstation, automatic exposure control, image processing and reconstruction programs, patient and equipment supports, component...

  20. 21 CFR 892.1715 - Full-field digital mammography system.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... planar digital x-ray images of the entire breast. This generic type of device may include digital mammography acquisition software, full-field digital image receptor, acquisition workstation, automatic exposure control, image processing and reconstruction programs, patient and equipment supports, component...

  1. Digital Image Compression Using Artificial Neural Networks

    NASA Technical Reports Server (NTRS)

    Serra-Ricart, M.; Garrido, L.; Gaitan, V.; Aloy, A.

    1993-01-01

    The problem of storing, transmitting, and manipulating digital images is considered. Because of the file sizes involved, large amounts of digitized image information are becoming common in modern projects. Our goal is to described an image compression transform coder based on artificial neural networks techniques (NNCTC). A comparison of the compression results obtained from digital astronomical images by the NNCTC and the method used in the compression of the digitized sky survey from the Space Telescope Science Institute based on the H-transform is performed in order to assess the reliability of the NNCTC.

  2. Skin colour assessment of replanted fingers in digital images and its reliability for the incorporation of images in nursing progress notes.

    PubMed

    Terashima, Taiko; Yoshimura, Sadako

    2018-03-01

    To determine whether nurses can accurately assess the skin colour of replanted fingers displayed as digital images on a computer screen. Colour measurement and clinical diagnostic methods for medical digital images have been studied, but reproducing skin colour on a computer screen remains difficult. The inter-rater reliability of skin colour assessment scores was evaluated. In May 2014, 21 nurses who worked on a trauma ward in Japan participated in testing. Six digital images with different skin colours were used. Colours were scored from both digital images and direct patient's observation. The score from a digital image was defined as the test score, and its difference from the direct assessment score as the difference score. Intraclass correlation coefficients were calculated. Nurses' opinions were classified and summarised. The intraclass correlation coefficients for the test scores were fair. Although the intraclass correlation coefficients for the difference scores were poor, they improved to good when three images that might have contributed to poor reliability were excluded. Most nurses stated that it is difficult to assess skin colour in digital images; they did not think it could be a substitute for direct visual assessment. However, most nurses were in favour of including images in nursing progress notes. Although the inter-rater reliability was fairly high, the reliability of colour reproduction in digital images as indicated by the difference scores was poor. Nevertheless, nurses expect the incorporation of digital images in nursing progress notes to be useful. This gap between the reliability of digital colour reproduction and nurses' expectations towards it must be addressed. High inter-rater reliability for digital images in nursing progress notes was not observed. Assessments of future improvements in colour reproduction technologies are required. Further digitisation and visualisation of nursing records might pose challenges. © 2017 John Wiley & Sons Ltd.

  3. Autocorrelation techniques for soft photogrammetry

    NASA Astrophysics Data System (ADS)

    Yao, Wu

    In this thesis research is carried out on image processing, image matching searching strategies, feature type and image matching, and optimal window size in image matching. To make comparisons, the soft photogrammetry package SoftPlotter is used. Two aerial photographs from the Iowa State University campus high flight 94 are scanned into digital format. In order to create a stereo model from them, interior orientation, single photograph rectification and stereo rectification are done. Two new image matching methods, multi-method image matching (MMIM) and unsquare window image matching are developed and compared. MMIM is used to determine the optimal window size in image matching. Twenty four check points from four different types of ground features are used for checking the results from image matching. Comparison between these four types of ground feature shows that the methods developed here improve the speed and the precision of image matching. A process called direct transformation is described and compared with the multiple steps in image processing. The results from image processing are consistent with those from SoftPlotter. A modified LAN image header is developed and used to store the information about the stereo model and image matching. A comparison is also made between cross correlation image matching (CCIM), least difference image matching (LDIM) and least square image matching (LSIM). The quality of image matching in relation to ground features are compared using two methods developed in this study, the coefficient surface for CCIM and the difference surface for LDIM. To reduce the amount of computation in image matching, the best-track searching algorithm, developed in this research, is used instead of the whole range searching algorithm.

  4. Image Halftoning and Inverse Halftoning for Optimized Dot Diffusion

    DTIC Science & Technology

    1998-01-01

    systems.caltech.edu, ppvnath@sys.caltech.edu ABSTRACT The dot diffusion method for digital halftoning has the advantage of parallelism unlike the error ... halftoning : ordered dither [3], error diffusion [4], neural-net based methods [2], and more recently direct binary search (DBS) [10]. Ordered dithering is a...patterns. On the other hand error diffused halftones do not suffer from periodicity and offer blue noise characteristic [11] which is found to be

  5. Analysis of identification of digital images from a map of cosmic microwaves

    NASA Astrophysics Data System (ADS)

    Skeivalas, J.; Turla, V.; Jurevicius, M.; Viselga, G.

    2018-04-01

    This paper discusses identification of digital images from the cosmic microwave background radiation map formed according to the data of the European Space Agency "Planck" telescope by applying covariance functions and wavelet theory. The estimates of covariance functions of two digital images or single images are calculated according to the random functions formed of the digital images in the form of pixel vectors. The estimates of pixel vectors are formed on expansion of the pixel arrays of the digital images by a single vector. When the scale of a digital image is varied, the frequencies of single-pixel color waves remain constant and the procedure for calculation of covariance functions is not affected. For identification of the images, the RGB format spectrum has been applied. The impact of RGB spectrum components and the color tensor on the estimates of covariance functions was analyzed. The identity of digital images is assessed according to the changes in the values of the correlation coefficients in a certain range of values by applying the developed computer program.

  6. Applying a 2D based CAD scheme for detecting micro-calcification clusters using digital breast tomosynthesis images: an assessment

    NASA Astrophysics Data System (ADS)

    Park, Sang Cheol; Zheng, Bin; Wang, Xiao-Hui; Gur, David

    2008-03-01

    Digital breast tomosynthesis (DBT) has emerged as a promising imaging modality for screening mammography. However, visually detecting micro-calcification clusters depicted on DBT images is a difficult task. Computer-aided detection (CAD) schemes for detecting micro-calcification clusters depicted on mammograms can achieve high performance and the use of CAD results can assist radiologists in detecting subtle micro-calcification clusters. In this study, we compared the performance of an available 2D based CAD scheme with one that includes a new grouping and scoring method when applied to both projection and reconstructed DBT images. We selected a dataset involving 96 DBT examinations acquired on 45 women. Each DBT image set included 11 low dose projection images and a varying number of reconstructed image slices ranging from 18 to 87. In this dataset 20 true-positive micro-calcification clusters were visually detected on the projection images and 40 were visually detected on the reconstructed images, respectively. We first applied the CAD scheme that was previously developed in our laboratory to the DBT dataset. We then tested a new grouping method that defines an independent cluster by grouping the same cluster detected on different projection or reconstructed images. We then compared four scoring methods to assess the CAD performance. The maximum sensitivity level observed for the different grouping and scoring methods were 70% and 88% for the projection and reconstructed images with a maximum false-positive rate of 4.0 and 15.9 per examination, respectively. This preliminary study demonstrates that (1) among the maximum, the minimum or the average CAD generated scores, using the maximum score of the grouped cluster regions achieved the highest performance level, (2) the histogram based scoring method is reasonably effective in reducing false-positive detections on the projection images but the overall CAD sensitivity is lower due to lower signal-to-noise ratio, and (3) CAD achieved higher sensitivity and higher false-positive rate (per examination) on the reconstructed images. We concluded that without changing the detection threshold or performing pre-filtering to possibly increase detection sensitivity, current CAD schemes developed and optimized for 2D mammograms perform relatively poorly and need to be re-optimized using DBT datasets and new grouping and scoring methods need to be incorporated into the schemes if these are to be used on the DBT examinations.

  7. Accurate cytogenetic biodosimetry through automated dicentric chromosome curation and metaphase cell selection

    PubMed Central

    Wilkins, Ruth; Flegal, Farrah; Knoll, Joan H.M.; Rogan, Peter K.

    2017-01-01

    Accurate digital image analysis of abnormal microscopic structures relies on high quality images and on minimizing the rates of false positive (FP) and negative objects in images. Cytogenetic biodosimetry detects dicentric chromosomes (DCs) that arise from exposure to ionizing radiation, and determines radiation dose received based on DC frequency. Improvements in automated DC recognition increase the accuracy of dose estimates by reclassifying FP DCs as monocentric chromosomes or chromosome fragments. We also present image segmentation methods to rank high quality digital metaphase images and eliminate suboptimal metaphase cells. A set of chromosome morphology segmentation methods selectively filtered out FP DCs arising primarily from sister chromatid separation, chromosome fragmentation, and cellular debris. This reduced FPs by an average of 55% and was highly specific to these abnormal structures (≥97.7%) in three samples. Additional filters selectively removed images with incomplete, highly overlapped, or missing metaphase cells, or with poor overall chromosome morphologies that increased FP rates. Image selection is optimized and FP DCs are minimized by combining multiple feature based segmentation filters and a novel image sorting procedure based on the known distribution of chromosome lengths. Applying the same image segmentation filtering procedures to both calibration and test samples reduced the average dose estimation error from 0.4 Gy to <0.2 Gy, obviating the need to first manually review these images. This reliable and scalable solution enables batch processing for multiple samples of unknown dose, and meets current requirements for triage radiation biodosimetry of high quality metaphase cell preparations. PMID:29026522

  8. Repeat analysis of intraoral digital imaging performed by undergraduate students using a complementary metal oxide semiconductor sensor: An institutional case study.

    PubMed

    Yusof, Mohd Yusmiaidil Putera Mohd; Rahman, Nur Liyana Abdul; Asri, Amiza Aqiela Ahmad; Othman, Noor Ilyani; Wan Mokhtar, Ilham

    2017-12-01

    This study was performed to quantify the repeat rate of imaging acquisitions based on different clinical examinations, and to assess the prevalence of error types in intraoral bitewing and periapical imaging using a digital complementary metal-oxide-semiconductor (CMOS) intraoral sensor. A total of 8,030 intraoral images were retrospectively collected from 3 groups of undergraduate clinical dental students. The type of examination, stage of the procedure, and reasons for repetition were analysed and recorded. The repeat rate was calculated as the total number of repeated images divided by the total number of examinations. The weighted Cohen's kappa for inter- and intra-observer agreement was used after calibration and prior to image analysis. The overall repeat rate on intraoral periapical images was 34.4%. A total of 1,978 repeated periapical images were from endodontic assessment, which included working length estimation (WLE), trial gutta-percha (tGP), obturation, and removal of gutta-percha (rGP). In the endodontic imaging, the highest repeat rate was from WLE (51.9%) followed by tGP (48.5%), obturation (42.2%), and rGP (35.6%). In bitewing images, the repeat rate was 15.1% and poor angulation was identified as the most common cause of error. A substantial level of intra- and interobserver agreement was achieved. The repeat rates in this study were relatively high, especially for certain clinical procedures, warranting training in optimization techniques and radiation protection. Repeat analysis should be performed from time to time to enhance quality assurance and hence deliver high-quality health services to patients.

  9. Digital Longitudinal Tomosynthesis

    NASA Astrophysics Data System (ADS)

    Rimkus, Daniel Steven

    1985-12-01

    The purpose of this dissertation was to investigate the clinical utility of digital longitudinal tomosynthesis in radiology. By acquiring a finite group of digital images during a longitudinal tomographic exposure, and processing these images, tomographic planes, other than the fulcrum plane, can be reconstructed. This process is now termed "tomosynthesis". A prototype system utilizing this technique was developed. Both phantom and patient studies were done with this system. The phantom studies were evaluated by subjective, visual criterion and by quantitative analysis of edge sharpness and noise in the reconstructions. Two groups of patients and one volunteer were studied. The first patient group consisted of 8 patients undergoing intravenous urography (IVU). These patients had digital tomography and film tomography of the abdomen. The second patient group consisted of 4 patients with lung cancer admitted to the hospital for laser resection of endobronchial tumor. These patients had mediastinal digital tomograms to evaluate the trachea and mainstem bronchi. The knee of one volunteer was imaged by film tomography and digital tomography. The results of the phantom studies showed that the digital reconstructions accurately produced images of the desired planes. The edge sharpness of the reconstructions approached that of the acquired images. Adequate reconstructions were achieved with as few as 5 images acquired during the exposure, with the quality of the reconstructions improving as the number of images acquired increased. The IVU patients' digital studies had less contrast and spatial resolution than the film tomograms. The single renal lesion visible on the film tomograms was also visible in the digital images. The digital mediastinal studies were felt by several radiologists to be superior to a standard chest xray in evaluating the airways. The digital images of the volunteer's knee showed many of the same anatomic features as the film tomogram, but the digital images had less spatial and contrast resolution. With the equipment improvements discussed in the thesis, digital tomography may have an important role in radiology.

  10. Statistical colour models: an automated digital image analysis method for quantification of histological biomarkers.

    PubMed

    Shu, Jie; Dolman, G E; Duan, Jiang; Qiu, Guoping; Ilyas, Mohammad

    2016-04-27

    Colour is the most important feature used in quantitative immunohistochemistry (IHC) image analysis; IHC is used to provide information relating to aetiology and to confirm malignancy. Statistical modelling is a technique widely used for colour detection in computer vision. We have developed a statistical model of colour detection applicable to detection of stain colour in digital IHC images. Model was first trained by massive colour pixels collected semi-automatically. To speed up the training and detection processes, we removed luminance channel, Y channel of YCbCr colour space and chose 128 histogram bins which is the optimal number. A maximum likelihood classifier is used to classify pixels in digital slides into positively or negatively stained pixels automatically. The model-based tool was developed within ImageJ to quantify targets identified using IHC and histochemistry. The purpose of evaluation was to compare the computer model with human evaluation. Several large datasets were prepared and obtained from human oesophageal cancer, colon cancer and liver cirrhosis with different colour stains. Experimental results have demonstrated the model-based tool achieves more accurate results than colour deconvolution and CMYK model in the detection of brown colour, and is comparable to colour deconvolution in the detection of pink colour. We have also demostrated the proposed model has little inter-dataset variations. A robust and effective statistical model is introduced in this paper. The model-based interactive tool in ImageJ, which can create a visual representation of the statistical model and detect a specified colour automatically, is easy to use and available freely at http://rsb.info.nih.gov/ij/plugins/ihc-toolbox/index.html . Testing to the tool by different users showed only minor inter-observer variations in results.

  11. A collimator optimization method for quantitative imaging: application to Y-90 bremsstrahlung SPECT.

    PubMed

    Rong, Xing; Frey, Eric C

    2013-08-01

    Post-therapy quantitative 90Y bremsstrahlung single photon emission computed tomography (SPECT) has shown great potential to provide reliable activity estimates, which are essential for dose verification. Typically 90Y imaging is performed with high- or medium-energy collimators. However, the energy spectrum of 90Y bremsstrahlung photons is substantially different than typical for these collimators. In addition, dosimetry requires quantitative images, and collimators are not typically optimized for such tasks. Optimizing a collimator for 90Y imaging is both novel and potentially important. Conventional optimization methods are not appropriate for 90Y bremsstrahlung photons, which have a continuous and broad energy distribution. In this work, the authors developed a parallel-hole collimator optimization method for quantitative tasks that is particularly applicable to radionuclides with complex emission energy spectra. The authors applied the proposed method to develop an optimal collimator for quantitative 90Y bremsstrahlung SPECT in the context of microsphere radioembolization. To account for the effects of the collimator on both the bias and the variance of the activity estimates, the authors used the root mean squared error (RMSE) of the volume of interest activity estimates as the figure of merit (FOM). In the FOM, the bias due to the null space of the image formation process was taken in account. The RMSE was weighted by the inverse mass to reflect the application to dosimetry; for a different application, more relevant weighting could easily be adopted. The authors proposed a parameterization for the collimator that facilitates the incorporation of the important factors (geometric sensitivity, geometric resolution, and septal penetration fraction) determining collimator performance, while keeping the number of free parameters describing the collimator small (i.e., two parameters). To make the optimization results for quantitative 90Y bremsstrahlung SPECT more general, the authors simulated multiple tumors of various sizes in the liver. The authors realistically simulated human anatomy using a digital phantom and the image formation process using a previously validated and computationally efficient method for modeling the image-degrading effects including object scatter, attenuation, and the full collimator-detector response (CDR). The scatter kernels and CDR function tables used in the modeling method were generated using a previously validated Monte Carlo simulation code. The hole length, hole diameter, and septal thickness of the obtained optimal collimator were 84, 3.5, and 1.4 mm, respectively. Compared to a commercial high-energy general-purpose collimator, the optimal collimator improved the resolution and FOM by 27% and 18%, respectively. The proposed collimator optimization method may be useful for improving quantitative SPECT imaging for radionuclides with complex energy spectra. The obtained optimal collimator provided a substantial improvement in quantitative performance for the microsphere radioembolization task considered.

  12. Cost-effective handling of digital medical images in the telemedicine environment.

    PubMed

    Choong, Miew Keen; Logeswaran, Rajasvaran; Bister, Michel

    2007-09-01

    This paper concentrates on strategies for less costly handling of medical images. Aspects of digitization using conventional digital cameras, lossy compression with good diagnostic quality, and visualization through less costly monitors are discussed. For digitization of film-based media, subjective evaluation of the suitability of digital cameras as an alternative to the digitizer was undertaken. To save on storage, bandwidth and transmission time, the acceptable degree of compression with diagnostically no loss of important data was studied through randomized double-blind tests of the subjective image quality when compression noise was kept lower than the inherent noise. A diagnostic experiment was undertaken to evaluate normal low cost computer monitors as viable viewing displays for clinicians. The results show that conventional digital camera images of X-ray images were diagnostically similar to the expensive digitizer. Lossy compression, when used moderately with the imaging noise to compression noise ratio (ICR) greater than four, can bring about image improvement with better diagnostic quality than the original image. Statistical analysis shows that there is no diagnostic difference between expensive high quality monitors and conventional computer monitors. The results presented show good potential in implementing the proposed strategies to promote widespread cost-effective telemedicine and digital medical environments. 2006 Elsevier Ireland Ltd

  13. Processing, mosaicking and management of the Monterey Bay digital sidescan-sonar images

    USGS Publications Warehouse

    Chavez, P.S.; Isbrecht, J.; Galanis, P.; Gabel, G.L.; Sides, S.C.; Soltesz, D.L.; Ross, Stephanie L.; Velasco, M.G.

    2002-01-01

    Sidescan-sonar imaging systems with digital capabilities have now been available for approximately 20 years. In this paper we present several of the various digital image processing techniques developed by the U.S. Geological Survey (USGS) and used to apply intensity/radiometric and geometric corrections, as well as enhance and digitally mosaic, sidescan-sonar images of the Monterey Bay region. New software run by a WWW server was designed and implemented to allow very large image data sets, such as the digital mosaic, to be easily viewed interactively, including the ability to roam throughout the digital mosaic at the web site in either compressed or full 1-m resolution. The processing is separated into the two different stages: preprocessing and information extraction. In the preprocessing stage, sensor-specific algorithms are applied to correct for both geometric and intensity/radiometric distortions introduced by the sensor. This is followed by digital mosaicking of the track-line strips into quadrangle format which can be used as input to either visual or digital image analysis and interpretation. An automatic seam removal procedure was used in combination with an interactive digital feathering/stenciling procedure to help minimize tone or seam matching problems between image strips from adjacent track-lines. The sidescan-sonar image processing package is part of the USGS Mini Image Processing System (MIPS) and has been designed to process data collected by any 'generic' digital sidescan-sonar imaging system. The USGS MIPS software, developed over the last 20 years as a public domain package, is available on the WWW at: http://terraweb.wr.usgs.gov/trs/software.html.

  14. High-dynamic range imaging techniques based on both color-separation algorithms used in conventional graphic arts and the human visual perception modeling

    NASA Astrophysics Data System (ADS)

    Lo, Mei-Chun; Hsieh, Tsung-Hsien; Perng, Ruey-Kuen; Chen, Jiong-Qiao

    2010-01-01

    The aim of this research is to derive illuminant-independent type of HDR imaging modules which can optimally multispectrally reconstruct of every color concerned in high-dynamic-range of original images for preferable cross-media color reproduction applications. Each module, based on either of broadband and multispectral approach, would be incorporated models of perceptual HDR tone-mapping, device characterization. In this study, an xvYCC format of HDR digital camera was used to capture HDR scene images for test. A tone-mapping module was derived based on a multiscale representation of the human visual system and used equations similar to a photoreceptor adaptation equation, proposed by Michaelis-Menten. Additionally, an adaptive bilateral type of gamut mapping algorithm, using approach of a multiple conversing-points (previously derived), was incorporated with or without adaptive Un-sharp Masking (USM) to carry out the optimization of HDR image rendering. An LCD with standard color space of Adobe RGB (D65) was used as a soft-proofing platform to display/represent HDR original RGB images, and also evaluate both renditionquality and prediction-performance of modules derived. Also, another LCD with standard color space of sRGB was used to test gamut-mapping algorithms, used to be integrated with tone-mapping module derived.

  15. Low-cost conversion of the Polaroid MD-4 land camera to a digital gel documentation system.

    PubMed

    Porch, Timothy G; Erpelding, John E

    2006-04-30

    A simple, inexpensive design is presented for the rapid conversion of the popular MD-4 Polaroid land camera to a high quality digital gel documentation system. Images of ethidium bromide stained DNA gels captured using the digital system were compared to images captured on Polaroid instant film. Resolution and sensitivity were enhanced using the digital system. In addition to the low cost and superior image quality of the digital system, there is also the added convenience of real-time image viewing through the swivel LCD of the digital camera, wide flexibility of gel sizes, accurate automatic focusing, variable image resolution, and consistent ease of use and quality. Images can be directly imported to a computer by using the USB port on the digital camera, further enhancing the potential of the digital system for documentation, analysis, and archiving. The system is appropriate for use as a start-up gel documentation system and for routine gel analysis.

  16. Development of an MRI-compatible digital SiPM detector stack for simultaneous PET/MRI

    PubMed Central

    Düppenbecker, Peter M; Weissler, Bjoern; Gebhardt, Pierre; Schug, David; Wehner, Jakob; Marsden, Paul K; Schulz, Volkmar

    2016-01-01

    Abstract Advances in solid-state photon detectors paved the way to combine positron emission tomography (PET) and magnetic resonance imaging (MRI) into highly integrated, truly simultaneous, hybrid imaging systems. Based on the most recent digital SiPM technology, we developed an MRI-compatible PET detector stack, intended as a building block for next generation simultaneous PET/MRI systems. Our detector stack comprises an array of 8 × 8 digital SiPM channels with 4 mm pitch using Philips Digital Photon Counting DPC 3200-22 devices, an FPGA for data acquisition, a supply voltage control system and a cooling infrastructure. This is the first detector design that allows the operation of digital SiPMs simultaneously inside an MRI system. We tested and optimized the MRI-compatibility of our detector stack on a laboratory test bench as well as in combination with a Philips Achieva 3 T MRI system. Our design clearly reduces distortions of the static magnetic field compared to a conventional design. The MRI static magnetic field causes weak and directional drift effects on voltage regulators, but has no direct impact on detector performance. MRI gradient switching initially degraded energy and timing resolution. Both distortions could be ascribed to voltage variations induced on the bias and the FPGA core voltage supply respectively. Based on these findings, we improved our detector design and our final design shows virtually no energy or timing degradations, even during heavy and continuous MRI gradient switching. In particular, we found no evidence that the performance of the DPC 3200-22 digital SiPM itself is degraded by the MRI system. PMID:28458919

  17. Programmable Remapper with Single Flow Architecture

    NASA Technical Reports Server (NTRS)

    Fisher, Timothy E. (Inventor)

    1993-01-01

    An apparatus for image processing comprising a camera for receiving an original visual image and transforming the original visual image into an analog image, a first converter for transforming the analog image of the camera to a digital image, a processor having a single flow architecture for receiving the digital image and producing, with a single algorithm, an output image, a second converter for transforming the digital image of the processor to an analog image, and a viewer for receiving the analog image, transforming the analog image into a transformed visual image for observing the transformations applied to the original visual image. The processor comprises one or more subprocessors for the parallel reception of a digital image for producing an output matrix of the transformed visual image. More particularly, the processor comprises a plurality of subprocessors for receiving in parallel and transforming the digital image for producing a matrix of the transformed visual image, and an output interface means for receiving the respective portions of the transformed visual image from the respective subprocessor for producing an output matrix of the transformed visual image.

  18. Digital Imaging and the Cognitive Revolution: A Media Challenge.

    ERIC Educational Resources Information Center

    Sartorius, Ute

    This paper discusses the role of digital technology within the cognitive revolution of the perception of images. It analyzes the traditional values placed on images as a source of cognition. These values are discussed in terms of the ethical and social issues raised by the use of digital image manipulation in so far as the digital era is falsely…

  19. Development and Characterization of Embedded Sensory Particles Using Multi-Scale 3D Digital Image Correlation

    NASA Technical Reports Server (NTRS)

    Cornell, Stephen R.; Leser, William P.; Hochhalter, Jacob D.; Newman, John A.; Hartl, Darren J.

    2014-01-01

    A method for detecting fatigue cracks has been explored at NASA Langley Research Center. Microscopic NiTi shape memory alloy (sensory) particles were embedded in a 7050 aluminum alloy matrix to detect the presence of fatigue cracks. Cracks exhibit an elevated stress field near their tip inducing a martensitic phase transformation in nearby sensory particles. Detectable levels of acoustic energy are emitted upon particle phase transformation such that the existence and location of fatigue cracks can be detected. To test this concept, a fatigue crack was grown in a mode-I single-edge notch fatigue crack growth specimen containing sensory particles. As the crack approached the sensory particles, measurements of particle strain, matrix-particle debonding, and phase transformation behavior of the sensory particles were performed. Full-field deformation measurements were performed using a novel multi-scale optical 3D digital image correlation (DIC) system. This information will be used in a finite element-based study to determine optimal sensory material behavior and density.

  20. Integration of Point Clouds from Terrestrial Laser Scanning and Image-Based Matching for Generating High-Resolution Orthoimages

    NASA Astrophysics Data System (ADS)

    Salach, A.; Markiewicza, J. S.; Zawieska, D.

    2016-06-01

    An orthoimage is one of the basic photogrammetric products used for architectural documentation of historical objects; recently, it has become a standard in such work. Considering the increasing popularity of photogrammetric techniques applied in the cultural heritage domain, this research examines the two most popular measuring technologies: terrestrial laser scanning, and automatic processing of digital photographs. The basic objective of the performed works presented in this paper was to optimize the quality of generated high-resolution orthoimages using integration of data acquired by a Z+F 5006 terrestrial laser scanner and a Canon EOS 5D Mark II digital camera. The subject was one of the walls of the "Blue Chamber" of the Museum of King Jan III's Palace at Wilanów (Warsaw, Poland). The high-resolution images resulting from integration of the point clouds acquired by the different methods were analysed in detail with respect to geometric and radiometric correctness.

  1. Effective 3-D surface modeling for geographic information systems

    NASA Astrophysics Data System (ADS)

    Yüksek, K.; Alparslan, M.; Mendi, E.

    2013-11-01

    In this work, we propose a dynamic, flexible and interactive urban digital terrain platform (DTP) with spatial data and query processing capabilities of Geographic Information Systems (GIS), multimedia database functionality and graphical modeling infrastructure. A new data element, called Geo-Node, which stores image, spatial data and 3-D CAD objects is developed using an efficient data structure. The system effectively handles data transfer of Geo-Nodes between main memory and secondary storage with an optimized Directional Replacement Policy (DRP) based buffer management scheme. Polyhedron structures are used in Digital Surface Modeling (DSM) and smoothing process is performed by interpolation. The experimental results show that our framework achieves high performance and works effectively with urban scenes independent from the amount of spatial data and image size. The proposed platform may contribute to the development of various applications such as Web GIS systems based on 3-D graphics standards (e.g. X3-D and VRML) and services which integrate multi-dimensional spatial information and satellite/aerial imagery.

  2. Focusing light through biological tissue and tissue-mimicking phantoms up to 9.6 cm in thickness with digital optical phase conjugation

    NASA Astrophysics Data System (ADS)

    Shen, Yuecheng; Liu, Yan; Ma, Cheng; Wang, Lihong V.

    2016-08-01

    Optical phase conjugation (OPC)-based wavefront shaping techniques focus light through or within scattering media, which is critically important for deep-tissue optical imaging, manipulation, and therapy. However, to date, the sample thickness in OPC experiments has been limited to only a few millimeters. Here, by using a laser with a long coherence length and an optimized digital OPC system that can safely deliver more light power, we focused 532-nm light through tissue-mimicking phantoms up to 9.6 cm thick, as well as through ex vivo chicken breast tissue up to 2.5 cm thick. Our results demonstrate that OPC can be achieved even when photons have experienced on average 1000 scattering events. The demonstrated penetration of nearly 10 cm (˜100 transport mean free paths) has never been achieved before by any optical focusing technique, and it shows the promise of OPC for deep-tissue noninvasive optical imaging, manipulation, and therapy.

  3. Effective 3-D surface modeling for geographic information systems

    NASA Astrophysics Data System (ADS)

    Yüksek, K.; Alparslan, M.; Mendi, E.

    2016-01-01

    In this work, we propose a dynamic, flexible and interactive urban digital terrain platform with spatial data and query processing capabilities of geographic information systems, multimedia database functionality and graphical modeling infrastructure. A new data element, called Geo-Node, which stores image, spatial data and 3-D CAD objects is developed using an efficient data structure. The system effectively handles data transfer of Geo-Nodes between main memory and secondary storage with an optimized directional replacement policy (DRP) based buffer management scheme. Polyhedron structures are used in digital surface modeling and smoothing process is performed by interpolation. The experimental results show that our framework achieves high performance and works effectively with urban scenes independent from the amount of spatial data and image size. The proposed platform may contribute to the development of various applications such as Web GIS systems based on 3-D graphics standards (e.g., X3-D and VRML) and services which integrate multi-dimensional spatial information and satellite/aerial imagery.

  4. Using computer assisted image analysis to determine the optimal Ki67 threshold for predicting outcome of invasive breast cancer

    PubMed Central

    Tay, Timothy Kwang Yong; Thike, Aye Aye; Pathmanathan, Nirmala; Jara-Lazaro, Ana Richelia; Iqbal, Jabed; Sng, Adeline Shi Hui; Ye, Heng Seow; Lim, Jeffrey Chun Tatt; Koh, Valerie Cui Yun; Tan, Jane Sie Yong; Yeong, Joe Poh Sheng; Chow, Zi Long; Li, Hui Hua; Cheng, Chee Leong; Tan, Puay Hoon

    2018-01-01

    Background Ki67 positivity in invasive breast cancers has an inverse correlation with survival outcomes and serves as an immunohistochemical surrogate for molecular subtyping of breast cancer, particularly ER positive breast cancer. The optimal threshold of Ki67 in both settings, however, remains elusive. We use computer assisted image analysis (CAIA) to determine the optimal threshold for Ki67 in predicting survival outcomes and differentiating luminal B from luminal A breast cancers. Methods Quantitative scoring of Ki67 on tissue microarray (TMA) sections of 440 invasive breast cancers was performed using Aperio ePathology ImmunoHistochemistry Nuclear Image Analysis algorithm, with TMA slides digitally scanned via Aperio ScanScope XT System. Results On multivariate analysis, tumours with Ki67 ≥14% had an increased likelihood of recurrence (HR 1.941, p=0.021) and shorter overall survival (HR 2.201, p=0.016). Similar findings were observed in the subset of 343 ER positive breast cancers (HR 2.409, p=0.012 and HR 2.787, p=0.012 respectively). The value of Ki67 associated with ER+HER2-PR<20% tumours (Luminal B subtype) was found to be <17%. Conclusion Using CAIA, we found optimal thresholds for Ki67 that predict a poorer prognosis and an association with the Luminal B subtype of breast cancer. Further investigation and validation of these thresholds are recommended. PMID:29545924

  5. Imaging hadron calorimetry for future Lepton Colliders

    NASA Astrophysics Data System (ADS)

    Repond, José

    2013-12-01

    To fully exploit the physics potential of a future Lepton Collider requires detectors with unprecedented jet energy and dijet-mass resolution. To meet these challenges, detectors optimized for the application of Particle Flow Algorithms (PFAs) are being designed and developed. The application of PFAs, in turn, requires calorimeters with very fine segmentation of the readout, so-called imaging calorimeters. This talk reviews progress in imaging hadron calorimetry as it is being developed for implementation in a detector at a future Lepton Collider. Recent results from the large prototypes built by the CALICE Collaboration, such as the Scintillator Analog Hadron Calorimeter (AHCAL) and the Digital Hadron Calorimeters (DHCAL and SDHCAL) are being presented. In addition, various R&D efforts beyond the present prototypes are being discussed.

  6. Aperture shape dependencies in extended depth of focus for imaging camera by wavefront coding

    NASA Astrophysics Data System (ADS)

    Sakita, Koichi; Ohta, Mitsuhiko; Shimano, Takeshi; Sakemoto, Akito

    2015-02-01

    Optical transfer functions (OTFs) on various directional spatial frequency axes for cubic phase mask (CPM) with circular and square apertures are investigated. Although OTF has no zero points, it has a very close value to zero for a circular aperture at low frequencies on diagonal axis, which results in degradation of restored images. The reason for close-to-zero value in OTF is also analyzed in connection with point spread function profiles using Fourier slice theorem. To avoid close-to-zero condition, square aperture with CPM is indispensable in WFC. We optimized cubic coefficient α of CPM and coefficients of digital filter, and succeeded to get excellent de-blurred images at large depth of field.

  7. Accuracy requirements of optical linear algebra processors in adaptive optics imaging systems

    NASA Technical Reports Server (NTRS)

    Downie, John D.

    1990-01-01

    A ground-based adaptive optics imaging telescope system attempts to improve image quality by detecting and correcting for atmospherically induced wavefront aberrations. The required control computations during each cycle will take a finite amount of time. Longer time delays result in larger values of residual wavefront error variance since the atmosphere continues to change during that time. Thus an optical processor may be well-suited for this task. This paper presents a study of the accuracy requirements in a general optical processor that will make it competitive with, or superior to, a conventional digital computer for the adaptive optics application. An optimization of the adaptive optics correction algorithm with respect to an optical processor's degree of accuracy is also briefly discussed.

  8. Correlation applied to the recognition of regular geometric figures

    NASA Astrophysics Data System (ADS)

    Lasso, William; Morales, Yaileth; Vega, Fabio; Díaz, Leonardo; Flórez, Daniel; Torres, Cesar

    2013-11-01

    It developed a system capable of recognizing of regular geometric figures, the images are taken by the software automatically through a process of validating the presence of figure to the camera lens, the digitized image is compared with a database that contains previously images captured, to subsequently be recognized and finally identified using sonorous words referring to the name of the figure identified. The contribution of system set out is the fact that the acquisition of data is done in real time and using a spy smart glasses with usb interface offering an system equally optimal but much more economical. This tool may be useful as a possible application for visually impaired people can get information of surrounding environment.

  9. Digital photography for the light microscope: results with a gated, video-rate CCD camera and NIH-image software.

    PubMed

    Shaw, S L; Salmon, E D; Quatrano, R S

    1995-12-01

    In this report, we describe a relatively inexpensive method for acquiring, storing and processing light microscope images that combines the advantages of video technology with the powerful medium now termed digital photography. Digital photography refers to the recording of images as digital files that are stored, manipulated and displayed using a computer. This report details the use of a gated video-rate charge-coupled device (CCD) camera and a frame grabber board for capturing 256 gray-level digital images from the light microscope. This camera gives high-resolution bright-field, phase contrast and differential interference contrast (DIC) images but, also, with gated on-chip integration, has the capability to record low-light level fluorescent images. The basic components of the digital photography system are described, and examples are presented of fluorescence and bright-field micrographs. Digital processing of images to remove noise, to enhance contrast and to prepare figures for printing is discussed.

  10. TDC Array Tradeoffs in Current and Upcoming Digital SiPM Detectors for Time-of-Flight PET

    NASA Astrophysics Data System (ADS)

    Tétrault, Marc-André; Therrien, Audrey Corbeil; Lemaire, William; Fontaine, Réjean; Pratte, Jean-François

    2017-03-01

    Radiation detection used in positron emission tomography (PET) exploits the timing information to remove background noise and refine position measurement through time-of-flight information. Fine time resolution in the order of 10 ps full-width at half-maximum (FWHM) would not only improve contrast in the image, but would also enable direct image reconstruction without iterative or back-projected algorithms. Currently, PET experimental setups based on silicon photomultipliers (SiPMs) reach 73 ps FWHM, where the scintillation process plays the larger role in spreading the timing resolution. This will change with the optimization of faster light emission mechanisms (prompt photons), where readout optoelectronics will once more have a noticeable contribution to the timing resolution limit. In addition to reducing electronic jitter as much as possible, other aspects of the design space must also explored, especially for digital SiPMs. Unlike traditional SiPMs, digital SiPMs can integrate circuits like time-to-digital converters (TDCs) directly with individual or groups of light sensing cells. Designers should consider the number of TDCs to integrate, the area they occupy, their power consumption, their resolution, and the impact of signal processing algorithms and find a compromise with the figure of merit and the coincidence timing resolution (CTR). This paper presents a parametric simulation flow for digital SiPM microsystems that evaluates CTR based on these aspects and on the best linear unbiased estimator (BLUE) in order to guide their design for present and future PET systems. For a small 1.1 × 1.1 × 3.0 mm3 LYSO crystal, the simulations indicate that for a low jitter digital SiPM microsystem with 18.2% photon detection efficiency, fewer than four timestamps with any multi-TDC configuration scheme nearly obtain the optimal CTR with BLUE (just below 100 ps FWHM), but with limited 5% improvement over only using the first observed photon. On the other hand, if a similar crystal but with 2.5% prompt photon fraction is considered, BLUE provides an improvement between 80% and 200% (depending on electronic jitter) over using only the first observed photon. In this case, a few tens of timestamps are required, yielding very different design guidelines than for standard LYSO scintillators.

  11. A stereo remote sensing feature selection method based on artificial bee colony algorithm

    NASA Astrophysics Data System (ADS)

    Yan, Yiming; Liu, Pigang; Zhang, Ye; Su, Nan; Tian, Shu; Gao, Fengjiao; Shen, Yi

    2014-05-01

    To improve the efficiency of stereo information for remote sensing classification, a stereo remote sensing feature selection method is proposed in this paper presents, which is based on artificial bee colony algorithm. Remote sensing stereo information could be described by digital surface model (DSM) and optical image, which contain information of the three-dimensional structure and optical characteristics, respectively. Firstly, three-dimensional structure characteristic could be analyzed by 3D-Zernike descriptors (3DZD). However, different parameters of 3DZD could descript different complexity of three-dimensional structure, and it needs to be better optimized selected for various objects on the ground. Secondly, features for representing optical characteristic also need to be optimized. If not properly handled, when a stereo feature vector composed of 3DZD and image features, that would be a lot of redundant information, and the redundant information may not improve the classification accuracy, even cause adverse effects. To reduce information redundancy while maintaining or improving the classification accuracy, an optimized frame for this stereo feature selection problem is created, and artificial bee colony algorithm is introduced for solving this optimization problem. Experimental results show that the proposed method can effectively improve the computational efficiency, improve the classification accuracy.

  12. Digital management and regulatory submission of medical images from clinical trials: role and benefits of the core laboratory

    NASA Astrophysics Data System (ADS)

    Robbins, William L.; Conklin, James J.

    1995-10-01

    Medical images (angiography, CT, MRI, nuclear medicine, ultrasound, x ray) play an increasingly important role in the clinical development and regulatory review process for pharmaceuticals and medical devices. Since medical images are increasingly acquired and archived digitally, or are readily digitized from film, they can be visualized, processed and analyzed in a variety of ways using digital image processing and display technology. Moreover, with image-based data management and data visualization tools, medical images can be electronically organized and submitted to the U.S. Food and Drug Administration (FDA) for review. The collection, processing, analysis, archival, and submission of medical images in a digital format versus an analog (film-based) format presents both challenges and opportunities for the clinical and regulatory information management specialist. The medical imaging 'core laboratory' is an important resource for clinical trials and regulatory submissions involving medical imaging data. Use of digital imaging technology within a core laboratory can increase efficiency and decrease overall costs in the image data management and regulatory review process.

  13. Magnetic resonance imaging of the normal bovine digit.

    PubMed

    Raji, A R; Sardari, K; Mirmahmoob, P

    2009-08-01

    The purpose of this study was defining the normal structures of the digits and hoof in Holstein dairy cattle using Magnetic Resonance Image (MRI). Transverse, Sagital and Dorsoplantar MRI images of three isolated cattle cadaver digits were obtained using Gyroscan T5-NT a magnet of 0.5 Tesla and T1 Weighted sequence. The MRI images were compared to corresponding frozen cross-sections and dissect specimens of the cadaver digits. Relevant anatomical structures were identified and labeled at each level. The MRI images provided anatomical detail of the digits and hoof in Holstein dairy cattle. Transversal images provided excellent depiction of anatomical structures when compared to corresponding frozen cross-sections. The information presented in this paper would serve as an initial reference to the evaluation of MRI images of the digits and hoof in Holstein dairy cattle, that can be used by radiologist, clinicians, surgeon or for research propose in bovine lameness.

  14. Digital Forensics Using Local Signal Statistics

    ERIC Educational Resources Information Center

    Pan, Xunyu

    2011-01-01

    With the rapid growth of the Internet and the popularity of digital imaging devices, digital imagery has become our major information source. Meanwhile, the development of digital manipulation techniques employed by most image editing software brings new challenges to the credibility of photographic images as the definite records of events. We…

  15. Optimizing the acquisition geometry for digital breast tomosynthesis using the Defrise phantom

    NASA Astrophysics Data System (ADS)

    Acciavatti, Raymond J.; Chang, Alice; Woodbridge, Laura; Maidment, Andrew D. A.

    2014-03-01

    In cone beam computed tomography (CT), it is common practice to use the Defrise phantom for image quality assessment. The phantom consists of a stack of plastic plates with low frequency spacing. Because the x-ray beam may traverse multiple plates, the spacing between plates can appear blurry in the reconstruction, and hence modulation provides a measure of image quality. This study considers the potential merit of using the Defrise phantom in digital breast tomosynthesis (DBT), a modality with a smaller projection range than CT. To this end, a Defrise phantom was constructed and subsequently imaged with a commercial DBT system. It was demonstrated that modulation is dependent on position and orientation in the reconstruction. Modulation is preserved over a broad range of positions along the chest wall if the input frequency is oriented in the tube travel direction. By contrast, modulation is degraded with increasing distance from the chest wall if the input frequency is oriented in the posteroanterior (PA) direction. A theoretical framework was then developed to model these results. Reconstructions were calculated in an acquisition geometry designed to improve modulation. Unlike current geometries in which the x-ray tube motion is restricted to the plane of the chest wall, we consider a geometry with an additional component of tube motion along the PA direction. In simulations, it is shown that the newly proposed geometry improves modulation at positions distal to the chest wall. In conclusion, this study demonstrates that the Defrise phantom is a tool for optimizing DBT systems.

  16. Comparison of digital intraoral scanners by single-image capture system and full-color movie system.

    PubMed

    Yamamoto, Meguru; Kataoka, Yu; Manabe, Atsufumi

    2017-01-01

    The use of dental computer-aided design/computer-aided manufacturing (CAD/CAM) restoration is rapidly increasing. This study was performed to evaluate the marginal and internal cement thickness and the adhesive gap of internal cavities comprising CAD/CAM materials using two digital impression acquisition methods and micro-computed tomography. Images obtained by a single-image acquisition system (Bluecam Ver. 4.0) and a full-color video acquisition system (Omnicam Ver. 4.2) were divided into the BL and OM groups, respectively. Silicone impressions were prepared from an ISO-standard metal mold, and CEREC Stone BC and New Fuji Rock IMP were used to create working models (n=20) in the BL and OM groups (n=10 per group), respectively. Individual inlays were designed in a conventional manner using designated software, and all restorations were prepared using CEREC inLab MC XL. These were assembled with the corresponding working models used for measurement, and the level of fit was examined by three-dimensional analysis based on micro-computed tomography. Significant differences in the marginal and internal cement thickness and adhesive gap spacing were found between the OM and BL groups. The full-color movie capture system appears to be a more optimal restoration system than the single-image capture system.

  17. Ultra-realistic imaging: a new beginning for display holography

    NASA Astrophysics Data System (ADS)

    Bjelkhagen, Hans I.; Brotherton-Ratcliffe, David

    2014-02-01

    Recent improvements in key foundation technologies are set to potentially transform the field of Display Holography. In particular new recording systems, based on recent DPSS and semiconductor lasers combined with novel recording materials and processing, have now demonstrated full-color analogue holograms of both lower noise and higher spectral accuracy. Progress in illumination technology is leading to a further major reduction in display noise and to a significant increase of the clear image depth and brightness of such holograms. So too, recent progress in 1-step Direct-Write Digital Holography (DWDH) now opens the way to the creation of High Virtual Volume Displays (HVV) - large format full-parallax DWDH reflection holograms having fundamentally larger clear image depths. In a certain fashion HVV displays can be thought of as providing a high quality full-color digital equivalent to the large-format laser-illuminated transmission holograms of the sixties and seventies. Back then, the advent of such holograms led to much optimism for display holography in the market. However, problems with laser illumination, their monochromatic analogue nature and image noise are well cited as being responsible for their failure in reality. Is there reason for believing that the latest technology improvements will make the mark this time around? This paper argues that indeed there is.

  18. Registration of opthalmic images using control points

    NASA Astrophysics Data System (ADS)

    Heneghan, Conor; Maguire, Paul

    2003-03-01

    A method for registering pairs of digital ophthalmic images of the retina is presented using anatomical features as control points present in both images. The anatomical features chosen are blood vessel crossings and bifurcations. These control points are identified by a combination of local contrast enhancement, and morphological processing. In general, the matching between control points is unknown, however, so an automated algorithm is used to determine the matching pairs of control points in the two images as follows. Using two control points from each image, rigid global transform (RGT) coefficients are calculated for all possible combinations of control point pairs, and the set of RGT coefficients is identified. Once control point pairs are established, registration of two images can be achieved by using linear regression to optimize an RGT, bilinear or second order polynomial global transform. An example of cross-modal image registration using an optical image and a fluorescein angiogram of an eye is presented to illustrate the technique.

  19. Training system for digital mammographic diagnoses of breast cancer

    NASA Astrophysics Data System (ADS)

    Thomaz, R. L.; Nirschl Crozara, M. G.; Patrocinio, A. C.

    2013-03-01

    As the technology evolves, the analog mammography systems are being replaced by digital systems. The digital system uses video monitors as the display of mammographic images instead of the previously used screen-film and negatoscope for analog images. The change in the way of visualizing mammographic images may require a different approach for training the health care professionals in diagnosing the breast cancer with digital mammography. Thus, this paper presents a computational approach to train the health care professionals providing a smooth transition between analog and digital technology also training to use the advantages of digital image processing tools to diagnose the breast cancer. This computational approach consists of a software where is possible to open, process and diagnose a full mammogram case from a database, which has the digital images of each of the mammographic views. The software communicates with a gold standard digital mammogram cases database. This database contains the digital images in Tagged Image File Format (TIFF) and the respective diagnoses according to BI-RADSTM, these files are read by software and shown to the user as needed. There are also some digital image processing tools that can be used to provide better visualization of each single image. The software was built based on a minimalist and a user-friendly interface concept that might help in the smooth transition. It also has an interface for inputting diagnoses from the professional being trained, providing a result feedback. This system has been already completed, but hasn't been applied to any professional training yet.

  20. Image Acquisition and Quality in Digital Radiography.

    PubMed

    Alexander, Shannon

    2016-09-01

    Medical imaging has undergone dramatic changes and technological breakthroughs since the introduction of digital radiography. This article presents information on the development of digital radiography and types of digital radiography systems. Aspects of image quality and radiation exposure control are highlighted as well. In addition, the article includes related workplace changes and medicolegal considerations in the digital radiography environment. ©2016 American Society of Radiologic Technologists.

  1. Securing Digital Images Integrity using Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Hajji, Tarik; Itahriouan, Zakaria; Ouazzani Jamil, Mohammed

    2018-05-01

    Digital image signature is a technique used to protect the image integrity. The application of this technique can serve several areas of imaging applied to smart cities. The objective of this work is to propose two methods to protect digital image integrity. We present a description of two approaches using artificial neural networks (ANN) to digitally sign an image. The first one is “Direct Signature without learning” and the second is “Direct Signature with learning”. This paper presents the theory of proposed approaches and an experimental study to test their effectiveness.

  2. Digital image processing: a primer for JVIR authors and readers: Part 3: Digital image editing.

    PubMed

    LaBerge, Jeanne M; Andriole, Katherine P

    2003-12-01

    This is the final installment of a three-part series on digital image processing intended to prepare authors for online submission of manuscripts. In the first two articles of the series, the fundamentals of digital image architecture were reviewed and methods of importing images to the computer desktop were described. In this article, techniques are presented for editing images in preparation for online submission. A step-by-step guide to basic editing with use of Adobe Photoshop is provided and the ethical implications of this activity are explored.

  3. Losing images in digital radiology: more than you think.

    PubMed

    Oglevee, Catherine; Pianykh, Oleg

    2015-06-01

    It is a common belief that the shift to digital imaging some 20 years ago helped medical image exchange and got rid of any potential image loss that was happening with printed image films. Unfortunately, this is not the case: despite the most recent advances in digital imaging, most hospitals still keep losing their imaging data, with these losses going completely unnoticed. As a result, not only does image loss affect the faith in digital imaging but it also affects patient diagnosis and daily quality of clinical work. This paper identifies the origins of invisible image losses, provides methods and procedures to detect image loss, and demonstrates modes of action that can be taken to stop the problem from happening.

  4. Digital processing of radiographic images for print publication.

    PubMed

    Cockerill, James W

    2002-01-01

    Digital imaging of X-rays yields high quality, evenly exposed negatives and prints. This article outlines the method used, materials and methods of this technique and discusses the advantages of digital radiographic images.

  5. An Efficient Implementation of Deep Convolutional Neural Networks for MRI Segmentation.

    PubMed

    Hoseini, Farnaz; Shahbahrami, Asadollah; Bayat, Peyman

    2018-02-27

    Image segmentation is one of the most common steps in digital image processing, classifying a digital image into different segments. The main goal of this paper is to segment brain tumors in magnetic resonance images (MRI) using deep learning. Tumors having different shapes, sizes, brightness and textures can appear anywhere in the brain. These complexities are the reasons to choose a high-capacity Deep Convolutional Neural Network (DCNN) containing more than one layer. The proposed DCNN contains two parts: architecture and learning algorithms. The architecture and the learning algorithms are used to design a network model and to optimize parameters for the network training phase, respectively. The architecture contains five convolutional layers, all using 3 × 3 kernels, and one fully connected layer. Due to the advantage of using small kernels with fold, it allows making the effect of larger kernels with smaller number of parameters and fewer computations. Using the Dice Similarity Coefficient metric, we report accuracy results on the BRATS 2016, brain tumor segmentation challenge dataset, for the complete, core, and enhancing regions as 0.90, 0.85, and 0.84 respectively. The learning algorithm includes the task-level parallelism. All the pixels of an MR image are classified using a patch-based approach for segmentation. We attain a good performance and the experimental results show that the proposed DCNN increases the segmentation accuracy compared to previous techniques.

  6. The wavelet/scalar quantization compression standard for digital fingerprint images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, J.N.; Brislawn, C.M.

    1994-04-01

    A new digital image compression standard has been adopted by the US Federal Bureau of Investigation for use on digitized gray-scale fingerprint images. The algorithm is based on adaptive uniform scalar quantization of a discrete wavelet transform image decomposition and is referred to as the wavelet/scalar quantization standard. The standard produces archival quality images at compression ratios of around 20:1 and will allow the FBI to replace their current database of paper fingerprint cards with digital imagery.

  7. Method for indexing and retrieving manufacturing-specific digital imagery based on image content

    DOEpatents

    Ferrell, Regina K.; Karnowski, Thomas P.; Tobin, Jr., Kenneth W.

    2004-06-15

    A method for indexing and retrieving manufacturing-specific digital images based on image content comprises three steps. First, at least one feature vector can be extracted from a manufacturing-specific digital image stored in an image database. In particular, each extracted feature vector corresponds to a particular characteristic of the manufacturing-specific digital image, for instance, a digital image modality and overall characteristic, a substrate/background characteristic, and an anomaly/defect characteristic. Notably, the extracting step includes generating a defect mask using a detection process. Second, using an unsupervised clustering method, each extracted feature vector can be indexed in a hierarchical search tree. Third, a manufacturing-specific digital image associated with a feature vector stored in the hierarchicial search tree can be retrieved, wherein the manufacturing-specific digital image has image content comparably related to the image content of the query image. More particularly, can include two data reductions, the first performed based upon a query vector extracted from a query image. Subsequently, a user can select relevant images resulting from the first data reduction. From the selection, a prototype vector can be calculated, from which a second-level data reduction can be performed. The second-level data reduction can result in a subset of feature vectors comparable to the prototype vector, and further comparable to the query vector. An additional fourth step can include managing the hierarchical search tree by substituting a vector average for several redundant feature vectors encapsulated by nodes in the hierarchical search tree.

  8. The application of digital image plane holography technology to identify Chinese herbal medicine

    NASA Astrophysics Data System (ADS)

    Wang, Huaying; Guo, Zhongjia; Liao, Wei; Zhang, Zhihui

    2012-03-01

    In this paper, the imaging technology of digital image plane holography to identify the Chinese herbal medicine is studied. The optical experiment system of digital image plane holography which is the special case of pre-magnification digital holography was built. In the record system, one is an object light by using plane waves which illuminates the object, and the other one is recording hologram by using spherical light wave as reference light. There is a Micro objective lens behind the object. The second phase factor which caus ed by the Micro objective lens can be eliminated by choosing the proper position of the reference point source when digital image plane holography is recorded by spherical light. In this experiment, we use the Lygodium cells and Onion cells as the object. The experiment results with Lygodium cells and Onion cells show that digital image plane holography avoid the process of finding recording distance by using auto-focusing approach, and the phase information of the object can be reconstructed more accurately. The digital image plane holography is applied to the microscopic imaging of cells more effectively, and it is suit to apply for the identify of Chinese Herbal Medicine. And it promotes the application of digital holographic in practice.

  9. 42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... effective management, safety, and proper performance of chest image acquisition, digitization, processing... digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object (e.g... radiographic image files from six or more sample chest radiographs that are of acceptable quality to one or...

  10. Hybrid digital-analog coding with bandwidth expansion for correlated Gaussian sources under Rayleigh fading

    NASA Astrophysics Data System (ADS)

    Yahampath, Pradeepa

    2017-12-01

    Consider communicating a correlated Gaussian source over a Rayleigh fading channel with no knowledge of the channel signal-to-noise ratio (CSNR) at the transmitter. In this case, a digital system cannot be optimal for a range of CSNRs. Analog transmission however is optimal at all CSNRs, if the source and channel are memoryless and bandwidth matched. This paper presents new hybrid digital-analog (HDA) systems for sources with memory and channels with bandwidth expansion, which outperform both digital-only and analog-only systems over a wide range of CSNRs. The digital part is either a predictive quantizer or a transform code, used to achieve a coding gain. Analog part uses linear encoding to transmit the quantization error which improves the performance under CSNR variations. The hybrid encoder is optimized to achieve the minimum AMMSE (average minimum mean square error) over the CSNR distribution. To this end, analytical expressions are derived for the AMMSE of asymptotically optimal systems. It is shown that the outage CSNR of the channel code and the analog-digital power allocation must be jointly optimized to achieve the minimum AMMSE. In the case of HDA predictive quantization, a simple algorithm is presented to solve the optimization problem. Experimental results are presented for both Gauss-Markov sources and speech signals.

  11. Confocal mosaicing microscopy of human skin ex vivo: spectral analysis for digital staining to simulate histology-like appearance

    PubMed Central

    Bini, Jason; Spain, James; Nehal, Kishwer; Hazelwood, Vikki; DiMarzio, Charles; Rajadhyaksha, Milind

    2011-01-01

    Confocal mosaicing microscopy enables rapid imaging of large areas of fresh tissue, without the processing that is necessary for conventional histology. Mosaicing may offer a means to perform rapid histology at the bedside. A possible barrier toward clinical acceptance is that the mosaics are based on a single mode of grayscale contrast and appear black and white, whereas histology is based on two stains (hematoxylin for nuclei, eosin for cellular cytoplasm and dermis) and appears purple and pink. Toward addressing this barrier, we report advances in digital staining: fluorescence mosaics that show only nuclei, are digitally stained purple and overlaid on reflectance mosaics, which show only cellular cytoplasm and dermis, and are digitally stained pink. With digital staining, the appearance of confocal mosaics mimics the appearance of histology. Using multispectral analysis and color matching functions, red, green, and blue (RGB) components of hematoxylin and eosin stains in tissue were determined. The resulting RGB components were then applied in a linear algorithm to transform fluorescence and reflectance contrast in confocal mosaics to the absorbance contrast seen in pathology. Optimization of staining with acridine orange showed improved quality of digitally stained mosaics, with good correlation to the corresponding histology. PMID:21806269

  12. The Engineer Topographic Laboratories /ETL/ hybrid optical/digital image processor

    NASA Astrophysics Data System (ADS)

    Benton, J. R.; Corbett, F.; Tuft, R.

    1980-01-01

    An optical-digital processor for generalized image enhancement and filtering is described. The optical subsystem is a two-PROM Fourier filter processor. Input imagery is isolated, scaled, and imaged onto the first PROM; this input plane acts like a liquid gate and serves as an incoherent-to-coherent converter. The image is transformed onto a second PROM which also serves as a filter medium; filters are written onto the second PROM with a laser scanner in real time. A solid state CCTV camera records the filtered image, which is then digitized and stored in a digital image processor. The operator can then manipulate the filtered image using the gray scale and color remapping capabilities of the video processor as well as the digital processing capabilities of the minicomputer.

  13. 15 CFR 786.2 - Recordkeeping.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... recognized as complete words or numbers. (iv) The system must preserve the initial image (including both... the system. (3) Requirements applicable to a system based on digital images. For systems based on the storage of digital images, the system must provide accessibility to any digital image in the system. The...

  14. 15 CFR 786.2 - Recordkeeping.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... recognized as complete words or numbers. (iv) The system must preserve the initial image (including both... the system. (3) Requirements applicable to a system based on digital images. For systems based on the storage of digital images, the system must provide accessibility to any digital image in the system. The...

  15. 15 CFR 786.2 - Recordkeeping.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... recognized as complete words or numbers. (iv) The system must preserve the initial image (including both... the system. (3) Requirements applicable to a system based on digital images. For systems based on the storage of digital images, the system must provide accessibility to any digital image in the system. The...

  16. 15 CFR 786.2 - Recordkeeping.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... recognized as complete words or numbers. (iv) The system must preserve the initial image (including both... the system. (3) Requirements applicable to a system based on digital images. For systems based on the storage of digital images, the system must provide accessibility to any digital image in the system. The...

  17. 15 CFR 786.2 - Recordkeeping.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... recognized as complete words or numbers. (iv) The system must preserve the initial image (including both... the system. (3) Requirements applicable to a system based on digital images. For systems based on the storage of digital images, the system must provide accessibility to any digital image in the system. The...

  18. 28 CFR 75.6 - Statement describing location of books and records.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...- or computer-manipulated image, digital image, or picture, or other matter (including but not limited... the book, magazine, periodical, film, videotape, digitally- or computer-manipulated image, digital image, picture, or other matter to affix the statement. In this paragraph, the term “copy” includes...

  19. [Development of an ophthalmological clinical information system for inpatient eye clinics].

    PubMed

    Kortüm, K U; Müller, M; Babenko, A; Kampik, A; Kreutzer, T C

    2015-12-01

    In times of increased digitalization in healthcare, departments of ophthalmology are faced with the challenge of introducing electronic clinical health records (EHR); however, specialized software for ophthalmology is not available with most major EHR sytems. The aim of this project was to create specific ophthalmological user interfaces for large inpatient eye care providers within a hospitalwide EHR. Additionally the integration of ophthalmic imaging systems, scheduling and surgical documentation should be achieved. The existing EHR i.s.h.med (Siemens, Germany) was modified using advanced business application programming (ABAP) language to create specific ophthalmological user interfaces for reproduction and moreover optimization of the clinical workflow. A user interface for documentation of ambulatory patients with eight tabs was designed. From June 2013 to October 2014 a total of 61,551 patient contact details were documented. For surgical documentation a separate user interface was set up. Digital clinical orders for documentation of registration and scheduling of operations user interfaces were also set up. A direct integration of ophthalmic imaging modalities could be established. An ophthalmologist-orientated EHR for outpatient and surgical documentation for inpatient clinics was created and successfully implemented. By incorporation of imaging procedures the foundation of future smart/big data analyses was created.

  20. Advances in biologically inspired on/near sensor processing

    NASA Astrophysics Data System (ADS)

    McCarley, Paul L.

    1999-07-01

    As electro-optic sensors increase in size and frame rate, the data transfer and digital processing resource requirements also increase. In many missions, the spatial area of interest is but a small fraction of the available field of view. Choosing the right region of interest, however, is a challenge and still requires an enormous amount of downstream digital processing resources. In order to filter this ever-increasing amount of data, we look at how nature solves the problem. The Advanced Guidance Division of the Munitions Directorate, Air Force Research Laboratory at Elgin AFB, Florida, has been pursuing research in the are of advanced sensor and image processing concepts based on biologically inspired sensory information processing. A summary of two 'neuromorphic' processing efforts will be presented along with a seeker system concept utilizing this innovative technology. The Neuroseek program is developing a 256 X 256 2-color dual band IRFPA coupled to an optimized silicon CMOS read-out and processing integrated circuit that provides simultaneous full-frame imaging in MWIR/LWIR wavebands along with built-in biologically inspired sensor image processing functions. Concepts and requirements for future such efforts will also be discussed.

  1. The use of immunohistochemistry for biomarker assessment--can it compete with other technologies?

    PubMed

    Dunstan, Robert W; Wharton, Keith A; Quigley, Catherine; Lowe, Amanda

    2011-10-01

    A morphology-based assay such as immunohistochemistry (IHC) should be a highly effective means to define the expression of a target molecule of interest, especially if the target is a protein. However, over the past decade, IHC as a platform for biomarkers has been challenged by more quantitative molecular assays with reference standards but that lack morphologic context. For IHC to be considered a "top-tier" biomarker assay, it must provide truly quantitative data on par with non-morphologic assays, which means it needs to be run with reference standards. However, creating such standards for IHC will require optimizing all aspects of tissue collection, fixation, section thickness, morphologic criteria for assessment, staining processes, digitization of images, and image analysis. This will also require anatomic pathology to evolve from a discipline that is descriptive to one that is quantitative. A major step in this transformation will be replacing traditional ocular microscopes with computer monitors and whole slide images, for without digitization, there can be no accurate quantitation; without quantitation, there can be no standardization; and without standardization, the value of morphology-based IHC assays will not be realized.

  2. Soft tissue strain measurement using an optical method

    NASA Astrophysics Data System (ADS)

    Toh, Siew Lok; Tay, Cho Jui; Goh, Cho Hong James

    2008-11-01

    Digital image correlation (DIC) is a non-contact optical technique that allows the full-field estimation of strains on a surface under an applied deformation. In this project, the application of an optimized DIC technique is applied, which can achieve efficiency and accuracy in the measurement of two-dimensional deformation fields in soft tissue. This technique relies on matching the random patterns recorded in images to directly obtain surface displacements and to get displacement gradients from which the strain field can be determined. Digital image correlation is a well developed technique that has numerous and varied engineering applications, including the application in soft and hard tissue biomechanics. Chicken drumstick ligaments were harvested and used during the experiments. The surface of the ligament was speckled with black paint to allow for correlation to be done. Results show that the stress-strain curve exhibits a bi-linear behavior i.e. a "toe region" and a "linear elastic region". The Young's modulus obtained for the toe region is about 92 MPa and the modulus for the linear elastic region is about 230 MPa. The results are within the values for mammalian anterior cruciate ligaments of 150-300 MPa.

  3. [Glossary of terms used by radiologists in image processing].

    PubMed

    Rolland, Y; Collorec, R; Bruno, A; Ramée, A; Morcet, N; Haigron, P

    1995-01-01

    We give the definition of 166 words used in image processing. Adaptivity, aliazing, analog-digital converter, analysis, approximation, arc, artifact, artificial intelligence, attribute, autocorrelation, bandwidth, boundary, brightness, calibration, class, classification, classify, centre, cluster, coding, color, compression, contrast, connectivity, convolution, correlation, data base, decision, decomposition, deconvolution, deduction, descriptor, detection, digitization, dilation, discontinuity, discretization, discrimination, disparity, display, distance, distorsion, distribution dynamic, edge, energy, enhancement, entropy, erosion, estimation, event, extrapolation, feature, file, filter, filter floaters, fitting, Fourier transform, frequency, fusion, fuzzy, Gaussian, gradient, graph, gray level, group, growing, histogram, Hough transform, Houndsfield, image, impulse response, inertia, intensity, interpolation, interpretation, invariance, isotropy, iterative, JPEG, knowledge base, label, laplacian, learning, least squares, likelihood, matching, Markov field, mask, matching, mathematical morphology, merge (to), MIP, median, minimization, model, moiré, moment, MPEG, neural network, neuron, node, noise, norm, normal, operator, optical system, optimization, orthogonal, parametric, pattern recognition, periodicity, photometry, pixel, polygon, polynomial, prediction, pulsation, pyramidal, quantization, raster, reconstruction, recursive, region, rendering, representation space, resolution, restoration, robustness, ROC, thinning, transform, sampling, saturation, scene analysis, segmentation, separable function, sequential, smoothing, spline, split (to), shape, threshold, tree, signal, speckle, spectrum, spline, stationarity, statistical, stochastic, structuring element, support, syntaxic, synthesis, texture, truncation, variance, vision, voxel, windowing.

  4. Multimodality imaging and state-of-art GPU technology in discriminating benign from malignant breast lesions on real time decision support system

    NASA Astrophysics Data System (ADS)

    Kostopoulos, S.; Sidiropoulos, K.; Glotsos, D.; Dimitropoulos, N.; Kalatzis, I.; Asvestas, P.; Cavouras, D.

    2014-03-01

    The aim of this study was to design a pattern recognition system for assisting the diagnosis of breast lesions, using image information from Ultrasound (US) and Digital Mammography (DM) imaging modalities. State-of-art computer technology was employed based on commercial Graphics Processing Unit (GPU) cards and parallel programming. An experienced radiologist outlined breast lesions on both US and DM images from 59 patients employing a custom designed computer software application. Textural features were extracted from each lesion and were used to design the pattern recognition system. Several classifiers were tested for highest performance in discriminating benign from malignant lesions. Classifiers were also combined into ensemble schemes for further improvement of the system's classification accuracy. Following the pattern recognition system optimization, the final system was designed employing the Probabilistic Neural Network classifier (PNN) on the GPU card (GeForce 580GTX) using CUDA programming framework and C++ programming language. The use of such state-of-art technology renders the system capable of redesigning itself on site once additional verified US and DM data are collected. Mixture of US and DM features optimized performance with over 90% accuracy in correctly classifying the lesions.

  5. Perceptual Image Compression in Telemedicine

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Ahumada, Albert J., Jr.; Eckstein, Miguel; Null, Cynthia H. (Technical Monitor)

    1996-01-01

    The next era of space exploration, especially the "Mission to Planet Earth" will generate immense quantities of image data. For example, the Earth Observing System (EOS) is expected to generate in excess of one terabyte/day. NASA confronts a major technical challenge in managing this great flow of imagery: in collection, pre-processing, transmission to earth, archiving, and distribution to scientists at remote locations. Expected requirements in most of these areas clearly exceed current technology. Part of the solution to this problem lies in efficient image compression techniques. For much of this imagery, the ultimate consumer is the human eye. In this case image compression should be designed to match the visual capacities of the human observer. We have developed three techniques for optimizing image compression for the human viewer. The first consists of a formula, developed jointly with IBM and based on psychophysical measurements, that computes a DCT quantization matrix for any specified combination of viewing distance, display resolution, and display brightness. This DCT quantization matrix is used in most recent standards for digital image compression (JPEG, MPEG, CCITT H.261). The second technique optimizes the DCT quantization matrix for each individual image, based on the contents of the image. This is accomplished by means of a model of visual sensitivity to compression artifacts. The third technique extends the first two techniques to the realm of wavelet compression. Together these two techniques will allow systematic perceptual optimization of image compression in NASA imaging systems. Many of the image management challenges faced by NASA are mirrored in the field of telemedicine. Here too there are severe demands for transmission and archiving of large image databases, and the imagery is ultimately used primarily by human observers, such as radiologists. In this presentation I will describe some of our preliminary explorations of the applications of our technology to the special problems of telemedicine.

  6. Global manipulation of digital images can lead to variation in cytological diagnosis

    PubMed Central

    Prasad, H; Wanjari, Sangeeta; Parwani, Rajkumar

    2011-01-01

    Background: With the adoption of a completely electronic workflow by several journals and the advent of telepathology, digital imaging has become an integral part of every scientific research. However, manipulating digital images is very easy, and it can lead to misinterpretations. Aim: To analyse the impact of manipulating digital images on their diagnosis. Design: Digital images were obtained from Papanicolaou-stained smears of dysplastic and normal oral epithelium. They were manipulated using GNU Image Manipulation Program (GIMP) to alter their brightness and contrast and color levels. A Power Point presentation composed of slides of these manipulated images along with the unaltered originals arranged randomly was created. The presentation was shown to five observers individually who rated the images as normal, mild, moderate or severe dysplasia. Weighted κ statistics was used to measure and assess the levels of agreement between observers. Results: Levels of agreement between manipulated images and original images varied greatly among observers. Variation in diagnosis was in the form of overdiagnosis or under-diagnosis, usually by one grade. Conclusion: Global manipulations of digital images of cytological slides can significantly affect their interpretation. Such manipulations should therefore be kept to a minimum, and avoided wherever possible. PMID:21572507

  7. Global manipulation of digital images can lead to variation in cytological diagnosis.

    PubMed

    Prasad, H; Wanjari, Sangeeta; Parwani, Rajkumar

    2011-03-31

    With the adoption of a completely electronic workflow by several journals and the advent of telepathology, digital imaging has become an integral part of every scientific research. However, manipulating digital images is very easy, and it can lead to misinterpretations. To analyse the impact of manipulating digital images on their diagnosis. Digital images were obtained from Papanicolaou-stained smears of dysplastic and normal oral epithelium. They were manipulated using GNU Image Manipulation Program (GIMP) to alter their brightness and contrast and color levels. A Power Point presentation composed of slides of these manipulated images along with the unaltered originals arranged randomly was created. The presentation was shown to five observers individually who rated the images as normal, mild, moderate or severe dysplasia. Weighted κ statistics was used to measure and assess the levels of agreement between observers. Levels of agreement between manipulated images and original images varied greatly among observers. Variation in diagnosis was in the form of overdiagnosis or under-diagnosis, usually by one grade. Global manipulations of digital images of cytological slides can significantly affect their interpretation. Such manipulations should therefore be kept to a minimum, and avoided wherever possible.

  8. Digital holographic image fusion for a larger size object using compressive sensing

    NASA Astrophysics Data System (ADS)

    Tian, Qiuhong; Yan, Liping; Chen, Benyong; Yao, Jiabao; Zhang, Shihua

    2017-05-01

    Digital holographic imaging fusion for a larger size object using compressive sensing is proposed. In this method, the high frequency component of the digital hologram under discrete wavelet transform is represented sparsely by using compressive sensing so that the data redundancy of digital holographic recording can be resolved validly, the low frequency component is retained totally to ensure the image quality, and multiple reconstructed images with different clear parts corresponding to a laser spot size are fused to realize the high quality reconstructed image of a larger size object. In addition, a filter combing high-pass and low-pass filters is designed to remove the zero-order term from a digital hologram effectively. The digital holographic experimental setup based on off-axis Fresnel digital holography was constructed. The feasible and comparative experiments were carried out. The fused image was evaluated by using the Tamura texture features. The experimental results demonstrated that the proposed method can improve the processing efficiency and visual characteristics of the fused image and enlarge the size of the measured object effectively.

  9. Multi-task transfer learning deep convolutional neural network: application to computer-aided diagnosis of breast cancer on mammograms

    NASA Astrophysics Data System (ADS)

    Samala, Ravi K.; Chan, Heang-Ping; Hadjiiski, Lubomir M.; Helvie, Mark A.; Cha, Kenny H.; Richter, Caleb D.

    2017-12-01

    Transfer learning in deep convolutional neural networks (DCNNs) is an important step in its application to medical imaging tasks. We propose a multi-task transfer learning DCNN with the aim of translating the ‘knowledge’ learned from non-medical images to medical diagnostic tasks through supervised training and increasing the generalization capabilities of DCNNs by simultaneously learning auxiliary tasks. We studied this approach in an important application: classification of malignant and benign breast masses. With Institutional Review Board (IRB) approval, digitized screen-film mammograms (SFMs) and digital mammograms (DMs) were collected from our patient files and additional SFMs were obtained from the Digital Database for Screening Mammography. The data set consisted of 2242 views with 2454 masses (1057 malignant, 1397 benign). In single-task transfer learning, the DCNN was trained and tested on SFMs. In multi-task transfer learning, SFMs and DMs were used to train the DCNN, which was then tested on SFMs. N-fold cross-validation with the training set was used for training and parameter optimization. On the independent test set, the multi-task transfer learning DCNN was found to have significantly (p  =  0.007) higher performance compared to the single-task transfer learning DCNN. This study demonstrates that multi-task transfer learning may be an effective approach for training DCNN in medical imaging applications when training samples from a single modality are limited.

  10. Validation of a new digital breast tomosynthesis medical display

    NASA Astrophysics Data System (ADS)

    Marchessoux, Cédric; Vivien, Nicolas; Kumcu, Asli; Kimpe, Tom

    2011-03-01

    The main objective of this study is to evaluate and validate the new Barco medical display MDMG-5221 which has been optimized for the Digital Breast Tomosynthesis (DBT) imaging modality system, and to prove the benefit of the new DBT display in terms of image quality and clinical performance. The clinical performance is evaluated by the detection of micro-calcifications inserted in reconstructed Digital Breast Tomosynthesis slices. The slices are shown in dynamic cine loops, at two frames rates. The statistical analysis chosen for this study is the Receiver Operating Characteristic Multiple-Reader, Multiple-Case methodology, in order to measure the clinical performance of the two displays. Four experienced radiologists are involved in this study. For this clinical study, 50 normal and 50 abnormal independent datasets were used. The result is that the new display outperforms the mammography display for a signal detection task using real DBT images viewed at 25 and 50 slices per second. In the case of 50 slices per second, the p-value = 0.0664. For a cut-off where alpha=0.05, the conclusion is that the null hypothesis cannot be rejected, however the trend is that the new display performs 6% better than the old display in terms of AUC. At 25 slices per second, the difference between the two displays is very apparent. The new display outperforms the mammography display by 10% in terms of AUC, with a good statistical significance of p=0.0415.

  11. Using the value of Lin's concordance correlation coefficient as a criterion for efficient estimation of areas of leaves of eelgrass from noisy digital images.

    PubMed

    Echavarría-Heras, Héctor; Leal-Ramírez, Cecilia; Villa-Diharce, Enrique; Castillo, Oscar

    2014-01-01

    Eelgrass is a cosmopolitan seagrass species that provides important ecological services in coastal and near-shore environments. Despite its relevance, loss of eelgrass habitats is noted worldwide. Restoration by replanting plays an important role, and accurate measurements of the standing crop and productivity of transplants are important for evaluating restoration of the ecological functions of natural populations. Traditional assessments are destructive, and although they do not harm natural populations, in transplants the destruction of shoots might cause undesirable alterations. Non-destructive assessments of the aforementioned variables are obtained through allometric proxies expressed in terms of measurements of the lengths or areas of leaves. Digital imagery could produce measurements of leaf attributes without the removal of shoots, but sediment attachments, damage infringed by drag forces or humidity contents induce noise-effects, reducing precision. Available techniques for dealing with noise caused by humidity contents on leaves use the concepts of adjacency, vicinity, connectivity and tolerance of similarity between pixels. Selection of an interval of tolerance of similarity for efficient measurements requires extended computational routines with tied statistical inferences making concomitant tasks complicated and time consuming. The present approach proposes a simplified and cost-effective alternative, and also a general tool aimed to deal with any sort of noise modifying eelgrass leaves images. Moreover, this selection criterion relies only on a single statistics; the calculation of the maximum value of the Concordance Correlation Coefficient for reproducibility of observed areas of leaves through proxies obtained from digital images. Available data reveals that the present method delivers simplified, consistent estimations of areas of eelgrass leaves taken from noisy digital images. Moreover, the proposed procedure is robust because both the optimal interval of tolerance of similarity and the reproducibility of observed leaf areas through digital image surrogates were independent of sample size. The present method provides simplified, unbiased and non-destructive measurements of eelgrass leaf area. These measurements, in conjunction with allometric methods, can predict the dynamics of eelgrass biomass and leaf growth through indirect techniques, reducing the destructive effect of sampling, fundamental to the evaluation of eelgrass restoration projects thereby contributing to the conservation of this important seagrass species.

  12. Performance evaluation of a retrofit digital detector-based mammography system.

    PubMed

    Marshall, Nicholas W; van Ongeval, Chantal; Bosmans, Hilde

    2016-02-01

    A retrofit flat panel detector was integrated with a GE DMR+ analog mammography system and characterized using detective quantum efficiency (DQE). Technical system performance was evaluated using the European Guidelines protocol, followed by a limited evaluation of clinical image quality for 20 cases using image quality criteria in the European Guidelines. Optimal anode/filter selections were established using signal difference-to-noise ratio measurements. Only small differences in peak DQE were seen between the three anode/filter settings, with an average value of 0.53. For poly(methyl methacrylate) (PMMA) thicknesses above 60 mm, the Rh/Rh setting was the optimal anode/filter setting. The system required a mean glandular dose of 0.54 mGy at 30 kV Rh/Rh to reach the Acceptable gold thickness limit for 0.1 mm details. Imaging performance of the retrofit unit with the GE DMR+ is notably better than of powder based computed radiography systems and is comparable to current flat panel FFDM systems. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  13. A graph-based watershed merging using fuzzy C-means and simulated annealing for image segmentation

    NASA Astrophysics Data System (ADS)

    Vadiveloo, Mogana; Abdullah, Rosni; Rajeswari, Mandava

    2015-12-01

    In this paper, we have addressed the issue of over-segmented regions produced in watershed by merging the regions using global feature. The global feature information is obtained from clustering the image in its feature space using Fuzzy C-Means (FCM) clustering. The over-segmented regions produced by performing watershed on the gradient of the image are then mapped to this global information in the feature space. Further to this, the global feature information is optimized using Simulated Annealing (SA). The optimal global feature information is used to derive the similarity criterion to merge the over-segmented watershed regions which are represented by the region adjacency graph (RAG). The proposed method has been tested on digital brain phantom simulated dataset to segment white matter (WM), gray matter (GM) and cerebrospinal fluid (CSF) soft tissues regions. The experiments showed that the proposed method performs statistically better, with average of 95.242% regions are merged, than the immersion watershed and average accuracy improvement of 8.850% in comparison with RAG-based immersion watershed merging using global and local features.

  14. Digital Radiographic Image Processing and Analysis.

    PubMed

    Yoon, Douglas C; Mol, André; Benn, Douglas K; Benavides, Erika

    2018-07-01

    This article describes digital radiographic imaging and analysis from the basics of image capture to examples of some of the most advanced digital technologies currently available. The principles underlying the imaging technologies are described to provide a better understanding of their strengths and limitations. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. 42 CFR 37.51 - Interpreting and classifying chest radiographs-digital radiography systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... chest radiographic images provided for use with the Guidelines for the Use of the ILO International... standard digital images may be used for classifying digital chest images for pneumoconiosis. Modification of the appearance of the standard images using software tools is not permitted. (d) Viewing systems...

  16. Integrating Digital Images into the Art and Art History Curriculum.

    ERIC Educational Resources Information Center

    Pitt, Sharon P.; Updike, Christina B.; Guthrie, Miriam E.

    2002-01-01

    Describes an Internet-based image database system connected to a flexible, in-class teaching and learning tool (the Madison Digital Image Database) developed at James Madison University to bring digital images to the arts and humanities classroom. Discusses content, copyright issues, ensuring system effectiveness, instructional impact, sharing the…

  17. The use of digital images in pathology.

    PubMed

    Furness, P N

    1997-11-01

    Digital images are routinely used by the publishing industry, but most diagnostic pathologists are unfamiliar with the technology and its possibilities. This review aims to explain the basic principles of digital image acquisition, storage, manipulation and use, and the possibilities provided not only in research, but also in teaching and in routine diagnostic pathology. Images of natural objects are usually expressed digitally as 'bitmaps'--rectilinear arrays of small dots. The size of each dot can vary, but so can its information content in terms, for example, of colour, greyscale or opacity. Various file formats and compression algorithms are available. Video cameras connected to microscopes are familiar to most pathologists; video images can be converted directly to a digital form by a suitably equipped computer. Digital cameras and scanners are alternative acquisition tools of relevance to pathologists. Once acquired, a digital image can easily be subjected to the digital equivalent of any conventional darkroom manipulation and modern software allows much more flexibility, to such an extent that a new tool for scientific fraud has been created. For research, image enhancement and analysis is an increasingly powerful and affordable tool. Morphometric measurements are, after many predictions, at last beginning to be part of the toolkit of the diagnostic pathologist. In teaching, the potential to create dramatic yet informative presentations is demonstrated daily by the publishing industry; such methods are readily applicable to the classroom. The combination of digital images and the Internet raises many possibilities; for example, instead of seeking one expert diagnostic opinion, one could simultaneously seek the opinion of many, all around the globe. It is inevitable that in the coming years the use of digital images will spread from the laboratory to the medical curriculum and to the whole of diagnostic pathology.

  18. Super-resolution for scanning light stimulation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bitzer, L. A.; Neumann, K.; Benson, N., E-mail: niels.benson@uni-due.de

    Super-resolution (SR) is a technique used in digital image processing to overcome the resolution limitation of imaging systems. In this process, a single high resolution image is reconstructed from multiple low resolution images. SR is commonly used for CCD and CMOS (Complementary Metal-Oxide-Semiconductor) sensor images, as well as for medical applications, e.g., magnetic resonance imaging. Here, we demonstrate that super-resolution can be applied with scanning light stimulation (LS) systems, which are common to obtain space-resolved electro-optical parameters of a sample. For our purposes, the Projection Onto Convex Sets (POCS) was chosen and modified to suit the needs of LS systems.more » To demonstrate the SR adaption, an Optical Beam Induced Current (OBIC) LS system was used. The POCS algorithm was optimized by means of OBIC short circuit current measurements on a multicrystalline solar cell, resulting in a mean square error reduction of up to 61% and improved image quality.« less

  19. Data mining and visualization of average images in a digital hand atlas

    NASA Astrophysics Data System (ADS)

    Zhang, Aifeng; Gertych, Arkadiusz; Liu, Brent J.; Huang, H. K.

    2005-04-01

    We have collected a digital hand atlas containing digitized left hand radiographs of normally developed children grouped accordingly by age, sex, and race. A set of features stored in a database reflecting patient's stage of skeletal development has been calculated by automatic image processing procedures. This paper addresses a new concept, "average" image in the digital hand atlas. The "average" reference image in the digital atlas is selected for each of the groups of normal developed children with the best representative skeletal maturity based on bony features. A data mining procedure was designed and applied to find the average image through average feature vector matching. It also provides a temporary solution for the missing feature problem through polynomial regression. As more cases are added to the digital hand atlas, it can grow to provide clinicians accurate reference images to aid the bone age assessment process.

  20. Automatic segmentation of mammogram and tomosynthesis images

    NASA Astrophysics Data System (ADS)

    Sargent, Dusty; Park, Sun Young

    2016-03-01

    Breast cancer is a one of the most common forms of cancer in terms of new cases and deaths both in the United States and worldwide. However, the survival rate with breast cancer is high if it is detected and treated before it spreads to other parts of the body. The most common screening methods for breast cancer are mammography and digital tomosynthesis, which involve acquiring X-ray images of the breasts that are interpreted by radiologists. The work described in this paper is aimed at optimizing the presentation of mammography and tomosynthesis images to the radiologist, thereby improving the early detection rate of breast cancer and the resulting patient outcomes. Breast cancer tissue has greater density than normal breast tissue, and appears as dense white image regions that are asymmetrical between the breasts. These irregularities are easily seen if the breast images are aligned and viewed side-by-side. However, since the breasts are imaged separately during mammography, the images may be poorly centered and aligned relative to each other, and may not properly focus on the tissue area. Similarly, although a full three dimensional reconstruction can be created from digital tomosynthesis images, the same centering and alignment issues can occur for digital tomosynthesis. Thus, a preprocessing algorithm that aligns the breasts for easy side-by-side comparison has the potential to greatly increase the speed and accuracy of mammogram reading. Likewise, the same preprocessing can improve the results of automatic tissue classification algorithms for mammography. In this paper, we present an automated segmentation algorithm for mammogram and tomosynthesis images that aims to improve the speed and accuracy of breast cancer screening by mitigating the above mentioned problems. Our algorithm uses information in the DICOM header to facilitate preprocessing, and incorporates anatomical region segmentation and contour analysis, along with a hidden Markov model (HMM) for processing the multi-frame tomosynthesis images. The output of the algorithm is a new set of images that have been processed to show only the diagnostically relevant region and align the breasts so that they can be easily compared side-by-side. Our method has been tested on approximately 750 images, including various examples of mammogram, tomosynthesis, and scanned images, and has correctly segmented the diagnostically relevant image region in 97% of cases.

Top