Sample records for distributed image processing

  1. Synthetic Foveal Imaging Technology

    NASA Technical Reports Server (NTRS)

    Nikzad, Shouleh (Inventor); Monacos, Steve P. (Inventor); Hoenk, Michael E. (Inventor)

    2013-01-01

    Apparatuses and methods are disclosed that create a synthetic fovea in order to identify and highlight interesting portions of an image for further processing and rapid response. Synthetic foveal imaging implements a parallel processing architecture that uses reprogrammable logic to implement embedded, distributed, real-time foveal image processing from different sensor types while simultaneously allowing for lossless storage and retrieval of raw image data. Real-time, distributed, adaptive processing of multi-tap image sensors with coordinated processing hardware used for each output tap is enabled. In mosaic focal planes, a parallel-processing network can be implemented that treats the mosaic focal plane as a single ensemble rather than a set of isolated sensors. Various applications are enabled for imaging and robotic vision where processing and responding to enormous amounts of data quickly and efficiently is important.

  2. Method for localizing and isolating an errant process step

    DOEpatents

    Tobin, Jr., Kenneth W.; Karnowski, Thomas P.; Ferrell, Regina K.

    2003-01-01

    A method for localizing and isolating an errant process includes the steps of retrieving from a defect image database a selection of images each image having image content similar to image content extracted from a query image depicting a defect, each image in the selection having corresponding defect characterization data. A conditional probability distribution of the defect having occurred in a particular process step is derived from the defect characterization data. A process step as a highest probable source of the defect according to the derived conditional probability distribution is then identified. A method for process step defect identification includes the steps of characterizing anomalies in a product, the anomalies detected by an imaging system. A query image of a product defect is then acquired. A particular characterized anomaly is then correlated with the query image. An errant process step is then associated with the correlated image.

  3. Information Acquisition, Analysis and Integration

    DTIC Science & Technology

    2016-08-03

    of sensing and processing, theory, applications, signal processing, image and video processing, machine learning , technology transfer. 16. SECURITY... learning . 5. Solved elegantly old problems like image and video debluring, intro- ducing new revolutionary approaches. 1 DISTRIBUTION A: Distribution...Polatkan, G. Sapiro, D. Blei, D. B. Dunson, and L. Carin, “ Deep learning with hierarchical convolution factor analysis,” IEEE 6 DISTRIBUTION A

  4. Ship Detection in SAR Image Based on the Alpha-stable Distribution

    PubMed Central

    Wang, Changcheng; Liao, Mingsheng; Li, Xiaofeng

    2008-01-01

    This paper describes an improved Constant False Alarm Rate (CFAR) ship detection algorithm in spaceborne synthetic aperture radar (SAR) image based on Alpha-stable distribution model. Typically, the CFAR algorithm uses the Gaussian distribution model to describe statistical characteristics of a SAR image background clutter. However, the Gaussian distribution is only valid for multilook SAR images when several radar looks are averaged. As sea clutter in SAR images shows spiky or heavy-tailed characteristics, the Gaussian distribution often fails to describe background sea clutter. In this study, we replace the Gaussian distribution with the Alpha-stable distribution, which is widely used in impulsive or spiky signal processing, to describe the background sea clutter in SAR images. In our proposed algorithm, an initial step for detecting possible ship targets is employed. Then, similar to the typical two-parameter CFAR algorithm, a local process is applied to the pixel identified as possible target. A RADARSAT-1 image is used to validate this Alpha-stable distribution based algorithm. Meanwhile, known ship location data during the time of RADARSAT-1 SAR image acquisition is used to validate ship detection results. Validation results show improvements of the new CFAR algorithm based on the Alpha-stable distribution over the CFAR algorithm based on the Gaussian distribution. PMID:27873794

  5. a Hadoop-Based Distributed Framework for Efficient Managing and Processing Big Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Wang, C.; Hu, F.; Hu, X.; Zhao, S.; Wen, W.; Yang, C.

    2015-07-01

    Various sensors from airborne and satellite platforms are producing large volumes of remote sensing images for mapping, environmental monitoring, disaster management, military intelligence, and others. However, it is challenging to efficiently storage, query and process such big data due to the data- and computing- intensive issues. In this paper, a Hadoop-based framework is proposed to manage and process the big remote sensing data in a distributed and parallel manner. Especially, remote sensing data can be directly fetched from other data platforms into the Hadoop Distributed File System (HDFS). The Orfeo toolbox, a ready-to-use tool for large image processing, is integrated into MapReduce to provide affluent image processing operations. With the integration of HDFS, Orfeo toolbox and MapReduce, these remote sensing images can be directly processed in parallel in a scalable computing environment. The experiment results show that the proposed framework can efficiently manage and process such big remote sensing data.

  6. Gaussian Process Interpolation for Uncertainty Estimation in Image Registration

    PubMed Central

    Wachinger, Christian; Golland, Polina; Reuter, Martin; Wells, William

    2014-01-01

    Intensity-based image registration requires resampling images on a common grid to evaluate the similarity function. The uncertainty of interpolation varies across the image, depending on the location of resampled points relative to the base grid. We propose to perform Bayesian inference with Gaussian processes, where the covariance matrix of the Gaussian process posterior distribution estimates the uncertainty in interpolation. The Gaussian process replaces a single image with a distribution over images that we integrate into a generative model for registration. Marginalization over resampled images leads to a new similarity measure that includes the uncertainty of the interpolation. We demonstrate that our approach increases the registration accuracy and propose an efficient approximation scheme that enables seamless integration with existing registration methods. PMID:25333127

  7. A cost-effective line-based light-balancing technique using adaptive processing.

    PubMed

    Hsia, Shih-Chang; Chen, Ming-Huei; Chen, Yu-Min

    2006-09-01

    The camera imaging system has been widely used; however, the displaying image appears to have an unequal light distribution. This paper presents novel light-balancing techniques to compensate uneven illumination based on adaptive signal processing. For text image processing, first, we estimate the background level and then process each pixel with nonuniform gain. This algorithm can balance the light distribution while keeping a high contrast in the image. For graph image processing, the adaptive section control using piecewise nonlinear gain is proposed to equalize the histogram. Simulations show that the performance of light balance is better than the other methods. Moreover, we employ line-based processing to efficiently reduce the memory requirement and the computational cost to make it applicable in real-time systems.

  8. Estimation of the Scatterer Distribution of the Cirrhotic Liver using Ultrasonic Image

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Tadashi; Hachiya, Hiroyuki

    1998-05-01

    In the B-mode image of the liver obtained by an ultrasonic imaging system, the speckled pattern changes with the progression of the disease such as liver cirrhosis.In this paper we present the statistical characteristics of the echo envelope of the liver, and the technique to extract information of the scatterer distribution from the normal and cirrhotic liver images using constant false alarm rate (CFAR) processing.We analyze the relationship between the extracted scatterer distribution and the stage of liver cirrhosis. The ratio of the area in which the amplitude of the processing signal is more than the threshold to the entire processed image area is related quantitatively to the stage of liver cirrhosis.It is found that the proposed technique is valid for the quantitative diagnosis of liver cirrhosis.

  9. A Parallel Distributed-Memory Particle Method Enables Acquisition-Rate Segmentation of Large Fluorescence Microscopy Images

    PubMed Central

    Afshar, Yaser; Sbalzarini, Ivo F.

    2016-01-01

    Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 1010 pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments. PMID:27046144

  10. A Parallel Distributed-Memory Particle Method Enables Acquisition-Rate Segmentation of Large Fluorescence Microscopy Images.

    PubMed

    Afshar, Yaser; Sbalzarini, Ivo F

    2016-01-01

    Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 10(10) pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments.

  11. A mobile ferromagnetic shape detection sensor using a Hall sensor array and magnetic imaging.

    PubMed

    Misron, Norhisam; Shin, Ng Wei; Shafie, Suhaidi; Marhaban, Mohd Hamiruce; Mailah, Nashiren Farzilah

    2011-01-01

    This paper presents a mobile Hall sensor array system for the shape detection of ferromagnetic materials that are embedded in walls or floors. The operation of the mobile Hall sensor array system is based on the principle of magnetic flux leakage to describe the shape of the ferromagnetic material. Two permanent magnets are used to generate the magnetic flux flow. The distribution of magnetic flux is perturbed as the ferromagnetic material is brought near the permanent magnets and the changes in magnetic flux distribution are detected by the 1-D array of the Hall sensor array setup. The process for magnetic imaging of the magnetic flux distribution is done by a signal processing unit before it displays the real time images using a netbook. A signal processing application software is developed for the 1-D Hall sensor array signal acquisition and processing to construct a 2-D array matrix. The processed 1-D Hall sensor array signals are later used to construct the magnetic image of ferromagnetic material based on the voltage signal and the magnetic flux distribution. The experimental results illustrate how the shape of specimens such as square, round and triangle shapes is determined through magnetic images based on the voltage signal and magnetic flux distribution of the specimen. In addition, the magnetic images of actual ferromagnetic objects are also illustrated to prove the functionality of mobile Hall sensor array system for actual shape detection. The results prove that the mobile Hall sensor array system is able to perform magnetic imaging in identifying various ferromagnetic materials.

  12. A Mobile Ferromagnetic Shape Detection Sensor Using a Hall Sensor Array and Magnetic Imaging

    PubMed Central

    Misron, Norhisam; Shin, Ng Wei; Shafie, Suhaidi; Marhaban, Mohd Hamiruce; Mailah, Nashiren Farzilah

    2011-01-01

    This paper presents a Mobile Hall Sensor Array system for the shape detection of ferromagnetic materials that are embedded in walls or floors. The operation of the Mobile Hall Sensor Array system is based on the principle of magnetic flux leakage to describe the shape of the ferromagnetic material. Two permanent magnets are used to generate the magnetic flux flow. The distribution of magnetic flux is perturbed as the ferromagnetic material is brought near the permanent magnets and the changes in magnetic flux distribution are detected by the 1-D array of the Hall sensor array setup. The process for magnetic imaging of the magnetic flux distribution is done by a signal processing unit before it displays the real time images using a netbook. A signal processing application software is developed for the 1-D Hall sensor array signal acquisition and processing to construct a 2-D array matrix. The processed 1-D Hall sensor array signals are later used to construct the magnetic image of ferromagnetic material based on the voltage signal and the magnetic flux distribution. The experimental results illustrate how the shape of specimens such as square, round and triangle shapes is determined through magnetic images based on the voltage signal and magnetic flux distribution of the specimen. In addition, the magnetic images of actual ferromagnetic objects are also illustrated to prove the functionality of Mobile Hall Sensor Array system for actual shape detection. The results prove that the Mobile Hall Sensor Array system is able to perform magnetic imaging in identifying various ferromagnetic materials. PMID:22346653

  13. [Method of correcting sensitivity nonuniformity using gaussian distribution on 3.0 Tesla abdominal MRI].

    PubMed

    Hayashi, Norio; Miyati, Tosiaki; Takanaga, Masako; Ohno, Naoki; Hamaguchi, Takashi; Kozaka, Kazuto; Sanada, Shigeru; Yamamoto, Tomoyuki; Matsui, Osamu

    2011-01-01

    In the direction where the phased array coil used in parallel magnetic resonance imaging (MRI) is perpendicular to the arrangement, sensitivity falls significantly. Moreover, in a 3.0 tesla (3T) abdominal MRI, the quality of the image is reduced by changes in the relaxation time, reinforcement of the magnetic susceptibility effect, etc. In a 3T MRI, which has a high resonant frequency, the signal of the depths (central part) is reduced in the trunk part. SCIC, which is sensitivity correction processing, has inadequate correction processing, such as that edges are emphasized and the central part is corrected. Therefore, we used 3T with a Gaussian distribution. The uneven compensation processing for sensitivity of an abdomen MR image was considered. The correction processing consisted of the following methods. 1) The center of gravity of the domain of the human body in an abdomen MR image was calculated. 2) The correction coefficient map was created from the center of gravity using the Gaussian distribution. 3) The sensitivity correction image was created from the correction coefficient map and the original picture image. Using the Gaussian correction to process the image, the uniformity calculated using the NEMA method was improved significantly compared to the original image of a phantom. In a visual evaluation by radiologists, the uniformity was improved significantly using the Gaussian correction processing. Because of the homogeneous improvement of the abdomen image taken using 3T MRI, the Gaussian correction processing is considered to be a very useful technique.

  14. Computer measurement of particle sizes in electron microscope images

    NASA Technical Reports Server (NTRS)

    Hall, E. L.; Thompson, W. B.; Varsi, G.; Gauldin, R.

    1976-01-01

    Computer image processing techniques have been applied to particle counting and sizing in electron microscope images. Distributions of particle sizes were computed for several images and compared to manually computed distributions. The results of these experiments indicate that automatic particle counting within a reasonable error and computer processing time is feasible. The significance of the results is that the tedious task of manually counting a large number of particles can be eliminated while still providing the scientist with accurate results.

  15. The application of terahertz pulsed imaging in characterising density distribution of roll-compacted ribbons.

    PubMed

    Zhang, Jianyi; Pei, Chunlei; Schiano, Serena; Heaps, David; Wu, Chuan-Yu

    2016-09-01

    Roll compaction is a commonly used dry granulation process in pharmaceutical, fine chemical and agrochemical industries for materials sensitive to heat or moisture. The ribbon density distribution plays an important role in controlling properties of granules (e.g. granule size distribution, porosity and strength). Accurate characterisation of ribbon density distribution is critical in process control and quality assurance. The terahertz imaging system has a great application potential in achieving this as the terahertz radiation has the ability to penetrate most of the pharmaceutical excipients and the refractive index reflects variations in density and chemical compositions. The aim of this study is to explore whether terahertz pulse imaging is a feasible technique for quantifying ribbon density distribution. Ribbons were made of two grades of microcrystalline cellulose (MCC), Avicel PH102 and DG, using a roll compactor at various process conditions and the ribbon density variation was investigated using terahertz imaging and section methods. The density variations obtained from both methods were compared to explore the reliability and accuracy of the terahertz imaging system. An average refractive index is calculated from the refractive index values in the frequency range between 0.5 and 1.5THz. It is shown that the refractive index gradually decreases from the middle of the ribbon towards to the edges. Variations of density distribution across the width of the ribbons are also obtained using both the section method and the terahertz imaging system. It is found that the terahertz imaging results are in excellent agreement with that obtained using the section method, demonstrating that terahertz imaging is a feasible and rapid tool to characterise ribbon density distributions. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Bidirectional light-scattering image processing method for high-concentration jet sprays

    NASA Astrophysics Data System (ADS)

    Shimizu, I.; Emori, Y.; Yang, W.-J.; Shimoda, M.; Suzuki, T.

    1985-01-01

    In order to study the distributions of droplet size and volume density in high-concentration jet sprays, a new technique is developed, which combines the forward and backward light scattering method and an image processing method. A pulsed ruby laser is used as the light source. The Mie scattering theory is applied to the results obtained from image processing on the scattering photographs. The time history is obtained for the droplet size and volume density distributions, and the method is demonstrated by diesel fuel sprays under various injecting conditions. The validity of the technique is verified by a good agreement in the injected fuel volume distributions obtained by the present method and by injection rate measurements.

  17. Architecture of distributed picture archiving and communication systems for storing and processing high resolution medical images

    NASA Astrophysics Data System (ADS)

    Tokareva, Victoria

    2018-04-01

    New generation medicine demands a better quality of analysis increasing the amount of data collected during checkups, and simultaneously decreasing the invasiveness of a procedure. Thus it becomes urgent not only to develop advanced modern hardware, but also to implement special software infrastructure for using it in everyday clinical practice, so-called Picture Archiving and Communication Systems (PACS). Developing distributed PACS is a challenging task for nowadays medical informatics. The paper discusses the architecture of distributed PACS server for processing large high-quality medical images, with respect to technical specifications of modern medical imaging hardware, as well as international standards in medical imaging software. The MapReduce paradigm is proposed for image reconstruction by server, and the details of utilizing the Hadoop framework for this task are being discussed in order to provide the design of distributed PACS as ergonomic and adapted to the needs of end users as possible.

  18. Secure distribution for high resolution remote sensing images

    NASA Astrophysics Data System (ADS)

    Liu, Jin; Sun, Jing; Xu, Zheng Q.

    2010-09-01

    The use of remote sensing images collected by space platforms is becoming more and more widespread. The increasing value of space data and its use in critical scenarios call for adoption of proper security measures to protect these data against unauthorized access and fraudulent use. In this paper, based on the characteristics of remote sensing image data and application requirements on secure distribution, a secure distribution method is proposed, including users and regions classification, hierarchical control and keys generation, and multi-level encryption based on regions. The combination of the three parts can make that the same remote sensing images after multi-level encryption processing are distributed to different permission users through multicast, but different permission users can obtain different degree information after decryption through their own decryption keys. It well meets user access control and security needs in the process of high resolution remote sensing image distribution. The experimental results prove the effectiveness of the proposed method which is suitable for practical use in the secure transmission of remote sensing images including confidential information over internet.

  19. Distributed processing method for arbitrary view generation in camera sensor network

    NASA Astrophysics Data System (ADS)

    Tehrani, Mehrdad P.; Fujii, Toshiaki; Tanimoto, Masayuki

    2003-05-01

    Camera sensor network as a new advent of technology is a network that each sensor node can capture video signals, process and communicate them with other nodes. The processing task in this network is to generate arbitrary view, which can be requested from central node or user. To avoid unnecessary communication between nodes in camera sensor network and speed up the processing time, we have distributed the processing tasks between nodes. In this method, each sensor node processes part of interpolation algorithm to generate the interpolated image with local communication between nodes. The processing task in camera sensor network is ray-space interpolation, which is an object independent method and based on MSE minimization by using adaptive filtering. Two methods were proposed for distributing processing tasks, which are Fully Image Shared Decentralized Processing (FIS-DP), and Partially Image Shared Decentralized Processing (PIS-DP), to share image data locally. Comparison of the proposed methods with Centralized Processing (CP) method shows that PIS-DP has the highest processing speed after FIS-DP, and CP has the lowest processing speed. Communication rate of CP and PIS-DP is almost same and better than FIS-DP. So, PIS-DP is recommended because of its better performance than CP and FIS-DP.

  20. Real time quantitative imaging for semiconductor crystal growth, control and characterization

    NASA Technical Reports Server (NTRS)

    Wargo, Michael J.

    1991-01-01

    A quantitative real time image processing system has been developed which can be software-reconfigured for semiconductor processing and characterization tasks. In thermal imager mode, 2D temperature distributions of semiconductor melt surfaces (900-1600 C) can be obtained with temperature and spatial resolutions better than 0.5 C and 0.5 mm, respectively, as demonstrated by analysis of melt surface thermal distributions. Temporal and spatial image processing techniques and multitasking computational capabilities convert such thermal imaging into a multimode sensor for crystal growth control. A second configuration of the image processing engine in conjunction with bright and dark field transmission optics is used to nonintrusively determine the microdistribution of free charge carriers and submicron sized crystalline defects in semiconductors. The IR absorption characteristics of wafers are determined with 10-micron spatial resolution and, after calibration, are converted into charge carrier density.

  1. Visualizing Chemistry with Infrared Imaging

    ERIC Educational Resources Information Center

    Xie, Charles

    2011-01-01

    Almost all chemical processes release or absorb heat. The heat flow in a chemical system reflects the process it is undergoing. By showing the temperature distribution dynamically, infrared (IR) imaging provides a salient visualization of the process. This paper presents a set of simple experiments based on IR imaging to demonstrate its enormous…

  2. Judging The Effectiveness Of Wool Combing By The Entropy Of The Images Of Wool Slivers

    NASA Astrophysics Data System (ADS)

    Rodrigues, F. Carvalho; Carvalho, Fernando D.; Peixoto, J. Pinto; Silva, M. Santos

    1989-04-01

    In general it can be said that the textile industry endeavours to render a bunch of fibers chaotically distributed in space into an ordered spatial distribution. This fact is independent of the nature the fibers, i.e., the aim of getting into higher order states in the spatial distribution of the fibers dictates different industrial processes depending on whether the fibers are wool, cotton or man made but the all effect is centred on obtaining at every step of any of the processes a more ordered state regarding the spatial distribution of the fibers. Thinking about the textile processes as a method of getting order out of chaos, the concept of entropy appears as the most appropriate judging parameter on the effectiveness of a step in the chain of an industrial process to produce a regular textile. In fact, entropy is the hidden parameter not only for the textile industry but also for the non woven and paper industrial processes. It happens that in these industries the state of order is linked with the spatial distribution of fibers and to obtain an image of a spatial distribution is an easy matter. To compute the image entropy from the grey level distribution requires only the use of the Shannon formula. In this paper to illustrate the usefulness of employing the entropy of an image concept to textiles the evolution of the entropy of wool slivers along the combing process is matched against the state of parallelization of the fibbers along the seven steps as measured by the existing method. The advantages of the entropy method over the previous method based on diffraction is also demonstrated.

  3. The Radon cumulative distribution transform and its application to image classification

    PubMed Central

    Kolouri, Soheil; Park, Se Rim; Rohde, Gustavo K.

    2016-01-01

    Invertible image representation methods (transforms) are routinely employed as low-level image processing operations based on which feature extraction and recognition algorithms are developed. Most transforms in current use (e.g. Fourier, Wavelet, etc.) are linear transforms, and, by themselves, are unable to substantially simplify the representation of image classes for classification. Here we describe a nonlinear, invertible, low-level image processing transform based on combining the well known Radon transform for image data, and the 1D Cumulative Distribution Transform proposed earlier. We describe a few of the properties of this new transform, and with both theoretical and experimental results show that it can often render certain problems linearly separable in transform space. PMID:26685245

  4. Multimodal molecular 3D imaging for the tumoral volumetric distribution assessment of folate-based biosensors.

    PubMed

    Ramírez-Nava, Gerardo J; Santos-Cuevas, Clara L; Chairez, Isaac; Aranda-Lara, Liliana

    2017-12-01

    The aim of this study was to characterize the in vivo volumetric distribution of three folate-based biosensors by different imaging modalities (X-ray, fluorescence, Cerenkov luminescence, and radioisotopic imaging) through the development of a tridimensional image reconstruction algorithm. The preclinical and multimodal Xtreme imaging system, with a Multimodal Animal Rotation System (MARS), was used to acquire bidimensional images, which were processed to obtain the tridimensional reconstruction. Images of mice at different times (biosensor distribution) were simultaneously obtained from the four imaging modalities. The filtered back projection and inverse Radon transformation were used as main image-processing techniques. The algorithm developed in Matlab was able to calculate the volumetric profiles of 99m Tc-Folate-Bombesin (radioisotopic image), 177 Lu-Folate-Bombesin (Cerenkov image), and FolateRSense™ 680 (fluorescence image) in tumors and kidneys of mice, and no significant differences were detected in the volumetric quantifications among measurement techniques. The imaging tridimensional reconstruction algorithm can be easily extrapolated to different 2D acquisition-type images. This characteristic flexibility of the algorithm developed in this study is a remarkable advantage in comparison to similar reconstruction methods.

  5. Cardio-PACs: a new opportunity

    NASA Astrophysics Data System (ADS)

    Heupler, Frederick A., Jr.; Thomas, James D.; Blume, Hartwig R.; Cecil, Robert A.; Heisler, Mary

    2000-05-01

    It is now possible to replace film-based image management in the cardiac catheterization laboratory with a Cardiology Picture Archiving and Communication System (Cardio-PACS) based on digital imaging technology. The first step in the conversion process is installation of a digital image acquisition system that is capable of generating high-quality DICOM-compatible images. The next three steps, which are the subject of this presentation, involve image display, distribution, and storage. Clinical requirements and associated cost considerations for these three steps are listed below: Image display: (1) Image quality equal to film, with DICOM format, lossless compression, image processing, desktop PC-based with color monitor, and physician-friendly imaging software; (2) Performance specifications include: acquire 30 frames/sec; replay 15 frames/sec; access to file server 5 seconds, and to archive 5 minutes; (3) Compatibility of image file, transmission, and processing formats; (4) Image manipulation: brightness, contrast, gray scale, zoom, biplane display, and quantification; (5) User-friendly control of image review. Image distribution: (1) Standard IP-based network between cardiac catheterization laboratories, file server, long-term archive, review stations, and remote sites; (2) Non-proprietary formats; (3) Bidirectional distribution. Image storage: (1) CD-ROM vs disk vs tape; (2) Verification of data integrity; (3) User-designated storage capacity for catheterization laboratory, file server, long-term archive. Costs: (1) Image acquisition equipment, file server, long-term archive; (2) Network infrastructure; (3) Review stations and software; (4) Maintenance and administration; (5) Future upgrades and expansion; (6) Personnel.

  6. Digital Image Processing in Private Industry.

    ERIC Educational Resources Information Center

    Moore, Connie

    1986-01-01

    Examines various types of private industry optical disk installations in terms of business requirements for digital image systems in five areas: records management; transaction processing; engineering/manufacturing; information distribution; and office automation. Approaches for implementing image systems are addressed as well as key success…

  7. A configurable distributed high-performance computing framework for satellite's TDI-CCD imaging simulation

    NASA Astrophysics Data System (ADS)

    Xue, Bo; Mao, Bingjing; Chen, Xiaomei; Ni, Guoqiang

    2010-11-01

    This paper renders a configurable distributed high performance computing(HPC) framework for TDI-CCD imaging simulation. It uses strategy pattern to adapt multi-algorithms. Thus, this framework help to decrease the simulation time with low expense. Imaging simulation for TDI-CCD mounted on satellite contains four processes: 1) atmosphere leads degradation, 2) optical system leads degradation, 3) electronic system of TDI-CCD leads degradation and re-sampling process, 4) data integration. Process 1) to 3) utilize diversity data-intensity algorithms such as FFT, convolution and LaGrange Interpol etc., which requires powerful CPU. Even uses Intel Xeon X5550 processor, regular series process method takes more than 30 hours for a simulation whose result image size is 1500 * 1462. With literature study, there isn't any mature distributing HPC framework in this field. Here we developed a distribute computing framework for TDI-CCD imaging simulation, which is based on WCF[1], uses Client/Server (C/S) layer and invokes the free CPU resources in LAN. The server pushes the process 1) to 3) tasks to those free computing capacity. Ultimately we rendered the HPC in low cost. In the computing experiment with 4 symmetric nodes and 1 server , this framework reduced about 74% simulation time. Adding more asymmetric nodes to the computing network, the time decreased namely. In conclusion, this framework could provide unlimited computation capacity in condition that the network and task management server are affordable. And this is the brand new HPC solution for TDI-CCD imaging simulation and similar applications.

  8. A novel image processing workflow for the in vivo quantification of skin microvasculature using dynamic optical coherence tomography.

    PubMed

    Zugaj, D; Chenet, A; Petit, L; Vaglio, J; Pascual, T; Piketty, C; Bourdes, V

    2018-02-04

    Currently, imaging technologies that can accurately assess or provide surrogate markers of the human cutaneous microvessel network are limited. Dynamic optical coherence tomography (D-OCT) allows the detection of blood flow in vivo and visualization of the skin microvasculature. However, image processing is necessary to correct images, filter artifacts, and exclude irrelevant signals. The objective of this study was to develop a novel image processing workflow to enhance the technical capabilities of D-OCT. Single-center, vehicle-controlled study including healthy volunteers aged 18-50 years. A capsaicin solution was applied topically on the subject's forearm to induce local inflammation. Measurements of capsaicin-induced increase in dermal blood flow, within the region of interest, were performed by laser Doppler imaging (LDI) (reference method) and D-OCT. Sixteen subjects were enrolled. A good correlation was shown between D-OCT and LDI, using the image processing workflow. Therefore, D-OCT offers an easy-to-use alternative to LDI, with good repeatability, new robust morphological features (dermal-epidermal junction localization), and quantification of the distribution of vessel size and changes in this distribution induced by capsaicin. The visualization of the vessel network was improved through bloc filtering and artifact removal. Moreover, the assessment of vessel size distribution allows a fine analysis of the vascular patterns. The newly developed image processing workflow enhances the technical capabilities of D-OCT for the accurate detection and characterization of microcirculation in the skin. A direct clinical application of this image processing workflow is the quantification of the effect of topical treatment on skin vascularization. © 2018 The Authors. Skin Research and Technology Published by John Wiley & Sons Ltd.

  9. Design and Verification of Remote Sensing Image Data Center Storage Architecture Based on Hadoop

    NASA Astrophysics Data System (ADS)

    Tang, D.; Zhou, X.; Jing, Y.; Cong, W.; Li, C.

    2018-04-01

    The data center is a new concept of data processing and application proposed in recent years. It is a new method of processing technologies based on data, parallel computing, and compatibility with different hardware clusters. While optimizing the data storage management structure, it fully utilizes cluster resource computing nodes and improves the efficiency of data parallel application. This paper used mature Hadoop technology to build a large-scale distributed image management architecture for remote sensing imagery. Using MapReduce parallel processing technology, it called many computing nodes to process image storage blocks and pyramids in the background to improve the efficiency of image reading and application and sovled the need for concurrent multi-user high-speed access to remotely sensed data. It verified the rationality, reliability and superiority of the system design by testing the storage efficiency of different image data and multi-users and analyzing the distributed storage architecture to improve the application efficiency of remote sensing images through building an actual Hadoop service system.

  10. Computer Sciences and Data Systems, volume 1

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Topics addressed include: software engineering; university grants; institutes; concurrent processing; sparse distributed memory; distributed operating systems; intelligent data management processes; expert system for image analysis; fault tolerant software; and architecture research.

  11. Vivaldi: A Domain-Specific Language for Volume Processing and Visualization on Distributed Heterogeneous Systems.

    PubMed

    Choi, Hyungsuk; Choi, Woohyuk; Quan, Tran Minh; Hildebrand, David G C; Pfister, Hanspeter; Jeong, Won-Ki

    2014-12-01

    As the size of image data from microscopes and telescopes increases, the need for high-throughput processing and visualization of large volumetric data has become more pressing. At the same time, many-core processors and GPU accelerators are commonplace, making high-performance distributed heterogeneous computing systems affordable. However, effectively utilizing GPU clusters is difficult for novice programmers, and even experienced programmers often fail to fully leverage the computing power of new parallel architectures due to their steep learning curve and programming complexity. In this paper, we propose Vivaldi, a new domain-specific language for volume processing and visualization on distributed heterogeneous computing systems. Vivaldi's Python-like grammar and parallel processing abstractions provide flexible programming tools for non-experts to easily write high-performance parallel computing code. Vivaldi provides commonly used functions and numerical operators for customized visualization and high-throughput image processing applications. We demonstrate the performance and usability of Vivaldi on several examples ranging from volume rendering to image segmentation.

  12. Calibration-free quantification of interior properties of porous media with x-ray computed tomography.

    PubMed

    Hussein, Esam M A; Agbogun, H M D; Al, Tom A

    2015-03-01

    A method is presented for interpreting the values of x-ray attenuation coefficients reconstructed in computed tomography of porous media, while overcoming the ambiguity caused by the multichromatic nature of x-rays, dilution by void, and material heterogeneity. The method enables determination of porosity without relying on calibration or image segmentation or thresholding to discriminate pores from solid material. It distinguishes between solution-accessible and inaccessible pores, and provides the spatial and frequency distributions of solid-matrix material in a heterogeneous medium. This is accomplished by matching an image of a sample saturated with a contrast solution with that saturated with a transparent solution. Voxels occupied with solid-material and inaccessible pores are identified by the fact that they maintain the same location and image attributes in both images, with voxels containing inaccessible pores appearing empty in both images. Fully porous and accessible voxels exhibit the maximum contrast, while the rest are porous voxels containing mixtures of pore solutions and solid. This matching process is performed with an image registration computer code, and image processing software that requires only simple subtraction and multiplication (scaling) processes. The process is demonstrated in dolomite (non-uniform void distribution, homogeneous solid matrix) and sandstone (nearly uniform void distribution, heterogeneous solid matrix) samples, and its overall performance is shown to compare favorably with a method based on calibration and thresholding. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Process for thermal imaging scanning of a swaged heater for an anode subassembly of a hollow cathode assembly

    NASA Technical Reports Server (NTRS)

    Patterson, Michael J. (Inventor); Verhey, Timothy R. R. (Inventor); Soulas, George C. (Inventor)

    2004-01-01

    A process for thermal imaging scanning of a swaged heater of an anode subassembly of a hollow cathode assembly, comprising scanning a swaged heater with a thermal imaging radiometer to measure a temperature distribution of the heater; raising the current in a power supply to increase the temperature of the swaged heater; and measuring the swaged heater temperature using the radiometer, whereupon the temperature distribution along the length of the heater shall be less than plus or minus 5 degrees C.

  14. JIP: Java image processing on the Internet

    NASA Astrophysics Data System (ADS)

    Wang, Dongyan; Lin, Bo; Zhang, Jun

    1998-12-01

    In this paper, we present JIP - Java Image Processing on the Internet, a new Internet based application for remote education and software presentation. JIP offers an integrate learning environment on the Internet where remote users not only can share static HTML documents and lectures notes, but also can run and reuse dynamic distributed software components, without having the source code or any extra work of software compilation, installation and configuration. By implementing a platform-independent distributed computational model, local computational resources are consumed instead of the resources on a central server. As an extended Java applet, JIP allows users to selected local image files on their computers or specify any image on the Internet using an URL as input. Multimedia lectures such as streaming video/audio and digital images are integrated into JIP and intelligently associated with specific image processing functions. Watching demonstrations an practicing the functions with user-selected input data dramatically encourages leaning interest, while promoting the understanding of image processing theory. The JIP framework can be easily applied to other subjects in education or software presentation, such as digital signal processing, business, mathematics, physics, or other areas such as employee training and charged software consumption.

  15. Study on parallel and distributed management of RS data based on spatial database

    NASA Astrophysics Data System (ADS)

    Chen, Yingbiao; Qian, Qinglan; Wu, Hongqiao; Liu, Shijin

    2009-10-01

    With the rapid development of current earth-observing technology, RS image data storage, management and information publication become a bottle-neck for its appliance and popularization. There are two prominent problems in RS image data storage and management system. First, background server hardly handle the heavy process of great capacity of RS data which stored at different nodes in a distributing environment. A tough burden has put on the background server. Second, there is no unique, standard and rational organization of Multi-sensor RS data for its storage and management. And lots of information is lost or not included at storage. Faced at the above two problems, the paper has put forward a framework for RS image data parallel and distributed management and storage system. This system aims at RS data information system based on parallel background server and a distributed data management system. Aiming at the above two goals, this paper has studied the following key techniques and elicited some revelatory conclusions. The paper has put forward a solid index of "Pyramid, Block, Layer, Epoch" according to the properties of RS image data. With the solid index mechanism, a rational organization for different resolution, different area, different band and different period of Multi-sensor RS image data is completed. In data storage, RS data is not divided into binary large objects to be stored at current relational database system, while it is reconstructed through the above solid index mechanism. A logical image database for the RS image data file is constructed. In system architecture, this paper has set up a framework based on a parallel server of several common computers. Under the framework, the background process is divided into two parts, the common WEB process and parallel process.

  16. Study on parallel and distributed management of RS data based on spatial data base

    NASA Astrophysics Data System (ADS)

    Chen, Yingbiao; Qian, Qinglan; Liu, Shijin

    2006-12-01

    With the rapid development of current earth-observing technology, RS image data storage, management and information publication become a bottle-neck for its appliance and popularization. There are two prominent problems in RS image data storage and management system. First, background server hardly handle the heavy process of great capacity of RS data which stored at different nodes in a distributing environment. A tough burden has put on the background server. Second, there is no unique, standard and rational organization of Multi-sensor RS data for its storage and management. And lots of information is lost or not included at storage. Faced at the above two problems, the paper has put forward a framework for RS image data parallel and distributed management and storage system. This system aims at RS data information system based on parallel background server and a distributed data management system. Aiming at the above two goals, this paper has studied the following key techniques and elicited some revelatory conclusions. The paper has put forward a solid index of "Pyramid, Block, Layer, Epoch" according to the properties of RS image data. With the solid index mechanism, a rational organization for different resolution, different area, different band and different period of Multi-sensor RS image data is completed. In data storage, RS data is not divided into binary large objects to be stored at current relational database system, while it is reconstructed through the above solid index mechanism. A logical image database for the RS image data file is constructed. In system architecture, this paper has set up a framework based on a parallel server of several common computers. Under the framework, the background process is divided into two parts, the common WEB process and parallel process.

  17. Towards Portable Large-Scale Image Processing with High-Performance Computing.

    PubMed

    Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A

    2018-05-03

    High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software development and expansion, and (3) scalable spider deployment compatible with HPC clusters and local workstations.

  18. Distributed data collection for a database of radiological image interpretations

    NASA Astrophysics Data System (ADS)

    Long, L. Rodney; Ostchega, Yechiam; Goh, Gin-Hua; Thoma, George R.

    1997-01-01

    The National Library of Medicine, in collaboration with the National Center for Health Statistics and the National Institute for Arthritis and Musculoskeletal and Skin Diseases, has built a system for collecting radiological interpretations for a large set of x-ray images acquired as part of the data gathered in the second National Health and Nutrition Examination Survey. This system is capable of delivering across the Internet 5- and 10-megabyte x-ray images to Sun workstations equipped with X Window based 2048 X 2560 image displays, for the purpose of having these images interpreted for the degree of presence of particular osteoarthritic conditions in the cervical and lumbar spines. The collected interpretations can then be stored in a database at the National Library of Medicine, under control of the Illustra DBMS. This system is a client/server database application which integrates (1) distributed server processing of client requests, (2) a customized image transmission method for faster Internet data delivery, (3) distributed client workstations with high resolution displays, image processing functions and an on-line digital atlas, and (4) relational database management of the collected data.

  19. Full field study of strain distribution near the crack tip in the fracture of solid propellants via large strain digital image correlation and optical microscopy

    NASA Astrophysics Data System (ADS)

    Gonzalez, Javier

    A full field method for visualizing deformation around the crack tip in a fracture process with large strains is developed. A digital image correlation program (DIC) is used to incrementally compute strains and displacements between two consecutive images of a deformation process. Values of strain and displacements for consecutive deformations are added, this way solving convergence problems in the DIC algorithm when large deformations are investigated. The method developed is used to investigate the strain distribution within 1 mm of the crack tip in a particulate composite solid (propellant) using microscopic visualization of the deformation process.

  20. Nonuniformity correction of imaging systems with a spatially nonhomogeneous radiation source.

    PubMed

    Gutschwager, Berndt; Hollandt, Jörg

    2015-12-20

    We present a novel method of nonuniformity correction of imaging systems in a wide optical spectral range by applying a radiation source with an unknown and spatially nonhomogeneous radiance or radiance temperature distribution. The benefit of this method is that it can be applied with radiation sources of arbitrary spatial radiance or radiance temperature distribution and only requires the sufficient temporal stability of this distribution during the measurement process. The method is based on the recording of several (at least three) images of a radiation source and a purposeful row- and line-shift of these sequent images in relation to the first primary image. The mathematical procedure is explained in detail. Its numerical verification with a source of a predefined nonhomogenous radiance distribution and a thermal imager of a predefined nonuniform focal plane array responsivity is presented.

  1. BOREAS TE-18, 60-m, Radiometrically Rectified Landsat TM Imagery

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Knapp, David

    2000-01-01

    The BOREAS TE-18 team used a radiometric rectification process to produce standardized DN values for a series of Landsat TM images of the BOREAS SSA and NSA in order to compare images that were collected under different atmospheric conditions. The images for each study area were referenced to an image that had very clear atmospheric qualities. The reference image for the SSA was collected on 02-Sep-1994, while the reference image for the NSA was collected on 2 1 Jun-1995. The 23 rectified images cover the period of 07-Jul-1985 to 18-Sep-1994 in the SSA and 22-Jun-1984 to 09-Jun-1994 in the NSA. Each of the reference scenes had coincident atmospheric optical thickness measurements made by RSS-11. The radiometric rectification process is described in more detail by Hall et al. (1991). The original Landsat TM data were received from CCRS for use in the BOREAS project. Due to the nature of the radiometric rectification process and copyright issues, the full-resolution (30-m) images may not be publicly distributed. However, this spatially degraded 60-m resolution version of the images may be openly distributed and is available on the BOREAS CD-ROM series. After the radiometric rectification processing, the original data were degraded to a 60-m pixel size from the original 30-m pixel size by averaging the data over a 2- by 2-pixel window. The data are stored in binary image-format files. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Activity Archive Center (DAAC).

  2. Cloud Engineering Principles and Technology Enablers for Medical Image Processing-as-a-Service.

    PubMed

    Bao, Shunxing; Plassard, Andrew J; Landman, Bennett A; Gokhale, Aniruddha

    2017-04-01

    Traditional in-house, laboratory-based medical imaging studies use hierarchical data structures (e.g., NFS file stores) or databases (e.g., COINS, XNAT) for storage and retrieval. The resulting performance from these approaches is, however, impeded by standard network switches since they can saturate network bandwidth during transfer from storage to processing nodes for even moderate-sized studies. To that end, a cloud-based "medical image processing-as-a-service" offers promise in utilizing the ecosystem of Apache Hadoop, which is a flexible framework providing distributed, scalable, fault tolerant storage and parallel computational modules, and HBase, which is a NoSQL database built atop Hadoop's distributed file system. Despite this promise, HBase's load distribution strategy of region split and merge is detrimental to the hierarchical organization of imaging data (e.g., project, subject, session, scan, slice). This paper makes two contributions to address these concerns by describing key cloud engineering principles and technology enhancements we made to the Apache Hadoop ecosystem for medical imaging applications. First, we propose a row-key design for HBase, which is a necessary step that is driven by the hierarchical organization of imaging data. Second, we propose a novel data allocation policy within HBase to strongly enforce collocation of hierarchically related imaging data. The proposed enhancements accelerate data processing by minimizing network usage and localizing processing to machines where the data already exist. Moreover, our approach is amenable to the traditional scan, subject, and project-level analysis procedures, and is compatible with standard command line/scriptable image processing software. Experimental results for an illustrative sample of imaging data reveals that our new HBase policy results in a three-fold time improvement in conversion of classic DICOM to NiFTI file formats when compared with the default HBase region split policy, and nearly a six-fold improvement over a commonly available network file system (NFS) approach even for relatively small file sets. Moreover, file access latency is lower than network attached storage.

  3. Particle Morphology Analysis of Biomass Material Based on Improved Image Processing Method

    PubMed Central

    Lu, Zhaolin

    2017-01-01

    Particle morphology, including size and shape, is an important factor that significantly influences the physical and chemical properties of biomass material. Based on image processing technology, a method was developed to process sample images, measure particle dimensions, and analyse the particle size and shape distributions of knife-milled wheat straw, which had been preclassified into five nominal size groups using mechanical sieving approach. Considering the great variation of particle size from micrometer to millimeter, the powders greater than 250 μm were photographed by a flatbed scanner without zoom function, and the others were photographed using a scanning electron microscopy (SEM) with high-image resolution. Actual imaging tests confirmed the excellent effect of backscattered electron (BSE) imaging mode of SEM. Particle aggregation is an important factor that affects the recognition accuracy of the image processing method. In sample preparation, the singulated arrangement and ultrasonic dispersion methods were used to separate powders into particles that were larger and smaller than the nominal size of 250 μm. In addition, an image segmentation algorithm based on particle geometrical information was proposed to recognise the finer clustered powders. Experimental results demonstrated that the improved image processing method was suitable to analyse the particle size and shape distributions of ground biomass materials and solve the size inconsistencies in sieving analysis. PMID:28298925

  4. Method of simulation and visualization of FDG metabolism based on VHP image

    NASA Astrophysics Data System (ADS)

    Cui, Yunfeng; Bai, Jing

    2005-04-01

    FDG ([18F] 2-fluoro-2-deoxy-D-glucose) is the typical tracer used in clinical PET (positron emission tomography) studies. The FDG-PET is an important imaging tool for early diagnosis and treatment of malignant tumor and functional disease. The main purpose of this work is to propose a method that represents FDG metabolism in human body through the simulation and visualization of 18F distribution process dynamically based on the segmented VHP (Visible Human Project) image dataset. First, the plasma time-activity curve (PTAC) and the tissues time-activity curves (TTAC) are obtained from the previous studies and the literatures. According to the obtained PTAC and TTACs, a set of corresponding values are assigned to the segmented VHP image, Thus a set of dynamic images are derived to show the 18F distribution in the concerned tissues for the predetermined sampling schedule. Finally, the simulated FDG distribution images are visualized in 3D and 2D formats, respectively, incorporated with principal interaction functions. As compared with original PET image, our visualization result presents higher resolution because of the high resolution of VHP image data, and show the distribution process of 18F dynamically. The results of our work can be used in education and related research as well as a tool for the PET operator to design their PET experiment program.

  5. A data distributed parallel algorithm for ray-traced volume rendering

    NASA Technical Reports Server (NTRS)

    Ma, Kwan-Liu; Painter, James S.; Hansen, Charles D.; Krogh, Michael F.

    1993-01-01

    This paper presents a divide-and-conquer ray-traced volume rendering algorithm and a parallel image compositing method, along with their implementation and performance on the Connection Machine CM-5, and networked workstations. This algorithm distributes both the data and the computations to individual processing units to achieve fast, high-quality rendering of high-resolution data. The volume data, once distributed, is left intact. The processing nodes perform local ray tracing of their subvolume concurrently. No communication between processing units is needed during this locally ray-tracing process. A subimage is generated by each processing unit and the final image is obtained by compositing subimages in the proper order, which can be determined a priori. Test results on both the CM-5 and a group of networked workstations demonstrate the practicality of our rendering algorithm and compositing method.

  6. The Use of Multidimensional Image-Based Analysis to Accurately Monitor Cell Growth in 3D Bioreactor Culture

    PubMed Central

    Baradez, Marc-Olivier; Marshall, Damian

    2011-01-01

    The transition from traditional culture methods towards bioreactor based bioprocessing to produce cells in commercially viable quantities for cell therapy applications requires the development of robust methods to ensure the quality of the cells produced. Standard methods for measuring cell quality parameters such as viability provide only limited information making process monitoring and optimisation difficult. Here we describe a 3D image-based approach to develop cell distribution maps which can be used to simultaneously measure the number, confluency and morphology of cells attached to microcarriers in a stirred tank bioreactor. The accuracy of the cell distribution measurements is validated using in silico modelling of synthetic image datasets and is shown to have an accuracy >90%. Using the cell distribution mapping process and principal component analysis we show how cell growth can be quantitatively monitored over a 13 day bioreactor culture period and how changes to manufacture processes such as initial cell seeding density can significantly influence cell morphology and the rate at which cells are produced. Taken together, these results demonstrate how image-based analysis can be incorporated in cell quality control processes facilitating the transition towards bioreactor based manufacture for clinical grade cells. PMID:22028809

  7. The use of multidimensional image-based analysis to accurately monitor cell growth in 3D bioreactor culture.

    PubMed

    Baradez, Marc-Olivier; Marshall, Damian

    2011-01-01

    The transition from traditional culture methods towards bioreactor based bioprocessing to produce cells in commercially viable quantities for cell therapy applications requires the development of robust methods to ensure the quality of the cells produced. Standard methods for measuring cell quality parameters such as viability provide only limited information making process monitoring and optimisation difficult. Here we describe a 3D image-based approach to develop cell distribution maps which can be used to simultaneously measure the number, confluency and morphology of cells attached to microcarriers in a stirred tank bioreactor. The accuracy of the cell distribution measurements is validated using in silico modelling of synthetic image datasets and is shown to have an accuracy >90%. Using the cell distribution mapping process and principal component analysis we show how cell growth can be quantitatively monitored over a 13 day bioreactor culture period and how changes to manufacture processes such as initial cell seeding density can significantly influence cell morphology and the rate at which cells are produced. Taken together, these results demonstrate how image-based analysis can be incorporated in cell quality control processes facilitating the transition towards bioreactor based manufacture for clinical grade cells.

  8. Quantitative image analysis for evaluating the coating thickness and pore distribution in coated small particles.

    PubMed

    Laksmana, F L; Van Vliet, L J; Hartman Kok, P J A; Vromans, H; Frijlink, H W; Van der Voort Maarschalk, K

    2009-04-01

    This study aims to develop a characterization method for coating structure based on image analysis, which is particularly promising for the rational design of coated particles in the pharmaceutical industry. The method applies the MATLAB image processing toolbox to images of coated particles taken with Confocal Laser Scanning Microscopy (CSLM). The coating thicknesses have been determined along the particle perimeter, from which a statistical analysis could be performed to obtain relevant thickness properties, e.g. the minimum coating thickness and the span of the thickness distribution. The characterization of the pore structure involved a proper segmentation of pores from the coating and a granulometry operation. The presented method facilitates the quantification of porosity, thickness and pore size distribution of a coating. These parameters are considered the important coating properties, which are critical to coating functionality. Additionally, the effect of the coating process variations on coating quality can straight-forwardly be assessed. Enabling a good characterization of the coating qualities, the presented method can be used as a fast and effective tool to predict coating functionality. This approach also enables the influence of different process conditions on coating properties to be effectively monitored, which latterly leads to process tailoring.

  9. Photofragment image analysis using the Onion-Peeling Algorithm

    NASA Astrophysics Data System (ADS)

    Manzhos, Sergei; Loock, Hans-Peter

    2003-07-01

    With the growing popularity of the velocity map imaging technique, a need for the analysis of photoion and photoelectron images arose. Here, a computer program is presented that allows for the analysis of cylindrically symmetric images. It permits the inversion of the projection of the 3D charged particle distribution using the Onion Peeling Algorithm. Further analysis includes the determination of radial and angular distributions, from which velocity distributions and spatial anisotropy parameters are obtained. Identification and quantification of the different photolysis channels is therefore straightforward. In addition, the program features geometry correction, centering, and multi-Gaussian fitting routines, as well as a user-friendly graphical interface and the possibility of generating synthetic images using either the fitted or user-defined parameters. Program summaryTitle of program: Glass Onion Catalogue identifier: ADRY Program Summary URL:http://cpc.cs.qub.ac.uk/summaries/ADRY Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions: none Computer: IBM PC Operating system under which the program has been tested: Windows 98, Windows 2000, Windows NT Programming language used: Delphi 4.0 Memory required to execute with typical data: 18 Mwords No. of bits in a word: 32 No. of bytes in distributed program, including test data, etc.: 9 911 434 Distribution format: zip file Keywords: Photofragment image, onion peeling, anisotropy parameters Nature of physical problem: Information about velocity and angular distributions of photofragments is the basis on which the analysis of the photolysis process resides. Reconstructing the three-dimensional distribution from the photofragment image is the first step, further processing involving angular and radial integration of the inverted image to obtain velocity and angular distributions. Provisions have to be made to correct for slight distortions of the image, and to verify the accuracy of the analysis process. Method of solution: The "Onion Peeling" algorithm described by Helm [Rev. Sci. Instrum. 67 (6) (1996)] is used to perform the image reconstruction. Angular integration with a subsequent multi-Gaussian fit supplies information about the velocity distribution of the photofragments, whereas radial integration with subsequent expansion of the angular distributions over Legendre Polynomials gives the spatial anisotropy parameters. Fitting algorithms have been developed to centre the image and to correct for image distortion. Restrictions on the complexity of the problem: The maximum image size (1280×1280) and resolution (16 bit) are restricted by available memory and can be changed in the source code. Initial centre coordinates within 5 pixels may be required for the correction and the centering algorithm to converge. Peaks on the velocity profile separated by less then the peak width may not be deconvolved. In the charged particle image reconstruction, it is assumed that the kinetic energy released in the dissociation process is small compared to the energy acquired in the electric field. For the fitting parameters to be physically meaningful, cylindrical symmetry of the image has to be assumed but the actual inversion algorithm is stable to distortions of such symmetry in experimental images. Typical running time: The analysis procedure can be divided into three parts: inversion, fitting, and geometry correction. The inversion time grows approx. as R3, where R is the radius of the region of interest: for R=200 pixels it is less than a minute, for R=400 pixels less then 6 min on a 400 MHz IBM personal computer. The time for the velocity fitting procedure to converge depends strongly on the number of peaks in the velocity profile and the convergence criterion. It ranges between less then a second for simple curves and a few minutes for profiles with up to twenty peaks. The time taken for the image correction scales as R2 and depends on the curve profile. It is on the order of a few minutes for images with R=500 pixels. Unusual features of the program: Our centering and image correction algorithm is based on Fourier analysis of the radial distribution to insure the sharpest velocity profile and is insensitive to an uneven intensity distribution. There exists an angular averaging option to stabilize the inversion algorithm and not to loose the resolution at the same time.

  10. Medical Imaging System

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The MD Image System, a true-color image processing system that serves as a diagnostic aid and tool for storage and distribution of images, was developed by Medical Image Management Systems, Huntsville, AL, as a "spinoff from a spinoff." The original spinoff, Geostar 8800, developed by Crystal Image Technologies, Huntsville, incorporates advanced UNIX versions of ELAS (developed by NASA's Earth Resources Laboratory for analysis of Landsat images) for general purpose image processing. The MD Image System is an application of this technology to a medical system that aids in the diagnosis of cancer, and can accept, store and analyze images from other sources such as Magnetic Resonance Imaging.

  11. Imaging Metals in Brain Tissue by Laser Ablation - Inductively Coupled Plasma - Mass Spectrometry (LA-ICP-MS)

    PubMed Central

    Hare, Dominic J.; Kysenius, Kai; Paul, Bence; Knauer, Beate; Hutchinson, Robert W.; O'Connor, Ciaran; Fryer, Fred; Hennessey, Tom P.; Bush, Ashley I.; Crouch, Peter J.; Doble, Philip A.

    2017-01-01

    Metals are found ubiquitously throughout an organism, with their biological role dictated by both their chemical reactivity and abundance within a specific anatomical region. Within the brain, metals have a highly compartmentalized distribution, depending on the primary function they play within the central nervous system. Imaging the spatial distribution of metals has provided unique insight into the biochemical architecture of the brain, allowing direct correlation between neuroanatomical regions and their known function with regard to metal-dependent processes. In addition, several age-related neurological disorders feature disrupted metal homeostasis, which is often confined to small regions of the brain that are otherwise difficult to analyze. Here, we describe a comprehensive method for quantitatively imaging metals in the mouse brain, using laser ablation - inductively coupled plasma - mass spectrometry (LA-ICP-MS) and specially designed image processing software. Focusing on iron, copper and zinc, which are three of the most abundant and disease-relevant metals within the brain, we describe the essential steps in sample preparation, analysis, quantitative measurements and image processing to produce maps of metal distribution within the low micrometer resolution range. This technique, applicable to any cut tissue section, is capable of demonstrating the highly variable distribution of metals within an organ or system, and can be used to identify changes in metal homeostasis and absolute levels within fine anatomical structures. PMID:28190025

  12. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment.

    PubMed

    Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A

    2016-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Content Based Image Retrieval based on Wavelet Transform coefficients distribution

    PubMed Central

    Lamard, Mathieu; Cazuguel, Guy; Quellec, Gwénolé; Bekri, Lynda; Roux, Christian; Cochener, Béatrice

    2007-01-01

    In this paper we propose a content based image retrieval method for diagnosis aid in medical fields. We characterize images without extracting significant features by using distribution of coefficients obtained by building signatures from the distribution of wavelet transform. The research is carried out by computing signature distances between the query and database images. Several signatures are proposed; they use a model of wavelet coefficient distribution. To enhance results, a weighted distance between signatures is used and an adapted wavelet base is proposed. Retrieval efficiency is given for different databases including a diabetic retinopathy, a mammography and a face database. Results are promising: the retrieval efficiency is higher than 95% for some cases using an optimization process. PMID:18003013

  14. Image enhancement using MCNP5 code and MATLAB in neutron radiography.

    PubMed

    Tharwat, Montaser; Mohamed, Nader; Mongy, T

    2014-07-01

    This work presents a method that can be used to enhance the neutron radiography (NR) image for objects with high scattering materials like hydrogen, carbon and other light materials. This method used Monte Carlo code, MCNP5, to simulate the NR process and get the flux distribution for each pixel of the image and determines the scattered neutron distribution that caused image blur, and then uses MATLAB to subtract this scattered neutron distribution from the initial image to improve its quality. This work was performed before the commissioning of digital NR system in Jan. 2013. The MATLAB enhancement method is quite a good technique in the case of static based film neutron radiography, while in neutron imaging (NI) technique, image enhancement and quantitative measurement were efficient by using ImageJ software. The enhanced image quality and quantitative measurements were presented in this work. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Noise parameter estimation for poisson corrupted images using variance stabilization transforms.

    PubMed

    Jin, Xiaodan; Xu, Zhenyu; Hirakawa, Keigo

    2014-03-01

    Noise is present in all images captured by real-world image sensors. Poisson distribution is said to model the stochastic nature of the photon arrival process and agrees with the distribution of measured pixel values. We propose a method for estimating unknown noise parameters from Poisson corrupted images using properties of variance stabilization. With a significantly lower computational complexity and improved stability, the proposed estimation technique yields noise parameters that are comparable in accuracy to the state-of-art methods.

  16. Infrared thermography quantitative image processing

    NASA Astrophysics Data System (ADS)

    Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB

    2017-11-01

    Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.

  17. A Scalable Distributed Approach to Mobile Robot Vision

    NASA Technical Reports Server (NTRS)

    Kuipers, Benjamin; Browning, Robert L.; Gribble, William S.

    1997-01-01

    This paper documents our progress during the first year of work on our original proposal entitled 'A Scalable Distributed Approach to Mobile Robot Vision'. We are pursuing a strategy for real-time visual identification and tracking of complex objects which does not rely on specialized image-processing hardware. In this system perceptual schemas represent objects as a graph of primitive features. Distributed software agents identify and track these features, using variable-geometry image subwindows of limited size. Active control of imaging parameters and selective processing makes simultaneous real-time tracking of many primitive features tractable. Perceptual schemas operate independently from the tracking of primitive features, so that real-time tracking of a set of image features is not hurt by latency in recognition of the object that those features make up. The architecture allows semantically significant features to be tracked with limited expenditure of computational resources, and allows the visual computation to be distributed across a network of processors. Early experiments are described which demonstrate the usefulness of this formulation, followed by a brief overview of our more recent progress (after the first year).

  18. Imaging of the Li spatial distribution within V 2O 5 cathode in a coin cell by neutron computed tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yuxuan; Chandran, K. S. Ravi; Bilheux, Hassina Z.

    An understanding of Lithium (Li) spatial distribution within the electrodes of a Li-ion cell, during charge and discharge cycles, is essential to optimize the electrode parameters for increased performance under cycling. In this work, it is demonstrated that the spatial distribution of Li within Vanadium Pentoxide (V 2O 5) electrodes of a small coin cell can be imaged by neutron computed tomography. The neutron attenuation data has been used to construct the three-dimensional Li spatial images. Specifically, it is shown that there is sufficient neutron imaging contrast between lithiated and delithiated regions of V 2O 5 electrode making it possiblemore » to map Li distributions even in small electrodes with thicknesses <1 mm. The images reveal that the Li spatial distribution is inhomogeneous and a relatively higher C-rate leads to more non-uniform Li distribution after Li insertion. The non-uniform distribution suggests the limitation of Li diffusion within the electrode during the lithiation process under the relatively high cycling rates.« less

  19. Imaging of the Li spatial distribution within V2O5 cathode in a coin cell by neutron computed tomography

    NASA Astrophysics Data System (ADS)

    Zhang, Yuxuan; Chandran, K. S. Ravi; Bilheux, Hassina Z.

    2018-02-01

    An understanding of Lithium (Li) spatial distribution within the electrodes of a Li-ion cell, during charge and discharge cycles, is essential to optimize the electrode parameters for increased performance under cycling. In this work, it is demonstrated that the spatial distribution of Li within Vanadium Pentoxide (V2O5) electrodes of a small coin cell can be imaged by neutron computed tomography. The neutron attenuation data has been used to construct the three-dimensional Li spatial images. Specifically, it is shown that there is sufficient neutron imaging contrast between lithiated and delithiated regions of V2O5 electrode making it possible to map Li distributions even in small electrodes with thicknesses <1 mm. The images reveal that the Li spatial distribution is inhomogeneous and a relatively higher C-rate leads to more non-uniform Li distribution after Li insertion. The non-uniform distribution suggests the limitation of Li diffusion within the electrode during the lithiation process under the relatively high cycling rates.

  20. Imaging of the Li spatial distribution within V 2O 5 cathode in a coin cell by neutron computed tomography

    DOE PAGES

    Zhang, Yuxuan; Chandran, K. S. Ravi; Bilheux, Hassina Z.

    2017-11-30

    An understanding of Lithium (Li) spatial distribution within the electrodes of a Li-ion cell, during charge and discharge cycles, is essential to optimize the electrode parameters for increased performance under cycling. In this work, it is demonstrated that the spatial distribution of Li within Vanadium Pentoxide (V 2O 5) electrodes of a small coin cell can be imaged by neutron computed tomography. The neutron attenuation data has been used to construct the three-dimensional Li spatial images. Specifically, it is shown that there is sufficient neutron imaging contrast between lithiated and delithiated regions of V 2O 5 electrode making it possiblemore » to map Li distributions even in small electrodes with thicknesses <1 mm. The images reveal that the Li spatial distribution is inhomogeneous and a relatively higher C-rate leads to more non-uniform Li distribution after Li insertion. The non-uniform distribution suggests the limitation of Li diffusion within the electrode during the lithiation process under the relatively high cycling rates.« less

  1. Skeletonization of gray-scale images by gray weighted distance transform

    NASA Astrophysics Data System (ADS)

    Qian, Kai; Cao, Siqi; Bhattacharya, Prabir

    1997-07-01

    In pattern recognition, thinning algorithms are often a useful tool to represent a digital pattern by means of a skeletonized image, consisting of a set of one-pixel-width lines that highlight the significant features interest in applying thinning directly to gray-scale images, motivated by the desire of processing images characterized by meaningful information distributed over different levels of gray intensity. In this paper, a new algorithm is presented which can skeletonize both black-white and gray pictures. This algorithm is based on the gray distance transformation and can be used to process any non-well uniformly distributed gray-scale picture and can preserve the topology of original picture. This process includes a preliminary phase of investigation in the 'hollows' in the gray-scale image; these hollows are considered not as topological constrains for the skeleton structure depending on their statistically significant depth. This algorithm can also be executed on a parallel machine as all the operations are executed in local. Some examples are discussed to illustrate the algorithm.

  2. Difet: Distributed Feature Extraction Tool for High Spatial Resolution Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Eken, S.; Aydın, E.; Sayar, A.

    2017-11-01

    In this paper, we propose distributed feature extraction tool from high spatial resolution remote sensing images. Tool is based on Apache Hadoop framework and Hadoop Image Processing Interface. Two corner detection (Harris and Shi-Tomasi) algorithms and five feature descriptors (SIFT, SURF, FAST, BRIEF, and ORB) are considered. Robustness of the tool in the task of feature extraction from LandSat-8 imageries are evaluated in terms of horizontal scalability.

  3. An Approach to Knowledge-Directed Image Analysis,

    DTIC Science & Technology

    1977-09-01

    34AN APPROACH TO KNOWLEDGE -DIRECTED IMAGE ANALYSIS D.H. Ballard, C.M.’Brown, J.A. Feldman Computer Science Department iThe University of Rochester...Rochester, New York 14627 DTII EECTE UTIC FILE COPY o n I, n 83 - ’ f t 8 11 28 19 1f.. AN APPROACH TO KNOWLEDGE -DIRECTED IMAGE ANALYSIS 5*., D.H...semantic network model and a distributed control structure to accomplish the image analysis process. The process of " understanding an image" leads to

  4. apART: system for the acquisition, processing, archiving, and retrieval of digital images in an open, distributed imaging environment

    NASA Astrophysics Data System (ADS)

    Schneider, Uwe; Strack, Ruediger

    1992-04-01

    apART reflects the structure of an open, distributed environment. According to the general trend in the area of imaging, network-capable, general purpose workstations with capabilities of open system image communication and image input are used. Several heterogeneous components like CCD cameras, slide scanners, and image archives can be accessed. The system is driven by an object-oriented user interface where devices (image sources and destinations), operators (derived from a commercial image processing library), and images (of different data types) are managed and presented uniformly to the user. Browsing mechanisms are used to traverse devices, operators, and images. An audit trail mechanism is offered to record interactive operations on low-resolution image derivatives. These operations are processed off-line on the original image. Thus, the processing of extremely high-resolution raster images is possible, and the performance of resolution dependent operations is enhanced significantly during interaction. An object-oriented database system (APRIL), which can be browsed, is integrated into the system. Attribute retrieval is supported by the user interface. Other essential features of the system include: implementation on top of the X Window System (X11R4) and the OSF/Motif widget set; a SUN4 general purpose workstation, inclusive ethernet, magneto optical disc, etc., as the hardware platform for the user interface; complete graphical-interactive parametrization of all operators; support of different image interchange formats (GIF, TIFF, IIF, etc.); consideration of current IPI standard activities within ISO/IEC for further refinement and extensions.

  5. Distributed decision making in action: diagnostic imaging investigations within the bigger picture.

    PubMed

    Makanjee, Chandra R; Bergh, Anne-Marie; Hoffmann, Willem A

    2018-03-01

    Decision making in the health care system - specifically with regard to diagnostic imaging investigations - occurs at multiple levels. Professional role players from various backgrounds are involved in making these decisions, from the point of referral to the outcomes of the imaging investigation. The aim of this study was to map the decision-making processes and pathways involved when patients are referred for diagnostic imaging investigations and to explore distributed decision-making events at the points of contact with patients within a health care system. A two-phased qualitative study was conducted in an academic public health complex with the district hospital as entry point. The first phase included case studies of 24 conveniently selected patients, and the second phase involved 12 focus group interviews with health care providers. Data analysis was based on Rapley's interpretation of decision making as being distributed across time, situations and actions, and including different role players and technologies. Clinical decisions incorporating imaging investigations are distributed across the three vital points of contact or decision-making events, namely the initial patient consultation, the diagnostic imaging investigation and the post-investigation consultation. Each of these decision-making events is made up of a sequence of discrete decision-making moments based on the transfer of retrospective, current and prospective information and its transformation into knowledge. This paper contributes to the understanding of the microstructural processes (the 'when' and 'where') involved in the distribution of decisions related to imaging investigations. It also highlights the interdependency in decision-making events of medical and non-medical providers within a single medical encounter. © 2017 The Authors. Journal of Medical Radiation Sciences published by John Wiley & Sons Australia, Ltd on behalf of Australian Society of Medical Imaging and Radiation Therapy and New Zealand Institute of Medical Radiation Technology.

  6. Detecting defective electrical components in heterogeneous infra-red images by spatial control charts

    NASA Astrophysics Data System (ADS)

    Jamshidieini, Bahman; Fazaee, Reza

    2016-05-01

    Distribution network components connect machines and other loads to electrical sources. If resistance or current of any component is more than specified range, its temperature may exceed the operational limit which can cause major problems. Therefore, these defects should be found and eliminated according to their severity. Although infra-red cameras have been used for inspection of electrical components, maintenance prioritization of distribution cubicles is mostly based on personal perception and lack of training data prevents engineers from developing image processing methods. New research on the spatial control chart encouraged us to use statistical approaches instead of the pattern recognition for the image processing. In the present study, a new scanning pattern which can tolerate heavy autocorrelation among adjacent pixels within infra-red image was developed and for the first time combination of kernel smoothing, spatial control charts and local robust regression were used for finding defects within heterogeneous infra-red images of old distribution cubicles. This method does not need training data and this advantage is crucially important when the training data is not available.

  7. G0-WISHART Distribution Based Classification from Polarimetric SAR Images

    NASA Astrophysics Data System (ADS)

    Hu, G. C.; Zhao, Q. H.

    2017-09-01

    Enormous scientific and technical developments have been carried out to further improve the remote sensing for decades, particularly Polarimetric Synthetic Aperture Radar(PolSAR) technique, so classification method based on PolSAR images has getted much more attention from scholars and related department around the world. The multilook polarmetric G0-Wishart model is a more flexible model which describe homogeneous, heterogeneous and extremely heterogeneous regions in the image. Moreover, the polarmetric G0-Wishart distribution dose not include the modified Bessel function of the second kind. It is a kind of simple statistical distribution model with less parameter. To prove its feasibility, a process of classification has been tested with the full-polarized Synthetic Aperture Radar (SAR) image by the method. First, apply multilook polarimetric SAR data process and speckle filter to reduce speckle influence for classification result. Initially classify the image into sixteen classes by H/A/α decomposition. Using the ICM algorithm to classify feature based on the G0-Wshart distance. Qualitative and quantitative results show that the proposed method can classify polaimetric SAR data effectively and efficiently.

  8. The imaging node for the Planetary Data System

    USGS Publications Warehouse

    Eliason, E.M.; LaVoie, S.K.; Soderblom, L.A.

    1996-01-01

    The Planetary Data System Imaging Node maintains and distributes the archives of planetary image data acquired from NASA's flight projects with the primary goal of enabling the science community to perform image processing and analysis on the data. The Node provides direct and easy access to the digital image archives through wide distribution of the data on CD-ROM media and on-line remote-access tools by way of Internet services. The Node provides digital image processing tools and the expertise and guidance necessary to understand the image collections. The data collections, now approaching one terabyte in volume, provide a foundation for remote sensing studies for virtually all the planetary systems in our solar system (except for Pluto). The Node is responsible for restoring data sets from past missions in danger of being lost. The Node works with active flight projects to assist in the creation of their archive products and to ensure that their products and data catalogs become an integral part of the Node's data collections.

  9. Cloud Engineering Principles and Technology Enablers for Medical Image Processing-as-a-Service

    PubMed Central

    Bao, Shunxing; Plassard, Andrew J.; Landman, Bennett A.; Gokhale, Aniruddha

    2017-01-01

    Traditional in-house, laboratory-based medical imaging studies use hierarchical data structures (e.g., NFS file stores) or databases (e.g., COINS, XNAT) for storage and retrieval. The resulting performance from these approaches is, however, impeded by standard network switches since they can saturate network bandwidth during transfer from storage to processing nodes for even moderate-sized studies. To that end, a cloud-based “medical image processing-as-a-service” offers promise in utilizing the ecosystem of Apache Hadoop, which is a flexible framework providing distributed, scalable, fault tolerant storage and parallel computational modules, and HBase, which is a NoSQL database built atop Hadoop’s distributed file system. Despite this promise, HBase’s load distribution strategy of region split and merge is detrimental to the hierarchical organization of imaging data (e.g., project, subject, session, scan, slice). This paper makes two contributions to address these concerns by describing key cloud engineering principles and technology enhancements we made to the Apache Hadoop ecosystem for medical imaging applications. First, we propose a row-key design for HBase, which is a necessary step that is driven by the hierarchical organization of imaging data. Second, we propose a novel data allocation policy within HBase to strongly enforce collocation of hierarchically related imaging data. The proposed enhancements accelerate data processing by minimizing network usage and localizing processing to machines where the data already exist. Moreover, our approach is amenable to the traditional scan, subject, and project-level analysis procedures, and is compatible with standard command line/scriptable image processing software. Experimental results for an illustrative sample of imaging data reveals that our new HBase policy results in a three-fold time improvement in conversion of classic DICOM to NiFTI file formats when compared with the default HBase region split policy, and nearly a six-fold improvement over a commonly available network file system (NFS) approach even for relatively small file sets. Moreover, file access latency is lower than network attached storage. PMID:28884169

  10. Determination of differential cross sections and kinetic energy release of co-products from central sliced images in photo-initiated dynamic processes.

    PubMed

    Chen, Kuo-mei; Chen, Yu-wei

    2011-04-07

    For photo-initiated inelastic and reactive collisions, dynamic information can be extracted from central sliced images of state-selected Newton spheres of product species. An analysis framework has been established to determine differential cross sections and the kinetic energy release of co-products from experimental images. When one of the reactants exhibits a high recoil speed in a photo-initiated dynamic process, the present theory can be employed to analyze central sliced images from ion imaging or three-dimensional sliced fluorescence imaging experiments. It is demonstrated that the differential cross section of a scattering process can be determined from the central sliced image by a double Legendre moment analysis, for either a fixed or continuously distributed recoil speeds in the center-of-mass reference frame. Simultaneous equations which lead to the determination of the kinetic energy release of co-products can be established from the second-order Legendre moment of the experimental image, as soon as the differential cross section is extracted. The intensity distribution of the central sliced image, along with its outer and inner ring sizes, provide all the clues to decipher the differential cross section and the kinetic energy release of co-products.

  11. Earth Observation Services (Image Processing Software)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    San Diego State University and Environmental Systems Research Institute, with other agencies, have applied satellite imaging and image processing techniques to geographic information systems (GIS) updating. The resulting images display land use and are used by a regional planning agency for applications like mapping vegetation distribution and preserving wildlife habitats. The EOCAP program provides government co-funding to encourage private investment in, and to broaden the use of NASA-developed technology for analyzing information about Earth and ocean resources.

  12. Visualization and quantification of three-dimensional distribution of yeast in bread dough.

    PubMed

    Maeda, Tatsuro; DO, Gab-Soo; Sugiyama, Junichi; Araki, Tetsuya; Tsuta, Mizuki; Shiraga, Seizaburo; Ueda, Mitsuyoshi; Yamada, Masaharu; Takeya, Koji; Sagara, Yasuyuki

    2009-07-01

    A three-dimensional (3-D) bio-imaging technique was developed for visualizing and quantifying the 3-D distribution of yeast in frozen bread dough samples in accordance with the progress of the mixing process of the samples, applying cell-surface engineering to the surfaces of the yeast cells. The fluorescent yeast was recognized as bright spots at the wavelength of 520 nm. Frozen dough samples were sliced at intervals of 1 microm by an micro-slicer image processing system (MSIPS) equipped with a fluorescence microscope for acquiring cross-sectional images of the samples. A set of successive two-dimensional images was reconstructed to analyze the 3-D distribution of the yeast. The average shortest distance between centroids of enhanced green fluorescent protein (EGFP) yeasts was 10.7 microm at the pick-up stage, 9.7 microm at the clean-up stage, 9.0 microm at the final stage, and 10.2 microm at the over-mixing stage. The results indicated that the distribution of the yeast cells was the most uniform in the dough of white bread at the final stage, while the heterogeneous distribution at the over-mixing stage was possibly due to the destruction of the gluten network structure within the samples.

  13. Hierarchical storage of large volume of multidector CT data using distributed servers

    NASA Astrophysics Data System (ADS)

    Ratib, Osman; Rosset, Antoine; Heuberger, Joris; Bandon, David

    2006-03-01

    Multidector scanners and hybrid multimodality scanners have the ability to generate large number of high-resolution images resulting in very large data sets. In most cases, these datasets are generated for the sole purpose of generating secondary processed images and 3D rendered images as well as oblique and curved multiplanar reformatted images. It is therefore not essential to archive the original images after they have been processed. We have developed an architecture of distributed archive servers for temporary storage of large image datasets for 3D rendering and image processing without the need for long term storage in PACS archive. With the relatively low cost of storage devices it is possible to configure these servers to hold several months or even years of data, long enough for allowing subsequent re-processing if required by specific clinical situations. We tested the latest generation of RAID servers provided by Apple computers with a capacity of 5 TBytes. We implemented a peer-to-peer data access software based on our Open-Source image management software called OsiriX, allowing remote workstations to directly access DICOM image files located on the server through a new technology called "bonjour". This architecture offers a seamless integration of multiple servers and workstations without the need for central database or complex workflow management tools. It allows efficient access to image data from multiple workstation for image analysis and visualization without the need for image data transfer. It provides a convenient alternative to centralized PACS architecture while avoiding complex and time-consuming data transfer and storage.

  14. Parallel Wavefront Analysis for a 4D Interferometer

    NASA Technical Reports Server (NTRS)

    Rao, Shanti R.

    2011-01-01

    This software provides a programming interface for automating data collection with a PhaseCam interferometer from 4D Technology, and distributing the image-processing algorithm across a cluster of general-purpose computers. Multiple instances of 4Sight (4D Technology s proprietary software) run on a networked cluster of computers. Each connects to a single server (the controller) and waits for instructions. The controller directs the interferometer to several images, then assigns each image to a different computer for processing. When the image processing is finished, the server directs one of the computers to collate and combine the processed images, saving the resulting measurement in a file on a disk. The available software captures approximately 100 images and analyzes them immediately. This software separates the capture and analysis processes, so that analysis can be done at a different time and faster by running the algorithm in parallel across several processors. The PhaseCam family of interferometers can measure an optical system in milliseconds, but it takes many seconds to process the data so that it is usable. In characterizing an adaptive optics system, like the next generation of astronomical observatories, thousands of measurements are required, and the processing time quickly becomes excessive. A programming interface distributes data processing for a PhaseCam interferometer across a Windows computing cluster. A scriptable controller program coordinates data acquisition from the interferometer, storage on networked hard disks, and parallel processing. Idle time of the interferometer is minimized. This architecture is implemented in Python and JavaScript, and may be altered to fit a customer s needs.

  15. Automated video-microscopic imaging and data acquisition system for colloid deposition measurements

    DOEpatents

    Abdel-Fattah, Amr I.; Reimus, Paul W.

    2004-12-28

    A video microscopic visualization system and image processing and data extraction and processing method for in situ detailed quantification of the deposition of sub-micrometer particles onto an arbitrary surface and determination of their concentration across the bulk suspension. The extracted data includes (a) surface concentration and flux of deposited, attached and detached colloids, (b) surface concentration and flux of arriving and departing colloids, (c) distribution of colloids in the bulk suspension in the direction perpendicular to the deposition surface, and (d) spatial and temporal distributions of deposited colloids.

  16. Colour Based Image Processing Method for Recognizing Ribbed Smoked Sheet Grade

    NASA Astrophysics Data System (ADS)

    Fibriani, Ike; Sumardi; Bayu Satriya, Alfredo; Budi Utomo, Satryo

    2017-03-01

    This research proposes a colour based image processing technique to recognize the Ribbed Smoked Sheet (RSS) grade so that the RSS sorting process can be faster and more accurate than the traditional one. The RSS sheet image captured by the camera is transformed into grayscale image to simplify the recognition of rust and mould on the RSS sheet. Then the grayscale image is transformed into binary image using threshold value which is obtained from the RSS 1 reference colour. The grade recognition is determined by counting the white pixel percentage. The result shows that the system has 88% of accuracy. Most faults exist on RSS 2 recognition. This is due to the illumination distribution which is not equal over the RSS image.

  17. A distributed computing system for magnetic resonance imaging: Java-based processing and binding of XML.

    PubMed

    de Beer, R; Graveron-Demilly, D; Nastase, S; van Ormondt, D

    2004-03-01

    Recently we have developed a Java-based heterogeneous distributed computing system for the field of magnetic resonance imaging (MRI). It is a software system for embedding the various image reconstruction algorithms that we have created for handling MRI data sets with sparse sampling distributions. Since these data sets may result from multi-dimensional MRI measurements our system has to control the storage and manipulation of large amounts of data. In this paper we describe how we have employed the extensible markup language (XML) to realize this data handling in a highly structured way. To that end we have used Java packages, recently released by Sun Microsystems, to process XML documents and to compile pieces of XML code into Java classes. We have effectuated a flexible storage and manipulation approach for all kinds of data within the MRI system, such as data describing and containing multi-dimensional MRI measurements, data configuring image reconstruction methods and data representing and visualizing the various services of the system. We have found that the object-oriented approach, possible with the Java programming environment, combined with the XML technology is a convenient way of describing and handling various data streams in heterogeneous distributed computing systems.

  18. The Statistics of Visual Representation

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J.; Rahman, Zia-Ur; Woodell, Glenn A.

    2002-01-01

    The experience of retinex image processing has prompted us to reconsider fundamental aspects of imaging and image processing. Foremost is the idea that a good visual representation requires a non-linear transformation of the recorded (approximately linear) image data. Further, this transformation appears to converge on a specific distribution. Here we investigate the connection between numerical and visual phenomena. Specifically the questions explored are: (1) Is there a well-defined consistent statistical character associated with good visual representations? (2) Does there exist an ideal visual image? And (3) what are its statistical properties?

  19. AIRSAR Web-Based Data Processing

    NASA Technical Reports Server (NTRS)

    Chu, Anhua; Van Zyl, Jakob; Kim, Yunjin; Hensley, Scott; Lou, Yunling; Madsen, Soren; Chapman, Bruce; Imel, David; Durden, Stephen; Tung, Wayne

    2007-01-01

    The AIRSAR automated, Web-based data processing and distribution system is an integrated, end-to-end synthetic aperture radar (SAR) processing system. Designed to function under limited resources and rigorous demands, AIRSAR eliminates operational errors and provides for paperless archiving. Also, it provides a yearly tune-up of the processor on flight missions, as well as quality assurance with new radar modes and anomalous data compensation. The software fully integrates a Web-based SAR data-user request subsystem, a data processing system to automatically generate co-registered multi-frequency images from both polarimetric and interferometric data collection modes in 80/40/20 MHz bandwidth, an automated verification quality assurance subsystem, and an automatic data distribution system for use in the remote-sensor community. Features include Survey Automation Processing in which the software can automatically generate a quick-look image from an entire 90-GB SAR raw data 32-MB/s tape overnight without operator intervention. Also, the software allows product ordering and distribution via a Web-based user request system. To make AIRSAR more user friendly, it has been designed to let users search by entering the desired mission flight line (Missions Searching), or to search for any mission flight line by entering the desired latitude and longitude (Map Searching). For precision image automation processing, the software generates the products according to each data processing request stored in the database via a Queue management system. Users are able to have automatic generation of coregistered multi-frequency images as the software generates polarimetric and/or interferometric SAR data processing in ground and/or slant projection according to user processing requests for one of the 12 radar modes.

  20. [Research and realization of signal processing algorithms based on FPGA in digital ophthalmic ultrasonography imaging].

    PubMed

    Fang, Simin; Zhou, Sheng; Wang, Xiaochun; Ye, Qingsheng; Tian, Ling; Ji, Jianjun; Wang, Yanqun

    2015-01-01

    To design and improve signal processing algorithms of ophthalmic ultrasonography based on FPGA. Achieved three signal processing modules: full parallel distributed dynamic filter, digital quadrature demodulation, logarithmic compression, using Verilog HDL hardware language in Quartus II. Compared to the original system, the hardware cost is reduced, the whole image shows clearer and more information of the deep eyeball contained in the image, the depth of detection increases from 5 cm to 6 cm. The new algorithms meet the design requirements and achieve the system's optimization that they can effectively improve the image quality of existing equipment.

  1. Sensing system for detection and control of deposition on pendant tubes in recovery and power boilers

    DOEpatents

    Kychakoff, George [Maple Valley, WA; Afromowitz, Martin A [Mercer Island, WA; Hogle, Richard E [Olympia, WA

    2008-10-14

    A system for detection and control of deposition on pendant tubes in recovery and power boilers includes one or more deposit monitoring sensors operating in infrared regions of about 4 or 8.7 microns and directly producing images of the interior of the boiler, or producing feeding signals to a data processing system for information to enable a distributed control system by which the boilers are operated to operate said boilers more efficiently. The data processing system includes an image pre-processing circuit in which a 2-D image formed by the video data input is captured, and includes a low pass filter for performing noise filtering of said video input. It also includes an image compensation system for array compensation to correct for pixel variation and dead cells, etc., and for correcting geometric distortion. An image segmentation module receives a cleaned image from the image pre-processing circuit for separating the image of the recovery boiler interior into background, pendant tubes, and deposition. It also accomplishes thresholding/clustering on gray scale/texture and makes morphological transforms to smooth regions, and identifies regions by connected components. An image-understanding unit receives a segmented image sent from the image segmentation module and matches derived regions to a 3-D model of said boiler. It derives a 3-D structure the deposition on pendant tubes in the boiler and provides the information about deposits to the plant distributed control system for more efficient operation of the plant pendant tube cleaning and operating systems.

  2. Transport Imaging of Spatial Distribution of Mobility-Lifetime (Micro Tau) Product in Bulk Semiconductors for Nuclear Radiation Detection

    DTIC Science & Technology

    2012-06-01

    the diffusion length L and the mobility-lifetime product  from the luminescence distribution using the 2D model for transport imaging in bulk...C. Scandrett, and N. M. Haegel, “Three-dimensional transport imaging for the spatially resolved determination of carrier diffusion length in bulk...that allows measurements of the diffusion length and extraction of the  product in luminescent materials without the need for device processing

  3. New Developments in Hard X-ray Fluorescence Microscopy for In-situ Investigations of Trace Element Distributions in Aqueous Systems of Soil Colloids

    NASA Astrophysics Data System (ADS)

    Gleber, Sophie-Charlotte; Weinhausen, Britta; Köster, Sarah; Ward, Jesse; Vine, David; Finney, Lydia; Vogt, Stefan

    2013-10-01

    The distribution, binding and release of trace elements on soil colloids determine matter transport through the soil matrix, and necessitates an aqueous environment and short length and time scales for their study. However, not many microscopy techniques allow for that. We previously showed hard x-ray fluorescence microscopy capabilities to image aqueous colloidal soil samples [1]. As this technique provides attogram sensitivity for transition elements like Cu, Zn, and other geochemically relevant trace elements at sub micrometer spatial resolution (currently down to 150 nm at 2-ID-E [2]; below 50nm at Bionanoprobe, cf. G.Woloschak et al, this volume) combined with the capability to penetrate tens of micrometer of water, it is ideally suited for imaging the elemental content of soil colloids. To address the question of binding and release processes of trace elements on the surface of soil colloids, we developed a microfluidics based XRF flow cytometer, and expanded the applied methods of hard x-ray fluorescence microscopy towards three dimensional imaging. Here, we show (a) the 2-D imaged distributions of Si, K and Fe on soil colloids of Pseudogley samples; (b) how the trace element distribution is a dynamic, pH-dependent process; and (c) x-ray tomographic applications to render the trace elemental distributions in 3-D. We conclude that the approach presented here shows the remarkable potential to image and quantitate elemental distributions from samles within their natural aqueous microenvironment, particularly important in the environmental, medical, and biological sciences.

  4. X-ray Computed Tomography Imaging of the Microstructure of Sand Particles Subjected to High Pressure One-Dimensional Compression

    PubMed Central

    al Mahbub, Asheque; Haque, Asadul

    2016-01-01

    This paper presents the results of X-ray CT imaging of the microstructure of sand particles subjected to high pressure one-dimensional compression leading to particle crushing. A high resolution X-ray CT machine capable of in situ imaging was employed to capture images of the whole volume of a sand sample subjected to compressive stresses up to 79.3 MPa. Images of the whole sample obtained at different load stages were analysed using a commercial image processing software (Avizo) to reveal various microstructural properties, such as pore and particle volume distributions, spatial distribution of void ratios, relative breakage, and anisotropy of particles. PMID:28774011

  5. X-ray Computed Tomography Imaging of the Microstructure of Sand Particles Subjected to High Pressure One-Dimensional Compression.

    PubMed

    Al Mahbub, Asheque; Haque, Asadul

    2016-11-03

    This paper presents the results of X-ray CT imaging of the microstructure of sand particles subjected to high pressure one-dimensional compression leading to particle crushing. A high resolution X-ray CT machine capable of in situ imaging was employed to capture images of the whole volume of a sand sample subjected to compressive stresses up to 79.3 MPa. Images of the whole sample obtained at different load stages were analysed using a commercial image processing software (Avizo) to reveal various microstructural properties, such as pore and particle volume distributions, spatial distribution of void ratios, relative breakage, and anisotropy of particles.

  6. Quantitative analysis of brain magnetic resonance imaging for hepatic encephalopathy

    NASA Astrophysics Data System (ADS)

    Syh, Hon-Wei; Chu, Wei-Kom; Ong, Chin-Sing

    1992-06-01

    High intensity lesions around ventricles have recently been observed in T1-weighted brain magnetic resonance images for patients suffering hepatic encephalopathy. The exact etiology that causes magnetic resonance imaging (MRI) gray scale changes has not been totally understood. The objective of our study was to investigate, through quantitative means, (1) the amount of changes to brain white matter due to the disease process, and (2) the extent and distribution of these high intensity lesions, since it is believed that the abnormality may not be entirely limited to the white matter only. Eleven patients with proven haptic encephalopathy and three normal persons without any evidence of liver abnormality constituted our current data base. Trans-axial, sagittal, and coronal brain MRI were obtained on a 1.5 Tesla scanner. All processing was carried out on a microcomputer-based image analysis system in an off-line manner. Histograms were decomposed into regular brain tissues and lesions. Gray scale ranges coded as lesion were then brought back to original images to identify distribution of abnormality. Our results indicated the disease process involved pallidus, mesencephalon, and subthalamic regions.

  7. High-resolution mapping of molecules in an ionic liquid via scanning transmission electron microscopy.

    PubMed

    Miyata, Tomohiro; Mizoguchi, Teruyasu

    2018-03-01

    Understanding structures and spatial distributions of molecules in liquid phases is crucial for the control of liquid properties and to develop efficient liquid-phase processes. Here, real-space mapping of molecular distributions in a liquid was performed. Specifically, the ionic liquid 1-Ethyl-3-methylimidazolium bis(trifluoromethanesulfonyl)imide (C2mimTFSI) was imaged using atomic-resolution scanning transmission electron microscopy. Simulations revealed network-like bright regions in the images that were attributed to the TFSI- anion, with minimal contributions from the C2mim+ cation. Simple visualization of the TFSI- distribution in the liquid sample was achieved by binarizing the experimental image.

  8. Super-Resolution of Multi-Pixel and Sub-Pixel Images for the SDI

    DTIC Science & Technology

    1993-06-08

    where the phase of the transmitted signal is not needed. The Wigner - Ville distribution ( WVD ) of a real signal s(t), associated with the complex...B. Boashash, 0. P. Kenny and H. J. Whitehouse, "Radar imaging using the Wigner - Ville distribution ", in Real-Time Signal Processing, J. P. Letellier...analytic signal z(t), is a time- frequency distribution defined as-’- 00 W(tf) Z (~t + ) t- -)exp(-i2nft) . (45) Note that the WVD is the double Fourier

  9. Color image analysis of contaminants and bacteria transport in porous media

    NASA Astrophysics Data System (ADS)

    Rashidi, Mehdi; Dehmeshki, Jamshid; Daemi, Mohammad F.; Cole, Larry; Dickenson, Eric

    1997-10-01

    Transport of contaminants and bacteria in aqueous heterogeneous saturated porous systems have been studied experimentally using a novel fluorescent microscopic imaging technique. The approach involves color visualization and quantification of bacterium and contaminant distributions within a transparent porous column. By introducing stained bacteria and an organic dye as a contaminant into the column and illuminating the porous regions with a planar sheet of laser beam, contaminant and bacterial transport processes through the porous medium can be observed and measured microscopically. A computer controlled color CCD camera is used to record the fluorescent images as a function of time. These images are recorded by a frame accurate high resolution VCR and are then analyzed using a color image analysis code written in our laboratories. The color images are digitized this way and simultaneous concentration and velocity distributions of both contaminant and bacterium are evaluated as a function of time and pore characteristics. The approach provides a unique dynamic probe to observe these transport processes microscopically. These results are extremely valuable in in-situ bioremediation problems since microscopic particle-contaminant- bacterium interactions are the key to understanding and optimization of these processes.

  10. Early Detection of Breast Cancer by Using Handycam Camera Manipulation as Thermal Camera Imaging with Images Processing Method

    NASA Astrophysics Data System (ADS)

    Riantana, R.; Arie, B.; Adam, M.; Aditya, R.; Nuryani; Yahya, I.

    2017-02-01

    One important thing to pay attention for detecting breast cancer is breast temperature changes. Indications symptoms of breast tissue abnormalities marked by a rise in temperature of the breast. Handycam in night vision mode interferences by external infrared can penetrate into the skin better and can make an infrared image becomes clearer. The program is capable to changing images from a camcorder into a night vision thermal image by breaking RGB into Grayscale matrix structure. The matrix rearranged in the new matrix with double data type so that it can be processed into contour color chart to differentiate the distribution of body temperature. In this program are also features of contrast scale setting of the image is processed so that the color can be set as desired. There was Also a contrast adjustment feature inverse scale that is useful to reverse the color scale so that colors can be changed opposite. There is improfile function used to retrieves the intensity values of pixels along a line what we want to show the distribution of intensity in a graph of relationship between the intensity and the pixel coordinates.

  11. A Distributed Compressive Sensing Scheme for Event Capture in Wireless Visual Sensor Networks

    NASA Astrophysics Data System (ADS)

    Hou, Meng; Xu, Sen; Wu, Weiling; Lin, Fei

    2018-01-01

    Image signals which acquired by wireless visual sensor network can be used for specific event capture. This event capture is realized by image processing at the sink node. A distributed compressive sensing scheme is used for the transmission of these image signals from the camera nodes to the sink node. A measurement and joint reconstruction algorithm for these image signals are proposed in this paper. Make advantage of spatial correlation between images within a sensing area, the cluster head node which as the image decoder can accurately co-reconstruct these image signals. The subjective visual quality and the reconstruction error rate are used for the evaluation of reconstructed image quality. Simulation results show that the joint reconstruction algorithm achieves higher image quality at the same image compressive rate than the independent reconstruction algorithm.

  12. Optical Signal Processing: Poisson Image Restoration and Shearing Interferometry

    NASA Technical Reports Server (NTRS)

    Hong, Yie-Ming

    1973-01-01

    Optical signal processing can be performed in either digital or analog systems. Digital computers and coherent optical systems are discussed as they are used in optical signal processing. Topics include: image restoration; phase-object visualization; image contrast reversal; optical computation; image multiplexing; and fabrication of spatial filters. Digital optical data processing deals with restoration of images degraded by signal-dependent noise. When the input data of an image restoration system are the numbers of photoelectrons received from various areas of a photosensitive surface, the data are Poisson distributed with mean values proportional to the illuminance of the incoherently radiating object and background light. Optical signal processing using coherent optical systems is also discussed. Following a brief review of the pertinent details of Ronchi's diffraction grating interferometer, moire effect, carrier-frequency photography, and achromatic holography, two new shearing interferometers based on them are presented. Both interferometers can produce variable shear.

  13. Spectral imaging toolbox: segmentation, hyperstack reconstruction, and batch processing of spectral images for the determination of cell and model membrane lipid order.

    PubMed

    Aron, Miles; Browning, Richard; Carugo, Dario; Sezgin, Erdinc; Bernardino de la Serna, Jorge; Eggeling, Christian; Stride, Eleanor

    2017-05-12

    Spectral imaging with polarity-sensitive fluorescent probes enables the quantification of cell and model membrane physical properties, including local hydration, fluidity, and lateral lipid packing, usually characterized by the generalized polarization (GP) parameter. With the development of commercial microscopes equipped with spectral detectors, spectral imaging has become a convenient and powerful technique for measuring GP and other membrane properties. The existing tools for spectral image processing, however, are insufficient for processing the large data sets afforded by this technological advancement, and are unsuitable for processing images acquired with rapidly internalized fluorescent probes. Here we present a MATLAB spectral imaging toolbox with the aim of overcoming these limitations. In addition to common operations, such as the calculation of distributions of GP values, generation of pseudo-colored GP maps, and spectral analysis, a key highlight of this tool is reliable membrane segmentation for probes that are rapidly internalized. Furthermore, handling for hyperstacks, 3D reconstruction and batch processing facilitates analysis of data sets generated by time series, z-stack, and area scan microscope operations. Finally, the object size distribution is determined, which can provide insight into the mechanisms underlying changes in membrane properties and is desirable for e.g. studies involving model membranes and surfactant coated particles. Analysis is demonstrated for cell membranes, cell-derived vesicles, model membranes, and microbubbles with environmentally-sensitive probes Laurdan, carboxyl-modified Laurdan (C-Laurdan), Di-4-ANEPPDHQ, and Di-4-AN(F)EPPTEA (FE), for quantification of the local lateral density of lipids or lipid packing. The Spectral Imaging Toolbox is a powerful tool for the segmentation and processing of large spectral imaging datasets with a reliable method for membrane segmentation and no ability in programming required. The Spectral Imaging Toolbox can be downloaded from https://uk.mathworks.com/matlabcentral/fileexchange/62617-spectral-imaging-toolbox .

  14. A Spatiotemporal Clustering Approach to Maritime Domain Awareness

    DTIC Science & Technology

    2013-09-01

    1997. [25] M. E. Celebi, “Effective initialization of k-means for color quantization,” 16th IEEE International Conference on Image Processing (ICIP...release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Spatiotemporal clustering is the process of grouping...Department of Electrical and Computer Engineering iv THIS PAGE INTENTIONALLY LEFT BLANK v ABSTRACT Spatiotemporal clustering is the process of

  15. Application of Image Analysis for Characterization of Spatial Arrangements of Features in Microstructure

    NASA Technical Reports Server (NTRS)

    Louis, Pascal; Gokhale, Arun M.

    1995-01-01

    A number of microstructural processes are sensitive to the spatial arrangements of features in microstructure. However, very little attention has been given in the past to the experimental measurements of the descriptors of microstructural distance distributions due to the lack of practically feasible methods. We present a digital image analysis procedure to estimate the micro-structural distance distributions. The application of the technique is demonstrated via estimation of K function, radial distribution function, and nearest-neighbor distribution function of hollow spherical carbon particulates in a polymer matrix composite, observed in a metallographic section.

  16. An online detection system for aggregate sizes and shapes based on digital image processing

    NASA Astrophysics Data System (ADS)

    Yang, Jianhong; Chen, Sijia

    2017-02-01

    Traditional aggregate size measuring methods are time-consuming, taxing, and do not deliver online measurements. A new online detection system for determining aggregate size and shape based on a digital camera with a charge-coupled device, and subsequent digital image processing, have been developed to overcome these problems. The system captures images of aggregates while falling and flat lying. Using these data, the particle size and shape distribution can be obtained in real time. Here, we calibrate this method using standard globules. Our experiments show that the maximum particle size distribution error was only 3 wt%, while the maximum particle shape distribution error was only 2 wt% for data derived from falling aggregates, having good dispersion. In contrast, the data for flat-lying aggregates had a maximum particle size distribution error of 12 wt%, and a maximum particle shape distribution error of 10 wt%; their accuracy was clearly lower than for falling aggregates. However, they performed well for single-graded aggregates, and did not require a dispersion device. Our system is low-cost and easy to install. It can successfully achieve online detection of aggregate size and shape with good reliability, and it has great potential for aggregate quality assurance.

  17. Clinical image processing engine

    NASA Astrophysics Data System (ADS)

    Han, Wei; Yao, Jianhua; Chen, Jeremy; Summers, Ronald

    2009-02-01

    Our group provides clinical image processing services to various institutes at NIH. We develop or adapt image processing programs for a variety of applications. However, each program requires a human operator to select a specific set of images and execute the program, as well as store the results appropriately for later use. To improve efficiency, we design a parallelized clinical image processing engine (CIPE) to streamline and parallelize our service. The engine takes DICOM images from a PACS server, sorts and distributes the images to different applications, multithreads the execution of applications, and collects results from the applications. The engine consists of four modules: a listener, a router, a job manager and a data manager. A template filter in XML format is defined to specify the image specification for each application. A MySQL database is created to store and manage the incoming DICOM images and application results. The engine achieves two important goals: reduce the amount of time and manpower required to process medical images, and reduce the turnaround time for responding. We tested our engine on three different applications with 12 datasets and demonstrated that the engine improved the efficiency dramatically.

  18. Industrial application of thermal image processing and thermal control

    NASA Astrophysics Data System (ADS)

    Kong, Lingxue

    2001-09-01

    Industrial application of infrared thermography is virtually boundless as it can be used in any situations where there are temperature differences. This technology has particularly been widely used in automotive industry for process evaluation and system design. In this work, thermal image processing technique will be introduced to quantitatively calculate the heat stored in a warm/hot object and consequently, a thermal control system will be proposed to accurately and actively manage the thermal distribution within the object in accordance with the heat calculated from the thermal images.

  19. Confocal microscopy and 3-D distribution of dead cells in cryopreserved pancreatic islets

    NASA Astrophysics Data System (ADS)

    Merchant, Fatima A.; Aggarwal, Shanti J.; Diller, Kenneth R.; Bartels, Keith A.; Bovik, Alan C.

    1992-06-01

    Our laboratory is involved in studies of changes in shape and size of biological specimens under osmotic stress at ambient and sub-zero temperatures. This paper describes confocal microscopy, image processing and analysis of 3-D distribution of cells in acridine orange/propidium iodide (AO/PI) fluorescent stained frozen-thawed islet of Langerhans. Isolated and cultured rat pancreatic islets were frozen and thawed in 2 M dimethylsulfoxide and examined under a Zeiss laser scanning confocal microscope. Two micrometers to five micrometers serial sections of the islets were obtained and processed to obtain high contrast images which were later processed in two steps. The first step consisted of the isolation of the region of interest by template masking followed by grey level thresholding to obtain a binary image. Three-dimensional blob coloring algorithm was applied and the number of voxels in each region and the number of regions were counted. The volumetric distribution of the dead cells in the islets was computed by calculating the distance from the center of each blob to the centroid of the 3-D image. An increase in the number of blobs moving from the center toward the periphery of the islet was observed indicating that the freeze damage was more concentrated in the outer edges of the islet.

  20. Ghost image in enhanced self-heterodyne synthetic aperture imaging ladar

    NASA Astrophysics Data System (ADS)

    Zhang, Guo; Sun, Jianfeng; Zhou, Yu; Lu, Zhiyong; Li, Guangyuan; Xu, Mengmeng; Zhang, Bo; Lao, Chenzhe; He, Hongyu

    2018-03-01

    The enhanced self-heterodyne synthetic aperture imaging ladar (SAIL) self-heterodynes two polarization-orthogonal echo signals to eliminate the phase disturbance caused by atmospheric turbulence and mechanical trembling, uses heterodyne receiver instead of self-heterodyne receiver to improve signal-to-noise ratio. The principle and structure of the enhanced self-heterodyne SAIL are presented. The imaging process of enhanced self-heterodyne SAIL for distributed target is also analyzed. In enhanced self-heterodyne SAIL, the phases of two orthogonal-polarization beams are modulated by four cylindrical lenses in transmitter to improve resolutions in orthogonal direction and travel direction, which will generate ghost image. The generation process of ghost image in enhanced self-heterodyne SAIL is mathematically detailed, and a method of eliminating ghost image is also presented, which is significant for far-distance imaging. A number of experiments of enhanced self-heterodyne SAIL for distributed target are presented, these experimental results verify the theoretical analysis of enhanced self-heterodyne SAIL. The enhanced self-heterodyne SAIL has the capability to eliminate the influence from the atmospheric turbulence and mechanical trembling, has high advantage in detecting weak signals, and has promising application for far-distance ladar imaging.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Donald F.; Schulz, Carl; Konijnenburg, Marco

    High-resolution Fourier transform ion cyclotron resonance (FT-ICR) mass spectrometry imaging enables the spatial mapping and identification of biomolecules from complex surfaces. The need for long time-domain transients, and thus large raw file sizes, results in a large amount of raw data (“big data”) that must be processed efficiently and rapidly. This can be compounded by largearea imaging and/or high spatial resolution imaging. For FT-ICR, data processing and data reduction must not compromise the high mass resolution afforded by the mass spectrometer. The continuous mode “Mosaic Datacube” approach allows high mass resolution visualization (0.001 Da) of mass spectrometry imaging data, butmore » requires additional processing as compared to featurebased processing. We describe the use of distributed computing for processing of FT-ICR MS imaging datasets with generation of continuous mode Mosaic Datacubes for high mass resolution visualization. An eight-fold improvement in processing time is demonstrated using a Dutch nationally available cloud service.« less

  2. A Method of Visualizing Three-Dimensional Distribution of Yeast in Bread Dough

    NASA Astrophysics Data System (ADS)

    Maeda, Tatsurou; Do, Gab-Soo; Sugiyama, Junichi; Oguchi, Kosei; Shiraga, Seizaburou; Ueda, Mitsuyoshi; Takeya, Koji; Endo, Shigeru

    A novel technique was developed to monitor the change in three-dimensional (3D) distribution of yeast in frozen bread dough samples in accordance with the progress of mixing process. Application of a surface engineering technology allowed the identification of yeast in bread dough by bonding EGFP (Enhanced Green Fluorescent Protein) to the surface of yeast cells. The fluorescent yeast (a biomarker) was recognized as bright spots at the wavelength of 520 nm. A Micro-Slicer Image Processing System (MSIPS) with a fluorescence microscope was utilized to acquire cross-sectional images of frozen dough samples sliced at intervals of 1 μm. A set of successive two-dimensional images was reconstructed to analyze 3D distribution of yeast. Samples were taken from each of four normal mixing stages (i.e., pick up, clean up, development, and final stages) and also from over mixing stage. In the pick up stage yeast distribution was uneven with local areas of dense yeast. As the mixing progressed from clean up to final stages, the yeast became more evenly distributed throughout the dough sample. However, the uniformity in yeast distribution was lost in the over mixing stage possibly due to the breakdown of gluten structure within the dough sample.

  3. Instantaneous three-dimensional visualization of concentration distributions in turbulent flows with crossed-plane laser-induced fluorescence imaging

    NASA Astrophysics Data System (ADS)

    Hoffmann, A.; Zimmermann, F.; Scharr, H.; Krömker, S.; Schulz, C.

    2005-01-01

    A laser-based technique for measuring instantaneous three-dimensional species concentration distributions in turbulent flows is presented. The laser beam from a single laser is formed into two crossed light sheets that illuminate the area of interest. The laser-induced fluorescence (LIF) signal emitted from excited species within both planes is detected with a single camera via a mirror arrangement. Image processing enables the reconstruction of the three-dimensional data set in close proximity to the cutting line of the two light sheets. Three-dimensional intensity gradients are computed and compared to the two-dimensional projections obtained from the two directly observed planes. Volume visualization by digital image processing gives unique insight into the three-dimensional structures within the turbulent processes. We apply this technique to measurements of toluene-LIF in a turbulent, non-reactive mixing process of toluene and air and to hydroxyl (OH) LIF in a turbulent methane-air flame upon excitation at 248 nm with a tunable KrF excimer laser.

  4. Visualization and understanding of the granulation liquid mixing and distribution during continuous twin screw granulation using NIR chemical imaging.

    PubMed

    Vercruysse, Jurgen; Toiviainen, Maunu; Fonteyne, Margot; Helkimo, Niko; Ketolainen, Jarkko; Juuti, Mikko; Delaet, Urbain; Van Assche, Ivo; Remon, Jean Paul; Vervaet, Chris; De Beer, Thomas

    2014-04-01

    Over the last decade, there has been increased interest in the application of twin screw granulation as a continuous wet granulation technique for pharmaceutical drug formulations. However, the mixing of granulation liquid and powder material during the short residence time inside the screw chamber and the atypical particle size distribution (PSD) of granules produced by twin screw granulation is not yet fully understood. Therefore, this study aims at visualizing the granulation liquid mixing and distribution during continuous twin screw granulation using NIR chemical imaging. In first instance, the residence time of material inside the barrel was investigated as function of screw speed and moisture content followed by the visualization of the granulation liquid distribution as function of different formulation and process parameters (liquid feed rate, liquid addition method, screw configuration, moisture content and barrel filling degree). The link between moisture uniformity and granule size distributions was also studied. For residence time analysis, increased screw speed and lower moisture content resulted to a shorter mean residence time and narrower residence time distribution. Besides, the distribution of granulation liquid was more homogenous at higher moisture content and with more kneading zones on the granulator screws. After optimization of the screw configuration, a two-level full factorial experimental design was performed to evaluate the influence of moisture content, screw speed and powder feed rate on the mixing efficiency of the powder and liquid phase. From these results, it was concluded that only increasing the moisture content significantly improved the granulation liquid distribution. This study demonstrates that NIR chemical imaging is a fast and adequate measurement tool for allowing process visualization and hence for providing better process understanding of a continuous twin screw granulation system. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Coherent X-Ray Imaging of Collagen Fibril Distributions within Intact Tendons

    PubMed Central

    Berenguer, Felisa; Bean, Richard J.; Bozec, Laurent; Vila-Comamala, Joan; Zhang, Fucai; Kewish, Cameron M.; Bunk, Oliver; Rodenburg, John M.; Robinson, Ian K.

    2014-01-01

    The characterization of the structure of highly hierarchical biosamples such as collagen-based tissues at the scale of tens of nanometers is essential to correlate the tissue structure with its growth processes. Coherent x-ray Bragg ptychography is an innovative imaging technique that gives high resolution images of the ordered parts of such samples. Herein, we report how we used this method to image the collagen fibrillar ultrastructure of intact rat tail tendons. The images show ordered fibrils extending over 10–20 μm in length, with a quantifiable D-banding spacing variation of 0.2%. Occasional defects in the fibrils distribution have also been observed, likely indicating fibrillar fusion events. PMID:24461021

  6. Research based on the SoPC platform of feature-based image registration

    NASA Astrophysics Data System (ADS)

    Shi, Yue-dong; Wang, Zhi-hui

    2015-12-01

    This paper focuses on the study of implementing feature-based image registration by System on a Programmable Chip (SoPC) hardware platform. We solidify the image registration algorithm on the FPGA chip, in which embedded soft core processor Nios II can speed up the image processing system. In this way, we can make image registration technology get rid of the PC. And, consequently, this kind of technology will be got an extensive use. The experiment result indicates that our system shows stable performance, particularly in terms of matching processing which noise immunity is good. And feature points of images show a reasonable distribution.

  7. Characterization of the Distance Relationship Between Localized Serotonin Receptors and Glia Cells on Fluorescence Microscopy Images of Brain Tissue.

    PubMed

    Jacak, Jaroslaw; Schaller, Susanne; Borgmann, Daniela; Winkler, Stephan M

    2015-08-01

    We here present two new methods for the characterization of fluorescent localization microscopy images obtained from immunostained brain tissue sections. Direct stochastic optical reconstruction microscopy images of 5-HT1A serotonin receptors and glial fibrillary acidic proteins in healthy cryopreserved brain tissues are analyzed. In detail, we here present two image processing methods for characterizing differences in receptor distribution on glial cells and their distribution on neural cells: One variant relies on skeleton extraction and adaptive thresholding, the other on k-means based discrete layer segmentation. Experimental results show that both methods can be applied for distinguishing classes of images with respect to serotonin receptor distribution. Quantification of nanoscopic changes in relative protein expression on particular cell types can be used to analyze degeneration in tissues caused by diseases or medical treatment.

  8. Signal processing in urodynamics: towards high definition urethral pressure profilometry.

    PubMed

    Klünder, Mario; Sawodny, Oliver; Amend, Bastian; Ederer, Michael; Kelp, Alexandra; Sievert, Karl-Dietrich; Stenzl, Arnulf; Feuer, Ronny

    2016-03-22

    Urethral pressure profilometry (UPP) is used in the diagnosis of stress urinary incontinence (SUI) which is a significant medical, social, and economic problem. Low spatial pressure resolution, common occurrence of artifacts, and uncertainties in data location limit the diagnostic value of UPP. To overcome these limitations, high definition urethral pressure profilometry (HD-UPP) combining enhanced UPP hardware and signal processing algorithms has been developed. In this work, we present the different signal processing steps in HD-UPP and show experimental results from female minipigs. We use a special microtip catheter with high angular pressure resolution and an integrated inclination sensor. Signals from the catheter are filtered and time-correlated artifacts removed. A signal reconstruction algorithm processes pressure data into a detailed pressure image on the urethra's inside. Finally, the pressure distribution on the urethra's outside is calculated through deconvolution. A mathematical model of the urethra is contained in a point-spread-function (PSF) which is identified depending on geometric and material properties of the urethra. We additionally investigate the PSF's frequency response to determine the relevant frequency band for pressure information on the urinary sphincter. Experimental pressure data are spatially located and processed into high resolution pressure images. Artifacts are successfully removed from data without blurring other details. The pressure distribution on the urethra's outside is reconstructed and compared to the one on the inside. Finally, the pressure images are mapped onto the urethral geometry calculated from inclination and position data to provide an integrated image of pressure distribution, anatomical shape, and location. With its advanced sensing capabilities, the novel microtip catheter collects an unprecedented amount of urethral pressure data. Through sequential signal processing steps, physicians are provided with detailed information on the pressure distribution in and around the urethra. Therefore, HD-UPP overcomes many current limitations of conventional UPP and offers the opportunity to evaluate urethral structures, especially the sphincter, in context of the correct anatomical location. This could enable the development of focal therapy approaches in the treatment of SUI.

  9. MALDI imaging facilitates new topical drug development process by determining quantitative skin distribution profiles.

    PubMed

    Bonnel, David; Legouffe, Raphaël; Eriksson, André H; Mortensen, Rasmus W; Pamelard, Fabien; Stauber, Jonathan; Nielsen, Kim T

    2018-04-01

    Generation of skin distribution profiles and reliable determination of drug molecule concentration in the target region are crucial during the development process of topical products for treatment of skin diseases like psoriasis and atopic dermatitis. Imaging techniques like mass spectrometric imaging (MSI) offer sufficient spatial resolution to generate meaningful distribution profiles of a drug molecule across a skin section. In this study, we use matrix-assisted laser desorption/ionization mass spectrometry imaging (MALDI-MSI) to generate quantitative skin distribution profiles based on tissue extinction coefficient (TEC) determinations of four different molecules in cross sections of human skin explants after topical administration. The four drug molecules: roflumilast, tofacitinib, ruxolitinib, and LEO 29102 have different physicochemical properties. In addition, tofacitinib was administrated in two different formulations. The study reveals that with MALDI-MSI, we were able to observe differences in penetration profiles for both the four drug molecules and the two formulations and thereby demonstrate its applicability as a screening tool when developing a topical drug product. Furthermore, the study reveals that the sensitivity of the MALDI-MSI techniques appears to be inversely correlated to the drug molecules' ability to bind to the surrounding tissues, which can be estimated by their Log D values. Graphical abstract.

  10. Real-time implementations of image segmentation algorithms on shared memory multicore architecture: a survey (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Akil, Mohamed

    2017-05-01

    The real-time processing is getting more and more important in many image processing applications. Image segmentation is one of the most fundamental tasks image analysis. As a consequence, many different approaches for image segmentation have been proposed. The watershed transform is a well-known image segmentation tool. The watershed transform is a very data intensive task. To achieve acceleration and obtain real-time processing of watershed algorithms, parallel architectures and programming models for multicore computing have been developed. This paper focuses on the survey of the approaches for parallel implementation of sequential watershed algorithms on multicore general purpose CPUs: homogeneous multicore processor with shared memory. To achieve an efficient parallel implementation, it's necessary to explore different strategies (parallelization/distribution/distributed scheduling) combined with different acceleration and optimization techniques to enhance parallelism. In this paper, we give a comparison of various parallelization of sequential watershed algorithms on shared memory multicore architecture. We analyze the performance measurements of each parallel implementation and the impact of the different sources of overhead on the performance of the parallel implementations. In this comparison study, we also discuss the advantages and disadvantages of the parallel programming models. Thus, we compare the OpenMP (an application programming interface for multi-Processing) with Ptheads (POSIX Threads) to illustrate the impact of each parallel programming model on the performance of the parallel implementations.

  11. Imaging articular cartilage using second harmonic generation microscopy

    NASA Astrophysics Data System (ADS)

    Mansfield, Jessica C.; Winlove, C. Peter; Knapp, Karen; Matcher, Stephen J.

    2006-02-01

    Sub cellular resolution images of equine articular cartilage have been obtained using both second harmonic generation microscopy (SHGM) and two-photon fluorescence microscopy (TPFM). The SHGM images clearly map the distribution of the collagen II fibers within the extracellular matrix while the TPFM images show the distribution of endogenous two-photon fluorophores in both the cells and the extracellular matrix, highlighting especially the pericellular matrix and bright 2-3μm diameter features within the cells. To investigate the source of TPF in the extracellular matrix experiments have been carried out to see if it may originate from the proteoglycans. Pure solutions of the following proteoglycans hyaluronan, chondroitin sulfate and aggrecan have been imaged, only the aggrecan produced any TPF and here the intensity was not great enough to account for the TPF in the extracellular matrix. Also cartilage samples were subjected to a process to remove proteoglycans and cellular components. After this process the TPF from the samples had decreased by a factor of two, with respect to the SHG intensity.

  12. Statistical Deconvolution for Superresolution Fluorescence Microscopy

    PubMed Central

    Mukamel, Eran A.; Babcock, Hazen; Zhuang, Xiaowei

    2012-01-01

    Superresolution microscopy techniques based on the sequential activation of fluorophores can achieve image resolution of ∼10 nm but require a sparse distribution of simultaneously activated fluorophores in the field of view. Image analysis procedures for this approach typically discard data from crowded molecules with overlapping images, wasting valuable image information that is only partly degraded by overlap. A data analysis method that exploits all available fluorescence data, regardless of overlap, could increase the number of molecules processed per frame and thereby accelerate superresolution imaging speed, enabling the study of fast, dynamic biological processes. Here, we present a computational method, referred to as deconvolution-STORM (deconSTORM), which uses iterative image deconvolution in place of single- or multiemitter localization to estimate the sample. DeconSTORM approximates the maximum likelihood sample estimate under a realistic statistical model of fluorescence microscopy movies comprising numerous frames. The model incorporates Poisson-distributed photon-detection noise, the sparse spatial distribution of activated fluorophores, and temporal correlations between consecutive movie frames arising from intermittent fluorophore activation. We first quantitatively validated this approach with simulated fluorescence data and showed that deconSTORM accurately estimates superresolution images even at high densities of activated fluorophores where analysis by single- or multiemitter localization methods fails. We then applied the method to experimental data of cellular structures and demonstrated that deconSTORM enables an approximately fivefold or greater increase in imaging speed by allowing a higher density of activated fluorophores/frame. PMID:22677393

  13. Size-frequency distribution of boulders ≥7 m on comet 67P/Churyumov-Gerasimenko

    NASA Astrophysics Data System (ADS)

    Pajola, Maurizio; Vincent, Jean-Baptiste; Güttler, Carsten; Lee, Jui-Chi; Bertini, Ivano; Massironi, Matteo; Simioni, Emanuele; Marzari, Francesco; Giacomini, Lorenza; Lucchetti, Alice; Barbieri, Cesare; Cremonese, Gabriele; Naletto, Giampiero; Pommerol, Antoine; El-Maarry, Mohamed R.; Besse, Sébastien; Küppers, Michael; La Forgia, Fiorangela; Lazzarin, Monica; Thomas, Nicholas; Auger, Anne-Thérèse; Sierks, Holger; Lamy, Philippe; Rodrigo, Rafael; Koschny, Detlef; Rickman, Hans; Keller, Horst U.; Agarwal, Jessica; A'Hearn, Michael F.; Barucci, Maria A.; Bertaux, Jean-Loup; Da Deppo, Vania; Davidsson, Björn; De Cecco, Mariolino; Debei, Stefano; Ferri, Francesca; Fornasier, Sonia; Fulle, Marco; Groussin, Olivier; Gutierrez, Pedro J.; Hviid, Stubbe F.; Ip, Wing-Huen; Jorda, Laurent; Knollenberg, Jörg; Kramm, J.-Rainer; Kürt, Ekkehard; Lara, Luisa M.; Lin, Zhong-Yi; Lopez Moreno, Jose J.; Magrin, Sara; Marchi, Simone; Michalik, Harald; Moissl, Richard; Mottola, Stefano; Oklay, Nilda; Preusker, Frank; Scholten, Frank; Tubiana, Cecilia

    2015-11-01

    Aims: We derive for the first time the size-frequency distribution of boulders on a comet, 67P/Churyumov-Gerasimenko (67P), computed from the images taken by the Rosetta/OSIRIS imaging system. We highlight the possible physical processes that lead to these boulder size distributions. Methods: We used images acquired by the OSIRIS Narrow Angle Camera, NAC, on 5 and 6 August 2014. The scale of these images (2.44-2.03 m/px) is such that boulders ≥7 m can be identified and manually extracted from the datasets with the software ArcGIS. We derived both global and localized size-frequency distributions. The three-pixel sampling detection, coupled with the favorable shadowing of the surface (observation phase angle ranging from 48° to 53°), enables unequivocally detecting boulders scattered all over the illuminated side of 67P. Results: We identify 3546 boulders larger than 7 m on the imaged surface (36.4 km2), with a global number density of nearly 100/km2 and a cumulative size-frequency distribution represented by a power-law with index of -3.6 +0.2/-0.3. The two lobes of 67P appear to have slightly different distributions, with an index of -3.5 +0.2/-0.3 for the main lobe (body) and -4.0 +0.3/-0.2 for the small lobe (head). The steeper distribution of the small lobe might be due to a more pervasive fracturing. The difference of the distribution for the connecting region (neck) is much more significant, with an index value of -2.2 +0.2/-0.2. We propose that the boulder field located in the neck area is the result of blocks falling from the contiguous Hathor cliff. The lower slope of the size-frequency distribution we see today in the neck area might be due to the concurrent processes acting on the smallest boulders, such as i) disintegration or fragmentation and vanishing through sublimation; ii) uplifting by gas drag and consequent redistribution; and iii) burial beneath a debris blanket. We also derived the cumulative size-frequency distribution per km2 of localized areas on 67P. By comparing the cumulative size-frequency distributions of similar geomorphological settings, we derived similar power-law index values. This suggests that despite the selected locations on different and often opposite sides of the comet, similar sublimation or activity processes, pit formation or collapses, as well as thermal stresses or fracturing events occurred on multiple areas of the comet, shaping its surface into the appearance we see today.

  14. Clinical evaluation of a commercial orthopedic metal artifact reduction tool for CT simulations in radiation therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Hua; Noel, Camille; Chen, Haijian

    Purpose: Severe artifacts in kilovoltage-CT simulation images caused by large metallic implants can significantly degrade the conspicuity and apparent CT Hounsfield number of targets and anatomic structures, jeopardize the confidence of anatomical segmentation, and introduce inaccuracies into the radiation therapy treatment planning process. This study evaluated the performance of the first commercial orthopedic metal artifact reduction function (O-MAR) for radiation therapy, and investigated its clinical applications in treatment planning. Methods: Both phantom and clinical data were used for the evaluation. The CIRS electron density phantom with known physical (and electron) density plugs and removable titanium implants was scanned on amore » Philips Brilliance Big Bore 16-slice CT simulator. The CT Hounsfield numbers of density plugs on both uncorrected and O-MAR corrected images were compared. Treatment planning accuracy was evaluated by comparing simulated dose distributions computed using the true density images, uncorrected images, and O-MAR corrected images. Ten CT image sets of patients with large hip implants were processed with the O-MAR function and evaluated by two radiation oncologists using a five-point score for overall image quality, anatomical conspicuity, and CT Hounsfield number accuracy. By utilizing the same structure contours delineated from the O-MAR corrected images, clinical IMRT treatment plans for five patients were computed on the uncorrected and O-MAR corrected images, respectively, and compared. Results: Results of the phantom study indicated that CT Hounsfield number accuracy and noise were improved on the O-MAR corrected images, especially for images with bilateral metal implants. The {gamma} pass rates of the simulated dose distributions computed on the uncorrected and O-MAR corrected images referenced to those of the true densities were higher than 99.9% (even when using 1% and 3 mm distance-to-agreement criterion), suggesting that dose distributions were clinically identical. In all patient cases, radiation oncologists rated O-MAR corrected images as higher quality. Formerly obscured critical structures were able to be visualized. The overall image quality and the conspicuity in critical organs were significantly improved compared with the uncorrected images: overall quality score (1.35 vs 3.25, P= 0.0022); bladder (2.15 vs 3.7, P= 0.0023); prostate and seminal vesicles/vagina (1.3 vs 3.275, P= 0.0020); rectum (2.8 vs 3.9, P= 0.0021). The noise levels of the selected ROIs were reduced from 93.7 to 38.2 HU. On most cases (8/10), the average CT Hounsfield numbers of the prostate/vagina on the O-MAR corrected images were closer to the referenced value (41.2 HU, an average measured from patients without metal implants) than those on the uncorrected images. High {gamma} pass rates of the five IMRT dose distribution pairs indicated that the dose distributions were not significantly affected by the CT image improvements. Conclusions: Overall, this study indicated that the O-MAR function can remarkably reduce metal artifacts and improve both CT Hounsfield number accuracy and target and critical structure visualization. Although there was no significant impact of the O-MAR algorithm on the calculated dose distributions, we suggest that O-MAR corrected images are more suitable for the entire treatment planning process by offering better anatomical structure visualization, improving radiation oncologists' confidence in target delineation, and by avoiding subjective density overrides of artifact regions on uncorrected images.« less

  15. Clinical evaluation of a commercial orthopedic metal artifact reduction tool for CT simulations in radiation therapy

    PubMed Central

    Li, Hua; Noel, Camille; Chen, Haijian; Harold Li, H.; Low, Daniel; Moore, Kevin; Klahr, Paul; Michalski, Jeff; Gay, Hiram A.; Thorstad, Wade; Mutic, Sasa

    2012-01-01

    Purpose: Severe artifacts in kilovoltage-CT simulation images caused by large metallic implants can significantly degrade the conspicuity and apparent CT Hounsfield number of targets and anatomic structures, jeopardize the confidence of anatomical segmentation, and introduce inaccuracies into the radiation therapy treatment planning process. This study evaluated the performance of the first commercial orthopedic metal artifact reduction function (O-MAR) for radiation therapy, and investigated its clinical applications in treatment planning. Methods: Both phantom and clinical data were used for the evaluation. The CIRS electron density phantom with known physical (and electron) density plugs and removable titanium implants was scanned on a Philips Brilliance Big Bore 16-slice CT simulator. The CT Hounsfield numbers of density plugs on both uncorrected and O-MAR corrected images were compared. Treatment planning accuracy was evaluated by comparing simulated dose distributions computed using the true density images, uncorrected images, and O-MAR corrected images. Ten CT image sets of patients with large hip implants were processed with the O-MAR function and evaluated by two radiation oncologists using a five-point score for overall image quality, anatomical conspicuity, and CT Hounsfield number accuracy. By utilizing the same structure contours delineated from the O-MAR corrected images, clinical IMRT treatment plans for five patients were computed on the uncorrected and O-MAR corrected images, respectively, and compared. Results: Results of the phantom study indicated that CT Hounsfield number accuracy and noise were improved on the O-MAR corrected images, especially for images with bilateral metal implants. The γ pass rates of the simulated dose distributions computed on the uncorrected and O-MAR corrected images referenced to those of the true densities were higher than 99.9% (even when using 1% and 3 mm distance-to-agreement criterion), suggesting that dose distributions were clinically identical. In all patient cases, radiation oncologists rated O-MAR corrected images as higher quality. Formerly obscured critical structures were able to be visualized. The overall image quality and the conspicuity in critical organs were significantly improved compared with the uncorrected images: overall quality score (1.35 vs 3.25, P = 0.0022); bladder (2.15 vs 3.7, P = 0.0023); prostate and seminal vesicles/vagina (1.3 vs 3.275, P = 0.0020); rectum (2.8 vs 3.9, P = 0.0021). The noise levels of the selected ROIs were reduced from 93.7 to 38.2 HU. On most cases (8/10), the average CT Hounsfield numbers of the prostate/vagina on the O-MAR corrected images were closer to the referenced value (41.2 HU, an average measured from patients without metal implants) than those on the uncorrected images. High γ pass rates of the five IMRT dose distribution pairs indicated that the dose distributions were not significantly affected by the CT image improvements. Conclusions: Overall, this study indicated that the O-MAR function can remarkably reduce metal artifacts and improve both CT Hounsfield number accuracy and target and critical structure visualization. Although there was no significant impact of the O-MAR algorithm on the calculated dose distributions, we suggest that O-MAR corrected images are more suitable for the entire treatment planning process by offering better anatomical structure visualization, improving radiation oncologists’ confidence in target delineation, and by avoiding subjective density overrides of artifact regions on uncorrected images. PMID:23231300

  16. Advances in Measurement of Skin Friction in Airflow

    NASA Technical Reports Server (NTRS)

    Brown, James L.; Naughton, Jonathan W.

    2006-01-01

    The surface interferometric skin-friction (SISF) measurement system is an instrument for determining the distribution of surface shear stress (skin friction) on a wind-tunnel model. The SISF system utilizes the established oil-film interference method, along with advanced image-data-processing techniques and mathematical models that express the relationship between interferograms and skin friction, to determine the distribution of skin friction over an observed region of the surface of a model during a single wind-tunnel test. In the oil-film interference method, a wind-tunnel model is coated with a thin film of oil of known viscosity and is illuminated with quasi-monochromatic, collimated light, typically from a mercury lamp. The light reflected from the outer surface of the oil film interferes with the light reflected from the oil-covered surface of the model. In the present version of the oil-film interference method, a camera captures an image of the illuminated model and the image in the camera is modulated by the interference pattern. The interference pattern depends on the oil-thickness distribution on the observed surface, and this distribution can be extracted through analysis of the image acquired by the camera. The oil-film technique is augmented by a tracer technique for observing the streamline pattern. To make the streamlines visible, small dots of fluorescentchalk/oil mixture are placed on the model just before a test. During the test, the chalk particles are embedded in the oil flow and produce chalk streaks that mark the streamlines. The instantaneous rate of thinning of the oil film at a given position on the surface of the model can be expressed as a function of the instantaneous thickness, the skin-friction distribution on the surface, and the streamline pattern on the surface; the functional relationship is expressed by a mathematical model that is nonlinear in the oil-film thickness and is known simply as the thin-oil-film equation. From the image data acquired as described, the time-dependent oil-thickness distribution and streamline pattern are extracted and by inversion of the thin-oil-film equation it is then possible to determine the skin-friction distribution. In addition to a quasi-monochromatic light source, the SISF system includes a beam splitter and two video cameras equipped with filters for observing the same area on a model in different wavelength ranges, plus a frame grabber and a computer for digitizing the video images and processing the image data. One video camera acquires the interference pattern in a narrow wavelength range of the quasi-monochromatic source. The other video camera acquires the streamline image of fluorescence from the chalk in a nearby but wider wavelength range. The interference- pattern and fluorescence images are digitized, and the resulting data are processed by an algorithm that inverts the thin-oil-film equation to find the skin-friction distribution.

  17. Image-guided convection-enhanced delivery of muscimol to the primate brain

    PubMed Central

    Heiss, John D.; Walbridge, Stuart; Asthagiri, Ashok R.; Lonser, Russell R.

    2009-01-01

    Object Muscimol is a potent γ-aminobutyric acid-A receptor agonist (GABAA) that temporarily and selectively suppresses neurons. Targeted muscimol-suppression of neuronal structures could provide insight into the pathophysiology and treatment of a variety of neurologic disorders. To determine if muscimol delivered to the brain by convection-enhanced delivery (CED) could be monitored using a co-infused surrogate magnetic resonance (MR)-imaging tracer, we perfused the striata of primates with tritiated muscimol and gadolinium-DTPA. Methods Three primates underwent convective co-infusion of 3H-muscimol (0.8 μM) and gadolinium-DTPA (−5 mM) into the bilateral striata. Primates underwent serial MR-imaging during infusion and animals were sacrificed immediately after infusion. Post-mortem quantitative autoradiography and histological analysis was performed. Results MR-imaging revealed that infusate (tritiated muscimol and gadolinium-DTPA) distribution was clearly discernible from the non-infused parenchyma. Real-time MR-imaging of the infusion revealed the precise region of anatomic perfusion in each animal. Imaging analysis during infusion revealed that the distribution volume of infusate linearly increased (R=0.92) with volume of infusion. Overall, the mean (±S.D.) volume of distribution to volume of infusion ratio was 8.2±1.3. Autoradiographic analysis revealed that MR-imaging of gadolinium-DTPA closely correlated with the distribution of 3H-muscimol and precisely estimated its volume of distribution (mean difference in volume of distribution, 7.4%). Quantitative autoradiograms revealed that muscimol was homogeneously distributed over the perfused region in a square-shaped concentration profile. Conclusions Muscimol can be effectively delivered to clinically relevant volumes of the primate brain. Moreover, the distribution of muscimol can be tracked by co-infusion of gadolinium-DTPA using MR-imaging. The ability to accurately monitor and control the anatomic extent of muscimol distribution during its convection-enhanced delivery will enhance safety, permit correlations of muscimol distribution with clinical effect, and should lead to an improved understanding of the pathophysiologic processes underlying a variety of neurologic disorders. PMID:19715424

  18. Thread concept for automatic task parallelization in image analysis

    NASA Astrophysics Data System (ADS)

    Lueckenhaus, Maximilian; Eckstein, Wolfgang

    1998-09-01

    Parallel processing of image analysis tasks is an essential method to speed up image processing and helps to exploit the full capacity of distributed systems. However, writing parallel code is a difficult and time-consuming process and often leads to an architecture-dependent program that has to be re-implemented when changing the hardware. Therefore it is highly desirable to do the parallelization automatically. For this we have developed a special kind of thread concept for image analysis tasks. Threads derivated from one subtask may share objects and run in the same context but may process different threads of execution and work on different data in parallel. In this paper we describe the basics of our thread concept and show how it can be used as basis of an automatic task parallelization to speed up image processing. We further illustrate the design and implementation of an agent-based system that uses image analysis threads for generating and processing parallel programs by taking into account the available hardware. The tests made with our system prototype show that the thread concept combined with the agent paradigm is suitable to speed up image processing by an automatic parallelization of image analysis tasks.

  19. Optical Processing of Speckle Images with Bacteriorhodopsin for Pattern Recognition

    NASA Technical Reports Server (NTRS)

    Downie, John D.; Tucker, Deanne (Technical Monitor)

    1994-01-01

    Logarithmic processing of images with multiplicative noise characteristics can be utilized to transform the image into one with an additive noise distribution. This simplifies subsequent image processing steps for applications such as image restoration or correlation for pattern recognition. One particularly common form of multiplicative noise is speckle, for which the logarithmic operation not only produces additive noise, but also makes it of constant variance (signal-independent). We examine the optical transmission properties of some bacteriorhodopsin films here and find them well suited to implement such a pointwise logarithmic transformation optically in a parallel fashion. We present experimental results of the optical conversion of speckle images into transformed images with additive, signal-independent noise statistics using the real-time photochromic properties of bacteriorhodopsin. We provide an example of improved correlation performance in terms of correlation peak signal-to-noise for such a transformed speckle image.

  20. A virtual clinical trial comparing static versus dynamic PET imaging in measuring response to breast cancer therapy

    NASA Astrophysics Data System (ADS)

    Wangerin, Kristen A.; Muzi, Mark; Peterson, Lanell M.; Linden, Hannah M.; Novakova, Alena; Mankoff, David A.; E Kinahan, Paul

    2017-05-01

    We developed a method to evaluate variations in the PET imaging process in order to characterize the relative ability of static and dynamic metrics to measure breast cancer response to therapy in a clinical trial setting. We performed a virtual clinical trial by generating 540 independent and identically distributed PET imaging study realizations for each of 22 original dynamic fluorodeoxyglucose (18F-FDG) breast cancer patient studies pre- and post-therapy. Each noise realization accounted for known sources of uncertainty in the imaging process, such as biological variability and SUV uptake time. Four definitions of SUV were analyzed, which were SUVmax, SUVmean, SUVpeak, and SUV50%. We performed a ROC analysis on the resulting SUV and kinetic parameter uncertainty distributions to assess the impact of the variability on the measurement capabilities of each metric. The kinetic macro parameter, K i , showed more variability than SUV (mean CV K i   =  17%, SUV  =  13%), but K i pre- and post-therapy distributions also showed increased separation compared to the SUV pre- and post-therapy distributions (mean normalized difference K i   =  0.54, SUV  =  0.27). For the patients who did not show perfect separation between the pre- and post-therapy parameter uncertainty distributions (ROC AUC  <  1), dynamic imaging outperformed SUV in distinguishing metabolic change in response to therapy, ranging from 12 to 14 of 16 patients over all SUV definitions and uptake time scenarios (p  <  0.05). For the patient cohort in this study, which is comprised of non-high-grade ER+  tumors, K i outperformed SUV in an ROC analysis of the parameter uncertainty distributions pre- and post-therapy. This methodology can be applied to different scenarios with the ability to inform the design of clinical trials using PET imaging.

  1. A Versatile Image Processor For Digital Diagnostic Imaging And Its Application In Computed Radiography

    NASA Astrophysics Data System (ADS)

    Blume, H.; Alexandru, R.; Applegate, R.; Giordano, T.; Kamiya, K.; Kresina, R.

    1986-06-01

    In a digital diagnostic imaging department, the majority of operations for handling and processing of images can be grouped into a small set of basic operations, such as image data buffering and storage, image processing and analysis, image display, image data transmission and image data compression. These operations occur in almost all nodes of the diagnostic imaging communications network of the department. An image processor architecture was developed in which each of these functions has been mapped into hardware and software modules. The modular approach has advantages in terms of economics, service, expandability and upgradeability. The architectural design is based on the principles of hierarchical functionality, distributed and parallel processing and aims at real time response. Parallel processing and real time response is facilitated in part by a dual bus system: a VME control bus and a high speed image data bus, consisting of 8 independent parallel 16-bit busses, capable of handling combined up to 144 MBytes/sec. The presented image processor is versatile enough to meet the video rate processing needs of digital subtraction angiography, the large pixel matrix processing requirements of static projection radiography, or the broad range of manipulation and display needs of a multi-modality diagnostic work station. Several hardware modules are described in detail. For illustrating the capabilities of the image processor, processed 2000 x 2000 pixel computed radiographs are shown and estimated computation times for executing the processing opera-tions are presented.

  2. Detection of triterpene acids distribution in loquat (Eriobotrya japonica) leaf using hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Shi, Jiyong; Chen, Wu; Zou, Xiaobo; Xu, Yiwei; Huang, Xiaowei; Zhu, Yaodi; Shen, Tingting

    2018-01-01

    Hyperspectral images (431-962 nm) and partial least squares (PLS) were used to detect the distribution of triterpene acids within loquat (Eriobotrya japonica) leaves. 72 fresh loquat leaves in the young group, mature group and old group were collected for hyperspectral imaging; and triterpene acids content of the loquat leaves was analyzed using high performance liquid chromatography (HPLC). Then the spectral data of loquat leaf hyperspectral images and the triterpene acids content were employed to build calibration models. After spectra pre-processing and wavelength selection, an optimum calibration model (Rp = 0.8473, RMSEP = 2.61 mg/g) for predicting triterpene acids was obtained by synergy interval partial least squares (siPLS). Finally, spectral data of each pixel in the loquat leaf hyperspectral image were extracted and substituted into the optimum calibration model to predict triterpene acids content of each pixel. Therefore, the distribution map of triterpene acids content was obtained. As shown in the distribution map, triterpene acids are accumulated mainly in the leaf mesophyll regions near the main veins, and triterpene acids concentration of young group is less than that of mature and old groups. This study showed that hyperspectral imaging is suitable to determine the distribution of active constituent content in medical herbs in a rapid and non-invasive manner.

  3. Uniform competency-based local feature extraction for remote sensing images

    NASA Astrophysics Data System (ADS)

    Sedaghat, Amin; Mohammadi, Nazila

    2018-01-01

    Local feature detectors are widely used in many photogrammetry and remote sensing applications. The quantity and distribution of the local features play a critical role in the quality of the image matching process, particularly for multi-sensor high resolution remote sensing image registration. However, conventional local feature detectors cannot extract desirable matched features either in terms of the number of correct matches or the spatial and scale distribution in multi-sensor remote sensing images. To address this problem, this paper proposes a novel method for uniform and robust local feature extraction for remote sensing images, which is based on a novel competency criterion and scale and location distribution constraints. The proposed method, called uniform competency (UC) local feature extraction, can be easily applied to any local feature detector for various kinds of applications. The proposed competency criterion is based on a weighted ranking process using three quality measures, including robustness, spatial saliency and scale parameters, which is performed in a multi-layer gridding schema. For evaluation, five state-of-the-art local feature detector approaches, namely, scale-invariant feature transform (SIFT), speeded up robust features (SURF), scale-invariant feature operator (SFOP), maximally stable extremal region (MSER) and hessian-affine, are used. The proposed UC-based feature extraction algorithms were successfully applied to match various synthetic and real satellite image pairs, and the results demonstrate its capability to increase matching performance and to improve the spatial distribution. The code to carry out the UC feature extraction is available from href="https://www.researchgate.net/publication/317956777_UC-Feature_Extraction.

  4. Data Processing Factory for the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan

    2002-12-01

    The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.

  5. BOREAS TE-18, 30-m, Radiometrically Rectified Landsat TM Imagery

    NASA Technical Reports Server (NTRS)

    Hall, Forrest G. (Editor); Knapp, David

    2000-01-01

    The BOREAS TE-18 team used a radiometric rectification process to produce standardized DN values for a series of Landsat TM images of the BOREAS SSA and NSA in order to compare images that were collected under different atmospheric conditions. The images for each study area were referenced to an image that had very clear atmospheric qualities. The reference image for the SSA was collected on 02-Sep-1994, while the reference image for the NSA was collected on 21-Jun-1995. the 23 rectified images cover the period of 07-Jul-1985 to 18 Sep-1994 in the SSA and from 22-Jun-1984 to 09-Jun-1994 in the NSA. Each of the reference scenes had coincident atmospheric optical thickness measurements made by RSS-11. The radiometric rectification process is described in more detail by Hall et al. (199 1). The original Landsat TM data were received from CCRS for use in the BOREAS project. The data are stored in binary image-format files. Due to the nature of the radiometric rectification process and copyright issues, these full-resolution images may not be publicly distributed. However, a spatially degraded 60-m resolution version of the images is available on the BOREAS CD-ROM series. See Sections 15 and 16 for information about how to possibly acquire the full resolution data. Information about the full-resolution images is provided in an inventory listing on the CD-ROMs. The data files are available on a CD-ROM (see document number 20010000884), or from the Oak Ridge National Laboratory (ORNL) Distributed Activity Archive Center (DAAC).

  6. Application of a distributed systems architecture for increased speed in image processing on an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Wright, Adam A.; Momin, Orko; Shin, Young Ho; Shakya, Rahul; Nepal, Kumud; Ahlgren, David J.

    2010-01-01

    This paper presents the application of a distributed systems architecture to an autonomous ground vehicle, Q, that participates in both the autonomous and navigation challenges of the Intelligent Ground Vehicle Competition. In the autonomous challenge the vehicle is required to follow a course, while avoiding obstacles and staying within the course boundaries, which are marked by white lines. For the navigation challenge, the vehicle is required to reach a set of target destinations, known as way points, with given GPS coordinates and avoid obstacles that it encounters in the process. Previously the vehicle utilized a single laptop to execute all processing activities including image processing, sensor interfacing and data processing, path planning and navigation algorithms and motor control. National Instruments' (NI) LabVIEW served as the programming language for software implementation. As an upgrade to last year's design, a NI compact Reconfigurable Input/Output system (cRIO) was incorporated to the system architecture. The cRIO is NI's solution for rapid prototyping that is equipped with a real time processor, an FPGA and modular input/output. Under the current system, the real time processor handles the path planning and navigation algorithms, the FPGA gathers and processes sensor data. This setup leaves the laptop to focus on running the image processing algorithm. Image processing as previously presented by Nepal et. al. is a multi-step line extraction algorithm and constitutes the largest processor load. This distributed approach results in a faster image processing algorithm which was previously Q's bottleneck. Additionally, the path planning and navigation algorithms are executed more reliably on the real time processor due to the deterministic nature of operation. The implementation of this architecture required exploration of various inter-system communication techniques. Data transfer between the laptop and the real time processor using UDP packets was established as the most reliable protocol after testing various options. Improvement can be made to the system by migrating more algorithms to the hardware based FPGA to further speed up the operations of the vehicle.

  7. Medical gamma ray imaging

    DOEpatents

    Osborne, Louis S.; Lanza, Richard C.

    1984-01-01

    A method and apparatus for determining the distribution of a position-emitting radioisotope into an object, the apparatus consisting of a wire mesh radiation converter, an ionizable gas for propagating ionization events caused by electrodes released by the converter, a drift field, a spatial position detector and signal processing circuitry for correlating near-simultaneous ionization events and determining their time differences, whereby the position sources of back-to-back collinear radiation can be located and a distribution image constructed.

  8. Calculating bathymetric and spatial distributions of estuarine eelgrass

    EPA Science Inventory

    Distributions of native eelgrass Zostera marina L. within the intertidal and shallow subtidal zones of three Oregon estuaries (Tillamook, Yaquina, and Alsea) were classified from color infrared aerial orthophotography acquired at extreme low tide. Image processing software, Spati...

  9. Diffraction effects and inelastic electron transport in angle-resolved microscopic imaging applications.

    PubMed

    Winkelmann, A; Nolze, G; Vespucci, S; Naresh-Kumar, G; Trager-Cowan, C; Vilalta-Clemente, A; Wilkinson, A J; Vos, M

    2017-09-01

    We analyse the signal formation process for scanning electron microscopic imaging applications on crystalline specimens. In accordance with previous investigations, we find nontrivial effects of incident beam diffraction on the backscattered electron distribution in energy and momentum. Specifically, incident beam diffraction causes angular changes of the backscattered electron distribution which we identify as the dominant mechanism underlying pseudocolour orientation imaging using multiple, angle-resolving detectors. Consequently, diffraction effects of the incident beam and their impact on the subsequent coherent and incoherent electron transport need to be taken into account for an in-depth theoretical modelling of the energy- and momentum distribution of electrons backscattered from crystalline sample regions. Our findings have implications for the level of theoretical detail that can be necessary for the interpretation of complex imaging modalities such as electron channelling contrast imaging (ECCI) of defects in crystals. If the solid angle of detection is limited to specific regions of the backscattered electron momentum distribution, the image contrast that is observed in ECCI and similar applications can be strongly affected by incident beam diffraction and topographic effects from the sample surface. As an application, we demonstrate characteristic changes in the resulting images if different properties of the backscattered electron distribution are used for the analysis of a GaN thin film sample containing dislocations. © 2017 The Authors. Journal of Microscopy published by JohnWiley & Sons Ltd on behalf of Royal Microscopical Society.

  10. Combined X-ray CT and mass spectrometry for biomedical imaging applications

    NASA Astrophysics Data System (ADS)

    Schioppa, E., Jr.; Ellis, S.; Bruinen, A. L.; Visser, J.; Heeren, R. M. A.; Uher, J.; Koffeman, E.

    2014-04-01

    Imaging technologies play a key role in many branches of science, especially in biology and medicine. They provide an invaluable insight into both internal structure and processes within a broad range of samples. There are many techniques that allow one to obtain images of an object. Different techniques are based on the analysis of a particular sample property by means of a dedicated imaging system, and as such, each imaging modality provides the researcher with different information. The use of multimodal imaging (imaging with several different techniques) can provide additional and complementary information that is not possible when employing a single imaging technique alone. In this study, we present for the first time a multi-modal imaging technique where X-ray computerized tomography (CT) is combined with mass spectrometry imaging (MSI). While X-ray CT provides 3-dimensional information regarding the internal structure of the sample based on X-ray absorption coefficients, MSI of thin sections acquired from the same sample allows the spatial distribution of many elements/molecules, each distinguished by its unique mass-to-charge ratio (m/z), to be determined within a single measurement and with a spatial resolution as low as 1 μm or even less. The aim of the work is to demonstrate how molecular information from MSI can be spatially correlated with 3D structural information acquired from X-ray CT. In these experiments, frozen samples are imaged in an X-ray CT setup using Medipix based detectors equipped with a CO2 cooled sample holder. Single projections are pre-processed before tomographic reconstruction using a signal-to-thickness calibration. In the second step, the object is sliced into thin sections (circa 20 μm) that are then imaged using both matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) and secondary ion (SIMS) mass spectrometry, where the spatial distribution of specific molecules within the sample is determined. The combination of two vastly different imaging approaches provides complementary information (i.e., anatomical and molecular distributions) that allows the correlation of distinct structural features with specific molecules distributions leading to unique insights in disease development.

  11. Digital image modification detection using color information and its histograms.

    PubMed

    Zhou, Haoyu; Shen, Yue; Zhu, Xinghui; Liu, Bo; Fu, Zigang; Fan, Na

    2016-09-01

    The rapid development of many open source and commercial image editing software makes the authenticity of the digital images questionable. Copy-move forgery is one of the most widely used tampering techniques to create desirable objects or conceal undesirable objects in a scene. Existing techniques reported in the literature to detect such tampering aim to improve the robustness of these methods against the use of JPEG compression, blurring, noise, or other types of post processing operations. These post processing operations are frequently used with the intention to conceal tampering and reduce tampering clues. A robust method based on the color moments and other five image descriptors is proposed in this paper. The method divides the image into fixed size overlapping blocks. Clustering operation divides entire search space into smaller pieces with similar color distribution. Blocks from the tampered regions will reside within the same cluster since both copied and moved regions have similar color distributions. Five image descriptors are used to extract block features, which makes the method more robust to post processing operations. An ensemble of deep compositional pattern-producing neural networks are trained with these extracted features. Similarity among feature vectors in clusters indicates possible forged regions. Experimental results show that the proposed method can detect copy-move forgery even if an image was distorted by gamma correction, addictive white Gaussian noise, JPEG compression, or blurring. Copyright © 2016. Published by Elsevier Ireland Ltd.

  12. Fault diagnosis for diesel valve trains based on time frequency images

    NASA Astrophysics Data System (ADS)

    Wang, Chengdong; Zhang, Youyun; Zhong, Zhenyuan

    2008-11-01

    In this paper, the Wigner-Ville distributions (WVD) of vibration acceleration signals which were acquired from the cylinder head in eight different states of valve train were calculated and displayed in grey images; and the probabilistic neural networks (PNN) were directly used to classify the time-frequency images after the images were normalized. By this way, the fault diagnosis of valve train was transferred to the classification of time-frequency images. As there is no need to extract further fault features (such as eigenvalues or symptom parameters) from time-frequency distributions before classification, the fault diagnosis process is highly simplified. The experimental results show that the faults of diesel valve trains can be classified accurately by the proposed methods.

  13. Observation of FeGe skyrmions by electron phase microscopy with hole-free phase plate

    NASA Astrophysics Data System (ADS)

    Kotani, Atsuhiro; Harada, Ken; Malac, Marek; Salomons, Mark; Hayashida, Misa; Mori, Shigeo

    2018-05-01

    We report application of hole-free phase plate (HFPP) to imaging of magnetic skyrmion lattices. Using HFPP imaging, we observed skyrmions in FeGe, and succeeded in obtaining phase contrast images that reflect the sample magnetization distribution. According to the Aharonov-Bohm effect, the electron phase is shifted by the magnetic flux due to sample magnetization. The differential processing of the intensity in a HFPP image allows us to successfully reconstruct the magnetization map of the skyrmion lattice. Furthermore, the calculated phase shift due to the magnetization of the thin film was consistent with that measured by electron holography experiment, which demonstrates that HFPP imaging can be utilized for analysis of magnetic fields and electrostatic potential distribution at the nanoscale.

  14. Three-dimensional characterization of pigment dispersion in dried paint films using focused ion beam-scanning electron microscopy.

    PubMed

    Lin, Jui-Ching; Heeschen, William; Reffner, John; Hook, John

    2012-04-01

    The combination of integrated focused ion beam-scanning electron microscope (FIB-SEM) serial sectioning and imaging techniques with image analysis provided quantitative characterization of three-dimensional (3D) pigment dispersion in dried paint films. The focused ion beam in a FIB-SEM dual beam system enables great control in slicing paints, and the sectioning process can be synchronized with SEM imaging providing high quality serial cross-section images for 3D reconstruction. Application of Euclidean distance map and ultimate eroded points image analysis methods can provide quantitative characterization of 3D particle distribution. It is concluded that 3D measurement of binder distribution in paints is effective to characterize the order of pigment dispersion in dried paint films.

  15. Optical granulometric analysis of sedimentary deposits by color segmentation-based software: OPTGRAN-CS

    NASA Astrophysics Data System (ADS)

    Chávez, G. Moreno; Sarocchi, D.; Santana, E. Arce; Borselli, L.

    2015-12-01

    The study of grain size distribution is fundamental for understanding sedimentological environments. Through these analyses, clast erosion, transport and deposition processes can be interpreted and modeled. However, grain size distribution analysis can be difficult in some outcrops due to the number and complexity of the arrangement of clasts and matrix and their physical size. Despite various technological advances, it is almost impossible to get the full grain size distribution (blocks to sand grain size) with a single method or instrument of analysis. For this reason development in this area continues to be fundamental. In recent years, various methods of particle size analysis by automatic image processing have been developed, due to their potential advantages with respect to classical ones; speed and final detailed content of information (virtually for each analyzed particle). In this framework, we have developed a novel algorithm and software for grain size distribution analysis, based on color image segmentation using an entropy-controlled quadratic Markov measure field algorithm and the Rosiwal method for counting intersections between clast and linear transects in the images. We test the novel algorithm in different sedimentary deposit types from 14 varieties of sedimentological environments. The results of the new algorithm were compared with grain counts performed manually by the same Rosiwal methods applied by experts. The new algorithm has the same accuracy as a classical manual count process, but the application of this innovative methodology is much easier and dramatically less time-consuming. The final productivity of the new software for analysis of clasts deposits after recording field outcrop images can be increased significantly.

  16. Deducing Electron Properties from Hard X-Ray Observations

    NASA Technical Reports Server (NTRS)

    Kontar, E. P.; Brown, J. C.; Emslie, A. G.; Hajdas, W.; Holman, G. D.; Hurford, G. J.; Kasparova, J.; Mallik, P. C. V.; Massone, A. M.; McConnell, M. L.; hide

    2011-01-01

    X-radiation from energetic electrons is the prime diagnostic of flare-accelerated electrons. The observed X-ray flux (and polarization state) is fundamentally a convolution of the cross-section for the hard X-ray emission process(es) in question with the electron distribution function, which is in turn a function of energy, direction, spatial location and time. To address the problems of particle propagation and acceleration one needs to infer as much information as possible on this electron distribution function, through a deconvolution of this fundamental relationship. This review presents recent progress toward this goal using spectroscopic, imaging and polarization measurements, primarily from the Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI). Previous conclusions regarding the energy, angular (pitch angle) and spatial distributions of energetic electrons in solar flares are critically reviewed. We discuss the role and the observational evidence of several radiation processes: free-free electron-ion, free-free electron-electron, free-bound electron-ion, photoelectric absorption and Compton backscatter (albedo), using both spectroscopic and imaging techniques. This unprecedented quality of data allows for the first time inference of the angular distributions of the X-ray-emitting electrons and improved model-independent inference of electron energy spectra and emission measures of thermal plasma. Moreover, imaging spectroscopy has revealed hitherto unknown details of solar flare morphology and detailed spectroscopy of coronal, footpoint and extended sources in flaring regions. Additional attempts to measure hard X-ray polarization were not sufficient to put constraints on the degree of anisotropy of electrons, but point to the importance of obtaining good quality polarization data in the future.

  17. Correlation between Charge Contrast Imaging and the Distribution of Some Trace Level Impurities in Gibbsite

    NASA Astrophysics Data System (ADS)

    Baroni, Travis C.; Griffin, Brendan J.; Browne, James R.; Lincoln, Frank J.

    2000-01-01

    Charge contrast images (CCI) of synthetic gibbsite obtained on an environmental scanning electron microscope gives information on the crystallization process. Furthermore, X-ray mapping of the same grains shows that impurities are localized during the initial stages of growth and that the resulting composition images have features similar to these observed in CCI. This suggests a possible correlation between impurity distributions and the emission detected during CCI. X-ray line profiles, simulating the spatial distribution of impurities derived from the Monte Carlo program CASINO, have been compared with experimental line profiles and give an estimate of the localization. The model suggests that a main impurity, Ca, is depleted from the solution within approximately 3 4 [mu]m of growth.

  18. Preliminary investigation of foot pressure distribution variation in men and women adults while standing.

    PubMed

    Periyasamy, R; Mishra, A; Anand, Sneh; Ammini, A C

    2011-09-01

    Women and men are anatomically and physiologically different in a number of ways. They differ in both shape and size. These differences could potentially mean foot pressure distribution variation in men and women. The purpose of this study was to analyze standing foot pressure image to obtain the foot pressure distribution parameter - power ratio variation between men and women using image processing in frequency domain. We examined 28 healthy adult subjects (14 men and 14 women) aged between 20 and 45 years was recruited for our study. Foot pressure distribution patterns while standing are obtained by using a PedoPowerGraph plantar pressure measurement system for foot image formation, a digital camera for image capturing, a TV tuner PC-add on card, a WinDvr software for still capture and Matlab software with dedicated image processing algorithms have been developed. Various PedoPowerGraphic parameters such as percentage medial impulse (PMI), fore foot to hind foot pressure distribution ratio (F/H), big toe to fore foot pressure distribution ratio (B/F) and power ratio (PR) were evaluated. In men, contact area was significantly larger in all regions of the foot compared with women. There were significant differences in plantar pressure distribution but there was no significant difference in F/H and B/F ratio. Mean PR value was significantly greater in men than women under the hind foot and fore foot. PMI value was greater in women than men. As compared to men, women have maximum PR variations in the mid foot. Hence there is significant difference at level p<0.05 in medial mid foot and mid foot PR of women as compared to men. There was variation in plantar pressure distribution because the contact area of the men foot was larger than that of women foot. Hence knowledge of pressure distributions variation of both feet can provide suitable guidelines to biomedical engineers and doctor for designing orthotic devices for reliving the area of excessively high pressure. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Information Processing Concepts: A Cure for "Technofright." Information Processing in the Electronic Office. Part 1: Concepts.

    ERIC Educational Resources Information Center

    Popyk, Marilyn K.

    1986-01-01

    Discusses the new automated office and its six major technologies (data processing, word processing, graphics, image, voice, and networking), the information processing cycle (input, processing, output, distribution/communication, and storage and retrieval), ergonomics, and ways to expand office education classes (versus class instruction). (CT)

  20. Iterative optimizing quantization method for reconstructing three-dimensional images from a limited number of views

    DOEpatents

    Lee, Heung-Rae

    1997-01-01

    A three-dimensional image reconstruction method comprises treating the object of interest as a group of elements with a size that is determined by the resolution of the projection data, e.g., as determined by the size of each pixel. One of the projections is used as a reference projection. A fictitious object is arbitrarily defined that is constrained by such reference projection. The method modifies the known structure of the fictitious object by comparing and optimizing its four projections to those of the unknown structure of the real object and continues to iterate until the optimization is limited by the residual sum of background noise. The method is composed of several sub-processes that acquire four projections from the real data and the fictitious object: generate an arbitrary distribution to define the fictitious object, optimize the four projections, generate a new distribution for the fictitious object, and enhance the reconstructed image. The sub-process for the acquisition of the four projections from the input real data is simply the function of acquiring the four projections from the data of the transmitted intensity. The transmitted intensity represents the density distribution, that is, the distribution of absorption coefficients through the object.

  1. Simulation of speckle patterns with pre-defined correlation distributions.

    PubMed

    Song, Lipei; Zhou, Zhen; Wang, Xueyan; Zhao, Xing; Elson, Daniel S

    2016-03-01

    We put forward a method to easily generate a single or a sequence of fully developed speckle patterns with pre-defined correlation distribution by utilizing the principle of coherent imaging. The few-to-one mapping between the input correlation matrix and the correlation distribution between simulated speckle patterns is realized and there is a simple square relationship between the values of these two correlation coefficient sets. This method is demonstrated both theoretically and experimentally. The square relationship enables easy conversion from any desired correlation distribution. Since the input correlation distribution can be defined by a digital matrix or a gray-scale image acquired experimentally, this method provides a convenient way to simulate real speckle-related experiments and to evaluate data processing techniques.

  2. Simulation of speckle patterns with pre-defined correlation distributions

    PubMed Central

    Song, Lipei; Zhou, Zhen; Wang, Xueyan; Zhao, Xing; Elson, Daniel S.

    2016-01-01

    We put forward a method to easily generate a single or a sequence of fully developed speckle patterns with pre-defined correlation distribution by utilizing the principle of coherent imaging. The few-to-one mapping between the input correlation matrix and the correlation distribution between simulated speckle patterns is realized and there is a simple square relationship between the values of these two correlation coefficient sets. This method is demonstrated both theoretically and experimentally. The square relationship enables easy conversion from any desired correlation distribution. Since the input correlation distribution can be defined by a digital matrix or a gray-scale image acquired experimentally, this method provides a convenient way to simulate real speckle-related experiments and to evaluate data processing techniques. PMID:27231589

  3. Two-dimensional thermography image retrieval from zig-zag scanned data with TZ-SCAN

    NASA Astrophysics Data System (ADS)

    Okumura, Hiroshi; Yamasaki, Ryohei; Arai, Kohei

    2008-10-01

    TZ-SCAN is a simple and low cost thermal imaging device which consists of a single point radiation thermometer on a tripod with a pan-tilt rotator, a DC motor controller board with a USB interface, and a laptop computer for rotator control, data acquisition, and data processing. TZ-SCAN acquires a series of zig-zag scanned data and stores the data as CSV file. A 2-D thermal distribution image can be retrieved by using the second quefrency peak calculated from TZ-SCAN data. An experiment is conducted to confirm the validity of the thermal retrieval algorithm. The experimental result shows efficient accuracy for 2-D thermal distribution image retrieval.

  4. New Processing of Spaceborne Imaging Radar-C (SIR-C) Data

    NASA Astrophysics Data System (ADS)

    Meyer, F. J.; Gracheva, V.; Arko, S. A.; Labelle-Hamer, A. L.

    2017-12-01

    The Spaceborne Imaging Radar-C (SIR-C) was a radar system, which successfully operated on two separate shuttle missions in April and October 1994. During these two missions, a total of 143 hours of radar data were recorded. SIR-C was the first multifrequency and polarimetric spaceborne radar system, operating in dual frequency (L- and C- band) and with quad-polarization. SIR-C had a variety of different operating modes, which are innovative even from today's point of view. Depending on the mode, it was possible to acquire data with different polarizations and carrier frequency combinations. Additionally, different swaths and bandwidths could be used during the data collection and it was possible to receive data with two antennas in the along-track direction.The United States Geological Survey (USGS) distributes the synthetic aperture radar (SAR) images as single-look complex (SLC) and multi-look complex (MLC) products. Unfortunately, since June 2005 the SIR-C processor has been inoperable and not repairable. All acquired SLC and MLC images were processed with a course resolution of 100 m with the goal of generating a quick look. These images are however not well suited for scientific analysis. Only a small percentage of the acquired data has been processed as full resolution SAR images and the unprocessed high resolution data cannot be processed any more at the moment.At the Alaska Satellite Facility (ASF) a new processor was developed to process binary SIR-C data to full resolution SAR images. ASF is planning to process the entire recoverable SIR-C archive to full resolution SLCs, MLCs and high resolution geocoded image products. ASF will make these products available to the science community through their existing data archiving and distribution system.The final paper will describe the new processor and analyze the challenges of reprocessing the SIR-C data.

  5. Distributed Fusion in Sensor Networks with Information Genealogy

    DTIC Science & Technology

    2011-06-28

    image processing [2], acoustic and speech recognition [3], multitarget tracking [4], distributed fusion [5], and Bayesian inference [6-7]. For...Adaptation for Distant-Talking Speech Recognition." in Proc Acoustics. Speech , and Signal Processing, 2004 |4| Y Bar-Shalom and T 1-. Fortmann...used in speech recognition and other classification applications [8]. But their use in underwater mine classification is limited. In this paper, we

  6. MOPEX: a software package for astronomical image processing and visualization

    NASA Astrophysics Data System (ADS)

    Makovoz, David; Roby, Trey; Khan, Iffat; Booth, Hartley

    2006-06-01

    We present MOPEX - a software package for astronomical image processing and display. The package is a combination of command-line driven image processing software written in C/C++ with a Java-based GUI. The main image processing capabilities include creating mosaic images, image registration, background matching, point source extraction, as well as a number of minor image processing tasks. The combination of the image processing and display capabilities allows for much more intuitive and efficient way of performing image processing. The GUI allows for the control over the image processing and display to be closely intertwined. Parameter setting, validation, and specific processing options are entered by the user through a set of intuitive dialog boxes. Visualization feeds back into further processing by providing a prompt feedback of the processing results. The GUI also allows for further analysis by accessing and displaying data from existing image and catalog servers using a virtual observatory approach. Even though originally designed for the Spitzer Space Telescope mission, a lot of functionalities are of general usefulness and can be used for working with existing astronomical data and for new missions. The software used in the package has undergone intensive testing and benefited greatly from effective software reuse. The visualization part has been used for observation planning for both the Spitzer and Herschel Space Telescopes as part the tool Spot. The visualization capabilities of Spot have been enhanced and integrated with the image processing functionality of the command-line driven MOPEX. The image processing software is used in the Spitzer automated pipeline processing, which has been in operation for nearly 3 years. The image processing capabilities have also been tested in off-line processing by numerous astronomers at various institutions around the world. The package is multi-platform and includes automatic update capabilities. The software package has been developed by a small group of software developers and scientists at the Spitzer Science Center. It is available for distribution at the Spitzer Science Center web page.

  7. Remote-Sensing Data Distribution and Processing in the Cloud at the ASF DAAC

    NASA Astrophysics Data System (ADS)

    Stoner, C.; Arko, S. A.; Nicoll, J. B.; Labelle-Hamer, A. L.

    2016-12-01

    The Alaska Satellite Facility (ASF) Distributed Active Archive Center (DAAC) has been tasked to archive and distribute data from both SENTINEL-1 satellites and from the NASA-ISRO Synthetic Aperture Radar (NISAR) satellite in a cost effective manner. In order to best support processing and distribution of these large data sets for users, the ASF DAAC enhanced our data system in a number of ways that will be detailed in this presentation.The SENTINEL-1 mission comprises a constellation of two polar-orbiting satellites, operating day and night performing C-band Synthetic Aperture Radar (SAR) imaging, enabling them to acquire imagery regardless of the weather. SENTINEL-1A was launched by the European Space Agency (ESA) in April 2014. SENTINEL-1B is scheduled to launch in April 2016.The NISAR satellite is designed to observe and take measurements of some of the planet's most complex processes, including ecosystem disturbances, ice-sheet collapse, and natural hazards such as earthquakes, tsunamis, volcanoes and landslides. NISAR will employ radar imaging, polarimetry, and interferometry techniques using the SweepSAR technology employed for full-resolution wide-swath imaging. NISAR data files are large, making storage and processing a challenge for conventional store and download systems.To effectively process, store, and distribute petabytes of data in a High-performance computing environment, ASF took a long view with regard to technology choices and picked a path of most flexibility and Software re-use. To that end, this Software tools and services presentation will cover Web Object Storage (WOS) and the ability to seamlessly move from local sunk cost hardware to public cloud, such as Amazon Web Services (AWS). A prototype of SENTINEL-1A system that is in AWS, as well as a local hardware solution, will be examined to explain the pros and cons of each. In preparation for NISAR files which will be even larger than SENTINEL-1A, ASF has embarked on a number of cloud initiatives, including processing in the cloud at scale, processing data on-demand, and processing end-user computations on DAAC data in the cloud.

  8. Multistage Spatial Property Based Segmentation for Quantification of Fluorescence Distribution in Cells

    NASA Astrophysics Data System (ADS)

    Zhang, Guangyun; Jia, Xiuping; Pham, Tuan D.; Crane, Denis I.

    2010-01-01

    The interpretation of the distribution of fluorescence in cells is often by simple visualization of microscope-derived images for qualitative studies. In other cases, however, it is desirable to be able to quantify the distribution of fluorescence using digital image processing techniques. In this paper, the challenges of fluorescence segmentation due to the noise present in the data are addressed. We report that intensity measurements alone do not allow separation of overlapping data between target and background. Consequently, spatial properties derived from neighborhood profile were included. Mathematical Morphological operations were implemented for cell boundary extraction and a window based contrast measure was developed for fluorescence puncta identification. All of these operations were applied in the proposed multistage processing scheme. The testing results show that the spatial measures effectively enhance the target separability.

  9. Analyzing Protein Clusters on the Plasma Membrane: Application of Spatial Statistical Analysis Methods on Super-Resolution Microscopy Images.

    PubMed

    Paparelli, Laura; Corthout, Nikky; Pavie, Benjamin; Annaert, Wim; Munck, Sebastian

    2016-01-01

    The spatial distribution of proteins within the cell affects their capability to interact with other molecules and directly influences cellular processes and signaling. At the plasma membrane, multiple factors drive protein compartmentalization into specialized functional domains, leading to the formation of clusters in which intermolecule interactions are facilitated. Therefore, quantifying protein distributions is a necessity for understanding their regulation and function. The recent advent of super-resolution microscopy has opened up the possibility of imaging protein distributions at the nanometer scale. In parallel, new spatial analysis methods have been developed to quantify distribution patterns in super-resolution images. In this chapter, we provide an overview of super-resolution microscopy and summarize the factors influencing protein arrangements on the plasma membrane. Finally, we highlight methods for analyzing clusterization of plasma membrane proteins, including examples of their applications.

  10. Fluorescence Imaging Reveals Surface Contamination

    NASA Technical Reports Server (NTRS)

    Schirato, Richard; Polichar, Raulf

    1992-01-01

    In technique to detect surface contamination, object inspected illuminated by ultraviolet light to make contaminants fluoresce; low-light-level video camera views fluorescence. Image-processing techniques quantify distribution of contaminants. If fluorescence of material expected to contaminate surface is not intense, tagged with low concentration of dye.

  11. Planning applications in image analysis

    NASA Technical Reports Server (NTRS)

    Boddy, Mark; White, Jim; Goldman, Robert; Short, Nick, Jr.

    1994-01-01

    We describe two interim results from an ongoing effort to automate the acquisition, analysis, archiving, and distribution of satellite earth science data. Both results are applications of Artificial Intelligence planning research to the automatic generation of processing steps for image analysis tasks. First, we have constructed a linear conditional planner (CPed), used to generate conditional processing plans. Second, we have extended an existing hierarchical planning system to make use of durations, resources, and deadlines, thus supporting the automatic generation of processing steps in time and resource-constrained environments.

  12. Statistical distributions of ultra-low dose CT sinograms and their fundamental limits

    NASA Astrophysics Data System (ADS)

    Lee, Tzu-Cheng; Zhang, Ruoqiao; Alessio, Adam M.; Fu, Lin; De Man, Bruno; Kinahan, Paul E.

    2017-03-01

    Low dose CT imaging is typically constrained to be diagnostic. However, there are applications for even lowerdose CT imaging, including image registration across multi-frame CT images and attenuation correction for PET/CT imaging. We define this as the ultra-low-dose (ULD) CT regime where the exposure level is a factor of 10 lower than current low-dose CT technique levels. In the ULD regime it is possible to use statistically-principled image reconstruction methods that make full use of the raw data information. Since most statistical based iterative reconstruction methods are based on the assumption of that post-log noise distribution is close to Poisson or Gaussian, our goal is to understand the statistical distribution of ULD CT data with different non-positivity correction methods, and to understand when iterative reconstruction methods may be effective in producing images that are useful for image registration or attenuation correction in PET/CT imaging. We first used phantom measurement and calibrated simulation to reveal how the noise distribution deviate from normal assumption under the ULD CT flux environment. In summary, our results indicate that there are three general regimes: (1) Diagnostic CT, where post-log data are well modeled by normal distribution. (2) Lowdose CT, where normal distribution remains a reasonable approximation and statistically-principled (post-log) methods that assume a normal distribution have an advantage. (3) An ULD regime that is photon-starved and the quadratic approximation is no longer effective. For instance, a total integral density of 4.8 (ideal pi for 24 cm of water) for 120kVp, 0.5mAs of radiation source is the maximum pi value where a definitive maximum likelihood value could be found. This leads to fundamental limits in the estimation of ULD CT data when using a standard data processing stream

  13. Digital techniques for processing Landsat imagery

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1978-01-01

    An overview of the basic techniques used to process Landsat images with a digital computer, and the VICAR image processing software developed at JPL and available to users through the NASA sponsored COSMIC computer program distribution center is presented. Examples of subjective processing performed to improve the information display for the human observer, such as contrast enhancement, pseudocolor display and band rationing, and of quantitative processing using mathematical models, such as classification based on multispectral signatures of different areas within a given scene and geometric transformation of imagery into standard mapping projections are given. Examples are illustrated by Landsat scenes of the Andes mountains and Altyn-Tagh fault zone in China before and after contrast enhancement and classification of land use in Portland, Oregon. The VICAR image processing software system which consists of a language translator that simplifies execution of image processing programs and provides a general purpose format so that imagery from a variety of sources can be processed by the same basic set of general applications programs is described.

  14. Intelligent distributed medical image management

    NASA Astrophysics Data System (ADS)

    Garcia, Hong-Mei C.; Yun, David Y.

    1995-05-01

    The rapid advancements in high performance global communication have accelerated cooperative image-based medical services to a new frontier. Traditional image-based medical services such as radiology and diagnostic consultation can now fully utilize multimedia technologies in order to provide novel services, including remote cooperative medical triage, distributed virtual simulation of operations, as well as cross-country collaborative medical research and training. Fast (efficient) and easy (flexible) retrieval of relevant images remains a critical requirement for the provision of remote medical services. This paper describes the database system requirements, identifies technological building blocks for meeting the requirements, and presents a system architecture for our target image database system, MISSION-DBS, which has been designed to fulfill the goals of Project MISSION (medical imaging support via satellite integrated optical network) -- an experimental high performance gigabit satellite communication network with access to remote supercomputing power, medical image databases, and 3D visualization capabilities in addition to medical expertise anywhere and anytime around the country. The MISSION-DBS design employs a synergistic fusion of techniques in distributed databases (DDB) and artificial intelligence (AI) for storing, migrating, accessing, and exploring images. The efficient storage and retrieval of voluminous image information is achieved by integrating DDB modeling and AI techniques for image processing while the flexible retrieval mechanisms are accomplished by combining attribute- based and content-based retrievals.

  15. LA-iMageS: a software for elemental distribution bioimaging using LA-ICP-MS data.

    PubMed

    López-Fernández, Hugo; de S Pessôa, Gustavo; Arruda, Marco A Z; Capelo-Martínez, José L; Fdez-Riverola, Florentino; Glez-Peña, Daniel; Reboiro-Jato, Miguel

    2016-01-01

    The spatial distribution of chemical elements in different types of samples is an important field in several research areas such as biology, paleontology or biomedicine, among others. Elemental distribution imaging by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) is an effective technique for qualitative and quantitative imaging due to its high spatial resolution and sensitivity. By applying this technique, vast amounts of raw data are generated to obtain high-quality images, essentially making the use of specific LA-ICP-MS imaging software that can process such data absolutely mandatory. Since existing solutions are usually commercial or hard-to-use for average users, this work introduces LA-iMageS, an open-source, free-to-use multiplatform application for fast and automatic generation of high-quality elemental distribution bioimages from LA-ICP-MS data in the PerkinElmer Elan XL format, whose results can be directly exported to external applications for further analysis. A key strength of LA-iMageS is its substantial added value for users, with particular regard to the customization of the elemental distribution bioimages, which allows, among other features, the ability to change color maps, increase image resolution or toggle between 2D and 3D visualizations.

  16. Study of temperature distributions in wafer exposure process

    NASA Astrophysics Data System (ADS)

    Lin, Zone-Ching; Wu, Wen-Jang

    During the exposure process of photolithography, wafer absorbs the exposure energy, which results in rising temperature and the phenomenon of thermal expansion. This phenomenon was often neglected due to its limited effect in the previous generation of process. However, in the new generation of process, it may very likely become a factor to be considered. In this paper, the finite element model for analyzing the transient behavior of the distribution of wafer temperature during exposure was established under the assumption that the wafer was clamped by a vacuum chuck without warpage. The model is capable of simulating the distribution of the wafer temperature under different exposure conditions. The flowchart of analysis begins with the simulation of transient behavior in a single exposure region to the variation of exposure energy, interval of exposure locations and interval of exposure time under continuous exposure to investigate the distribution of wafer temperature. The simulation results indicate that widening the interval of exposure locations has a greater impact in improving the distribution of wafer temperature than extending the interval of exposure time between neighboring image fields. Besides, as long as the distance between the field center locations of two neighboring exposure regions exceeds the straight distance equals to three image fields wide, the interacting thermal effect during wafer exposure can be ignored. The analysis flow proposed in this paper can serve as a supporting reference tool for engineers in planning exposure paths.

  17. Detection and imaging of moving objects with SAR by a joint space-time-frequency processing

    NASA Astrophysics Data System (ADS)

    Barbarossa, Sergio; Farina, Alfonso

    This paper proposes a joint spacetime-frequency processing scheme for the detection and imaging of moving targets by Synthetic Aperture Radars (SAR). The method is based on the availability of an array antenna. The signals received by the array elements are combined, in a spacetime processor, to cancel the clutter. Then, they are analyzed in the time-frequency domain, by computing their Wigner-Ville Distribution (WVD), in order to estimate the instantaneous frequency, to be used for the successive phase compensation, necessary to produce a high resolution image.

  18. 3-D readout-electronics packaging for high-bandwidth massively paralleled imager

    DOEpatents

    Kwiatkowski, Kris; Lyke, James

    2007-12-18

    Dense, massively parallel signal processing electronics are co-packaged behind associated sensor pixels. Microchips containing a linear or bilinear arrangement of photo-sensors, together with associated complex electronics, are integrated into a simple 3-D structure (a "mirror cube"). An array of photo-sensitive cells are disposed on a stacked CMOS chip's surface at a 45.degree. angle from light reflecting mirror surfaces formed on a neighboring CMOS chip surface. Image processing electronics are held within the stacked CMOS chip layers. Electrical connections couple each of said stacked CMOS chip layers and a distribution grid, the connections for distributing power and signals to components associated with each stacked CSMO chip layer.

  19. A distributed pipeline for DIDSON data processing

    USGS Publications Warehouse

    Li, Liling; Danner, Tyler; Eickholt, Jesse; McCann, Erin L.; Pangle, Kevin; Johnson, Nicholas

    2018-01-01

    Technological advances in the field of ecology allow data on ecological systems to be collected at high resolution, both temporally and spatially. Devices such as Dual-frequency Identification Sonar (DIDSON) can be deployed in aquatic environments for extended periods and easily generate several terabytes of underwater surveillance data which may need to be processed multiple times. Due to the large amount of data generated and need for flexibility in processing, a distributed pipeline was constructed for DIDSON data making use of the Hadoop ecosystem. The pipeline is capable of ingesting raw DIDSON data, transforming the acoustic data to images, filtering the images, detecting and extracting motion, and generating feature data for machine learning and classification. All of the tasks in the pipeline can be run in parallel and the framework allows for custom processing. Applications of the pipeline include monitoring migration times, determining the presence of a particular species, estimating population size and other fishery management tasks.

  20. Digital image processing of Seabeam bathymetric data for structural studies of seamounts near the East Pacific Rise

    NASA Technical Reports Server (NTRS)

    Edwards, M. H.; Arvidson, R. E.; Guinness, E. A.

    1984-01-01

    The problem of displaying information on the seafloor morphology is attacked by utilizing digital image processing techniques to generate images for Seabeam data covering three young seamounts on the eastern flank of the East Pacific Rise. Errors in locations between crossing tracks are corrected by interactively identifying features and translating tracks relative to a control track. Spatial interpolation techniques using moving averages are used to interpolate between gridded depth values to produce images in shaded relief and color-coded forms. The digitally processed images clarify the structural control on seamount growth and clearly show the lateral extent of volcanic materials, including the distribution and fault control of subsidiary volcanic constructional features. The image presentations also clearly show artifacts related to both residual navigational errors and to depth or location differences that depend on ship heading relative to slope orientation in regions with steep slopes.

  1. Analysis of electroluminescence images in small-area circular CdTe solar cells

    NASA Astrophysics Data System (ADS)

    Bokalič, Matevž; Raguse, John; Sites, James R.; Topič, Marko

    2013-09-01

    The electroluminescence (EL) imaging process of small area solar cells is investigated in detail to expose optical and electrical effects that influence image acquisition and corrupt the acquired image. An approach to correct the measured EL images and to extract the exact EL radiation as emitted from the photovoltaic device is presented. EL images of circular cadmium telluride (CdTe) solar cells are obtained under different conditions. The power-law relationship between forward injection current and EL emission and a negative temperature coefficient of EL radiation are observed. The distributed Simulation Program with Integrated Circuit Emphasis (SPICE®) model of the circular CdTe solar cell is used to simulate the dark J-V curve and current distribution under the conditions used during EL measurements. Simulation results are presented as circularly averaged EL intensity profiles, which clearly show that the ratio between resistive parameters determines the current distribution in thin-film solar cells. The exact resistance values for front and back contact layers and for CdTe bulk layer are determined at different temperatures, and a negative temperature coefficient for the CdTe bulk resistance is observed.

  2. An Improved Method for Measuring Quantitative Resistance to the Wheat Pathogen Zymoseptoria tritici Using High-Throughput Automated Image Analysis.

    PubMed

    Stewart, Ethan L; Hagerty, Christina H; Mikaberidze, Alexey; Mundt, Christopher C; Zhong, Ziming; McDonald, Bruce A

    2016-07-01

    Zymoseptoria tritici causes Septoria tritici blotch (STB) on wheat. An improved method of quantifying STB symptoms was developed based on automated analysis of diseased leaf images made using a flatbed scanner. Naturally infected leaves (n = 949) sampled from fungicide-treated field plots comprising 39 wheat cultivars grown in Switzerland and 9 recombinant inbred lines (RIL) grown in Oregon were included in these analyses. Measures of quantitative resistance were percent leaf area covered by lesions, pycnidia size and gray value, and pycnidia density per leaf and lesion. These measures were obtained automatically with a batch-processing macro utilizing the image-processing software ImageJ. All phenotypes in both locations showed a continuous distribution, as expected for a quantitative trait. The trait distributions at both sites were largely overlapping even though the field and host environments were quite different. Cultivars and RILs could be assigned to two or more statistically different groups for each measured phenotype. Traditional visual assessments of field resistance were highly correlated with quantitative resistance measures based on image analysis for the Oregon RILs. These results show that automated image analysis provides a promising tool for assessing quantitative resistance to Z. tritici under field conditions.

  3. Surface rupture and slip distribution of the 2016 Mw7.8 Kaikoura earthquake (New Zealand) from optical satellite image correlation using MicMac

    NASA Astrophysics Data System (ADS)

    Champenois, Johann; Klinger, Yann; Grandin, Raphaël; Satriano, Claudio; Baize, Stéphane; Delorme, Arthur; Scotti, Oona

    2017-04-01

    Remote sensing techniques, like optical satellite image correlation, are very efficient methods to localize and quantify surface displacements due to earthquakes. In this study, we use the french sub-pixel correlator MicMac (Multi Images Correspondances par Méthodes Automatiques de Corrélation). This free open-source software, developed by IGN, was recently adapted to process satellite images. This correlator uses regularization, and that provides good results especially in near-fault area with a high spatial resolution. We use co-seismic pair of ortho-images to measure the horizontal displacement field during the recent 2016 Mw7.8 Kaikoura earthquake. Optical satellite images from different satellites are processed (Sentinel-2A, Landsat8, etc.) to present a dense map of the surface ruptures and to analyze high density slip distribution along all major ruptures. We also provide a detail pattern of deformation along these main surface ruptures. Moreover, 2D displacement from optical correlation is compared to co-seismic measurements from GPS, static displacement from accelerometric records, geodetic marks and field investigations. Last but not least, we investigate the reconstruction of 3D displacement from combining InSAR, GPS and optic.

  4. Fixed-Cell Imaging of Schizosaccharomyces pombe.

    PubMed

    Hagan, Iain M; Bagley, Steven

    2016-07-01

    The acknowledged genetic malleability of fission yeast has been matched by impressive cytology to drive major advances in our understanding of basic molecular cell biological processes. In many of the more recent studies, traditional approaches of fixation followed by processing to accommodate classical staining procedures have been superseded by live-cell imaging approaches that monitor the distribution of fusion proteins between a molecule of interest and a fluorescent protein. Although such live-cell imaging is uniquely informative for many questions, fixed-cell imaging remains the better option for others and is an important-sometimes critical-complement to the analysis of fluorescent fusion proteins by live-cell imaging. Here, we discuss the merits of fixed- and live-cell imaging as well as specific issues for fluorescence microscopy imaging of fission yeast. © 2016 Cold Spring Harbor Laboratory Press.

  5. Automatic rocks detection and classification on high resolution images of planetary surfaces

    NASA Astrophysics Data System (ADS)

    Aboudan, A.; Pacifici, A.; Murana, A.; Cannarsa, F.; Ori, G. G.; Dell'Arciprete, I.; Allemand, P.; Grandjean, P.; Portigliotti, S.; Marcer, A.; Lorenzoni, L.

    2013-12-01

    High-resolution images can be used to obtain rocks location and size on planetary surfaces. In particular rock size-frequency distribution is a key parameter to evaluate the surface roughness, to investigate the geologic processes that formed the surface and to assess the hazards related with spacecraft landing. The manual search for rocks on high-resolution images (even for small areas) can be a very intensive work. An automatic or semi-automatic algorithm to identify rocks is mandatory to enable further processing as determining the rocks presence, size, height (by means of shadows) and spatial distribution over an area of interest. Accurate rocks and shadows contours localization are the key steps for rock detection. An approach to contour detection based on morphological operators and statistical thresholding is presented in this work. The identified contours are then fitted using a proper geometric model of the rocks or shadows and used to estimate salient rocks parameters (position, size, area, height). The performances of this approach have been evaluated both on images of Martian analogue area of Morocco desert and on HiRISE images. Results have been compared with ground truth obtained by means of manual rock mapping and proved the effectiveness of the algorithm. The rock abundance and rocks size-frequency distribution derived on selected HiRISE images have been compared with the results of similar analyses performed for the landing site certification of Mars landers (Viking, Pathfinder, MER, MSL) and with the available thermal data from IRTM and TES.

  6. Mapping three-dimensional oil distribution with π-EPI MRI measurements at low magnetic field

    NASA Astrophysics Data System (ADS)

    Li, Ming; Xiao, Dan; Romero-Zerón, Laura; Marica, Florea; MacMillan, Bryce; Balcom, Bruce J.

    2016-08-01

    Magnetic resonance imaging (MRI) is a robust tool to image oil saturation distribution in rock cores during oil displacement processes. However, a lengthy measurement time for 3D measurements at low magnetic field can hinder monitoring the displacement. 1D and 2D MRI measurements are instead often undertaken to monitor the oil displacement since they are faster. However, 1D and 2D images may not completely reflect the oil distribution in heterogeneous rock cores. In this work, a high-speed 3D MRI technique, π Echo Planar Imaging (π-EPI), was employed at 0.2 T to monitor oil displacement. Centric scan interleaved sampling with view sharing in k-t space was employed to improve the temporal resolution of the π-EPI measurements. A D2O brine was employed to distinguish the hydrocarbon and water phases. A relatively homogenous glass bead pack and a heterogeneous Spynie core plug were employed to show different oil displacement behaviors. High quality 3D images were acquired with π-EPI MRI measurements. Fluid quantification with π-EPI compared favorably with FID, CPMG, 1D-DHK-SPRITE, 3D Fast Spin Echo (FSE) and 3D Conical SPRITE measurements. π-EPI greatly reduced the gradient duty cycle and improved sensitivity, compared to FSE and Conical SPRITE measurements, enabling dynamic monitoring of oil displacement processes. For core plug samples with sufficiently long lived T2, T2∗, π-EPI is an ideal method for rapid 3D saturation imaging.

  7. The infrared video image pseudocolor processing system

    NASA Astrophysics Data System (ADS)

    Zhu, Yong; Zhang, JiangLing

    2003-11-01

    The infrared video image pseudo-color processing system, emphasizing on the algorithm and its implementation for measured object"s 2D temperature distribution using pseudo-color technology, is introduced in the paper. The data of measured object"s thermal image is the objective presentation of its surface temperature distribution, but the color has a close relationship with people"s subjective cognition. The so-called pseudo-color technology cross the bridge between subjectivity and objectivity, and represents the measured object"s temperature distribution in reason and at first hand. The algorithm of pseudo-color is based on the distance of IHS space. Thereby the definition of pseudo-color visual resolution is put forward. Both the software (which realize the map from the sample data to the color space) and the hardware (which carry out the conversion from the color space to palette by HDL) co-operate. Therefore the two levels map which is logic map and physical map respectively is presented. The system has been used abroad in failure diagnose of electric power devices, fire protection for lifesaving and even SARS detection in CHINA lately.

  8. Random-access scanning microscopy for 3D imaging in awake behaving animals

    PubMed Central

    Nadella, K. M. Naga Srinivas; Roš, Hana; Baragli, Chiara; Griffiths, Victoria A.; Konstantinou, George; Koimtzis, Theo; Evans, Geoffrey J.; Kirkby, Paul A.; Silver, R. Angus

    2018-01-01

    Understanding how neural circuits process information requires rapid measurements from identified neurons distributed in 3D space. Here we describe an acousto-optic lens two-photon microscope that performs high-speed focussing and line-scanning within a volume spanning hundreds of micrometres. We demonstrate its random access functionality by selectively imaging cerebellar interneurons sparsely distributed in 3D and by simultaneously recording from the soma, proximal and distal dendrites of neocortical pyramidal cells in behaving mice. PMID:27749836

  9. Mirion--a software package for automatic processing of mass spectrometric images.

    PubMed

    Paschke, C; Leisner, A; Hester, A; Maass, K; Guenther, S; Bouschen, W; Spengler, B

    2013-08-01

    Mass spectrometric imaging (MSI) techniques are of growing interest for the Life Sciences. In recent years, the development of new instruments employing ion sources that are tailored for spatial scanning allowed the acquisition of large data sets. A subsequent data processing, however, is still a bottleneck in the analytical process, as a manual data interpretation is impossible within a reasonable time frame. The transformation of mass spectrometric data into spatial distribution images of detected compounds turned out to be the most appropriate method to visualize the results of such scans, as humans are able to interpret images faster and easier than plain numbers. Image generation, thus, is a time-consuming and complex yet very efficient task. The free software package "Mirion," presented in this paper, allows the handling and analysis of data sets acquired by mass spectrometry imaging. Mirion can be used for image processing of MSI data obtained from many different sources, as it uses the HUPO-PSI-based standard data format imzML, which is implemented in the proprietary software of most of the mass spectrometer companies. Different graphical representations of the recorded data are available. Furthermore, automatic calculation and overlay of mass spectrometric images promotes direct comparison of different analytes for data evaluation. The program also includes tools for image processing and image analysis.

  10. Combustion behaviors of GO2/GH2 swirl-coaxial injector using non-intrusive optical diagnostics

    NASA Astrophysics Data System (ADS)

    GuoBiao, Cai; Jian, Dai; Yang, Zhang; NanJia, Yu

    2016-06-01

    This research evaluates the combustion behaviors of a single-element, swirl-coaxial injector in an atmospheric combustion chamber with gaseous oxygen and gaseous hydrogen (GO2/GH2) as the propellants. A brief simulated flow field schematic comparison between a shear-coaxial injector and the swirl-coaxial injector reveals the distribution characteristics of the temperature field and streamline patterns. Advanced optical diagnostics, i.e., OH planar laser-induced fluorescence and high-speed imaging, are simultaneously employed to determine the OH radical spatial distribution and flame fluctuations, respectively. The present study focuses on the flame structures under varying O/F mixing ratios and center oxygen swirl intensities. The combined use of several image-processing methods aimed at OH instantaneous images, including time-averaged, root-mean-square, and gradient transformation, provides detailed information regarding the distribution of the flow field. The results indicate that the shear layers anchored on the oxygen injector lip are the main zones of chemical heat release and that the O/F mixing ratio significantly affects the flame shape. Furthermore, with high-speed imaging, an intuitionistic ignition process and several consecutive steady-state images reveal that lean conditions make it easy to drive the combustion instabilities and that the center swirl intensity has a moderate influence on the flame oscillation strength. The results of this study provide a visualized analysis for future optimal swirl-coaxial injector designs.

  11. Magnetosphere - ionosphere coupling process in the auroral region estimated from auroral tomography

    NASA Astrophysics Data System (ADS)

    Tanaka, Y.; Ogawa, Y.; Kadokura, A.; Gustavsson, B.; Kauristie, K.; Whiter, D. K.; Enell, C. F. T.; Brandstrom, U.; Sergienko, T.; Partamies, N.; Kozlovsky, A.; Miyaoka, H.; Kosch, M. J.

    2016-12-01

    We have studied the magnetosphere - ionosphere coupling process by using multiple auroral images and the ionospheric data obtained by a campaign observation with multi-point imagers and the EISCAT UHF radar in Northern Europe. We observed wavy structure of discrete arcs around the magnetic zenith at Tromso, Norway, from 22:00 to 23:15 UT on March 14, 2015, followed by auroral breakup, poleward expansion, and pulsating auroras. During this interval, the monochromatic (427.8nm) images were taken at a sampling interval of 2 seconds by three EMCCD imagers and at an interval of 10 seconds by totally six imagers. The EISCAT UHF radar at Tromso measured the ionospheric parameters along the magnetic field line from 20 to 24 UT. We applied the tomographic inversion technique to these data set to retrieve 3D distribution of the 427.8nm emission, that enabled us to obtain the following quantities for the auroras that change from moment to moment; (1) the relation between the 427.8nm emission and the electron density enhancement along the field line, (2) the horizontal distribution of energy flux of auroral precipitating electrons, and (3) the horizontal distribution of height-integrated ionospheric conductivity. By combining those with the ionospheric equivalent current estimated from the ground-based magnetometer network, we discuss the current system of a sequence of the auroral event in terms of the magnetosphere-ionosphere coupling.

  12. Digital Correlation In Laser-Speckle Velocimetry

    NASA Technical Reports Server (NTRS)

    Gilbert, John A.; Mathys, Donald R.

    1992-01-01

    Periodic recording helps to eliminate spurious results. Improved digital-correlation process extracts velocity field of two-dimensional flow from laser-speckle images of seed particles distributed sparsely in flow. Method which involves digital correlation of images recorded at unequal intervals, completely automated and has potential to be fastest yet.

  13. Radar signal analysis of ballistic missile with micro-motion based on time-frequency distribution

    NASA Astrophysics Data System (ADS)

    Wang, Jianming; Liu, Lihua; Yu, Hua

    2015-12-01

    The micro-motion of ballistic missile targets induces micro-Doppler modulation on the radar return signal, which is a unique feature for the warhead discrimination during flight. In order to extract the micro-Doppler feature of ballistic missile targets, time-frequency analysis is employed to process the micro-Doppler modulated time-varying radar signal. The images of time-frequency distribution (TFD) reveal the micro-Doppler modulation characteristic very well. However, there are many existing time-frequency analysis methods to generate the time-frequency distribution images, including the short-time Fourier transform (STFT), Wigner distribution (WD) and Cohen class distribution, etc. Under the background of ballistic missile defence, the paper aims at working out an effective time-frequency analysis method for ballistic missile warhead discrimination from the decoys.

  14. Model-based VQ for image data archival, retrieval and distribution

    NASA Technical Reports Server (NTRS)

    Manohar, Mareboyana; Tilton, James C.

    1995-01-01

    An ideal image compression technique for image data archival, retrieval and distribution would be one with the asymmetrical computational requirements of Vector Quantization (VQ), but without the complications arising from VQ codebooks. Codebook generation and maintenance are stumbling blocks which have limited the use of VQ as a practical image compression algorithm. Model-based VQ (MVQ), a variant of VQ described here, has the computational properties of VQ but does not require explicit codebooks. The codebooks are internally generated using mean removed error and Human Visual System (HVS) models. The error model assumed is the Laplacian distribution with mean, lambda-computed from a sample of the input image. A Laplacian distribution with mean, lambda, is generated with uniform random number generator. These random numbers are grouped into vectors. These vectors are further conditioned to make them perceptually meaningful by filtering the DCT coefficients from each vector. The DCT coefficients are filtered by multiplying by a weight matrix that is found to be optimal for human perception. The inverse DCT is performed to produce the conditioned vectors for the codebook. The only image dependent parameter used in the generation of codebook is the mean, lambda, that is included in the coded file to repeat the codebook generation process for decoding.

  15. Parallel Processing of Images in Mobile Devices using BOINC

    NASA Astrophysics Data System (ADS)

    Curiel, Mariela; Calle, David F.; Santamaría, Alfredo S.; Suarez, David F.; Flórez, Leonardo

    2018-04-01

    Medical image processing helps health professionals make decisions for the diagnosis and treatment of patients. Since some algorithms for processing images require substantial amounts of resources, one could take advantage of distributed or parallel computing. A mobile grid can be an adequate computing infrastructure for this problem. A mobile grid is a grid that includes mobile devices as resource providers. In a previous step of this research, we selected BOINC as the infrastructure to build our mobile grid. However, parallel processing of images in mobile devices poses at least two important challenges: the execution of standard libraries for processing images and obtaining adequate performance when compared to desktop computers grids. By the time we started our research, the use of BOINC in mobile devices also involved two issues: a) the execution of programs in mobile devices required to modify the code to insert calls to the BOINC API, and b) the division of the image among the mobile devices as well as its merging required additional code in some BOINC components. This article presents answers to these four challenges.

  16. Multichannel blind deconvolution of spatially misaligned images.

    PubMed

    Sroubek, Filip; Flusser, Jan

    2005-07-01

    Existing multichannel blind restoration techniques assume perfect spatial alignment of channels, correct estimation of blur size, and are prone to noise. We developed an alternating minimization scheme based on a maximum a posteriori estimation with a priori distribution of blurs derived from the multichannel framework and a priori distribution of original images defined by the variational integral. This stochastic approach enables us to recover the blurs and the original image from channels severely corrupted by noise. We observe that the exact knowledge of the blur size is not necessary, and we prove that translation misregistration up to a certain extent can be automatically removed in the restoration process.

  17. Glossiness and perishable food quality: visual freshness judgment of fish eyes based on luminance distribution.

    PubMed

    Murakoshi, Takuma; Masuda, Tomohiro; Utsumi, Ken; Tsubota, Kazuo; Wada, Yuji

    2013-01-01

    Previous studies have reported the effects of statistics of luminance distribution on visual freshness perception using pictures which included the degradation process of food samples. However, these studies did not examine the effect of individual differences between the same kinds of food. Here we elucidate whether luminance distribution would continue to have a significant effect on visual freshness perception even if visual stimuli included individual differences in addition to the degradation process of foods. We took pictures of the degradation of three fishes over 3.29 hours in a controlled environment, then cropped square patches of their eyes from the original images as visual stimuli. Eleven participants performed paired comparison tests judging the visual freshness of the fish eyes at three points of degradation. Perceived freshness scores (PFS) were calculated using the Bradley-Terry Model for each image. The ANOVA revealed that the PFS for each fish decreased as the degradation time increased; however, the differences in the PFS between individual fish was larger for the shorter degradation time, and smaller for the longer degradation time. A multiple linear regression analysis was conducted in order to determine the relative importance of the statistics of luminance distribution of the stimulus images in predicting PFS. The results show that standard deviation and skewness in luminance distribution have a significant influence on PFS. These results show that even if foodstuffs contain individual differences, visual freshness perception and changes in luminance distribution correlate with degradation time.

  18. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-03-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl's law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3DMIP platform when a larger number of cores is available.

  19. Physical reconstruction of packed beds and their morphological analysis: core-shell packings as an example.

    PubMed

    Bruns, Stefan; Tallarek, Ulrich

    2011-04-08

    We report a fast, nondestructive, and quantitative approach to characterize the morphology of packed beds of fine particles by their three-dimensional reconstruction from confocal laser scanning microscopy images, exemplarily shown for a 100μm i.d. fused-silica capillary packed with 2.6μm-sized core-shell particles. The presented method is generally applicable to silica-based capillary columns, monolithic or particulate, and comprises column pretreatment, image acquisition, image processing, and statistical analysis of the image data. It defines a unique platform for fundamental comparisons of particulate and monolithic supports using the statistical measures derived from their reconstructions. Received morphological data are column cross-sectional porosity profiles and chord length distributions from the interparticle macropore space, which are a descriptor of local density and can be characterized by a simplified k-gamma distribution. This distribution function provides a parameter of location and a parameter of dispersion which can be correlated to individual chromatographic band broadening processes (i.e., to transchannel and short-range interchannel contributions to eddy dispersion, respectively). Together with the transcolumn porosity profile the presented approach allows to analyze and quantify the packing microstructure from pore to column scale and therefore holds great promise in a comparative study of packing conditions and particle properties, particularly for characterizing and minimizing the packing process-specific heterogeneities in the final bed structure. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Fiji: an open-source platform for biological-image analysis.

    PubMed

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2012-06-28

    Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

  1. Minimum risk wavelet shrinkage operator for Poisson image denoising.

    PubMed

    Cheng, Wu; Hirakawa, Keigo

    2015-05-01

    The pixel values of images taken by an image sensor are said to be corrupted by Poisson noise. To date, multiscale Poisson image denoising techniques have processed Haar frame and wavelet coefficients--the modeling of coefficients is enabled by the Skellam distribution analysis. We extend these results by solving for shrinkage operators for Skellam that minimizes the risk functional in the multiscale Poisson image denoising setting. The minimum risk shrinkage operator of this kind effectively produces denoised wavelet coefficients with minimum attainable L2 error.

  2. Relationships between digital signal processing and control and estimation theory

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1978-01-01

    Research areas associated with digital signal processing and control and estimation theory are identified. Particular attention is given to image processing, system identification problems (parameter identification, linear prediction, least squares, Kalman filtering), stability analyses (the use of the Liapunov theory, frequency domain criteria, passivity), and multiparameter systems, distributed processes, and random fields.

  3. IMAGEP - A FORTRAN ALGORITHM FOR DIGITAL IMAGE PROCESSING

    NASA Technical Reports Server (NTRS)

    Roth, D. J.

    1994-01-01

    IMAGEP is a FORTRAN computer algorithm containing various image processing, analysis, and enhancement functions. It is a keyboard-driven program organized into nine subroutines. Within the subroutines are other routines, also, selected via keyboard. Some of the functions performed by IMAGEP include digitization, storage and retrieval of images; image enhancement by contrast expansion, addition and subtraction, magnification, inversion, and bit shifting; display and movement of cursor; display of grey level histogram of image; and display of the variation of grey level intensity as a function of image position. This algorithm has possible scientific, industrial, and biomedical applications in material flaw studies, steel and ore analysis, and pathology, respectively. IMAGEP is written in VAX FORTRAN for DEC VAX series computers running VMS. The program requires the use of a Grinnell 274 image processor which can be obtained from Mark McCloud Associates, Campbell, CA. An object library of the required GMR series software is included on the distribution media. IMAGEP requires 1Mb of RAM for execution. The standard distribution medium for this program is a 1600 BPI 9track magnetic tape in VAX FILES-11 format. It is also available on a TK50 tape cartridge in VAX FILES-11 format. This program was developed in 1991. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation.

  4. Comparison of numerical simulations to experiments for atomization in a jet nebulizer.

    PubMed

    Lelong, Nicolas; Vecellio, Laurent; Sommer de Gélicourt, Yann; Tanguy, Christian; Diot, Patrice; Junqua-Moullet, Alexandra

    2013-01-01

    The development of jet nebulizers for medical purposes is an important challenge of aerosol therapy. The performance of a nebulizer is characterized by its output rate of droplets with a diameter under 5 µm. However the optimization of this parameter through experiments has reached a plateau. The purpose of this study is to design a numerical model simulating the nebulization process and to compare it with experimental data. Such a model could provide a better understanding of the atomization process and the parameters influencing the nebulizer output. A model based on the Updraft nebulizer (Hudson) was designed with ANSYS Workbench. Boundary conditions were set with experimental data then transient 3D calculations were run on a 4 µm mesh with ANSYS Fluent. Two air flow rate (2 L/min and 8 L/min, limits of the operating range) were considered to account for different turbulence regimes. Numerical and experimental results were compared according to phenomenology and droplet size. The behavior of the liquid was compared to images acquired through shadowgraphy with a CCD Camera. Three experimental methods, laser diffractometry, phase Doppler anemometry (PDA) and shadowgraphy were used to characterize the droplet size distributions. Camera images showed similar patterns as numerical results. Droplet sizes obtained numerically are overestimated in relation to PDA and diffractometry, which only consider spherical droplets. However, at both flow rates, size distributions extracted from numerical image processing were similar to distributions obtained from shadowgraphy image processing. The simulation then provides a good understanding and prediction of the phenomena involved in the fragmentation of droplets over 10 µm. The laws of dynamics apply to droplets down to 1 µm, so we can assume the continuity of the distribution and extrapolate the results for droplets between 1 and 10 µm. So, this model could help predicting nebulizer output with defined geometrical and physical parameters.

  5. Comparison of Numerical Simulations to Experiments for Atomization in a Jet Nebulizer

    PubMed Central

    Lelong, Nicolas; Vecellio, Laurent; Sommer de Gélicourt, Yann; Tanguy, Christian; Diot, Patrice; Junqua-Moullet, Alexandra

    2013-01-01

    The development of jet nebulizers for medical purposes is an important challenge of aerosol therapy. The performance of a nebulizer is characterized by its output rate of droplets with a diameter under 5 µm. However the optimization of this parameter through experiments has reached a plateau. The purpose of this study is to design a numerical model simulating the nebulization process and to compare it with experimental data. Such a model could provide a better understanding of the atomization process and the parameters influencing the nebulizer output. A model based on the Updraft nebulizer (Hudson) was designed with ANSYS Workbench. Boundary conditions were set with experimental data then transient 3D calculations were run on a 4 µm mesh with ANSYS Fluent. Two air flow rate (2 L/min and 8 L/min, limits of the operating range) were considered to account for different turbulence regimes. Numerical and experimental results were compared according to phenomenology and droplet size. The behavior of the liquid was compared to images acquired through shadowgraphy with a CCD Camera. Three experimental methods, laser diffractometry, phase Doppler anemometry (PDA) and shadowgraphy were used to characterize the droplet size distributions. Camera images showed similar patterns as numerical results. Droplet sizes obtained numerically are overestimated in relation to PDA and diffractometry, which only consider spherical droplets. However, at both flow rates, size distributions extracted from numerical image processing were similar to distributions obtained from shadowgraphy image processing. The simulation then provides a good understanding and prediction of the phenomena involved in the fragmentation of droplets over 10 µm. The laws of dynamics apply to droplets down to 1 µm, so we can assume the continuity of the distribution and extrapolate the results for droplets between 1 and 10 µm. So, this model could help predicting nebulizer output with defined geometrical and physical parameters. PMID:24244334

  6. Riding the Right Wavelet: Quantifying Scale Transitions in Fractured Rocks

    NASA Astrophysics Data System (ADS)

    Rizzo, Roberto E.; Healy, David; Farrell, Natalie J.; Heap, Michael J.

    2017-12-01

    The mechanics of brittle failure is a well-described multiscale process that involves a rapid transition from distributed microcracks to localization along a single macroscopic rupture plane. However, considerable uncertainty exists regarding both the length scale at which this transition occurs and the underlying causes that prompt this shift from a distributed to a localized assemblage of cracks or fractures. For the first time, we used an image analysis tool developed to investigate orientation changes at different scales in images of fracture patterns in faulted materials, based on a two-dimensional continuous wavelet analysis. We detected the abrupt change in the fracture pattern from distributed tensile microcracks to localized shear failure in a fracture network produced by triaxial deformation of a sandstone core plug. The presented method will contribute to our ability of unraveling the physical processes at the base of catastrophic rock failure, including the nucleation of earthquakes, landslides, and volcanic eruptions.

  7. Nonuniformity correction of infrared cameras by reading radiance temperatures with a spatially nonhomogeneous radiation source

    NASA Astrophysics Data System (ADS)

    Gutschwager, Berndt; Hollandt, Jörg

    2017-01-01

    We present a novel method of nonuniformity correction (NUC) of infrared cameras and focal plane arrays (FPA) in a wide optical spectral range by reading radiance temperatures and by applying a radiation source with an unknown and spatially nonhomogeneous radiance temperature distribution. The benefit of this novel method is that it works with the display and the calculation of radiance temperatures, it can be applied to radiation sources of arbitrary spatial radiance temperature distribution, and it only requires sufficient temporal stability of this distribution during the measurement process. In contrast to this method, an initially presented method described the calculation of NUC correction with the reading of monitored radiance values. Both methods are based on the recording of several (at least three) images of a radiation source and a purposeful row- and line-shift of these sequent images in relation to the first primary image. The mathematical procedure is explained in detail. Its numerical verification with a source of a predefined nonhomogeneous radiance temperature distribution and a thermal imager of a predefined nonuniform FPA responsivity is presented.

  8. [Image fusion: use in the control of the distribution of prostatic biopsies].

    PubMed

    Mozer, Pierre; Baumann, Michaël; Chevreau, Grégoire; Troccaz, Jocelyne

    2008-02-01

    Prostate biopsies are performed under 2D TransRectal UltraSound (US) guidance by sampling the prostate according to a predefined pattern. Modern image processing tools allow better control of biopsy distribution. We evaluated the accuracy of a single operator performing a pattern of 12 ultrasound-guided biopsies by registering 3D ultrasound control images acquired after each biopsy. For each patient, prostate image alignment was performed automatically with a voxel-based registration algorithm allowing visualization of each biopsy trajectory in a single ultrasound reference volume. On average, the operator reached the target in 60% of all cases. This study shows that it is difficult to accurately reach targets in the prostate using 2D ultrasound. In the near future, real-time fusion of MRI and US images will allow selection of a target in previously acquired MR images and biopsy of this target by US guidance.

  9. Label-free optical imaging of membrane patches for atomic force microscopy

    PubMed Central

    Churnside, Allison B.; King, Gavin M.; Perkins, Thomas T.

    2010-01-01

    In atomic force microscopy (AFM), finding sparsely distributed regions of interest can be difficult and time-consuming. Typically, the tip is scanned until the desired object is located. This process can mechanically or chemically degrade the tip, as well as damage fragile biological samples. Protein assemblies can be detected using the back-scattered light from a focused laser beam. We previously used back-scattered light from a pair of laser foci to stabilize an AFM. In the present work, we integrate these techniques to optically image patches of purple membranes prior to AFM investigation. These rapidly acquired optical images were aligned to the subsequent AFM images to ~40 nm, since the tip position was aligned to the optical axis of the imaging laser. Thus, this label-free imaging efficiently locates sparsely distributed protein assemblies for subsequent AFM study while simultaneously minimizing degradation of the tip and the sample. PMID:21164738

  10. Analysis of the fractal dimension of volcano geomorphology through Synthetic Aperture Radar (SAR) amplitude images acquired in C and X band.

    NASA Astrophysics Data System (ADS)

    Pepe, S.; Di Martino, G.; Iodice, A.; Manzo, M.; Pepe, A.; Riccio, D.; Ruello, G.; Sansosti, E.; Tizzani, P.; Zinno, I.

    2012-04-01

    In the last two decades several aspects relevant to volcanic activity have been analyzed in terms of fractal parameters that effectively describe natural objects geometry. More specifically, these researches have been aimed at the identification of (1) the power laws that governed the magma fragmentation processes, (2) the energy of explosive eruptions, and (3) the distribution of the associated earthquakes. In this paper, the study of volcano morphology via satellite images is dealt with; in particular, we use the complete forward model developed by some of the authors (Di Martino et al., 2012) that links the stochastic characterization of amplitude Synthetic Aperture Radar (SAR) images to the fractal dimension of the imaged surfaces, modelled via fractional Brownian motion (fBm) processes. Based on the inversion of such a model, a SAR image post-processing has been implemented (Di Martino et al., 2010), that allows retrieving the fractal dimension of the observed surfaces, dictating the distribution of the roughness over different spatial scales. The fractal dimension of volcanic structures has been related to the specific nature of materials and to the effects of active geodynamic processes. Hence, the possibility to estimate the fractal dimension from a single amplitude-only SAR image is of fundamental importance for the characterization of volcano structures and, moreover, can be very helpful for monitoring and crisis management activities in case of eruptions and other similar natural hazards. The implemented SAR image processing performs the extraction of the point-by-point fractal dimension of the scene observed by the sensor, providing - as an output product - the map of the fractal dimension of the area of interest. In this work, such an analysis is performed on Cosmo-SkyMed, ERS-1/2 and ENVISAT images relevant to active stratovolcanoes in different geodynamic contexts, such as Mt. Somma-Vesuvio, Mt. Etna, Vulcano and Stromboli in Southern Italy, Shinmoe in Japan, Merapi in Indonesia. Preliminary results reveal that the fractal dimension of natural areas, being related only to the roughness of the observed surface, is very stable as the radar illumination geometry, the resolution and the wavelength change, thus holding a very unique property in SAR data inversion. Such a behavior is not verified in case of non-natural objects. As a matter of fact, when the fractal estimation is performed in the presence of either man-made objects or SAR image features depending on geometrical distortions due to the SAR system acquisition (i.e. layover, shadowing), fractal dimension (D) values outside the range of fractality of natural surfaces (2 < D < 3) are retrieved. These non-fractal characteristics show to be heavily dependent on sensor acquisition parameters (e.g. view angle, resolution). In this work, the behaviour of the maps generated starting from the C- and X- band SAR data, relevant to all the considered volcanoes, is analyzed: the distribution of the obtained fractal dimension values is investigated on different zones of the maps. In particular, it is verified that the fore-slope and back-slope areas of the image share a very similar fractal dimension distribution that is placed around the mean value of D=2.3. We conclude that, in this context, the fractal dimension could be considered as a signature of the identification of the volcano growth as a natural process. The COSMO-SkyMed data used in this study have been processed at IREA-CNR within the SAR4Volcanoes project under Italian Space Agency agreement n. I/034/11/0.

  11. Prostate seed implant quality assessment using MR and CT image fusion.

    PubMed

    Amdur, R J; Gladstone, D; Leopold, K A; Harris, R D

    1999-01-01

    After a seed implant of the prostate, computerized tomography (CT) is ideal for determining seed distribution but soft tissue anatomy is frequently not well visualized. Magnetic resonance (MR) images soft tissue anatomy well but seed visualization is problematic. We describe a method of fusing CT and MR images to exploit the advantages of both of these modalities when assessing the quality of a prostate seed implant. Eleven consecutive prostate seed implant patients were imaged with axial MR and CT scans. MR and CT images were fused in three dimensions using the Pinnacle 3.0 version of the ADAC treatment planning system. The urethra and bladder base were used to "line up" MR and CT image sets during image fusion. Alignment was accomplished using translation and rotation in the three ortho-normal planes. Accuracy of image fusion was evaluated by calculating the maximum deviation in millimeters between the center of the urethra on axial MR versus CT images. Implant quality was determined by comparing dosimetric results to previously set parameters. Image fusion was performed with a high degree of accuracy. When lining up the urethra and base of bladder, the maximum difference in axial position of the urethra between MR and CT averaged 2.5 mm (range 1.3-4.0 mm, SD 0.9 mm). By projecting CT-derived dose distributions over MR images of soft tissue structures, qualitative and quantitative evaluation of implant quality is straightforward. The image-fusion process we describe provides a sophisticated way of assessing the quality of a prostate seed implant. Commercial software makes the process time-efficient and available to any clinical practice with a high-quality treatment planning system. While we use MR to image soft tissue structures, the process could be used with any imaging modality that is able to visualize the prostatic urethra (e.g., ultrasound).

  12. Contour-Driven Atlas-Based Segmentation

    PubMed Central

    Wachinger, Christian; Fritscher, Karl; Sharp, Greg; Golland, Polina

    2016-01-01

    We propose new methods for automatic segmentation of images based on an atlas of manually labeled scans and contours in the image. First, we introduce a Bayesian framework for creating initial label maps from manually annotated training images. Within this framework, we model various registration- and patch-based segmentation techniques by changing the deformation field prior. Second, we perform contour-driven regression on the created label maps to refine the segmentation. Image contours and image parcellations give rise to non-stationary kernel functions that model the relationship between image locations. Setting the kernel to the covariance function in a Gaussian process establishes a distribution over label maps supported by image structures. Maximum a posteriori estimation of the distribution over label maps conditioned on the outcome of the atlas-based segmentation yields the refined segmentation. We evaluate the segmentation in two clinical applications: the segmentation of parotid glands in head and neck CT scans and the segmentation of the left atrium in cardiac MR angiography images. PMID:26068202

  13. PCI bus content-addressable-memory (CAM) implementation on FPGA for pattern recognition/image retrieval in a distributed environment

    NASA Astrophysics Data System (ADS)

    Megherbi, Dalila B.; Yan, Yin; Tanmay, Parikh; Khoury, Jed; Woods, C. L.

    2004-11-01

    Recently surveillance and Automatic Target Recognition (ATR) applications are increasing as the cost of computing power needed to process the massive amount of information continues to fall. This computing power has been made possible partly by the latest advances in FPGAs and SOPCs. In particular, to design and implement state-of-the-Art electro-optical imaging systems to provide advanced surveillance capabilities, there is a need to integrate several technologies (e.g. telescope, precise optics, cameras, image/compute vision algorithms, which can be geographically distributed or sharing distributed resources) into a programmable system and DSP systems. Additionally, pattern recognition techniques and fast information retrieval, are often important components of intelligent systems. The aim of this work is using embedded FPGA as a fast, configurable and synthesizable search engine in fast image pattern recognition/retrieval in a distributed hardware/software co-design environment. In particular, we propose and show a low cost Content Addressable Memory (CAM)-based distributed embedded FPGA hardware architecture solution with real time recognition capabilities and computing for pattern look-up, pattern recognition, and image retrieval. We show how the distributed CAM-based architecture offers a performance advantage of an order-of-magnitude over RAM-based architecture (Random Access Memory) search for implementing high speed pattern recognition for image retrieval. The methods of designing, implementing, and analyzing the proposed CAM based embedded architecture are described here. Other SOPC solutions/design issues are covered. Finally, experimental results, hardware verification, and performance evaluations using both the Xilinx Virtex-II and the Altera Apex20k are provided to show the potential and power of the proposed method for low cost reconfigurable fast image pattern recognition/retrieval at the hardware/software co-design level.

  14. Iterative optimizing quantization method for reconstructing three-dimensional images from a limited number of views

    DOEpatents

    Lee, H.R.

    1997-11-18

    A three-dimensional image reconstruction method comprises treating the object of interest as a group of elements with a size that is determined by the resolution of the projection data, e.g., as determined by the size of each pixel. One of the projections is used as a reference projection. A fictitious object is arbitrarily defined that is constrained by such reference projection. The method modifies the known structure of the fictitious object by comparing and optimizing its four projections to those of the unknown structure of the real object and continues to iterate until the optimization is limited by the residual sum of background noise. The method is composed of several sub-processes that acquire four projections from the real data and the fictitious object: generate an arbitrary distribution to define the fictitious object, optimize the four projections, generate a new distribution for the fictitious object, and enhance the reconstructed image. The sub-process for the acquisition of the four projections from the input real data is simply the function of acquiring the four projections from the data of the transmitted intensity. The transmitted intensity represents the density distribution, that is, the distribution of absorption coefficients through the object. 5 figs.

  15. Quantitative imaging reveals heterogeneous growth dynamics and treatment-dependent residual tumor distributions in a three-dimensional ovarian cancer model

    NASA Astrophysics Data System (ADS)

    Celli, Jonathan P.; Rizvi, Imran; Evans, Conor L.; Abu-Yousif, Adnan O.; Hasan, Tayyaba

    2010-09-01

    Three-dimensional tumor models have emerged as valuable in vitro research tools, though the power of such systems as quantitative reporters of tumor growth and treatment response has not been adequately explored. We introduce an approach combining a 3-D model of disseminated ovarian cancer with high-throughput processing of image data for quantification of growth characteristics and cytotoxic response. We developed custom MATLAB routines to analyze longitudinally acquired dark-field microscopy images containing thousands of 3-D nodules. These data reveal a reproducible bimodal log-normal size distribution. Growth behavior is driven by migration and assembly, causing an exponential decay in spatial density concomitant with increasing mean size. At day 10, cultures are treated with either carboplatin or photodynamic therapy (PDT). We quantify size-dependent cytotoxic response for each treatment on a nodule by nodule basis using automated segmentation combined with ratiometric batch-processing of calcein and ethidium bromide fluorescence intensity data (indicating live and dead cells, respectively). Both treatments reduce viability, though carboplatin leaves micronodules largely structurally intact with a size distribution similar to untreated cultures. In contrast, PDT treatment disrupts micronodular structure, causing punctate regions of toxicity, shifting the distribution toward smaller sizes, and potentially increasing vulnerability to subsequent chemotherapeutic treatment.

  16. Photoelectron imaging of doped helium nanodroplets

    NASA Astrophysics Data System (ADS)

    Neumark, Daniel

    2008-03-01

    Photoelectron images of helium nanodroplets doped with Kr and Ne atoms are reported. The images and resulting photoelectron spectra were obtained using tunable synchrotron radiation to ionize the droplets. Droplets were excited at 21.6 eV, corresponding to a strong droplet electronic excitation. The rare gas dopant is then ionized via a Penning excitation transfer process. The electron kinetic energy distributions reflect complex ionization and electron escape dynamics.

  17. Column ratio mapping: a processing technique for atomic resolution high-angle annular dark-field (HAADF) images.

    PubMed

    Robb, Paul D; Craven, Alan J

    2008-12-01

    An image processing technique is presented for atomic resolution high-angle annular dark-field (HAADF) images that have been acquired using scanning transmission electron microscopy (STEM). This technique is termed column ratio mapping and involves the automated process of measuring atomic column intensity ratios in high-resolution HAADF images. This technique was developed to provide a fuller analysis of HAADF images than the usual method of drawing single intensity line profiles across a few areas of interest. For instance, column ratio mapping reveals the compositional distribution across the whole HAADF image and allows a statistical analysis and an estimation of errors. This has proven to be a very valuable technique as it can provide a more detailed assessment of the sharpness of interfacial structures from HAADF images. The technique of column ratio mapping is described in terms of a [110]-oriented zinc-blende structured AlAs/GaAs superlattice using the 1 angstroms-scale resolution capability of the aberration-corrected SuperSTEM 1 instrument.

  18. Determination of Hydrodynamic Parameters on Two--Phase Flow Gas - Liquid in Pipes with Different Inclination Angles Using Image Processing Algorithm

    NASA Astrophysics Data System (ADS)

    Montoya, Gustavo; Valecillos, María; Romero, Carlos; Gonzáles, Dosinda

    2009-11-01

    In the present research a digital image processing-based automated algorithm was developed in order to determine the phase's height, hold up, and statistical distribution of the drop size in a two-phase system water-air using pipes with 0 , 10 , and 90 of inclination. Digital images were acquired with a high speed camera (up to 4500fps), using an equipment that consist of a system with three acrylic pipes with diameters of 1.905, 3.175, and 4.445 cm. Each pipe is arranged in two sections of 8 m of length. Various flow patterns were visualized for different superficial velocities of water and air. Finally, using the image processing program designed in Matlab/Simulink^, the captured images were processed to establish the parameters previously mentioned. The image processing algorithm is based in the frequency domain analysis of the source pictures, which allows to find the phase as the edge between the water and air, through a Sobel filter that extracts the high frequency components of the image. The drop size was found using the calculation of the Feret diameter. Three flow patterns were observed: Annular, ST, and ST&MI.

  19. Automatic tissue image segmentation based on image processing and deep learning

    NASA Astrophysics Data System (ADS)

    Kong, Zhenglun; Luo, Junyi; Xu, Shengpu; Li, Ting

    2018-02-01

    Image segmentation plays an important role in multimodality imaging, especially in fusion structural images offered by CT, MRI with functional images collected by optical technologies or other novel imaging technologies. Plus, image segmentation also provides detailed structure description for quantitative visualization of treating light distribution in the human body when incorporated with 3D light transport simulation method. Here we used image enhancement, operators, and morphometry methods to extract the accurate contours of different tissues such as skull, cerebrospinal fluid (CSF), grey matter (GM) and white matter (WM) on 5 fMRI head image datasets. Then we utilized convolutional neural network to realize automatic segmentation of images in a deep learning way. We also introduced parallel computing. Such approaches greatly reduced the processing time compared to manual and semi-automatic segmentation and is of great importance in improving speed and accuracy as more and more samples being learned. Our results can be used as a criteria when diagnosing diseases such as cerebral atrophy, which is caused by pathological changes in gray matter or white matter. We demonstrated the great potential of such image processing and deep leaning combined automatic tissue image segmentation in personalized medicine, especially in monitoring, and treatments.

  20. Technology for Elevated Temperature Tests of Structural Panels

    NASA Technical Reports Server (NTRS)

    Thornton, E. A.

    1999-01-01

    A technique for full-field measurement of surface temperature and in-plane strain using a single grid imaging technique was demonstrated on a sample subjected to thermally-induced strain. The technique is based on digital imaging of a sample marked by an alternating line array of La2O2S:Eu(+3) thermographic phosphor and chromium illuminated by a UV lamp. Digital images of this array in unstrained and strained states were processed using a modified spin filter. Normal strain distribution was determined by combining unstrained and strained grid images using a single grid digital moire technique. Temperature distribution was determined by ratioing images of phosphor intensity at two wavelengths. Combined strain and temperature measurements demonstrated on the thermally heated sample were DELTA-epsilon = +/- 250 microepsilon and DELTA-T = +/- 5 K respectively with a spatial resolution of 0.8 mm.

  1. Regionally adaptive histogram equalization of the chest.

    PubMed

    Sherrier, R H; Johnson, G A

    1987-01-01

    Advances in the area of digital chest radiography have resulted in the acquisition of high-quality images of the human chest. With these advances, there arises a genuine need for image processing algorithms specific to the chest, in order to fully exploit this digital technology. We have implemented the well-known technique of histogram equalization, noting the problems encountered when it is adapted to chest images. These problems have been successfully solved with our regionally adaptive histogram equalization method. With this technique histograms are calculated locally and then modified according to both the mean pixel value of that region as well as certain characteristics of the cumulative distribution function. This process, which has allowed certain regions of the chest radiograph to be enhanced differentially, may also have broader implications for other image processing tasks.

  2. Skeletonization with hollow detection on gray image by gray weighted distance transform

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Prabir; Qian, Kai; Cao, Siqi; Qian, Yi

    1998-10-01

    A skeletonization algorithm that could be used to process non-uniformly distributed gray-scale images with hollows was presented. This algorithm is based on the Gray Weighted Distance Transformation. The process includes a preliminary phase of investigation in the hollows in the gray-scale image, whether these hollows are considered as topological constraints for the skeleton structure depending on their statistically significant depth. We then extract the resulting skeleton that has certain meaningful information for understanding the object in the image. This improved algorithm can overcome the possible misinterpretation of some complicated images in the extracted skeleton, especially in images with asymmetric hollows and asymmetric features. This algorithm can be executed on a parallel machine as all the operations are executed in local. Some examples are discussed to illustrate the algorithm.

  3. Radiography for intensive care: participatory process analysis in a PACS-equipped and film/screen environment

    NASA Astrophysics Data System (ADS)

    Peer, Regina; Peer, Siegfried; Sander, Heike; Marsolek, Ingo; Koller, Wolfgang; Pappert, Dirk; Hierholzer, Johannes

    2002-05-01

    If new technology is introduced into medical practice it must prove to make a difference. However traditional approaches of outcome analysis failed to show a direct benefit of PACS on patient care and economical benefits are still in debate. A participatory process analysis was performed to compare workflow in a film based hospital and a PACS environment. This included direct observation of work processes, interview of involved staff, structural analysis and discussion of observations with staff members. After definition of common structures strong and weak workflow steps were evaluated. With a common workflow structure in both hospitals, benefits of PACS were revealed in workflow steps related to image reporting with simultaneous image access for ICU-physicians and radiologists, archiving of images as well as image and report distribution. However PACS alone is not able to cover the complete process of 'radiography for intensive care' from ordering of an image till provision of the final product equals image + report. Interference of electronic workflow with analogue process steps such as paper based ordering reduces the potential benefits of PACS. In this regard workflow modeling proved to be very helpful for the evaluation of complex work processes linking radiology and the ICU.

  4. Okayama optical polarimetry and spectroscopy system (OOPS) II. Network-transparent control software.

    NASA Astrophysics Data System (ADS)

    Sasaki, T.; Kurakami, T.; Shimizu, Y.; Yutani, M.

    Control system of the OOPS (Okayama Optical Polarimetry and Spectroscopy system) is designed to integrate several instruments whose controllers are distributed over a network; the OOPS instrument, a CCD camera and data acquisition unit, the 91 cm telescope, an autoguider, a weather monitor, and an image display tool SAOimage. With the help of message-based communication, the control processes cooperate with related processes to perform an astronomical observation under supervising control by a scheduler process. A logger process collects status data of all the instruments to distribute them to related processes upon request. Software structure of each process is described.

  5. Parallel programming of gradient-based iterative image reconstruction schemes for optical tomography.

    PubMed

    Hielscher, Andreas H; Bartel, Sebastian

    2004-02-01

    Optical tomography (OT) is a fast developing novel imaging modality that uses near-infrared (NIR) light to obtain cross-sectional views of optical properties inside the human body. A major challenge remains the time-consuming, computational-intensive image reconstruction problem that converts NIR transmission measurements into cross-sectional images. To increase the speed of iterative image reconstruction schemes that are commonly applied for OT, we have developed and implemented several parallel algorithms on a cluster of workstations. Static process distribution as well as dynamic load balancing schemes suitable for heterogeneous clusters and varying machine performances are introduced and tested. The resulting algorithms are shown to accelerate the reconstruction process to various degrees, substantially reducing the computation times for clinically relevant problems.

  6. Surface conversion techniques for low energy neutral atom imagers

    NASA Technical Reports Server (NTRS)

    Quinn, J. M.

    1995-01-01

    This investigation has focused on development of key technology elements for low energy neutral atom imaging. More specifically, we have investigated the conversion of low energy neutral atoms to negatively charged ions upon reflection from specially prepared surfaces. This 'surface conversion' technique appears to offer a unique capability of detecting, and thus imaging, neutral atoms at energies of 0.01 - 1 keV with high enough efficiencies to make practical its application to low energy neutral atom imaging in space. Such imaging offers the opportunity to obtain the first instantaneous global maps of macroscopic plasma features and their temporal variation. Through previous in situ plasma measurements, we have a statistical picture of large scale morphology and local measurements of dynamic processes. However, with in situ techniques it is impossible to characterize or understand many of the global plasma transport and energization processes. A series of global plasma images would greatly advance our understanding of these processes and would provide the context for interpreting previous and future in situ measurements. Fast neutral atoms, created from ions that are neutralized in collisions with exospheric neutrals, offer the means for remotely imaging plasma populations. Energy and mass analysis of these neutrals provides critical information about the source plasma distribution. The flux of neutral atoms available for imaging depends upon a convolution of the ambient plasma distribution with the charge exchange cross section for the background neutral population. Some of the highest signals are at relatively low energies (well below 1 keV). This energy range also includes some of the most important plasma populations to be imaged, for example the base of the cleft ion fountain.

  7. Estimation of the processes controlling variability in phytoplankton pigment distributions on the southeastern U.S. continental shelf

    NASA Technical Reports Server (NTRS)

    Mcclain, Charles R.; Ishizaka, Joji; Hofmann, Eileen E.

    1990-01-01

    Five coastal-zone-color-scanner images from the southeastern U.S. continental shelf are combined with concurrent moored current meter measurements to assess the processes controlling the variability in chlorophyll concentration and distribution in this region. An equation governing the space and time distribution of a nonconservative quantity such as chlorophyll is used in the calculations. The terms of the equation, estimated from observations, show that advective, diffusive, and local processes contribute to the plankton distributions and vary with time and location. The results from this calculation are compared with similar results obtained using a numerical physical-biological model with circulation fields derived from an optimal interpolation of the current meter observations and it is concluded that the two approaches produce different estimates of the processes controlling phytoplankton variability.

  8. Wave Processes in Arctic Seas, Observed from TerraSAR-X

    DTIC Science & Technology

    2015-09-30

    in order to improve wave models as well as ice models applicable to a changing Arctic wave/ and ice climate . This includes observation and...fields retrieved from the TS-X image swaths. 4. “Wave Climate and Wave Mixing in the Marginal Ice Zones of Arctic Seas, Observations and Modelling”, by...1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. “Wave Processes in Arctic Seas, Observed from TerraSAR-X

  9. Cryomilling for the fabrication of doxorubicin-containing silica-nanoparticle/polycaprolactone nanocomposite films

    NASA Astrophysics Data System (ADS)

    Gao, Yu; Lim, Jing; Han, Yiyuan; Wang, Lifeng; Chong, Mark Seow Khoon; Teoh, Swee-Hin; Xu, Chenjie

    2016-01-01

    Bionanocomposites need to have a homogeneous distribution of nanomaterials in the polymeric matrix to achieve consistent mechanical and biological functions. However, a significant challenge lies in achieving the homogeneous distribution of nanomaterials, particularly through a solvent-free approach. This report introduces a technology to address this need. Specifically, cryomilling, a solvent-free, low-temperature processing method, was applied to generate a bionanocomposite film with well-dispersed nanoparticles. As a proof-of-concept, polycaprolactone (PCL) and doxorubicin-containing silica nanoparticles (Si-Dox) were processed through cryomilling and subsequently heat pressed to form the PCL/Si-Dox (cPCL/Si-Dox) film. Homogeneous distribution of Si-Dox was observed under both confocal imaging and atomic force microscopy imaging. The mechanical properties of cPCL/Si-Dox were comparable to those of the pure PCL film. Subsequent in vitro release profiles suggested that sustained release of Dox from the cPCL/Si-Dox film was achievable over 50 days. When human cervical cancer cells were seeded directly on these films, uptake of Dox was observed as early as day 1 and significant inhibition of cell growth was recorded on day 5.Bionanocomposites need to have a homogeneous distribution of nanomaterials in the polymeric matrix to achieve consistent mechanical and biological functions. However, a significant challenge lies in achieving the homogeneous distribution of nanomaterials, particularly through a solvent-free approach. This report introduces a technology to address this need. Specifically, cryomilling, a solvent-free, low-temperature processing method, was applied to generate a bionanocomposite film with well-dispersed nanoparticles. As a proof-of-concept, polycaprolactone (PCL) and doxorubicin-containing silica nanoparticles (Si-Dox) were processed through cryomilling and subsequently heat pressed to form the PCL/Si-Dox (cPCL/Si-Dox) film. Homogeneous distribution of Si-Dox was observed under both confocal imaging and atomic force microscopy imaging. The mechanical properties of cPCL/Si-Dox were comparable to those of the pure PCL film. Subsequent in vitro release profiles suggested that sustained release of Dox from the cPCL/Si-Dox film was achievable over 50 days. When human cervical cancer cells were seeded directly on these films, uptake of Dox was observed as early as day 1 and significant inhibition of cell growth was recorded on day 5. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr07287e

  10. Fast Laser Holographic Interferometry For Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Lee, George

    1989-01-01

    Proposed system makes holographic interferograms quickly in wind tunnels. Holograms reveal two-dimensional flows around airfoils and provide information on distributions of pressure, structures of wake and boundary layers, and density contours of flow fields. Holograms form quickly in thermoplastic plates in wind tunnel. Plates rigid and left in place so neither vibrations nor photgraphic-development process degrades accuracy of holograms. System processes and analyzes images quickly. Semiautomatic micro-computer-based desktop image-processing unit now undergoing development moves easily to wind tunnel, and its speed and memory adequate for flows about airfoils.

  11. Inferring river bathymetry via Image-to-Depth Quantile Transformation (IDQT)

    USGS Publications Warehouse

    Legleiter, Carl

    2016-01-01

    Conventional, regression-based methods of inferring depth from passive optical image data undermine the advantages of remote sensing for characterizing river systems. This study introduces and evaluates a more flexible framework, Image-to-Depth Quantile Transformation (IDQT), that involves linking the frequency distribution of pixel values to that of depth. In addition, a new image processing workflow involving deep water correction and Minimum Noise Fraction (MNF) transformation can reduce a hyperspectral data set to a single variable related to depth and thus suitable for input to IDQT. Applied to a gravel bed river, IDQT avoided negative depth estimates along channel margins and underpredictions of pool depth. Depth retrieval accuracy (R25 0.79) and precision (0.27 m) were comparable to an established band ratio-based method, although a small shallow bias (0.04 m) was observed. Several ways of specifying distributions of pixel values and depths were evaluated but had negligible impact on the resulting depth estimates, implying that IDQT was robust to these implementation details. In essence, IDQT uses frequency distributions of pixel values and depths to achieve an aspatial calibration; the image itself provides information on the spatial distribution of depths. The approach thus reduces sensitivity to misalignment between field and image data sets and allows greater flexibility in the timing of field data collection relative to image acquisition, a significant advantage in dynamic channels. IDQT also creates new possibilities for depth retrieval in the absence of field data if a model could be used to predict the distribution of depths within a reach.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Domenico, Giovanni, E-mail: didomenico@fe.infn.it; Cardarelli, Paolo; Taibi, Angelo

    Purpose: The quality of a radiography system is affected by several factors, a major one being the focal spot size of the x-ray tube. In fact, the measurement of such size is recognized to be of primary importance during acceptance tests and image quality evaluations of clinical radiography systems. The most common device providing an image of the focal spot emission distribution is a pin-hole camera, which requires a high tube loading in order to produce a measurable signal. This work introduces an alternative technique to obtain an image of the focal spot, through the processing of a single radiographmore » of a simple test object, acquired with a suitable magnification. Methods: The radiograph of a magnified sharp edge is a well-established method to evaluate the extension of the focal spot profile along the direction perpendicular to the edge. From a single radiograph of a circular x-ray absorber, it is possible to extract simultaneously the radial profiles of several sharp edges with different orientations. The authors propose a technique that allows to obtain an image of the focal spot through the processing of these radial profiles by means of a pseudo-CT reconstruction technique. In order to validate this technique, the reconstruction has been applied to the simulated radiographs of an ideal disk-shaped absorber, generated by various simulated focal spot distributions. Furthermore, the method has been applied to the focal spot of a commercially available mammography unit. Results: In the case of simulated radiographs, the results of the reconstructions have been compared to the original distributions, showing an excellent agreement for what regards both the overall distribution and the full width at half maximum measurements. In the case of the experimental test, the method allowed to obtain images of the focal spot that have been compared with the results obtained through standard techniques, namely, pin-hole camera and slit camera. Conclusions: The method was proven to be effective for simulated images and the results of the experimental test suggest that it could be considered as an alternative technique for focal spot distribution evaluation. The method offers the possibility to measure the actual focal spot size and emission distribution at the same exposure conditions as clinical routine, avoiding high tube loading as in the case of the pin-hole imaging technique.« less

  13. X-ray focal spot reconstruction by circular penumbra analysis-Application to digital radiography systems.

    PubMed

    Di Domenico, Giovanni; Cardarelli, Paolo; Contillo, Adriano; Taibi, Angelo; Gambaccini, Mauro

    2016-01-01

    The quality of a radiography system is affected by several factors, a major one being the focal spot size of the x-ray tube. In fact, the measurement of such size is recognized to be of primary importance during acceptance tests and image quality evaluations of clinical radiography systems. The most common device providing an image of the focal spot emission distribution is a pin-hole camera, which requires a high tube loading in order to produce a measurable signal. This work introduces an alternative technique to obtain an image of the focal spot, through the processing of a single radiograph of a simple test object, acquired with a suitable magnification. The radiograph of a magnified sharp edge is a well-established method to evaluate the extension of the focal spot profile along the direction perpendicular to the edge. From a single radiograph of a circular x-ray absorber, it is possible to extract simultaneously the radial profiles of several sharp edges with different orientations. The authors propose a technique that allows to obtain an image of the focal spot through the processing of these radial profiles by means of a pseudo-CT reconstruction technique. In order to validate this technique, the reconstruction has been applied to the simulated radiographs of an ideal disk-shaped absorber, generated by various simulated focal spot distributions. Furthermore, the method has been applied to the focal spot of a commercially available mammography unit. In the case of simulated radiographs, the results of the reconstructions have been compared to the original distributions, showing an excellent agreement for what regards both the overall distribution and the full width at half maximum measurements. In the case of the experimental test, the method allowed to obtain images of the focal spot that have been compared with the results obtained through standard techniques, namely, pin-hole camera and slit camera. The method was proven to be effective for simulated images and the results of the experimental test suggest that it could be considered as an alternative technique for focal spot distribution evaluation. The method offers the possibility to measure the actual focal spot size and emission distribution at the same exposure conditions as clinical routine, avoiding high tube loading as in the case of the pin-hole imaging technique.

  14. Correlation processing of polarization inhomogenous images in laser diagnostics of biological tissues

    NASA Astrophysics Data System (ADS)

    Trifonyuk, L.

    2012-10-01

    The model of interaction of laser radiation with biological tissue as a two-component amorphous-crystalline matrix was proposed. The processes of formation of polarization of laser radiation are considered, taking into account birefringence network protein fibrils. Measurement of the coordinate distribution of polarization states in the location of the laser micropolarimetr was conducted .The results of investigating the interrelation between the values of correlation (correlation area, asymmetry coefficient and autocorrelation function excess) and fractal (dispersion of logarithmic dependencies of power spectra) parameters are presented. They characterize the coordinate distributions of polarization azimuth of laser images of histological sections of women's reproductive sphere tissues and pathological changes in human organism. The diagnostic criteria of the prolapse of the vaginal tissue arising are determined.

  15. ID card number detection algorithm based on convolutional neural network

    NASA Astrophysics Data System (ADS)

    Zhu, Jian; Ma, Hanjie; Feng, Jie; Dai, Leiyan

    2018-04-01

    In this paper, a new detection algorithm based on Convolutional Neural Network is presented in order to realize the fast and convenient ID information extraction in multiple scenarios. The algorithm uses the mobile device equipped with Android operating system to locate and extract the ID number; Use the special color distribution of the ID card, select the appropriate channel component; Use the image threshold segmentation, noise processing and morphological processing to take the binary processing for image; At the same time, the image rotation and projection method are used for horizontal correction when image was tilting; Finally, the single character is extracted by the projection method, and recognized by using Convolutional Neural Network. Through test shows that, A single ID number image from the extraction to the identification time is about 80ms, the accuracy rate is about 99%, It can be applied to the actual production and living environment.

  16. Matrix decomposition graphics processing unit solver for Poisson image editing

    NASA Astrophysics Data System (ADS)

    Lei, Zhao; Wei, Li

    2012-10-01

    In recent years, gradient-domain methods have been widely discussed in the image processing field, including seamless cloning and image stitching. These algorithms are commonly carried out by solving a large sparse linear system: the Poisson equation. However, solving the Poisson equation is a computational and memory intensive task which makes it not suitable for real-time image editing. A new matrix decomposition graphics processing unit (GPU) solver (MDGS) is proposed to settle the problem. A matrix decomposition method is used to distribute the work among GPU threads, so that MDGS will take full advantage of the computing power of current GPUs. Additionally, MDGS is a hybrid solver (combines both the direct and iterative techniques) and has two-level architecture. These enable MDGS to generate identical solutions with those of the common Poisson methods and achieve high convergence rate in most cases. This approach is advantageous in terms of parallelizability, enabling real-time image processing, low memory-taken and extensive applications.

  17. IKONOS geometric characterization

    USGS Publications Warehouse

    Helder, Dennis; Coan, Michael; Patrick, Kevin; Gaska, Peter

    2003-01-01

    The IKONOS spacecraft acquired images on July 3, 17, and 25, and August 13, 2001 of Brookings SD, a small city in east central South Dakota, and on May 22, June 30, and July 30, 2000, of the rural area around the EROS Data Center. South Dakota State University (SDSU) evaluated the Brookings scenes and the USGS EROS Data Center (EDC) evaluated the other scenes. The images evaluated by SDSU utilized various natural objects and man-made features as identifiable targets randomly distribution throughout the scenes, while the images evaluated by EDC utilized pre-marked artificial points (panel points) to provide the best possible targets distributed in a grid pattern. Space Imaging provided products at different processing levels to each institution. For each scene, the pixel (line, sample) locations of the various targets were compared to field observed, survey-grade Global Positioning System locations. Patterns of error distribution for each product were plotted, and a variety of statistical statements of accuracy are made. The IKONOS sensor also acquired 12 pairs of stereo images of globally distributed scenes between April 2000 and April 2001. For each scene, analysts at the National Imagery and Mapping Agency (NIMA) compared derived photogrammetric coordinates to their corresponding NIMA field-surveyed ground control point (GCPs). NIMA analysts determined horizontal and vertical accuracies by averaging the differences between the derived photogrammetric points and the field-surveyed GCPs for all 12 stereo pairs. Patterns of error distribution for each scene are presented.

  18. Three-Dimensional Optical Mapping of Nanoparticle Distribution in Intact Tissues.

    PubMed

    Sindhwani, Shrey; Syed, Abdullah Muhammad; Wilhelm, Stefan; Glancy, Dylan R; Chen, Yih Yang; Dobosz, Michael; Chan, Warren C W

    2016-05-24

    The role of tissue architecture in mediating nanoparticle transport, targeting, and biological effects is unknown due to the lack of tools for imaging nanomaterials in whole organs. Here, we developed a rapid optical mapping technique to image nanomaterials in intact organs ex vivo and in three-dimensions (3D). We engineered a high-throughput electrophoretic flow device to simultaneously transform up to 48 tissues into optically transparent structures, allowing subcellular imaging of nanomaterials more than 1 mm deep into tissues which is 25-fold greater than current techniques. A key finding is that nanomaterials can be retained in the processed tissue by chemical cross-linking of surface adsorbed serum proteins to the tissue matrix, which enables nanomaterials to be imaged with respect to cells, blood vessels, and other structures. We developed a computational algorithm to analyze and quantitatively map nanomaterial distribution. This method can be universally applied to visualize the distribution and interactions of materials in whole tissues and animals including such applications as the imaging of nanomaterials, tissue engineered constructs, and biosensors within their intact biological environment.

  19. Microscale Effects from Global Hot Plasma Imagery

    NASA Technical Reports Server (NTRS)

    Moore, T. E.; Fok, M.-C.; Perez, J. D.; Keady, J. P.

    1995-01-01

    We have used a three-dimensional model of recovery phase storm hot plasmas to explore the signatures of pitch angle distributions (PADS) in global fast atom imagery of the magnetosphere. The model computes mass, energy, and position-dependent PADs based on drift effects, charge exchange losses, and Coulomb drag. The hot plasma PAD strongly influences both the storm current system carried by the hot plasma and its time evolution. In turn, the PAD is strongly influenced by plasma waves through pitch angle diffusion, a microscale effect. We report the first simulated neutral atom images that account for anisotropic PADs within the hot plasma. They exhibit spatial distribution features that correspond directly to the PADs along the lines of sight. We investigate the use of image brightness distributions along tangent-shell field lines to infer equatorial PADS. In tangent-shell regions with minimal spatial gradients, reasonably accurate PADs are inferred from simulated images. They demonstrate the importance of modeling PADs for image inversion and show that comparisons of models with real storm plasma images will reveal the global effects of these microscale processes.

  20. Challenges of implementing digital technology in motion picture distribution and exhibition: testing and evaluation methodology

    NASA Astrophysics Data System (ADS)

    Swartz, Charles S.

    2003-05-01

    The process of distributing and exhibiting a motion picture has changed little since the Lumière brothers presented the first motion picture to an audience in 1895. While this analog photochemical process is capable of producing screen images of great beauty and expressive power, more often the consumer experience is diminished by third generation prints and by the wear and tear of the mechanical process. Furthermore, the film industry globally spends approximately $1B annually manufacturing and shipping prints. Alternatively, distributing digital files would theoretically yield great benefits in terms of image clarity and quality, lower cost, greater security, and more flexibility in the cinema (e.g., multiple language versions). In order to understand the components of the digital cinema chain and evaluate the proposed technical solutions, the Entertainment Technology Center at USC in 2000 established the Digital Cinema Laboratory as a critical viewing environment, with the highest quality film and digital projection equipment. The presentation describes the infrastructure of the Lab, test materials, and testing methodologies developed for compression evaluation, and lessons learned up to the present. In addition to compression, the Digital Cinema Laboratory plans to evaluate other components of the digital cinema process as well.

  1. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  2. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    PubMed Central

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information. PMID:26823196

  3. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption.

    PubMed

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-29

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  4. Concrete thawing studied by single-point ramped imaging.

    PubMed

    Prado, P J; Balcom, B J; Beyea, S D; Armstrong, R L; Bremner, T W

    1997-12-01

    A series of two-dimensional images of proton distribution in a hardened concrete sample has been obtained during the thawing process (from -50 degrees C up to 11 degrees C). The SPRITE sequence is optimal for this study given the characteristic short relaxation times of water in this porous media (T2* < 200 micros and T1 < 3.6 ms). The relaxation parameters of the sample were determined in order to optimize the time efficiency of the sequence, permitting a 4-scan 64 x 64 acquisition in under 3 min. The image acquisition is fast on the time scale of the temperature evolution of the specimen. The frozen water distribution is quantified through a position based study of the image contrast. A multiple point acquisition method is presented and the signal sensitivity improvement is discussed.

  5. GPR-Based Water Leak Models in Water Distribution Systems

    PubMed Central

    Ayala-Cabrera, David; Herrera, Manuel; Izquierdo, Joaquín; Ocaña-Levario, Silvia J.; Pérez-García, Rafael

    2013-01-01

    This paper addresses the problem of leakage in water distribution systems through the use of ground penetrating radar (GPR) as a nondestructive method. Laboratory tests are performed to extract features of water leakage from the obtained GPR images. Moreover, a test in a real-world urban system under real conditions is performed. Feature extraction is performed by interpreting GPR images with the support of a pre-processing methodology based on an appropriate combination of statistical methods and multi-agent systems. The results of these tests are presented, interpreted, analyzed and discussed in this paper.

  6. USGS remote sensing coordination for the 2010 Haiti earthquake

    USGS Publications Warehouse

    Duda, Kenneth A.; Jones, Brenda

    2011-01-01

    In response to the devastating 12 January 2010, earthquake in Haiti, the US Geological Survey (USGS) provided essential coordinating services for remote sensing activities. Communication was rapidly established between the widely distributed response teams and data providers to define imaging requirements and sensor tasking opportunities. Data acquired from a variety of sources were received and archived by the USGS, and these products were subsequently distributed using the Hazards Data Distribution System (HDDS) and other mechanisms. Within six weeks after the earthquake, over 600,000 files representing 54 terabytes of data were provided to the response community. The USGS directly supported a wide variety of groups in their use of these data to characterize post-earthquake conditions and to make comparisons with pre-event imagery. The rapid and continuing response achieved was enabled by existing imaging and ground systems, and skilled personnel adept in all aspects of satellite data acquisition, processing, distribution and analysis. The information derived from image interpretation assisted senior planners and on-site teams to direct assistance where it was most needed.

  7. Optical disk processing of solar images.

    NASA Astrophysics Data System (ADS)

    Title, A.; Tarbell, T.

    The current generation of space and ground-based experiments in solar physics produces many megabyte-sized image data arrays. Optical disk technology is the leading candidate for convenient analysis, distribution, and archiving of these data. The authors have been developing data analysis procedures which use both analog and digital optical disks for the study of solar phenomena.

  8. Southwest U.S. Imagery (GOES-WEST) - Satellite Services Division / Office

    Science.gov Websites

    of Satellite Data Processing and Distribution Skip Navigation Link NESDIS banner image and link Information Service Home Page Default Office of Satellite and Product Operations banner image and link to OSPO Color -- Sea/Lake Ice -- Sea Surface Height -- Sea Surface Temperatures -- Tropical Systems Product List

  9. An Effective Post-Filtering Framework for 3-D PET Image Denoising Based on Noise and Sensitivity Characteristics

    NASA Astrophysics Data System (ADS)

    Kim, Ji Hye; Ahn, Il Jun; Nam, Woo Hyun; Ra, Jong Beom

    2015-02-01

    Positron emission tomography (PET) images usually suffer from a noticeable amount of statistical noise. In order to reduce this noise, a post-filtering process is usually adopted. However, the performance of this approach is limited because the denoising process is mostly performed on the basis of the Gaussian random noise. It has been reported that in a PET image reconstructed by the expectation-maximization (EM), the noise variance of each voxel depends on its mean value, unlike in the case of Gaussian noise. In addition, we observe that the variance also varies with the spatial sensitivity distribution in a PET system, which reflects both the solid angle determined by a given scanner geometry and the attenuation information of a scanned object. Thus, if a post-filtering process based on the Gaussian random noise is applied to PET images without consideration of the noise characteristics along with the spatial sensitivity distribution, the spatially variant non-Gaussian noise cannot be reduced effectively. In the proposed framework, to effectively reduce the noise in PET images reconstructed by the 3-D ordinary Poisson ordered subset EM (3-D OP-OSEM), we first denormalize an image according to the sensitivity of each voxel so that the voxel mean value can represent its statistical properties reliably. Based on our observation that each noisy denormalized voxel has a linear relationship between the mean and variance, we try to convert this non-Gaussian noise image to a Gaussian noise image. We then apply a block matching 4-D algorithm that is optimized for noise reduction of the Gaussian noise image, and reconvert and renormalize the result to obtain a final denoised image. Using simulated phantom data and clinical patient data, we demonstrate that the proposed framework can effectively suppress the noise over the whole region of a PET image while minimizing degradation of the image resolution.

  10. Polarization-singular processing of biological layers laser images to diagnose and classify their optical properties

    NASA Astrophysics Data System (ADS)

    Ushenko, Yu. O.; Telenga, O. Y.

    2011-09-01

    Presented in this work are the results of investigation aimed at analysis of coordinate distributions for azimuths and ellipticity of polarization (polarization maps) in blood plasma layers laser images of three groups of patients: healthy (group 1), with dysplasia (group 2) and cancer of cervix uteri (group 3). To characterize polarization maps for all groups of samples, the authors have offered to use three groups of parameters: statistical moments of the first to the fourth orders, autocorrelation functions, logarithmic dependences for power spectra related to distributions of azimuths and ellipticity of polarization inherent to blood plasma laser images. Ascertained are the criteria for diagnostics and differentiation of cervix uteri pathological changes.

  11. Distributed Kernelized Locality-Sensitive Hashing for Faster Image Based Navigation

    DTIC Science & Technology

    2015-03-26

    Facebook, Google, and Yahoo !. Current methods for image retrieval become problematic when implemented on image datasets that can easily reach billions of...correlations. Tech industry leaders like Facebook, Google, and Yahoo ! sort and index even larger volumes of “big data” daily. When attempting to process...open source implementation of Google’s MapReduce programming paradigm [13] which has been used for many different things. Using Apache Hadoop, Yahoo

  12. Flame colour characterization in the visible and infrared spectrum using a digital camera and image processing

    NASA Astrophysics Data System (ADS)

    Huang, Hua-Wei; Zhang, Yang

    2008-08-01

    An attempt has been made to characterize the colour spectrum of methane flame under various burning conditions using RGB and HSV colour models instead of resolving the real physical spectrum. The results demonstrate that each type of flame has its own characteristic distribution in both the RGB and HSV space. It has also been observed that the averaged B and G values in the RGB model represent well the CH* and C*2 emission of methane premixed flame. Theses features may be utilized for flame measurement and monitoring. The great advantage of using a conventional camera for monitoring flame properties based on the colour spectrum is that it is readily available, easy to interface with a computer, cost effective and has certain spatial resolution. Furthermore, it has been demonstrated that a conventional digital camera is able to image flame not only in the visible spectrum but also in the infrared. This feature is useful in avoiding the problem of image saturation typically encountered in capturing the very bright sooty flames. As a result, further digital imaging processing and quantitative information extraction is possible. It has been identified that an infrared image also has its own distribution in both the RGB and HSV colour space in comparison with a flame image in the visible spectrum.

  13. Sparsity-based image monitoring of crystal size distribution during crystallization

    NASA Astrophysics Data System (ADS)

    Liu, Tao; Huo, Yan; Ma, Cai Y.; Wang, Xue Z.

    2017-07-01

    To facilitate monitoring crystal size distribution (CSD) during a crystallization process by using an in-situ imaging system, a sparsity-based image analysis method is proposed for real-time implementation. To cope with image degradation arising from in-situ measurement subject to particle motion, solution turbulence, and uneven illumination background in the crystallizer, sparse representation of a real-time captured crystal image is developed based on using an in-situ image dictionary established in advance, such that the noise components in the captured image can be efficiently removed. Subsequently, the edges of a crystal shape in a captured image are determined in terms of the salience information defined from the denoised crystal images. These edges are used to derive a blur kernel for reconstruction of a denoised image. A non-blind deconvolution algorithm is given for the real-time reconstruction. Consequently, image segmentation can be easily performed for evaluation of CSD. The crystal image dictionary and blur kernels are timely updated in terms of the imaging conditions to improve the restoration efficiency. An experimental study on the cooling crystallization of α-type L-glutamic acid (LGA) is shown to demonstrate the effectiveness and merit of the proposed method.

  14. Removal of intensity bias in magnitude spin-echo MRI images by nonlinear diffusion filtering

    NASA Astrophysics Data System (ADS)

    Samsonov, Alexei A.; Johnson, Chris R.

    2004-05-01

    MRI data analysis is routinely done on the magnitude part of complex images. While both real and imaginary image channels contain Gaussian noise, magnitude MRI data are characterized by Rice distribution. However, conventional filtering methods often assume image noise to be zero mean and Gaussian distributed. Estimation of an underlying image using magnitude data produces biased result. The bias may lead to significant image errors, especially in areas of low signal-to-noise ratio (SNR). The incorporation of the Rice PDF into a noise filtering procedure can significantly complicate the method both algorithmically and computationally. In this paper, we demonstrate that inherent image phase smoothness of spin-echo MRI images could be utilized for separate filtering of real and imaginary complex image channels to achieve unbiased image denoising. The concept is demonstrated with a novel nonlinear diffusion filtering scheme developed for complex image filtering. In our proposed method, the separate diffusion processes are coupled through combined diffusion coefficients determined from the image magnitude. The new method has been validated with simulated and real MRI data. The new method has provided efficient denoising and bias removal in conventional and black-blood angiography MRI images obtained using fast spin echo acquisition protocols.

  15. Near-infrared chemical imaging (NIR-CI) as a process monitoring solution for a production line of roll compaction and tableting.

    PubMed

    Khorasani, Milad; Amigo, José M; Sun, Changquan Calvin; Bertelsen, Poul; Rantanen, Jukka

    2015-06-01

    In the present study the application of near-infrared chemical imaging (NIR-CI) supported by chemometric modeling as non-destructive tool for monitoring and assessing the roller compaction and tableting processes was investigated. Based on preliminary risk-assessment, discussion with experts and current work from the literature the critical process parameter (roll pressure and roll speed) and critical quality attributes (ribbon porosity, granule size, amount of fines, tablet tensile strength) were identified and a design space was established. Five experimental runs with different process settings were carried out which revealed intermediates (ribbons, granules) and final products (tablets) with different properties. Principal component analysis (PCA) based model of NIR images was applied to map the ribbon porosity distribution. The ribbon porosity distribution gained from the PCA based NIR-CI was used to develop predictive models for granule size fractions. Predictive methods with acceptable R(2) values could be used to predict the granule particle size. Partial least squares regression (PLS-R) based model of the NIR-CI was used to map and predict the chemical distribution and content of active compound for both roller compacted ribbons and corresponding tablets. In order to select the optimal process, setting the standard deviation of tablet tensile strength and tablet weight for each tablet batch was considered. Strong linear correlation between tablet tensile strength and amount of fines and granule size was established, respectively. These approaches are considered to have a potentially large impact on quality monitoring and control of continuously operating manufacturing lines, such as roller compaction and tableting processes. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Dynamic imaging model and parameter optimization for a star tracker.

    PubMed

    Yan, Jinyun; Jiang, Jie; Zhang, Guangjun

    2016-03-21

    Under dynamic conditions, star spots move across the image plane of a star tracker and form a smeared star image. This smearing effect increases errors in star position estimation and degrades attitude accuracy. First, an analytical energy distribution model of a smeared star spot is established based on a line segment spread function because the dynamic imaging process of a star tracker is equivalent to the static imaging process of linear light sources. The proposed model, which has a clear physical meaning, explicitly reflects the key parameters of the imaging process, including incident flux, exposure time, velocity of a star spot in an image plane, and Gaussian radius. Furthermore, an analytical expression of the centroiding error of the smeared star spot is derived using the proposed model. An accurate and comprehensive evaluation of centroiding accuracy is obtained based on the expression. Moreover, analytical solutions of the optimal parameters are derived to achieve the best performance in centroid estimation. Finally, we perform numerical simulations and a night sky experiment to validate the correctness of the dynamic imaging model, the centroiding error expression, and the optimal parameters.

  17. Electronic Photography at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Holm, Jack; Judge, Nancianne

    1995-01-01

    An electronic photography facility has been established in the Imaging & Photographic Technology Section, Visual Imaging Branch, at the NASA Langley Research Center (LaRC). The purpose of this facility is to provide the LaRC community with access to digital imaging technology. In particular, capabilities have been established for image scanning, direct image capture, optimized image processing for storage, image enhancement, and optimized device dependent image processing for output. Unique approaches include: evaluation and extraction of the entire film information content through scanning; standardization of image file tone reproduction characteristics for optimal bit utilization and viewing; education of digital imaging personnel on the effects of sampling and quantization to minimize image processing related information loss; investigation of the use of small kernel optimal filters for image restoration; characterization of a large array of output devices and development of image processing protocols for standardized output. Currently, the laboratory has a large collection of digital image files which contain essentially all the information present on the original films. These files are stored at 8-bits per color, but the initial image processing was done at higher bit depths and/or resolutions so that the full 8-bits are used in the stored files. The tone reproduction of these files has also been optimized so the available levels are distributed according to visual perceptibility. Look up tables are available which modify these files for standardized output on various devices, although color reproduction has been allowed to float to some extent to allow for full utilization of output device gamut.

  18. Semi-automatic mapping of fault rocks on a Digital Outcrop Model, Gole Larghe Fault Zone (Southern Alps, Italy)

    NASA Astrophysics Data System (ADS)

    Vho, Alice; Bistacchi, Andrea

    2015-04-01

    A quantitative analysis of fault-rock distribution is of paramount importance for studies of fault zone architecture, fault and earthquake mechanics, and fluid circulation along faults at depth. Here we present a semi-automatic workflow for fault-rock mapping on a Digital Outcrop Model (DOM). This workflow has been developed on a real case of study: the strike-slip Gole Larghe Fault Zone (GLFZ). It consists of a fault zone exhumed from ca. 10 km depth, hosted in granitoid rocks of Adamello batholith (Italian Southern Alps). Individual seismogenic slip surfaces generally show green cataclasites (cemented by the precipitation of epidote and K-feldspar from hydrothermal fluids) and more or less well preserved pseudotachylytes (black when well preserved, greenish to white when altered). First of all, a digital model for the outcrop is reconstructed with photogrammetric techniques, using a large number of high resolution digital photographs, processed with VisualSFM software. By using high resolution photographs the DOM can have a much higher resolution than with LIDAR surveys, up to 0.2 mm/pixel. Then, image processing is performed to map the fault-rock distribution with the ImageJ-Fiji package. Green cataclasites and epidote/K-feldspar veins can be quite easily separated from the host rock (tonalite) using spectral analysis. Particularly, band ratio and principal component analysis have been tested successfully. The mapping of black pseudotachylyte veins is more tricky because the differences between the pseudotachylyte and biotite spectral signature are not appreciable. For this reason we have tested different morphological processing tools aimed at identifying (and subtracting) the tiny biotite grains. We propose a solution based on binary images involving a combination of size and circularity thresholds. Comparing the results with manually segmented images, we noticed that major problems occur only when pseudotachylyte veins are very thin and discontinuous. After having tested and refined the image analysis processing for some typical images, we have recorded a macro with ImageJ-Fiji allowing to process all the images for a given DOM. As a result, the three different types of rocks can be semi-automatically mapped on large DOMs using a simple and efficient procedure. This allows to develop quantitative analyses of fault rock distribution and thickness, fault trace roughness/curvature and length, fault zone architecture, and alteration halos due to hydrothermal fluid-rock interaction. To improve our workflow, additional or different morphological operators could be integrated in our procedure to yield a better resolution on small and thin pseudotachylyte veins (e.g. perimeter/area ratio).

  19. Neural network and its application to CT imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nikravesh, M.; Kovscek, A.R.; Patzek, T.W.

    We present an integrated approach to imaging the progress of air displacement by spontaneous imbibition of oil into sandstone. We combine Computerized Tomography (CT) scanning and neural network image processing. The main aspects of our approach are (I) visualization of the distribution of oil and air saturation by CT, (II) interpretation of CT scans using neural networks, and (III) reconstruction of 3-D images of oil saturation from the CT scans with a neural network model. Excellent agreement between the actual images and the neural network predictions is found.

  20. Small-Animal Imaging Using Diffuse Fluorescence Tomography.

    PubMed

    Davis, Scott C; Tichauer, Kenneth M

    2016-01-01

    Diffuse fluorescence tomography (DFT) has been developed to image the spatial distribution of fluorescence-tagged tracers in living tissue. This capability facilitates the recovery of any number of functional parameters, including enzymatic activity, receptor density, blood flow, and gene expression. However, deploying DFT effectively is complex and often requires years of know-how, especially for newer mutlimodal systems that combine DFT with conventional imaging systems. In this chapter, we step through the process of using MRI-DFT imaging of a receptor-targeted tracer in small animals.

  1. [Investigation on remote measurement of air pollution by a method of infrared passive scanning imaging].

    PubMed

    Jiao, Yang; Xu, Liang; Gao, Min-Guang; Feng, Ming-Chun; Jin, Ling; Tong, Jing-Jing; Li, Sheng

    2012-07-01

    Passive remote sensing by Fourier-transform infrared (FTIR) spectrometry allows detection of air pollution. However, for the localization of a leak and a complete assessment of the situation in the case of the release of a hazardous cloud, information about the position and the distribution of a cloud is essential. Therefore, an imaging passive remote sensing system comprising an interferometer, a data acquisition and processing software, scan system, a video system, and a personal computer has been developed. The remote sensing of SF6 was done. The column densities of all directions in which a target compound has been identified may be retrieved by a nonlinear least squares fitting algorithm and algorithm of radiation transfer, and a false color image is displayed. The results were visualized by a video image, overlaid by false color concentration distribution image. The system has a high selectivity, and allows visualization and quantification of pollutant clouds.

  2. Scalable, High-performance 3D Imaging Software Platform: System Architecture and Application to Virtual Colonoscopy

    PubMed Central

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli; Brett, Bevin

    2013-01-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. In this work, we have developed a software platform that is designed to support high-performance 3D medical image processing for a wide range of applications using increasingly available and affordable commodity computing systems: multi-core, clusters, and cloud computing systems. To achieve scalable, high-performance computing, our platform (1) employs size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D image processing algorithms; (2) supports task scheduling for efficient load distribution and balancing; and (3) consists of a layered parallel software libraries that allow a wide range of medical applications to share the same functionalities. We evaluated the performance of our platform by applying it to an electronic cleansing system in virtual colonoscopy, with initial experimental results showing a 10 times performance improvement on an 8-core workstation over the original sequential implementation of the system. PMID:23366803

  3. Advancements of labelled radio-pharmaceutics imaging with the PIM-MPGD

    NASA Astrophysics Data System (ADS)

    Donnard, J.; Arlicot, N.; Berny, R.; Carduner, H.; Leray, P.; Morteau, E.; Servagent, N.; Thers, D.

    2009-11-01

    The Beta autoradiography is widely used in pharmacology or in biological fields to study the response of an organism to a certain kind of molecule. The image of the distribution is processed by studying the concentration of the radioactivity into different organs. We report on the development of an integrated apparatus based on a PIM device (Parallel Ionization Multiplier) able to process the image of 10 microscope slides at the same time over an area of 18*18 cm2. Thanks to a vacuum pump and a regulation gas circuit, 5 minutes is sufficient to begin an acquisition. All the electronics and the gas distribution are included in the structure leading to a transportable device. Special software has been developed to process data in real time with image visualization. Biological samples can be labelled with β emitters of low energy like 3H/14C or Auger electrons of 125I/99mTc. The measured spatial resolution is 30 μm in 3H and the trigger and the charge rate are constant over more than 6 days of acquisition showing good stability of the device. Moreover, collaboration with doctors and biologists of INSERM (National Institute for Medical Research in France) has started in order to demonstrate that MPGD's can be easily proposed outside a physics laboratory.

  4. Better Pictures in a Snap

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Retinex Imaging Processing, winner of NASA's 1999 Space Act Award, is commercially available through TruView Imaging Company. With this technology, amateur photographers use their personal computers to improve the brightness, scene contrast, detail, and overall sharpness of images with increased ease. The process was originally developed for remote sensing of the Earth by researchers at Langley Research Center and Science and Technology Corporation (STC). It automatically enhances a digital image in terms of dynamic range compression, color independence from the spectral distribution of the scene illuminant, and color/lightness rendition. As a result, the enhanced digital image is much closer to the scene perceived by the human visual system, under all kinds and levels of lighting variations. TruView believes there are other applications for the software in medical imaging, forensics, security, recognizance, mining, assembly, and other industrial areas.

  5. Novel wavelength diversity technique for high-speed atmospheric turbulence compensation

    NASA Astrophysics Data System (ADS)

    Arrasmith, William W.; Sullivan, Sean F.

    2010-04-01

    The defense, intelligence, and homeland security communities are driving a need for software dominant, real-time or near-real time atmospheric turbulence compensated imagery. The development of parallel processing capabilities are finding application in diverse areas including image processing, target tracking, pattern recognition, and image fusion to name a few. A novel approach to the computationally intensive case of software dominant optical and near infrared imaging through atmospheric turbulence is addressed in this paper. Previously, the somewhat conventional wavelength diversity method has been used to compensate for atmospheric turbulence with great success. We apply a new correlation based approach to the wavelength diversity methodology using a parallel processing architecture enabling high speed atmospheric turbulence compensation. Methods for optical imaging through distributed turbulence are discussed, simulation results are presented, and computational and performance assessments are provided.

  6. Flexible distributed architecture for semiconductor process control and experimentation

    NASA Astrophysics Data System (ADS)

    Gower, Aaron E.; Boning, Duane S.; McIlrath, Michael B.

    1997-01-01

    Semiconductor fabrication requires an increasingly expensive and integrated set of tightly controlled processes, driving the need for a fabrication facility with fully computerized, networked processing equipment. We describe an integrated, open system architecture enabling distributed experimentation and process control for plasma etching. The system was developed at MIT's Microsystems Technology Laboratories and employs in-situ CCD interferometry based analysis in the sensor-feedback control of an Applied Materials Precision 5000 Plasma Etcher (AME5000). Our system supports accelerated, advanced research involving feedback control algorithms, and includes a distributed interface that utilizes the internet to make these fabrication capabilities available to remote users. The system architecture is both distributed and modular: specific implementation of any one task does not restrict the implementation of another. The low level architectural components include a host controller that communicates with the AME5000 equipment via SECS-II, and a host controller for the acquisition and analysis of the CCD sensor images. A cell controller (CC) manages communications between these equipment and sensor controllers. The CC is also responsible for process control decisions; algorithmic controllers may be integrated locally or via remote communications. Finally, a system server images connections from internet/intranet (web) based clients and uses a direct link with the CC to access the system. Each component communicates via a predefined set of TCP/IP socket based messages. This flexible architecture makes integration easier and more robust, and enables separate software components to run on the same or different computers independent of hardware or software platform.

  7. TANGO: a generic tool for high-throughput 3D image analysis for studying nuclear organization.

    PubMed

    Ollion, Jean; Cochennec, Julien; Loll, François; Escudé, Christophe; Boudier, Thomas

    2013-07-15

    The cell nucleus is a highly organized cellular organelle that contains the genetic material. The study of nuclear architecture has become an important field of cellular biology. Extracting quantitative data from 3D fluorescence imaging helps understand the functions of different nuclear compartments. However, such approaches are limited by the requirement for processing and analyzing large sets of images. Here, we describe Tools for Analysis of Nuclear Genome Organization (TANGO), an image analysis tool dedicated to the study of nuclear architecture. TANGO is a coherent framework allowing biologists to perform the complete analysis process of 3D fluorescence images by combining two environments: ImageJ (http://imagej.nih.gov/ij/) for image processing and quantitative analysis and R (http://cran.r-project.org) for statistical processing of measurement results. It includes an intuitive user interface providing the means to precisely build a segmentation procedure and set-up analyses, without possessing programming skills. TANGO is a versatile tool able to process large sets of images, allowing quantitative study of nuclear organization. TANGO is composed of two programs: (i) an ImageJ plug-in and (ii) a package (rtango) for R. They are both free and open source, available (http://biophysique.mnhn.fr/tango) for Linux, Microsoft Windows and Macintosh OSX. Distribution is under the GPL v.2 licence. thomas.boudier@snv.jussieu.fr Supplementary data are available at Bioinformatics online.

  8. Autonomous chemical and biological miniature wireless-sensor

    NASA Astrophysics Data System (ADS)

    Goldberg, Bar-Giora

    2005-05-01

    The presentation discusses a new concept and a paradigm shift in biological, chemical and explosive sensor system design and deployment. From large, heavy, centralized and expensive systems to distributed wireless sensor networks utilizing miniature platforms (nodes) that are lightweight, low cost and wirelessly connected. These new systems are possible due to the emergence and convergence of new innovative radio, imaging, networking and sensor technologies. Miniature integrated radio-sensor networks, is a technology whose time has come. These network systems are based on large numbers of distributed low cost and short-range wireless platforms that sense and process their environment and communicate data thru a network to a command center. The recent emergence of chemical and explosive sensor technology based on silicon nanostructures, coupled with the fast evolution of low-cost CMOS imagers, low power DSP engines and integrated radio chips, has created an opportunity to realize the vision of autonomous wireless networks. These threat detection networks will perform sophisticated analysis at the sensor node and convey alarm information up the command chain. Sensor networks of this type are expected to revolutionize the ability to detect and locate biological, chemical, or explosive threats. The ability to distribute large numbers of low-cost sensors over large areas enables these devices to be close to the targeted threats and therefore improve detection efficiencies and enable rapid counter responses. These sensor networks will be used for homeland security, shipping container monitoring, and other applications such as laboratory medical analysis, drug discovery, automotive, environmental and/or in-vivo monitoring. Avaak"s system concept is to image a chromatic biological, chemical and/or explosive sensor utilizing a digital imager, analyze the images and distribute alarm or image data wirelessly through the network. All the imaging, processing and communications would take place within the miniature, low cost distributed sensor platforms. This concept however presents a significant challenge due to a combination and convergence of required new technologies, as mentioned above. Passive biological and chemical sensors with very high sensitivity and which require no assaying are in development using a technique to optically and chemically encode silicon wafers with tailored nanostructures. The silicon wafer is patterned with nano-structures designed to change colors ad patterns when exposed to the target analytes (TICs, TIMs, VOC). A small video camera detects the color and pattern changes on the sensor. To determine if an alarm condition is present, an on board DSP processor, using specialized image processing algorithms and statistical analysis, determines if color gradient changes occurred on the sensor array. These sensors can detect several agents simultaneously. This system is currently under development by Avaak, with funding from DARPA through an SBIR grant.

  9. Multispectral Imaging Broadens Cellular Analysis

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Amnis Corporation, a Seattle-based biotechnology company, developed ImageStream to produce sensitive fluorescence images of cells in flow. The company responded to an SBIR solicitation from Ames Research Center, and proposed to evaluate several methods of extending the depth of field for its ImageStream system and implement the best as an upgrade to its commercial products. This would allow users to view whole cells at the same time, rather than just one section of each cell. Through Phase I and II SBIR contracts, Ames provided Amnis the funding the company needed to develop this extended functionality. For NASA, the resulting high-speed image flow cytometry process made its way into Medusa, a life-detection instrument built to collect, store, and analyze sample organisms from erupting hydrothermal vents, and has the potential to benefit space flight health monitoring. On the commercial end, Amnis has implemented the process in ImageStream, combining high-resolution microscopy and flow cytometry in a single instrument, giving researchers the power to conduct quantitative analyses of individual cells and cell populations at the same time, in the same experiment. ImageStream is also built for many other applications, including cell signaling and pathway analysis; classification and characterization of peripheral blood mononuclear cell populations; quantitative morphology; apoptosis (cell death) assays; gene expression analysis; analysis of cell conjugates; molecular distribution; and receptor mapping and distribution.

  10. SU-E-T-04: 3D Dose Based Patient Compensator QA Procedure for Proton Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, W; Reyhan, M; Zhang, M

    2015-06-15

    Purpose: In proton double-scattering radiotherapy, compensators are the essential patient specific devices to contour the distal dose distribution to the tumor target. Traditional compensator QA is limited to checking the drilled surface profiles against the plan. In our work, a compensator QA process was established that assess the entire compensator including its internal structure for patient 3D dose verification. Methods: The fabricated patient compensators were CT scanned. Through mathematical image processing and geometric transformations, the CT images of the proton compensator were combined with the patient simulation CT images into a new series of CT images, in which the imagedmore » compensator is placed at the planned location along the corresponding beam line. The new CT images were input into the Eclipse treatment planning system. The original plan was calculated to the combined CT image series without the plan compensator. The newly computed patient 3D dose from the combined patientcompensator images was verified against the original plan dose. Test plans include the compensators with defects intentionally created inside the fabricated compensators. Results: The calculated 3D dose with the combined compensator and patient CT images reflects the impact of the fabricated compensator to the patient. For the test cases in which no defects were created, the dose distributions were in agreement between our method and the corresponding original plans. For the compensator with the defects, the purposely changed material and a purposely created internal defect were successfully detected while not possible with just the traditional compensator profiles detection methods. Conclusion: We present here a 3D dose verification process to qualify the fabricated proton double-scattering compensator. Such compensator detection process assesses the patient 3D impact of the fabricated compensator surface profile as well as the compensator internal material and structure changes. This research receives funding support from CURA Medical Technologies.« less

  11. 15 CFR 960.11 - Conditions for operation.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    .... During such limitations, the licensee shall, on request, provide unenhanced restricted images on a..., processing, archiving and dissemination. (i) If the operating license restricts the distribution of certain...

  12. Northern Everglades, Florida, satellite image map

    USGS Publications Warehouse

    Thomas, Jean-Claude; Jones, John W.

    2002-01-01

    These satellite image maps are one product of the USGS Land Characteristics from Remote Sensing project, funded through the USGS Place-Based Studies Program with support from the Everglades National Park. The objective of this project is to develop and apply innovative remote sensing and geographic information system techniques to map the distribution of vegetation, vegetation characteristics, and related hydrologic variables through space and over time. The mapping and description of vegetation characteristics and their variations are necessary to accurately simulate surface hydrology and other surface processes in South Florida and to monitor land surface changes. As part of this research, data from many airborne and satellite imaging systems have been georeferenced and processed to facilitate data fusion and analysis. These image maps were created using image fusion techniques developed as part of this project.

  13. Rapid Measurements of Aerosol Size Distribution and Hygroscopic Growth via Image Processing with a Fast Integrated Mobility Spectrometer (FIMS)

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Pinterich, T.; Spielman, S. R.; Hering, S. V.; Wang, J.

    2017-12-01

    Aerosol size distribution and hygroscopicity are among key parameters in determining the impact of atmospheric aerosols on global radiation and climate change. In situ submicron aerosol size distribution measurements commonly involve a scanning mobility particle sizer (SMPS). The SMPS scanning time is in the scale of minutes, which is often too slow to capture the variation of aerosol size distribution, such as for aerosols formed via nucleation processes or measurements onboard research aircraft. To solve this problem, a Fast Integrated Mobility Spectrometer (FIMS) based on image processing was developed for rapid measurements of aerosol size distributions from 10 to 500 nm. The FIMS consists of a parallel plate classifier, a condenser, and a CCD detector array. Inside the classifier an electric field separates charged aerosols based on electrical mobilities. Upon exiting the classifier, the aerosols pass through a three stage growth channel (Pinterich et al. 2017; Spielman et al. 2017), where aerosols as small as 7 nm are enlarged to above 1 μm through water or heptanol condensation. Finally, the grown aerosols are illuminated by a laser sheet and imaged onto a CCD array. The images provide both aerosol concentration and position, which directly relate to the aerosol size distribution. By this simultaneous measurement of aerosols with different sizes, the FIMS provides aerosol size spectra nearly 100 times faster than the SMPS. Recent deployment onboard research aircraft demonstrated that the FIMS is capable of measuring aerosol size distributions in 1s (Figure), thereby offering a great advantage in applications requiring high time resolution (Wang et al. 2016). In addition, the coupling of the FIMS with other conventional aerosol instruments provides orders of magnitude more rapid characterization of aerosol optical and microphysical properties. For example, the combination of a differential mobility analyzer, a relative humidity control unit, and a FIMS was used to measure aerosol hygroscopic growth. Such a system reduced the time of measuring the hygroscopic properties of submicron aerosols (six sizes) to less than three minutes in total, with an error within 1%. Pinterich et al. (2017) Aerosol Sci. Technol. accepted Spielman et al. (2017) Aerosol Sci. Technol. accepted Wang et al. (2016) Nature 539:416-419

  14. Rapid Processing of Radio Interferometer Data for Transient Surveys

    NASA Astrophysics Data System (ADS)

    Bourke, S.; Mooley, K.; Hallinan, G.

    2014-05-01

    We report on a software infrastructure and pipeline developed to process large radio interferometer datasets. The pipeline is implemented using a radical redesign of the AIPS processing model. An infrastructure we have named AIPSlite is used to spawn, at runtime, minimal AIPS environments across a cluster. The pipeline then distributes and processes its data in parallel. The system is entirely free of the traditional AIPS distribution and is self configuring at runtime. This software has so far been used to process a EVLA Stripe 82 transient survey, the data for the JVLA-COSMOS project, and has been used to process most of the EVLA L-Band data archive imaging each integration to search for short duration transients.

  15. Collaborative Research and Development (CR&D) III Task Order 0090: Image Processing Framework: From Acquisition and Analysis to Archival Storage

    DTIC Science & Technology

    2013-05-01

    contract or a PhD di sse rtation typically are a " proo f- of-concept" code base that can onl y read a single set of inputs and are not designed ...AFRL-RX-WP-TR-2013-0210 COLLABORATIVE RESEARCH AND DEVELOPMENT (CR&D) III Task Order 0090: Image Processing Framework: From...public release; distribution unlimited. See additional restrictions described on inside pages. STINFO COPY AIR FORCE RESEARCH LABORATORY

  16. PAT: From Western solid dosage forms to Chinese materia medica preparations using NIR-CI.

    PubMed

    Zhou, Luwei; Xu, Manfei; Wu, Zhisheng; Shi, Xinyuan; Qiao, Yanjiang

    2016-01-01

    Near-infrared chemical imaging (NIR-CI) is an emerging technology that combines traditional near-infrared spectroscopy with chemical imaging. Therefore, NIR-CI can extract spectral information from pharmaceutical products and simultaneously visualize the spatial distribution of chemical components. The rapid and non-destructive features of NIR-CI make it an attractive process analytical technology (PAT) for identifying and monitoring critical control parameters during the pharmaceutical manufacturing process. This review mainly focuses on the pharmaceutical applications of NIR-CI in each unit operation during the manufacturing processes, from the Western solid dosage forms to the Chinese materia medica preparations. Finally, future applications of chemical imaging in the pharmaceutical industry are discussed. Copyright © 2015 John Wiley & Sons, Ltd.

  17. Distributed memory parallel Markov random fields using graph partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heinemann, C.; Perciano, T.; Ushizima, D.

    Markov random fields (MRF) based algorithms have attracted a large amount of interest in image analysis due to their ability to exploit contextual information about data. Image data generated by experimental facilities, though, continues to grow larger and more complex, making it more difficult to analyze in a reasonable amount of time. Applying image processing algorithms to large datasets requires alternative approaches to circumvent performance problems. Aiming to provide scientists with a new tool to recover valuable information from such datasets, we developed a general purpose distributed memory parallel MRF-based image analysis framework (MPI-PMRF). MPI-PMRF overcomes performance and memory limitationsmore » by distributing data and computations across processors. The proposed approach was successfully tested with synthetic and experimental datasets. Additionally, the performance of the MPI-PMRF framework is analyzed through a detailed scalability study. We show that a performance increase is obtained while maintaining an accuracy of the segmentation results higher than 98%. The contributions of this paper are: (a) development of a distributed memory MRF framework; (b) measurement of the performance increase of the proposed approach; (c) verification of segmentation accuracy in both synthetic and experimental, real-world datasets« less

  18. Particle size distribution of brown and white rice during gastric digestion measured by image analysis.

    PubMed

    Bornhorst, Gail M; Kostlan, Kevin; Singh, R Paul

    2013-09-01

    The particle size distribution of foods during gastric digestion indicates the amount of physical breakdown that occurred due to the peristaltic movement of the stomach walls in addition to the breakdown that initially occurred during oral processing. The objective of this study was to present an image analysis technique that was rapid, simple, and could distinguish between food components (that is, rice kernel and bran layer in brown rice). The technique was used to quantify particle breakdown of brown and white rice during gastric digestion in growing pigs (used as a model for an adult human) over 480 min of digestion. The particle area distributions were fit to a Rosin-Rammler distribution function. Brown and white rice exhibited considerable breakdown as the number of particles per image decreased over time. The median particle area (x(50)) increased during digestion, suggesting a gastric sieving phenomenon, where small particles were emptied and larger particles were retained for additional breakdown. Brown rice breakdown was further quantified by an examination of the bran layer fragments and rice grain pieces. The percentage of total particle area composed of bran layer fragments was greater in the distal stomach than the proximal stomach in the first 120 min of digestion. The results of this study showed that image analysis may be used to quantify particle breakdown of a soft food product during gastric digestion, discriminate between different food components, and help to clarify the role of food structure and processing in food breakdown during gastric digestion. © 2013 Institute of Food Technologists®

  19. The leaf angle distribution of natural plant populations: assessing the canopy with a novel software tool.

    PubMed

    Müller-Linow, Mark; Pinto-Espinosa, Francisco; Scharr, Hanno; Rascher, Uwe

    2015-01-01

    Three-dimensional canopies form complex architectures with temporally and spatially changing leaf orientations. Variations in canopy structure are linked to canopy function and they occur within the scope of genetic variability as well as a reaction to environmental factors like light, water and nutrient supply, and stress. An important key measure to characterize these structural properties is the leaf angle distribution, which in turn requires knowledge on the 3-dimensional single leaf surface. Despite a large number of 3-d sensors and methods only a few systems are applicable for fast and routine measurements in plants and natural canopies. A suitable approach is stereo imaging, which combines depth and color information that allows for easy segmentation of green leaf material and the extraction of plant traits, such as leaf angle distribution. We developed a software package, which provides tools for the quantification of leaf surface properties within natural canopies via 3-d reconstruction from stereo images. Our approach includes a semi-automatic selection process of single leaves and different modes of surface characterization via polygon smoothing or surface model fitting. Based on the resulting surface meshes leaf angle statistics are computed on the whole-leaf level or from local derivations. We include a case study to demonstrate the functionality of our software. 48 images of small sugar beet populations (4 varieties) have been analyzed on the base of their leaf angle distribution in order to investigate seasonal, genotypic and fertilization effects on leaf angle distributions. We could show that leaf angle distributions change during the course of the season with all varieties having a comparable development. Additionally, different varieties had different leaf angle orientation that could be separated in principle component analysis. In contrast nitrogen treatment had no effect on leaf angles. We show that a stereo imaging setup together with the appropriate image processing tools is capable of retrieving the geometric leaf surface properties of plants and canopies. Our software package provides whole-leaf statistics but also a local estimation of leaf angles, which may have great potential to better understand and quantify structural canopy traits for guided breeding and optimized crop management.

  20. Optimization of digital image processing to determine quantum dots' height and density from atomic force microscopy.

    PubMed

    Ruiz, J E; Paciornik, S; Pinto, L D; Ptak, F; Pires, M P; Souza, P L

    2018-01-01

    An optimized method of digital image processing to interpret quantum dots' height measurements obtained by atomic force microscopy is presented. The method was developed by combining well-known digital image processing techniques and particle recognition algorithms. The properties of quantum dot structures strongly depend on dots' height, among other features. Determination of their height is sensitive to small variations in their digital image processing parameters, which can generate misleading results. Comparing the results obtained with two image processing techniques - a conventional method and the new method proposed herein - with the data obtained by determining the height of quantum dots one by one within a fixed area, showed that the optimized method leads to more accurate results. Moreover, the log-normal distribution, which is often used to represent natural processes, shows a better fit to the quantum dots' height histogram obtained with the proposed method. Finally, the quantum dots' height obtained were used to calculate the predicted photoluminescence peak energies which were compared with the experimental data. Again, a better match was observed when using the proposed method to evaluate the quantum dots' height. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. 3D tensor-based blind multispectral image decomposition for tumor demarcation

    NASA Astrophysics Data System (ADS)

    Kopriva, Ivica; Peršin, Antun

    2010-03-01

    Blind decomposition of multi-spectral fluorescent image for tumor demarcation is formulated exploiting tensorial structure of the image. First contribution of the paper is identification of the matrix of spectral responses and 3D tensor of spatial distributions of the materials present in the image from Tucker3 or PARAFAC models of 3D image tensor. Second contribution of the paper is clustering based estimation of the number of the materials present in the image as well as matrix of their spectral profiles. 3D tensor of the spatial distributions of the materials is recovered through 3-mode multiplication of the multi-spectral image tensor and inverse of the matrix of spectral profiles. Tensor representation of the multi-spectral image preserves its local spatial structure that is lost, due to vectorization process, when matrix factorization-based decomposition methods (such as non-negative matrix factorization and independent component analysis) are used. Superior performance of the tensor-based image decomposition over matrix factorization-based decompositions is demonstrated on experimental red-green-blue (RGB) image with known ground truth as well as on RGB fluorescent images of the skin tumor (basal cell carcinoma).

  2. Vis-NIR hyperspectral imaging in visualizing moisture distribution of mango slices during microwave-vacuum drying.

    PubMed

    Pu, Yuan-Yuan; Sun, Da-Wen

    2015-12-01

    Mango slices were dried by microwave-vacuum drying using a domestic microwave oven equipped with a vacuum desiccator inside. Two lab-scale hyperspectral imaging (HSI) systems were employed for moisture prediction. The Page and the Two-term thin-layer drying models were suitable to describe the current drying process with a fitting goodness of R(2)=0.978. Partial least square (PLS) was applied to correlate the mean spectrum of each slice and reference moisture content. With three waveband selection strategies, optimal wavebands corresponding to moisture prediction were identified. The best model RC-PLS-2 (Rp(2)=0.972 and RMSEP=4.611%) was implemented into the moisture visualization procedure. Moisture distribution map clearly showed that the moisture content in the central part of the mango slices was lower than that of other parts. The present study demonstrated that hyperspectral imaging was a useful tool for non-destructively and rapidly measuring and visualizing the moisture content during drying process. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. A robust nonlinear filter for image restoration.

    PubMed

    Koivunen, V

    1995-01-01

    A class of nonlinear regression filters based on robust estimation theory is introduced. The goal of the filtering is to recover a high-quality image from degraded observations. Models for desired image structures and contaminating processes are employed, but deviations from strict assumptions are allowed since the assumptions on signal and noise are typically only approximately true. The robustness of filters is usually addressed only in a distributional sense, i.e., the actual error distribution deviates from the nominal one. In this paper, the robustness is considered in a broad sense since the outliers may also be due to inappropriate signal model, or there may be more than one statistical population present in the processing window, causing biased estimates. Two filtering algorithms minimizing a least trimmed squares criterion are provided. The design of the filters is simple since no scale parameters or context-dependent threshold values are required. Experimental results using both real and simulated data are presented. The filters effectively attenuate both impulsive and nonimpulsive noise while recovering the signal structure and preserving interesting details.

  4. A Electro-Optical Image Algebra Processing System for Automatic Target Recognition

    NASA Astrophysics Data System (ADS)

    Coffield, Patrick Cyrus

    The proposed electro-optical image algebra processing system is designed specifically for image processing and other related computations. The design is a hybridization of an optical correlator and a massively paralleled, single instruction multiple data processor. The architecture of the design consists of three tightly coupled components: a spatial configuration processor (the optical analog portion), a weighting processor (digital), and an accumulation processor (digital). The systolic flow of data and image processing operations are directed by a control buffer and pipelined to each of the three processing components. The image processing operations are defined in terms of basic operations of an image algebra developed by the University of Florida. The algebra is capable of describing all common image-to-image transformations. The merit of this architectural design is how it implements the natural decomposition of algebraic functions into spatially distributed, point use operations. The effect of this particular decomposition allows convolution type operations to be computed strictly as a function of the number of elements in the template (mask, filter, etc.) instead of the number of picture elements in the image. Thus, a substantial increase in throughput is realized. The implementation of the proposed design may be accomplished in many ways. While a hybrid electro-optical implementation is of primary interest, the benefits and design issues of an all digital implementation are also discussed. The potential utility of this architectural design lies in its ability to control a large variety of the arithmetic and logic operations of the image algebra's generalized matrix product. The generalized matrix product is the most powerful fundamental operation in the algebra, thus allowing a wide range of applications. No other known device or design has made this claim of processing speed and general implementation of a heterogeneous image algebra.

  5. A Photo Storm Report Mobile Application, Processing/Distribution System, and AWIPS-II Display Concept

    NASA Astrophysics Data System (ADS)

    Longmore, S. P.; Bikos, D.; Szoke, E.; Miller, S. D.; Brummer, R.; Lindsey, D. T.; Hillger, D.

    2014-12-01

    The increasing use of mobile phones equipped with digital cameras and the ability to post images and information to the Internet in real-time has significantly improved the ability to report events almost instantaneously. In the context of severe weather reports, a representative digital image conveys significantly more information than a simple text or phone relayed report to a weather forecaster issuing severe weather warnings. It also allows the forecaster to reasonably discern the validity and quality of a storm report. Posting geo-located, time stamped storm report photographs utilizing a mobile phone application to NWS social media weather forecast office pages has generated recent positive feedback from forecasters. Building upon this feedback, this discussion advances the concept, development, and implementation of a formalized Photo Storm Report (PSR) mobile application, processing and distribution system and Advanced Weather Interactive Processing System II (AWIPS-II) plug-in display software.The PSR system would be composed of three core components: i) a mobile phone application, ii) a processing and distribution software and hardware system, and iii) AWIPS-II data, exchange and visualization plug-in software. i) The mobile phone application would allow web-registered users to send geo-location, view direction, and time stamped PSRs along with severe weather type and comments to the processing and distribution servers. ii) The servers would receive PSRs, convert images and information to NWS network bandwidth manageable sizes in an AWIPS-II data format, distribute them on the NWS data communications network, and archive the original PSRs for possible future research datasets. iii) The AWIPS-II data and exchange plug-ins would archive PSRs, and the visualization plug-in would display PSR locations, times and directions by hour, similar to surface observations. Hovering on individual PSRs would reveal photo thumbnails and clicking on them would display the full resolution photograph.Here, we present initial NWS forecaster feedback received from social media posted PSRs, motivating the possible advantages of PSRs within AWIPS-II, the details of developing and implementing a PSR system, and possible future applications beyond severe weather reports and AWIPS-II.

  6. Spatial Data Exploring by Satellite Image Distributed Processing

    NASA Astrophysics Data System (ADS)

    Mihon, V. D.; Colceriu, V.; Bektas, F.; Allenbach, K.; Gvilava, M.; Gorgan, D.

    2012-04-01

    Our society needs and environmental predictions encourage the applications development, oriented on supervising and analyzing different Earth Science related phenomena. Satellite images could be explored for discovering information concerning land cover, hydrology, air quality, and water and soil pollution. Spatial and environment related data could be acquired by imagery classification consisting of data mining throughout the multispectral bands. The process takes in account a large set of variables such as satellite image types (e.g. MODIS, Landsat), particular geographic area, soil composition, vegetation cover, and generally the context (e.g. clouds, snow, and season). All these specific and variable conditions require flexible tools and applications to support an optimal search for the appropriate solutions, and high power computation resources. The research concerns with experiments on solutions of using the flexible and visual descriptions of the satellite image processing over distributed infrastructures (e.g. Grid, Cloud, and GPU clusters). This presentation highlights the Grid based implementation of the GreenLand application. The GreenLand application development is based on simple, but powerful, notions of mathematical operators and workflows that are used in distributed and parallel executions over the Grid infrastructure. Currently it is used in three major case studies concerning with Istanbul geographical area, Rioni River in Georgia, and Black Sea catchment region. The GreenLand application offers a friendly user interface for viewing and editing workflows and operators. The description involves the basic operators provided by GRASS [1] library as well as many other image related operators supported by the ESIP platform [2]. The processing workflows are represented as directed graphs giving the user a fast and easy way to describe complex parallel algorithms, without having any prior knowledge of any programming language or application commands. Also this Web application does not require any kind of install for what the house-hold user is concerned. It is a remote application which may be accessed over the Internet. Currently the GreenLand application is available through the BSC-OS Portal provided by the enviroGRIDS FP7 project [3]. This presentation aims to highlight the challenges and issues of flexible description of the Grid based processing of satellite images, interoperability with other software platforms available in the portal, as well as the particular requirements of the Black Sea related use cases.

  7. Resting-state blood oxygen level-dependent functional magnetic resonance imaging for presurgical planning.

    PubMed

    Kamran, Mudassar; Hacker, Carl D; Allen, Monica G; Mitchell, Timothy J; Leuthardt, Eric C; Snyder, Abraham Z; Shimony, Joshua S

    2014-11-01

    Resting-state functional MR imaging (rsfMR imaging) measures spontaneous fluctuations in the blood oxygen level-dependent (BOLD) signal and can be used to elucidate the brain's functional organization. It is used to simultaneously assess multiple distributed resting-state networks. Unlike task-based functional MR imaging, rsfMR imaging does not require task performance. This article presents a brief introduction of rsfMR imaging processing methods followed by a detailed discussion on the use of rsfMR imaging in presurgical planning. Example cases are provided to highlight the strengths and limitations of the technique. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Image Transform Based on the Distribution of Representative Colors for Color Deficient

    NASA Astrophysics Data System (ADS)

    Ohata, Fukashi; Kudo, Hiroaki; Matsumoto, Tetsuya; Takeuchi, Yoshinori; Ohnishi, Noboru

    This paper proposes the method to convert digital image containing distinguishing difficulty sets of colors into the image with high visibility. We set up four criteria, automatically processing by a computer, retaining continuity in color space, not making images into lower visible for people with normal color vision, and not making images not originally having distinguishing difficulty sets of colors into lower visible. We conducted the psychological experiment. We obtained the result that the visibility of a converted image had been improved at 60% for 40 images, and we confirmed the main criterion of the continuity in color space was kept.

  9. Readout models for BaFBr0.85I0.15:Eu image plates

    NASA Astrophysics Data System (ADS)

    Stoeckl, M.; Solodov, A. A.

    2018-06-01

    The linearity of the photostimulated luminescence process makes repeated image-plate scanning a viable technique to extract a more dynamic range. In order to obtain a response estimate, two semi-empirical models for the readout fading of an image plate are introduced; they relate the depth distribution of activated photostimulated luminescence centers within an image plate to the recorded signal. Model parameters are estimated from image-plate scan series with BAS-MS image plates and the Typhoon FLA 7000 scanner for the hard x-ray image-plate diagnostic over a collection of experiments providing x-ray energy spectra whose approximate shape is a double exponential.

  10. Positron Emission Tomography Methods with Potential for Increased Understanding of Mental Retardation and Developmental Disabilities

    ERIC Educational Resources Information Center

    Sundaram, Senthil K.; Chugani, Harry T.; Chugani, Diane C.

    2005-01-01

    Positron emission tomography (PET) is a technique that enables imaging of the distribution of radiolabeled tracers designed to track biochemical and molecular processes in the body after intravenous injection or inhalation. New strategies for the use of radiolabeled tracers hold potential for imaging gene expression in the brain during development…

  11. Observations of breakup processes of liquid jets using real-time X-ray radiography

    NASA Technical Reports Server (NTRS)

    Char, J. M.; Kuo, K. K.; Hsieh, K. C.

    1988-01-01

    To unravel the liquid-jet breakup process in the nondilute region, a newly developed system of real-time X-ray radiography, an advanced digital image processor, and a high-speed video camera were used. Based upon recorded X-ray images, the inner structure of a liquid jet during breakup was observed. The jet divergence angle, jet breakup length, and fraction distributions along the axial and transverse directions of the liquid jets were determined in the near-injector region. Both wall- and free-jet tests were conducted to study the effect of wall friction on the jet breakup process.

  12. Collaborative, Rapid Mapping of Water Extents During Hurricane Harvey Using Optical and Radar Satellite Sensors

    NASA Astrophysics Data System (ADS)

    Muench, R.; Jones, M.; Herndon, K. E.; Bell, J. R.; Anderson, E. R.; Markert, K. N.; Molthan, A.; Adams, E. C.; Shultz, L.; Cherrington, E. A.; Flores, A.; Lucey, R.; Munroe, T.; Layne, G.; Pulla, S. T.; Weigel, A. M.; Tondapu, G.

    2017-12-01

    On August 25, 2017, Hurricane Harvey made landfall between Port Aransas and Port O'Connor, Texas, bringing with it unprecedented amounts of rainfall and flooding. In times of natural disasters of this nature, emergency responders require timely and accurate information about the hazard in order to assess and plan for disaster response. Due to the extreme flooding impacts associated with Hurricane Harvey, delineations of water extent were crucial to inform resource deployment. Through the USGS's Hazards Data Distribution System, government and commercial vendors were able to acquire and distribute various satellite imagery to analysts to create value-added products that can be used by these emergency responders. Rapid-response water extent maps were created through a collaborative multi-organization and multi-sensor approach. One team of researchers created Synthetic Aperture Radar (SAR) water extent maps using modified Copernicus Sentinel data (2017), processed by ESA. This group used backscatter images, pre-processed by the Alaska Satellite Facility's Hybrid Pluggable Processing Pipeline (HyP3), to identify and apply a threshold to identify water in the image. Quality control was conducted by manually examining the image and correcting for potential errors. Another group of researchers and graduate student volunteers derived water masks from high resolution DigitalGlobe and SPOT images. Through a system of standardized image processing, quality control measures, and communication channels the team provided timely and fairly accurate water extent maps to support a larger NASA Disasters Program response. The optical imagery was processed through a combination of various band thresholds by using Normalized Difference Water Index (NDWI), Modified Normalized Water Index (MNDWI), Normalized Difference Vegetation Index (NDVI), and cloud masking. Several aspects of the pre-processing and image access were run on internal servers to expedite the provision of images to analysts who could focus on manipulating thresholds and quality control checks for maximum accuracy within the time constraints. The combined results of the radar- and optical-derived value-added products through the coordination of multiple organizations provided timely information for emergency response and recovery efforts

  13. Collaborative, Rapid Mapping of Water Extents During Hurricane Harvey Using Optical and Radar Satellite Sensors

    NASA Technical Reports Server (NTRS)

    Muench, Rebekke; Jones, Madeline; Herndon, Kelsey; Schultz, Lori; Bell, Jordan; Anderson, Eric; Markert, Kel; Molthan, Andrew; Adams, Emily; Cherrington, Emil; hide

    2017-01-01

    On August 25, 2017, Hurricane Harvey made landfall between Port Aransas and Port O'Connor, Texas, bringing with it unprecedented amounts of rainfall and record flooding. In times of natural disasters of this nature, emergency responders require timely and accurate information about the hazard in order to assess and plan for disaster response. Due to the extreme flooding impacts associated with Hurricane Harvey, delineations of water extent were crucial to inform resource deployment. Through the USGS's Hazards Data Distribution System, government and commercial vendors were able to acquire and distribute various satellite imagery to analysts to create value-added products that can be used by these emergency responders. Rapid-response water extent maps were created through a collaborative multi-organization and multi-sensor approach. One team of researchers created Synthetic Aperture Radar (SAR) water extent maps using modified Copernicus Sentinel data (2017), processed by ESA. This group used backscatter images, pre-processed by the Alaska Satellite Facility's Hybrid Pluggable Processing Pipeline (HyP3), to identify and apply a threshold to identify water in the image. Quality control was conducted by manually examining the image and correcting for potential errors. Another group of researchers and graduate student volunteers derived water masks from high resolution DigitalGlobe and SPOT images. Through a system of standardized image processing, quality control measures, and communication channels the team provided timely and fairly accurate water extent maps to support a larger NASA Disasters Program response. The optical imagery was processed through a combination of various band thresholds and by using Normalized Difference Water Index (NDWI), Modified Normalized Water Index (MNDWI), Normalized Difference Vegetation Index (NDVI), and cloud masking. Several aspects of the pre-processing and image access were run on internal servers to expedite the provision of images to analysts who could focus on manipulating thresholds and quality control checks for maximum accuracy within the time constraints. The combined results of the radar- and optical-derived value-added products through the coordination of multiple organizations provided timely information for emergency response and recovery efforts.

  14. Low dose reconstruction algorithm for differential phase contrast imaging.

    PubMed

    Wang, Zhentian; Huang, Zhifeng; Zhang, Li; Chen, Zhiqiang; Kang, Kejun; Yin, Hongxia; Wang, Zhenchang; Marco, Stampanoni

    2011-01-01

    Differential phase contrast imaging computed tomography (DPCI-CT) is a novel x-ray inspection method to reconstruct the distribution of refraction index rather than the attenuation coefficient in weakly absorbing samples. In this paper, we propose an iterative reconstruction algorithm for DPCI-CT which benefits from the new compressed sensing theory. We first realize a differential algebraic reconstruction technique (DART) by discretizing the projection process of the differential phase contrast imaging into a linear partial derivative matrix. In this way the compressed sensing reconstruction problem of DPCI reconstruction can be transformed to a resolved problem in the transmission imaging CT. Our algorithm has the potential to reconstruct the refraction index distribution of the sample from highly undersampled projection data. Thus it can significantly reduce the dose and inspection time. The proposed algorithm has been validated by numerical simulations and actual experiments.

  15. A simple and robust method for artifacts correction on X-ray microtomography images

    NASA Astrophysics Data System (ADS)

    Timofey, Sizonenko; Marina, Karsanina; Dina, Gilyazetdinova; Irina, Bayuk; Kirill, Gerke

    2017-04-01

    X-ray microtomography images of rock material often have some kinds of distortion due to different reasons such as X-ray attenuation, beam hardening, irregularity of distribution of liquid/solid phases. Several kinds of distortion can arise from further image processing and stitching of images from different measurements. Beam-hardening is a well-known and studied distortion which is relative easy to be described, fitted and corrected using a number of equations. However, this is not the case for other grey scale intensity distortions. Shading by irregularity of distribution of liquid phases, incorrect scanner operating/parameters choosing, as well as numerous artefacts from mathematical reconstructions from projections, including stitching from separate scans cannot be described using single mathematical model. To correct grey scale intensities on large 3D images we developed a package Traditional method for removing the beam hardening [1] has been modified in order to find the center of distortion. The main contribution of this work is in development of a method for arbitrary image correction. This method is based on fitting the distortion by Bezier curve using image histogram. The distortion along the image is represented by a number of Bezier curves and one base line that characterizes the natural distribution of gray value along the image. All of these curves are set manually by the operator. We have tested our approaches on different X-ray microtomography images of porous media. Arbitrary correction removes all principal distortion. After correction the images has been binarized with subsequent pore-network extracted. Equal distribution of pore-network elements along the image was the criteria to verify the proposed technique to correct grey scale intensities. [1] Iassonov, P. and Tuller, M., 2010. Application of segmentation for correction of intensity bias in X-ray computed tomography images. Vadose Zone Journal, 9(1), pp.187-191.

  16. Adaptive nonlinear L2 and L3 filters for speckled image processing

    NASA Astrophysics Data System (ADS)

    Lukin, Vladimir V.; Melnik, Vladimir P.; Chemerovsky, Victor I.; Astola, Jaakko T.

    1997-04-01

    Here we propose adaptive nonlinear filters based on calculation and analysis of two or three order statistics in a scanning window. They are designed for processing images corrupted by severe speckle noise with non-symmetrical. (Rayleigh or one-side exponential) distribution laws; impulsive noise can be also present. The proposed filtering algorithms provide trade-off between impulsive noise can be also present. The proposed filtering algorithms provide trade-off between efficient speckle noise suppression, robustness, good edge/detail preservation, low computational complexity, preservation of average level for homogeneous regions of images. Quantitative evaluations of the characteristics of the proposed filter are presented as well as the results of the application to real synthetic aperture radar and ultrasound medical images.

  17. [Development of a secure and cost-effective infrastructure for the access of arbitrary web-based image distribution systems].

    PubMed

    Hackländer, T; Kleber, K; Schneider, H; Demabre, N; Cramer, B M

    2004-08-01

    To build an infrastructure that enables radiologists on-call and external users a teleradiological access to the HTML-based image distribution system inside the hospital via internet. In addition, no investment costs should arise on the user side and the image data should be sent renamed using cryptographic techniques. A pure HTML-based system manages the image distribution inside the hospital, with an open source project extending this system through a secure gateway outside the firewall of the hospital. The gateway handles the communication between the external users and the HTML server within the network of the hospital. A second firewall is installed between the gateway and the external users and builds up a virtual private network (VPN). A connection between the gateway and the external user is only acknowledged if the computers involved authenticate each other via certificates and the external users authenticate via a multi-stage password system. All data are transferred encrypted. External users get only access to images that have been renamed to a pseudonym by means of automated processing before. With an ADSL internet access, external users achieve an image load frequency of 0.4 CT images per second. More than 90 % of the delay during image transfer results from security checks within the firewalls. Data passing the gateway induce no measurable delay. Project goals were realized by means of an infrastructure that works vendor independently with any HTML-based image distribution systems. The requirements of data security were realized using state-of-the-art web techniques. Adequate access and transfer speed lead to a widespread acceptance of the system on the part of external users.

  18. Digital Image Support in the ROADNet Real-time Monitoring Platform

    NASA Astrophysics Data System (ADS)

    Lindquist, K. G.; Hansen, T. S.; Newman, R. L.; Vernon, F. L.; Nayak, A.; Foley, S.; Fricke, T.; Orcutt, J.; Rajasekar, A.

    2004-12-01

    The ROADNet real-time monitoring infrastructure has allowed researchers to integrate geophysical monitoring data from a wide variety of signal domains. Antelope-based data transport, relational-database buffering and archiving, backup/replication/archiving through the Storage Resource Broker, and a variety of web-based distribution tools create a powerful monitoring platform. In this work we discuss our use of the ROADNet system for the collection and processing of digital image data. Remote cameras have been deployed at approximately 32 locations as of September 2004, including the SDSU Santa Margarita Ecological Reserve, the Imperial Beach pier, and the Pinon Flats geophysical observatory. Fire monitoring imagery has been obtained through a connection to the HPWREN project. Near-real-time images obtained from the R/V Roger Revelle include records of seafloor operations by the JASON submersible, as part of a maintenance mission for the H2O underwater seismic observatory. We discuss acquisition mechanisms and the packet architecture for image transport via Antelope orbservers, including multi-packet support for arbitrarily large images. Relational database storage supports archiving of timestamped images, image-processing operations, grouping of related images and cameras, support for motion-detect triggers, thumbnail images, pre-computed video frames, support for time-lapse movie generation and storage of time-lapse movies. Available ROADNet monitoring tools include both orbserver-based display of incoming real-time images and web-accessible searching and distribution of images and movies driven by the relational database (http://mercali.ucsd.edu/rtapps/rtimbank.php). An extension to the Kepler Scientific Workflow System also allows real-time image display via the Ptolemy project. Custom time-lapse movies may be made from the ROADNet web pages.

  19. Secure Display of Space-Exploration Images

    NASA Technical Reports Server (NTRS)

    Cheng, Cecilia; Thornhill, Gillian; McAuley, Michael

    2006-01-01

    Java EDR Display Interface (JEDI) is software for either local display or secure Internet distribution, to authorized clients, of image data acquired from cameras aboard spacecraft engaged in exploration of remote planets. ( EDR signifies experimental data record, which, in effect, signifies image data.) Processed at NASA s Multimission Image Processing Laboratory (MIPL), the data can be from either near-realtime processing streams or stored files. JEDI uses the Java Advanced Imaging application program interface, plus input/output packages that are parts of the Video Image Communication and Retrieval software of the MIPL, to display images. JEDI can be run as either a standalone application program or within a Web browser as a servlet with an applet front end. In either operating mode, JEDI communicates using the HTTP(s) protocol(s). In the Web-browser case, the user must provide a password to gain access. For each user and/or image data type, there is a configuration file, called a "personality file," containing parameters that control the layout of the displays and the information to be included in them. Once JEDI has accepted the user s password, it processes the requested EDR (provided that user is authorized to receive the specific EDR) to create a display according to the user s personality file.

  20. Distributed nuclear medicine applications using World Wide Web and Java technology.

    PubMed

    Knoll, P; Höll, K; Mirzaei, S; Koriska, K; Köhn, H

    2000-01-01

    At present, medical applications applying World Wide Web (WWW) technology are mainly used to view static images and to retrieve some information. The Java platform is a relative new way of computing, especially designed for network computing and distributed applications which enables interactive connection between user and information via the WWW. The Java 2 Software Development Kit (SDK) including Java2D API, Java Remote Method Invocation (RMI) technology, Object Serialization and the Java Advanced Imaging (JAI) extension was used to achieve a robust, platform independent and network centric solution. Medical image processing software based on this technology is presented and adequate performance capability of Java is demonstrated by an iterative reconstruction algorithm for single photon emission computerized tomography (SPECT).

  1. Linking brain, mind and behavior.

    PubMed

    Makeig, Scott; Gramann, Klaus; Jung, Tzyy-Ping; Sejnowski, Terrence J; Poizner, Howard

    2009-08-01

    Cortical brain areas and dynamics evolved to organize motor behavior in our three-dimensional environment also support more general human cognitive processes. Yet traditional brain imaging paradigms typically allow and record only minimal participant behavior, then reduce the recorded data to single map features of averaged responses. To more fully investigate the complex links between distributed brain dynamics and motivated natural behavior, we propose the development of wearable mobile brain/body imaging (MoBI) systems that continuously capture the wearer's high-density electrical brain and muscle signals, three-dimensional body movements, audiovisual scene and point of regard, plus new data-driven analysis methods to model their interrelationships. The new imaging modality should allow new insights into how spatially distributed brain dynamics support natural human cognition and agency.

  2. Suppression of fixed pattern noise for infrared image system

    NASA Astrophysics Data System (ADS)

    Park, Changhan; Han, Jungsoo; Bae, Kyung-Hoon

    2008-04-01

    In this paper, we propose suppression of fixed pattern noise (FPN) and compensation of soft defect for improvement of object tracking in cooled staring infrared focal plane array (IRFPA) imaging system. FPN appears an observable image which applies to non-uniformity compensation (NUC) by temperature. Soft defect appears glittering black and white point by characteristics of non-uniformity for IR detector by time. This problem is very important because it happen serious problem for object tracking as well as degradation for image quality. Signal processing architecture in cooled staring IRFPA imaging system consists of three tables: low, normal, high temperature for reference gain and offset values. Proposed method operates two offset tables for each table. This is method which operates six term of temperature on the whole. Proposed method of soft defect compensation consists of three stages: (1) separates sub-image for an image, (2) decides a motion distribution of object between each sub-image, (3) analyzes for statistical characteristic from each stationary fixed pixel. Based on experimental results, the proposed method shows an improved image which suppresses FPN by change of temperature distribution from an observational image in real-time.

  3. Efficient Retrieval of Massive Ocean Remote Sensing Images via a Cloud-Based Mean-Shift Algorithm.

    PubMed

    Yang, Mengzhao; Song, Wei; Mei, Haibin

    2017-07-23

    The rapid development of remote sensing (RS) technology has resulted in the proliferation of high-resolution images. There are challenges involved in not only storing large volumes of RS images but also in rapidly retrieving the images for ocean disaster analysis such as for storm surges and typhoon warnings. In this paper, we present an efficient retrieval of massive ocean RS images via a Cloud-based mean-shift algorithm. Distributed construction method via the pyramid model is proposed based on the maximum hierarchical layer algorithm and used to realize efficient storage structure of RS images on the Cloud platform. We achieve high-performance processing of massive RS images in the Hadoop system. Based on the pyramid Hadoop distributed file system (HDFS) storage method, an improved mean-shift algorithm for RS image retrieval is presented by fusion with the canopy algorithm via Hadoop MapReduce programming. The results show that the new method can achieve better performance for data storage than HDFS alone and WebGIS-based HDFS. Speedup and scaleup are very close to linear changes with an increase of RS images, which proves that image retrieval using our method is efficient.

  4. Efficient Retrieval of Massive Ocean Remote Sensing Images via a Cloud-Based Mean-Shift Algorithm

    PubMed Central

    Song, Wei; Mei, Haibin

    2017-01-01

    The rapid development of remote sensing (RS) technology has resulted in the proliferation of high-resolution images. There are challenges involved in not only storing large volumes of RS images but also in rapidly retrieving the images for ocean disaster analysis such as for storm surges and typhoon warnings. In this paper, we present an efficient retrieval of massive ocean RS images via a Cloud-based mean-shift algorithm. Distributed construction method via the pyramid model is proposed based on the maximum hierarchical layer algorithm and used to realize efficient storage structure of RS images on the Cloud platform. We achieve high-performance processing of massive RS images in the Hadoop system. Based on the pyramid Hadoop distributed file system (HDFS) storage method, an improved mean-shift algorithm for RS image retrieval is presented by fusion with the canopy algorithm via Hadoop MapReduce programming. The results show that the new method can achieve better performance for data storage than HDFS alone and WebGIS-based HDFS. Speedup and scaleup are very close to linear changes with an increase of RS images, which proves that image retrieval using our method is efficient. PMID:28737699

  5. Assessment of body fat based on potential function clustering segmentation of computed tomography images

    NASA Astrophysics Data System (ADS)

    Zhang, Lixin; Lin, Min; Wan, Baikun; Zhou, Yu; Wang, Yizhong

    2005-01-01

    In this paper, a new method of body fat and its distribution testing is proposed based on CT image processing. As it is more sensitive to slight differences in attenuation than standard radiography, CT depicts the soft tissues with better clarity. And body fat has a distinct grayness range compared with its neighboring tissues in a CT image. An effective multi-thresholds image segmentation method based on potential function clustering is used to deal with multiple peaks in the grayness histogram of a CT image. The CT images of abdomens of 14 volunteers with different fatness are processed with the proposed method. Not only can the result of total fat area be got, but also the differentiation of subcutaneous fat from intra-abdominal fat has been identified. The results show the adaptability and stability of the proposed method, which will be a useful tool for diagnosing obesity.

  6. Analysis of a new phase and height algorithm in phase measurement profilometry

    NASA Astrophysics Data System (ADS)

    Bian, Xintian; Zuo, Fen; Cheng, Ju

    2018-04-01

    Traditional phase measurement profilometry adopts divergent illumination to obtain the height distribution of a measured object accurately. However, the mapping relation between reference plane coordinates and phase distribution must be calculated before measurement. Data are then stored in a computer in the form of a data sheet for standby applications. This study improved the distribution of projected fringes and deducted the phase-height mapping algorithm when the two pupils of the projection and imaging systems are of unequal heights and when the projection and imaging axes are on different planes. With the algorithm, calculating the mapping relation between reference plane coordinates and phase distribution prior to measurement is unnecessary. Thus, the measurement process is simplified, and the construction of an experimental system is made easy. Computer simulation and experimental results confirm the effectiveness of the method.

  7. Personal Computer (PC) based image processing applied to fluid mechanics

    NASA Technical Reports Server (NTRS)

    Cho, Y.-C.; Mclachlan, B. G.

    1987-01-01

    A PC based image processing system was employed to determine the instantaneous velocity field of a two-dimensional unsteady flow. The flow was visualized using a suspension of seeding particles in water, and a laser sheet for illumination. With a finite time exposure, the particle motion was captured on a photograph as a pattern of streaks. The streak pattern was digitized and processed using various imaging operations, including contrast manipulation, noise cleaning, filtering, statistical differencing, and thresholding. Information concerning the velocity was extracted from the enhanced image by measuring the length and orientation of the individual streaks. The fluid velocities deduced from the randomly distributed particle streaks were interpolated to obtain velocities at uniform grid points. For the interpolation a simple convolution technique with an adaptive Gaussian window was used. The results are compared with a numerical prediction by a Navier-Stokes computation.

  8. Remote sensing characterization of the Animas River watershed, southwestern Colorado, by AVIRIS imaging spectroscopy

    USGS Publications Warehouse

    Dalton, J.B.; Bove, D.J.; Mladinich, C.S.

    2005-01-01

    Visible-wavelength and near-infrared image cubes of the Animas River watershed in southwestern Colorado have been acquired by the Jet Propulsion Laboratory's Airborne Visible and InfraRed Imaging Spectrometer (AVIRIS) instrument and processed using the U.S. Geological Survey Tetracorder v3.6a2 implementation. The Tetracorder expert system utilizes a spectral reference library containing more than 400 laboratory and field spectra of end-member minerals, mineral mixtures, vegetation, manmade materials, atmospheric gases, and additional substances to generate maps of mineralogy, vegetation, snow, and other material distributions. Major iron-bearing, clay, mica, carbonate, sulfate, and other minerals were identified, among which are several minerals associated with acid rock drainage, including pyrite, jarosite, alunite, and goethite. Distributions of minerals such as calcite and chlorite indicate a relationship between acid-neutralizing assemblages and stream geochemistry within the watershed. Images denoting material distributions throughout the watershed have been orthorectified against digital terrain models to produce georeferenced image files suitable for inclusion in Geographic Information System databases. Results of this study are of use to land managers, stakeholders, and researchers interested in understanding a number of characteristics of the Animas River watershed.

  9. Artifacts in magnetic spirals retrieved by transport of intensity equation (TIE)

    NASA Astrophysics Data System (ADS)

    Cui, J.; Yao, Y.; Shen, X.; Wang, Y. G.; Yu, R. C.

    2018-05-01

    The artifacts in the magnetic structures reconstructed from Lorentz transmission electron microscopy (LTEM) images with TIE method have been analyzed in detail. The processing for the simulated images of Bloch and Neel spirals indicated that the improper parameters in TIE may overestimate the high frequency information and induce some false features in the retrieved images. The specimen tilting will further complicate the analysis of the images because the LTEM image contrast is not the result of the magnetization distribution within the specimen but the integral projection pattern of the magnetic induction filling the entire space including the specimen.

  10. Enterprise Imaging Governance: HIMSS-SIIM Collaborative White Paper.

    PubMed

    Roth, Christopher J; Lannum, Louis M; Joseph, Carol L

    2016-10-01

    Enterprise imaging governance is an emerging need in health enterprises today. This white paper highlights the decision-making body, framework, and process for optimal enterprise imaging governance inclusive of five areas of focus: program governance, technology governance, information governance, clinical governance, and financial governance. It outlines relevant parallels and differences when forming or optimizing imaging governance as compared with other established broad horizontal governance groups, such as for the electronic health record. It is intended for CMIOs and health informatics leaders looking to grow and govern a program to optimally capture, store, index, distribute, view, exchange, and analyze the images of their enterprise.

  11. Optical noise-free image encryption based on quick response code and high dimension chaotic system in gyrator transform domain

    NASA Astrophysics Data System (ADS)

    Sui, Liansheng; Xu, Minjie; Tian, Ailing

    2017-04-01

    A novel optical image encryption scheme is proposed based on quick response code and high dimension chaotic system, where only the intensity distribution of encoded information is recorded as ciphertext. Initially, the quick response code is engendered from the plain image and placed in the input plane of the double random phase encoding architecture. Then, the code is encrypted to the ciphertext with noise-like distribution by using two cascaded gyrator transforms. In the process of encryption, the parameters such as rotation angles and random phase masks are generated as interim variables and functions based on Chen system. A new phase retrieval algorithm is designed to reconstruct the initial quick response code in the process of decryption, in which a priori information such as three position detection patterns is used as the support constraint. The original image can be obtained without any energy loss by scanning the decrypted code with mobile devices. The ciphertext image is the real-valued function which is more convenient for storing and transmitting. Meanwhile, the security of the proposed scheme is enhanced greatly due to high sensitivity of initial values of Chen system. Extensive cryptanalysis and simulation have performed to demonstrate the feasibility and effectiveness of the proposed scheme.

  12. A web-based institutional DICOM distribution system with the integration of the Clinical Trial Processor (CTP).

    PubMed

    Aryanto, K Y E; Broekema, A; Langenhuysen, R G A; Oudkerk, M; van Ooijen, P M A

    2015-05-01

    To develop and test a fast and easy rule-based web-environment with optional de-identification of imaging data to facilitate data distribution within a hospital environment. A web interface was built using Hypertext Preprocessor (PHP), an open source scripting language for web development, and Java with SQL Server to handle the database. The system allows for the selection of patient data and for de-identifying these when necessary. Using the services provided by the RSNA Clinical Trial Processor (CTP), the selected images were pushed to the appropriate services using a protocol based on the module created for the associated task. Five pipelines, each performing a different task, were set up in the server. In a 75 month period, more than 2,000,000 images are transferred and de-identified in a proper manner while 20,000,000 images are moved from one node to another without de-identification. While maintaining a high level of security and stability, the proposed system is easy to setup, it integrate well with our clinical and research practice and it provides a fast and accurate vendor-neutral process of transferring, de-identifying, and storing DICOM images. Its ability to run different de-identification processes in parallel pipelines is a major advantage in both clinical and research setting.

  13. The integrated design and archive of space-borne signal processing and compression coding

    NASA Astrophysics Data System (ADS)

    He, Qiang-min; Su, Hao-hang; Wu, Wen-bo

    2017-10-01

    With the increasing demand of users for the extraction of remote sensing image information, it is very urgent to significantly enhance the whole system's imaging quality and imaging ability by using the integrated design to achieve its compact structure, light quality and higher attitude maneuver ability. At this present stage, the remote sensing camera's video signal processing unit and image compression and coding unit are distributed in different devices. The volume, weight and consumption of these two units is relatively large, which unable to meet the requirements of the high mobility remote sensing camera. This paper according to the high mobility remote sensing camera's technical requirements, designs a kind of space-borne integrated signal processing and compression circuit by researching a variety of technologies, such as the high speed and high density analog-digital mixed PCB design, the embedded DSP technology and the image compression technology based on the special-purpose chips. This circuit lays a solid foundation for the research of the high mobility remote sensing camera.

  14. Modified interferometric imaging condition for reverse-time migration

    NASA Astrophysics Data System (ADS)

    Guo, Xue-Bao; Liu, Hong; Shi, Ying

    2018-01-01

    For reverse-time migration, high-resolution imaging mainly depends on the accuracy of the velocity model and the imaging condition. In practice, however, the small-scale components of the velocity model cannot be estimated by tomographical methods; therefore, the wavefields are not accurately reconstructed from the background velocity, and the imaging process will generate artefacts. Some of the noise is due to cross-correlation of unrelated seismic events. Interferometric imaging condition suppresses imaging noise very effectively, especially the unknown random disturbance of the small-scale part. The conventional interferometric imaging condition is extended in this study to obtain a new imaging condition based on the pseudo-Wigner distribution function (WDF). Numerical examples show that the modified interferometric imaging condition improves imaging precision.

  15. An Explorative Study to Use DBD Plasma Generation for Aircraft Icing Mitigation

    NASA Astrophysics Data System (ADS)

    Hu, Hui; Zhou, Wenwu; Liu, Yang; Kolbakir, Cem

    2017-11-01

    An explorative investigation was performed to demonstrate the feasibility of utilizing thermal effect induced by Dielectric-Barrier-Discharge (DBD) plasma generation for aircraft icing mitigation. The experimental study was performed in an Icing Research Tunnel available at Iowa State University (i.e., ISU-IRT). A NACA0012 airfoil/wing model embedded with DBD plasma actuators was installed in ISU-IRT under typical glaze icing conditions pertinent to aircraft inflight icing phenomena. While a high-speed imaging system was used to record the dynamic ice accretion process over the airfoil surface for the test cases with and without switching on the DBD plasma actuators, an infrared (IR) thermal imaging system was utilized to map the corresponding temperature distributions to quantify the unsteady heat transfer and phase changing process over the airfoil surface. The thermal effect induced by DBD plasma generation was demonstrated to be able to keep the airfoil surface staying free of ice during the entire ice accretion experiment. The measured quantitative surface temperature distributions were correlated with the acquired images of the dynamic ice accretion and water runback processes to elucidate the underlying physics. National Science Foundation CBET-1064196 and CBET-1435590.

  16. Laser-induced acoustic imaging of underground objects

    NASA Astrophysics Data System (ADS)

    Li, Wen; DiMarzio, Charles A.; McKnight, Stephen W.; Sauermann, Gerhard O.; Miller, Eric L.

    1999-02-01

    This paper introduces a new demining technique based on the photo-acoustic interaction, together with results from photo- acoustic experiments. We have buried different types of targets (metal, rubber and plastic) in different media (sand, soil and water) and imaged them by measuring reflection of acoustic waves generated by irradiation with a CO2 laser. Research has been focused on the signal acquisition and signal processing. A deconvolution method using Wiener filters is utilized in data processing. Using a uniform spatial distribution of laser pulses at the ground's surface, we obtained 3D images of buried objects. The images give us a clear representation of the shapes of the underground objects. The quality of the images depends on the mismatch of acoustic impedance of the buried objects, the bandwidth and center frequency of the acoustic sensors and the selection of filter functions.

  17. South Florida Everglades: satellite image map

    USGS Publications Warehouse

    Jones, John W.; Thomas, Jean-Claude; Desmond, G.B.

    2001-01-01

    These satellite image maps are one product of the USGS Land Characteristics from Remote Sensing project, funded through the USGS Place-Based Studies Program (http://access.usgs.gov/) with support from the Everglades National Park (http://www.nps.gov/ever/). The objective of this project is to develop and apply innovative remote sensing and geographic information system techniques to map the distribution of vegetation, vegetation characteristics, and related hydrologic variables through space and over time. The mapping and description of vegetation characteristics and their variations are necessary to accurately simulate surface hydrology and other surface processes in South Florida and to monitor land surface changes. As part of this research, data from many airborne and satellite imaging systems have been georeferenced and processed to facilitate data fusion and analysis. These image maps were created using image fusion techniques developed as part of this project.

  18. Raman Spectroscopic Imaging of the Whole Ciona intestinalis Embryo during Development

    PubMed Central

    Nakamura, Mitsuru J.; Hotta, Kohji; Oka, Kotaro

    2013-01-01

    Intracellular composition and the distribution of bio-molecules play central roles in the specification of cell fates and morphogenesis during embryogenesis. Consequently, investigation of changes in the expression and distribution of bio-molecules, especially mRNAs and proteins, is an important challenge in developmental biology. Raman spectroscopic imaging, a non-invasive and label-free technique, allows simultaneous imaging of the intracellular composition and distribution of multiple bio-molecules. In this study, we explored the application of Raman spectroscopic imaging in the whole Ciona intestinalis embryo during development. Analysis of Raman spectra scattered from C. intestinalis embryos revealed a number of localized patterns of high Raman intensity within the embryo. Based on the observed distribution of bio-molecules, we succeeded in identifying the location and structure of differentiated muscle and endoderm within the whole embryo, up to the tailbud stage, in a label-free manner. Furthermore, during cell differentiation, we detected significant differences in cell state between muscle/endoderm daughter cells and daughter cells with other fates that had divided from the same mother cells; this was achieved by focusing on the Raman intensity of single Raman bands at 1002 or 1526 cm−1, respectively. This study reports the first application of Raman spectroscopic imaging to the study of identifying and characterizing differentiating tissues in a whole chordate embryo. Our results suggest that Raman spectroscopic imaging is a feasible label-free technique for investigating the developmental process of the whole embryo of C. intestinalis. PMID:23977129

  19. Multispectral UV imaging for surface analysis of MUPS tablets with special focus on the pellet distribution.

    PubMed

    Novikova, Anna; Carstensen, Jens M; Rades, Thomas; Leopold, Prof Dr Claudia S

    2016-12-30

    In the present study the applicability of multispectral UV imaging in combination with multivariate image analysis for surface evaluation of MUPS tablets was investigated with respect to the differentiation of the API pellets from the excipients matrix, estimation of the drug content as well as pellet distribution, and influence of the coating material and tablet thickness on the predictive model. Different formulations consisting of coated drug pellets with two coating polymers (Aquacoat ® ECD and Eudragit ® NE 30 D) at three coating levels each were compressed to MUPS tablets with various amounts of coated pellets and different tablet thicknesses. The coated drug pellets were clearly distinguishable from the excipients matrix using a partial least squares approach regardless of the coating layer thickness and coating material used. Furthermore, the number of the detected drug pellets on the tablet surface allowed an estimation of the true drug content in the respective MUPS tablet. In addition, the pellet distribution in the MUPS formulations could be estimated by UV image analysis of the tablet surface. In conclusion, this study revealed that UV imaging in combination with multivariate image analysis is a promising approach for the automatic quality control of MUPS tablets during the manufacturing process. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Image Based Hair Segmentation Algorithm for the Application of Automatic Facial Caricature Synthesis

    PubMed Central

    Peng, Zhenyun; Zhang, Yaohui

    2014-01-01

    Hair is a salient feature in human face region and are one of the important cues for face analysis. Accurate detection and presentation of hair region is one of the key components for automatic synthesis of human facial caricature. In this paper, an automatic hair detection algorithm for the application of automatic synthesis of facial caricature based on a single image is proposed. Firstly, hair regions in training images are labeled manually and then the hair position prior distributions and hair color likelihood distribution function are estimated from these labels efficiently. Secondly, the energy function of the test image is constructed according to the estimated prior distributions of hair location and hair color likelihood. This energy function is further optimized according to graph cuts technique and initial hair region is obtained. Finally, K-means algorithm and image postprocessing techniques are applied to the initial hair region so that the final hair region can be segmented precisely. Experimental results show that the average processing time for each image is about 280 ms and the average hair region detection accuracy is above 90%. The proposed algorithm is applied to a facial caricature synthesis system. Experiments proved that with our proposed hair segmentation algorithm the facial caricatures are vivid and satisfying. PMID:24592182

  1. Applying local binary patterns in image clustering problems

    NASA Astrophysics Data System (ADS)

    Skorokhod, Nikolai N.; Elizarov, Alexey I.

    2017-11-01

    Due to the fact that the cloudiness plays a critical role in the Earth radiative balance, the study of the distribution of different types of clouds and their movements is relevant. The main sources of such information are artificial satellites that provide data in the form of images. The most commonly used method of solving tasks of processing and classification of images of clouds is based on the description of texture features. The use of a set of local binary patterns is proposed to describe the texture image.

  2. Analysis of Morphological Features of Benign and Malignant Breast Cell Extracted From FNAC Microscopic Image Using the Pearsonian System of Curves.

    PubMed

    Rajbongshi, Nijara; Bora, Kangkana; Nath, Dilip C; Das, Anup K; Mahanta, Lipi B

    2018-01-01

    Cytological changes in terms of shape and size of nuclei are some of the common morphometric features to study breast cancer, which can be observed by careful screening of fine needle aspiration cytology (FNAC) images. This study attempts to categorize a collection of FNAC microscopic images into benign and malignant classes based on family of probability distribution using some morphometric features of cell nuclei. For this study, features namely area, perimeter, eccentricity, compactness, and circularity of cell nuclei were extracted from FNAC images of both benign and malignant samples using an image processing technique. All experiments were performed on a generated FNAC image database containing 564 malignant (cancerous) and 693 benign (noncancerous) cell level images. The five-set extracted features were reduced to three-set (area, perimeter, and circularity) based on the mean statistic. Finally, the data were fitted to the generalized Pearsonian system of frequency curve, so that the resulting distribution can be used as a statistical model. Pearsonian system is a family of distributions where kappa (κ) is the selection criteria computed as functions of the first four central moments. For the benign group, kappa (κ) corresponding to area, perimeter, and circularity was -0.00004, 0.0000, and 0.04155 and for malignant group it was 1016942, 0.01464, and -0.3213, respectively. Thus, the family of distribution related to these features for the benign and malignant group were different, and therefore, characterization of their probability curve will also be different.

  3. a Kml-Based Approach for Distributed Collaborative Interpretation of Remote Sensing Images in the Geo-Browser

    NASA Astrophysics Data System (ADS)

    Huang, L.; Zhu, X.; Guo, W.; Xiang, L.; Chen, X.; Mei, Y.

    2012-07-01

    Existing implementations of collaborative image interpretation have many limitations for very large satellite imageries, such as inefficient browsing, slow transmission, etc. This article presents a KML-based approach to support distributed, real-time, synchronous collaborative interpretation for remote sensing images in the geo-browser. As an OGC standard, KML (Keyhole Markup Language) has the advantage of organizing various types of geospatial data (including image, annotation, geometry, etc.) in the geo-browser. Existing KML elements can be used to describe simple interpretation results indicated by vector symbols. To enlarge its application, this article expands KML elements to describe some complex image processing operations, including band combination, grey transformation, geometric correction, etc. Improved KML is employed to describe and share interpretation operations and results among interpreters. Further, this article develops some collaboration related services that are collaboration launch service, perceiving service and communication service. The launch service creates a collaborative interpretation task and provides a unified interface for all participants. The perceiving service supports interpreters to share collaboration awareness. Communication service provides interpreters with written words communication. Finally, the GeoGlobe geo-browser (an extensible and flexible geospatial platform developed in LIESMARS) is selected to perform experiments of collaborative image interpretation. The geo-browser, which manage and visualize massive geospatial information, can provide distributed users with quick browsing and transmission. Meanwhile in the geo-browser, GIS data (for example DEM, DTM, thematic map and etc.) can be integrated to assist in improving accuracy of interpretation. Results show that the proposed method is available to support distributed collaborative interpretation of remote sensing image

  4. Determination of the microbolometric FPA's responsivity with imaging system's radiometric considerations

    NASA Astrophysics Data System (ADS)

    Gogler, Slawomir; Bieszczad, Grzegorz; Krupinski, Michal

    2013-10-01

    Thermal imagers and used therein infrared array sensors are subject to calibration procedure and evaluation of their voltage sensitivity on incident radiation during manufacturing process. The calibration procedure is especially important in so-called radiometric cameras, where accurate radiometric quantities, given in physical units, are of concern. Even though non-radiometric cameras are not expected to stand up to such elevated standards, it is still important, that the image faithfully represents temperature variations across the scene. Detectors used in thermal camera are illuminated by infrared radiation transmitted through an infrared transmitting optical system. Often an optical system, when exposed to uniform Lambertian source forms a non-uniform irradiation distribution in its image plane. In order to be able to carry out an accurate non-uniformity correction it is essential to correctly predict irradiation distribution from a uniform source. In the article a non-uniformity correction method has been presented, that takes into account optical system's radiometry. Predictions of the irradiation distribution have been confronted with measured irradiance values. Presented radiometric model allows fast and accurate non-uniformity correction to be carried out.

  5. The effect of signal variability on the histograms of anthropomorphic channel outputs: factors resulting in non-normally distributed data

    NASA Astrophysics Data System (ADS)

    Elshahaby, Fatma E. A.; Ghaly, Michael; Jha, Abhinav K.; Frey, Eric C.

    2015-03-01

    Model Observers are widely used in medical imaging for the optimization and evaluation of instrumentation, acquisition parameters and image reconstruction and processing methods. The channelized Hotelling observer (CHO) is a commonly used model observer in nuclear medicine and has seen increasing use in other modalities. An anthropmorphic CHO consists of a set of channels that model some aspects of the human visual system and the Hotelling Observer, which is the optimal linear discriminant. The optimality of the CHO is based on the assumption that the channel outputs for data with and without the signal present have a multivariate normal distribution with equal class covariance matrices. The channel outputs result from the dot product of channel templates with input images and are thus the sum of a large number of random variables. The central limit theorem is thus often used to justify the assumption that the channel outputs are normally distributed. In this work, we aim to examine this assumption for realistically simulated nuclear medicine images when various types of signal variability are present.

  6. A Machine Learning Ensemble Classifier for Early Prediction of Diabetic Retinopathy.

    PubMed

    S K, Somasundaram; P, Alli

    2017-11-09

    The main complication of diabetes is Diabetic retinopathy (DR), retinal vascular disease and it leads to the blindness. Regular screening for early DR disease detection is considered as an intensive labor and resource oriented task. Therefore, automatic detection of DR diseases is performed only by using the computational technique is the great solution. An automatic method is more reliable to determine the presence of an abnormality in Fundus images (FI) but, the classification process is poorly performed. Recently, few research works have been designed for analyzing texture discrimination capacity in FI to distinguish the healthy images. However, the feature extraction (FE) process was not performed well, due to the high dimensionality. Therefore, to identify retinal features for DR disease diagnosis and early detection using Machine Learning and Ensemble Classification method, called, Machine Learning Bagging Ensemble Classifier (ML-BEC) is designed. The ML-BEC method comprises of two stages. The first stage in ML-BEC method comprises extraction of the candidate objects from Retinal Images (RI). The candidate objects or the features for DR disease diagnosis include blood vessels, optic nerve, neural tissue, neuroretinal rim, optic disc size, thickness and variance. These features are initially extracted by applying Machine Learning technique called, t-distributed Stochastic Neighbor Embedding (t-SNE). Besides, t-SNE generates a probability distribution across high-dimensional images where the images are separated into similar and dissimilar pairs. Then, t-SNE describes a similar probability distribution across the points in the low-dimensional map. This lessens the Kullback-Leibler divergence among two distributions regarding the locations of the points on the map. The second stage comprises of application of ensemble classifiers to the extracted features for providing accurate analysis of digital FI using machine learning. In this stage, an automatic detection of DR screening system using Bagging Ensemble Classifier (BEC) is investigated. With the help of voting the process in ML-BEC, bagging minimizes the error due to variance of the base classifier. With the publicly available retinal image databases, our classifier is trained with 25% of RI. Results show that the ensemble classifier can achieve better classification accuracy (CA) than single classification models. Empirical experiments suggest that the machine learning-based ensemble classifier is efficient for further reducing DR classification time (CT).

  7. High-performance image processing architecture

    NASA Astrophysics Data System (ADS)

    Coffield, Patrick C.

    1992-04-01

    The proposed architecture is a logical design specifically for image processing and other related computations. The design is a hybrid electro-optical concept consisting of three tightly coupled components: a spatial configuration processor (the optical analog portion), a weighting processor (digital), and an accumulation processor (digital). The systolic flow of data and image processing operations are directed by a control buffer and pipelined to each of the three processing components. The image processing operations are defined by an image algebra developed by the University of Florida. The algebra is capable of describing all common image-to-image transformations. The merit of this architectural design is how elegantly it handles the natural decomposition of algebraic functions into spatially distributed, point-wise operations. The effect of this particular decomposition allows convolution type operations to be computed strictly as a function of the number of elements in the template (mask, filter, etc.) instead of the number of picture elements in the image. Thus, a substantial increase in throughput is realized. The logical architecture may take any number of physical forms. While a hybrid electro-optical implementation is of primary interest, the benefits and design issues of an all digital implementation are also discussed. The potential utility of this architectural design lies in its ability to control all the arithmetic and logic operations of the image algebra's generalized matrix product. This is the most powerful fundamental formulation in the algebra, thus allowing a wide range of applications.

  8. Potential use of MCR-ALS for the identification of coeliac-related biochemical changes in hyperspectral Raman maps from pediatric intestinal biopsies.

    PubMed

    Fornasaro, Stefano; Vicario, Annalisa; De Leo, Luigina; Bonifacio, Alois; Not, Tarcisio; Sergo, Valter

    2018-05-14

    Raman hyperspectral imaging is an emerging practice in biological and biomedical research for label free analysis of tissues and cells. Using this method, both spatial distribution and spectral information of analyzed samples can be obtained. The current study reports the first Raman microspectroscopic characterisation of colon tissues from patients with Coeliac Disease (CD). The aim was to assess if Raman imaging coupled with hyperspectral multivariate image analysis is capable of detecting the alterations in the biochemical composition of intestinal tissues associated with CD. The analytical approach was based on a multi-step methodology: duodenal biopsies from healthy and coeliac patients were measured and processed with Multivariate Curve Resolution Alternating Least Squares (MCR-ALS). Based on the distribution maps and the pure spectra of the image constituents obtained from MCR-ALS, interesting biochemical differences between healthy and coeliac patients has been derived. Noticeably, a reduced distribution of complex lipids in the pericryptic space, and a different distribution and abundance of proteins rich in beta-sheet structures was found in CD patients. The output of the MCR-ALS analysis was then used as a starting point for two clustering algorithms (k-means clustering and hierarchical clustering methods). Both methods converged with similar results providing precise segmentation over multiple Raman images of studied tissues.

  9. Field signatures of non-Fickian transport processes: transit time distributions, spatial correlations, reversibility and hydrogeophysical imaging

    NASA Astrophysics Data System (ADS)

    Le Borgne, T.; Kang, P. K.; Guihéneuf, N.; Shakas, A.; Bour, O.; Linde, N.; Dentz, M.

    2015-12-01

    Non-Fickian transport phenomena are observed in a wide range of scales across hydrological systems. They are generally manifested by a broad range of transit time distributions, as measured for instance in tracer breakthrough curves. However, similar transit time distributions may be caused by different origins, including broad velocity distributions, flow channeling or diffusive mass transfer [1,2]. The identification of these processes is critical for defining relevant transport models. How can we distinguish the different origins of non-Fickian transport in the field? In this presentation, we will review recent experimental developments to decipher the different causes of anomalous transport, based on tracer tests performed at different scales in cross borehole and push pull conditions, and time lapse hydrogeophysical imaging of tracer motion [3,4]. References:[1] de Anna-, P., T. Le Borgne, M. Dentz, A. M. Tartakovsky, D. Bolster, P. Davy (2013) Flow Intermittency, Dispersion and Correlated Continuous Time Random Walks in Porous Media, Phys. Rev. Lett., 110, 184502 [2] Le Borgne T., Dentz M., and Carrera J. (2008) Lagrangian Statistical Model for Transport in Highly Heterogeneous Velocity Fields. Phys. Rev. Lett. 101, 090601 [3] Kang, P. K., T. Le Borgne, M. Dentz, O. Bour, and R. Juanes (2015), Impact of velocity correlation and distribution on transport in fractured media : Field evidence and theoretical model, Water Resour. Res., 51, 940-959 [4] Dorn C., Linde N., Le Borgne T., O. Bour and L. Baron (2011) Single-hole GPR reflection imaging of solute transport in a granitic aquifer Geophys. Res. Lett. Vol.38, L08401

  10. Photofragment Coincidence Imaging of Small I- (H2O)n Clusters Excited to the Charge-transfer-to-solvent State

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neumark, D. E. Szpunar, K. E. Kautzman, A. E. Faulhaber, and D. M.; Kautzman, K.E.; Faulhaber, A.E.

    2005-11-09

    The photodissociation dynamics of small I{sup -}(H{sub 2}O){sub n} (n = 2-5) clusters excited to their charge-transfer-to-solvent (CTTS) states have been studied using photofragment coincidence imaging. Upon excitation to the CTTS state, two photodissociation channels were observed. The major channel ({approx}90%) is a 2-body process forming neutral I + (H{sub 2}O){sub n} photofragments, and the minor channel is a 3-body process forming I + (H{sub 2}O){sub n-1} + H{sub 2}O fragments. Both process display translational energy (P(E{sub T})) distributions peaking at E{sub T} = 0 with little available energy partitioned into translation. Clusters excited to the detachment continuum rather thanmore » to the CTTS state display the same two channels with similar P(E{sub T}) distributions. The observation of similar P(E{sub T}) distributions from the two sets of experiments suggests that in the CTTS experiments, I atom loss occurs after autodetachment of the excited (I(H{sub 2}O){sub n}{sup -})* cluster, or, less probably, that the presence of the excess electron has little effect on the departing I atom.« less

  11. Improving the scalability of hyperspectral imaging applications on heterogeneous platforms using adaptive run-time data compression

    NASA Astrophysics Data System (ADS)

    Plaza, Antonio; Plaza, Javier; Paz, Abel

    2010-10-01

    Latest generation remote sensing instruments (called hyperspectral imagers) are now able to generate hundreds of images, corresponding to different wavelength channels, for the same area on the surface of the Earth. In previous work, we have reported that the scalability of parallel processing algorithms dealing with these high-dimensional data volumes is affected by the amount of data to be exchanged through the communication network of the system. However, large messages are common in hyperspectral imaging applications since processing algorithms are pixel-based, and each pixel vector to be exchanged through the communication network is made up of hundreds of spectral values. Thus, decreasing the amount of data to be exchanged could improve the scalability and parallel performance. In this paper, we propose a new framework based on intelligent utilization of wavelet-based data compression techniques for improving the scalability of a standard hyperspectral image processing chain on heterogeneous networks of workstations. This type of parallel platform is quickly becoming a standard in hyperspectral image processing due to the distributed nature of collected hyperspectral data as well as its flexibility and low cost. Our experimental results indicate that adaptive lossy compression can lead to improvements in the scalability of the hyperspectral processing chain without sacrificing analysis accuracy, even at sub-pixel precision levels.

  12. Exploiting spectral content for image segmentation in GPR data

    NASA Astrophysics Data System (ADS)

    Wang, Patrick K.; Morton, Kenneth D., Jr.; Collins, Leslie M.; Torrione, Peter A.

    2011-06-01

    Ground-penetrating radar (GPR) sensors provide an effective means for detecting changes in the sub-surface electrical properties of soils, such as changes indicative of landmines or other buried threats. However, most GPR-based pre-screening algorithms only localize target responses along the surface of the earth, and do not provide information regarding an object's position in depth. As a result, feature extraction algorithms are forced to process data from entire cubes of data around pre-screener alarms, which can reduce feature fidelity and hamper performance. In this work, spectral analysis is investigated as a method for locating subsurface anomalies in GPR data. In particular, a 2-D spatial/frequency decomposition is applied to pre-screener flagged GPR B-scans. Analysis of these spatial/frequency regions suggests that aspects (e.g. moments, maxima, mode) of the frequency distribution of GPR energy can be indicative of the presence of target responses. After translating a GPR image to a function of the spatial/frequency distributions at each pixel, several image segmentation approaches can be applied to perform segmentation in this new transformed feature space. To illustrate the efficacy of the approach, a performance comparison between feature processing with and without the image segmentation algorithm is provided.

  13. Imaging surface contacts: Power law contact distributions and contact stresses in quartz, calcite, glass and acrylic plastic

    USGS Publications Warehouse

    Dieterich, J.H.; Kilgore, B.D.

    1996-01-01

    A procedure has been developed to obtain microscope images of regions of contact between roughened surfaces of transparent materials, while the surfaces are subjected to static loads or undergoing frictional slip. Static loading experiments with quartz, calcite, soda-lime glass and acrylic plastic at normal stresses to 30 MPa yield power law distributions of contact areas from the smallest contacts that can be resolved (3.5 ??m2) up to a limiting size that correlates with the grain size of the abrasive grit used to roughen the surfaces. In each material, increasing normal stress results in a roughly linear increase of the real area of contact. Mechanisms of contact area increase are by growth of existing contacts, coalescence of contacts and appearance of new contacts. Mean contacts stresses are consistent with the indentation strength of each material. Contact size distributions are insensitive to normal stress indicating that the increase of contact area is approximately self-similar. The contact images and contact distributions are modeled using simulations of surfaces with random fractal topographies. The contact process for model fractal surfaces is represented by the simple expedient of removing material at regions where surface irregularities overlap. Synthetic contact images created by this approach reproduce observed characteristics of the contacts and demonstrate that the exponent in the power law distributions depends on the scaling exponent used to generate the surface topography.

  14. A data grid for imaging-based clinical trials

    NASA Astrophysics Data System (ADS)

    Zhou, Zheng; Chao, Sander S.; Lee, Jasper; Liu, Brent; Documet, Jorge; Huang, H. K.

    2007-03-01

    Clinical trials play a crucial role in testing new drugs or devices in modern medicine. Medical imaging has also become an important tool in clinical trials because images provide a unique and fast diagnosis with visual observation and quantitative assessment. A typical imaging-based clinical trial consists of: 1) A well-defined rigorous clinical trial protocol, 2) a radiology core that has a quality control mechanism, a biostatistics component, and a server for storing and distributing data and analysis results; and 3) many field sites that generate and send image studies to the radiology core. As the number of clinical trials increases, it becomes a challenge for a radiology core servicing multiple trials to have a server robust enough to administrate and quickly distribute information to participating radiologists/clinicians worldwide. The Data Grid can satisfy the aforementioned requirements of imaging based clinical trials. In this paper, we present a Data Grid architecture for imaging-based clinical trials. A Data Grid prototype has been implemented in the Image Processing and Informatics (IPI) Laboratory at the University of Southern California to test and evaluate performance in storing trial images and analysis results for a clinical trial. The implementation methodology and evaluation protocol of the Data Grid are presented.

  15. Building high-performance system for processing a daily large volume of Chinese satellites imagery

    NASA Astrophysics Data System (ADS)

    Deng, Huawu; Huang, Shicun; Wang, Qi; Pan, Zhiqiang; Xin, Yubin

    2014-10-01

    The number of Earth observation satellites from China increases dramatically recently and those satellites are acquiring a large volume of imagery daily. As the main portal of image processing and distribution from those Chinese satellites, the China Centre for Resources Satellite Data and Application (CRESDA) has been working with PCI Geomatics during the last three years to solve two issues in this regard: processing the large volume of data (about 1,500 scenes or 1 TB per day) in a timely manner and generating geometrically accurate orthorectified products. After three-year research and development, a high performance system has been built and successfully delivered. The high performance system has a service oriented architecture and can be deployed to a cluster of computers that may be configured with high end computing power. The high performance is gained through, first, making image processing algorithms into parallel computing by using high performance graphic processing unit (GPU) cards and multiple cores from multiple CPUs, and, second, distributing processing tasks to a cluster of computing nodes. While achieving up to thirty (and even more) times faster in performance compared with the traditional practice, a particular methodology was developed to improve the geometric accuracy of images acquired from Chinese satellites (including HJ-1 A/B, ZY-1-02C, ZY-3, GF-1, etc.). The methodology consists of fully automatic collection of dense ground control points (GCP) from various resources and then application of those points to improve the photogrammetric model of the images. The delivered system is up running at CRESDA for pre-operational production and has been and is generating good return on investment by eliminating a great amount of manual labor and increasing more than ten times of data throughput daily with fewer operators. Future work, such as development of more performance-optimized algorithms, robust image matching methods and application workflows, is identified to improve the system in the coming years.

  16. The Far Ultra-Violet Imager on the Icon Mission

    NASA Astrophysics Data System (ADS)

    Mende, S. B.; Frey, H. U.; Rider, K.; Chou, C.; Harris, S. E.; Siegmund, O. H. W.; England, S. L.; Wilkins, C.; Craig, W.; Immel, T. J.; Turin, P.; Darling, N.; Loicq, J.; Blain, P.; Syrstad, E.; Thompson, B.; Burt, R.; Champagne, J.; Sevilla, P.; Ellis, S.

    2017-10-01

    ICON Far UltraViolet (FUV) imager contributes to the ICON science objectives by providing remote sensing measurements of the daytime and nighttime atmosphere/ionosphere. During sunlit atmospheric conditions, ICON FUV images the limb altitude profile in the shortwave (SW) band at 135.6 nm and the longwave (LW) band at 157 nm perpendicular to the satellite motion to retrieve the atmospheric O/N2 ratio. In conditions of atmospheric darkness, ICON FUV measures the 135.6 nm recombination emission of O+ ions used to compute the nighttime ionospheric altitude distribution. ICON Far UltraViolet (FUV) imager is a Czerny-Turner design Spectrographic Imager with two exit slits and corresponding back imager cameras that produce two independent images in separate wavelength bands on two detectors. All observations will be processed as limb altitude profiles. In addition, the ionospheric 135.6 nm data will be processed as longitude and latitude spatial maps to obtain images of ion distributions around regions of equatorial spread F. The ICON FUV optic axis is pointed 20 degrees below local horizontal and has a steering mirror that allows the field of view to be steered up to 30 degrees forward and aft, to keep the local magnetic meridian in the field of view. The detectors are micro channel plate (MCP) intensified FUV tubes with the phosphor fiber-optically coupled to Charge Coupled Devices (CCDs). The dual stack MCP-s amplify the photoelectron signals to overcome the CCD noise and the rapidly scanned frames are co-added to digitally create 12-second integrated images. Digital on-board signal processing is used to compensate for geometric distortion and satellite motion and to achieve data compression. The instrument was originally aligned in visible light by using a special grating and visible cameras. Final alignment, functional and environmental testing and calibration were performed in a large vacuum chamber with a UV source. The test and calibration program showed that ICON FUV meets its design requirements and is ready to be launched on the ICON spacecraft.

  17. Small-window parametric imaging based on information entropy for ultrasound tissue characterization

    PubMed Central

    Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean

    2017-01-01

    Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging. PMID:28106118

  18. Small-window parametric imaging based on information entropy for ultrasound tissue characterization

    NASA Astrophysics Data System (ADS)

    Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean

    2017-01-01

    Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging.

  19. Wavelet-based statistical classification of skin images acquired with reflectance confocal microscopy

    PubMed Central

    Halimi, Abdelghafour; Batatia, Hadj; Le Digabel, Jimmy; Josse, Gwendal; Tourneret, Jean Yves

    2017-01-01

    Detecting skin lentigo in reflectance confocal microscopy images is an important and challenging problem. This imaging modality has not yet been widely investigated for this problem and there are a few automatic processing techniques. They are mostly based on machine learning approaches and rely on numerous classical image features that lead to high computational costs given the very large resolution of these images. This paper presents a detection method with very low computational complexity that is able to identify the skin depth at which the lentigo can be detected. The proposed method performs multiresolution decomposition of the image obtained at each skin depth. The distribution of image pixels at a given depth can be approximated accurately by a generalized Gaussian distribution whose parameters depend on the decomposition scale, resulting in a very-low-dimension parameter space. SVM classifiers are then investigated to classify the scale parameter of this distribution allowing real-time detection of lentigo. The method is applied to 45 healthy and lentigo patients from a clinical study, where sensitivity of 81.4% and specificity of 83.3% are achieved. Our results show that lentigo is identifiable at depths between 50μm and 60μm, corresponding to the average location of the the dermoepidermal junction. This result is in agreement with the clinical practices that characterize the lentigo by assessing the disorganization of the dermoepidermal junction. PMID:29296480

  20. Hospital integrated parallel cluster for fast and cost-efficient image analysis: clinical experience and research evaluation

    NASA Astrophysics Data System (ADS)

    Erberich, Stephan G.; Hoppe, Martin; Jansen, Christian; Schmidt, Thomas; Thron, Armin; Oberschelp, Walter

    2001-08-01

    In the last few years more and more University Hospitals as well as private hospitals changed to digital information systems for patient record, diagnostic files and digital images. Not only that patient management becomes easier, it is also very remarkable how clinical research can profit from Picture Archiving and Communication Systems (PACS) and diagnostic databases, especially from image databases. Since images are available on the finger tip, difficulties arise when image data needs to be processed, e.g. segmented, classified or co-registered, which usually demands a lot computational power. Today's clinical environment does support PACS very well, but real image processing is still under-developed. The purpose of this paper is to introduce a parallel cluster of standard distributed systems and its software components and how such a system can be integrated into a hospital environment. To demonstrate the cluster technique we present our clinical experience with the crucial but cost-intensive motion correction of clinical routine and research functional MRI (fMRI) data, as it is processed in our Lab on a daily basis.

  1. SU-E-J-92: Validating Dose Uncertainty Estimates Produced by AUTODIRECT, An Automated Program to Evaluate Deformable Image Registration Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, H; Chen, J; Pouliot, J

    2015-06-15

    Purpose: Deformable image registration (DIR) is a powerful tool with the potential to deformably map dose from one computed-tomography (CT) image to another. Errors in the DIR, however, will produce errors in the transferred dose distribution. We have proposed a software tool, called AUTODIRECT (automated DIR evaluation of confidence tool), which predicts voxel-specific dose mapping errors on a patient-by-patient basis. This work validates the effectiveness of AUTODIRECT to predict dose mapping errors with virtual and physical phantom datasets. Methods: AUTODIRECT requires 4 inputs: moving and fixed CT images and two noise scans of a water phantom (for noise characterization). Then,more » AUTODIRECT uses algorithms to generate test deformations and applies them to the moving and fixed images (along with processing) to digitally create sets of test images, with known ground-truth deformations that are similar to the actual one. The clinical DIR algorithm is then applied to these test image sets (currently 4) . From these tests, AUTODIRECT generates spatial and dose uncertainty estimates for each image voxel based on a Student’s t distribution. This work compares these uncertainty estimates to the actual errors made by the Velocity Deformable Multi Pass algorithm on 11 virtual and 1 physical phantom datasets. Results: For 11 of the 12 tests, the predicted dose error distributions from AUTODIRECT are well matched to the actual error distributions within 1–6% for 10 virtual phantoms, and 9% for the physical phantom. For one of the cases though, the predictions underestimated the errors in the tail of the distribution. Conclusion: Overall, the AUTODIRECT algorithm performed well on the 12 phantom cases for Velocity and was shown to generate accurate estimates of dose warping uncertainty. AUTODIRECT is able to automatically generate patient-, organ- , and voxel-specific DIR uncertainty estimates. This ability would be useful for patient-specific DIR quality assurance.« less

  2. Quantification of the heterogeneity of prognostic cellular biomarkers in ewing sarcoma using automated image and random survival forest analysis.

    PubMed

    Bühnemann, Claudia; Li, Simon; Yu, Haiyue; Branford White, Harriet; Schäfer, Karl L; Llombart-Bosch, Antonio; Machado, Isidro; Picci, Piero; Hogendoorn, Pancras C W; Athanasou, Nicholas A; Noble, J Alison; Hassan, A Bassim

    2014-01-01

    Driven by genomic somatic variation, tumour tissues are typically heterogeneous, yet unbiased quantitative methods are rarely used to analyse heterogeneity at the protein level. Motivated by this problem, we developed automated image segmentation of images of multiple biomarkers in Ewing sarcoma to generate distributions of biomarkers between and within tumour cells. We further integrate high dimensional data with patient clinical outcomes utilising random survival forest (RSF) machine learning. Using material from cohorts of genetically diagnosed Ewing sarcoma with EWSR1 chromosomal translocations, confocal images of tissue microarrays were segmented with level sets and watershed algorithms. Each cell nucleus and cytoplasm were identified in relation to DAPI and CD99, respectively, and protein biomarkers (e.g. Ki67, pS6, Foxo3a, EGR1, MAPK) localised relative to nuclear and cytoplasmic regions of each cell in order to generate image feature distributions. The image distribution features were analysed with RSF in relation to known overall patient survival from three separate cohorts (185 informative cases). Variation in pre-analytical processing resulted in elimination of a high number of non-informative images that had poor DAPI localisation or biomarker preservation (67 cases, 36%). The distribution of image features for biomarkers in the remaining high quality material (118 cases, 104 features per case) were analysed by RSF with feature selection, and performance assessed using internal cross-validation, rather than a separate validation cohort. A prognostic classifier for Ewing sarcoma with low cross-validation error rates (0.36) was comprised of multiple features, including the Ki67 proliferative marker and a sub-population of cells with low cytoplasmic/nuclear ratio of CD99. Through elimination of bias, the evaluation of high-dimensionality biomarker distribution within cell populations of a tumour using random forest analysis in quality controlled tumour material could be achieved. Such an automated and integrated methodology has potential application in the identification of prognostic classifiers based on tumour cell heterogeneity.

  3. Systemic localization of seven major types of carbohydrates on cell membranes by dSTORM imaging.

    PubMed

    Chen, Junling; Gao, Jing; Zhang, Min; Cai, Mingjun; Xu, Haijiao; Jiang, Junguang; Tian, Zhiyuan; Wang, Hongda

    2016-07-25

    Carbohydrates on the cell surface control intercellular interactions and play a vital role in various physiological processes. However, their systemic distribution patterns are poorly understood. Through the direct stochastic optical reconstruction microscopy (dSTORM) strategy, we systematically revealed that several types of representative carbohydrates are found in clustered states. Interestingly, the results from dual-color dSTORM imaging indicate that these carbohydrate clusters are prone to connect with one another and eventually form conjoined platforms where different functional glycoproteins aggregate (e.g., epidermal growth factor receptor, (EGFR) and band 3 protein). A thorough understanding of the ensemble distribution of carbohydrates on the cell surface paves the way for elucidating the structure-function relationship of cell membranes and the critical roles of carbohydrates in various physiological and pathological cell processes.

  4. Systemic localization of seven major types of carbohydrates on cell membranes by dSTORM imaging

    PubMed Central

    Chen, Junling; Gao, Jing; Zhang, Min; Cai, Mingjun; Xu, Haijiao; Jiang, Junguang; Tian, Zhiyuan; Wang, Hongda

    2016-01-01

    Carbohydrates on the cell surface control intercellular interactions and play a vital role in various physiological processes. However, their systemic distribution patterns are poorly understood. Through the direct stochastic optical reconstruction microscopy (dSTORM) strategy, we systematically revealed that several types of representative carbohydrates are found in clustered states. Interestingly, the results from dual-color dSTORM imaging indicate that these carbohydrate clusters are prone to connect with one another and eventually form conjoined platforms where different functional glycoproteins aggregate (e.g., epidermal growth factor receptor, (EGFR) and band 3 protein). A thorough understanding of the ensemble distribution of carbohydrates on the cell surface paves the way for elucidating the structure-function relationship of cell membranes and the critical roles of carbohydrates in various physiological and pathological cell processes. PMID:27453176

  5. Automatical and accurate segmentation of cerebral tissues in fMRI dataset with combination of image processing and deep learning

    NASA Astrophysics Data System (ADS)

    Kong, Zhenglun; Luo, Junyi; Xu, Shengpu; Li, Ting

    2018-02-01

    Image segmentation plays an important role in medical science. One application is multimodality imaging, especially the fusion of structural imaging with functional imaging, which includes CT, MRI and new types of imaging technology such as optical imaging to obtain functional images. The fusion process require precisely extracted structural information, in order to register the image to it. Here we used image enhancement, morphometry methods to extract the accurate contours of different tissues such as skull, cerebrospinal fluid (CSF), grey matter (GM) and white matter (WM) on 5 fMRI head image datasets. Then we utilized convolutional neural network to realize automatic segmentation of images in deep learning way. Such approach greatly reduced the processing time compared to manual and semi-automatic segmentation and is of great importance in improving speed and accuracy as more and more samples being learned. The contours of the borders of different tissues on all images were accurately extracted and 3D visualized. This can be used in low-level light therapy and optical simulation software such as MCVM. We obtained a precise three-dimensional distribution of brain, which offered doctors and researchers quantitative volume data and detailed morphological characterization for personal precise medicine of Cerebral atrophy/expansion. We hope this technique can bring convenience to visualization medical and personalized medicine.

  6. Current Status and Future Perspectives of Mass Spectrometry Imaging

    PubMed Central

    Nimesh, Surendra; Mohottalage, Susantha; Vincent, Renaud; Kumarathasan, Prem

    2013-01-01

    Mass spectrometry imaging is employed for mapping proteins, lipids and metabolites in biological tissues in a morphological context. Although initially developed as a tool for biomarker discovery by imaging the distribution of protein/peptide in tissue sections, the high sensitivity and molecular specificity of this technique have enabled its application to biomolecules, other than proteins, even in cells, latent finger prints and whole organisms. Relatively simple, with no requirement for labelling, homogenization, extraction or reconstitution, the technique has found a variety of applications in molecular biology, pathology, pharmacology and toxicology. By discriminating the spatial distribution of biomolecules in serial sections of tissues, biomarkers of lesions and the biological responses to stressors or diseases can be better understood in the context of structure and function. In this review, we have discussed the advances in the different aspects of mass spectrometry imaging processes, application towards different disciplines and relevance to the field of toxicology. PMID:23759983

  7. Combined optimization of image-gathering and image-processing systems for scene feature detection

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim; Arduini, Robert F.; Samms, Richard W.

    1987-01-01

    The relationship between the image gathering and image processing systems for minimum mean squared error estimation of scene characteristics is investigated. A stochastic optimization problem is formulated where the objective is to determine a spatial characteristic of the scene rather than a feature of the already blurred, sampled and noisy image data. An analytical solution for the optimal characteristic image processor is developed. The Wiener filter for the sampled image case is obtained as a special case, where the desired characteristic is scene restoration. Optimal edge detection is investigated using the Laplacian operator x G as the desired characteristic, where G is a two dimensional Gaussian distribution function. It is shown that the optimal edge detector compensates for the blurring introduced by the image gathering optics, and notably, that it is not circularly symmetric. The lack of circular symmetry is largely due to the geometric effects of the sampling lattice used in image acquisition. The optimal image gathering optical transfer function is also investigated and the results of a sensitivity analysis are shown.

  8. The Saturn hydrogen plume

    NASA Astrophysics Data System (ADS)

    Shemansky, D. E.; Liu, X.; Melin, H.

    2009-12-01

    Images of the Saturn atmosphere and magnetosphere in H Lyα emission during the Cassini spacecraft pre and post Saturn orbit insertion (SOI) event obtained using the UVIS experiment FUV spectrograph have revealed definitive evidence for the escape of H I atoms from the top of the thermosphere. An image at 0.1×0.1 Saturn equatorial radii ( RS) pixel resolution with an edge-on-view of the rings shows a distinctive structure (plume) with full width at half maximum (FWHM) of 0.56RS at the exobase sub-solar limb at ˜-13.5∘ latitude as part of the distributed outflow of H I from the sunlit hemisphere, with a counterpart on the antisolar side peaking near the equator above the exobase limb. The structure of the image indicates that part of the outflowing population is sub-orbital and re-enters the thermosphere in an approximate 5 h time scale. An evident larger more broadly distributed component fills the magnetosphere to beyond 45RS in the orbital plane in an asymmetric distribution in local time, similar to an image obtained at Voyager 1 post encounter in a different observational geometry. It has been found that H2 singlet ungerade Rydberg EUV/FUV emission spectra collected with the H Lyα into the image mosaic show a distinctive resonance property correlated with the H Lyα plume. The inferred approximate globally averaged energy deposition at the top of the thermosphere from the production of the hot atomic hydrogen accounts for the measured atmospheric temperature. The only known process capable of producing the atoms at the required few eV/atom kinetic energy appears to be the direct electron excitation of non-LTE H2XΣg+1( v:J) into the repulsive H2bΣu+3, although details of the processes need to be examined under the constraints imposed by the observations to determine compatibility with the current knowledge of hydrogen rate processes.

  9. View planetary differentiation process through high-resolution 3D imaging

    NASA Astrophysics Data System (ADS)

    Fei, Y.

    2011-12-01

    Core-mantle separation is one of the most important processes in planetary evolution, defining the structure and chemical distribution in the planets. Iron-dominated core materials could migrate through silicate mantle to the core by efficient liquid-liquid separation and/or by percolation of liquid metal through solid silicate matrix. We can experimentally simulate these processes to examine the efficiency and time of core formation and its geochemical signatures. The quantitative measure of the efficiency of percolation is usually the dihedral angle, related to the interfacial energies of the liquid and solid phases. To determine the true dihedral angle at high pressure and temperatures, it is necessary to measure the relative frequency distributions of apparent dihedral angles between the quenched liquid metal and silicate grains for each experiment. Here I present a new imaging technique to visualize the distribution of liquid metal in silicate matrix in 3D by combination of focus ion beam (FIB) milling and high-resolution SEM image. The 3D volume rendering provides precise determination of the dihedral angle and quantitative measure of volume fraction and connectivity. I have conducted a series of experiments using mixtures of San Carlos olivine and Fe-S (10wt%S) metal with different metal-silicate ratios, up to 25 GPa and at temperatures above 1800C. High-quality 3D volume renderings were reconstructed from FIB serial sectioning and imaging with 10-nm slice thickness and 14-nm image resolution for each quenched sample. The unprecedented spatial resolution at nano scale allows detailed examination of textural features and precise determination of the dihedral angle as a function of pressure, temperature and composition. The 3D reconstruction also allows direct assessment of connectivity in multi-phase matrix, providing a new way to investigate the efficiency of metal percolation in a real silicate mantle.

  10. Three-dimensional video imaging of drainage and imbibition processes in model porous medium

    NASA Astrophysics Data System (ADS)

    Sharma, Prerna; Aswathi, P.; Sane, Anit; Ghosh, Shankar; Bhattacharya, Sabyasachi

    2011-03-01

    We report experimental results where we have performed three dimensional video imaging of the displacement of an oil phase by an aqueous phase and vice versa in a model porous medium. The stability of the oil water interface was studied as a function of their viscosity ratios, the wettability of the porous medium and the variation in the pore size distribution. Our experiments captures the pore scale information of the displacement process and its role in determining the long time structure of the interface.

  11. Phase Composition Maps integrate mineral compositions with rock textures from the micro-meter to the thin section scale

    NASA Astrophysics Data System (ADS)

    Willis, Kyle V.; Srogi, LeeAnn; Lutz, Tim; Monson, Frederick C.; Pollock, Meagen

    2017-12-01

    Textures and compositions are critical information for interpreting rock formation. Existing methods to integrate both types of information favor high-resolution images of mineral compositions over small areas or low-resolution images of larger areas for phase identification. The method in this paper produces images of individual phases in which textural and compositional details are resolved over three orders of magnitude, from tens of micrometers to tens of millimeters. To construct these images, called Phase Composition Maps (PCMs), we make use of the resolution in backscattered electron (BSE) images and calibrate the gray scale values with mineral analyses by energy-dispersive X-ray spectrometry (EDS). The resulting images show the area of a standard thin section (roughly 40 mm × 20 mm) with spatial resolution as good as 3.5 μm/pixel, or more than 81 000 pixels/mm2, comparable to the resolution of X-ray element maps produced by wavelength-dispersive spectrometry (WDS). Procedures to create PCMs for mafic igneous rocks with multivariate linear regression models for minerals with solid solution (olivine, plagioclase feldspar, and pyroxenes) are presented and are applicable to other rock types. PCMs are processed using threshold functions based on the regression models to image specific composition ranges of minerals. PCMs are constructed using widely-available instrumentation: a scanning-electron microscope (SEM) with BSE and EDS X-ray detectors and standard image processing software such as ImageJ and Adobe Photoshop. Three brief applications illustrate the use of PCMs as petrologic tools: to reveal mineral composition patterns at multiple scales; to generate crystal size distributions for intracrystalline compositional zones and compare growth over time; and to image spatial distributions of minerals at different stages of magma crystallization by integrating textures and compositions with thermodynamic modeling.

  12. Monte Carlo based toy model for fission process

    NASA Astrophysics Data System (ADS)

    Kurniadi, R.; Waris, A.; Viridi, S.

    2014-09-01

    There are many models and calculation techniques to obtain visible image of fission yield process. In particular, fission yield can be calculated by using two calculations approach, namely macroscopic approach and microscopic approach. This work proposes another calculation approach in which the nucleus is treated as a toy model. Hence, the fission process does not represent real fission process in nature completely. The toy model is formed by Gaussian distribution of random number that randomizes distance likesthe distance between particle and central point. The scission process is started by smashing compound nucleus central point into two parts that are left central and right central points. These three points have different Gaussian distribution parameters such as mean (μCN, μL, μR), and standard deviation (σCN, σL, σR). By overlaying of three distributions, the number of particles (NL, NR) that are trapped by central points can be obtained. This process is iterated until (NL, NR) become constant numbers. Smashing process is repeated by changing σL and σR, randomly.

  13. High-performance computing in image registration

    NASA Astrophysics Data System (ADS)

    Zanin, Michele; Remondino, Fabio; Dalla Mura, Mauro

    2012-10-01

    Thanks to the recent technological advances, a large variety of image data is at our disposal with variable geometric, radiometric and temporal resolution. In many applications the processing of such images needs high performance computing techniques in order to deliver timely responses e.g. for rapid decisions or real-time actions. Thus, parallel or distributed computing methods, Digital Signal Processor (DSP) architectures, Graphical Processing Unit (GPU) programming and Field-Programmable Gate Array (FPGA) devices have become essential tools for the challenging issue of processing large amount of geo-data. The article focuses on the processing and registration of large datasets of terrestrial and aerial images for 3D reconstruction, diagnostic purposes and monitoring of the environment. For the image alignment procedure, sets of corresponding feature points need to be automatically extracted in order to successively compute the geometric transformation that aligns the data. The feature extraction and matching are ones of the most computationally demanding operations in the processing chain thus, a great degree of automation and speed is mandatory. The details of the implemented operations (named LARES) exploiting parallel architectures and GPU are thus presented. The innovative aspects of the implementation are (i) the effectiveness on a large variety of unorganized and complex datasets, (ii) capability to work with high-resolution images and (iii) the speed of the computations. Examples and comparisons with standard CPU processing are also reported and commented.

  14. Cryo-imaging of fluorescently labeled single cells in a mouse

    NASA Astrophysics Data System (ADS)

    Steyer, Grant J.; Roy, Debashish; Salvado, Olivier; Stone, Meredith E.; Wilson, David L.

    2009-02-01

    We developed a cryo-imaging system to provide single-cell detection of fluorescently labeled cells in mouse, with particular applicability to stem cells and metastatic cancer. The Case cryoimaging system consists of a fluorescence microscope, robotic imaging positioner, customized cryostat, PC-based control system, and visualization/analysis software. The system alternates between sectioning (10-40 μm) and imaging, collecting color brightfield and fluorescent blockface image volumes >60GB. In mouse experiments, we imaged quantum-dot labeled stem cells, GFP-labeled cancer and stem cells, and cell-size fluorescent microspheres. To remove subsurface fluorescence, we used a simplified model of light-tissue interaction whereby the next image was scaled, blurred, and subtracted from the current image. We estimated scaling and blurring parameters by minimizing entropy of subtracted images. Tissue specific attenuation parameters were found [uT : heart (267 +/- 47.6 μm), liver (218 +/- 27.1 μm), brain (161 +/- 27.4 μm)] to be within the range of estimates in the literature. "Next image" processing removed subsurface fluorescence equally well across multiple tissues (brain, kidney, liver, adipose tissue, etc.), and analysis of 200 microsphere images in the brain gave 97+/-2% reduction of subsurface fluorescence. Fluorescent signals were determined to arise from single cells based upon geometric and integrated intensity measurements. Next image processing greatly improved axial resolution, enabled high quality 3D volume renderings, and improved enumeration of single cells with connected component analysis by up to 24%. Analysis of image volumes identified metastatic cancer sites, found homing of stem cells to injury sites, and showed microsphere distribution correlated with blood flow patterns. We developed and evaluated cryo-imaging to provide single-cell detection of fluorescently labeled cells in mouse. Our cryo-imaging system provides extreme (>60GB), micron-scale, fluorescence, and bright field image data. Here we describe our image preprocessing, analysis, and visualization techniques. Processing improves axial resolution, reduces subsurface fluorescence by 97%, and enables single cell detection and counting. High quality 3D volume renderings enable us to evaluate cell distribution patterns. Applications include the myriad of biomedical experiments using fluorescent reporter gene and exogenous fluorophore labeling of cells in applications such as stem cell regenerative medicine, cancer, tissue engineering, etc.

  15. Processing, Cataloguing and Distribution of Uas Images in Near Real Time

    NASA Astrophysics Data System (ADS)

    Runkel, I.

    2013-08-01

    Why are UAS such a hype? UAS make the data capture flexible, fast and easy. For many applications this is more important than a perfect photogrammetric aerial image block. To ensure, that the advantage of a fast data capturing will be valid up to the end of the processing chain, all intermediate steps like data processing and data dissemination to the customer need to be flexible and fast as well. GEOSYSTEMS has established the whole processing workflow as server/client solution. This is the focus of the presentation. Depending on the image acquisition system the image data can be down linked during the flight to the data processing computer or it is stored on a mobile device and hooked up to the data processing computer after the flight campaign. The image project manager reads the data from the device and georeferences the images according to the position data. The meta data is converted into an ISO conform format and subsequently all georeferenced images are catalogued in the raster data management System ERDAS APOLLO. APOLLO provides the data, respectively the images as an OGC-conform services to the customer. Within seconds the UAV-images are ready to use for GIS application, image processing or direct interpretation via web applications - where ever you want. The whole processing chain is built in a generic manner. It can be adapted to a magnitude of applications. The UAV imageries can be processed and catalogued as single ortho imges or as image mosaic. Furthermore, image data of various cameras can be fusioned. By using WPS (web processing services) image enhancement, image analysis workflows like change detection layers can be calculated and provided to the image analysts. The processing of the WPS runs direct on the raster data management server. The image analyst has no data and no software on his local computer. This workflow is proven to be fast, stable and accurate. It is designed to support time critical applications for security demands - the images can be checked and interpreted in near real-time. For sensible areas it gives you the possibility to inform remote decision makers or interpretation experts in order to provide them situations awareness, wherever they are. For monitoring and inspection tasks it speeds up the process of data capture and data interpretation. The fully automated workflow of data pre-processing, data georeferencing, data cataloguing and data dissemination in near real time was developed based on the Intergraph products ERDAS IMAGINE, ERDAS APOLLO and GEOSYSTEMS METAmorph!IT. It is offered as adaptable solution by GEOSYSTEMS GmbH.

  16. Liquid crystal thermography and true-colour digital image processing

    NASA Astrophysics Data System (ADS)

    Stasiek, J.; Stasiek, A.; Jewartowski, M.; Collins, M. W.

    2006-06-01

    In the last decade thermochromic liquid crystals (TLC) and true-colour digital image processing have been successfully used in non-intrusive technical, industrial and biomedical studies and applications. Thin coatings of TLCs at surfaces are utilized to obtain detailed temperature distributions and heat transfer rates for steady or transient processes. Liquid crystals also can be used to make visible the temperature and velocity fields in liquids by the simple expedient of directly mixing the liquid crystal material into the liquid (water, glycerol, glycol, and silicone oils) in very small quantities to use as thermal and hydrodynamic tracers. In biomedical situations e.g., skin diseases, breast cancer, blood circulation and other medical application, TLC and image processing are successfully used as an additional non-invasive diagnostic method especially useful for screening large groups of potential patients. The history of this technique is reviewed, principal methods and tools are described and some examples are also presented.

  17. Imaging whole Escherichia coli bacteria by using single-particle x-ray diffraction

    NASA Astrophysics Data System (ADS)

    Miao, Jianwei; Hodgson, Keith O.; Ishikawa, Tetsuya; Larabell, Carolyn A.; Legros, Mark A.; Nishino, Yoshinori

    2003-01-01

    We report the first experimental recording, to our knowledge, of the diffraction pattern from intact Escherichia coli bacteria using coherent x-rays with a wavelength of 2 Å. By using the oversampling phasing method, a real space image at a resolution of 30 nm was directly reconstructed from the diffraction pattern. An R factor used for characterizing the quality of the reconstruction was in the range of 5%, which demonstrated the reliability of the reconstruction process. The distribution of proteins inside the bacteria labeled with manganese oxide has been identified and this distribution confirmed by fluorescence microscopy images. Compared with lens-based microscopy, this diffraction-based imaging approach can examine thicker samples, such as whole cultured cells, in three dimensions with resolution limited only by radiation damage. Looking forward, the successful recording and reconstruction of diffraction patterns from biological samples reported here represent an important step toward the potential of imaging single biomolecules at near-atomic resolution by combining single-particle diffraction with x-ray free electron lasers.

  18. Evaluation of the image quality of telescopes using the star test

    NASA Astrophysics Data System (ADS)

    Vazquez y Monteil, Sergio; Salazar Romero, Marcos A.; Gale, David M.

    2004-10-01

    The Point Spread Function (PSF) or star test is one of the main criteria to be considered in the quality of the image formed by a telescope. In a real system the distribution of irradiance in the image of a point source is given by the PSF, a function which is highly sensitive to aberrations. The PSF of a telescope may be determined by measuring the intensity distribution in the image of a star. Alternatively, if we already know the aberrations present in the optical system, then we may use diffraction theory to calculate the function. In this paper we propose a method for determining the wavefront aberrations from the PSF, using Genetic Algorithms to perform an optimization process starting from the PSF instead of the more traditional method of adjusting an aberration polynomial. We show that this method of phase recuperation is immune to noise-induced errors arising during image aquisition and registration. Some practical results are shown.

  19. Parallel volume ray-casting for unstructured-grid data on distributed-memory architectures

    NASA Technical Reports Server (NTRS)

    Ma, Kwan-Liu

    1995-01-01

    As computing technology continues to advance, computational modeling of scientific and engineering problems produces data of increasing complexity: large in size and unstructured in shape. Volume visualization of such data is a challenging problem. This paper proposes a distributed parallel solution that makes ray-casting volume rendering of unstructured-grid data practical. Both the data and the rendering process are distributed among processors. At each processor, ray-casting of local data is performed independent of the other processors. The global image composing processes, which require inter-processor communication, are overlapped with the local ray-casting processes to achieve maximum parallel efficiency. This algorithm differs from previous ones in four ways: it is completely distributed, less view-dependent, reasonably scalable, and flexible. Without using dynamic load balancing, test results on the Intel Paragon using from two to 128 processors show, on average, about 60% parallel efficiency.

  20. From nociception to pain perception: imaging the spinal and supraspinal pathways

    PubMed Central

    Brooks, Jonathan; Tracey, Irene

    2005-01-01

    Functional imaging techniques have allowed researchers to look within the brain, and revealed the cortical representation of pain. Initial experiments, performed in the early 1990s, revolutionized pain research, as they demonstrated that pain was not processed in a single cortical area, but in several distributed brain regions. Over the last decade, the roles of these pain centres have been investigated and a clearer picture has emerged of the medial and lateral pain system. In this brief article, we review the imaging literature to date that has allowed these advances to be made, and examine the new frontiers for pain imaging research: imaging the brainstem and other structures involved in the descending control of pain; functional and anatomical connectivity studies of pain processing brain regions; imaging models of neuropathic pain-like states; and going beyond the brain to image spinal function. The ultimate goal of such research is to take these new techniques into the clinic, to investigate and provide new remedies for chronic pain sufferers. PMID:16011543

  1. Thin layer imaging process for microlithography using radiation at strongly attenuated wavelengths

    DOEpatents

    Wheeler, David R.

    2004-01-06

    A method for patterning of resist surfaces which is particularly advantageous for systems having low photon flux and highly energetic, strongly attenuated radiation. A thin imaging layer is created with uniform silicon distribution in a bilayer format. An image is formed by exposing selected regions of the silylated imaging layer to radiation. The radiation incident upon the silyliated resist material results in acid generation which either catalyzes cleavage of Si--O bonds to produce moieties that are volatile enough to be driven off in a post exposure bake step or produces a resist material where the exposed portions of the imaging layer are soluble in a basic solution, thereby desilylating the exposed areas of the imaging layer. The process is self limiting due to the limited quantity of silyl groups within each region of the pattern. Following the post exposure bake step, an etching step, generally an oxygen plasma etch, removes the resist material from the de-silylated areas of the imaging layer.

  2. Complex noise suppression using a sparse representation and 3D filtering of images

    NASA Astrophysics Data System (ADS)

    Kravchenko, V. F.; Ponomaryov, V. I.; Pustovoit, V. I.; Palacios-Enriquez, A.

    2017-08-01

    A novel method for the filtering of images corrupted by complex noise composed of randomly distributed impulses and additive Gaussian noise has been substantiated for the first time. The method consists of three main stages: the detection and filtering of pixels corrupted by impulsive noise, the subsequent image processing to suppress the additive noise based on 3D filtering and a sparse representation of signals in a basis of wavelets, and the concluding image processing procedure to clean the final image of the errors emerged at the previous stages. A physical interpretation of the filtering method under complex noise conditions is given. A filtering block diagram has been developed in accordance with the novel approach. Simulations of the novel image filtering method have shown an advantage of the proposed filtering scheme in terms of generally recognized criteria, such as the structural similarity index measure and the peak signal-to-noise ratio, and when visually comparing the filtered images.

  3. Annotating images by mining image search results.

    PubMed

    Wang, Xin-Jing; Zhang, Lei; Li, Xirong; Ma, Wei-Ying

    2008-11-01

    Although it has been studied for years by the computer vision and machine learning communities, image annotation is still far from practical. In this paper, we propose a novel attempt at model-free image annotation, which is a data-driven approach that annotates images by mining their search results. Some 2.4 million images with their surrounding text are collected from a few photo forums to support this approach. The entire process is formulated in a divide-and-conquer framework where a query keyword is provided along with the uncaptioned image to improve both the effectiveness and efficiency. This is helpful when the collected data set is not dense everywhere. In this sense, our approach contains three steps: 1) the search process to discover visually and semantically similar search results, 2) the mining process to identify salient terms from textual descriptions of the search results, and 3) the annotation rejection process to filter out noisy terms yielded by Step 2. To ensure real-time annotation, two key techniques are leveraged-one is to map the high-dimensional image visual features into hash codes, the other is to implement it as a distributed system, of which the search and mining processes are provided as Web services. As a typical result, the entire process finishes in less than 1 second. Since no training data set is required, our approach enables annotating with unlimited vocabulary and is highly scalable and robust to outliers. Experimental results on both real Web images and a benchmark image data set show the effectiveness and efficiency of the proposed algorithm. It is also worth noting that, although the entire approach is illustrated within the divide-and conquer framework, a query keyword is not crucial to our current implementation. We provide experimental results to prove this.

  4. In-line monitoring of pellet coating thickness growth by means of visual imaging.

    PubMed

    Oman Kadunc, Nika; Sibanc, Rok; Dreu, Rok; Likar, Boštjan; Tomaževič, Dejan

    2014-08-15

    Coating thickness is the most important attribute of coated pharmaceutical pellets as it directly affects release profiles and stability of the drug. Quality control of the coating process of pharmaceutical pellets is thus of utmost importance for assuring the desired end product characteristics. A visual imaging technique is presented and examined as a process analytic technology (PAT) tool for noninvasive continuous in-line and real time monitoring of coating thickness of pharmaceutical pellets during the coating process. Images of pellets were acquired during the coating process through an observation window of a Wurster coating apparatus. Image analysis methods were developed for fast and accurate determination of pellets' coating thickness during a coating process. The accuracy of the results for pellet coating thickness growth obtained in real time was evaluated through comparison with an off-line reference method and a good agreement was found. Information about the inter-pellet coating uniformity was gained from further statistical analysis of the measured pellet size distributions. Accuracy and performance analysis of the proposed method showed that visual imaging is feasible as a PAT tool for in-line and real time monitoring of the coating process of pharmaceutical pellets. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Quantitative Analysis of Rat Dorsal Root Ganglion Neurons Cultured on Microelectrode Arrays Based on Fluorescence Microscopy Image Processing.

    PubMed

    Mari, João Fernando; Saito, José Hiroki; Neves, Amanda Ferreira; Lotufo, Celina Monteiro da Cruz; Destro-Filho, João-Batista; Nicoletti, Maria do Carmo

    2015-12-01

    Microelectrode Arrays (MEA) are devices for long term electrophysiological recording of extracellular spontaneous or evocated activities on in vitro neuron culture. This work proposes and develops a framework for quantitative and morphological analysis of neuron cultures on MEAs, by processing their corresponding images, acquired by fluorescence microscopy. The neurons are segmented from the fluorescence channel images using a combination of segmentation by thresholding, watershed transform, and object classification. The positioning of microelectrodes is obtained from the transmitted light channel images using the circular Hough transform. The proposed method was applied to images of dissociated culture of rat dorsal root ganglion (DRG) neuronal cells. The morphological and topological quantitative analysis carried out produced information regarding the state of culture, such as population count, neuron-to-neuron and neuron-to-microelectrode distances, soma morphologies, neuron sizes, neuron and microelectrode spatial distributions. Most of the analysis of microscopy images taken from neuronal cultures on MEA only consider simple qualitative analysis. Also, the proposed framework aims to standardize the image processing and to compute quantitative useful measures for integrated image-signal studies and further computational simulations. As results show, the implemented microelectrode identification method is robust and so are the implemented neuron segmentation and classification one (with a correct segmentation rate up to 84%). The quantitative information retrieved by the method is highly relevant to assist the integrated signal-image study of recorded electrophysiological signals as well as the physical aspects of the neuron culture on MEA. Although the experiments deal with DRG cell images, cortical and hippocampal cell images could also be processed with small adjustments in the image processing parameter estimation.

  6. Three-beam interferogram analysis method for surface flatness testing of glass plates and wedges

    NASA Astrophysics Data System (ADS)

    Sunderland, Zofia; Patorski, Krzysztof

    2015-09-01

    When testing transparent plates with high quality flat surfaces and a small angle between them the three-beam interference phenomenon is observed. Since the reference beam and the object beams reflected from both the front and back surface of a sample are detected, the recorded intensity distribution may be regarded as a sum of three fringe patterns. Images of that type cannot be succesfully analyzed with standard interferogram analysis methods. They contain, however, useful information on the tested plate surface flatness and its optical thickness variations. Several methods were elaborated to decode the plate parameters. Our technique represents a competitive solution which allows for retrieval of phase components of the three-beam interferogram. It requires recording two images: a three-beam interferogram and the two-beam one with the reference beam blocked. Mutually subtracting these images leads to the intensity distribution which, under some assumptions, provides access to the two component fringe sets which encode surfaces flatness. At various stages of processing we take advantage of nonlinear operations as well as single-frame interferogram analysis methods. Two-dimensional continuous wavelet transform (2D CWT) is used to separate a particular fringe family from the overall interferogram intensity distribution as well as to estimate the phase distribution from a pattern. We distinguish two processing paths depending on the relative density of fringe sets which is connected with geometry of a sample and optical setup. The proposed method is tested on simulated data.

  7. Earth imaging and scientific observations by SSTI ``Clark'' a NASA technology demonstration spacecraft

    NASA Astrophysics Data System (ADS)

    Hayduk, Robert J.; Scott, Walter S.; Walberg, Gerald D.; Butts, James J.; Starr, Richard D.

    1997-01-01

    The Small Satellite Technology Initiative (SSTI) is a National Aeronautics and Space Administration (NASA) program to demonstrate smaller, high technology satellites constructed rapidly and less expensively. Under SSTI, NASA funded the development of ``Clark,'' a high technology demonstration satellite to provide 3-m resolution panchromatic and 15-m resolution multispectral images, as well as collect atmospheric constituent and cosmic x-ray data. The 690-lb. satellite, to be launched in early 1997, will be in a 476 km, circular, sun-synchronous polar orbit. This paper describes the program objectives, the technical characteristics of the sensors and satellite, image processing, archiving and distribution. Data archiving and distribution will be performed by NASA Stennis Space Center and by the EROS Data Center, Sioux Falls, South Dakota, USA.

  8. VIEWCACHE: An incremental pointer-based access method for autonomous interoperable databases

    NASA Technical Reports Server (NTRS)

    Roussopoulos, N.; Sellis, Timos

    1992-01-01

    One of biggest problems facing NASA today is to provide scientists efficient access to a large number of distributed databases. Our pointer-based incremental database access method, VIEWCACHE, provides such an interface for accessing distributed data sets and directories. VIEWCACHE allows database browsing and search performing inter-database cross-referencing with no actual data movement between database sites. This organization and processing is especially suitable for managing Astrophysics databases which are physically distributed all over the world. Once the search is complete, the set of collected pointers pointing to the desired data are cached. VIEWCACHE includes spatial access methods for accessing image data sets, which provide much easier query formulation by referring directly to the image and very efficient search for objects contained within a two-dimensional window. We will develop and optimize a VIEWCACHE External Gateway Access to database management systems to facilitate distributed database search.

  9. Production and Distribution of Global Products From MODIS

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward; Smith, David E. (Technical Monitor)

    2000-01-01

    The Moderate Resolution Imaging Spectroradiometer was launched on the EOS Terra spacecraft in December 1999 and will also fly on EOS Aqua in December 2000. With 36 spectral bands from the visible through thermal infrared and spatial resolution of 250m to 1 kilometer, each MODIS instrument will image the entire Earth surface in 2 days. This paper traces the flow of MODIS data products from the receipt of Level 0 data at the EDOS facility, through the production and quality assurance process to the Distributed Active Archive Centers (DAACs), which ship products to the user community. It describes where to obtain products and plans for reprocessing MODIS products. As most components of the ground system are severely limited in their capacity to distribute MODIS products, it also describes the key characteristics of MODIS products and their metadata that allow a user to optimize their selection of products given anticipate bottlenecks in distribution.

  10. Hierarchical Bayesian method for mapping biogeochemical hot spots using induced polarization imaging

    DOE PAGES

    Wainwright, Haruko M.; Flores Orozco, Adrian; Bucker, Matthias; ...

    2016-01-29

    In floodplain environments, a naturally reduced zone (NRZ) is considered to be a common biogeochemical hot spot, having distinct microbial and geochemical characteristics. Although important for understanding their role in mediating floodplain biogeochemical processes, mapping the subsurface distribution of NRZs over the dimensions of a floodplain is challenging, as conventional wellbore data are typically spatially limited and the distribution of NRZs is heterogeneous. In this work, we present an innovative methodology for the probabilistic mapping of NRZs within a three-dimensional (3-D) subsurface domain using induced polarization imaging, which is a noninvasive geophysical technique. Measurements consist of surface geophysical surveys andmore » drilling-recovered sediments at the U.S. Department of Energy field site near Rifle, CO (USA). Inversion of surface time domain-induced polarization (TDIP) data yielded 3-D images of the complex electrical resistivity, in terms of magnitude and phase, which are associated with mineral precipitation and other lithological properties. By extracting the TDIP data values colocated with wellbore lithological logs, we found that the NRZs have a different distribution of resistivity and polarization from the other aquifer sediments. To estimate the spatial distribution of NRZs, we developed a Bayesian hierarchical model to integrate the geophysical and wellbore data. In addition, the resistivity images were used to estimate hydrostratigraphic interfaces under the floodplain. Validation results showed that the integration of electrical imaging and wellbore data using a Bayesian hierarchical model was capable of mapping spatially heterogeneous interfaces and NRZ distributions thereby providing a minimally invasive means to parameterize a hydrobiogeochemical model of the floodplain.« less

  11. Pore-scale Simulation and Imaging of Multi-phase Flow and Transport in Porous Media (Invited)

    NASA Astrophysics Data System (ADS)

    Crawshaw, J.; Welch, N.; Daher, I.; Yang, J.; Shah, S.; Grey, F.; Boek, E.

    2013-12-01

    We combine multi-scale imaging and computer simulation of multi-phase flow and reactive transport in rock samples to enhance our fundamental understanding of long term CO2 storage in rock formations. The imaging techniques include Confocal Laser Scanning Microscopy (CLSM), micro-CT and medical CT scanning, with spatial resolutions ranging from sub-micron to mm respectively. First, we report a new sample preparation technique to study micro-porosity in carbonates using CLSM in 3 dimensions. Second, we use micro-CT scanning to generate high resolution 3D pore space images of carbonate and cap rock samples. In addition, we employ micro-CT to image the processes of evaporation in fractures and cap rock degradation due to exposure to CO2 flow. Third, we use medical CT scanning to image spontaneous imbibition in carbonate rock samples. Our imaging studies are complemented by computer simulations of multi-phase flow and transport, using the 3D pore space images obtained from the scanning experiments. We have developed a massively parallel lattice-Boltzmann (LB) code to calculate the single phase flow field in these pore space images. The resulting flow fields are then used to calculate hydrodynamic dispersion using a novel scheme to predict probability distributions for molecular displacements using the LB method and a streamline algorithm, modified for optimal solid boundary conditions. We calculate solute transport on pore-space images of rock cores with increasing degree of heterogeneity: a bead pack, Bentheimer sandstone and Portland carbonate. We observe that for homogeneous rock samples, such as bead packs, the displacement distribution remains Gaussian with time increasing. In the more heterogeneous rocks, on the other hand, the displacement distribution develops a stagnant part. We observe that the fraction of trapped solute increases from the beadpack (0 %) to Bentheimer sandstone (1.5 %) to Portland carbonate (8.1 %), in excellent agreement with PFG-NMR experiments. We then use our preferred multi-phase model to directly calculate flow in pore space images of two different sandstones and observe excellent agreement with experimental relative permeabilities. Also we calculate cluster size distributions in good agreement with experimental studies. Our analysis shows that the simulations are able to predict both multi-phase flow and transport properties directly on large 3D pore space images of real rocks. Pore space images, left and velocity distributions, right (Yang and Boek, 2013)

  12. Quantum Color Image Encryption Algorithm Based on A Hyper-Chaotic System and Quantum Fourier Transform

    NASA Astrophysics Data System (ADS)

    Tan, Ru-Chao; Lei, Tong; Zhao, Qing-Min; Gong, Li-Hua; Zhou, Zhi-Hong

    2016-12-01

    To improve the slow processing speed of the classical image encryption algorithms and enhance the security of the private color images, a new quantum color image encryption algorithm based on a hyper-chaotic system is proposed, in which the sequences generated by the Chen's hyper-chaotic system are scrambled and diffused with three components of the original color image. Sequentially, the quantum Fourier transform is exploited to fulfill the encryption. Numerical simulations show that the presented quantum color image encryption algorithm possesses large key space to resist illegal attacks, sensitive dependence on initial keys, uniform distribution of gray values for the encrypted image and weak correlation between two adjacent pixels in the cipher-image.

  13. A novel image enhancement algorithm based on stationary wavelet transform for infrared thermography to the de-bonding defect in solid rocket motors

    NASA Astrophysics Data System (ADS)

    Liu, Tao; Zhang, Wei; Yan, Shaoze

    2015-10-01

    In this paper, a multi-scale image enhancement algorithm based on low-passing filtering and nonlinear transformation is proposed for infrared testing image of the de-bonding defect in solid propellant rocket motors. Infrared testing images with high-level noise and low contrast are foundations for identifying defects and calculating the defects size. In order to improve quality of the infrared image, according to distribution properties of the detection image, within framework of stationary wavelet transform, the approximation coefficients at suitable decomposition level is processed by index low-passing filtering by using Fourier transform, after that, the nonlinear transformation is applied to further process the figure to improve the picture contrast. To verify validity of the algorithm, the image enhancement algorithm is applied to infrared testing pictures of two specimens with de-bonding defect. Therein, one specimen is made of a type of high-strength steel, and the other is a type of carbon fiber composite. As the result shown, in the images processed by the image enhancement algorithm presented in the paper, most of noises are eliminated, and contrast between defect areas and normal area is improved greatly; in addition, by using the binary picture of the processed figure, the continuous defect edges can be extracted, all of which show the validity of the algorithm. The paper provides a well-performing image enhancement algorithm for the infrared thermography.

  14. Optical image encryption method based on incoherent imaging and polarized light encoding

    NASA Astrophysics Data System (ADS)

    Wang, Q.; Xiong, D.; Alfalou, A.; Brosseau, C.

    2018-05-01

    We propose an incoherent encoding system for image encryption based on a polarized encoding method combined with an incoherent imaging. Incoherent imaging is the core component of this proposal, in which the incoherent point-spread function (PSF) of the imaging system serves as the main key to encode the input intensity distribution thanks to a convolution operation. An array of retarders and polarizers is placed on the input plane of the imaging structure to encrypt the polarized state of light based on Mueller polarization calculus. The proposal makes full use of randomness of polarization parameters and incoherent PSF so that a multidimensional key space is generated to deal with illegal attacks. Mueller polarization calculus and incoherent illumination of imaging structure ensure that only intensity information is manipulated. Another key advantage is that complicated processing and recording related to a complex-valued signal are avoided. The encoded information is just an intensity distribution, which is advantageous for data storage and transition because information expansion accompanying conventional encryption methods is also avoided. The decryption procedure can be performed digitally or using optoelectronic devices. Numerical simulation tests demonstrate the validity of the proposed scheme.

  15. Multicolor pyrometer for materials processing in space, phase 2

    NASA Technical Reports Server (NTRS)

    Frish, Michael; Frank, Jonathan; Beerman, Henry

    1988-01-01

    The program goals were to design, construct, and program a prototype passive imaging pyrometer capable of measuring, as accurately as possible, the temperature distribution across the surface of a moving object suspended in space.

  16. Quantitative 3-D Imaging, Segmentation and Feature Extraction of the Respiratory System in Small Mammals for Computational Biophysics Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trease, Lynn L.; Trease, Harold E.; Fowler, John

    2007-03-15

    One of the critical steps toward performing computational biology simulations, using mesh based integration methods, is in using topologically faithful geometry derived from experimental digital image data as the basis for generating the computational meshes. Digital image data representations contain both the topology of the geometric features and experimental field data distributions. The geometric features that need to be captured from the digital image data are three-dimensional, therefore the process and tools we have developed work with volumetric image data represented as data-cubes. This allows us to take advantage of 2D curvature information during the segmentation and feature extraction process.more » The process is basically: 1) segmenting to isolate and enhance the contrast of the features that we wish to extract and reconstruct, 2) extracting the geometry of the features in an isosurfacing technique, and 3) building the computational mesh using the extracted feature geometry. “Quantitative” image reconstruction and feature extraction is done for the purpose of generating computational meshes, not just for producing graphics "screen" quality images. For example, the surface geometry that we extract must represent a closed water-tight surface.« less

  17. Tomographic data fusion with CFD simulations associated with a planar sensor

    NASA Astrophysics Data System (ADS)

    Liu, J.; Liu, S.; Sun, S.; Zhou, W.; Schlaberg, I. H. I.; Wang, M.; Yan, Y.

    2017-04-01

    Tomographic techniques have great abilities to interrogate the combustion processes, especially when it is combined with the physical models of the combustion itself. In this study, a data fusion algorithm is developed to investigate the flame distribution of a swirl-induced environmental (EV) burner, a new type of burner for low NOx combustion. An electric capacitance tomography (ECT) system is used to acquire 3D flame images and computational fluid dynamics (CFD) is applied to calculate an initial distribution of the temperature profile for the EV burner. Experiments were also carried out to visualize flames at a series of locations above the burner. While the ECT images essentially agree with the CFD temperature distribution, discrepancies exist at a certain height. When data fusion is applied, the discrepancy is visibly reduced and the ECT images are improved. The methods used in this study can lead to a new route where combustion visualization can be much improved and applied to clean energy conversion and new burner development.

  18. Simulating Picosecond X-ray Diffraction from shocked crystals by Post-processing Molecular Dynamics Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimminau, G; Nagler, B; Higginbotham, A

    2008-06-19

    Calculations of the x-ray diffraction patterns from shocked crystals derived from the results of Non-Equilibrium-Molecular-Dynamics (NEMD) simulations are presented. The atomic coordinates predicted by the NEMD simulations combined with atomic form factors are used to generate a discrete distribution of electron density. A Fast-Fourier-Transform (FFT) of this distribution provides an image of the crystal in reciprocal space, which can be further processed to produce quantitative simulated data for direct comparison with experiments that employ picosecond x-ray diffraction from laser-irradiated crystalline targets.

  19. FibrilJ: ImageJ plugin for fibrils' diameter and persistence length determination

    NASA Astrophysics Data System (ADS)

    Sokolov, P. A.; Belousov, M. V.; Bondarev, S. A.; Zhouravleva, G. A.; Kasyanenko, N. A.

    2017-05-01

    Application of microscopy to evaluate the morphology and size of filamentous proteins and amyloids requires new and creative approaches to simplify and automate the image processing. The estimation of mean values of fibrils diameter, length and bending stiffness on micrographs is a major challenge. For this purpose we developed an open-source FibrilJ plugin for the ImageJ/FiJi program. It automatically recognizes the fibrils on the surface of a mica, silicon, gold or formvar film and further analyzes them to calculate the distribution of fibrils by diameters, lengths and persistence lengths. The plugin has been validated by the processing of TEM images of fibrils formed by Sup35NM yeast protein and artificially created images of rod-shape objects with predefined parameters. Novel data obtained by SEM for Sup35NM protein fibrils immobilized on silicon and gold substrates are also presented and analyzed.

  20. Effect of random phase mask on input plane in photorefractive authentic memory with two-wave encryption method

    NASA Astrophysics Data System (ADS)

    Mita, Akifumi; Okamoto, Atsushi; Funakoshi, Hisatoshi

    2004-06-01

    We have proposed an all-optical authentic memory with the two-wave encryption method. In the recording process, the image data are encrypted to a white noise by the random phase masks added on the input beam with the image data and the reference beam. Only reading beam with the phase-conjugated distribution of the reference beam can decrypt the encrypted data. If the encrypted data are read out with an incorrect phase distribution, the output data are transformed into a white noise. Moreover, during read out, reconstructions of the encrypted data interfere destructively resulting in zero intensity. Therefore our memory has a merit that we can detect unlawful accesses easily by measuring the output beam intensity. In our encryption method, the random phase mask on the input plane plays important roles in transforming the input image into a white noise and prohibiting to decrypt a white noise to the input image by the blind deconvolution method. Without this mask, when unauthorized users observe the output beam by using CCD in the readout with the plane wave, the completely same intensity distribution as that of Fourier transform of the input image is obtained. Therefore the encrypted image will be decrypted easily by using the blind deconvolution method. However in using this mask, even if unauthorized users observe the output beam using the same method, the encrypted image cannot be decrypted because the observed intensity distribution is dispersed at random by this mask. Thus it can be said the robustness is increased by this mask. In this report, we compare two correlation coefficients, which represents the degree of a white noise of the output image, between the output image and the input image in using this mask or not. We show that the robustness of this encryption method is increased as the correlation coefficient is improved from 0.3 to 0.1 by using this mask.

  1. Development and Operation of the Americas ALOS Data Node

    NASA Astrophysics Data System (ADS)

    Arko, S. A.; Marlin, R. H.; La Belle-Hamer, A. L.

    2004-12-01

    In the spring of 2005, the Japanese Aerospace Exploration Agency (JAXA) will launch the next generation in advanced, remote sensing satellites. The Advanced Land Observing Satellite (ALOS) includes three sensors, two visible imagers and one L-band polarimetric SAR, providing high-quality remote sensing data to the scientific and commercial communities throughout the world. Focusing on remote sensing and scientific pursuits, ALOS will image nearly the entire Earth using all three instruments during its expected three-year lifetime. These data sets offer the potential for data continuation of older satellite missions as well as new products for the growing user community. One of the unique features of the ALOS mission is the data distribution approach. JAXA has created a worldwide cooperative data distribution network. The data nodes are NOAA /ASF representing the Americas ALOS Data Node (AADN), ESA representing the ALOS European and African Node (ADEN), Geoscience Australia representing Oceania and JAXA representing the Asian continent. The AADN is the sole agency responsible for archival, processing and distribution of L0 and L1 products to users in both North and South America. In support of this mission, AADN is currently developing a processing and distribution infrastructure to provide easy access to these data sets. Utilizing a custom, grid-based process controller and media generation system, the overall infrastructure has been designed to provide maximum throughput while requiring a minimum of operator input and maintenance. This paper will present an overview of the ALOS system, details of each sensor's capabilities and of the processing and distribution system being developed by AADN to provide these valuable data sets to users throughout North and South America.

  2. Imaging the distribution of transient viscosity after the 2016 Mw 7.1 Kumamoto earthquake

    NASA Astrophysics Data System (ADS)

    Moore, James D. P.; Yu, Hang; Tang, Chi-Hsien; Wang, Teng; Barbot, Sylvain; Peng, Dongju; Masuti, Sagar; Dauwels, Justin; Hsu, Ya-Ju; Lambert, Valère; Nanjundiah, Priyamvada; Wei, Shengji; Lindsey, Eric; Feng, Lujia; Shibazaki, Bunichiro

    2017-04-01

    The deformation of mantle and crustal rocks in response to stress plays a crucial role in the distribution of seismic and volcanic hazards, controlling tectonic processes ranging from continental drift to earthquake triggering. However, the spatial variation of these dynamic properties is poorly understood as they are difficult to measure. We exploited the large stress perturbation incurred by the 2016 earthquake sequence in Kumamoto, Japan, to directly image localized and distributed deformation. The earthquakes illuminated distinct regions of low effective viscosity in the lower crust, notably beneath the Mount Aso and Mount Kuju volcanoes, surrounded by larger-scale variations of viscosity across the back-arc. This study demonstrates a new potential for geodesy to directly probe rock rheology in situ across many spatial and temporal scales.

  3. Contour-based object orientation estimation

    NASA Astrophysics Data System (ADS)

    Alpatov, Boris; Babayan, Pavel

    2016-04-01

    Real-time object orientation estimation is an actual problem of computer vision nowadays. In this paper we propose an approach to estimate an orientation of objects lacking axial symmetry. Proposed algorithm is intended to estimate orientation of a specific known 3D object, so 3D model is required for learning. The proposed orientation estimation algorithm consists of 2 stages: learning and estimation. Learning stage is devoted to the exploring of studied object. Using 3D model we can gather set of training images by capturing 3D model from viewpoints evenly distributed on a sphere. Sphere points distribution is made by the geosphere principle. It minimizes the training image set. Gathered training image set is used for calculating descriptors, which will be used in the estimation stage of the algorithm. The estimation stage is focusing on matching process between an observed image descriptor and the training image descriptors. The experimental research was performed using a set of images of Airbus A380. The proposed orientation estimation algorithm showed good accuracy (mean error value less than 6°) in all case studies. The real-time performance of the algorithm was also demonstrated.

  4. Actinide bioimaging in tissues: Comparison of emulsion and solid track autoradiography techniques with the iQID camera

    PubMed Central

    Miller, Brian W.; Van der Meeren, Anne; Tazrart, Anissa; Angulo, Jaime F.; Griffiths, Nina M.

    2017-01-01

    This work presents a comparison of three autoradiography techniques for imaging biological samples contaminated with actinides: emulsion-based, plastic-based autoradiography and a quantitative digital technique, the iQID camera, based on the numerical analysis of light from a scintillator screen. In radiation toxicology it has been important to develop means of imaging actinide distribution in tissues as these radionuclides may be heterogeneously distributed within and between tissues after internal contamination. Actinide distribution determines which cells are exposed to alpha radiation and is thus potentially critical for assessing absorbed dose. The comparison was carried out by generating autoradiographs of the same biological samples contaminated with actinides with the three autoradiography techniques. These samples were cell preparations or tissue sections collected from animals contaminated with different physico-chemical forms of actinides. The autoradiograph characteristics and the performances of the techniques were evaluated and discussed mainly in terms of acquisition process, activity distribution patterns, spatial resolution and feasibility of activity quantification. The obtained autoradiographs presented similar actinide distribution at low magnification. Out of the three techniques, emulsion autoradiography is the only one to provide a highly-resolved image of the actinide distribution inherently superimposed on the biological sample. Emulsion autoradiography is hence best interpreted at higher magnifications. However, this technique is destructive for the biological sample. Both emulsion- and plastic-based autoradiography record alpha tracks and thus enabled the differentiation between ionized forms of actinides and oxide particles. This feature can help in the evaluation of decorporation therapy efficacy. The most recent technique, the iQID camera, presents several additional features: real-time imaging, separate imaging of alpha particles and gamma rays, and alpha activity quantification. The comparison of these three autoradiography techniques showed that they are complementary and the choice of the technique depends on the purpose of the imaging experiment. PMID:29023595

  5. Adaptive histogram equalization in digital radiography of destructive skeletal lesions.

    PubMed

    Braunstein, E M; Capek, P; Buckwalter, K; Bland, P; Meyer, C R

    1988-03-01

    Adaptive histogram equalization, an image-processing technique that distributes pixel values of an image uniformly throughout the gray scale, was applied to 28 plain radiographs of bone lesions, after they had been digitized. The non-equalized and equalized digital images were compared by two skeletal radiologists with respect to lesion margins, internal matrix, soft-tissue mass, cortical breakthrough, and periosteal reaction. Receiver operating characteristic (ROC) curves were constructed on the basis of the responses. Equalized images were superior to nonequalized images in determination of cortical breakthrough and presence or absence of periosteal reaction. ROC analysis showed no significant difference in determination of margins, matrix, or soft-tissue masses.

  6. Iteration and superposition encryption scheme for image sequences based on multi-dimensional keys

    NASA Astrophysics Data System (ADS)

    Han, Chao; Shen, Yuzhen; Ma, Wenlin

    2017-12-01

    An iteration and superposition encryption scheme for image sequences based on multi-dimensional keys is proposed for high security, big capacity and low noise information transmission. Multiple images to be encrypted are transformed into phase-only images with the iterative algorithm and then are encrypted by different random phase, respectively. The encrypted phase-only images are performed by inverse Fourier transform, respectively, thus new object functions are generated. The new functions are located in different blocks and padded zero for a sparse distribution, then they propagate to a specific region at different distances by angular spectrum diffraction, respectively and are superposed in order to form a single image. The single image is multiplied with a random phase in the frequency domain and then the phase part of the frequency spectrums is truncated and the amplitude information is reserved. The random phase, propagation distances, truncated phase information in frequency domain are employed as multiple dimensional keys. The iteration processing and sparse distribution greatly reduce the crosstalk among the multiple encryption images. The superposition of image sequences greatly improves the capacity of encrypted information. Several numerical experiments based on a designed optical system demonstrate that the proposed scheme can enhance encrypted information capacity and make image transmission at a highly desired security level.

  7. Algorithms and programming tools for image processing on the MPP, part 2

    NASA Technical Reports Server (NTRS)

    Reeves, Anthony P.

    1986-01-01

    A number of algorithms were developed for image warping and pyramid image filtering. Techniques were investigated for the parallel processing of a large number of independent irregular shaped regions on the MPP. In addition some utilities for dealing with very long vectors and for sorting were developed. Documentation pages for the algorithms which are available for distribution are given. The performance of the MPP for a number of basic data manipulations was determined. From these results it is possible to predict the efficiency of the MPP for a number of algorithms and applications. The Parallel Pascal development system, which is a portable programming environment for the MPP, was improved and better documentation including a tutorial was written. This environment allows programs for the MPP to be developed on any conventional computer system; it consists of a set of system programs and a library of general purpose Parallel Pascal functions. The algorithms were tested on the MPP and a presentation on the development system was made to the MPP users group. The UNIX version of the Parallel Pascal System was distributed to a number of new sites.

  8. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  9. Magnetic Levitation Coupled with Portable Imaging and Analysis for Disease Diagnostics.

    PubMed

    Knowlton, Stephanie M; Yenilmez, Bekir; Amin, Reza; Tasoglu, Savas

    2017-02-19

    Currently, many clinical diagnostic procedures are complex, costly, inefficient, and inaccessible to a large population in the world. The requirements for specialized equipment and trained personnel require that many diagnostic tests be performed at remote, centralized clinical laboratories. Magnetic levitation is a simple yet powerful technique and can be applied to levitate cells, which are suspended in a paramagnetic solution and placed in a magnetic field, at a position determined by equilibrium between a magnetic force and a buoyancy force. Here, we present a versatile platform technology designed for point-of-care diagnostics which uses magnetic levitation coupled to microscopic imaging and automated analysis to determine the density distribution of a patient's cells as a useful diagnostic indicator. We present two platforms operating on this principle: (i) a smartphone-compatible version of the technology, where the built-in smartphone camera is used to image cells in the magnetic field and a smartphone application processes the images and to measures the density distribution of the cells and (ii) a self-contained version where a camera board is used to capture images and an embedded processing unit with attached thin-film-transistor (TFT) screen measures and displays the results. Demonstrated applications include: (i) measuring the altered distribution of a cell population with a disease phenotype compared to a healthy phenotype, which is applied to sickle cell disease diagnosis, and (ii) separation of different cell types based on their characteristic densities, which is applied to separate white blood cells from red blood cells for white blood cell cytometry. These applications, as well as future extensions of the essential density-based measurements enabled by this portable, user-friendly platform technology, will significantly enhance disease diagnostic capabilities at the point of care.

  10. Temperature as a third dimension in column-density mapping of dusty astrophysical structures associated with star formation

    NASA Astrophysics Data System (ADS)

    Marsh, K. A.; Whitworth, A. P.; Lomax, O.

    2015-12-01

    We present point process mapping (PPMAP), a Bayesian procedure that uses images of dust continuum emission at multiple wavelengths to produce resolution-enhanced image cubes of differential column density as a function of dust temperature and position. PPMAP is based on the generic `point process formalism, whereby the system of interest (in this case, a dusty astrophysical structure such as a filament or pre-stellar core) is represented by a collection of points in a suitably defined state space. It can be applied to a variety of observational data, such as Herschel images, provided only that the image intensity is delivered by optically thin dust in thermal equilibrium. PPMAP takes full account of the instrumental point-spread functions and does not require all images to be degraded to the same resolution. We present the results of testing using simulated data for a pre-stellar core and a fractal turbulent cloud, and demonstrate its performance with real data from the Herschel infrared Galactic Plane Survey (Hi-GAL). Specifically, we analyse observations of a large filamentary structure in the CMa OB1 giant molecular cloud. Histograms of differential column density indicate that the warm material (T ≳ 13 K) is distributed lognormally, consistent with turbulence, but the column densities of the cooler material are distributed as a high-density tail, consistent with the effects of self-gravity. The results illustrate the potential of PPMAP to aid in distinguishing between different physical components along the line of sight in star-forming clouds, and aid the interpretation of the associated Probability distribution functions (PDFs) of column density.

  11. Field Ground Truthing Data Collector - a Mobile Toolkit for Image Analysis and Processing

    NASA Astrophysics Data System (ADS)

    Meng, X.

    2012-07-01

    Field Ground Truthing Data Collector is one of the four key components of the NASA funded ICCaRS project, being developed in Southeast Michigan. The ICCaRS ground truthing toolkit entertains comprehensive functions: 1) Field functions, including determining locations through GPS, gathering and geo-referencing visual data, laying out ground control points for AEROKAT flights, measuring the flight distance and height, and entering observations of land cover (and use) and health conditions of ecosystems and environments in the vicinity of the flight field; 2) Server synchronization functions, such as, downloading study-area maps, aerial photos and satellite images, uploading and synchronizing field-collected data with the distributed databases, calling the geospatial web services on the server side to conduct spatial querying, image analysis and processing, and receiving the processed results in field for near-real-time validation; and 3) Social network communication functions for direct technical assistance and pedagogical support, e.g., having video-conference calls in field with the supporting educators, scientists, and technologists, participating in Webinars, or engaging discussions with other-learning portals. This customized software package is being built on Apple iPhone/iPad and Google Maps/Earth. The technical infrastructures, data models, coupling methods between distributed geospatial data processing and field data collector tools, remote communication interfaces, coding schema, and functional flow charts will be illustrated and explained at the presentation. A pilot case study will be also demonstrated.

  12. lop-DWI: A Novel Scheme for Pre-Processing of Diffusion-Weighted Images in the Gradient Direction Domain.

    PubMed

    Sepehrband, Farshid; Choupan, Jeiran; Caruyer, Emmanuel; Kurniawan, Nyoman D; Gal, Yaniv; Tieng, Quang M; McMahon, Katie L; Vegh, Viktor; Reutens, David C; Yang, Zhengyi

    2014-01-01

    We describe and evaluate a pre-processing method based on a periodic spiral sampling of diffusion-gradient directions for high angular resolution diffusion magnetic resonance imaging. Our pre-processing method incorporates prior knowledge about the acquired diffusion-weighted signal, facilitating noise reduction. Periodic spiral sampling of gradient direction encodings results in an acquired signal in each voxel that is pseudo-periodic with characteristics that allow separation of low-frequency signal from high frequency noise. Consequently, it enhances local reconstruction of the orientation distribution function used to define fiber tracks in the brain. Denoising with periodic spiral sampling was tested using synthetic data and in vivo human brain images. The level of improvement in signal-to-noise ratio and in the accuracy of local reconstruction of fiber tracks was significantly improved using our method.

  13. Digital image processing of nanometer-size metal particles on amorphous substrates

    NASA Technical Reports Server (NTRS)

    Soria, F.; Artal, P.; Bescos, J.; Heinemann, K.

    1989-01-01

    The task of differentiating very small metal aggregates supported on amorphous films from the phase contrast image features inherently stemming from the support is extremely difficult in the nanometer particle size range. Digital image processing was employed to overcome some of the ambiguities in evaluating such micrographs. It was demonstrated that such processing allowed positive particle detection and a limited degree of statistical size analysis even for micrographs where by bare eye examination the distribution between particles and erroneous substrate features would seem highly ambiguous. The smallest size class detected for Pd/C samples peaks at 0.8 nm. This size class was found in various samples prepared under different evaporation conditions and it is concluded that these particles consist of 'a magic number' of 13 atoms and have cubooctahedral or icosahedral crystal structure.

  14. Surface topography analysis and performance on post-CMP images (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Lee, Jusang; Bello, Abner F.; Kakita, Shinichiro; Pieniazek, Nicholas; Johnson, Timothy A.

    2017-03-01

    Surface topography on post-CMP processing can be measured with white light interference microscopy to determine the planarity. Results are used to avoid under or over polishing and to decrease dishing. The numerical output of the surface topography is the RMS (root-mean-square) of the height. Beyond RMS, the topography image is visually examined and not further quantified. Subjective comparisons of the height maps are used to determine optimum CMP process conditions. While visual comparison of height maps can determine excursions, it's only through manual inspection of the images. In this work we describe methods of quantifying post-CMP surface topography characteristics that are used in other technical fields such as geography and facial-recognition. The topography image is divided into small surface patches of 7x7 pixels. Each surface patch is fitted to an analytic surface equation, in this case a third order polynomial, from which the gradient, directional derivatives, and other characteristics are calculated. Based on the characteristics, the surface patch is labeled as peak, ridge, flat, saddle, ravine, pit or hillside. The number of each label and thus the associated histogram is then used as a quantified characteristic of the surface topography, and could be used as a parameter for SPC (statistical process control) charting. In addition, the gradient for each surface patch is calculated, so the average, maximum, and other characteristics of the gradient distribution can be used for SPC. Repeatability measurements indicate high confidence where individual labels can be lower than 2% relative standard deviation. When the histogram is considered, an associated chi-squared value can be defined from which to compare other measurements. The chi-squared value of the histogram is a very sensitive and quantifiable parameter to determine the within wafer and wafer-to-wafer topography non-uniformity. As for the gradient histogram distribution, the chi-squared could again be calculated and used as yet another quantifiable parameter for SPC. In this work we measured the post Cu CMP of a die designed for 14nm technology. A region of interest (ROI) known to be indicative of the CMP processing is chosen for the topography analysis. The ROI, of size 1800 x 2500 pixels where each pixel represents 2um, was repeatably measured. We show the sensitivity based on measurements and the comparison between center and edge die measurements. The topography measurements and surface patch analysis were applied to hundreds of images representing the periodic process qualification runs required to control and verify CMP performance and tool matching. The analysis is shown to be sensitive to process conditions that vary in polishing time, type of slurry, CMP tool manufacturer, and CMP pad lifetime. Keywords: Keywords: CMP, Topography, Image Processing, Metrology, Interference microscopy, surface processing [1] De Lega, Xavier Colonna, and Peter De Groot. "Optical topography measurement of patterned wafers." Characterization and Metrology for ULSI Technology 2005 788 (2005): 432-436. [2] de Groot, Peter. "Coherence scanning interferometry." Optical Measurement of Surface Topography. Springer Berlin Heidelberg, 2011. 187-208. [3] Watson, Layne T., Thomas J. Laffey, and Robert M. Haralick. "Topographic classification of digital image intensity surfaces using generalized splines and the discrete cosine transformation." Computer Vision, Graphics, and Image Processing 29.2 (1985): 143-167. [4] Wang, Jun, et al. "3D facial expression recognition based on primitive surface feature distribution." Computer Vision and Pattern Recognition, 2006 IEEE Computer Society Conference on. Vol. 2. IEEE, 2006.

  15. Deviations from Rayleigh statistics in ultrasonic speckle.

    PubMed

    Tuthill, T A; Sperry, R H; Parker, K J

    1988-04-01

    The statistics of speckle patterns in ultrasound images have potential for tissue characterization. In "fully developed speckle" from many random scatterers, the amplitude is widely recognized as possessing a Rayleigh distribution. This study examines how scattering populations and signal processing can produce non-Rayleigh distributions. The first order speckle statistics are shown to depend on random scatterer density and the amplitude and spacing of added periodic scatterers. Envelope detection, amplifier compression, and signal bandwidth are also shown to cause distinct changes in the signal distribution.

  16. A Semi-Automatic Method for Image Analysis of Edge Dynamics in Living Cells

    PubMed Central

    Huang, Lawrence; Helmke, Brian P.

    2011-01-01

    Spatial asymmetry of actin edge ruffling contributes to the process of cell polarization and directional migration, but mechanisms by which external cues control actin polymerization near cell edges remain unclear. We designed a quantitative image analysis strategy to measure the spatiotemporal distribution of actin edge ruffling. Time-lapse images of endothelial cells (ECs) expressing mRFP-actin were segmented using an active contour method. In intensity line profiles oriented normal to the cell edge, peak detection identified the angular distribution of polymerized actin within 1 µm of the cell edge, which was localized to lamellipodia and edge ruffles. Edge features associated with filopodia and peripheral stress fibers were removed. Circular statistical analysis enabled detection of cell polarity, indicated by a unimodal distribution of edge ruffles. To demonstrate the approach, we detected a rapid, nondirectional increase in edge ruffling in serum-stimulated ECs and a change in constitutive ruffling orientation in quiescent, nonpolarized ECs. Error analysis using simulated test images demonstrate robustness of the method to variations in image noise levels, edge ruffle arc length, and edge intensity gradient. These quantitative measurements of edge ruffling dynamics enable investigation at the cellular length scale of the underlying molecular mechanisms regulating actin assembly and cell polarization. PMID:21643526

  17. Algorithm for automatic image dodging of unmanned aerial vehicle images using two-dimensional radiometric spatial attributes

    NASA Astrophysics Data System (ADS)

    Li, Wenzhuo; Sun, Kaimin; Li, Deren; Bai, Ting

    2016-07-01

    Unmanned aerial vehicle (UAV) remote sensing technology has come into wide use in recent years. The poor stability of the UAV platform, however, produces more inconsistencies in hue and illumination among UAV images than other more stable platforms. Image dodging is a process used to reduce these inconsistencies caused by different imaging conditions. We propose an algorithm for automatic image dodging of UAV images using two-dimensional radiometric spatial attributes. We use object-level image smoothing to smooth foreground objects in images and acquire an overall reference background image by relative radiometric correction. We apply the Contourlet transform to separate high- and low-frequency sections for every single image, and replace the low-frequency section with the low-frequency section extracted from the corresponding region in the overall reference background image. We apply the inverse Contourlet transform to reconstruct the final dodged images. In this process, a single image must be split into reasonable block sizes with overlaps due to large pixel size. Experimental mosaic results show that our proposed method reduces the uneven distribution of hue and illumination. Moreover, it effectively eliminates dark-bright interstrip effects caused by shadows and vignetting in UAV images while maximally protecting image texture information.

  18. New technology of functional infrared imaging and its clinical applications

    NASA Astrophysics Data System (ADS)

    Yang, Hongqin; Xie, Shusen; Lu, Zukang; Liu, Zhongqi

    2006-01-01

    With improvements in infrared camera technology, the promise of reduced costs and noninvasive character, infrared thermal imaging resurges in medicine. The paper introduces a new technology of functional infrared imaging, thermal texture maps (TTM), which is not only an apparatus for thermal radiation imaging but also a new method for revealing the relationship between the temperature distribution of the skin surface and the emission field inside body. The skin temperature distribution of a healthy human body exhibits a contralateral symmetry. Any disease in the body is associated with an alteration of the thermal distribution of human body. Infrared thermography is noninvasive, so it is the best choice for studying the physiology of thermoregulation and the thermal dysfunction associated with diseases. Reading and extracting information from the thermograms is a complex and subjective task that can be greatly facilitated by computerized techniques. Through image processing and measurement technology, surface or internal radiation sources can be non-invasively distinguished through extrapolation. We discuss the principle, the evaluation procedure and the effectiveness of TTM technology in the clinical detection and diagnosis of cancers, especially in their early stages and other diseases by comparing with other imaging technologies, such as ultrasound. Several study cases are given to show the effectiveness of this method. At last, we point out the applications of TTM technology in the research field of traditional medicine.

  19. An Intelligent Systems Approach to Automated Object Recognition: A Preliminary Study

    USGS Publications Warehouse

    Maddox, Brian G.; Swadley, Casey L.

    2002-01-01

    Attempts at fully automated object recognition systems have met with varying levels of success over the years. However, none of the systems have achieved high enough accuracy rates to be run unattended. One of the reasons for this may be that they are designed from the computer's point of view and rely mainly on image-processing methods. A better solution to this problem may be to make use of modern advances in computational intelligence and distributed processing to try to mimic how the human brain is thought to recognize objects. As humans combine cognitive processes with detection techniques, such a system would combine traditional image-processing techniques with computer-based intelligence to determine the identity of various objects in a scene.

  20. Quantitative imaging of volcanic plumes — Results, needs, and future trends

    USGS Publications Warehouse

    Platt, Ulrich; Lübcke, Peter; Kuhn, Jonas; Bobrowski, Nicole; Prata, Fred; Burton, Mike; Kern, Christoph

    2015-01-01

    Recent technology allows two-dimensional “imaging” of trace gas distributions in plumes. In contrast to older, one-dimensional remote sensing techniques, that are only capable of measuring total column densities, the new imaging methods give insight into details of transport and mixing processes as well as chemical transformation within plumes. We give an overview of gas imaging techniques already being applied at volcanoes (SO2cameras, imaging DOAS, FT-IR imaging), present techniques where first field experiments were conducted (LED-LIDAR, tomographic mapping), and describe some techniques where only theoretical studies with application to volcanology exist (e.g. Fabry–Pérot Imaging, Gas Correlation Spectroscopy, bi-static LIDAR). Finally, we discuss current needs and future trends in imaging technology.

  1. Fitting-free algorithm for efficient quantification of collagen fiber alignment in SHG imaging applications.

    PubMed

    Hall, Gunnsteinn; Liang, Wenxuan; Li, Xingde

    2017-10-01

    Collagen fiber alignment derived from second harmonic generation (SHG) microscopy images can be important for disease diagnostics. Image processing algorithms are needed to robustly quantify the alignment in images with high sensitivity and reliability. Fourier transform (FT) magnitude, 2D power spectrum, and image autocorrelation have previously been used to extract fiber information from images by assuming a certain mathematical model (e.g. Gaussian distribution of the fiber-related parameters) and fitting. The fitting process is slow and fails to converge when the data is not Gaussian. Herein we present an efficient constant-time deterministic algorithm which characterizes the symmetricity of the FT magnitude image in terms of a single parameter, named the fiber alignment anisotropy R ranging from 0 (randomized fibers) to 1 (perfect alignment). This represents an important improvement of the technology and may bring us one step closer to utilizing the technology for various applications in real time. In addition, we present a digital image phantom-based framework for characterizing and validating the algorithm, as well as assessing the robustness of the algorithm against different perturbations.

  2. ScipionCloud: An integrative and interactive gateway for large scale cryo electron microscopy image processing on commercial and academic clouds.

    PubMed

    Cuenca-Alba, Jesús; Del Cano, Laura; Gómez Blanco, Josué; de la Rosa Trevín, José Miguel; Conesa Mingo, Pablo; Marabini, Roberto; S Sorzano, Carlos Oscar; Carazo, Jose María

    2017-10-01

    New instrumentation for cryo electron microscopy (cryoEM) has significantly increased data collection rate as well as data quality, creating bottlenecks at the image processing level. Current image processing model of moving the acquired images from the data source (electron microscope) to desktops or local clusters for processing is encountering many practical limitations. However, computing may also take place in distributed and decentralized environments. In this way, cloud is a new form of accessing computing and storage resources on demand. Here, we evaluate on how this new computational paradigm can be effectively used by extending our current integrative framework for image processing, creating ScipionCloud. This new development has resulted in a full installation of Scipion both in public and private clouds, accessible as public "images", with all the required preinstalled cryoEM software, just requiring a Web browser to access all Graphical User Interfaces. We have profiled the performance of different configurations on Amazon Web Services and the European Federated Cloud, always on architectures incorporating GPU's, and compared them with a local facility. We have also analyzed the economical convenience of different scenarios, so cryoEM scientists have a clearer picture of the setup that is best suited for their needs and budgets. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Image segmentation using hidden Markov Gauss mixture models.

    PubMed

    Pyun, Kyungsuk; Lim, Johan; Won, Chee Sun; Gray, Robert M

    2007-07-01

    Image segmentation is an important tool in image processing and can serve as an efficient front end to sophisticated algorithms and thereby simplify subsequent processing. We develop a multiclass image segmentation method using hidden Markov Gauss mixture models (HMGMMs) and provide examples of segmentation of aerial images and textures. HMGMMs incorporate supervised learning, fitting the observation probability distribution given each class by a Gauss mixture estimated using vector quantization with a minimum discrimination information (MDI) distortion. We formulate the image segmentation problem using a maximum a posteriori criteria and find the hidden states that maximize the posterior density given the observation. We estimate both the hidden Markov parameter and hidden states using a stochastic expectation-maximization algorithm. Our results demonstrate that HMGMM provides better classification in terms of Bayes risk and spatial homogeneity of the classified objects than do several popular methods, including classification and regression trees, learning vector quantization, causal hidden Markov models (HMMs), and multiresolution HMMs. The computational load of HMGMM is similar to that of the causal HMM.

  4. The infection algorithm: an artificial epidemic approach for dense stereo correspondence.

    PubMed

    Olague, Gustavo; Fernández, Francisco; Pérez, Cynthia B; Lutton, Evelyne

    2006-01-01

    We present a new bio-inspired approach applied to a problem of stereo image matching. This approach is based on an artificial epidemic process, which we call the infection algorithm. The problem at hand is a basic one in computer vision for 3D scene reconstruction. It has many complex aspects and is known as an extremely difficult one. The aim is to match the contents of two images in order to obtain 3D information that allows the generation of simulated projections from a viewpoint that is different from the ones of the initial photographs. This process is known as view synthesis. The algorithm we propose exploits the image contents in order to produce only the necessary 3D depth information, while saving computational time. It is based on a set of distributed rules, which propagate like an artificial epidemic over the images. Experiments on a pair of real images are presented, and realistic reprojected images have been generated.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wainwright, Haruko M.; Flores Orozco, Adrian; Bucker, Matthias

    In floodplain environments, a naturally reduced zone (NRZ) is considered to be a common biogeochemical hot spot, having distinct microbial and geochemical characteristics. Although important for understanding their role in mediating floodplain biogeochemical processes, mapping the subsurface distribution of NRZs over the dimensions of a floodplain is challenging, as conventional wellbore data are typically spatially limited and the distribution of NRZs is heterogeneous. In this work, we present an innovative methodology for the probabilistic mapping of NRZs within a three-dimensional (3-D) subsurface domain using induced polarization imaging, which is a noninvasive geophysical technique. Measurements consist of surface geophysical surveys andmore » drilling-recovered sediments at the U.S. Department of Energy field site near Rifle, CO (USA). Inversion of surface time domain-induced polarization (TDIP) data yielded 3-D images of the complex electrical resistivity, in terms of magnitude and phase, which are associated with mineral precipitation and other lithological properties. By extracting the TDIP data values colocated with wellbore lithological logs, we found that the NRZs have a different distribution of resistivity and polarization from the other aquifer sediments. To estimate the spatial distribution of NRZs, we developed a Bayesian hierarchical model to integrate the geophysical and wellbore data. In addition, the resistivity images were used to estimate hydrostratigraphic interfaces under the floodplain. Validation results showed that the integration of electrical imaging and wellbore data using a Bayesian hierarchical model was capable of mapping spatially heterogeneous interfaces and NRZ distributions thereby providing a minimally invasive means to parameterize a hydrobiogeochemical model of the floodplain.« less

  6. Advanced Image Processing for NASA Applications

    NASA Technical Reports Server (NTRS)

    LeMoign, Jacqueline

    2007-01-01

    The future of space exploration will involve cooperating fleets of spacecraft or sensor webs geared towards coordinated and optimal observation of Earth Science phenomena. The main advantage of such systems is to utilize multiple viewing angles as well as multiple spatial and spectral resolutions of sensors carried on multiple spacecraft but acting collaboratively as a single system. Within this framework, our research focuses on all areas related to sensing in collaborative environments, which means systems utilizing intracommunicating spatially distributed sensor pods or crafts being deployed to monitor or explore different environments. This talk will describe the general concept of sensing in collaborative environments, will give a brief overview of several technologies developed at NASA Goddard Space Flight Center in this area, and then will concentrate on specific image processing research related to that domain, specifically image registration and image fusion.

  7. Satellite land remote sensing advancements for the eighties; Proceedings of the Eighth Pecora Symposium, Sioux Falls, SD, October 4-7, 1983

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Among the topics discussed are NASA's land remote sensing plans for the 1980s, the evolution of Landsat 4 and the performance of its sensors, the Landsat 4 thematic mapper image processing system radiometric and geometric characteristics, data quality, image data radiometric analysis and spectral/stratigraphic analysis, and thematic mapper agricultural, forest resource and geological applications. Also covered are geologic applications of side-looking airborne radar, digital image processing, the large format camera, the RADARSAT program, the SPOT 1 system's program status, distribution plans, and simulation program, Space Shuttle multispectral linear array studies of the optical and biological properties of terrestrial land cover, orbital surveys of solar-stimulated luminescence, the Space Shuttle imaging radar research facility, and Space Shuttle-based polar ice sounding altimetry.

  8. Direct visualization of polarization reversal of organic ferroelectric memory transistor by using charge modulated reflectance imaging

    NASA Astrophysics Data System (ADS)

    Otsuka, Takako; Taguchi, Dai; Manaka, Takaaki; Iwamoto, Mitsumasa

    2017-11-01

    By using the charge modulated reflectance (CMR) imaging technique, charge distribution in the pentacene organic field-effect transistor (OFET) with a ferroelectric gate insulator [P(VDF-TrFE)] was investigated in terms of polarization reversal of the P(VDF-TrFE) layer. We studied the polarization reversal process and the carrier spreading process in the OFET channel. The I-V measurement showed a hysteresis behavior caused by the spontaneous polarization of P(VDF-TrFE), but the hysteresis I-V curve changes depending on the applied drain bias, possibly due to the gradual shift of the polarization reversal position in the OFET channel. CMR imaging visualized the gradual shift of the polarization reversal position and showed that the electrostatic field formed by the polarization of P(VDF-TrFE) contributes to hole and electron injection into the pentacene layer and the carrier distribution is significantly dependent on the direction of the polarization. The polarization reversal position in the channel region is governed by the electrostatic potential, and it happens where the potential reaches the coercive voltage of P(VDF-TrFE). The transmission line model developed on the basis of the Maxwell-Wagner effect element analysis well accounts for this polarization reversal process in the OFET channel.

  9. Peer-to-peer architecture for multi-departmental distributed PACS

    NASA Astrophysics Data System (ADS)

    Rosset, Antoine; Heuberger, Joris; Pysher, Lance; Ratib, Osman

    2006-03-01

    We have elected to explore peer-to-peer technology as an alternative to centralized PACS architecture for the increasing requirements for wide access to images inside and outside a radiology department. The goal being to allow users across the enterprise to access any study anytime without the need for prefetching or routing of images from central archive. Images can be accessed between different workstations and local storage nodes. We implemented "bonjour" a new remote file access technology developed by Apple allowing applications to share data and files remotely with optimized data access and data transfer. Our Open-source image display platform called OsiriX was adapted to allow sharing of local DICOM images through direct access of each local SQL database to be accessible from any other OsiriX workstation over the network. A server version of Osirix Core Data database also allows to access distributed archives servers in the same way. The infrastructure implemented allows fast and efficient access to any image anywhere anytime independently from the actual physical location of the data. It also allows benefiting from the performance of distributed low-cost and high capacity storage servers that can provide efficient caching of PACS data that was found to be 10 to 20 x faster that accessing the same date from the central PACS archive. It is particularly suitable for large hospitals and academic environments where clinical conferences, interdisciplinary discussions and successive sessions of image processing are often part of complex workflow or patient management and decision making.

  10. Line-edge roughness performance targets for EUV lithography

    NASA Astrophysics Data System (ADS)

    Brunner, Timothy A.; Chen, Xuemei; Gabor, Allen; Higgins, Craig; Sun, Lei; Mack, Chris A.

    2017-03-01

    Our paper will use stochastic simulations to explore how EUV pattern roughness can cause device failure through rare events, so-called "black swans". We examine the impact of stochastic noise on the yield of simple wiring patterns with 36nm pitch, corresponding to 7nm node logic, using a local Critical Dimension (CD)-based fail criteria Contact hole failures are examined in a similar way. For our nominal EUV process, local CD uniformity variation and local Pattern Placement Error variation was observed, but no pattern failures were seen in the modest (few thousand) number of features simulated. We degraded the image quality by incorporating Moving Standard Deviation (MSD) blurring to degrade the Image Log-Slope (ILS), and were able to find conditions where pattern failures were observed. We determined the Line Width Roughness (LWR) value as a function of the ILS. By use of an artificial "step function" image degraded by various MSD blur, we were able to extend the LWR vs ILS curve into regimes that might be available for future EUV imagery. As we decreased the image quality, we observed LWR grow and also began to see pattern failures. For high image quality, we saw CD distributions that were symmetrical and close to Gaussian in shape. Lower image quality caused CD distributions that were asymmetric, with "fat tails" on the low CD side (under-exposed) which were associated with pattern failures. Similar non-Gaussian CD distributions were associated with image conditions that caused missing contact holes, i.e. CD=0.

  11. Investigation of dust formations in the atmosphere on the basis of satellite observations

    NASA Astrophysics Data System (ADS)

    Ivanchik, M. V.; Kliushnikov, S. I.; Krovotyntsev, V. A.; Serebrennikov, A. N.

    1984-06-01

    A method for the computer processing of space photographs is described which makes it possible to determine dust formations in the atmosphere. Dust formations are identified according to the character of contrast-density distribution. Processed images are compared with actinometric data collected in a dust storm area (Conakry, Guinea, May 1983).

  12. DAX - The Next Generation: Towards One Million Processes on Commodity Hardware.

    PubMed

    Damon, Stephen M; Boyd, Brian D; Plassard, Andrew J; Taylor, Warren; Landman, Bennett A

    2017-01-01

    Large scale image processing demands a standardized way of not only storage but also a method for job distribution and scheduling. The eXtensible Neuroimaging Archive Toolkit (XNAT) is one of several platforms that seeks to solve the storage issues. Distributed Automation for XNAT (DAX) is a job control and distribution manager. Recent massive data projects have revealed several bottlenecks for projects with >100,000 assessors (i.e., data processing pipelines in XNAT). In order to address these concerns, we have developed a new API, which exposes a direct connection to the database rather than REST API calls to accomplish the generation of assessors. This method, consistent with XNAT, keeps a full history for auditing purposes. Additionally, we have optimized DAX to keep track of processing status on disk (called DISKQ) rather than on XNAT, which greatly reduces load on XNAT by vastly dropping the number of API calls. Finally, we have integrated DAX into a Docker container with the idea of using it as a Docker controller to launch Docker containers of image processing pipelines. Using our new API, we reduced the time to create 1,000 assessors (a sub-cohort of our case project) from 65040 seconds to 229 seconds (a decrease of over 270 fold). DISKQ, using pyXnat, allows launching of 400 jobs in under 10 seconds which previously took 2,000 seconds. Together these updates position DAX to support projects with hundreds of thousands of scans and to run them in a time-efficient manner.

  13. DAX - the next generation: towards one million processes on commodity hardware

    NASA Astrophysics Data System (ADS)

    Damon, Stephen M.; Boyd, Brian D.; Plassard, Andrew J.; Taylor, Warren; Landman, Bennett A.

    2017-03-01

    Large scale image processing demands a standardized way of not only storage but also a method for job distribution and scheduling. The eXtensible Neuroimaging Archive Toolkit (XNAT) is one of several platforms that seeks to solve the storage issues. Distributed Automation for XNAT (DAX) is a job control and distribution manager. Recent massive data projects have revealed several bottlenecks for projects with <100,000 assessors (i.e., data processing pipelines in XNAT). In order to address these concerns, we have developed a new API, which exposes a direct connection to the database rather than REST API calls to accomplish the generation of assessors. This method, consistent with XNAT, keeps a full history for auditing purposes. Additionally, we have optimized DAX to keep track of processing status on disk (called DISKQ) rather than on XNAT, which greatly reduces load on XNAT by vastly dropping the number of API calls. Finally, we have integrated DAX into a Docker container with the idea of using it as a Docker controller to launch Docker containers of image processing pipelines. Using our new API, we reduced the time to create 1,000 assessors (a sub-cohort of our case project) from 65040 seconds to 229 seconds (a decrease of over 270 fold). DISKQ, using pyXnat, allows launching of 400 jobs in under 10 seconds which previously took 2,000 seconds. Together these updates position DAX to support projects with hundreds of thousands of scans and to run them in a time-efficient manner.

  14. DAX - The Next Generation: Towards One Million Processes on Commodity Hardware

    PubMed Central

    Boyd, Brian D.; Plassard, Andrew J.; Taylor, Warren; Landman, Bennett A.

    2017-01-01

    Large scale image processing demands a standardized way of not only storage but also a method for job distribution and scheduling. The eXtensible Neuroimaging Archive Toolkit (XNAT) is one of several platforms that seeks to solve the storage issues. Distributed Automation for XNAT (DAX) is a job control and distribution manager. Recent massive data projects have revealed several bottlenecks for projects with >100,000 assessors (i.e., data processing pipelines in XNAT). In order to address these concerns, we have developed a new API, which exposes a direct connection to the database rather than REST API calls to accomplish the generation of assessors. This method, consistent with XNAT, keeps a full history for auditing purposes. Additionally, we have optimized DAX to keep track of processing status on disk (called DISKQ) rather than on XNAT, which greatly reduces load on XNAT by vastly dropping the number of API calls. Finally, we have integrated DAX into a Docker container with the idea of using it as a Docker controller to launch Docker containers of image processing pipelines. Using our new API, we reduced the time to create 1,000 assessors (a sub-cohort of our case project) from 65040 seconds to 229 seconds (a decrease of over 270 fold). DISKQ, using pyXnat, allows launching of 400 jobs in under 10 seconds which previously took 2,000 seconds. Together these updates position DAX to support projects with hundreds of thousands of scans and to run them in a time-efficient manner. PMID:28919661

  15. Breakup phenomena of a coaxial jet in the non-dilute region using real-time X-ray radiography

    NASA Astrophysics Data System (ADS)

    Cheung, F. B.; Kuo, K. K.; Woodward, R. D.; Garner, K. N.

    1990-07-01

    An innovative approach to the investigation of liquid jet breakup processes in the near-injector region has been developed to overcome the experimental difficulties associated with optically opaque, dense sprays. Real-time X-ray radiography (RTR) has been employed to observe the inner structure and breakup phenomena of coaxial jets. In the atomizing regime, droplets much smaller than the exit diameter are formed beginning essentially at the injector exit. Through the use of RTR, the instantaneous contour of the liquid core was visualized. Experimental results consist of controlled-exposure digital video images of the liquid jet breakup process. Time-averaged video images have also been recorded for comparison. A digital image processing system is used to analyze the recorded images by creating radiance level distributions of the jet. A rudimentary method for deducing intact-liquid-core length has been suggested. The technique of real-time X-ray radiography has been shown to be a viable approach to the study of the breakup processes of high-speed liquid jets.

  16. Smartphone snapshot mapping of skin chromophores under triple-wavelength laser illumination

    NASA Astrophysics Data System (ADS)

    Spigulis, Janis; Oshina, Ilze; Berzina, Anna; Bykov, Alexander

    2017-09-01

    Chromophore distribution maps are useful tools for skin malformation severity assessment and for monitoring of skin recovery after burns, surgeries, and other interactions. The chromophore maps can be obtained by processing several spectral images of skin, e.g., captured by hyperspectral or multispectral cameras during seconds or even minutes. To avoid motion artifacts and simplify the procedure, a single-snapshot technique for mapping melanin, oxyhemoglobin, and deoxyhemoglobin of in-vivo skin by a smartphone under simultaneous three-wavelength (448-532-659 nm) laser illumination is proposed and examined. Three monochromatic spectral images related to the illumination wavelengths were extracted from the smartphone camera RGB image data set with respect to crosstalk between the RGB detection bands. Spectral images were further processed accordingly to Beer's law in a three chromophore approximation. Photon absorption path lengths in skin at the exploited wavelengths were estimated by means of Monte Carlo simulations. The technique was validated clinically on three kinds of skin lesions: nevi, hemangiomas, and seborrheic keratosis. Design of the developed add-on laser illumination system, image-processing details, and the results of clinical measurements are presented and discussed.

  17. Non-Destructive Monitoring of Charge-Discharge Cycles on Lithium Ion Batteries using 7Li Stray-Field Imaging

    PubMed Central

    Tang, Joel A.; Dugar, Sneha; Zhong, Guiming; Dalal, Naresh S.; Zheng, Jim P.; Yang, Yong; Fu, Riqiang

    2013-01-01

    Magnetic resonance imaging provides a noninvasive method for in situ monitoring of electrochemical processes involved in charge/discharge cycling of batteries. Determining how the electrochemical processes become irreversible, ultimately resulting in degraded battery performance, will aid in developing new battery materials and designing better batteries. Here we introduce the use of an alternative in situ diagnostic tool to monitor the electrochemical processes. Utilizing a very large field-gradient in the fringe field of a magnet, stray-field-imaging (STRAFI) technique significantly improves the image resolution. These STRAFI images enable the real time monitoring of the electrodes at a micron level. It is demonstrated by two prototype half-cells, graphite∥Li and LiFePO4∥Li, that the high-resolution 7Li STRAFI profiles allow one to visualize in situ Li-ions transfer between the electrodes during charge/discharge cyclings as well as the formation and changes of irreversible microstructures of the Li components, and particularly reveal a non-uniform Li-ion distribution in the graphite. PMID:24005580

  18. Automated processing for proton spectroscopic imaging using water reference deconvolution.

    PubMed

    Maudsley, A A; Wu, Z; Meyerhoff, D J; Weiner, M W

    1994-06-01

    Automated formation of MR spectroscopic images (MRSI) is necessary before routine application of these methods is possible for in vivo studies; however, this task is complicated by the presence of spatially dependent instrumental distortions and the complex nature of the MR spectrum. A data processing method is presented for completely automated formation of in vivo proton spectroscopic images, and applied for analysis of human brain metabolites. This procedure uses the water reference deconvolution method (G. A. Morris, J. Magn. Reson. 80, 547(1988)) to correct for line shape distortions caused by instrumental and sample characteristics, followed by parametric spectral analysis. Results for automated image formation were found to compare favorably with operator dependent spectral integration methods. While the water reference deconvolution processing was found to provide good correction of spatially dependent resonance frequency shifts, it was found to be susceptible to errors for correction of line shape distortions. These occur due to differences between the water reference and the metabolite distributions.

  19. Networked vision system using a Prolog controller

    NASA Astrophysics Data System (ADS)

    Batchelor, B. G.; Caton, S. J.; Chatburn, L. T.; Crowther, R. A.; Miller, J. W. V.

    2005-11-01

    Prolog offers a very different style of programming compared to conventional languages; it can define object properties and abstract relationships in a way that Java, C, C++, etc. find awkward. In an accompanying paper, the authors describe how a distributed web-based vision systems can be built using elements that may even be located on different continents. One particular system of this general type is described here. The top-level controller is a Prolog program, which operates one, or more, image processing engines. This type of function is natural to Prolog, since it is able to reason logically using symbolic (non-numeric) data. Although Prolog is not suitable for programming image processing functions directly, it is ideal for analysing the results derived by an image processor. This article describes the implementation of two systems, in which a Prolog program controls several image processing engines, a simple robot, a pneumatic pick-and-place arm), LED illumination modules and a various mains-powered devices.

  20. From image captioning to video summary using deep recurrent networks and unsupervised segmentation

    NASA Astrophysics Data System (ADS)

    Morosanu, Bogdan-Andrei; Lemnaru, Camelia

    2018-04-01

    Automatic captioning systems based on recurrent neural networks have been tremendously successful at providing realistic natural language captions for complex and varied image data. We explore methods for adapting existing models trained on large image caption data sets to a similar problem, that of summarising videos using natural language descriptions and frame selection. These architectures create internal high level representations of the input image that can be used to define probability distributions and distance metrics on these distributions. Specifically, we interpret each hidden unit inside a layer of the caption model as representing the un-normalised log probability of some unknown image feature of interest for the caption generation process. We can then apply well understood statistical divergence measures to express the difference between images and create an unsupervised segmentation of video frames, classifying consecutive images of low divergence as belonging to the same context, and those of high divergence as belonging to different contexts. To provide a final summary of the video, we provide a group of selected frames and a text description accompanying them, allowing a user to perform a quick exploration of large unlabeled video databases.

  1. Development of a user-friendly system for image processing of electron microscopy by integrating a web browser and PIONE with Eos.

    PubMed

    Tsukamoto, Takafumi; Yasunaga, Takuo

    2014-11-01

    Eos (Extensible object-oriented system) is one of the powerful applications for image processing of electron micrographs. In usual cases, Eos works with only character user interfaces (CUI) under the operating systems (OS) such as OS-X or Linux, not user-friendly. Thus, users of Eos need to be expert at image processing of electron micrographs, and have a little knowledge of computer science, as well. However, all the persons who require Eos does not an expert for CUI. Thus we extended Eos to a web system independent of OS with graphical user interfaces (GUI) by integrating web browser.Advantage to use web browser is not only to extend Eos with GUI, but also extend Eos to work under distributed computational environment. Using Ajax (Asynchronous JavaScript and XML) technology, we implemented more comfortable user-interface on web browser. Eos has more than 400 commands related to image processing for electron microscopy, and the usage of each command is different from each other. Since the beginning of development, Eos has managed their user-interface by using the interface definition file of "OptionControlFile" written in CSV (Comma-Separated Value) format, i.e., Each command has "OptionControlFile", which notes information for interface and its usage generation. Developed GUI system called "Zephyr" (Zone for Easy Processing of HYpermedia Resources) also accessed "OptionControlFIle" and produced a web user-interface automatically, because its mechanism is mature and convenient,The basic actions of client side system was implemented properly and can supply auto-generation of web-form, which has functions of execution, image preview, file-uploading to a web server. Thus the system can execute Eos commands with unique options for each commands, and process image analysis. There remain problems of image file format for visualization and workspace for analysis: The image file format information is useful to check whether the input/output file is correct and we also need to provide common workspace for analysis because the client is physically separated from a server. We solved the file format problem by extension of rules of OptionControlFile of Eos. Furthermore, to solve workspace problems, we have developed two type of system. The first system is to use only local environments. The user runs a web server provided by Eos, access to a web client through a web browser, and manipulate the local files with GUI on the web browser. The second system is employing PIONE (Process-rule for Input/Output Negotiation Environment), which is our developing platform that works under heterogenic distributed environment. The users can put their resources, such as microscopic images, text files and so on, into the server-side environment supported by PIONE, and so experts can write PIONE rule definition, which defines a workflow of image processing. PIONE run each image processing on suitable computers, following the defined rule. PIONE has the ability of interactive manipulation, and user is able to try a command with various setting values. In this situation, we contribute to auto-generation of GUI for a PIONE workflow.As advanced functions, we have developed a module to log user actions. The logs include information such as setting values in image processing, procedure of commands and so on. If we use the logs effectively, we can get a lot of advantages. For example, when an expert may discover some know-how of image processing, other users can also share logs including his know-hows and so we may obtain recommendation workflow of image analysis, if we analyze logs. To implement social platform of image processing for electron microscopists, we have developed system infrastructure, as well. © The Author 2014. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Investigation of microstructure in additive manufactured Inconel 625 by spatially resolved neutron transmission spectroscopy

    DOE PAGES

    Tremsin, Anton S.; Gao, Yan; Dial, Laura C.; ...

    2016-07-08

    Non-destructive testing techniques based on neutron imaging and diffraction can provide information on the internal structure of relatively thick metal samples (up to several cm), which are opaque to other conventional non-destructive methods. Spatially resolved neutron transmission spectroscopy is an extension of traditional neutron radiography, where multiple images are acquired simultaneously, each corresponding to a narrow range of energy. The analysis of transmission spectra enables studies of bulk microstructures at the spatial resolution comparable to the detector pixel. In this study we demonstrate the possibility of imaging (with ~100 μm resolution) distribution of some microstructure properties, such as residual strain,more » texture, voids and impurities in Inconel 625 samples manufactured with an additive manufacturing method called direct metal laser melting (DMLM). Although this imaging technique can be implemented only in a few large-scale facilities, it can be a valuable tool for optimization of additive manufacturing techniques and materials and for correlating bulk microstructure properties to manufacturing process parameters. Additionally, the experimental strain distribution can help validate finite element models which many industries use to predict the residual stress distributions in additive manufactured components.« less

  3. Investigation of microstructure in additive manufactured Inconel 625 by spatially resolved neutron transmission spectroscopy.

    PubMed

    Tremsin, Anton S; Gao, Yan; Dial, Laura C; Grazzi, Francesco; Shinohara, Takenao

    2016-01-01

    Non-destructive testing techniques based on neutron imaging and diffraction can provide information on the internal structure of relatively thick metal samples (up to several cm), which are opaque to other conventional non-destructive methods. Spatially resolved neutron transmission spectroscopy is an extension of traditional neutron radiography, where multiple images are acquired simultaneously, each corresponding to a narrow range of energy. The analysis of transmission spectra enables studies of bulk microstructures at the spatial resolution comparable to the detector pixel. In this study we demonstrate the possibility of imaging (with ~100 μm resolution) distribution of some microstructure properties, such as residual strain, texture, voids and impurities in Inconel 625 samples manufactured with an additive manufacturing method called direct metal laser melting (DMLM). Although this imaging technique can be implemented only in a few large-scale facilities, it can be a valuable tool for optimization of additive manufacturing techniques and materials and for correlating bulk microstructure properties to manufacturing process parameters. In addition, the experimental strain distribution can help validate finite element models which many industries use to predict the residual stress distributions in additive manufactured components.

  4. Investigation of microstructure in additive manufactured Inconel 625 by spatially resolved neutron transmission spectroscopy

    NASA Astrophysics Data System (ADS)

    Tremsin, Anton S.; Gao, Yan; Dial, Laura C.; Grazzi, Francesco; Shinohara, Takenao

    2016-01-01

    Non-destructive testing techniques based on neutron imaging and diffraction can provide information on the internal structure of relatively thick metal samples (up to several cm), which are opaque to other conventional non-destructive methods. Spatially resolved neutron transmission spectroscopy is an extension of traditional neutron radiography, where multiple images are acquired simultaneously, each corresponding to a narrow range of energy. The analysis of transmission spectra enables studies of bulk microstructures at the spatial resolution comparable to the detector pixel. In this study we demonstrate the possibility of imaging (with 100 μm resolution) distribution of some microstructure properties, such as residual strain, texture, voids and impurities in Inconel 625 samples manufactured with an additive manufacturing method called direct metal laser melting (DMLM). Although this imaging technique can be implemented only in a few large-scale facilities, it can be a valuable tool for optimization of additive manufacturing techniques and materials and for correlating bulk microstructure properties to manufacturing process parameters. In addition, the experimental strain distribution can help validate finite element models which many industries use to predict the residual stress distributions in additive manufactured components.

  5. Investigation of microstructure in additive manufactured Inconel 625 by spatially resolved neutron transmission spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tremsin, Anton S.; Gao, Yan; Dial, Laura C.

    Non-destructive testing techniques based on neutron imaging and diffraction can provide information on the internal structure of relatively thick metal samples (up to several cm), which are opaque to other conventional non-destructive methods. Spatially resolved neutron transmission spectroscopy is an extension of traditional neutron radiography, where multiple images are acquired simultaneously, each corresponding to a narrow range of energy. The analysis of transmission spectra enables studies of bulk microstructures at the spatial resolution comparable to the detector pixel. In this study we demonstrate the possibility of imaging (with ~100 μm resolution) distribution of some microstructure properties, such as residual strain,more » texture, voids and impurities in Inconel 625 samples manufactured with an additive manufacturing method called direct metal laser melting (DMLM). Although this imaging technique can be implemented only in a few large-scale facilities, it can be a valuable tool for optimization of additive manufacturing techniques and materials and for correlating bulk microstructure properties to manufacturing process parameters. Additionally, the experimental strain distribution can help validate finite element models which many industries use to predict the residual stress distributions in additive manufactured components.« less

  6. Investigation of microstructure in additive manufactured Inconel 625 by spatially resolved neutron transmission spectroscopy

    PubMed Central

    Tremsin, Anton S.; Gao, Yan; Dial, Laura C.; Grazzi, Francesco; Shinohara, Takenao

    2016-01-01

    Abstract Non-destructive testing techniques based on neutron imaging and diffraction can provide information on the internal structure of relatively thick metal samples (up to several cm), which are opaque to other conventional non-destructive methods. Spatially resolved neutron transmission spectroscopy is an extension of traditional neutron radiography, where multiple images are acquired simultaneously, each corresponding to a narrow range of energy. The analysis of transmission spectra enables studies of bulk microstructures at the spatial resolution comparable to the detector pixel. In this study we demonstrate the possibility of imaging (with ~100 μm resolution) distribution of some microstructure properties, such as residual strain, texture, voids and impurities in Inconel 625 samples manufactured with an additive manufacturing method called direct metal laser melting (DMLM). Although this imaging technique can be implemented only in a few large-scale facilities, it can be a valuable tool for optimization of additive manufacturing techniques and materials and for correlating bulk microstructure properties to manufacturing process parameters. In addition, the experimental strain distribution can help validate finite element models which many industries use to predict the residual stress distributions in additive manufactured components. PMID:27877885

  7. Hadoop-based implementation of processing medical diagnostic records for visual patient system

    NASA Astrophysics Data System (ADS)

    Yang, Yuanyuan; Shi, Liehang; Xie, Zhe; Zhang, Jianguo

    2018-03-01

    We have innovatively introduced Visual Patient (VP) concept and method visually to represent and index patient imaging diagnostic records (IDR) in last year SPIE Medical Imaging (SPIE MI 2017), which can enable a doctor to review a large amount of IDR of a patient in a limited appointed time slot. In this presentation, we presented a new approach to design data processing architecture of VP system (VPS) to acquire, process and store various kinds of IDR to build VP instance for each patient in hospital environment based on Hadoop distributed processing structure. We designed this system architecture called Medical Information Processing System (MIPS) with a combination of Hadoop batch processing architecture and Storm stream processing architecture. The MIPS implemented parallel processing of various kinds of clinical data with high efficiency, which come from disparate hospital information system such as PACS, RIS LIS and HIS.

  8. A simultaneous beta and coincidence-gamma imaging system for plant leaves

    NASA Astrophysics Data System (ADS)

    Ranjbar, Homayoon; Wen, Jie; Mathews, Aswin J.; Komarov, Sergey; Wang, Qiang; Li, Ke; O'Sullivan, Joseph A.; Tai, Yuan-Chuan

    2016-05-01

    Positron emitting isotopes, such as 11C, 13N, and 18F, can be used to label molecules. The tracers, such as 11CO2, are delivered to plants to study their biological processes, particularly metabolism and photosynthesis, which may contribute to the development of plants that have a higher yield of crops and biomass. Measurements and resulting images from PET scanners are not quantitative in young plant structures or in plant leaves due to poor positron annihilation in thin objects. To address this problem we have designed, assembled, modeled, and tested a nuclear imaging system (simultaneous beta-gamma imager). The imager can simultaneously detect positrons ({β+} ) and coincidence-gamma rays (γ). The imaging system employs two planar detectors; one is a regular gamma detector which has a LYSO crystal array, and the other is a phoswich detector which has an additional BC-404 plastic scintillator for beta detection. A forward model for positrons is proposed along with a joint image reconstruction formulation to utilize the beta and coincidence-gamma measurements for estimating radioactivity distribution in plant leaves. The joint reconstruction algorithm first reconstructs beta and gamma images independently to estimate the thickness component of the beta forward model and afterward jointly estimates the radioactivity distribution in the object. We have validated the physics model and reconstruction framework through a phantom imaging study and imaging a tomato leaf that has absorbed 11CO2. The results demonstrate that the simultaneously acquired beta and coincidence-gamma data, combined with our proposed joint reconstruction algorithm, improved the quantitative accuracy of estimating radioactivity distribution in thin objects such as leaves. We used the structural similarity (SSIM) index for comparing the leaf images from the simultaneous beta-gamma imager with the ground truth image. The jointly reconstructed images yield SSIM indices of 0.69 and 0.63, whereas the separately reconstructed beta alone and gamma alone images had indices of 0.33 and 0.52, respectively.

  9. A simultaneous beta and coincidence-gamma imaging system for plant leaves.

    PubMed

    Ranjbar, Homayoon; Wen, Jie; Mathews, Aswin J; Komarov, Sergey; Wang, Qiang; Li, Ke; O'Sullivan, Joseph A; Tai, Yuan-Chuan

    2016-05-07

    Positron emitting isotopes, such as (11)C, (13)N, and (18)F, can be used to label molecules. The tracers, such as (11)CO2, are delivered to plants to study their biological processes, particularly metabolism and photosynthesis, which may contribute to the development of plants that have a higher yield of crops and biomass. Measurements and resulting images from PET scanners are not quantitative in young plant structures or in plant leaves due to poor positron annihilation in thin objects. To address this problem we have designed, assembled, modeled, and tested a nuclear imaging system (simultaneous beta-gamma imager). The imager can simultaneously detect positrons ([Formula: see text]) and coincidence-gamma rays (γ). The imaging system employs two planar detectors; one is a regular gamma detector which has a LYSO crystal array, and the other is a phoswich detector which has an additional BC-404 plastic scintillator for beta detection. A forward model for positrons is proposed along with a joint image reconstruction formulation to utilize the beta and coincidence-gamma measurements for estimating radioactivity distribution in plant leaves. The joint reconstruction algorithm first reconstructs beta and gamma images independently to estimate the thickness component of the beta forward model and afterward jointly estimates the radioactivity distribution in the object. We have validated the physics model and reconstruction framework through a phantom imaging study and imaging a tomato leaf that has absorbed (11)CO2. The results demonstrate that the simultaneously acquired beta and coincidence-gamma data, combined with our proposed joint reconstruction algorithm, improved the quantitative accuracy of estimating radioactivity distribution in thin objects such as leaves. We used the structural similarity (SSIM) index for comparing the leaf images from the simultaneous beta-gamma imager with the ground truth image. The jointly reconstructed images yield SSIM indices of 0.69 and 0.63, whereas the separately reconstructed beta alone and gamma alone images had indices of 0.33 and 0.52, respectively.

  10. High-Contrast Near-Infrared Imaging Polarimetry of the Protoplanetary Disk around RY Tau

    NASA Technical Reports Server (NTRS)

    Takami, Michihiro; Karr, Jennifer L.; Hashimoto, Jun; Kim, Hyosun; Wisenewski, John; Henning, Thomas; Grady, Carol; Kandori, Ryo; Hodapp, Klaus W.; Kudo, Tomoyuki; hide

    2013-01-01

    We present near-infrared coronagraphic imaging polarimetry of RY Tau. The scattered light in the circumstellar environment was imaged at H-band at a high resolution (approx. 0.05) for the first time, using Subaru-HiCIAO. The observed polarized intensity (PI) distribution shows a butterfly-like distribution of bright emission with an angular scale similar to the disk observed at millimeter wavelengths. This distribution is offset toward the blueshifted jet, indicating the presence of a geometrically thick disk or a remnant envelope, and therefore the earliest stage of the Class II evolutionary phase. We perform comparisons between the observed PI distribution and disk models with: (1) full radiative transfer code, using the spectral energy distribution (SED) to constrain the disk parameters; and (2) monochromatic simulations of scattered light which explore a wide range of parameters space to constrain the disk and dust parameters. We show that these models cannot consistently explain the observed PI distribution, SED, and the viewing angle inferred by millimeter interferometry. We suggest that the scattered light in the near-infrared is associated with an optically thin and geometrically thick layer above the disk surface, with the surface responsible for the infrared SED. Half of the scattered light and thermal radiation in this layer illuminates the disk surface, and this process may significantly affect the thermal structure of the disk.

  11. A sub-sampled approach to extremely low-dose STEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, A.; Luzi, L.; Yang, H.

    The inpainting of randomly sub-sampled images acquired by scanning transmission electron microscopy (STEM) is an attractive method for imaging under low-dose conditions (≤ 1 e -Å 2) without changing either the operation of the microscope or the physics of the imaging process. We show that 1) adaptive sub-sampling increases acquisition speed, resolution, and sensitivity; and 2) random (non-adaptive) sub-sampling is equivalent, but faster than, traditional low-dose techniques. Adaptive sub-sampling opens numerous possibilities for the analysis of beam sensitive materials and in-situ dynamic processes at the resolution limit of the aberration corrected microscope and is demonstrated here for the analysis ofmore » the node distribution in metal-organic frameworks (MOFs).« less

  12. Multifaceted free-space image distributor for optical interconnects in massively parrallel processing

    NASA Astrophysics Data System (ADS)

    Zhao, Feng; Frietman, Edward E. E.; Han, Zhong; Chen, Ray T.

    1999-04-01

    A characteristic feature of a conventional von Neumann computer is that computing power is delivered by a single processing unit. Although increasing the clock frequency improves the performance of the computer, the switching speed of the semiconductor devices and the finite speed at which electrical signals propagate along the bus set the boundaries. Architectures containing large numbers of nodes can solve this performance dilemma, with the comment that main obstacles in designing such systems are caused by difficulties to come up with solutions that guarantee efficient communications among the nodes. Exchanging data becomes really a bottleneck should al nodes be connected by a shared resource. Only optics, due to its inherent parallelism, could solve that bottleneck. Here, we explore a multi-faceted free space image distributor to be used in optical interconnects in massively parallel processing. In this paper, physical and optical models of the image distributor are focused on from diffraction theory of light wave to optical simulations. the general features and the performance of the image distributor are also described. The new structure of an image distributor and the simulations for it are discussed. From the digital simulation and experiment, it is found that the multi-faceted free space image distributing technique is quite suitable for free space optical interconnection in massively parallel processing and new structure of the multifaceted free space image distributor would perform better.

  13. High-Throughput Image Analysis of Fibrillar Materials: A Case Study on Polymer Nanofiber Packing, Alignment, and Defects in Organic Field Effect Transistors.

    PubMed

    Persson, Nils E; Rafshoon, Joshua; Naghshpour, Kaylie; Fast, Tony; Chu, Ping-Hsun; McBride, Michael; Risteen, Bailey; Grover, Martha; Reichmanis, Elsa

    2017-10-18

    High-throughput discovery of process-structure-property relationships in materials through an informatics-enabled empirical approach is an increasingly utilized technique in materials research due to the rapidly expanding availability of data. Here, process-structure-property relationships are extracted for the nucleation, growth, and deposition of semiconducting poly(3-hexylthiophene) (P3HT) nanofibers used in organic field effect transistors, via high-throughput image analysis. This study is performed using an automated image analysis pipeline combining existing open-source software and new algorithms, enabling the rapid evaluation of structural metrics for images of fibrillar materials, including local orientational order, fiber length density, and fiber length distributions. We observe that microfluidic processing leads to fibers that pack with unusually high density, while sonication yields fibers that pack sparsely with low alignment. This is attributed to differences in their crystallization mechanisms. P3HT nanofiber packing during thin film deposition exhibits behavior suggesting that fibers are confined to packing in two-dimensional layers. We find that fiber alignment, a feature correlated with charge carrier mobility, is driven by increasing fiber length, and that shorter fibers tend to segregate to the buried dielectric interface during deposition, creating potentially performance-limiting defects in alignment. Another barrier to perfect alignment is the curvature of P3HT fibers; we propose a mechanistic simulation of fiber growth that reconciles both this curvature and the log-normal distribution of fiber lengths inherent to the fiber populations under consideration.

  14. Systems and methods for optically measuring properties of hydrocarbon fuel gases

    DOEpatents

    Adler-Golden, S.; Bernstein, L.S.; Bien, F.; Gersh, M.E.; Goldstein, N.

    1998-10-13

    A system and method for optical interrogation and measurement of a hydrocarbon fuel gas includes a light source generating light at near-visible wavelengths. A cell containing the gas is optically coupled to the light source which is in turn partially transmitted by the sample. A spectrometer disperses the transmitted light and captures an image thereof. The image is captured by a low-cost silicon-based two-dimensional CCD array. The captured spectral image is processed by electronics for determining energy or BTU content and composition of the gas. The innovative optical approach provides a relatively inexpensive, durable, maintenance-free sensor and method which is reliable in the field and relatively simple to calibrate. In view of the above, accurate monitoring is possible at a plurality of locations along the distribution chain leading to more efficient distribution. 14 figs.

  15. Lessons Learned through the Development and Publication of AstroImageJ

    NASA Astrophysics Data System (ADS)

    Collins, Karen

    2018-01-01

    As lead author of the scientific image processing software package AstroImageJ (AIJ), I will discuss the reasoning behind why we decided to release AIJ to the public, and the lessons we learned related to the development, publication, distribution, and support of AIJ. I will also summarize the AIJ code language selection, code documentation and testing approaches, code distribution, update, and support facilities used, and the code citation and licensing decisions. Since AIJ was initially developed as part of my graduate research and was my first scientific open source software publication, many of my experiences and difficulties encountered may parallel those of others new to scientific software publication. Finally, I will discuss the benefits and disadvantages of releasing scientific software that I now recognize after having AIJ in the public domain for more than five years.

  16. Systems and methods for optically measuring properties of hydrocarbon fuel gases

    DOEpatents

    Adler-Golden, Steven; Bernstein, Lawrence S.; Bien, Fritz; Gersh, Michael E.; Goldstein, Neil

    1998-10-13

    A system and method for optical interrogation and measurement of a hydrocarbon fuel gas includes a light source generating light at near-visible wavelengths. A cell containing the gas is optically coupled to the light source which is in turn partially transmitted by the sample. A spectrometer disperses the transmitted light and captures an image thereof. The image is captured by a low-cost silicon-based two-dimensional CCD array. The captured spectral image is processed by electronics for determining energy or BTU content and composition of the gas. The innovative optical approach provides a relatively inexpensive, durable, maintenance-free sensor and method which is reliable in the field and relatively simple to calibrate. In view of the above, accurate monitoring is possible at a plurality of locations along the distribution chain leading to more efficient distribution.

  17. PET-radioimmunodetection of integrins: imaging acute colitis using a ⁶⁴Cu-labeled anti-β₇ integrin antibody.

    PubMed

    Dearling, Jason L J; Packard, Alan B

    2012-01-01

    Integrins are involved in a wide range of cell interactions. Imaging their distribution using high-resolution noninvasive techniques that are directly translatable to the clinic can provide new insights into disease processes and presents the opportunity to directly monitor new therapies. In this chapter, we describe a protocol to image, the in vivo distribution of the integrin β(7), expressed by lymphocytes recruited to and retained by the inflamed gut, using a radiolabeled whole antibody. The antibody is purified, conjugated with a bifunctional chelator for labeling with a radiometal, labeled with the positron-emitting radionuclide (64)Cu, and injected into mice for microPET studies. Mice with DSS-induced colitis were found to have higher uptake of the (64)Cu-labeled antibody in the gut than control groups.

  18. Microstructure Evolution and Composition Control during the Processing of Thin-gage Metallic Foil (Preprint)

    DTIC Science & Technology

    2012-02-01

    the presence of somewhat randomly-distributed carbides and borides (white particles in BSE images), this grain size was comparable to that observed...pinned by carbide/ boride particles (imaging white in Figure 8c). The very fine gamma-prime precipitates likely produced during magnetron sputtering...sputtered material. First, the carbide/ boride particles were nucleated and hence located preferentially at the grain boundaries in the sputtered

  19. Image Guided Biodistribution and Pharmacokinetic Studies of Theranostics

    PubMed Central

    Ding, Hong; Wu, Fang

    2012-01-01

    Image guided technique is playing an increasingly important role in the investigation of the biodistribution and pharmacokinetics of drugs or drug delivery systems in various diseases, especially cancers. Besides anatomical imaging modalities such as computed tomography (CT), magnetic resonance imaging (MRI), molecular imaging strategy including optical imaging, positron emission tomography (PET) and single-photon emission computed tomography (SPECT) will facilitate the localization and quantization of radioisotope or optical probe labeled nanoparticle delivery systems in the category of theranostics. The quantitative measurement of the bio-distribution and pharmacokinetics of theranostics in the fields of new drug/probe development, diagnosis and treatment process monitoring as well as tracking the brain-blood-barrier (BBB) breaking through by high sensitive imaging method, and the applications of the representative imaging modalities are summarized in this review. PMID:23227121

  20. Advanced Image Processing for Defect Visualization in Infrared Thermography

    NASA Technical Reports Server (NTRS)

    Plotnikov, Yuri A.; Winfree, William P.

    1997-01-01

    Results of a defect visualization process based on pulse infrared thermography are presented. Algorithms have been developed to reduce the amount of operator participation required in the process of interpreting thermographic images. The algorithms determine the defect's depth and size from the temporal and spatial thermal distributions that exist on the surface of the investigated object following thermal excitation. A comparison of the results from thermal contrast, time derivative, and phase analysis methods for defect visualization are presented. These comparisons are based on three dimensional simulations of a test case representing a plate with multiple delaminations. Comparisons are also based on experimental data obtained from a specimen with flat bottom holes and a composite panel with delaminations.

  1. What can we learn from in-soil imaging of a live plant: X-ray Computed Tomography and 3D numerical simulation of root-soil system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xiaofan; Varga, Tamas; Liu, Chongxuan

    Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to farming, forest management and climate change. X-ray computed tomography (XCT) has been proven to be an effective tool for non-invasive root imaging and analysis. A combination of XCT, open-source software, and our own code was used to noninvasively image a prairie dropseed (Sporobolus heterolepis) specimen, segment the root data to obtain a 3D image of the root structure at 31µm resolution, and extract quantitative information (root volume and surface area) from the 3D data, respectively. Based on themore » mesh generated from the root structure, computational fluid dynamics (CFD) simulations were applied to numerically investigate the root-soil-groundwater system. The plant root conductivity, soil hydraulic conductivity and transpiration rate were shown to control the groundwater distribution. The flow variability and soil water distributions under different scenarios were investigated. Parameterizations were evaluated to show their impacts on the average conductivity. The pore-scale modeling approach provides realistic simulations of rhizosphere flow processes and provides useful information that can be linked to upscaled models.« less

  2. Workflow-enabled distributed component-based information architecture for digital medical imaging enterprises.

    PubMed

    Wong, Stephen T C; Tjandra, Donny; Wang, Huili; Shen, Weimin

    2003-09-01

    Few information systems today offer a flexible means to define and manage the automated part of radiology processes, which provide clinical imaging services for the entire healthcare organization. Even fewer of them provide a coherent architecture that can easily cope with heterogeneity and inevitable local adaptation of applications and can integrate clinical and administrative information to aid better clinical, operational, and business decisions. We describe an innovative enterprise architecture of image information management systems to fill the needs. Such a system is based on the interplay of production workflow management, distributed object computing, Java and Web techniques, and in-depth domain knowledge in radiology operations. Our design adapts the approach of "4+1" architectural view. In this new architecture, PACS and RIS become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, the workflow-enabled digital radiology system would provide powerful query and statistical functions for managing resources and improving productivity. This paper will potentially lead to a new direction of image information management. We illustrate the innovative design with examples taken from an implemented system.

  3. Dealloying in Individual Nanoparticles and Thin Film Grains: A Bragg Coherent Diffractive Imaging Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cha, Wonsuk; Liu, Yihua; You, Hoydoo

    Dealloying is a process whereby selective dissolution results in a porous, strained structure often with new properties. The process is of both intrinsic and applied interest, and recently has been used to make highly active catalysts. The porosity has been studied using electron microscopy while the dealloying-induced strain has been studied at the ensemble level using X-ray diffraction. Despite the importance of local, for example, at the individual particle or grain level, strain in controlling the properties of the dealloyed material, it remains unresolved due to the difficulty of imaging 3D strain distributions with nanometer resolution in reactive environments. Thismore » information could play an integral role in understanding and controlling lattice strain for a variety of applications. Here, 3D strain distributions in individual nanoparticles and thin film grains in silver-gold alloys undergoing nitric acid-induced dealloying are imaged by Bragg coherent diffractive imaging. Particles exhibit dramatic changes in their local strains due to dealloying but grains do not. Furthermore, the average lattice in both grains and particles contracts during dealloying. In general, the results reveal significant dealloying-induced strain heterogeneity at the nanoscale in both isolated and extended samples, which may be utilized to develop advanced nanostructures for a variety of important applications.« less

  4. Dealloying in Individual Nanoparticles and Thin Film Grains: A Bragg Coherent Diffractive Imaging Study

    DOE PAGES

    Cha, Wonsuk; Liu, Yihua; You, Hoydoo; ...

    2017-05-09

    Dealloying is a process whereby selective dissolution results in a porous, strained structure often with new properties. The process is of both intrinsic and applied interest, and recently has been used to make highly active catalysts. The porosity has been studied using electron microscopy while the dealloying-induced strain has been studied at the ensemble level using X-ray diffraction. Despite the importance of local, for example, at the individual particle or grain level, strain in controlling the properties of the dealloyed material, it remains unresolved due to the difficulty of imaging 3D strain distributions with nanometer resolution in reactive environments. Thismore » information could play an integral role in understanding and controlling lattice strain for a variety of applications. Here, 3D strain distributions in individual nanoparticles and thin film grains in silver-gold alloys undergoing nitric acid-induced dealloying are imaged by Bragg coherent diffractive imaging. Particles exhibit dramatic changes in their local strains due to dealloying but grains do not. Furthermore, the average lattice in both grains and particles contracts during dealloying. In general, the results reveal significant dealloying-induced strain heterogeneity at the nanoscale in both isolated and extended samples, which may be utilized to develop advanced nanostructures for a variety of important applications.« less

  5. Hierarchical representation of shapes in visual cortex—from localized features to figural shape segregation

    PubMed Central

    Tschechne, Stephan; Neumann, Heiko

    2014-01-01

    Visual structures in the environment are segmented into image regions and those combined to a representation of surfaces and prototypical objects. Such a perceptual organization is performed by complex neural mechanisms in the visual cortex of primates. Multiple mutually connected areas in the ventral cortical pathway receive visual input and extract local form features that are subsequently grouped into increasingly complex, more meaningful image elements. Such a distributed network of processing must be capable to make accessible highly articulated changes in shape boundary as well as very subtle curvature changes that contribute to the perception of an object. We propose a recurrent computational network architecture that utilizes hierarchical distributed representations of shape features to encode surface and object boundary over different scales of resolution. Our model makes use of neural mechanisms that model the processing capabilities of early and intermediate stages in visual cortex, namely areas V1–V4 and IT. We suggest that multiple specialized component representations interact by feedforward hierarchical processing that is combined with feedback signals driven by representations generated at higher stages. Based on this, global configurational as well as local information is made available to distinguish changes in the object's contour. Once the outline of a shape has been established, contextual contour configurations are used to assign border ownership directions and thus achieve segregation of figure and ground. The model, thus, proposes how separate mechanisms contribute to distributed hierarchical cortical shape representation and combine with processes of figure-ground segregation. Our model is probed with a selection of stimuli to illustrate processing results at different processing stages. We especially highlight how modulatory feedback connections contribute to the processing of visual input at various stages in the processing hierarchy. PMID:25157228

  6. Hierarchical representation of shapes in visual cortex-from localized features to figural shape segregation.

    PubMed

    Tschechne, Stephan; Neumann, Heiko

    2014-01-01

    Visual structures in the environment are segmented into image regions and those combined to a representation of surfaces and prototypical objects. Such a perceptual organization is performed by complex neural mechanisms in the visual cortex of primates. Multiple mutually connected areas in the ventral cortical pathway receive visual input and extract local form features that are subsequently grouped into increasingly complex, more meaningful image elements. Such a distributed network of processing must be capable to make accessible highly articulated changes in shape boundary as well as very subtle curvature changes that contribute to the perception of an object. We propose a recurrent computational network architecture that utilizes hierarchical distributed representations of shape features to encode surface and object boundary over different scales of resolution. Our model makes use of neural mechanisms that model the processing capabilities of early and intermediate stages in visual cortex, namely areas V1-V4 and IT. We suggest that multiple specialized component representations interact by feedforward hierarchical processing that is combined with feedback signals driven by representations generated at higher stages. Based on this, global configurational as well as local information is made available to distinguish changes in the object's contour. Once the outline of a shape has been established, contextual contour configurations are used to assign border ownership directions and thus achieve segregation of figure and ground. The model, thus, proposes how separate mechanisms contribute to distributed hierarchical cortical shape representation and combine with processes of figure-ground segregation. Our model is probed with a selection of stimuli to illustrate processing results at different processing stages. We especially highlight how modulatory feedback connections contribute to the processing of visual input at various stages in the processing hierarchy.

  7. Faxed document image restoration method based on local pixel patterns

    NASA Astrophysics Data System (ADS)

    Akiyama, Teruo; Miyamoto, Nobuo; Oguro, Masami; Ogura, Kenji

    1998-04-01

    A method for restoring degraded faxed document images using the patterns of pixels that construct small areas in a document is proposed. The method effectively restores faxed images that contain the halftone textures and/or density salt-and-pepper noise that degrade OCR system performance. The halftone image restoration process, white-centered 3 X 3 pixels, in which black-and-white pixels alternate, are identified first using the distribution of the pixel values as halftone textures, and then the white center pixels are inverted to black. To remove high-density salt- and-pepper noise, it is assumed that the degradation is caused by ill-balanced bias and inappropriate thresholding of the sensor output which results in the addition of random noise. Restored image can be estimated using an approximation that uses the inverse operation of the assumed original process. In order to process degraded faxed images, the algorithms mentioned above are combined. An experiment is conducted using 24 especially poor quality examples selected from data sets that exemplify what practical fax- based OCR systems cannot handle. The maximum recovery rate in terms of mean square error was 98.8 percent.

  8. Automatic Image Processing Workflow for the Keck/NIRC2 Vortex Coronagraph

    NASA Astrophysics Data System (ADS)

    Xuan, Wenhao; Cook, Therese; Ngo, Henry; Zawol, Zoe; Ruane, Garreth; Mawet, Dimitri

    2018-01-01

    The Keck/NIRC2 camera, equipped with the vortex coronagraph, is an instrument targeted at the high contrast imaging of extrasolar planets. To uncover a faint planet signal from the overwhelming starlight, we utilize the Vortex Image Processing (VIP) library, which carries out principal component analysis to model and remove the stellar point spread function. To bridge the gap between data acquisition and data reduction, we implement a workflow that 1) downloads, sorts, and processes data with VIP, 2) stores the analysis products into a database, and 3) displays the reduced images, contrast curves, and auxiliary information on a web interface. Both angular differential imaging and reference star differential imaging are implemented in the analysis module. A real-time version of the workflow runs during observations, allowing observers to make educated decisions about time distribution on different targets, hence optimizing science yield. The post-night version performs a standardized reduction after the observation, building up a valuable database that not only helps uncover new discoveries, but also enables a statistical study of the instrument itself. We present the workflow, and an examination of the contrast performance of the NIRC2 vortex with respect to factors including target star properties and observing conditions.

  9. Fast variogram analysis of remotely sensed images in HPC environment

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Cortés, Anna; Masó, Joan; Pons, Xavier

    2013-04-01

    Exploring and describing spatial variation of images is one of the main applications of geostatistics to remote sensing. The variogram is a very suitable tool to carry out this spatial pattern analysis. Variogram analysis is composed of two steps: empirical variogram generation and fitting a variogram model. The empirical variogram generation is a very quick procedure for most analyses of irregularly distributed samples, but time consuming increases quite significantly for remotely sensed images, because number of samples (pixels) involved is usually huge (more than 30 million for a Landsat TM scene), basically depending on extension and spatial resolution of images. In several remote sensing applications this type of analysis is repeated for each image, sometimes hundreds of scenes and sometimes for each radiometric band (high number in the case of hyperspectral images) so that there is a need for a fast implementation. In order to reduce this high execution time, we carried out a parallel solution of the variogram analyses. The solution adopted is the master/worker programming paradigm in which the master process distributes and coordinates the tasks executed by the worker processes. The code is written in ANSI-C language, including MPI (Message Passing Interface) as a message-passing library in order to communicate the master with the workers. This solution (ANSI-C + MPI) guarantees portability between different computer platforms. The High Performance Computing (HPC) environment is formed by 32 nodes, each with two Dual Core Intel(R) Xeon (R) 3.0 GHz processors with 12 Gb of RAM, communicated with integrated dual gigabit Ethernet. This IBM cluster is located in the research laboratory of the Computer Architecture and Operating Systems Department of the Universitat Autònoma de Barcelona. The performance results for a 15km x 15km subcene of 198-31 path-row Landsat TM image are shown in table 1. The proximity between empirical speedup behaviour and theoretical linear speedup confirms a suitable parallel design and implementation applied. N Workers Time (s) Speedup 0 2975.03 2 2112.33 1.41 4 1067.45 2.79 8 534.18 5.57 12 357.54 8.32 16 269.00 11.06 20 216.24 13.76 24 186.31 15.97 Furthermore, very similar performance results are obtained for CASI images (hyperspectral and finer spatial resolution than Landsat), showed in table 2, and demonstrating that the distributed load design is not specifically defined and optimized for unique type of images, but it is a flexible design that maintains a good balance and scalability suitable for different range of image dimensions. N Workers Time (s) Speedup 0 5485.03 2 3847.47 1.43 4 1921.62 2.85 8 965.55 5.68 12 644.26 8.51 16 483.40 11.35 20 393.67 13.93 24 347.15 15.80 28 306.33 17.91 32 304.39 18.02 Finally, we conclude that this significant time reduction underlines the utility of distributed environments for processing large amount of data as remotely sensed images.

  10. Evaluation of noise limits to improve image processing in soft X-ray projection microscopy.

    PubMed

    Jamsranjav, Erdenetogtokh; Kuge, Kenichi; Ito, Atsushi; Kinjo, Yasuhito; Shiina, Tatsuo

    2017-03-03

    Soft X-ray microscopy has been developed for high resolution imaging of hydrated biological specimens due to the availability of water window region. In particular, a projection type microscopy has advantages in wide viewing area, easy zooming function and easy extensibility to computed tomography (CT). The blur of projection image due to the Fresnel diffraction of X-rays, which eventually reduces spatial resolution, could be corrected by an iteration procedure, i.e., repetition of Fresnel and inverse Fresnel transformations. However, it was found that the correction is not enough to be effective for all images, especially for images with low contrast. In order to improve the effectiveness of image correction by computer processing, we in this study evaluated the influence of background noise in the iteration procedure through a simulation study. In the study, images of model specimen with known morphology were used as a substitute for the chromosome images, one of the targets of our microscope. Under the condition that artificial noise was distributed on the images randomly, we introduced two different parameters to evaluate noise effects according to each situation where the iteration procedure was not successful, and proposed an upper limit of the noise within which the effective iteration procedure for the chromosome images was possible. The study indicated that applying the new simulation and noise evaluation method was useful for image processing where background noises cannot be ignored compared with specimen images.

  11. Application of imaging technology to improve the laboratory and field compaction of HMA.

    DOT National Transportation Integrated Search

    2009-04-01

    Field compaction of asphalt mixtures is an important process that influences performance of asphalt : pavements. This study evaluates the relationship between different field compaction patterns and the : uniformity of air void distribution in asphal...

  12. Web-Enabled Distributed Health-Care Framework for Automated Malaria Parasite Classification: an E-Health Approach.

    PubMed

    Maity, Maitreya; Dhane, Dhiraj; Mungle, Tushar; Maiti, A K; Chakraborty, Chandan

    2017-10-26

    Web-enabled e-healthcare system or computer assisted disease diagnosis has a potential to improve the quality and service of conventional healthcare delivery approach. The article describes the design and development of a web-based distributed healthcare management system for medical information and quantitative evaluation of microscopic images using machine learning approach for malaria. In the proposed study, all the health-care centres are connected in a distributed computer network. Each peripheral centre manages its' own health-care service independently and communicates with the central server for remote assistance. The proposed methodology for automated evaluation of parasites includes pre-processing of blood smear microscopic images followed by erythrocytes segmentation. To differentiate between different parasites; a total of 138 quantitative features characterising colour, morphology, and texture are extracted from segmented erythrocytes. An integrated pattern classification framework is designed where four feature selection methods viz. Correlation-based Feature Selection (CFS), Chi-square, Information Gain, and RELIEF are employed with three different classifiers i.e. Naive Bayes', C4.5, and Instance-Based Learning (IB1) individually. Optimal features subset with the best classifier is selected for achieving maximum diagnostic precision. It is seen that the proposed method achieved with 99.2% sensitivity and 99.6% specificity by combining CFS and C4.5 in comparison with other methods. Moreover, the web-based tool is entirely designed using open standards like Java for a web application, ImageJ for image processing, and WEKA for data mining considering its feasibility in rural places with minimal health care facilities.

  13. Novel image processing method study for a label-free optical biosensor

    NASA Astrophysics Data System (ADS)

    Yang, Chenhao; Wei, Li'an; Yang, Rusong; Feng, Ying

    2015-10-01

    Optical biosensor is generally divided into labeled type and label-free type, the former mainly contains fluorescence labeled method and radioactive-labeled method, while fluorescence-labeled method is more mature in the application. The mainly image processing methods of fluorescent-labeled biosensor includes smooth filtering, artificial gridding and constant thresholding. Since some fluorescent molecules may influence the biological reaction, label-free methods have been the main developing direction of optical biosensors nowadays. The using of wider field of view and larger angle of incidence light path which could effectively improve the sensitivity of the label-free biosensor also brought more difficulties in image processing, comparing with the fluorescent-labeled biosensor. Otsu's method is widely applied in machine vision, etc, which choose the threshold to minimize the intraclass variance of the thresholded black and white pixels. It's capacity-constrained with the asymmetrical distribution of images as a global threshold segmentation. In order to solve the irregularity of light intensity on the transducer, we improved the algorithm. In this paper, we present a new image processing algorithm based on a reflectance modulation biosensor platform, which mainly comprises the design of sliding normalization algorithm for image rectification and utilizing the improved otsu's method for image segmentation, in order to implement automatic recognition of target areas. Finally we used adaptive gridding method extracting the target parameters for analysis. Those methods could improve the efficiency of image processing, reduce human intervention, enhance the reliability of experiments and laid the foundation for the realization of high throughput of label-free optical biosensors.

  14. Muon tomography imaging improvement using optimized limited angle data

    NASA Astrophysics Data System (ADS)

    Bai, Chuanyong; Simon, Sean; Kindem, Joel; Luo, Weidong; Sossong, Michael J.; Steiger, Matthew

    2014-05-01

    Image resolution of muon tomography is limited by the range of zenith angles of cosmic ray muons and the flux rate at sea level. Low flux rate limits the use of advanced data rebinning and processing techniques to improve image quality. By optimizing the limited angle data, however, image resolution can be improved. To demonstrate the idea, physical data of tungsten blocks were acquired on a muon tomography system. The angular distribution and energy spectrum of muons measured on the system was also used to generate simulation data of tungsten blocks of different arrangement (geometry). The data were grouped into subsets using the zenith angle and volume images were reconstructed from the data subsets using two algorithms. One was a distributed PoCA (point of closest approach) algorithm and the other was an accelerated iterative maximal likelihood/expectation maximization (MLEM) algorithm. Image resolution was compared for different subsets. Results showed that image resolution was better in the vertical direction for subsets with greater zenith angles and better in the horizontal plane for subsets with smaller zenith angles. The overall image resolution appeared to be the compromise of that of different subsets. This work suggests that the acquired data can be grouped into different limited angle data subsets for optimized image resolution in desired directions. Use of multiple images with resolution optimized in different directions can improve overall imaging fidelity and the intended applications.

  15. Exploiting range imagery: techniques and applications

    NASA Astrophysics Data System (ADS)

    Armbruster, Walter

    2009-07-01

    Practically no applications exist for which automatic processing of 2D intensity imagery can equal human visual perception. This is not the case for range imagery. The paper gives examples of 3D laser radar applications, for which automatic data processing can exceed human visual cognition capabilities and describes basic processing techniques for attaining these results. The examples are drawn from the fields of helicopter obstacle avoidance, object detection in surveillance applications, object recognition at high range, multi-object-tracking, and object re-identification in range image sequences. Processing times and recognition performances are summarized. The techniques used exploit the bijective continuity of the imaging process as well as its independence of object reflectivity, emissivity and illumination. This allows precise formulations of the probability distributions involved in figure-ground segmentation, feature-based object classification and model based object recognition. The probabilistic approach guarantees optimal solutions for single images and enables Bayesian learning in range image sequences. Finally, due to recent results in 3D-surface completion, no prior model libraries are required for recognizing and re-identifying objects of quite general object categories, opening the way to unsupervised learning and fully autonomous cognitive systems.

  16. Simulating dispersion in porous media and the influence of segmentation on stagnancy in carbonates

    NASA Astrophysics Data System (ADS)

    Gray, F.; Cen, J.; Shah, S. M.; Crawshaw, J. P.; Boek, E. S.

    2016-11-01

    Understanding the transport of chemical components in porous media is fundamentally important to many reservoir processes such as contaminant transport and reactive flows involved in CO2 sequestration. Carbonate rocks in particular present difficulties for pore-scale simulations because they contain large amounts of sub-micron porosity. In this work, we introduce a new hybrid simulation model to calculate hydrodynamic dispersion in pore-scale images of real porous media and use this to elucidate the origins and behaviour of stagnant zones arising in transport simulations using micro-CT images of carbonates. For this purpose a stochastic particle model for simulating the transport of a solute is coupled to a Lattice-Boltzmann algorithm to calculate the flow field. The particle method incorporates second order spatial and temporal resolution to resolve finer features of the domain. We demonstrate how dispersion coefficients can be accurately obtained in capillaries, where corresponding analytical solutions are available, even when these are resolved to just a few lattice units. Then we compute molecular displacement distributions for pore-spaces of varying complexity: a pack of beads; a Bentheimer sandstone; and a Portland carbonate. Our calculated propagator distributions are compared directly with recent experimental PFG-NMR propagator distributions (Scheven et al., 2005; Mitchell et al., 2008), the latter excluding spin relaxation mechanisms. We observe that the calculated transport propagators can be quantitatively compared with the experimental distribution, provided that spin relaxations in the experiment are excluded, and good agreement is found for both the sandstone and the carbonate. However, due to the absence of explicit micro-porosity from the carbonate pore space image used for flow field simulations we note that there are fundamental differences in the physical origins of the stagnant zones for micro-porous rocks between simulation and experiment. We show that for a given micro-CT image of a carbonate, small variations in the parameters chosen for the segmentation process lead to different amounts of stagnancy which diffuse away at different rates. Finally, we use a filtering method to show that this is due to the presence of spurious isolated pores which arise from the segmentation process and suggest an approach to overcome this limitation.

  17. Labelling plants the Chernobyl way: A new approach for mapping rhizodeposition and biopore reuse

    NASA Astrophysics Data System (ADS)

    Banfield, Callum; Kuzyakov, Yakov

    2016-04-01

    A novel approach for mapping root distribution and rhizodeposition using 137Cs and 14C was applied. By immersing cut leaves into vials containing 137CsCl solution, the 137Cs label is taken up and partly released into the rhizosphere, where it strongly binds to soil particles, thus labelling the distribution of root channels in the long term. Reuse of root channels in crop rotations can be determined by labelling the first crop with 137Cs and the following crop with 14C. Imaging of the β- radiation with strongly differing energies differentiates active roots growing in existing root channels (14C + 137Cs activity) from roots growing in bulk soil (14C activity only). The feasibility of the approach was shown in a pot experiment with ten plants of two species, Cichorium intybus L., and Medicago sativa L. The same plants were each labelled with 100 kBq of 137CsCl and after one week with 500 kBq of 14CO2. 96 h later pots were cut horizontally at 6 cm depth. After the first 137Cs + 14C imaging of the cut surface, imaging was repeated with three layers of plastic film between the cut surface and the plate for complete shielding of 14C β- radiation to the background level, producing an image of the 137Cs distribution. Subtracting the second image from the first gave the 14C image. Both species allocated 18 - 22% of the 137Cs and about 30 - 40% of 14C activity below ground. Intensities far above the detection limit suggest that this approach is applicable to map the root system by 137Cs and to obtain root size distributions through image processing. The rhizosphere boundary was defined by the point at which rhizodeposited 14C activity declined to 5% of the activity of the root centre. Medicago showed 25% smaller rhizosphere extension than Cichorium, demonstrating that plant-specific rhizodeposition patterns can be distinguished. Our new approach is appropriate to visualise processes and hotspots on multiple scales: Heterogeneous rhizodeposition, as well as size and counts of roots and biopores formed by these in various soil depths can be determined. Finally, biopore reuse in crop rotations can be visualised.

  18. Distribution of guidance models for cardiac resynchronization therapy in the setting of multi-center clinical trials

    NASA Astrophysics Data System (ADS)

    Rajchl, Martin; Abhari, Kamyar; Stirrat, John; Ukwatta, Eranga; Cantor, Diego; Li, Feng P.; Peters, Terry M.; White, James A.

    2014-03-01

    Multi-center trials provide the unique ability to investigate novel techniques across a range of geographical sites with sufficient statistical power, the inclusion of multiple operators determining feasibility under a wider array of clinical environments and work-flows. For this purpose, we introduce a new means of distributing pre-procedural cardiac models for image-guided interventions across a large scale multi-center trial. In this method, a single core facility is responsible for image processing, employing a novel web-based interface for model visualization and distribution. The requirements for such an interface, being WebGL-based, are minimal and well within the realms of accessibility for participating centers. We then demonstrate the accuracy of our approach using a single-center pacemaker lead implantation trial with generic planning models.

  19. Imaging the distribution of transient viscosity after the 2016 Mw 7.1 Kumamoto earthquake.

    PubMed

    Moore, James D P; Yu, Hang; Tang, Chi-Hsien; Wang, Teng; Barbot, Sylvain; Peng, Dongju; Masuti, Sagar; Dauwels, Justin; Hsu, Ya-Ju; Lambert, Valère; Nanjundiah, Priyamvada; Wei, Shengji; Lindsey, Eric; Feng, Lujia; Shibazaki, Bunichiro

    2017-04-14

    The deformation of mantle and crustal rocks in response to stress plays a crucial role in the distribution of seismic and volcanic hazards, controlling tectonic processes ranging from continental drift to earthquake triggering. However, the spatial variation of these dynamic properties is poorly understood as they are difficult to measure. We exploited the large stress perturbation incurred by the 2016 earthquake sequence in Kumamoto, Japan, to directly image localized and distributed deformation. The earthquakes illuminated distinct regions of low effective viscosity in the lower crust, notably beneath the Mount Aso and Mount Kuju volcanoes, surrounded by larger-scale variations of viscosity across the back-arc. This study demonstrates a new potential for geodesy to directly probe rock rheology in situ across many spatial and temporal scales. Copyright © 2017, American Association for the Advancement of Science.

  20. A novel key management solution for reinforcing compliance with HIPAA privacy/security regulations.

    PubMed

    Lee, Chien-Ding; Ho, Kevin I-J; Lee, Wei-Bin

    2011-07-01

    Digitizing medical records facilitates the healthcare process. However, it can also cause serious security and privacy problems, which are the major concern in the Health Insurance Portability and Accountability Act (HIPAA). While various conventional encryption mechanisms can solve some aspects of these problems, they cannot address the illegal distribution of decrypted medical images, which violates the regulations defined in the HIPAA. To protect decrypted medical images from being illegally distributed by an authorized staff member, the model proposed in this paper provides a way to integrate several cryptographic mechanisms. In this model, the malicious staff member can be tracked by a watermarked clue. By combining several well-designed cryptographic mechanisms and developing a key management scheme to facilitate the interoperation among these mechanisms, the risk of illegal distribution can be reduced.

  1. The edge detection method of the infrared imagery of the laser spot

    NASA Astrophysics Data System (ADS)

    Che, Jinxi; Zhang, Jinchun; Li, Zhongmin

    2016-01-01

    In the jamming effectiveness experiments, in which the thermal infrared imager was interfered by the CO2 Laser, in order to evaluate the jamming effect of the thermal infrared imager by the CO2 Laser, it was needed to analyses the obtained infrared imagery of laser spot. Because the laser spot pictures obtained from the thermal infrared imager are irregular, the edge detection is an important process. The image edge is one of the most basic characteristics of the image, and it contains most of the information of the image. Generally, because of the thermal balance effect, the partly temperature of objective is no quite difference; therefore the infrared imagery's ability of reflecting the local detail of object is obvious week. At the same time, when the information of heat distribution of the thermal imagery was combined with the basic information of target, such as the object size, the relative position of field of view, shape and outline, and so on, the information just has more value. Hence, it is an important step for making image processing to extract the objective edge of the infrared imagery. Meanwhile it is an important part of image processing procedure and it is the premise of many subsequent processing. So as to extract outline information of the target from the original thermal imagery, and overcome the disadvantage, such as the low image contrast of the image and serious noise interference, and so on, the edge of thermal imagery needs detecting and processing. The principles of the Roberts, Sobel, Prewitt and Canny operator were analyzed, and then they were used to making edge detection on the thermal imageries of laser spot, which were obtained from the jamming effect experiments of CO2 laser jamming the thermal infrared imager. On the basis of the detection result, their performances were compared. At the end, the characteristics of the operators were summarized, which provide reference for the choice of edge detection operators in thermal imagery processing in future.

  2. Optical encryption interface

    NASA Technical Reports Server (NTRS)

    Jackson, Deborah J. (Inventor)

    1998-01-01

    An analog optical encryption system based on phase scrambling of two-dimensional optical images and holographic transformation for achieving large encryption keys and high encryption speed. An enciphering interface uses a spatial light modulator for converting a digital data stream into a two dimensional optical image. The optical image is further transformed into a hologram with a random phase distribution. The hologram is converted into digital form for transmission over a shared information channel. A respective deciphering interface at a receiver reverses the encrypting process by using a phase conjugate reconstruction of the phase scrambled hologram.

  3. The Fourier Imaging X-ray Spectrometer (FIXS) for the Argentinian, Scout-launched satelite de Aplicaciones Cienficas-1 (SAC-1)

    NASA Technical Reports Server (NTRS)

    Dennis, Brian R.; Crannell, Carol JO; Desai, Upendra D.; Orwig, Larry E.; Kiplinger, Alan L.; Schwartz, Richard A.; Hurford, Gordon J.; Emslie, A. Gordon; Machado, Marcos; Wood, Kent

    1988-01-01

    The Fourier Imaging X-ray Spectrometer (FIXS) is one of four instruments on SAC-1, the Argentinian satellite being proposed for launch by NASA on a Scout rocket in 1992/3. The FIXS is designed to provide solar flare images at X-ray energies between 5 and 35 keV. Observations will be made on arcsecond size scales and subsecond time scales of the processes that modify the electron spectrum and the thermal distribution in flaring magnetic structures.

  4. Preliminary results of 3D dose calculations with MCNP-4B code from a SPECT image.

    PubMed

    Rodríguez Gual, M; Lima, F F; Sospedra Alfonso, R; González González, J; Calderón Marín, C

    2004-01-01

    Interface software was developed to generate the input file to run Monte Carlo MCNP-4B code from medical image in Interfile format version 3.3. The software was tested using a spherical phantom of tomography slides with known cumulated activity distribution in Interfile format generated with IMAGAMMA medical image processing system. The 3D dose calculation obtained with Monte Carlo MCNP-4B code was compared with the voxel S factor method. The results show a relative error between both methods less than 1 %.

  5. Typical effects of laser dazzling CCD camera

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Zhang, Jianmin; Shao, Bibo; Cheng, Deyan; Ye, Xisheng; Feng, Guobin

    2015-05-01

    In this article, an overview of laser dazzling effect to buried channel CCD camera is given. The CCDs are sorted into staring and scanning types. The former includes the frame transfer and interline transfer types. The latter includes linear and time delay integration types. All CCDs must perform four primary tasks in generating an image, which are called charge generation, charge collection, charge transfer and charge measurement. In camera, the lenses are needed to input the optical signal to the CCD sensors, in which the techniques for erasing stray light are used. And the electron circuits are needed to process the output signal of CCD, in which many electronic techniques are used. The dazzling effects are the conjunct result of light distribution distortion and charge distribution distortion, which respectively derive from the lens and the sensor. Strictly speaking, in lens, the light distribution is not distorted. In general, the lens are so well designed and fabricated that its stray light can be neglected. But the laser is of much enough intensity to make its stray light obvious. In CCD image sensors, laser can induce a so large electrons generation. Charges transfer inefficiency and charges blooming will cause the distortion of the charge distribution. Commonly, the largest signal outputted from CCD sensor is restricted by capability of the collection well of CCD, and can't go beyond the dynamic range for the subsequent electron circuits maintaining normal work. So the signal is not distorted in the post-processing circuits. But some techniques in the circuit can make some dazzling effects present different phenomenon in final image.

  6. Method for Assessment of Changes in the Width of Cracks in Cement Composites with Use of Computer Image Processing and Analysis

    NASA Astrophysics Data System (ADS)

    Tomczak, Kamil; Jakubowski, Jacek; Fiołek, Przemysław

    2017-06-01

    Crack width measurement is an important element of research on the progress of self-healing cement composites. Due to the nature of this research, the method of measuring the width of cracks and their changes over time must meet specific requirements. The article presents a novel method of measuring crack width based on images from a scanner with an optical resolution of 6400 dpi, subject to initial image processing in the ImageJ development environment and further processing and analysis of results. After registering a series of images of the cracks at different times using SIFT conversion (Scale-Invariant Feature Transform), a dense network of line segments is created in all images, intersecting the cracks perpendicular to the local axes. Along these line segments, brightness profiles are extracted, which are the basis for determination of crack width. The distribution and rotation of the line of intersection in a regular layout, automation of transformations, management of images and profiles of brightness, and data analysis to determine the width of cracks and their changes over time are made automatically by own code in the ImageJ and VBA environment. The article describes the method, tests on its properties, sources of measurement uncertainty. It also presents an example of application of the method in research on autogenous self-healing of concrete, specifically the ability to reduce a sample crack width and its full closure within 28 days of the self-healing process.

  7. Use of satellite images to determine surface-water cover during the flood event of September 13, 2013, in Lyons and western Longmont, Colorado

    USGS Publications Warehouse

    Cole, Christopher J.; Friesen, Beverly A.; Wilson, Earl M.; Wilds, Stanley R.; Noble, Suzanne M.

    2015-01-01

    This surface-water cover dataset was created as a timely representation of post-flood ground conditions to support response efforts. This dataset and all processed imagery and derived products were uploaded to the USGS Hazards Data Distribution System (HDDS) website (http://hddsexplorer.usgs.gov/uplift/hdds/) for distribution to those responding to the flood event.

  8. Speckle measurements of density and temperature profiles in a model gas circuit breaker

    NASA Astrophysics Data System (ADS)

    Stoller, P. C.; Panousis, E.; Carstensen, J.; Doiron, C. B.; Färber, R.

    2015-01-01

    Speckle imaging was used to measure the density and temperature distribution in the arc zone of a model high voltage circuit breaker during the high current phase and under conditions simulating those present during current-zero crossings (current-zero-like arc); the arc was stabilized by a transonic, axial flow of synthetic air. A single probe beam was used; thus, accurate reconstruction was only possible for axially symmetric gas flows and arc channels. The displacement of speckles with respect to a reference image was converted to a line-of-sight integrated deflection angle, which was in turn converted into an axially symmetric refractive index distribution using a multistep process that made use of the inverse Radon transform. The Gladstone-Dale relation, which gives the index of refraction as a function of density, was extended to high temperatures by taking into account dissociation and ionization processes. The temperature and density were determined uniquely by assuming that the pressure distribution in the case of cold gas flow (in the absence of an arc) is not modified significantly by the arc. The electric conductivity distribution was calculated from the temperature profile and compared to measurements of the arc voltage and to previous results published in the literature for similar experimental conditions.

  9. A statistical model for radar images of agricultural scenes

    NASA Technical Reports Server (NTRS)

    Frost, V. S.; Shanmugan, K. S.; Holtzman, J. C.; Stiles, J. A.

    1982-01-01

    The presently derived and validated statistical model for radar images containing many different homogeneous fields predicts the probability density functions of radar images of entire agricultural scenes, thereby allowing histograms of large scenes composed of a variety of crops to be described. Seasat-A SAR images of agricultural scenes are accurately predicted by the model on the basis of three assumptions: each field has the same SNR, all target classes cover approximately the same area, and the true reflectivity characterizing each individual target class is a uniformly distributed random variable. The model is expected to be useful in the design of data processing algorithms and for scene analysis using radar images.

  10. Extraction of lead and ridge characteristics from SAR images of sea ice

    NASA Technical Reports Server (NTRS)

    Vesecky, John F.; Smith, Martha P.; Samadani, Ramin

    1990-01-01

    Image-processing techniques for extracting the characteristics of lead and pressure ridge features in SAR images of sea ice are reported. The methods are applied to a SAR image of the Beaufort Sea collected from the Seasat satellite on October 3, 1978. Estimates of lead and ridge statistics are made, e.g., lead and ridge density (number of lead or ridge pixels per unit area of image) and the distribution of lead area and orientation as well as ridge length and orientation. The information derived is useful in both ice science and polar operations for such applications as albedo and heat and momentum transfer estimates, as well as ship routing and offshore engineering.

  11. Development of alternative data analysis techniques for improving the accuracy and specificity of natural resource inventories made with digital remote sensing data

    NASA Technical Reports Server (NTRS)

    Lillesand, T. M.; Meisner, D. E. (Principal Investigator)

    1980-01-01

    An investigation was conducted into ways to improve the involvement of state and local user personnel in the digital image analysis process by isolating those elements of the analysis process which require extensive involvement by field personnel and providing means for performing those activities apart from a computer facility. In this way, the analysis procedure can be converted from a centralized activity focused on a computer facility to a distributed activity in which users can interact with the data at the field office level or in the field itself. A general image processing software was developed on the University of Minnesota computer system (Control Data Cyber models 172 and 74). The use of color hardcopy image data as a primary medium in supervised training procedures was investigated and digital display equipment and a coordinate digitizer were procured.

  12. Quantitative fluorescence imaging of protein diffusion and interaction in living cells.

    PubMed

    Capoulade, Jérémie; Wachsmuth, Malte; Hufnagel, Lars; Knop, Michael

    2011-08-07

    Diffusion processes and local dynamic equilibria inside cells lead to nonuniform spatial distributions of molecules, which are essential for processes such as nuclear organization and signaling in cell division, differentiation and migration. To understand these mechanisms, spatially resolved quantitative measurements of protein abundance, mobilities and interactions are needed, but current methods have limited capabilities to study dynamic parameters. Here we describe a microscope based on light-sheet illumination that allows massively parallel fluorescence correlation spectroscopy (FCS) measurements and use it to visualize the diffusion and interactions of proteins in mammalian cells and in isolated fly tissue. Imaging the mobility of heterochromatin protein HP1α (ref. 4) in cell nuclei we could provide high-resolution diffusion maps that reveal euchromatin areas with heterochromatin-like HP1α-chromatin interactions. We expect that FCS imaging will become a useful method for the precise characterization of cellular reaction-diffusion processes.

  13. Modeling the UO2 ex-AUC pellet process and predicting the fuel rod temperature distribution under steady-state operating condition

    NASA Astrophysics Data System (ADS)

    Hung, Nguyen Trong; Thuan, Le Ba; Thanh, Tran Chi; Nhuan, Hoang; Khoai, Do Van; Tung, Nguyen Van; Lee, Jin-Young; Jyothi, Rajesh Kumar

    2018-06-01

    Modeling uranium dioxide pellet process from ammonium uranyl carbonate - derived uranium dioxide powder (UO2 ex-AUC powder) and predicting fuel rod temperature distribution were reported in the paper. Response surface methodology (RSM) and FRAPCON-4.0 code were used to model the process and to predict the fuel rod temperature under steady-state operating condition. Fuel rod design of AP-1000 designed by Westinghouse Electric Corporation, in these the pellet fabrication parameters are from the study, were input data for the code. The predictive data were suggested the relationship between the fabrication parameters of UO2 pellets and their temperature image in nuclear reactor.

  14. A Plenoptic Multi-Color Imaging Pyrometer

    NASA Technical Reports Server (NTRS)

    Danehy, Paul M.; Hutchins, William D.; Fahringer, Timothy; Thurow, Brian S.

    2017-01-01

    A three-color pyrometer has been developed based on plenoptic imaging technology. Three bandpass filters placed in front of a camera lens allow separate 2D images to be obtained on a single image sensor at three different and adjustable wavelengths selected by the user. Images were obtained of different black- or grey-bodies including a calibration furnace, a radiation heater, and a luminous sulfur match flame. The images obtained of the calibration furnace and radiation heater were processed to determine 2D temperature distributions. Calibration results in the furnace showed that the instrument can measure temperature with an accuracy and precision of 10 Kelvins between 1100 and 1350 K. Time-resolved 2D temperature measurements of the radiation heater are shown.

  15. Model-based optimization of near-field binary-pixelated beam shapers

    DOE PAGES

    Dorrer, C.; Hassett, J.

    2017-01-23

    The optimization of components that rely on spatially dithered distributions of transparent or opaque pixels and an imaging system with far-field filtering for transmission control is demonstrated. The binary-pixel distribution can be iteratively optimized to lower an error function that takes into account the design transmission and the characteristics of the required far-field filter. Simulations using a design transmission chosen in the context of high-energy lasers show that the beam-fluence modulation at an image plane can be reduced by a factor of 2, leading to performance similar to using a non-optimized spatial-dithering algorithm with pixels of size reduced by amore » factor of 2 without the additional fabrication complexity or cost. The optimization process preserves the pixel distribution statistical properties. Analysis shows that the optimized pixel distribution starting from a high-noise distribution defined by a random-draw algorithm should be more resilient to fabrication errors than the optimized pixel distributions starting from a low-noise, error-diffusion algorithm, while leading to similar beamshaping performance. Furthermore, this is confirmed by experimental results obtained with various pixel distributions and induced fabrication errors.« less

  16. Imaging of Selenium by Laser Ablation Inductively Coupled Plasma Mass Spectrometry (LA-ICP-MS) in 2-D Electrophoresis Gels and Biological Tissues.

    PubMed

    Cruz, Elisa Castañeda Santa; Susanne Becker, J; Sabine Becker, J; Sussulini, Alessandra

    2018-01-01

    Selenium and selenoproteins are important components of living organisms that play a role in different biological processes. Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) is a powerful analytical technique that has been employed to obtain distribution maps of selenium in biological tissues in a direct manner, as well as in selenoproteins, previously separated by their molecular masses and isoelectric points using two-dimensional polyacrylamide gel electrophoresis (2-D PAGE). In this chapter, we present the protocols to perform LA-ICP-MS imaging experiments, allowing the distribution visualization and determination of selenium and/or selenoproteins in biological systems.

  17. [Three-dimensional finite element analysis of three conjunctive methods of free iliac bone graft for established mandibular body defects].

    PubMed

    Wang, Dong; Yang, Zhuang-qun; Hu, Xiao-yi

    2007-08-01

    To analyze the stress and displacement distribution of 3D-FE models in three conjunctive methods of vascularized iliac bone graft for established mandibular body defects. Using computer image process technique, a series of spiral CT images were put into Ansys preprocess programe to establish three 3D-FE models of different conjunctions. The three 3D-FE models of established mandibular body defects by vascularized iliac bone graft were built up. The distribution of Von Mises stress and displacement around mandibular segment, grafted ilium, plates and screws was obtained. It may be determined successfully that the optimal conjunctive shape be the on-lay conjunction.

  18. Nanometer-scale sizing accuracy of particle suspensions on an unmodified cell phone using elastic light scattering.

    PubMed

    Smith, Zachary J; Chu, Kaiqin; Wachsmann-Hogiu, Sebastian

    2012-01-01

    We report on the construction of a Fourier plane imaging system attached to a cell phone. By illuminating particle suspensions with a collimated beam from an inexpensive diode laser, angularly resolved scattering patterns are imaged by the phone's camera. Analyzing these patterns with Mie theory results in predictions of size distributions of the particles in suspension. Despite using consumer grade electronics, we extracted size distributions of sphere suspensions with better than 20 nm accuracy in determining the mean size. We also show results from milk, yeast, and blood cells. Performing these measurements on a portable device presents opportunities for field-testing of food quality, process monitoring, and medical diagnosis.

  19. Nanoscale strain mapping in battery nanostructures

    NASA Astrophysics Data System (ADS)

    Ulvestad, A.; Cho, H. M.; Harder, R.; Kim, J. W.; Dietze, S. H.; Fohtung, E.; Meng, Y. S.; Shpyrko, O. G.

    2014-02-01

    Coherent x-ray diffraction imaging is used to map the local three dimensional strain inhomogeneity and electron density distribution of two individual LiNi0.5Mn1.5O4-δ cathode nanoparticles in both ex-situ and in-situ environments. Our reconstructed images revealed a maximum strain of 0.4%. We observed different variations in strain inhomogeneity due to multiple competing effects. The compressive/tensile component of the strain is connected to the local lithium content and, on the surface, interpreted in terms of a local Jahn-Teller distortion of Mn3+. Finally, the measured strain distributions are discussed in terms of their impact on competing theoretical models of the lithiation process.

  20. Reducing Interpolation Artifacts for Mutual Information Based Image Registration

    PubMed Central

    Soleimani, H.; Khosravifard, M.A.

    2011-01-01

    Medical image registration methods which use mutual information as similarity measure have been improved in recent decades. Mutual Information is a basic concept of Information theory which indicates the dependency of two random variables (or two images). In order to evaluate the mutual information of two images their joint probability distribution is required. Several interpolation methods, such as Partial Volume (PV) and bilinear, are used to estimate joint probability distribution. Both of these two methods yield some artifacts on mutual information function. Partial Volume-Hanning window (PVH) and Generalized Partial Volume (GPV) methods are introduced to remove such artifacts. In this paper we show that the acceptable performance of these methods is not due to their kernel function. It's because of the number of pixels which incorporate in interpolation. Since using more pixels requires more complex and time consuming interpolation process, we propose a new interpolation method which uses only four pixels (the same as PV and bilinear interpolations) and removes most of the artifacts. Experimental results of the registration of Computed Tomography (CT) images show superiority of the proposed scheme. PMID:22606673

  1. The implementation of contour-based object orientation estimation algorithm in FPGA-based on-board vision system

    NASA Astrophysics Data System (ADS)

    Alpatov, Boris; Babayan, Pavel; Ershov, Maksim; Strotov, Valery

    2016-10-01

    This paper describes the implementation of the orientation estimation algorithm in FPGA-based vision system. An approach to estimate an orientation of objects lacking axial symmetry is proposed. Suggested algorithm is intended to estimate orientation of a specific known 3D object based on object 3D model. The proposed orientation estimation algorithm consists of two stages: learning and estimation. Learning stage is devoted to the exploring of studied object. Using 3D model we can gather set of training images by capturing 3D model from viewpoints evenly distributed on a sphere. Sphere points distribution is made by the geosphere principle. Gathered training image set is used for calculating descriptors, which will be used in the estimation stage of the algorithm. The estimation stage is focusing on matching process between an observed image descriptor and the training image descriptors. The experimental research was performed using a set of images of Airbus A380. The proposed orientation estimation algorithm showed good accuracy in all case studies. The real-time performance of the algorithm in FPGA-based vision system was demonstrated.

  2. Fluorescence lifetime imaging microscopy using near-infrared contrast agents.

    PubMed

    Nothdurft, R; Sarder, P; Bloch, S; Culver, J; Achilefu, S

    2012-08-01

    Although single-photon fluorescence lifetime imaging microscopy (FLIM) is widely used to image molecular processes using a wide range of excitation wavelengths, the captured emission of this technique is confined to the visible spectrum. Here, we explore the feasibility of utilizing near-infrared (NIR) fluorescent molecular probes with emission >700 nm for FLIM of live cells. The confocal microscope is equipped with a 785 nm laser diode, a red-enhanced photomultiplier tube, and a time-correlated single photon counting card. We demonstrate that our system reports the lifetime distributions of NIR fluorescent dyes, cypate and DTTCI, in cells. In cells labelled separately or jointly with these dyes, NIR FLIM successfully distinguishes their lifetimes, providing a method to sort different cell populations. In addition, lifetime distributions of cells co-incubated with these dyes allow estimate of the dyes' relative concentrations in complex cellular microenvironments. With the heightened interest in fluorescence lifetime-based small animal imaging using NIR fluorophores, this technique further serves as a bridge between in vitro spectroscopic characterization of new fluorophore lifetimes and in vivo tissue imaging. © 2012 The Author Journal of Microscopy © 2012 Royal Microscopical Society.

  3. Fluorescence Lifetime Imaging Microscopy Using Near-Infrared Contrast Agents

    PubMed Central

    Nothdurft, Ralph; Sarder, Pinaki; Bloch, Sharon; Culver, Joseph; Achilefu, Samuel

    2013-01-01

    Although single-photon fluorescence lifetime imaging microscopy (FLIM) is widely used to image molecular processes using a wide range of excitation wavelengths, the captured emission of this technique is confined to the visible spectrum. Here, we explore the feasibility of utilizing near-infrared (NIR) fluorescent molecular probes with emission >700 nm for FLIM of live cells. The confocal microscope is equipped with a 785 nm laser diode, a red-enhanced photomultiplier tube, and a time-correlated single photon counting card. We demonstrate that our system reports the lifetime distributions of NIR fluorescent dyes, cypate and DTTCI, in cells. In cells labeled separately or jointly with these dyes, NIR FLIM successfully distinguishes their lifetimes, providing a method to sort different cell populations. In addition, lifetime distributions of cells co-incubated with these dyes allow estimate of the dyes’ relative concentrations in complex cellular microenvironments. With the heightened interest in fluorescence lifetime-based small animal imaging using NIR fluorophores, this technique further serves as a bridge between in vitro spectroscopic characterization of new fluorophore lifetimes and in vivo tissue imaging. PMID:22788550

  4. ASI aurora search: an attempt of intelligent image processing for circular fisheye lens.

    PubMed

    Yang, Xi; Gao, Xinbo; Song, Bin; Wang, Nannan; Yang, Dong

    2018-04-02

    The circular fisheye lens exhibits an approximately 180° angular field-of-view (FOV), which is much larger than that of an ordinary lens. Thus, images captured with a circular fisheye lens are distributed non-uniformly with spherical deformation. Along with the fast development of deep neural networks for normal images, how to apply it to achieve intelligent image processing for a circular fisheye lens is a new task of significant importance. In this paper, we take the aurora images captured with all-sky-imagers (ASI) as a typical example. By analyzing the imaging principle of ASI and the magnetic characteristics of the aurora, a deformed region division (DRD) scheme is proposed to replace the region proposals network (RPN) in the advanced mask regional convolutional neural network (Mask R-CNN) framework. Thus, each image can be regarded as a "bag" of deformed regions represented with CNN features. After clustering all CNN features to generate a vocabulary, each deformed region is quantified to its nearest center for indexing. On the stage of an online search, a similarity score is computed by measuring the distances between regions in the query image and all regions in the data set, and the image with the highest value is outputted as the top rank search result. Experimental results show that the proposed method greatly improves the search accuracy and efficiency, demonstrating that it is a valuable attempt of intelligent image processing for circular fisheye lenses.

  5. Ligament Mediated Fragmentation of Viscoelastic Liquids

    NASA Astrophysics Data System (ADS)

    Keshavarz, Bavand; Houze, Eric C.; Moore, John R.; Koerner, Michael R.; McKinley, Gareth H.

    2016-10-01

    The breakup and atomization of complex fluids can be markedly different than the analogous processes in a simple Newtonian fluid. Atomization of paint, combustion of fuels containing antimisting agents, as well as physiological processes such as sneezing are common examples in which the atomized liquid contains synthetic or biological macromolecules that result in viscoelastic fluid characteristics. Here, we investigate the ligament-mediated fragmentation dynamics of viscoelastic fluids in three different canonical flows. The size distributions measured in each viscoelastic fragmentation process show a systematic broadening from the Newtonian solvent. In each case, the droplet sizes are well described by Gamma distributions which correspond to a fragmentation-coalescence scenario. We use a prototypical axial step strain experiment together with high-speed video imaging to show that this broadening results from the pronounced change in the corrugated shape of viscoelastic ligaments as they separate from the liquid core. These corrugations saturate in amplitude and the measured distributions for viscoelastic liquids in each process are given by a universal probability density function, corresponding to a Gamma distribution with nmin=4 . The breadth of this size distribution for viscoelastic filaments is shown to be constrained by a geometrical limit which can not be exceeded in ligament-mediated fragmentation phenomena.

  6. Ligament Mediated Fragmentation of Viscoelastic Liquids.

    PubMed

    Keshavarz, Bavand; Houze, Eric C; Moore, John R; Koerner, Michael R; McKinley, Gareth H

    2016-10-07

    The breakup and atomization of complex fluids can be markedly different than the analogous processes in a simple Newtonian fluid. Atomization of paint, combustion of fuels containing antimisting agents, as well as physiological processes such as sneezing are common examples in which the atomized liquid contains synthetic or biological macromolecules that result in viscoelastic fluid characteristics. Here, we investigate the ligament-mediated fragmentation dynamics of viscoelastic fluids in three different canonical flows. The size distributions measured in each viscoelastic fragmentation process show a systematic broadening from the Newtonian solvent. In each case, the droplet sizes are well described by Gamma distributions which correspond to a fragmentation-coalescence scenario. We use a prototypical axial step strain experiment together with high-speed video imaging to show that this broadening results from the pronounced change in the corrugated shape of viscoelastic ligaments as they separate from the liquid core. These corrugations saturate in amplitude and the measured distributions for viscoelastic liquids in each process are given by a universal probability density function, corresponding to a Gamma distribution with n_{min}=4. The breadth of this size distribution for viscoelastic filaments is shown to be constrained by a geometrical limit which can not be exceeded in ligament-mediated fragmentation phenomena.

  7. Assessment of Restoration Methods of X-Ray Images with Emphasis on Medical Photogrammetric Usage

    NASA Astrophysics Data System (ADS)

    Hosseinian, S.; Arefi, H.

    2016-06-01

    Nowadays, various medical X-ray imaging methods such as digital radiography, computed tomography and fluoroscopy are used as important tools in diagnostic and operative processes especially in the computer and robotic assisted surgeries. The procedures of extracting information from these images require appropriate deblurring and denoising processes on the pre- and intra-operative images in order to obtain more accurate information. This issue becomes more considerable when the X-ray images are planned to be employed in the photogrammetric processes for 3D reconstruction from multi-view X-ray images since, accurate data should be extracted from images for 3D modelling and the quality of X-ray images affects directly on the results of the algorithms. For restoration of X-ray images, it is essential to consider the nature and characteristics of these kinds of images. X-ray images exhibit severe quantum noise due to limited X-ray photons involved. The assumptions of Gaussian modelling are not appropriate for photon-limited images such as X-ray images, because of the nature of signal-dependant quantum noise. These images are generally modelled by Poisson distribution which is the most common model for low-intensity imaging. In this paper, existing methods are evaluated. For this purpose, after demonstrating the properties of medical X-ray images, the more efficient and recommended methods for restoration of X-ray images would be described and assessed. After explaining these approaches, they are implemented on samples from different kinds of X-ray images. By considering the results, it is concluded that using PURE-LET, provides more effective and efficient denoising than other examined methods in this research.

  8. Bayesian inference on multiscale models for poisson intensity estimation: applications to photon-limited image denoising.

    PubMed

    Lefkimmiatis, Stamatios; Maragos, Petros; Papandreou, George

    2009-08-01

    We present an improved statistical model for analyzing Poisson processes, with applications to photon-limited imaging. We build on previous work, adopting a multiscale representation of the Poisson process in which the ratios of the underlying Poisson intensities (rates) in adjacent scales are modeled as mixtures of conjugate parametric distributions. Our main contributions include: 1) a rigorous and robust regularized expectation-maximization (EM) algorithm for maximum-likelihood estimation of the rate-ratio density parameters directly from the noisy observed Poisson data (counts); 2) extension of the method to work under a multiscale hidden Markov tree model (HMT) which couples the mixture label assignments in consecutive scales, thus modeling interscale coefficient dependencies in the vicinity of image edges; 3) exploration of a 2-D recursive quad-tree image representation, involving Dirichlet-mixture rate-ratio densities, instead of the conventional separable binary-tree image representation involving beta-mixture rate-ratio densities; and 4) a novel multiscale image representation, which we term Poisson-Haar decomposition, that better models the image edge structure, thus yielding improved performance. Experimental results on standard images with artificially simulated Poisson noise and on real photon-limited images demonstrate the effectiveness of the proposed techniques.

  9. Estimating occupancy and abundance using aerial images with imperfect detection

    USGS Publications Warehouse

    Williams, Perry J.; Hooten, Mevin B.; Womble, Jamie N.; Bower, Michael R.

    2017-01-01

    Species distribution and abundance are critical population characteristics for efficient management, conservation, and ecological insight. Point process models are a powerful tool for modelling distribution and abundance, and can incorporate many data types, including count data, presence-absence data, and presence-only data. Aerial photographic images are a natural tool for collecting data to fit point process models, but aerial images do not always capture all animals that are present at a site. Methods for estimating detection probability for aerial surveys usually include collecting auxiliary data to estimate the proportion of time animals are available to be detected.We developed an approach for fitting point process models using an N-mixture model framework to estimate detection probability for aerial occupancy and abundance surveys. Our method uses multiple aerial images taken of animals at the same spatial location to provide temporal replication of sample sites. The intersection of the images provide multiple counts of individuals at different times. We examined this approach using both simulated and real data of sea otters (Enhydra lutris kenyoni) in Glacier Bay National Park, southeastern Alaska.Using our proposed methods, we estimated detection probability of sea otters to be 0.76, the same as visual aerial surveys that have been used in the past. Further, simulations demonstrated that our approach is a promising tool for estimating occupancy, abundance, and detection probability from aerial photographic surveys.Our methods can be readily extended to data collected using unmanned aerial vehicles, as technology and regulations permit. The generality of our methods for other aerial surveys depends on how well surveys can be designed to meet the assumptions of N-mixture models.

  10. Uranus: a rapid prototyping tool for FPGA embedded computer vision

    NASA Astrophysics Data System (ADS)

    Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.

    2007-01-01

    The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.

  11. A robust close-range photogrammetric target extraction algorithm for size and type variant targets

    NASA Astrophysics Data System (ADS)

    Nyarko, Kofi; Thomas, Clayton; Torres, Gilbert

    2016-05-01

    The Photo-G program conducted by Naval Air Systems Command at the Atlantic Test Range in Patuxent River, Maryland, uses photogrammetric analysis of large amounts of real-world imagery to characterize the motion of objects in a 3-D scene. Current approaches involve several independent processes including target acquisition, target identification, 2-D tracking of image features, and 3-D kinematic state estimation. Each process has its own inherent complications and corresponding degrees of both human intervention and computational complexity. One approach being explored for automated target acquisition relies on exploiting the pixel intensity distributions of photogrammetric targets, which tend to be patterns with bimodal intensity distributions. The bimodal distribution partitioning algorithm utilizes this distribution to automatically deconstruct a video frame into regions of interest (ROI) that are merged and expanded to target boundaries, from which ROI centroids are extracted to mark target acquisition points. This process has proved to be scale, position and orientation invariant, as well as fairly insensitive to global uniform intensity disparities.

  12. Phase retrieval using regularization method in intensity correlation imaging

    NASA Astrophysics Data System (ADS)

    Li, Xiyu; Gao, Xin; Tang, Jia; Lu, Changming; Wang, Jianli; Wang, Bin

    2014-11-01

    Intensity correlation imaging(ICI) method can obtain high resolution image with ground-based low precision mirrors, in the imaging process, phase retrieval algorithm should be used to reconstituted the object's image. But the algorithm now used(such as hybrid input-output algorithm) is sensitive to noise and easy to stagnate. However the signal-to-noise ratio of intensity interferometry is low especially in imaging astronomical objects. In this paper, we build the mathematical model of phase retrieval and simplified it into a constrained optimization problem of a multi-dimensional function. New error function was designed by noise distribution and prior information using regularization method. The simulation results show that the regularization method can improve the performance of phase retrieval algorithm and get better image especially in low SNR condition

  13. Pseudoinverse Decoding Process in Delay-Encoded Synthetic Transmit Aperture Imaging.

    PubMed

    Gong, Ping; Kolios, Michael C; Xu, Yuan

    2016-09-01

    Recently, we proposed a new method to improve the signal-to-noise ratio of the prebeamformed radio-frequency data in synthetic transmit aperture (STA) imaging: the delay-encoded STA (DE-STA) imaging. In the decoding process of DE-STA, the equivalent STA data were obtained by directly inverting the coding matrix. This is usually regarded as an ill-posed problem, especially under high noise levels. Pseudoinverse (PI) is usually used instead for seeking a more stable inversion process. In this paper, we apply singular value decomposition to the coding matrix to conduct the PI. Our numerical studies demonstrate that the singular values of the coding matrix have a special distribution, i.e., all the values are the same except for the first and last ones. We compare the PI in two cases: complete PI (CPI), where all the singular values are kept, and truncated PI (TPI), where the last and smallest singular value is ignored. The PI (both CPI and TPI) DE-STA processes are tested against noise with both numerical simulations and experiments. The CPI and TPI can restore the signals stably, and the noise mainly affects the prebeamformed signals corresponding to the first transmit channel. The difference in the overall enveloped beamformed image qualities between the CPI and TPI is negligible. Thus, it demonstrates that DE-STA is a relatively stable encoding and decoding technique. Also, according to the special distribution of the singular values of the coding matrix, we propose a new efficient decoding formula that is based on the conjugate transpose of the coding matrix. We also compare the computational complexity of the direct inverse and the new formula.

  14. In situ observation of dynamic electrodeposition processes by soft x-ray fluorescence microspectroscopy and keyhole coherent diffractive imaging

    NASA Astrophysics Data System (ADS)

    Bozzini, Benedetto; Kourousias, George; Gianoncelli, Alessandra

    2017-03-01

    This paper describes two novel in situ microspectroscopic approaches to the dynamic study of electrodeposition processes: x-ray fluorescence (XRF) mapping with submicrometric space resolution and keyhole coherent diffractive imaging (kCDI) with nanometric lateral resolution. As a case study, we consider the pulse-plating of nanocomposites with polypyrrole matrix and Mn x Co y O z dispersoids, a prospective cathode material for zinc-air batteries. This study is centred on the detailed measurement of the elemental distributions developing in two representative subsequent growth steps, based on the combination of in situ identical-location XRF microspectroscopy—accompanied by soft-x ray absorption microscopy—and kCDI. XRF discloses space and time distributions of the two electrodeposited metals and kCDI on the one hand allows nanometric resolution and on the other hand provides complementary absorption as well as phase contrast modes. The joint information derived from these two microspectroscopies allows measurement of otherwise inaccessible observables that are a prerequisite for electrodeposition modelling and control accounting for dynamic localization processes.

  15. Coal Layer Identification using Electrical Resistivity Imaging Method in Sinjai Area South Sulawesi

    NASA Astrophysics Data System (ADS)

    Ilham Samanlangi, Andi

    2018-03-01

    The purpose of this research is to image subsurface resistivity for coal identification in Panaikang Village, Sinjai, South Sulawesi.Resistivity measurements were conducted in 3 lines of length 400 meters and 300 meter using resistivity imaging, dipole-dipole configuration. Resistivity data was processed using Res2DInv software to image resistivity variation and interpret lithology. The research results shown that coal resistivity in Line is about 70-200 Ωm, Line 2 is about 70-90 Ωm, and Line 3 is about 70-200 Ωm with average thickness about 10 meters and distributed to the east of research area.

  16. Foodomics imaging by mass spectrometry and magnetic resonance.

    PubMed

    Canela, Núria; Rodríguez, Miguel Ángel; Baiges, Isabel; Nadal, Pedro; Arola, Lluís

    2016-07-01

    This work explores the use of advanced imaging MS (IMS) and magnetic resonance imaging (MRI) techniques in food science and nutrition to evaluate food sensory characteristics, nutritional value and health benefits. Determining the chemical content and applying imaging tools to food metabolomics offer detailed information about food quality, safety, processing, storage and authenticity assessment. IMS and MRI are powerful analytical systems with an excellent capability for mapping the distribution of many molecules, and recent advances in these platforms are reviewed and discussed, showing the great potential of these techniques for small molecule-based food metabolomics research. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. A portable detection instrument based on DSP for beef marbling

    NASA Astrophysics Data System (ADS)

    Zhou, Tong; Peng, Yankun

    2014-05-01

    Beef marbling is one of the most important indices to assess beef quality. Beef marbling is graded by the measurement of the fat distribution density in the rib-eye region. However quality grades of beef in most of the beef slaughtering houses and businesses depend on trainees using their visual senses or comparing the beef slice to the Chinese standard sample cards. Manual grading demands not only great labor but it also lacks objectivity and accuracy. Aiming at the necessity of beef slaughtering houses and businesses, a beef marbling detection instrument was designed. The instrument employs Charge-coupled Device (CCD) imaging techniques, digital image processing, Digital Signal Processor (DSP) control and processing techniques and Liquid Crystal Display (LCD) screen display techniques. The TMS320DM642 digital signal processor of Texas Instruments (TI) is the core that combines high-speed data processing capabilities and real-time processing features. All processes such as image acquisition, data transmission, image processing algorithms and display were implemented on this instrument for a quick, efficient, and non-invasive detection of beef marbling. Structure of the system, working principle, hardware and software are introduced in detail. The device is compact and easy to transport. The instrument can determine the grade of beef marbling reliably and correctly.

  18. Maximizing Total QoS-Provisioning of Image Streams with Limited Energy Budget

    NASA Astrophysics Data System (ADS)

    Lee, Wan Yeon; Kim, Kyong Hoon; Ko, Young Woong

    To fully utilize the limited battery energy of mobile electronic devices, we propose an adaptive adjustment method of processing quality for multiple image stream tasks running with widely varying execution times. This adjustment method completes the worst-case executions of the tasks with a given budget of energy, and maximizes the total reward value of processing quality obtained during their executions by exploiting the probability distribution of task execution times. The proposed method derives the maximum reward value for the tasks being executable with arbitrary processing quality, and near maximum value for the tasks being executable with a finite number of processing qualities. Our evaluation on a prototype system shows that the proposed method achieves larger reward values, by up to 57%, than the previous method.

  19. An Estimation Approach to Extract Multimedia Information in Distributed Steganographic Images

    DTIC Science & Technology

    2007-07-01

    image steganography (DIS) [8] is a new method of concealing secret information in several host images , leaving...distributed image steganography , steganalysis, estimation, image quality matrix 1 Introduction Steganography is a method that hides secret information...used to sufficiently hide a secret image . Another emerging image steganographic technique is referred to as distributed image steganography

  20. An embedded multi-core parallel model for real-time stereo imaging

    NASA Astrophysics Data System (ADS)

    He, Wenjing; Hu, Jian; Niu, Jingyu; Li, Chuanrong; Liu, Guangyu

    2018-04-01

    The real-time processing based on embedded system will enhance the application capability of stereo imaging for LiDAR and hyperspectral sensor. The task partitioning and scheduling strategies for embedded multiprocessor system starts relatively late, compared with that for PC computer. In this paper, aimed at embedded multi-core processing platform, a parallel model for stereo imaging is studied and verified. After analyzing the computing amount, throughout capacity and buffering requirements, a two-stage pipeline parallel model based on message transmission is established. This model can be applied to fast stereo imaging for airborne sensors with various characteristics. To demonstrate the feasibility and effectiveness of the parallel model, a parallel software was designed using test flight data, based on the 8-core DSP processor TMS320C6678. The results indicate that the design performed well in workload distribution and had a speed-up ratio up to 6.4.

  1. Personal computer (PC) based image processing applied to fluid mechanics research

    NASA Technical Reports Server (NTRS)

    Cho, Y.-C.; Mclachlan, B. G.

    1987-01-01

    A PC based image processing system was employed to determine the instantaneous velocity field of a two-dimensional unsteady flow. The flow was visualized using a suspension of seeding particles in water, and a laser sheet for illumination. With a finite time exposure, the particle motion was captured on a photograph as a pattern of streaks. The streak pattern was digitized and processsed using various imaging operations, including contrast manipulation, noise cleaning, filtering, statistical differencing, and thresholding. Information concerning the velocity was extracted from the enhanced image by measuring the length and orientation of the individual streaks. The fluid velocities deduced from the randomly distributed particle streaks were interpolated to obtain velocities at uniform grid points. For the interpolation a simple convolution technique with an adaptive Gaussian window was used. The results are compared with a numerical prediction by a Navier-Stokes commputation.

  2. Respiratory motion correction in emission tomography image reconstruction.

    PubMed

    Reyes, Mauricio; Malandain, Grégoire; Koulibaly, Pierre Malick; González Ballester, Miguel A; Darcourt, Jacques

    2005-01-01

    In Emission Tomography imaging, respiratory motion causes artifacts in lungs and cardiac reconstructed images, which lead to misinterpretations and imprecise diagnosis. Solutions like respiratory gating, correlated dynamic PET techniques, list-mode data based techniques and others have been tested with improvements over the spatial activity distribution in lungs lesions, but with the disadvantages of requiring additional instrumentation or discarding part of the projection data used for reconstruction. The objective of this study is to incorporate respiratory motion correction directly into the image reconstruction process, without any additional acquisition protocol consideration. To this end, we propose an extension to the Maximum Likelihood Expectation Maximization (MLEM) algorithm that includes a respiratory motion model, which takes into account the displacements and volume deformations produced by the respiratory motion during the data acquisition process. We present results from synthetic simulations incorporating real respiratory motion as well as from phantom and patient data.

  3. Combined endeavor of Neutrosophic Set and Chan-Vese model to extract accurate liver image from CT scan.

    PubMed

    Siri, Sangeeta K; Latte, Mrityunjaya V

    2017-11-01

    Many different diseases can occur in the liver, including infections such as hepatitis, cirrhosis, cancer and over effect of medication or toxins. The foremost stage for computer-aided diagnosis of liver is the identification of liver region. Liver segmentation algorithms extract liver image from scan images which helps in virtual surgery simulation, speedup the diagnosis, accurate investigation and surgery planning. The existing liver segmentation algorithms try to extort exact liver image from abdominal Computed Tomography (CT) scan images. It is an open problem because of ambiguous boundaries, large variation in intensity distribution, variability of liver geometry from patient to patient and presence of noise. A novel approach is proposed to meet challenges in extracting the exact liver image from abdominal CT scan images. The proposed approach consists of three phases: (1) Pre-processing (2) CT scan image transformation to Neutrosophic Set (NS) and (3) Post-processing. In pre-processing, the noise is removed by median filter. The "new structure" is designed to transform a CT scan image into neutrosophic domain which is expressed using three membership subset: True subset (T), False subset (F) and Indeterminacy subset (I). This transform approximately extracts the liver image structure. In post processing phase, morphological operation is performed on indeterminacy subset (I) and apply Chan-Vese (C-V) model with detection of initial contour within liver without user intervention. This resulted in liver boundary identification with high accuracy. Experiments show that, the proposed method is effective, robust and comparable with existing algorithm for liver segmentation of CT scan images. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Rupture Processes of the Mw8.3 Sea of Okhotsk Earthquake and Aftershock Sequences from 3-D Back Projection Imaging

    NASA Astrophysics Data System (ADS)

    Jian, P. R.; Hung, S. H.; Meng, L.

    2014-12-01

    On May 24, 2013, the largest deep earthquake ever recorded in history occurred on the southern tip of the Kamchatka Island, where the Pacific Plate subducts underneath the Okhotsk Plate. Previous 2D beamforming back projection (BP) of P- coda waves suggests the mainshock ruptured bilaterally along a horizontal fault plane determined by the global centroid moment tensor solution. On the other hand, the multiple point source inversion of P and SH waveforms argued that the earthquake comprises a sequence of 6 subevents not located on a single plane but actually distributed in a zone that extends 64 km horizontally and 35 km in depth. We then apply a three-dimensional MUSIC BP approach to resolve the rupture processes of the manishock and two large aftershocks (M6.7) with no a priori setup of preferential orientations of the planar rupture. The maximum pseudo-spectrum of high-frequency P wave in a sequence of time windows recorded by the densely-distributed stations from US and EU Array are used to image 3-D temporal and spatial rupture distribution. The resulting image confirms that the nearly N-S striking but two antiparallel rupture stages. The first subhorizontal rupture initially propagates toward the NNE direction, while at 18 s later it directs reversely to the SSW and concurrently shifts downward to 35 km deeper lasting for about 20 s. The rupture lengths in the first NNE-ward and second SSW-ward stage are about 30 km and 85 km; the estimated rupture velocities are 3 km/s and 4.25 km/s, respectively. Synthetic experiments are undertaken to assess the capability of the 3D MUSIC BP for the recovery of spatio-temporal rupture processes. Besides, high frequency BP images based on the EU-Array data show two M6.7 aftershocks are more likely to rupture on the vertical fault planes.

  5. Poisson-Gaussian Noise Reduction Using the Hidden Markov Model in Contourlet Domain for Fluorescence Microscopy Images

    PubMed Central

    Yang, Sejung; Lee, Byung-Uk

    2015-01-01

    In certain image acquisitions processes, like in fluorescence microscopy or astronomy, only a limited number of photons can be collected due to various physical constraints. The resulting images suffer from signal dependent noise, which can be modeled as a Poisson distribution, and a low signal-to-noise ratio. However, the majority of research on noise reduction algorithms focuses on signal independent Gaussian noise. In this paper, we model noise as a combination of Poisson and Gaussian probability distributions to construct a more accurate model and adopt the contourlet transform which provides a sparse representation of the directional components in images. We also apply hidden Markov models with a framework that neatly describes the spatial and interscale dependencies which are the properties of transformation coefficients of natural images. In this paper, an effective denoising algorithm for Poisson-Gaussian noise is proposed using the contourlet transform, hidden Markov models and noise estimation in the transform domain. We supplement the algorithm by cycle spinning and Wiener filtering for further improvements. We finally show experimental results with simulations and fluorescence microscopy images which demonstrate the improved performance of the proposed approach. PMID:26352138

  6. Constraints on Circumstellar Dust Grain Sizes from High Spatial Resolution Observations in the Thermal Infrared

    NASA Technical Reports Server (NTRS)

    Bloemhof, E. E.; Danen, R. M.; Gwinn, C. R.

    1996-01-01

    We describe how high spatial resolution imaging of circumstellar dust at a wavelength of about 10 micron, combined with knowledge of the source spectral energy distribution, can yield useful information about the sizes of the individual dust grains responsible for the infrared emission. Much can be learned even when only upper limits to source size are available. In parallel with high-resolution single-telescope imaging that may resolve the more extended mid-infrared sources, we plan to apply these less direct techniques to interpretation of future observations from two-element optical interferometers, where quite general arguments may be made despite only crude imaging capability. Results to date indicate a tendency for circumstellar grain sizes to be rather large compared to the Mathis-Rumpl-Nordsieck size distribution traditionally thought to characterize dust in the general interstellar medium. This may mean that processing of grains after their initial formation and ejection from circumstellar atmospheres adjusts their size distribution to the ISM curve; further mid-infrared observations of grains in various environments would help to confirm this conjecture.

  7. Independent motion detection with a rival penalized adaptive particle filter

    NASA Astrophysics Data System (ADS)

    Becker, Stefan; Hübner, Wolfgang; Arens, Michael

    2014-10-01

    Aggregation of pixel based motion detection into regions of interest, which include views of single moving objects in a scene is an essential pre-processing step in many vision systems. Motion events of this type provide significant information about the object type or build the basis for action recognition. Further, motion is an essential saliency measure, which is able to effectively support high level image analysis. When applied to static cameras, background subtraction methods achieve good results. On the other hand, motion aggregation on freely moving cameras is still a widely unsolved problem. The image flow, measured on a freely moving camera is the result from two major motion types. First the ego-motion of the camera and second object motion, that is independent from the camera motion. When capturing a scene with a camera these two motion types are adverse blended together. In this paper, we propose an approach to detect multiple moving objects from a mobile monocular camera system in an outdoor environment. The overall processing pipeline consists of a fast ego-motion compensation algorithm in the preprocessing stage. Real-time performance is achieved by using a sparse optical flow algorithm as an initial processing stage and a densely applied probabilistic filter in the post-processing stage. Thereby, we follow the idea proposed by Jung and Sukhatme. Normalized intensity differences originating from a sequence of ego-motion compensated difference images represent the probability of moving objects. Noise and registration artefacts are filtered out, using a Bayesian formulation. The resulting a posteriori distribution is located on image regions, showing strong amplitudes in the difference image which are in accordance with the motion prediction. In order to effectively estimate the a posteriori distribution, a particle filter is used. In addition to the fast ego-motion compensation, the main contribution of this paper is the design of the probabilistic filter for real-time detection and tracking of independently moving objects. The proposed approach introduces a competition scheme between particles in order to ensure an improved multi-modality. Further, the filter design helps to generate a particle distribution which is homogenous even in the presence of multiple targets showing non-rigid motion patterns. The effectiveness of the method is shown on exemplary outdoor sequences.

  8. NDSI products system based on Hadoop platform

    NASA Astrophysics Data System (ADS)

    Zhou, Yan; Jiang, He; Yang, Xiaoxia; Geng, Erhui

    2015-12-01

    Snow is solid state of water resources on earth, and plays an important role in human life. Satellite remote sensing is significant in snow extraction with the advantages of cyclical, macro, comprehensiveness, objectivity, timeliness. With the continuous development of remote sensing technology, remote sensing data access to the trend of multiple platforms, multiple sensors and multiple perspectives. At the same time, in view of the remote sensing data of compute-intensive applications demand increase gradually. However, current the producing system of remote sensing products is in a serial mode, and this kind of production system is used for professional remote sensing researchers mostly, and production systems achieving automatic or semi-automatic production are relatively less. Facing massive remote sensing data, the traditional serial mode producing system with its low efficiency has been difficult to meet the requirements of mass data timely and efficient processing. In order to effectively improve the production efficiency of NDSI products, meet the demand of large-scale remote sensing data processed timely and efficiently, this paper build NDSI products production system based on Hadoop platform, and the system mainly includes the remote sensing image management module, NDSI production module, and system service module. Main research contents and results including: (1)The remote sensing image management module: includes image import and image metadata management two parts. Import mass basis IRS images and NDSI product images (the system performing the production task output) into HDFS file system; At the same time, read the corresponding orbit ranks number, maximum/minimum longitude and latitude, product date, HDFS storage path, Hadoop task ID (NDSI products), and other metadata information, and then create thumbnails, and unique ID number for each record distribution, import it into base/product image metadata database. (2)NDSI production module: includes the index calculation, production tasks submission and monitoring two parts. Read HDF images related to production task in the form of a byte stream, and use Beam library to parse image byte stream to the form of Product; Use MapReduce distributed framework to perform production tasks, at the same time monitoring task status; When the production task complete, calls remote sensing image management module to store NDSI products. (3)System service module: includes both image search and DNSI products download. To image metadata attributes described in JSON format, return to the image sequence ID existing in the HDFS file system; For the given MapReduce task ID, package several task output NDSI products into ZIP format file, and return to the download link (4)System evaluation: download massive remote sensing data and use the system to process it to get the NDSI products testing the performance, and the result shows that the system has high extendibility, strong fault tolerance, fast production speed, and the image processing results with high accuracy.

  9. Lung Parenchymal Signal Intensity in MRI: A Technical Review with Educational Aspirations Regarding Reversible Versus Irreversible Transverse Relaxation Effects in Common Pulse Sequences.

    PubMed

    Mulkern, Robert; Haker, Steven; Mamata, Hatsuho; Lee, Edward; Mitsouras, Dimitrios; Oshio, Koichi; Balasubramanian, Mukund; Hatabu, Hiroto

    2014-03-01

    Lung parenchyma is challenging to image with proton MRI. The large air space results in ~l/5th as many signal-generating protons compared to other organs. Air/tissue magnetic susceptibility differences lead to strong magnetic field gradients throughout the lungs and to broad frequency distributions, much broader than within other organs. Such distributions have been the subject of experimental and theoretical analyses which may reveal aspects of lung microarchitecture useful for diagnosis. Their most immediate relevance to current imaging practice is to cause rapid signal decays, commonly discussed in terms of short T 2 * values of 1 ms or lower at typical imaging field strengths. Herein we provide a brief review of previous studies describing and interpreting proton lung spectra. We then link these broad frequency distributions to rapid signal decays, though not necessarily the exponential decays generally used to define T 2 * values. We examine how these decays influence observed signal intensities and spatial mapping features associated with the most prominent torso imaging sequences, including spoiled gradient and spin echo sequences. Effects of imperfect refocusing pulses on the multiple echo signal decays in single shot fast spin echo (SSFSE) sequences and effects of broad frequency distributions on balanced steady state free precession (bSSFP) sequence signal intensities are also provided. The theoretical analyses are based on the concept of explicitly separating the effects of reversible and irreversible transverse relaxation processes, thus providing a somewhat novel and more general framework from which to estimate lung signal intensity behavior in modern imaging practice.

  10. Lung Parenchymal Signal Intensity in MRI: A Technical Review with Educational Aspirations Regarding Reversible Versus Irreversible Transverse Relaxation Effects in Common Pulse Sequences

    PubMed Central

    MULKERN, ROBERT; HAKER, STEVEN; MAMATA, HATSUHO; LEE, EDWARD; MITSOURAS, DIMITRIOS; OSHIO, KOICHI; BALASUBRAMANIAN, MUKUND; HATABU, HIROTO

    2014-01-01

    Lung parenchyma is challenging to image with proton MRI. The large air space results in ~l/5th as many signal-generating protons compared to other organs. Air/tissue magnetic susceptibility differences lead to strong magnetic field gradients throughout the lungs and to broad frequency distributions, much broader than within other organs. Such distributions have been the subject of experimental and theoretical analyses which may reveal aspects of lung microarchitecture useful for diagnosis. Their most immediate relevance to current imaging practice is to cause rapid signal decays, commonly discussed in terms of short T2* values of 1 ms or lower at typical imaging field strengths. Herein we provide a brief review of previous studies describing and interpreting proton lung spectra. We then link these broad frequency distributions to rapid signal decays, though not necessarily the exponential decays generally used to define T2* values. We examine how these decays influence observed signal intensities and spatial mapping features associated with the most prominent torso imaging sequences, including spoiled gradient and spin echo sequences. Effects of imperfect refocusing pulses on the multiple echo signal decays in single shot fast spin echo (SSFSE) sequences and effects of broad frequency distributions on balanced steady state free precession (bSSFP) sequence signal intensities are also provided. The theoretical analyses are based on the concept of explicitly separating the effects of reversible and irreversible transverse relaxation processes, thus providing a somewhat novel and more general framework from which to estimate lung signal intensity behavior in modern imaging practice. PMID:25228852

  11. The PDS-based Data Processing, Archiving and Management Procedures in Chang'e Mission

    NASA Astrophysics Data System (ADS)

    Zhang, Z. B.; Li, C.; Zhang, H.; Zhang, P.; Chen, W.

    2017-12-01

    PDS is adopted as standard format of scientific data and foundation of all data-related procedures in Chang'e mission. Unlike the geographically distributed nature of the planetary data system, all procedures of data processing, archiving, management and distribution are proceeded in the headquarter of Ground Research and Application System of Chang'e mission in a centralized manner. The RAW data acquired by the ground stations is transmitted to and processed by data preprocessing subsystem (DPS) for the production of PDS-compliant Level 0 Level 2 data products using established algorithms, with each product file being well described using an attached label, then all products with the same orbit number are put together into a scheduled task for archiving along with a XML archive list file recoding all product files' properties such as file name, file size etc. After receiving the archive request from DPS, data management subsystem (DMS) is provoked to parse the XML list file to validate all the claimed files and their compliance to PDS using a prebuilt data dictionary, then to exact metadata of each data product file from its PDS label and the fields of its normalized filename. Various requirements of data management, retrieving, distribution and application can be well met using the flexible combination of the rich metadata empowered by the PDS. In the forthcoming CE-5 mission, all the design of data structure and procedures will be updated from PDS version 3 used in previous CE-1, CE-2 and CE-3 missions to the new version 4, the main changes would be: 1) a dedicated detached XML label will be used to describe the corresponding scientific data acquired by the 4 instruments carried, the XML parsing framework used in archive list validation will be reused for the label after some necessary adjustments; 2) all the image data acquired by the panorama camera, landing camera and lunar mineralogical spectrometer should use an Array_2D_Image/Array_3D_Image object to store image data, and use a Table_Character object to store image frame header; the tabulated data acquired by the lunar regolith penetrating radar should use a Table_Binary object to store measurements.

  12. Finite slice analysis (FINA) of sliced and velocity mapped images on a Cartesian grid

    NASA Astrophysics Data System (ADS)

    Thompson, J. O. F.; Amarasinghe, C.; Foley, C. D.; Rombes, N.; Gao, Z.; Vogels, S. N.; van de Meerakker, S. Y. T.; Suits, A. G.

    2017-08-01

    Although time-sliced imaging yields improved signal-to-noise and resolution compared with unsliced velocity mapped ion images, for finite slice widths as encountered in real experiments there is a loss of resolution and recovered intensities for the slow fragments. Recently, we reported a new approach that permits correction of these effects for an arbitrarily sliced distribution of a 3D charged particle cloud. This finite slice analysis (FinA) method utilizes basis functions that model the out-of-plane contribution of a given velocity component to the image for sequential subtraction in a spherical polar coordinate system. However, the original approach suffers from a slow processing time due to the weighting procedure needed to accurately model the out-of-plane projection of an anisotropic angular distribution. To overcome this issue we present a variant of the method in which the FinA approach is performed in a cylindrical coordinate system (Cartesian in the image plane) rather than a spherical polar coordinate system. Dubbed C-FinA, we show how this method is applied in much the same manner. We compare this variant to the polar FinA method and find that the processing time (of a 510 × 510 pixel image) in its most extreme case improves by a factor of 100. We also show that although the resulting velocity resolution is not quite as high as the polar version, this new approach shows superior resolution for fine structure in the differential cross sections. We demonstrate the method on a range of experimental and synthetic data at different effective slice widths.

  13. High-resolution, continuous field-of-view (FOV), non-rotating imaging system

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance L. (Inventor); Stirbl, Robert C. (Inventor); Aghazarian, Hrand (Inventor); Padgett, Curtis W. (Inventor)

    2010-01-01

    A high resolution CMOS imaging system especially suitable for use in a periscope head. The imaging system includes a sensor head for scene acquisition, and a control apparatus inclusive of distributed processors and software for device-control, data handling, and display. The sensor head encloses a combination of wide field-of-view CMOS imagers and narrow field-of-view CMOS imagers. Each bank of imagers is controlled by a dedicated processing module in order to handle information flow and image analysis of the outputs of the camera system. The imaging system also includes automated or manually controlled display system and software for providing an interactive graphical user interface (GUI) that displays a full 360-degree field of view and allows the user or automated ATR system to select regions for higher resolution inspection.

  14. Retinex enhancement of infrared images.

    PubMed

    Li, Ying; He, Renjie; Xu, Guizhi; Hou, Changzhi; Sun, Yunyan; Guo, Lei; Rao, Liyun; Yan, Weili

    2008-01-01

    With the ability of imaging the temperature distribution of body, infrared imaging is promising in diagnostication and prognostication of diseases. However the poor quality of the raw original infrared images prevented applications and one of the essential problems is the low contrast appearance of the imagined object. In this paper, the image enhancement technique based on the Retinex theory is studied, which is a process that automatically retrieve the visual realism to images. The algorithms, including Frackle-McCann algorithm, McCann99 algorithm, single-scale Retinex algorithm, multi-scale Retinex algorithm and multi-scale Retinex algorithm with color restoration, are experienced to the enhancement of infrared images. The entropy measurements along with the visual inspection were compared and results shown the algorithms based on Retinex theory have the ability in enhancing the infrared image. Out of the algorithms compared, MSRCR demonstrated the best performance.

  15. Imaging of current distributions in superconducting thin film structures

    NASA Astrophysics Data System (ADS)

    Dönitz, Dietmar

    2006-10-01

    Local analysis plays an important role in many fields of scientific research. However, imaging methods are not very common in the investigation of superconductors. For more than 20 years, Low Temperature Scanning Electron Microscopy (LTSEM) has been successfully used at the University of Tübingen for studying of condensed matter phenomena, especially of superconductivity. In this thesis LTSEM was used for imaging current distributions in different superconducting thin film structures: - Imaging of current distributions in Josephson junctions with ferromagnetic interlayer, also known as SIFS junctions, showed inhomogeneous current transport over the junctions which directly led to an improvement in the fabrication process. An investigation of improved samples showed a very homogeneous current distribution without any trace of magnetic domains. Either such domains were not present or too small for imaging with the LTSEM. - An investigation of Nb/YBCO zigzag Josephson junctions yielded important information on signal formation in the LTSEM both for Josephson junctions in the short and in the long limit. Using a reference junction our signal formation model could be verified, thus confirming earlier results on short zigzag junctions. These results, which could be reproduced in this work, support the theory of d-wave symmetry in the superconducting order parameter of YBCO. Furthermore, investigations of the quasiparticle tunneling in the zigzag junctions showed the existence of Andreev bound states, which is another indication of the d-wave symmetry in YBCO. - The LTSEM study of Hot Electron Bolometers (HEB) allowed the first successful imaging of a stable 'Hot Spot', a self-heating region in HEB structures. Moreover, the electron beam was used to induce an - otherwise unstable - hot spot. Both investigations yielded information on the homogeneity of the samples. - An entirely new method of imaging the current distribution in superconducting interference devices (SQUIDs) could be developed. It is based on vortex imaging by LTSEM that had been established several years ago. The vortex signals can be used as local detectors for the vortex-free circulating sheet-current distribution J. Compared to previous inversion methods that infer J from the measured magnetic field, this method gives a more direct measurement of the current distribution. The experimental results were in very good agreement with numerical calculations of J. The presented investigations show how versatile and useful Low Temperature Scanning Electron Microscopy can be for studying superconducting thin film structures. Thus one may expect that many more important results can be obtained with this method.

  16. 3D optical imagery for motion compensation in a limb ultrasound system

    NASA Astrophysics Data System (ADS)

    Ranger, Bryan J.; Feigin, Micha; Zhang, Xiang; Mireault, Al; Raskar, Ramesh; Herr, Hugh M.; Anthony, Brian W.

    2016-04-01

    Conventional processes for prosthetic socket fabrication are heavily subjective, often resulting in an interface to the human body that is neither comfortable nor completely functional. With nearly 100% of amputees reporting that they experience discomfort with the wearing of their prosthetic limb, designing an effective interface to the body can significantly affect quality of life and future health outcomes. Active research in medical imaging and biomechanical tissue modeling of residual limbs has led to significant advances in computer aided prosthetic socket design, demonstrating an interest in moving toward more quantifiable processes that are still patient-specific. In our work, medical ultrasonography is being pursued to acquire data that may quantify and improve the design process and fabrication of prosthetic sockets while greatly reducing cost compared to an MRI-based framework. This paper presents a prototype limb imaging system that uses a medical ultrasound probe, mounted to a mechanical positioning system and submerged in a water bath. The limb imaging is combined with three-dimensional optical imaging for motion compensation. Images are collected circumferentially around the limb and combined into cross-sectional axial image slices, resulting in a compound image that shows tissue distributions and anatomical boundaries similar to magnetic resonance imaging. In this paper we provide a progress update on our system development, along with preliminary results as we move toward full volumetric imaging of residual limbs for prosthetic socket design. This demonstrates a novel multi-modal approach to residual limb imaging.

  17. Detection of high molecular weight proteins by MALDI imaging mass spectrometry.

    PubMed

    Mainini, Veronica; Bovo, Giorgio; Chinello, Clizia; Gianazza, Erica; Grasso, Marco; Cattoretti, Giorgio; Magni, Fulvio

    2013-06-01

    MALDI imaging mass spectrometry (IMS) is a unique technology to explore the spatial distribution of biomolecules directly on tissues. It allows the in situ investigation of a large number of small proteins and peptides. Detection of high molecular weight proteins through MALDI IMS still represents an important challenge, as it would allow the direct investigation of the distribution of more proteins involved in biological processes, such as cytokines, enzymes, neuropeptide precursors and receptors. In this work we compare the traditional method performed with sinapinic acid with a comparable protocol using ferulic acid as the matrix. Data show a remarkable increase of signal acquisition in the mass range of 20k to 150k Th. Moreover, we report molecular images of biomolecules above 70k Th, demonstrating the possibility of expanding the application of this technology both in clinical investigations and basic science.

  18. Physics Based Modeling and Rendering of Vegetation in the Thermal Infrared

    NASA Technical Reports Server (NTRS)

    Smith, J. A.; Ballard, J. R., Jr.

    1999-01-01

    We outline a procedure for rendering physically-based thermal infrared images of simple vegetation scenes. Our approach incorporates the biophysical processes that affect the temperature distribution of the elements within a scene. Computer graphics plays a key role in two respects. First, in computing the distribution of scene shaded and sunlit facets and, second, in the final image rendering once the temperatures of all the elements in the scene have been computed. We illustrate our approach for a simple corn scene where the three-dimensional geometry is constructed based on measured morphological attributes of the row crop. Statistical methods are used to construct a representation of the scene in agreement with the measured characteristics. Our results are quite good. The rendered images exhibit realistic behavior in directional properties as a function of view and sun angle. The root-mean-square error in measured versus predicted brightness temperatures for the scene was 2.1 deg C.

  19. Imaging natural materials with a quasi-microscope. [spectrophotometry of granular materials

    NASA Technical Reports Server (NTRS)

    Bragg, S.; Arvidson, R.

    1977-01-01

    A Viking lander camera with auxilliary optics mounted inside the dust post was evaluated to determine its capability for imaging the inorganic properties of granular materials. During mission operations, prepared samples would be delivered to a plate positioned within the camera's field of view and depth of focus. The auxiliary optics would then allow soil samples to be imaged with an 11 pm pixel size in the broad band (high resolution, black and white) mode, and a 33 pm pixel size in the multispectral mode. The equipment will be used to characterize: (1) the size distribution of grains produced by igneous (intrusive and extrusive) processes or by shock metamorphism, (2) the size distribution resulting from crushing, chemical alteration, or by hydraulic or aerodynamic sorting; (3) the shape and degree of grain roundness and surface texture induced by mechanical and chemical alteration; and (4) the mineralogy and chemistry of grains.

  20. Componential distribution analysis of food using near infrared ray image

    NASA Astrophysics Data System (ADS)

    Yamauchi, Hiroki; Kato, Kunihito; Yamamoto, Kazuhiko; Ogawa, Noriko; Ohba, Kimie

    2008-11-01

    The components of the food related to the "deliciousness" are usually evaluated by componential analysis. The component content and type of components in the food are determined by this analysis. However, componential analysis is not able to analyze measurements in detail, and the measurement is time consuming. We propose a method to measure the two-dimensional distribution of the component in food using a near infrared ray (IR) image. The advantage of our method is to be able to visualize the invisible components. Many components in food have characteristics such as absorption and reflection of light in the IR range. The component content is measured using subtraction between two wavelengths of near IR light. In this paper, we describe a method to measure the component of food using near IR image processing, and we show an application to visualize the saccharose in the pumpkin.

Top