Sample records for correlation image sensor

  1. Generating Artificial Reference Images for Open Loop Correlation Wavefront Sensors

    NASA Astrophysics Data System (ADS)

    Townson, M. J.; Love, G. D.; Saunter, C. D.

    2018-05-01

    Shack-Hartmann wavefront sensors for both solar and laser guide star adaptive optics (with elongated spots) need to observe extended objects. Correlation techniques have been successfully employed to measure the wavefront gradient in solar adaptive optics systems and have been proposed for laser guide star systems. In this paper we describe a method for synthesising reference images for correlation Shack-Hartmann wavefront sensors with a larger field of view than individual sub-apertures. We then show how these supersized reference images can increase the performance of correlation wavefront sensors in regimes where large relative shifts are induced between sub-apertures, such as those observed in open-loop wavefront sensors. The technique we describe requires no external knowledge outside of the wavefront-sensor images, making it available as an entirely "software" upgrade to an existing adaptive optics system. For solar adaptive optics we show the supersized reference images extend the magnitude of shifts which can be accurately measured from 12% to 50% of the field of view of a sub-aperture and in laser guide star wavefront sensors the magnitude of centroids that can be accurately measured is increased from 12% to 25% of the total field of view of the sub-aperture.

  2. Time-of-flight camera via a single-pixel correlation image sensor

    NASA Astrophysics Data System (ADS)

    Mao, Tianyi; Chen, Qian; He, Weiji; Dai, Huidong; Ye, Ling; Gu, Guohua

    2018-04-01

    A time-of-flight imager based on single-pixel correlation image sensors is proposed for noise-free depth map acquisition in presence of ambient light. Digital micro-mirror device and time-modulated IR-laser provide spatial and temporal illumination on the unknown object. Compressed sensing and ‘four bucket principle’ method are combined to reconstruct the depth map from a sequence of measurements at a low sampling rate. Second-order correlation transform is also introduced to reduce the noise from the detector itself and direct ambient light. Computer simulations are presented to validate the computational models and improvement of reconstructions.

  3. Influence of the internal wall thickness of electrical capacitance tomography sensors on image quality

    NASA Astrophysics Data System (ADS)

    Liang, Shiguo; Ye, Jiamin; Wang, Haigang; Wu, Meng; Yang, Wuqiang

    2018-03-01

    In the design of electrical capacitance tomography (ECT) sensors, the internal wall thickness can vary with specific applications, and it is a key factor that influences the sensitivity distribution and image quality. This paper will discuss the effect of the wall thickness of ECT sensors on image quality. Three flow patterns are simulated for wall thicknesses of 2.5 mm to 15 mm on eight-electrode ECT sensors. The sensitivity distributions and potential distributions are compared for different wall thicknesses. Linear back-projection and Landweber iteration algorithms are used for image reconstruction. Relative image error and correlation coefficients are used for image evaluation using both simulation and experimental data.

  4. Attitude-correlated frames approach for a star sensor to improve attitude accuracy under highly dynamic conditions.

    PubMed

    Ma, Liheng; Zhan, Dejun; Jiang, Guangwen; Fu, Sihua; Jia, Hui; Wang, Xingshu; Huang, Zongsheng; Zheng, Jiaxing; Hu, Feng; Wu, Wei; Qin, Shiqiao

    2015-09-01

    The attitude accuracy of a star sensor decreases rapidly when star images become motion-blurred under dynamic conditions. Existing techniques concentrate on a single frame of star images to solve this problem and improvements are obtained to a certain extent. An attitude-correlated frames (ACF) approach, which concentrates on the features of the attitude transforms of the adjacent star image frames, is proposed to improve upon the existing techniques. The attitude transforms between different star image frames are measured by the strap-down gyro unit precisely. With the ACF method, a much larger star image frame is obtained through the combination of adjacent frames. As a result, the degradation of attitude accuracy caused by motion-blurring are compensated for. The improvement of the attitude accuracy is approximately proportional to the square root of the number of correlated star image frames. Simulations and experimental results indicate that the ACF approach is effective in removing random noises and improving the attitude determination accuracy of the star sensor under highly dynamic conditions.

  5. Fixed Pattern Noise pixel-wise linear correction for crime scene imaging CMOS sensor

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Messinger, David W.; Dube, Roger R.; Ientilucci, Emmett J.

    2017-05-01

    Filtered multispectral imaging technique might be a potential method for crime scene documentation and evidence detection due to its abundant spectral information as well as non-contact and non-destructive nature. Low-cost and portable multispectral crime scene imaging device would be highly useful and efficient. The second generation crime scene imaging system uses CMOS imaging sensor to capture spatial scene and bandpass Interference Filters (IFs) to capture spectral information. Unfortunately CMOS sensors suffer from severe spatial non-uniformity compared to CCD sensors and the major cause is Fixed Pattern Noise (FPN). IFs suffer from "blue shift" effect and introduce spatial-spectral correlated errors. Therefore, Fixed Pattern Noise (FPN) correction is critical to enhance crime scene image quality and is also helpful for spatial-spectral noise de-correlation. In this paper, a pixel-wise linear radiance to Digital Count (DC) conversion model is constructed for crime scene imaging CMOS sensor. Pixel-wise conversion gain Gi,j and Dark Signal Non-Uniformity (DSNU) Zi,j are calculated. Also, conversion gain is divided into four components: FPN row component, FPN column component, defects component and effective photo response signal component. Conversion gain is then corrected to average FPN column and row components and defects component so that the sensor conversion gain is uniform. Based on corrected conversion gain and estimated image incident radiance from the reverse of pixel-wise linear radiance to DC model, corrected image spatial uniformity can be enhanced to 7 times as raw image, and the bigger the image DC value within its dynamic range, the better the enhancement.

  6. Study the performance of star sensor influenced by space radiation damage of image sensor

    NASA Astrophysics Data System (ADS)

    Feng, Jie; Li, Yudong; Wen, Lin; Guo, Qi; Zhang, Xingyao

    2018-03-01

    Star sensor is an essential component of spacecraft attitude control system. Spatial radiation can cause star sensor performance degradation, abnormal work, attitude measurement accuracy and reliability reduction. Many studies have already been dedicated to the radiation effect on Charge-Coupled Device(CCD) image sensor, but fewer studies focus on the radiation effect of star sensor. The innovation of this paper is to study the radiation effects from the device level to the system level. The influence of the degradation of CCD image sensor radiation sensitive parameters on the performance parameters of star sensor is studied in this paper. The correlation among the radiation effect of proton, the non-uniformity noise of CCD image sensor and the performance parameter of star sensor is analyzed. This paper establishes a foundation for the study of error prediction and correction technology of star sensor on-orbit attitude measurement, and provides some theoretical basis for the design of high performance star sensor.

  7. Analysis of remote sensing data collected for detection and mapping of oil spills: Reduction and analysis of multi-sensor airborne data of the NASA Wallops oil spill exercise of November 1978

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Airborne, remotely sensed data of the NASA Wallops controlled oil spill were corrected, reduced and analysed. Sensor performance comparisons were made by registering data sets from different sensors, which were near-coincident in time and location. Multispectral scanner images were, in turn, overlayed with profiles of correlation between airborne and laboratory-acquired fluorosensor spectra of oil; oil-thickness contours derived (by NASA) from a scanning fluorosensor and also from a two-channel scanning microwave radiometer; and synthetic aperture radar X-HH images. Microwave scatterometer data were correlated with dual-channel (UV and TIR) line scanner images of the oil slick.

  8. A Dual-Mode Large-Arrayed CMOS ISFET Sensor for Accurate and High-Throughput pH Sensing in Biomedical Diagnosis.

    PubMed

    Huang, Xiwei; Yu, Hao; Liu, Xu; Jiang, Yu; Yan, Mei; Wu, Dongping

    2015-09-01

    The existing ISFET-based DNA sequencing detects hydrogen ions released during the polymerization of DNA strands on microbeads, which are scattered into microwell array above the ISFET sensor with unknown distribution. However, false pH detection happens at empty microwells due to crosstalk from neighboring microbeads. In this paper, a dual-mode CMOS ISFET sensor is proposed to have accurate pH detection toward DNA sequencing. Dual-mode sensing, optical and chemical modes, is realized by integrating a CMOS image sensor (CIS) with ISFET pH sensor, and is fabricated in a standard 0.18-μm CIS process. With accurate determination of microbead physical locations with CIS pixel by contact imaging, the dual-mode sensor can correlate local pH for one DNA slice at one location-determined microbead, which can result in improved pH detection accuracy. Moreover, toward a high-throughput DNA sequencing, a correlated-double-sampling readout that supports large array for both modes is deployed to reduce pixel-to-pixel nonuniformity such as threshold voltage mismatch. The proposed CMOS dual-mode sensor is experimentally examined to show a well correlated pH map and optical image for microbeads with a pH sensitivity of 26.2 mV/pH, a fixed pattern noise (FPN) reduction from 4% to 0.3%, and a readout speed of 1200 frames/s. A dual-mode CMOS ISFET sensor with suppressed FPN for accurate large-arrayed pH sensing is proposed and demonstrated with state-of-the-art measured results toward accurate and high-throughput DNA sequencing. The developed dual-mode CMOS ISFET sensor has great potential for future personal genome diagnostics with high accuracy and low cost.

  9. Toward CMOS image sensor based glucose monitoring.

    PubMed

    Devadhasan, Jasmine Pramila; Kim, Sanghyo

    2012-09-07

    Complementary metal oxide semiconductor (CMOS) image sensor is a powerful tool for biosensing applications. In this present study, CMOS image sensor has been exploited for detecting glucose levels by simple photon count variation with high sensitivity. Various concentrations of glucose (100 mg dL(-1) to 1000 mg dL(-1)) were added onto a simple poly-dimethylsiloxane (PDMS) chip and the oxidation of glucose was catalyzed with the aid of an enzymatic reaction. Oxidized glucose produces a brown color with the help of chromogen during enzymatic reaction and the color density varies with the glucose concentration. Photons pass through the PDMS chip with varying color density and hit the sensor surface. Photon count was recognized by CMOS image sensor depending on the color density with respect to the glucose concentration and it was converted into digital form. By correlating the obtained digital results with glucose concentration it is possible to measure a wide range of blood glucose levels with great linearity based on CMOS image sensor and therefore this technique will promote a convenient point-of-care diagnosis.

  10. General Model of Photon-Pair Detection with an Image Sensor

    NASA Astrophysics Data System (ADS)

    Defienne, Hugo; Reichert, Matthew; Fleischer, Jason W.

    2018-05-01

    We develop an analytic model that relates intensity correlation measurements performed by an image sensor to the properties of photon pairs illuminating it. Experiments using an effective single-photon counting camera, a linear electron-multiplying charge-coupled device camera, and a standard CCD camera confirm the model. The results open the field of quantum optical sensing using conventional detectors.

  11. Earth Surface Monitoring with COSI-Corr, Techniques and Applications

    NASA Astrophysics Data System (ADS)

    Leprince, S.; Ayoub, F.; Avouac, J.

    2009-12-01

    Co-registration of Optically Sensed Images and Correlation (COSI-Corr) is a software package developed at the California Institute of Technology (USA) for accurate geometrical processing of optical satellite and aerial imagery. Initially developed for the measurement of co-seismic ground deformation using optical imagery, COSI-Corr is now used for a wide range of applications in Earth Sciences, which take advantage of the software capability to co-register, with very high accuracy, images taken from different sensors and acquired at different times. As long as a sensor is supported in COSI-Corr, all images between the supported sensors can be accurately orthorectified and co-registered. For example, it is possible to co-register a series of SPOT images, a series of aerial photographs, as well as to register a series of aerial photographs with a series of SPOT images, etc... Currently supported sensors include the SPOT 1-5, Quickbird, Worldview 1 and Formosat 2 satellites, the ASTER instrument, and frame camera acquisitions from e.g., aerial survey or declassified satellite imagery. Potential applications include accurate change detection between multi-temporal and multi-spectral images, and the calibration of pushbroom cameras. In particular, COSI-Corr provides a powerful correlation tool, which allows for accurate estimation of surface displacement. The accuracy depends on many factors (e.g., cloud, snow, and vegetation cover, shadows, temporal changes in general, steadiness of the imaging platform, defects of the imaging system, etc...) but in practice, the standard deviation of the measurements obtained from the correlation of mutli-temporal images is typically around 1/20 to 1/10 of the pixel size. The software package also includes post-processing tools such as denoising, destriping, and stacking tools to facilitate data interpretation. Examples drawn from current research in, e.g., seismotectonics, glaciology, and geomorphology will be presented. COSI-Corr is developed in IDL (Interactive Data Language), integrated under the user friendly interface ENVI (Environment for Visualizing Images), and is distributed free of charge for academic research purposes.

  12. Photon counting phosphorescence lifetime imaging with TimepixCam

    DOE PAGES

    Hirvonen, Liisa M.; Fisher-Levine, Merlin; Suhling, Klaus; ...

    2017-01-12

    TimepixCam is a novel fast optical imager based on an optimized silicon pixel sensor with a thin entrance window, and read out by a Timepix ASIC. The 256 x 256 pixel sensor has a time resolution of 15 ns at a sustained frame rate of 10 Hz. We used this sensor in combination with an image intensifier for wide-field time-correlated single photon counting (TCSPC) imaging. We have characterised the photon detection capabilities of this detector system, and employed it on a wide-field epifluorescence microscope to map phosphorescence decays of various iridium complexes with lifetimes of about 1 μs in 200more » μm diameter polystyrene beads.« less

  13. Photon counting phosphorescence lifetime imaging with TimepixCam.

    PubMed

    Hirvonen, Liisa M; Fisher-Levine, Merlin; Suhling, Klaus; Nomerotski, Andrei

    2017-01-01

    TimepixCam is a novel fast optical imager based on an optimized silicon pixel sensor with a thin entrance window and read out by a Timepix Application Specific Integrated Circuit. The 256 × 256 pixel sensor has a time resolution of 15 ns at a sustained frame rate of 10 Hz. We used this sensor in combination with an image intensifier for wide-field time-correlated single photon counting imaging. We have characterised the photon detection capabilities of this detector system and employed it on a wide-field epifluorescence microscope to map phosphorescence decays of various iridium complexes with lifetimes of about 1 μs in 200 μm diameter polystyrene beads.

  14. Photon counting phosphorescence lifetime imaging with TimepixCam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirvonen, Liisa M.; Fisher-Levine, Merlin; Suhling, Klaus

    TimepixCam is a novel fast optical imager based on an optimized silicon pixel sensor with a thin entrance window, and read out by a Timepix ASIC. The 256 x 256 pixel sensor has a time resolution of 15 ns at a sustained frame rate of 10 Hz. We used this sensor in combination with an image intensifier for wide-field time-correlated single photon counting (TCSPC) imaging. We have characterised the photon detection capabilities of this detector system, and employed it on a wide-field epifluorescence microscope to map phosphorescence decays of various iridium complexes with lifetimes of about 1 μs in 200more » μm diameter polystyrene beads.« less

  15. Photon counting phosphorescence lifetime imaging with TimepixCam

    NASA Astrophysics Data System (ADS)

    Hirvonen, Liisa M.; Fisher-Levine, Merlin; Suhling, Klaus; Nomerotski, Andrei

    2017-01-01

    TimepixCam is a novel fast optical imager based on an optimized silicon pixel sensor with a thin entrance window and read out by a Timepix Application Specific Integrated Circuit. The 256 × 256 pixel sensor has a time resolution of 15 ns at a sustained frame rate of 10 Hz. We used this sensor in combination with an image intensifier for wide-field time-correlated single photon counting imaging. We have characterised the photon detection capabilities of this detector system and employed it on a wide-field epifluorescence microscope to map phosphorescence decays of various iridium complexes with lifetimes of about 1 μs in 200 μm diameter polystyrene beads.

  16. Sensor-based auto-focusing system using multi-scale feature extraction and phase correlation matching.

    PubMed

    Jang, Jinbeum; Yoo, Yoonjong; Kim, Jongheon; Paik, Joonki

    2015-03-10

    This paper presents a novel auto-focusing system based on a CMOS sensor containing pixels with different phases. Robust extraction of features in a severely defocused image is the fundamental problem of a phase-difference auto-focusing system. In order to solve this problem, a multi-resolution feature extraction algorithm is proposed. Given the extracted features, the proposed auto-focusing system can provide the ideal focusing position using phase correlation matching. The proposed auto-focusing (AF) algorithm consists of four steps: (i) acquisition of left and right images using AF points in the region-of-interest; (ii) feature extraction in the left image under low illumination and out-of-focus blur; (iii) the generation of two feature images using the phase difference between the left and right images; and (iv) estimation of the phase shifting vector using phase correlation matching. Since the proposed system accurately estimates the phase difference in the out-of-focus blurred image under low illumination, it can provide faster, more robust auto focusing than existing systems.

  17. Sensor-Based Auto-Focusing System Using Multi-Scale Feature Extraction and Phase Correlation Matching

    PubMed Central

    Jang, Jinbeum; Yoo, Yoonjong; Kim, Jongheon; Paik, Joonki

    2015-01-01

    This paper presents a novel auto-focusing system based on a CMOS sensor containing pixels with different phases. Robust extraction of features in a severely defocused image is the fundamental problem of a phase-difference auto-focusing system. In order to solve this problem, a multi-resolution feature extraction algorithm is proposed. Given the extracted features, the proposed auto-focusing system can provide the ideal focusing position using phase correlation matching. The proposed auto-focusing (AF) algorithm consists of four steps: (i) acquisition of left and right images using AF points in the region-of-interest; (ii) feature extraction in the left image under low illumination and out-of-focus blur; (iii) the generation of two feature images using the phase difference between the left and right images; and (iv) estimation of the phase shifting vector using phase correlation matching. Since the proposed system accurately estimates the phase difference in the out-of-focus blurred image under low illumination, it can provide faster, more robust auto focusing than existing systems. PMID:25763645

  18. Automatic Methods in Image Processing and Their Relevance to Map-Making.

    DTIC Science & Technology

    1981-02-11

    23b) and ECfg ) = DC1 1 reIc (5-24) Is an example, let the image function f be white noise so that Cf( ) = s, ,), the Dirac impulse . Then (5-24...based on image and correlator models which describe the behavior of correlation processors under condi- tions of low image contrast or signal-to- noise ...71 Sensor Noise ......................... 74 Self Noise .7.................. 6 Ma chine Noise ................ 81 Fixed Point Processing

  19. Yield variability prediction by remote sensing sensors with different spatial resolution

    NASA Astrophysics Data System (ADS)

    Kumhálová, Jitka; Matějková, Štěpánka

    2017-04-01

    Currently, remote sensing sensors are very popular for crop monitoring and yield prediction. This paper describes how satellite images with moderate (Landsat satellite data) and very high (QuickBird and WorldView-2 satellite data) spatial resolution, together with GreenSeeker hand held crop sensor, can be used to estimate yield and crop growth variability. Winter barley (2007 and 2015) and winter wheat (2009 and 2011) were chosen because of cloud-free data availability in the same time period for experimental field from Landsat satellite images and QuickBird or WorldView-2 images. Very high spatial resolution images were resampled to worse spatial resolution. Normalised difference vegetation index was derived from each satellite image data sets and it was also measured with GreenSeeker handheld crop sensor for the year 2015 only. Results showed that each satellite image data set can be used for yield and plant variability estimation. Nevertheless, better results, in comparison with crop yield, were obtained for images acquired in later phenological phases, e.g. in 2007 - BBCH 59 - average correlation coefficient 0.856, and in 2011 - BBCH 59-0.784. GreenSeeker handheld crop sensor was not suitable for yield estimation due to different measuring method.

  20. CMOS Imaging of Pin-Printed Xerogel-Based Luminescent Sensor Microarrays.

    PubMed

    Yao, Lei; Yung, Ka Yi; Khan, Rifat; Chodavarapu, Vamsy P; Bright, Frank V

    2010-12-01

    We present the design and implementation of a luminescence-based miniaturized multisensor system using pin-printed xerogel materials which act as host media for chemical recognition elements. We developed a CMOS imager integrated circuit (IC) to image the luminescence response of the xerogel-based sensor array. The imager IC uses a 26 × 20 (520 elements) array of active pixel sensors and each active pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. The imager includes a correlated double sampling circuit and pixel address/digital control circuit; the image data is read-out as coded serial signal. The sensor system uses a light-emitting diode (LED) to excite the target analyte responsive luminophores doped within discrete xerogel-based sensor elements. As a prototype, we developed a 4 × 4 (16 elements) array of oxygen (O 2 ) sensors. Each group of 4 sensor elements in the array (arranged in a row) is designed to provide a different and specific sensitivity to the target gaseous O 2 concentration. This property of multiple sensitivities is achieved by using a strategic mix of two oxygen sensitive luminophores ([Ru(dpp) 3 ] 2+ and ([Ru(bpy) 3 ] 2+ ) in each pin-printed xerogel sensor element. The CMOS imager consumes an average power of 8 mW operating at 1 kHz sampling frequency driven at 5 V. The developed prototype system demonstrates a low cost and miniaturized luminescence multisensor system.

  1. Correlation plenoptic imaging

    NASA Astrophysics Data System (ADS)

    Pepe, Francesco V.; Di Lena, Francesco; Garuccio, Augusto; D'Angelo, Milena

    2017-06-01

    Plenoptic Imaging (PI) is a novel optical technique for achieving tridimensional imaging in a single shot. In conventional PI, a microlens array is inserted in the native image plane and the sensor array is moved behind the microlenses. On the one hand, the microlenses act as imaging pixels to reproduce the image of the scene; on the other hand, each microlens reproduces on the sensor array an image of the camera lens, thus providing the angular information associated with each imaging pixel. The recorded propagation direction is exploited, in post- processing, to computationally retrace the geometrical light path, thus enabling the refocusing of different planes within the scene, the extension of the depth of field of the acquired image, as well as the 3D reconstruction of the scene. However, a trade-off between spatial and angular resolution is built in the standard plenoptic imaging process. We demonstrate that the second-order spatio-temporal correlation properties of light can be exploited to overcome this fundamental limitation. Using two correlated beams, from either a chaotic or an entangled photon source, we can perform imaging in one arm and simultaneously obtain the angular information in the other arm. In fact, we show that the second order correlation function possesses plenoptic imaging properties (i.e., it encodes both spatial and angular information), and is thus characterized by a key re-focusing and 3D imaging capability. From a fundamental standpoint, the plenoptic application is the first situation where the counterintuitive properties of correlated systems are effectively used to beat intrinsic limits of standard imaging systems. From a practical standpoint, our protocol can dramatically enhance the potentials of PI, paving the way towards its promising applications.

  2. CMOS Imaging of Temperature Effects on Pin-Printed Xerogel Sensor Microarrays.

    PubMed

    Lei Yao; Ka Yi Yung; Chodavarapu, Vamsy P; Bright, Frank V

    2011-04-01

    In this paper, we study the effect of temperature on the operation and performance of a xerogel-based sensor microarrays coupled to a complementary metal-oxide semiconductor (CMOS) imager integrated circuit (IC) that images the photoluminescence response from the sensor microarray. The CMOS imager uses a 32 × 32 (1024 elements) array of active pixel sensors and each pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. A correlated double sampling circuit and pixel address/digital control/signal integration circuit are also implemented on-chip. The CMOS imager data are read out as a serial coded signal. The sensor system uses a light-emitting diode to excite target analyte responsive organometallic luminophores doped within discrete xerogel-based sensor elements. As a proto type, we developed a 3 × 3 (9 elements) array of oxygen (O2) sensors. Each group of three sensor elements in the array (arranged in a column) is designed to provide a different and specific sensitivity to the target gaseous O2 concentration. This property of multiple sensitivities is achieved by using a mix of two O2 sensitive luminophores in each pin-printed xerogel sensor element. The CMOS imager is designed to be low noise and consumes a static power of 320.4 μW and an average dynamic power of 624.6 μW when operating at 100-Hz sampling frequency and 1.8-V dc power supply.

  3. Exploiting the speckle-correlation scattering matrix for a compact reference-free holographic image sensor

    PubMed Central

    Lee, KyeoReh; Park, YongKeun

    2016-01-01

    The word ‘holography' means a drawing that contains all of the information for light—both amplitude and wavefront. However, because of the insufficient bandwidth of current electronics, the direct measurement of the wavefront of light has not yet been achieved. Though reference-field-assisted interferometric methods have been utilized in numerous applications, introducing a reference field raises several fundamental and practical issues. Here we demonstrate a reference-free holographic image sensor. To achieve this, we propose a speckle-correlation scattering matrix approach; light-field information passing through a thin disordered layer is recorded and retrieved from a single-shot recording of speckle intensity patterns. Self-interference via diffusive scattering enables access to impinging light-field information, when light transport in the diffusive layer is precisely calibrated. As a proof-of-concept, we demonstrate direct holographic measurements of three-dimensional optical fields using a compact device consisting of a regular image sensor and a diffusor. PMID:27796290

  4. Exploiting the speckle-correlation scattering matrix for a compact reference-free holographic image sensor.

    PubMed

    Lee, KyeoReh; Park, YongKeun

    2016-10-31

    The word 'holography' means a drawing that contains all of the information for light-both amplitude and wavefront. However, because of the insufficient bandwidth of current electronics, the direct measurement of the wavefront of light has not yet been achieved. Though reference-field-assisted interferometric methods have been utilized in numerous applications, introducing a reference field raises several fundamental and practical issues. Here we demonstrate a reference-free holographic image sensor. To achieve this, we propose a speckle-correlation scattering matrix approach; light-field information passing through a thin disordered layer is recorded and retrieved from a single-shot recording of speckle intensity patterns. Self-interference via diffusive scattering enables access to impinging light-field information, when light transport in the diffusive layer is precisely calibrated. As a proof-of-concept, we demonstrate direct holographic measurements of three-dimensional optical fields using a compact device consisting of a regular image sensor and a diffusor.

  5. Evaluation of excitation strategy with multi-plane electrical capacitance tomography sensor

    NASA Astrophysics Data System (ADS)

    Mao, Mingxu; Ye, Jiamin; Wang, Haigang; Zhang, Jiaolong; Yang, Wuqiang

    2016-11-01

    Electrical capacitance tomography (ECT) is an imaging technique for measuring the permittivity change of materials. Using a multi-plane ECT sensor, three-dimensional (3D) distribution of permittivity may be represented. In this paper, three excitation strategies, including single-electrode excitation, dual-electrode excitation in the same plane, and dual-electrode excitation in different planes are investigated by numerical simulation and experiment for two three-plane ECT sensors with 12 electrodes in total. In one sensor, the electrodes on the middle plane are in line with the others. In the other sensor, they are rotated 45° with reference to the other two planes. A linear back projection algorithm is used to reconstruct the images and a correlation coefficient is used to evaluate the image quality. The capacitance data and sensitivity distribution with each measurement strategy and sensor model are analyzed. Based on simulation and experimental results using noise-free and noisy capacitance data, the performance of the three strategies is evaluated.

  6. Self-calibration of a noisy multiple-sensor system with genetic algorithms

    NASA Astrophysics Data System (ADS)

    Brooks, Richard R.; Iyengar, S. Sitharama; Chen, Jianhua

    1996-01-01

    This paper explores an image processing application of optimization techniques which entails interpreting noisy sensor data. The application is a generalization of image correlation; we attempt to find the optimal gruence which matches two overlapping gray-scale images corrupted with noise. Both taboo search and genetic algorithms are used to find the parameters which match the two images. A genetic algorithm approach using an elitist reproduction scheme is found to provide significantly superior results. The presentation includes a graphic presentation of the paths taken by tabu search and genetic algorithms when trying to find the best possible match between two corrupted images.

  7. Proposal of a Method to Determine the Correlation between Total Suspended Solids and Dissolved Organic Matter in Water Bodies from Spectral Imaging and Artificial Neural Networks

    PubMed Central

    Kupssinskü, Lucas S.; T. Guimarães, Tainá; Koste, Emilie C.; da Silva, Juarez M.; de Souza, Laís V.; Oliverio, William F. M.; Jardim, Rogélio S.; Koch, Ismael É.; de Souza, Jonas G.; Mauad, Frederico F.

    2018-01-01

    Water quality monitoring through remote sensing with UAVs is best conducted using multispectral sensors; however, these sensors are expensive. We aimed to predict multispectral bands from a low-cost sensor (R, G, B bands) using artificial neural networks (ANN). We studied a lake located on the campus of Unisinos University, Brazil, using a low-cost sensor mounted on a UAV. Simultaneously, we collected water samples during the UAV flight to determine total suspended solids (TSS) and dissolved organic matter (DOM). We correlated the three bands predicted with TSS and DOM. The results show that the ANN validation process predicted the three bands of the multispectral sensor using the three bands of the low-cost sensor with a low average error of 19%. The correlations with TSS and DOM resulted in R2 values of greater than 0.60, consistent with literature values. PMID:29315219

  8. Evaluation of electrical capacitance tomography sensor based on the coupling of fluid field and electrostatic field

    NASA Astrophysics Data System (ADS)

    Ye, Jiamin; Wang, Haigang; Yang, Wuqiang

    2016-07-01

    Electrical capacitance tomography (ECT) is based on capacitance measurements from electrode pairs mounted outside of a pipe or vessel. The structure of ECT sensors is vital to image quality. In this paper, issues with the number of electrodes and the electrode covering ratio for complex liquid-solids flows in a rotating device are investigated based on a new coupling simulation model. The number of electrodes is increased from 4 to 32 while the electrode covering ratio is changed from 0.1 to 0.9. Using the coupling simulation method, real permittivity distributions and the corresponding capacitance data at 0, 0.5, 1, 2, 3, 5, and 8 s with a rotation speed of 96 rotations per minute (rpm) are collected. Linear back projection (LBP) and Landweber iteration algorithms are used for image reconstruction. The quality of reconstructed images is evaluated by correlation coefficient compared with the real permittivity distributions obtained from the coupling simulation. The sensitivity for each sensor is analyzed and compared with the correlation coefficient. The capacitance data with a range of signal-to-noise ratios (SNRs) of 45, 50, 55 and 60 dB are generated to evaluate the effect of data noise on the performance of ECT sensors. Furthermore, the SNRs of experimental data are analyzed for a stationary pipe with permittivity distribution. Based on the coupling simulation, 16-electrode ECT sensors are recommended to achieve good image quality.

  9. Contact CMOS imaging of gaseous oxygen sensor array

    PubMed Central

    Daivasagaya, Daisy S.; Yao, Lei; Yi Yung, Ka; Hajj-Hassan, Mohamad; Cheung, Maurice C.; Chodavarapu, Vamsy P.; Bright, Frank V.

    2014-01-01

    We describe a compact luminescent gaseous oxygen (O2) sensor microsystem based on the direct integration of sensor elements with a polymeric optical filter and placed on a low power complementary metal-oxide semiconductor (CMOS) imager integrated circuit (IC). The sensor operates on the measurement of excited-state emission intensity of O2-sensitive luminophore molecules tris(4,7-diphenyl-1,10-phenanthroline) ruthenium(II) ([Ru(dpp)3]2+) encapsulated within sol–gel derived xerogel thin films. The polymeric optical filter is made with polydimethylsiloxane (PDMS) that is mixed with a dye (Sudan-II). The PDMS membrane surface is molded to incorporate arrays of trapezoidal microstructures that serve to focus the optical sensor signals on to the imager pixels. The molded PDMS membrane is then attached with the PDMS color filter. The xerogel sensor arrays are contact printed on top of the PDMS trapezoidal lens-like microstructures. The CMOS imager uses a 32 × 32 (1024 elements) array of active pixel sensors and each pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. Correlated double sampling circuit, pixel address, digital control and signal integration circuits are also implemented on-chip. The CMOS imager data is read out as a serial coded signal. The CMOS imager consumes a static power of 320 µW and an average dynamic power of 625 µW when operating at 100 Hz sampling frequency and 1.8 V DC. This CMOS sensor system provides a useful platform for the development of miniaturized optical chemical gas sensors. PMID:24493909

  10. Contact CMOS imaging of gaseous oxygen sensor array.

    PubMed

    Daivasagaya, Daisy S; Yao, Lei; Yi Yung, Ka; Hajj-Hassan, Mohamad; Cheung, Maurice C; Chodavarapu, Vamsy P; Bright, Frank V

    2011-10-01

    We describe a compact luminescent gaseous oxygen (O 2 ) sensor microsystem based on the direct integration of sensor elements with a polymeric optical filter and placed on a low power complementary metal-oxide semiconductor (CMOS) imager integrated circuit (IC). The sensor operates on the measurement of excited-state emission intensity of O 2 -sensitive luminophore molecules tris(4,7-diphenyl-1,10-phenanthroline) ruthenium(II) ([Ru(dpp) 3 ] 2+ ) encapsulated within sol-gel derived xerogel thin films. The polymeric optical filter is made with polydimethylsiloxane (PDMS) that is mixed with a dye (Sudan-II). The PDMS membrane surface is molded to incorporate arrays of trapezoidal microstructures that serve to focus the optical sensor signals on to the imager pixels. The molded PDMS membrane is then attached with the PDMS color filter. The xerogel sensor arrays are contact printed on top of the PDMS trapezoidal lens-like microstructures. The CMOS imager uses a 32 × 32 (1024 elements) array of active pixel sensors and each pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. Correlated double sampling circuit, pixel address, digital control and signal integration circuits are also implemented on-chip. The CMOS imager data is read out as a serial coded signal. The CMOS imager consumes a static power of 320 µW and an average dynamic power of 625 µW when operating at 100 Hz sampling frequency and 1.8 V DC. This CMOS sensor system provides a useful platform for the development of miniaturized optical chemical gas sensors.

  11. Shear sensing in bonded composites with cantilever beam microsensors and dual-plane digital image correlation

    NASA Astrophysics Data System (ADS)

    Baur, Jeffery W.; Slinker, Keith; Kondash, Corey

    2017-04-01

    Understanding the shear strain, viscoelastic response, and onset of damage within bonded composites is critical to their design, processing, and reliability. This presentation will discuss the multidisciplinary research conducted which led to the conception, development, and demonstration of two methods for measuring the shear within a bonded joint - dualplane digital image correlation (DIC) and a micro-cantilever shear sensor. The dual plane DIC method was developed to measure the strain field on opposing sides of a transparent single-lap joint in order to spatially quantify the joint shear strain. The sensor consists of a single glass fiber cantilever beam with a radially-grown forest of carbon nanotubes (CNTs) within a capillary pore. When the fiber is deflected, the internal radial CNT array is compressed against an electrode within the pore and the corresponding decrease in electrical resistance is correlated with the external loading. When this small, simple, and low-cost sensor was integrated within a composite bonded joint and cycled in tension, the onset of damage prior to joint failure was observed. In a second sample configuration, both the dual plane DIC and the hair sensor detected viscoplastic changes in the strain of the sample in response to continued loading.

  12. Imaging through turbulence using a plenoptic sensor

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C.

    2015-09-01

    Atmospheric turbulence can significantly affect imaging through paths near the ground. Atmospheric turbulence is generally treated as a time varying inhomogeneity of the refractive index of the air, which disrupts the propagation of optical signals from the object to the viewer. Under circumstances of deep or strong turbulence, the object is hard to recognize through direct imaging. Conventional imaging methods can't handle those problems efficiently. The required time for lucky imaging can be increased significantly and the image processing approaches require much more complex and iterative de-blurring algorithms. We propose an alternative approach using a plenoptic sensor to resample and analyze the image distortions. The plenoptic sensor uses a shared objective lens and a microlens array to form a mini Keplerian telescope array. Therefore, the image obtained by a conventional method will be separated into an array of images that contain multiple copies of the object's image and less correlated turbulence disturbances. Then a highdimensional lucky imaging algorithm can be performed based on the collected video on the plenoptic sensor. The corresponding algorithm will select the most stable pixels from various image cells and reconstruct the object's image as if there is only weak turbulence effect. Then, by comparing the reconstructed image with the recorded images in each MLA cell, the difference can be regarded as the turbulence effects. As a result, the retrieval of the object's image and extraction of turbulence effect can be performed simultaneously.

  13. Optical flows method for lightweight agile remote sensor design and instrumentation

    NASA Astrophysics Data System (ADS)

    Wang, Chong; Xing, Fei; Wang, Hongjian; You, Zheng

    2013-08-01

    Lightweight agile remote sensors have become one type of the most important payloads and were widely utilized in space reconnaissance and resource survey. These imaging sensors are designed to obtain the high spatial, temporary and spectral resolution imageries. Key techniques in instrumentation include flexible maneuvering, advanced imaging control algorithms and integrative measuring techniques, which are closely correlative or even acting as the bottle-necks for each other. Therefore, mutual restrictive problems must be solved and optimized. Optical flow is the critical model which to be fully represented in the information transferring as well as radiation energy flowing in dynamic imaging. For agile sensors, especially with wide-field-of view, imaging optical flows may distort and deviate seriously when they perform large angle attitude maneuvering imaging. The phenomena are mainly attributed to the geometrical characteristics of the three-dimensional earth surface as well as the coupled effects due to the complicated relative motion between the sensor and scene. Under this circumstance, velocity fields distribute nonlinearly, the imageries may badly be smeared or probably the geometrical structures are changed since the image velocity matching errors are not having been eliminated perfectly. In this paper, precise imaging optical flow model is established for agile remote sensors, for which optical flows evolving is factorized by two forms, which respectively due to translational movement and image shape changing. Moreover, base on that, agile remote sensors instrumentation was investigated. The main techniques which concern optical flow modeling include integrative design with lightweight star sensors along with micro inertial measurement units and corresponding data fusion, the assemblies of focal plane layout and control, imageries post processing for agile remote sensors etc. Some experiments show that the optical analyzing method is effective to eliminate the limitations for the performance indexes, and succeeded to be applied for integrative system design. Finally, a principle prototype of agile remote sensor designed by the method is discussed.

  14. Compliant finger sensor for sensorimotor studies in MEG and MR environment

    NASA Astrophysics Data System (ADS)

    Li, Y.; Yong, X.; Cheung, T. P. L.; Menon, C.

    2016-07-01

    Magnetoencephalography (MEG) and functional magnetic resonance imaging (fMRI) are widely used for functional brain imaging. The correlations between the sensorimotor functions of the hand and brain activities have been investigated in MEG/fMRI studies. Currently, limited information can be drawn from these studies due to the limitations of existing motion sensors that are used to detect hand movements. One major challenge in designing these motion sensors is to limit the signal interference between the motion sensors and the MEG/fMRI. In this work, a novel finger motion sensor, which contains low-ferromagnetic and non-conductive materials, is introduced. The finger sensor consists of four air-filled chambers. When compressed by finger(s), the pressure change in the chambers can be detected by the electronics of the finger sensor. Our study has validated that the interference between the finger sensor and an MEG is negligible. Also, by applying a support vector machine algorithm to the data obtained from the finger sensor, at least 11 finger patterns can be discriminated. Comparing to the use of traditional electromyography (EMG) in detecting finger motion, our proposed finger motion sensor is not only MEG/fMRI compatible, it is also easy to use. As the signals acquired from the sensor have a higher SNR than that of the EMG, no complex algorithms are required to detect different finger movement patterns. Future studies can utilize this motion sensor to investigate brain activations during different finger motions and correlate the activations with the sensory and motor functions respectively.

  15. Column-parallel correlated multiple sampling circuits for CMOS image sensors and their noise reduction effects.

    PubMed

    Suh, Sungho; Itoh, Shinya; Aoyama, Satoshi; Kawahito, Shoji

    2010-01-01

    For low-noise complementary metal-oxide-semiconductor (CMOS) image sensors, the reduction of pixel source follower noises is becoming very important. Column-parallel high-gain readout circuits are useful for low-noise CMOS image sensors. This paper presents column-parallel high-gain signal readout circuits, correlated multiple sampling (CMS) circuits and their noise reduction effects. In the CMS, the gain of the noise cancelling is controlled by the number of samplings. It has a similar effect to that of an amplified CDS for the thermal noise but is a little more effective for 1/f and RTS noises. Two types of the CMS with simple integration and folding integration are proposed. In the folding integration, the output signal swing is suppressed by a negative feedback using a comparator and one-bit D-to-A converter. The CMS circuit using the folding integration technique allows to realize a very low-noise level while maintaining a wide dynamic range. The noise reduction effects of their circuits have been investigated with a noise analysis and an implementation of a 1Mpixel pinned photodiode CMOS image sensor. Using 16 samplings, dynamic range of 59.4 dB and noise level of 1.9 e(-) for the simple integration CMS and 75 dB and 2.2 e(-) for the folding integration CMS, respectively, are obtained.

  16. Multi-Sensor Registration of Earth Remotely Sensed Imagery

    NASA Technical Reports Server (NTRS)

    LeMoigne, Jacqueline; Cole-Rhodes, Arlene; Eastman, Roger; Johnson, Kisha; Morisette, Jeffrey; Netanyahu, Nathan S.; Stone, Harold S.; Zavorin, Ilya; Zukor, Dorothy (Technical Monitor)

    2001-01-01

    Assuming that approximate registration is given within a few pixels by a systematic correction system, we develop automatic image registration methods for multi-sensor data with the goal of achieving sub-pixel accuracy. Automatic image registration is usually defined by three steps; feature extraction, feature matching, and data resampling or fusion. Our previous work focused on image correlation methods based on the use of different features. In this paper, we study different feature matching techniques and present five algorithms where the features are either original gray levels or wavelet-like features, and the feature matching is based on gradient descent optimization, statistical robust matching, and mutual information. These algorithms are tested and compared on several multi-sensor datasets covering one of the EOS Core Sites, the Konza Prairie in Kansas, from four different sensors: IKONOS (4m), Landsat-7/ETM+ (30m), MODIS (500m), and SeaWIFS (1000m).

  17. Architecture and applications of a high resolution gated SPAD image sensor

    PubMed Central

    Burri, Samuel; Maruyama, Yuki; Michalet, Xavier; Regazzoni, Francesco; Bruschini, Claudio; Charbon, Edoardo

    2014-01-01

    We present the architecture and three applications of the largest resolution image sensor based on single-photon avalanche diodes (SPADs) published to date. The sensor, fabricated in a high-voltage CMOS process, has a resolution of 512 × 128 pixels and a pitch of 24 μm. The fill-factor of 5% can be increased to 30% with the use of microlenses. For precise control of the exposure and for time-resolved imaging, we use fast global gating signals to define exposure windows as small as 4 ns. The uniformity of the gate edges location is ∼140 ps (FWHM) over the whole array, while in-pixel digital counting enables frame rates as high as 156 kfps. Currently, our camera is used as a highly sensitive sensor with high temporal resolution, for applications ranging from fluorescence lifetime measurements to fluorescence correlation spectroscopy and generation of true random numbers. PMID:25090572

  18. Finite Element Analysis of Film Stack Architecture for Complementary Metal-Oxide-Semiconductor Image Sensors.

    PubMed

    Wu, Kuo-Tsai; Hwang, Sheng-Jye; Lee, Huei-Huang

    2017-05-02

    Image sensors are the core components of computer, communication, and consumer electronic products. Complementary metal oxide semiconductor (CMOS) image sensors have become the mainstay of image-sensing developments, but are prone to leakage current. In this study, we simulate the CMOS image sensor (CIS) film stacking process by finite element analysis. To elucidate the relationship between the leakage current and stack architecture, we compare the simulated and measured leakage currents in the elements. Based on the analysis results, we further improve the performance by optimizing the architecture of the film stacks or changing the thin-film material. The material parameters are then corrected to improve the accuracy of the simulation results. The simulated and experimental results confirm a positive correlation between measured leakage current and stress. This trend is attributed to the structural defects induced by high stress, which generate leakage. Using this relationship, we can change the structure of the thin-film stack to reduce the leakage current and thereby improve the component life and reliability of the CIS components.

  19. Finite Element Analysis of Film Stack Architecture for Complementary Metal-Oxide–Semiconductor Image Sensors

    PubMed Central

    Wu, Kuo-Tsai; Hwang, Sheng-Jye; Lee, Huei-Huang

    2017-01-01

    Image sensors are the core components of computer, communication, and consumer electronic products. Complementary metal oxide semiconductor (CMOS) image sensors have become the mainstay of image-sensing developments, but are prone to leakage current. In this study, we simulate the CMOS image sensor (CIS) film stacking process by finite element analysis. To elucidate the relationship between the leakage current and stack architecture, we compare the simulated and measured leakage currents in the elements. Based on the analysis results, we further improve the performance by optimizing the architecture of the film stacks or changing the thin-film material. The material parameters are then corrected to improve the accuracy of the simulation results. The simulated and experimental results confirm a positive correlation between measured leakage current and stress. This trend is attributed to the structural defects induced by high stress, which generate leakage. Using this relationship, we can change the structure of the thin-film stack to reduce the leakage current and thereby improve the component life and reliability of the CIS components. PMID:28468324

  20. A Distributed Compressive Sensing Scheme for Event Capture in Wireless Visual Sensor Networks

    NASA Astrophysics Data System (ADS)

    Hou, Meng; Xu, Sen; Wu, Weiling; Lin, Fei

    2018-01-01

    Image signals which acquired by wireless visual sensor network can be used for specific event capture. This event capture is realized by image processing at the sink node. A distributed compressive sensing scheme is used for the transmission of these image signals from the camera nodes to the sink node. A measurement and joint reconstruction algorithm for these image signals are proposed in this paper. Make advantage of spatial correlation between images within a sensing area, the cluster head node which as the image decoder can accurately co-reconstruct these image signals. The subjective visual quality and the reconstruction error rate are used for the evaluation of reconstructed image quality. Simulation results show that the joint reconstruction algorithm achieves higher image quality at the same image compressive rate than the independent reconstruction algorithm.

  1. Intercomparison of Evapotranspiration Over the Savannah Volta Basin in West Africa Using Remote Sensing Data

    PubMed Central

    Opoku-Duah, S.; Donoghue, D.N.M.; Burt, T. P.

    2008-01-01

    This paper compares evapotranspiration estimates from two complementary satellite sensors – NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) and ESA's ENVISAT Advanced Along-Track Scanning Radiometer (AATSR) over the savannah area of the Volta basin in West Africa. This was achieved through solving for evapotranspiration on the basis of the regional energy balance equation, which was computationally-driven by the Surface Energy Balance Algorithm for Land algorithm (SEBAL). The results showed that both sensors are potentially good sources of evapotranspiration estimates over large heterogeneous landscapes. The MODIS sensor measured daily evapotranspiration reasonably well with a strong spatial correlation (R2=0.71) with Landsat ETM+ but underperformed with deviations up to ∼2.0 mm day-1, when compared with local eddy correlation observations and the Penman-Monteith method mainly because of scale mismatch. The AATSR sensor produced much poorer correlations (R2=0.13) with Landsat ETM+ and conventional ET methods also because of differences in atmospheric correction and sensor calibration over land. PMID:27879847

  2. Optical fibres in pre-detector signal processing

    NASA Astrophysics Data System (ADS)

    Flinn, A. R.

    The basic form of conventional electro-optic sensors is described. The main drawback of these sensors is their inability to deal with the background radiation which usually accompanies the signal. This 'clutter' limits the sensors performance long before other noise such as 'shot' noise. Pre-detector signal processing using the complex amplitude of the light is introduced as a means to discriminate between the signal and 'clutter'. Further improvements to predetector signal processors can be made by the inclusion of optical fibres allowing radiation to be used with greater efficiency and enabling certain signal processing tasks to be carried out with an ease unequalled by any other method. The theory of optical waveguides and their application in sensors, interferometers, and signal processors is reviewed. Geometrical aspects of the formation of linear and circular interference fringes are described along with temporal and spatial coherence theory and their relationship to Michelson's visibility function. The requirements for efficient coupling of a source into singlemode and multimode fibres are given. We describe interference experiments between beams of light emitted from a few metres of two or more, singlemode or multimode, optical fibres. Fresnel's equation is used to obtain expressions for Fresnel and Fraunhofer diffraction patterns which enable electro-optic (E-0) sensors to be analysed by Fourier optics. Image formation is considered when the aperture plane of an E-0 sensor is illuminated with partially coherent light. This allows sensors to be designed using optical transfer functions which are sensitive to the spatial coherence of the illuminating light. Spatial coherence sensors which use gratings as aperture plane reticles are discussed. By using fibre arrays, spatial coherence processing enables E-0 sensors to discriminate between a spatially coherent source and an incoherent background. The sensors enable the position and wavelength of the source to be determined. Experiments are described which use optical fibre arrays as masks for correlation with spatial distributions of light in image planes of E-0 sensors. Correlations between laser light from different points in a scene is investigated by interfering the light emitted from an array of fibres, placed in the image plane of a sensor, with each other. Temporal signal processing experiments show that the visibility of interference fringes gives information about path differences in a scene or through an optical system. Most E-0 sensors employ wavelength filtering of the detected radiation to improve their discrimination and this is shown to be less selective than temporal coherence filtering which is sensitive to spectral bandwidth. Experiments using fibre interferometers to discriminate between red and blue laser light by their bandwidths are described. In most cases the path difference need only be a few tens of centimetres. We consider spatial and temporal coherence in fibres. We show that high visibility interference fringes can be produced by red and blue laser light transmitted through over 100 metres of singlemode or multimode fibre. The effect of detector size, relative to speckle size, is considered for fringes produced by multimode fibres. The effect of dispersion on the coherence of the light emitted from fibres is considered in terms of correlation and interference between modes. We describe experiments using a spatial light modulator called SIGHT-MOD. The device is used in various systems as a fibre optic switch and as a programmable aperture plane reticle. The contrast of the device is measured using red and green, HeNe, sources. Fourier transform images of patterns on the SIGHT-MOD are obtained and used to demonstrate the geometrical manipulation of images using 2D fibre arrays. Correlation of Fourier transform images of the SIGHT-MOD with 2D fibre arrays is demonstrated.

  3. Dependence of Adaptive Cross-correlation Algorithm Performance on the Extended Scene Image Quality

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2008-01-01

    Recently, we reported an adaptive cross-correlation (ACC) algorithm to estimate with high accuracy the shift as large as several pixels between two extended-scene sub-images captured by a Shack-Hartmann wavefront sensor. It determines the positions of all extended-scene image cells relative to a reference cell in the same frame using an FFT-based iterative image-shifting algorithm. It works with both point-source spot images as well as extended scene images. We have demonstrated previously based on some measured images that the ACC algorithm can determine image shifts with as high an accuracy as 0.01 pixel for shifts as large 3 pixels, and yield similar results for both point source spot images and extended scene images. The shift estimate accuracy of the ACC algorithm depends on illumination level, background, and scene content in addition to the amount of the shift between two image cells. In this paper we investigate how the performance of the ACC algorithm depends on the quality and the frequency content of extended scene images captured by a Shack-Hatmann camera. We also compare the performance of the ACC algorithm with those of several other approaches, and introduce a failsafe criterion for the ACC algorithm-based extended scene Shack-Hatmann sensors.

  4. Multiple Hypothesis Correlation for Space Situational Awareness

    DTIC Science & Technology

    2011-08-29

    formulations with anti-aliasing through hybrid approaches such as the Drizzle algorithm [43] all the way up through to image superresolution techniques. Most... superresolution techniques. Second, given a set of images, either directly from the sensor or preprocessed using the above techniques, we showed how

  5. Co-Registration Between Multisource Remote-Sensing Images

    NASA Astrophysics Data System (ADS)

    Wu, J.; Chang, C.; Tsai, H.-Y.; Liu, M.-C.

    2012-07-01

    Image registration is essential for geospatial information systems analysis, which usually involves integrating multitemporal and multispectral datasets from remote optical and radar sensors. An algorithm that deals with feature extraction, keypoint matching, outlier detection and image warping is experimented in this study. The methods currently available in the literature rely on techniques, such as the scale-invariant feature transform, between-edge cost minimization, normalized cross correlation, leasts-quares image matching, random sample consensus, iterated data snooping and thin-plate splines. Their basics are highlighted and encoded into a computer program. The test images are excerpts from digital files created by the multispectral SPOT-5 and Formosat-2 sensors, and by the panchromatic IKONOS and QuickBird sensors. Suburban areas, housing rooftops, the countryside and hilly plantations are studied. The co-registered images are displayed with block subimages in a criss-cross pattern. Besides the imagery, the registration accuracy is expressed by the root mean square error. Toward the end, this paper also includes a few opinions on issues that are believed to hinder a correct correspondence between diverse images.

  6. Smartphone-based quantitative measurements on holographic sensors.

    PubMed

    Khalili Moghaddam, Gita; Lowe, Christopher Robin

    2017-01-01

    The research reported herein integrates a generic holographic sensor platform and a smartphone-based colour quantification algorithm in order to standardise and improve the determination of the concentration of analytes of interest. The utility of this approach has been exemplified by analysing the replay colour of the captured image of a holographic pH sensor in near real-time. Personalised image encryption followed by a wavelet-based image compression method were applied to secure the image transfer across a bandwidth-limited network to the cloud. The decrypted and decompressed image was processed through four principal steps: Recognition of the hologram in the image with a complex background using a template-based approach, conversion of device-dependent RGB values to device-independent CIEXYZ values using a polynomial model of the camera and computation of the CIEL*a*b* values, use of the colour coordinates of the captured image to segment the image, select the appropriate colour descriptors and, ultimately, locate the region of interest (ROI), i.e. the hologram in this case, and finally, application of a machine learning-based algorithm to correlate the colour coordinates of the ROI to the analyte concentration. Integrating holographic sensors and the colour image processing algorithm potentially offers a cost-effective platform for the remote monitoring of analytes in real time in readily accessible body fluids by minimally trained individuals.

  7. Smartphone-based quantitative measurements on holographic sensors

    PubMed Central

    Khalili Moghaddam, Gita

    2017-01-01

    The research reported herein integrates a generic holographic sensor platform and a smartphone-based colour quantification algorithm in order to standardise and improve the determination of the concentration of analytes of interest. The utility of this approach has been exemplified by analysing the replay colour of the captured image of a holographic pH sensor in near real-time. Personalised image encryption followed by a wavelet-based image compression method were applied to secure the image transfer across a bandwidth-limited network to the cloud. The decrypted and decompressed image was processed through four principal steps: Recognition of the hologram in the image with a complex background using a template-based approach, conversion of device-dependent RGB values to device-independent CIEXYZ values using a polynomial model of the camera and computation of the CIEL*a*b* values, use of the colour coordinates of the captured image to segment the image, select the appropriate colour descriptors and, ultimately, locate the region of interest (ROI), i.e. the hologram in this case, and finally, application of a machine learning-based algorithm to correlate the colour coordinates of the ROI to the analyte concentration. Integrating holographic sensors and the colour image processing algorithm potentially offers a cost-effective platform for the remote monitoring of analytes in real time in readily accessible body fluids by minimally trained individuals. PMID:29141008

  8. Nondestructive testing of advanced materials using sensors with metamaterials

    NASA Astrophysics Data System (ADS)

    Rozina, Steigmann; Narcis Andrei, Danila; Nicoleta, Iftimie; Catalin-Andrei, Tugui; Frantisek, Novy; Stanislava, Fintova; Petrica, Vizureanu; Adriana, Savin

    2016-11-01

    This work presents a method for nondestructive evaluation (NDE) of advanced materials that makes use of the images in near field and the concentration of flux using the phenomenon of spatial resolution. The method allows the detection of flaws as crack, nonadhesion of coating, degradation or presence delamination stresses correlated with the response of electromagnetic sensor.

  9. Detection of spectral line curvature in imaging spectrometer data

    NASA Astrophysics Data System (ADS)

    Neville, Robert A.; Sun, Lixin; Staenz, Karl

    2003-09-01

    A procedure has been developed to measure the band-centers and bandwidths for imaging spectrometers using data acquired by the sensor in flight. This is done for each across-track pixel, thus allowing the measurement of the instrument's slit curvature or spectral 'smile'. The procedure uses spectral features present in the at-sensor radiance which are common to all pixels in the scene. These are principally atmospheric absorption lines. The band-center and bandwidth determinations are made by correlating the sensor measured radiance with a modelled radiance, the latter calculated using MODTRAN 4.2. Measurements have been made for a number of instruments including Airborne Visible and Infra-Red Imaging Spectrometer (AVIRIS), SWIR Full Spectrum Imager (SFSI), and Hyperion. The measurements on AVIRIS data were performed as a test of the procedure; since AVIRIS is a whisk-broom scanner it is expected to be free of spectral smile. SFSI is an airborne pushbroom instrument with considerable spectral smile. Hyperion is a satellite pushbroom sensor with a relatively small degree of smile. Measurements of Hyperion were made using three different data sets to check for temporal variations.

  10. Analysis and correction of Landsat 4 and 5 Thematic Mapper Sensor Data

    NASA Technical Reports Server (NTRS)

    Bernstein, R.; Hanson, W. A.

    1985-01-01

    Procedures for the correction and registration and registration of Landsat TM image data are examined. The registration of Landsat-4 TM images of San Francisco to Landsat-5 TM images of the San Francisco using the interactive geometric correction program and the cross-correlation technique is described. The geometric correction program and cross-correlation results are presented. The corrections of the TM data to a map reference and to a cartographic database are discussed; geometric and cartographic analyses are applied to the registration results.

  11. Time-to-impact sensors in robot vision applications based on the near-sensor image processing concept

    NASA Astrophysics Data System (ADS)

    Åström, Anders; Forchheimer, Robert

    2012-03-01

    Based on the Near-Sensor Image Processing (NSIP) concept and recent results concerning optical flow and Time-to- Impact (TTI) computation with this architecture, we show how these results can be used and extended for robot vision applications. The first case involves estimation of the tilt of an approaching planar surface. The second case concerns the use of two NSIP cameras to estimate absolute distance and speed similar to a stereo-matching system but without the need to do image correlations. Going back to a one-camera system, the third case deals with the problem to estimate the shape of the approaching surface. It is shown that the previously developed TTI method not only gives a very compact solution with respect to hardware complexity, but also surprisingly high performance.

  12. PCA-based spatially adaptive denoising of CFA images for single-sensor digital cameras.

    PubMed

    Zheng, Lei; Lukac, Rastislav; Wu, Xiaolin; Zhang, David

    2009-04-01

    Single-sensor digital color cameras use a process called color demosiacking to produce full color images from the data captured by a color filter array (CAF). The quality of demosiacked images is degraded due to the sensor noise introduced during the image acquisition process. The conventional solution to combating CFA sensor noise is demosiacking first, followed by a separate denoising processing. This strategy will generate many noise-caused color artifacts in the demosiacking process, which are hard to remove in the denoising process. Few denoising schemes that work directly on the CFA images have been presented because of the difficulties arisen from the red, green and blue interlaced mosaic pattern, yet a well-designed "denoising first and demosiacking later" scheme can have advantages such as less noise-caused color artifacts and cost-effective implementation. This paper presents a principle component analysis (PCA)-based spatially-adaptive denoising algorithm, which works directly on the CFA data using a supporting window to analyze the local image statistics. By exploiting the spatial and spectral correlations existing in the CFA image, the proposed method can effectively suppress noise while preserving color edges and details. Experiments using both simulated and real CFA images indicate that the proposed scheme outperforms many existing approaches, including those sophisticated demosiacking and denoising schemes, in terms of both objective measurement and visual evaluation.

  13. Wavelet compression techniques for hyperspectral data

    NASA Technical Reports Server (NTRS)

    Evans, Bruce; Ringer, Brian; Yeates, Mathew

    1994-01-01

    Hyperspectral sensors are electro-optic sensors which typically operate in visible and near infrared bands. Their characteristic property is the ability to resolve a relatively large number (i.e., tens to hundreds) of contiguous spectral bands to produce a detailed profile of the electromagnetic spectrum. In contrast, multispectral sensors measure relatively few non-contiguous spectral bands. Like multispectral sensors, hyperspectral sensors are often also imaging sensors, measuring spectra over an array of spatial resolution cells. The data produced may thus be viewed as a three dimensional array of samples in which two dimensions correspond to spatial position and the third to wavelength. Because they multiply the already large storage/transmission bandwidth requirements of conventional digital images, hyperspectral sensors generate formidable torrents of data. Their fine spectral resolution typically results in high redundancy in the spectral dimension, so that hyperspectral data sets are excellent candidates for compression. Although there have been a number of studies of compression algorithms for multispectral data, we are not aware of any published results for hyperspectral data. Three algorithms for hyperspectral data compression are compared. They were selected as representatives of three major approaches for extending conventional lossy image compression techniques to hyperspectral data. The simplest approach treats the data as an ensemble of images and compresses each image independently, ignoring the correlation between spectral bands. The second approach transforms the data to decorrelate the spectral bands, and then compresses the transformed data as a set of independent images. The third approach directly generalizes two-dimensional transform coding by applying a three-dimensional transform as part of the usual transform-quantize-entropy code procedure. The algorithms studied all use the discrete wavelet transform. In the first two cases, a wavelet transform coder was used for the two-dimensional compression. The third case used a three dimensional extension of this same algorithm.

  14. Radiometric cross-calibration of the Terra MODIS and Landsat 7 ETM+ using an invariant desert site

    USGS Publications Warehouse

    Choi, T.; Angal, A.; Chander, G.; Xiong, X.

    2008-01-01

    A methodology for long-term radiometric cross-calibration between the Terra Moderate Resolution Imaging Spectroradiometer (MODIS) and Landsat 7 (L7) Enhanced Thematic Mapper Plus (ETM+) sensors was developed. The approach involves calibration of near-simultaneous surface observations between 2000 and 2007. Fifty-seven cloud-free image pairs were carefully selected over the Libyan desert for this study. The Libyan desert site (+28.55??, +23.39??), located in northern Africa, is a high reflectance site with high spatial, spectral, and temporal uniformity. Because the test site covers about 12 kmx13 km, accurate geometric preprocessing is required to match the footprint size between the two sensors to avoid uncertainties due to residual image misregistration. MODIS Level IB radiometrically corrected products were reprojected to the corresponding ETM+ image's Universal Transverse Mercator (UTM) grid projection. The 30 m pixels from the ETM+ images were aggregated to match the MODIS spatial resolution (250 m in Bands 1 and 2, or 500 m in Bands 3 to 7). The image data from both sensors were converted to absolute units of at-sensor radiance and top-ofatmosphere (TOA) reflectance for the spectrally matching band pairs. For each band pair, a set of fitted coefficients (slope and offset) is provided to quantify the relationships between the testing sensors. This work focuses on long-term stability and correlation of the Terra MODIS and L7 ETM+ sensors using absolute calibration results over the entire mission of the two sensors. Possible uncertainties are also discussed such as spectral differences in matching band pairs, solar zenith angle change during a collection, and differences in solar irradiance models.

  15. Three-dimensional estimates of tree canopies: Scaling from high-resolution UAV data to satellite observations

    NASA Astrophysics Data System (ADS)

    Sankey, T.; Donald, J.; McVay, J.

    2015-12-01

    High resolution remote sensing images and datasets are typically acquired at a large cost, which poses big a challenge for many scientists. Northern Arizona University recently acquired a custom-engineered, cutting-edge UAV and we can now generate our own images with the instrument. The UAV has a unique capability to carry a large payload including a hyperspectral sensor, which images the Earth surface in over 350 spectral bands at 5 cm resolution, and a lidar scanner, which images the land surface and vegetation in 3-dimensions. Both sensors represent the newest available technology with very high resolution, precision, and accuracy. Using the UAV sensors, we are monitoring the effects of regional forest restoration treatment efforts. Individual tree canopy width and height are measured in the field and via the UAV sensors. The high-resolution UAV images are then used to segment individual tree canopies and to derive 3-dimensional estimates. The UAV image-derived variables are then correlated to the field-based measurements and scaled to satellite-derived tree canopy measurements. The relationships between the field-based and UAV-derived estimates are then extrapolated to a larger area to scale the tree canopy dimensions and to estimate tree density within restored and control forest sites.

  16. Effect of anisoplanatism on the measurement accuracy of an extended-source Hartmann-Shack wavefront sensor

    NASA Astrophysics Data System (ADS)

    Woeger, Friedrich; Rimmele, Thomas

    2009-10-01

    We analyze the effect of anisoplanatic atmospheric turbulence on the measurement accuracy of an extended-source Hartmann-Shack wavefront sensor (HSWFS). We have numerically simulated an extended-source HSWFS, using a scenery of the solar surface that is imaged through anisoplanatic atmospheric turbulence and imaging optics. Solar extended-source HSWFSs often use cross-correlation algorithms in combination with subpixel shift finding algorithms to estimate the wavefront gradient, two of which were tested for their effect on the measurement accuracy. We find that the measurement error of an extended-source HSWFS is governed mainly by the optical geometry of the HSWFS, employed subpixel finding algorithm, and phase anisoplanatism. Our results show that effects of scintillation anisoplanatism are negligible when cross-correlation algorithms are used.

  17. A complete passive blind image copy-move forensics scheme based on compound statistics features.

    PubMed

    Peng, Fei; Nie, Yun-ying; Long, Min

    2011-10-10

    Since most sensor pattern noise based image copy-move forensics methods require a known reference sensor pattern noise, it generally results in non-blinded passive forensics, which significantly confines the application circumstances. In view of this, a novel passive-blind image copy-move forensics scheme is proposed in this paper. Firstly, a color image is transformed into a grayscale one, and wavelet transform based de-noising filter is used to extract the sensor pattern noise, then the variance of the pattern noise, the signal noise ratio between the de-noised image and the pattern noise, the information entropy and the average energy gradient of the original grayscale image are chosen as features, non-overlapping sliding window operations are done to the images to divide them into different sub-blocks. Finally, the tampered areas are detected by analyzing the correlation of the features between the sub-blocks and the whole image. Experimental results and analysis show that the proposed scheme is completely passive-blind, has a good detection rate, and is robust against JPEG compression, noise, rotation, scaling and blurring. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  18. Joint reconstruction of multiview compressed images.

    PubMed

    Thirumalai, Vijayaraghavan; Frossard, Pascal

    2013-05-01

    Distributed representation of correlated multiview images is an important problem that arises in vision sensor networks. This paper concentrates on the joint reconstruction problem where the distributively compressed images are decoded together in order to take benefit from the image correlation. We consider a scenario where the images captured at different viewpoints are encoded independently using common coding solutions (e.g., JPEG) with a balanced rate distribution among different cameras. A central decoder first estimates the inter-view image correlation from the independently compressed data. The joint reconstruction is then cast as a constrained convex optimization problem that reconstructs total-variation (TV) smooth images, which comply with the estimated correlation model. At the same time, we add constraints that force the reconstructed images to be as close as possible to their compressed versions. We show through experiments that the proposed joint reconstruction scheme outperforms independent reconstruction in terms of image quality, for a given target bit rate. In addition, the decoding performance of our algorithm compares advantageously to state-of-the-art distributed coding schemes based on motion learning and on the DISCOVER algorithm.

  19. Diffractive-optical correlators: chances to make optical image preprocessing as intelligent as human vision

    NASA Astrophysics Data System (ADS)

    Lauinger, Norbert

    2004-10-01

    The human eye is a good model for the engineering of optical correlators. Three prominent intelligent functionalities in human vision could in the near future become realized by a new diffractive-optical hardware design of optical imaging sensors: (1) Illuminant-adaptive RGB-based color Vision, (2) Monocular 3D Vision based on RGB data processing, (3) Patchwise fourier-optical Object-Classification and Identification. The hardware design of the human eye has specific diffractive-optical elements (DOE's) in aperture and in image space and seems to execute the three jobs at -- or not far behind -- the loci of the images of objects.

  20. Analysis of sensor network observations during some simulated landslide experiments

    NASA Astrophysics Data System (ADS)

    Scaioni, M.; Lu, P.; Feng, T.; Chen, W.; Wu, H.; Qiao, G.; Liu, C.; Tong, X.; Li, R.

    2012-12-01

    A multi-sensor network was tested during some experiments on a landslide simulation platform established at Tongji University (Shanghai, P.R. China). Here landslides were triggered by means of artificial rainfall (see Figure 1). The sensor network currently incorporates contact sensors and two imaging systems. This represent a novel solution, because the spatial sensor network incorporate either contact sensors and remote sensors (video-cameras). In future, these sensors will be installed on two real ground slopes in Sichuan province (South-West China), where Wenchuan earthquake occurred in 2008. This earthquake caused the immediate activation of several landslide, while other area became unstable and still are a menace for people and properties. The platform incorporates the reconstructed scale slope, sensor network, communication system, database and visualization system. Some landslide simulation experiments allowed ascertaining which sensors could be more suitable to be deployed in Wenchuan area. The poster will focus on the analysis of results coming from down scale simulations. Here the different steps of the landslide evolution can be followed on the basis of sensor observations. This include underground sensors to detect the water table level and the pressure in the ground, a set of accelerometers and two inclinometers. In the first part of the analysis the full data series are investigated to look for correlations and common patterns, as well as to link them to the physical processes. In the second, 4 subsets of sensors located in neighbor positions are analyzed. The analysis of low- and high-speed image sequences allowed to track a dense field of displacement on the slope surface. These outcomes have been compared to the ones obtained from accelerometers for cross-validation. Images were also used for the photogrammetric reconstruction of the slope topography during the experiment. Consequently, volume computation and mass movements could be evaluated on the basis of processed images.; Figure 1 - The landslide simulation platform at Tongji University at the end of an experiment. The picture shows the body of simulated landslide.

  1. Adaptive Cross-correlation Algorithm and Experiment of Extended Scene Shack-Hartmann Wavefront Sensing

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin; Morgan, Rhonda M.; Green, Joseph J.; Ohara, Catherine M.; Redding, David C.

    2007-01-01

    We have developed a new, adaptive cross-correlation (ACC) algorithm to estimate with high accuracy the shift as large as several pixels in two extended-scene images captured by a Shack-Hartmann wavefront sensor (SH-WFS). It determines the positions of all of the extended-scene image cells relative to a reference cell using an FFT-based iterative image shifting algorithm. It works with both point-source spot images as well as extended scene images. We have also set up a testbed for extended0scene SH-WFS, and tested the ACC algorithm with the measured data of both point-source and extended-scene images. In this paper we describe our algorithm and present out experimental results.

  2. Characterization of microcracks by application of digital image correlation to SPM images

    NASA Astrophysics Data System (ADS)

    Keller, Juergen; Gollhardt, Astrid; Vogel, Dietmar; Michel, Bernd

    2004-07-01

    With the development of micro- and nanotechnological products such as sensors, MEMS/NEMS and their broad application in a variety of market segments new reliability issues will arise. The increasing interface-to-volume ratio in highly integrated systems and nanoparticle filled materials and unsolved questions of size effect of nanomaterials are challenges for experimental reliability evaluation. To fulfill this needs the authors developed the nanoDAC method (nano Deformation Analysis by Correlation), which allows the determination and evaluation of 2D displacement fields based on scanning probe microscopy (SPM) data. In-situ SPM scans of the analyzed object are carried out at different thermo-mechanical load states. The obtained topography-, phase- or error-images are compared utilizing grayscale cross correlation algorithms. This allows the tracking of local image patterns of the analyzed surface structure. The measurement results of the nanoDAC method are full-field displacement and strain fields. Due to the application of SPM equipment deformations in the micro-, nanometer range can be easily detected. The method can be performed on bulk materials, thin films and on devices i.e microelectronic components, sensors or MEMS/NEMS. Furthermore, the characterization and evaluation of micro- and nanocracks or defects in bulk materials, thin layers and at material interfaces can be carried out.

  3. Noise Reduction Effect of Multiple-Sampling-Based Signal-Readout Circuits for Ultra-Low Noise CMOS Image Sensors.

    PubMed

    Kawahito, Shoji; Seo, Min-Woong

    2016-11-06

    This paper discusses the noise reduction effect of multiple-sampling-based signal readout circuits for implementing ultra-low-noise image sensors. The correlated multiple sampling (CMS) technique has recently become an important technology for high-gain column readout circuits in low-noise CMOS image sensors (CISs). This paper reveals how the column CMS circuits, together with a pixel having a high-conversion-gain charge detector and low-noise transistor, realizes deep sub-electron read noise levels based on the analysis of noise components in the signal readout chain from a pixel to the column analog-to-digital converter (ADC). The noise measurement results of experimental CISs are compared with the noise analysis and the effect of noise reduction to the sampling number is discussed at the deep sub-electron level. Images taken with three CMS gains of two, 16, and 128 show distinct advantage of image contrast for the gain of 128 (noise(median): 0.29 e - rms ) when compared with the CMS gain of two (2.4 e - rms ), or 16 (1.1 e - rms ).

  4. Noise Reduction Effect of Multiple-Sampling-Based Signal-Readout Circuits for Ultra-Low Noise CMOS Image Sensors

    PubMed Central

    Kawahito, Shoji; Seo, Min-Woong

    2016-01-01

    This paper discusses the noise reduction effect of multiple-sampling-based signal readout circuits for implementing ultra-low-noise image sensors. The correlated multiple sampling (CMS) technique has recently become an important technology for high-gain column readout circuits in low-noise CMOS image sensors (CISs). This paper reveals how the column CMS circuits, together with a pixel having a high-conversion-gain charge detector and low-noise transistor, realizes deep sub-electron read noise levels based on the analysis of noise components in the signal readout chain from a pixel to the column analog-to-digital converter (ADC). The noise measurement results of experimental CISs are compared with the noise analysis and the effect of noise reduction to the sampling number is discussed at the deep sub-electron level. Images taken with three CMS gains of two, 16, and 128 show distinct advantage of image contrast for the gain of 128 (noise(median): 0.29 e−rms) when compared with the CMS gain of two (2.4 e−rms), or 16 (1.1 e−rms). PMID:27827972

  5. Development of Ferrite-Based Temperature Sensors for Magnetic Resonance Imaging: A Study of Cu1 -xZnxFe2O4

    NASA Astrophysics Data System (ADS)

    Alghamdi, N. A.; Hankiewicz, J. H.; Anderson, N. R.; Stupic, K. F.; Camley, R. E.; Przybylski, M.; Żukrowski, J.; Celinski, Z.

    2018-05-01

    We investigate the use of Cu1 -xZnxFe2O4 ferrites (0.60

  6. Hadamard multimode optical imaging transceiver

    DOEpatents

    Cooke, Bradly J; Guenther, David C; Tiee, Joe J; Kellum, Mervyn J; Olivas, Nicholas L; Weisse-Bernstein, Nina R; Judd, Stephen L; Braun, Thomas R

    2012-10-30

    Disclosed is a method and system for simultaneously acquiring and producing results for multiple image modes using a common sensor without optical filtering, scanning, or other moving parts. The system and method utilize the Walsh-Hadamard correlation detection process (e.g., functions/matrix) to provide an all-binary structure that permits seamless bridging between analog and digital domains. An embodiment may capture an incoming optical signal at an optical aperture, convert the optical signal to an electrical signal, pass the electrical signal through a Low-Noise Amplifier (LNA) to create an LNA signal, pass the LNA signal through one or more correlators where each correlator has a corresponding Walsh-Hadamard (WH) binary basis function, calculate a correlation output coefficient for each correlator as a function of the corresponding WH binary basis function in accordance with Walsh-Hadamard mathematical principles, digitize each of the correlation output coefficient by passing each correlation output coefficient through an Analog-to-Digital Converter (ADC), and performing image mode processing on the digitized correlation output coefficients as desired to produce one or more image modes. Some, but not all, potential image modes include: multi-channel access, temporal, range, three-dimensional, and synthetic aperture.

  7. Denoising Algorithm for CFA Image Sensors Considering Inter-Channel Correlation.

    PubMed

    Lee, Min Seok; Park, Sang Wook; Kang, Moon Gi

    2017-05-28

    In this paper, a spatio-spectral-temporal filter considering an inter-channel correlation is proposed for the denoising of a color filter array (CFA) sequence acquired by CCD/CMOS image sensors. Owing to the alternating under-sampled grid of the CFA pattern, the inter-channel correlation must be considered in the direct denoising process. The proposed filter is applied in the spatial, spectral, and temporal domain, considering the spatio-tempo-spectral correlation. First, nonlocal means (NLM) spatial filtering with patch-based difference (PBD) refinement is performed by considering both the intra-channel correlation and inter-channel correlation to overcome the spatial resolution degradation occurring with the alternating under-sampled pattern. Second, a motion-compensated temporal filter that employs inter-channel correlated motion estimation and compensation is proposed to remove the noise in the temporal domain. Then, a motion adaptive detection value controls the ratio of the spatial filter and the temporal filter. The denoised CFA sequence can thus be obtained without motion artifacts. Experimental results for both simulated and real CFA sequences are presented with visual and numerical comparisons to several state-of-the-art denoising methods combined with a demosaicing method. Experimental results confirmed that the proposed frameworks outperformed the other techniques in terms of the objective criteria and subjective visual perception in CFA sequences.

  8. Development of a video-based slurry sensor for on-line ash analysis. Fifth quarterly technical progress report, October 1, 1995--December 31, 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adel, G.T.; Luttrell, G.H.

    Automatic control of fine coal cleaning circuits has traditionally been limited by the lack of sensors for on-line ash analysis. Although several nuclear-based analyzers are available, none have seen widespread acceptance. This is largely due to the fact that nuclear sensors are expensive and tend to be influenced by changes in seam type and pyrite content. Recently, researchers at VPI&SU have developed an optical sensor for phosphate analysis. The sensor uses image processing technology to analyze video images of phosphate ore. It is currently being used by PCS Phosphate for off-line analysis of dry flotation concentrate. The primary advantages ofmore » optical sensors over nuclear sensors are that hey are significantly cheaper, are not subject to measurement variations due to changes in high atomic number materials, are inherently safer and require no special radiation permitting. The purpose of this work is to apply the knowledge gained in the development of an optical phosphate analyzer to the development of an on-line ash analyzer for fine coal slurries. During the past quarter, the current prototype of the on-line optical ash analyzer was subjected to extensive testing at the Middlefork coal preparation plant. Initial work focused on obtaining correlations between ash content and mean gray level, while developmental work on the more comprehensive neural network calibration approach continued. Test work to date shows a promising trend in the correlation between ash content and mean gray level. Unfortunately, data scatter remains significant. Recent tests seem to eliminate variations in percent solids, particle size distribution, measurement angle and light setting as causes for the data scatter; however, equipment warm-up time and number of images taken per measurement appear to have a significant impact on the gray-level values obtained. 8 figs., 8 tabs.« less

  9. Optimal full motion video registration with rigorous error propagation

    NASA Astrophysics Data System (ADS)

    Dolloff, John; Hottel, Bryant; Doucette, Peter; Theiss, Henry; Jocher, Glenn

    2014-06-01

    Optimal full motion video (FMV) registration is a crucial need for the Geospatial community. It is required for subsequent and optimal geopositioning with simultaneous and reliable accuracy prediction. An overall approach being developed for such registration is presented that models relevant error sources in terms of the expected magnitude and correlation of sensor errors. The corresponding estimator is selected based on the level of accuracy of the a priori information of the sensor's trajectory and attitude (pointing) information, in order to best deal with non-linearity effects. Estimator choices include near real-time Kalman Filters and batch Weighted Least Squares. Registration solves for corrections to the sensor a priori information for each frame. It also computes and makes available a posteriori accuracy information, i.e., the expected magnitude and correlation of sensor registration errors. Both the registered sensor data and its a posteriori accuracy information are then made available to "down-stream" Multi-Image Geopositioning (MIG) processes. An object of interest is then measured on the registered frames and a multi-image optimal solution, including reliable predicted solution accuracy, is then performed for the object's 3D coordinates. This paper also describes a robust approach to registration when a priori information of sensor attitude is unavailable. It makes use of structure-from-motion principles, but does not use standard Computer Vision techniques, such as estimation of the Essential Matrix which can be very sensitive to noise. The approach used instead is a novel, robust, direct search-based technique.

  10. German Radar Observation Shuttle Experiment (ROSE)

    NASA Technical Reports Server (NTRS)

    Sleber, A. J.; Hartl, P.; Haydn, R.; Hildebrandt, G.; Konecny, G.; Muehlfeld, R.

    1984-01-01

    The success of radar sensors in several different application areas of interest depends on the knowledge of the backscatter of radar waves from the targets of interest, the variance of these interaction mechanisms with respect to changing measurement parameters, and the determination of the influence of he measuring systems on the results. The incidence-angle dependency of the radar cross section of different natural targets is derived. Problems involved by the combination of data gained with different sensors, e.g., MSS-, TM-, SPOTand SAR-images are analyzed. Radar cross-section values gained with ground-based radar spectrometers and spaceborne radar imaging, and non-imaging scatterometers and spaceborne radar images from the same areal target are correlated. The penetration of L-band radar waves into vegetated and nonvegetated surfaces is analyzed.

  11. A novel, optical, on-line bacteria sensor for monitoring drinking water quality

    PubMed Central

    Højris, Bo; Christensen, Sarah Christine Boesgaard; Albrechtsen, Hans-Jørgen; Smith, Christian; Dahlqvist, Mathis

    2016-01-01

    Today, microbial drinking water quality is monitored through either time-consuming laboratory methods or indirect on-line measurements. Results are thus either delayed or insufficient to support proactive action. A novel, optical, on-line bacteria sensor with a 10-minute time resolution has been developed. The sensor is based on 3D image recognition, and the obtained pictures are analyzed with algorithms considering 59 quantified image parameters. The sensor counts individual suspended particles and classifies them as either bacteria or abiotic particles. The technology is capable of distinguishing and quantifying bacteria and particles in pure and mixed suspensions, and the quantification correlates with total bacterial counts. Several field applications have demonstrated that the technology can monitor changes in the concentration of bacteria, and is thus well suited for rapid detection of critical conditions such as pollution events in drinking water. PMID:27040142

  12. A novel, optical, on-line bacteria sensor for monitoring drinking water quality.

    PubMed

    Højris, Bo; Christensen, Sarah Christine Boesgaard; Albrechtsen, Hans-Jørgen; Smith, Christian; Dahlqvist, Mathis

    2016-04-04

    Today, microbial drinking water quality is monitored through either time-consuming laboratory methods or indirect on-line measurements. Results are thus either delayed or insufficient to support proactive action. A novel, optical, on-line bacteria sensor with a 10-minute time resolution has been developed. The sensor is based on 3D image recognition, and the obtained pictures are analyzed with algorithms considering 59 quantified image parameters. The sensor counts individual suspended particles and classifies them as either bacteria or abiotic particles. The technology is capable of distinguishing and quantifying bacteria and particles in pure and mixed suspensions, and the quantification correlates with total bacterial counts. Several field applications have demonstrated that the technology can monitor changes in the concentration of bacteria, and is thus well suited for rapid detection of critical conditions such as pollution events in drinking water.

  13. RadMAP: The Radiological Multi-sensor Analysis Platform

    NASA Astrophysics Data System (ADS)

    Bandstra, Mark S.; Aucott, Timothy J.; Brubaker, Erik; Chivers, Daniel H.; Cooper, Reynold J.; Curtis, Joseph C.; Davis, John R.; Joshi, Tenzing H.; Kua, John; Meyer, Ross; Negut, Victor; Quinlan, Michael; Quiter, Brian J.; Srinivasan, Shreyas; Zakhor, Avideh; Zhang, Richard; Vetter, Kai

    2016-12-01

    The variability of gamma-ray and neutron background during the operation of a mobile detector system greatly limits the ability of the system to detect weak radiological and nuclear threats. The natural radiation background measured by a mobile detector system is the result of many factors, including the radioactivity of nearby materials, the geometric configuration of those materials and the system, the presence of absorbing materials, and atmospheric conditions. Background variations tend to be highly non-Poissonian, making it difficult to set robust detection thresholds using knowledge of the mean background rate alone. The Radiological Multi-sensor Analysis Platform (RadMAP) system is designed to allow the systematic study of natural radiological background variations and to serve as a development platform for emerging concepts in mobile radiation detection and imaging. To do this, RadMAP has been used to acquire extensive, systematic background measurements and correlated contextual data that can be used to test algorithms and detector modalities at low false alarm rates. By combining gamma-ray and neutron detector systems with data from contextual sensors, the system enables the fusion of data from multiple sensors into novel data products. The data are curated in a common format that allows for rapid querying across all sensors, creating detailed multi-sensor datasets that are used to study correlations between radiological and contextual data, and develop and test novel techniques in mobile detection and imaging. In this paper we will describe the instruments that comprise the RadMAP system, the effort to curate and provide access to multi-sensor data, and some initial results on the fusion of contextual and radiological data.

  14. Demonstration of the CDMA-mode CAOS smart camera.

    PubMed

    Riza, Nabeel A; Mazhar, Mohsin A

    2017-12-11

    Demonstrated is the code division multiple access (CDMA)-mode coded access optical sensor (CAOS) smart camera suited for bright target scenarios. Deploying a silicon CMOS sensor and a silicon point detector within a digital micro-mirror device (DMD)-based spatially isolating hybrid camera design, this smart imager first engages the DMD starring mode with a controlled factor of 200 high optical attenuation of the scene irradiance to provide a classic unsaturated CMOS sensor-based image for target intelligence gathering. Next, this CMOS sensor provided image data is used to acquire a focused zone more robust un-attenuated true target image using the time-modulated CDMA-mode of the CAOS camera. Using four different bright light test target scenes, successfully demonstrated is a proof-of-concept visible band CAOS smart camera operating in the CDMA-mode using up-to 4096 bits length Walsh design CAOS pixel codes with a maximum 10 KHz code bit rate giving a 0.4096 seconds CAOS frame acquisition time. A 16-bit analog-to-digital converter (ADC) with time domain correlation digital signal processing (DSP) generates the CDMA-mode images with a 3600 CAOS pixel count and a best spatial resolution of one micro-mirror square pixel size of 13.68 μm side. The CDMA-mode of the CAOS smart camera is suited for applications where robust high dynamic range (DR) imaging is needed for un-attenuated un-spoiled bright light spectrally diverse targets.

  15. Tactile sensor is useful for estimating liver hardness and liver fibrosis compared with ultrasonography and computed tomography.

    PubMed

    Suzuki, Satoshi; Watanabe, Yohei; Yazawa, Takashi; Ishigame, Teruhide; Sassa, Motoki; Monma, Tomoyuki; Takawa, Tadashi; Kumamoto, Kensuke; Nakamura, Izumi; Ohoki, Shinji; Hatakeyama, Yuichi; Sakuma, Hiroshi; Ono, Toshiyuki; Omata, Sadao; Takenoshita, Seiichi

    2014-01-01

    We examined whether conventional ultrasonography (US) and computed tomography (CT) were useful to evaluate liver hardness and hepatic fibrosis by comparing the results with those obtained by a tactile sensor using rats with liver fibrosis. We used 44 Wistar rats in which liver fibrosis was induced by intraperitoneal administration of thioacetamide. The CT and US values of each liver were measured before laparotomy. After laparotomy, a tactile sensor was used to measure liver hardness. We prepared Azan stained sections of each excised liver specimen and calculated the degree of liver fibrosis (HFI: hepatic fibrosis index) by computed color image analysis. The stiffness values and HFI showed a positive correlation (r=0.690, p<0.001), as did the tactile values and HFI (r=0.709, p<0.001).In addition, the stiffness and tactile values correlated positively with each other (r=0.814, p<0.001). There was no correlation between the CT values and HFI, as well as no correlation between the US values and HFI. We confirmed that it was difficult to evaluate liver hardness and HFI by CT or US examination, and considered that, at present, a tactile sensor is useful method for evaluating HFI.

  16. A forestry GIS-based study on evaluating the potential of imaging spectroscopy in mapping forest land fertility

    NASA Astrophysics Data System (ADS)

    Mõttus, Matti; Takala, Tuure

    2014-12-01

    Fertility, or the availability of nutrients and water, controls forest productivity. It affects its carbon sequestration, and thus the forest's effect on climate, as well as its commercial value. Although the availability of nutrients cannot be measured directly using remote sensing methods, fertility alters several vegetation traits detectable from the reflectance spectra of the forest stand, including its pigment content and water stress. However, forest reflectance is also influenced by other factors, such as species composition and stand age. Here, we present a case study demonstrating how data obtained using imaging spectroscopy is correlated with site fertility. The study was carried out in Hyytiälä, Finland, in the southern boreal forest zone. We used a database of state-owned forest stands including basic forestry variables and a site fertility index. To test the suitability of imaging spectroscopy with different spatial and spectral resolutions for site fertility mapping, we performed two airborne acquisitions using different sensor configurations. First, the sensor was flown at a high altitude with high spectral resolution resulting in a pixel size in the order of a tree crown. Next, the same area was flown to provide reflectance data with sub-meter spatial resolution. However, to maintain usable signal-to-noise ratios, several spectral channels inside the sensor were combined, thus reducing spectral resolution. We correlated a number of narrowband vegetation indices (describing canopy biochemical composition, structure, and photosynthetic activity) on site fertility. Overall, site fertility had a significant influence on the vegetation indices but the strength of the correlation depended on dominant species. We found that high spatial resolution data calculated from the spectra of sunlit parts of tree crowns had the strongest correlation with site fertility.

  17. High-resolution depth profiling using a range-gated CMOS SPAD quanta image sensor.

    PubMed

    Ren, Ximing; Connolly, Peter W R; Halimi, Abderrahim; Altmann, Yoann; McLaughlin, Stephen; Gyongy, Istvan; Henderson, Robert K; Buller, Gerald S

    2018-03-05

    A CMOS single-photon avalanche diode (SPAD) quanta image sensor is used to reconstruct depth and intensity profiles when operating in a range-gated mode used in conjunction with pulsed laser illumination. By designing the CMOS SPAD array to acquire photons within a pre-determined temporal gate, the need for timing circuitry was avoided and it was therefore possible to have an enhanced fill factor (61% in this case) and a frame rate (100,000 frames per second) that is more difficult to achieve in a SPAD array which uses time-correlated single-photon counting. When coupled with appropriate image reconstruction algorithms, millimeter resolution depth profiles were achieved by iterating through a sequence of temporal delay steps in synchronization with laser illumination pulses. For photon data with high signal-to-noise ratios, depth images with millimeter scale depth uncertainty can be estimated using a standard cross-correlation approach. To enhance the estimation of depth and intensity images in the sparse photon regime, we used a bespoke clustering-based image restoration strategy, taking into account the binomial statistics of the photon data and non-local spatial correlations within the scene. For sparse photon data with total exposure times of 75 ms or less, the bespoke algorithm can reconstruct depth images with millimeter scale depth uncertainty at a stand-off distance of approximately 2 meters. We demonstrate a new approach to single-photon depth and intensity profiling using different target scenes, taking full advantage of the high fill-factor, high frame rate and large array format of this range-gated CMOS SPAD array.

  18. Enhance the Quality of Crowdsensing for Fine-Grained Urban Environment Monitoring via Data Correlation

    PubMed Central

    Kang, Xu; Liu, Liang; Ma, Huadong

    2017-01-01

    Monitoring the status of urban environments, which provides fundamental information for a city, yields crucial insights into various fields of urban research. Recently, with the popularity of smartphones and vehicles equipped with onboard sensors, a people-centric scheme, namely “crowdsensing”, for city-scale environment monitoring is emerging. This paper proposes a data correlation based crowdsensing approach for fine-grained urban environment monitoring. To demonstrate urban status, we generate sensing images via crowdsensing network, and then enhance the quality of sensing images via data correlation. Specifically, to achieve a higher quality of sensing images, we not only utilize temporal correlation of mobile sensing nodes but also fuse the sensory data with correlated environment data by introducing a collective tensor decomposition approach. Finally, we conduct a series of numerical simulations and a real dataset based case study. The results validate that our approach outperforms the traditional spatial interpolation-based method. PMID:28054968

  19. A simple and low-cost biofilm quantification method using LED and CMOS image sensor.

    PubMed

    Kwak, Yeon Hwa; Lee, Junhee; Lee, Junghoon; Kwak, Soo Hwan; Oh, Sangwoo; Paek, Se-Hwan; Ha, Un-Hwan; Seo, Sungkyu

    2014-12-01

    A novel biofilm detection platform, which consists of a cost-effective red, green, and blue light-emitting diode (RGB LED) as a light source and a lens-free CMOS image sensor as a detector, is designed. This system can measure the diffraction patterns of cells from their shadow images, and gather light absorbance information according to the concentration of biofilms through a simple image processing procedure. Compared to a bulky and expensive commercial spectrophotometer, this platform can provide accurate and reproducible biofilm concentration detection and is simple, compact, and inexpensive. Biofilms originating from various bacterial strains, including Pseudomonas aeruginosa (P. aeruginosa), were tested to demonstrate the efficacy of this new biofilm detection approach. The results were compared with the results obtained from a commercial spectrophotometer. To utilize a cost-effective light source (i.e., an LED) for biofilm detection, the illumination conditions were optimized. For accurate and reproducible biofilm detection, a simple, custom-coded image processing algorithm was developed and applied to a five-megapixel CMOS image sensor, which is a cost-effective detector. The concentration of biofilms formed by P. aeruginosa was detected and quantified by varying the indole concentration, and the results were compared with the results obtained from a commercial spectrophotometer. The correlation value of the results from those two systems was 0.981 (N = 9, P < 0.01) and the coefficients of variation (CVs) were approximately threefold lower at the CMOS image-sensor platform. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Characterisation of GaAs:Cr pixel sensors coupled to Timepix chips in view of synchrotron applications

    NASA Astrophysics Data System (ADS)

    Ponchut, C.; Cotte, M.; Lozinskaya, A.; Zarubin, A.; Tolbanov, O.; Tyazhev, A.

    2017-12-01

    In order to meet the needs of some ESRF beamlines for highly efficient 2D X-ray detectors in the 20-50 keV range, GaAs:Cr pixel sensors coupled to TIMEPIX readout chips were implemented into a MAXIPIX detector. Use of GaAs:Cr sensor material is intended to overcome the limitations of Si (low absorption) and of CdTe (fluorescence) in this energy range The GaAs:Cr sensor assemblies were characterised with both laboratory X-ray sources and monochromatic synchrotron X-ray beams. The sensor response as a function of bias voltage was compared to a theoretical model, leading to an estimation of the μτ product of electrons in GaAs:Cr sensor material of 1.6×10-4 cm2/V. The spatial homogeneity of X-ray images obtained with the sensors was measured in different irradiation conditions, showing a particular sensitivity to small variations in the incident beam spectrum. 2D-resolved elemental mapping of the sensor surface was carried out to investigate a possible relation between the noise pattern observed in X-ray images and local fluctuations in chemical composition. A scanning of the sensor response at subpixel scale revealed that these irregularities can be correlated with a distortion of the effective pixel shapes.

  1. On the influence of noise correlations in measurement data on basis image noise in dual-energylike x-ray imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roessl, Ewald; Ziegler, Andy; Proksa, Roland

    2007-03-15

    In conventional dual-energy systems, two transmission measurements with distinct spectral characteristics are performed. These measurements are used to obtain the line integrals of two basis material densities. Usually, the measurement process is such that the two measured signals can be treated as independent and therefore uncorrelated. Recently, however, a readout system for x-ray detectors has been introduced for which this is no longer the case. The readout electronics is designed to obtain simultaneous measurements of the total number of photons N and the total energy E they deposit in the sensor material. Practically, this is realized by a signal replicationmore » and separate counting and integrating processing units. Since the quantities N and E are (electronically) derived from one and the same physical sensor signal, they are statistically correlated. Nevertheless, the pair N and E can be used to perform a dual-energy processing following the well-known approach by Alvarez and Macovski. Formally, this means that N is to be identified with the first dual-energy measurement M{sub 1} and E with the second measurement M{sub 2}. In the presence of input correlations between M{sub 1}=N and M{sub 2}=E, however, the corresponding analytic expressions for the basis image noise have to be modified. The main observation made in this paper is that for positively correlated data, as is the case for the simultaneous counting and integrating device mentioned above, the basis image noise is suppressed through the influence of the covariance between the two signals. We extend the previously published relations for the basis image noise to the case where the original measurements are not independent and illustrate the importance of the input correlations by comparing dual-energy basis image noise resulting from the device mentioned above and a device measuring the photon numbers and the deposited energies consecutively.« less

  2. Evaluation of a linear spectral mixture model and vegetation indices (NDVI and EVI) in a study of schistosomiasis mansoni and Biomphalaria glabrata distribution in the state of Minas Gerais, Brazil.

    PubMed

    Guimarães, Ricardo J P S; Freitas, Corina C; Dutra, Luciano V; Scholte, Ronaldo G C; Amaral, Ronaldo S; Drummond, Sandra C; Shimabukuro, Yosio E; Oliveira, Guilherme C; Carvalho, Omar S

    2010-07-01

    This paper analyses the associations between Normalized Difference Vegetation Index (NDVI) and Enhanced Vegetation Index (EVI) on the prevalence of schistosomiasis and the presence of Biomphalaria glabrata in the state of Minas Gerais (MG), Brazil. Additionally, vegetation, soil and shade fraction images were created using a Linear Spectral Mixture Model (LSMM) from the blue, red and infrared channels of the Moderate Resolution Imaging Spectroradiometer spaceborne sensor and the relationship between these images and the prevalence of schistosomiasis and the presence of B. glabrata was analysed. First, we found a high correlation between the vegetation fraction image and EVI and second, a high correlation between soil fraction image and NDVI. The results also indicate that there was a positive correlation between prevalence and the vegetation fraction image (July 2002), a negative correlation between prevalence and the soil fraction image (July 2002) and a positive correlation between B. glabrata and the shade fraction image (July 2002). This paper demonstrates that the LSMM variables can be used as a substitute for the standard vegetation indices (EVI and NDVI) to determine and delimit risk areas for B. glabrata and schistosomiasis in MG, which can be used to improve the allocation of resources for disease control.

  3. Low Computational-Cost Footprint Deformities Diagnosis Sensor through Angles, Dimensions Analysis and Image Processing Techniques

    PubMed Central

    Maestre-Rendon, J. Rodolfo; Sierra-Hernandez, Juan M.; Contreras-Medina, Luis M.; Fernandez-Jaramillo, Arturo A.

    2017-01-01

    Manual measurements of foot anthropometry can lead to errors since this task involves the experience of the specialist who performs them, resulting in different subjective measures from the same footprint. Moreover, some of the diagnoses that are given to classify a footprint deformity are based on a qualitative interpretation by the physician; there is no quantitative interpretation of the footprint. The importance of providing a correct and accurate diagnosis lies in the need to ensure that an appropriate treatment is provided for the improvement of the patient without risking his or her health. Therefore, this article presents a smart sensor that integrates the capture of the footprint, a low computational-cost analysis of the image and the interpretation of the results through a quantitative evaluation. The smart sensor implemented required the use of a camera (Logitech C920) connected to a Raspberry Pi 3, where a graphical interface was made for the capture and processing of the image, and it was adapted to a podoscope conventionally used by specialists such as orthopedist, physiotherapists and podiatrists. The footprint diagnosis smart sensor (FPDSS) has proven to be robust to different types of deformity, precise, sensitive and correlated in 0.99 with the measurements from the digitalized image of the ink mat. PMID:29165397

  4. Low Computational-Cost Footprint Deformities Diagnosis Sensor through Angles, Dimensions Analysis and Image Processing Techniques.

    PubMed

    Maestre-Rendon, J Rodolfo; Rivera-Roman, Tomas A; Sierra-Hernandez, Juan M; Cruz-Aceves, Ivan; Contreras-Medina, Luis M; Duarte-Galvan, Carlos; Fernandez-Jaramillo, Arturo A

    2017-11-22

    Manual measurements of foot anthropometry can lead to errors since this task involves the experience of the specialist who performs them, resulting in different subjective measures from the same footprint. Moreover, some of the diagnoses that are given to classify a footprint deformity are based on a qualitative interpretation by the physician; there is no quantitative interpretation of the footprint. The importance of providing a correct and accurate diagnosis lies in the need to ensure that an appropriate treatment is provided for the improvement of the patient without risking his or her health. Therefore, this article presents a smart sensor that integrates the capture of the footprint, a low computational-cost analysis of the image and the interpretation of the results through a quantitative evaluation. The smart sensor implemented required the use of a camera (Logitech C920) connected to a Raspberry Pi 3, where a graphical interface was made for the capture and processing of the image, and it was adapted to a podoscope conventionally used by specialists such as orthopedist, physiotherapists and podiatrists. The footprint diagnosis smart sensor (FPDSS) has proven to be robust to different types of deformity, precise, sensitive and correlated in 0.99 with the measurements from the digitalized image of the ink mat.

  5. A 7 ke-SD-FWC 1.2 e-RMS Temporal Random Noise 128×256 Time-Resolved CMOS Image Sensor With Two In-Pixel SDs for Biomedical Applications.

    PubMed

    Seo, Min-Woong; Kawahito, Shoji

    2017-12-01

    A large full well capacity (FWC) for wide signal detection range and low temporal random noise for high sensitivity lock-in pixel CMOS image sensor (CIS) embedded with two in-pixel storage diodes (SDs) has been developed and presented in this paper. For fast charge transfer from photodiode to SDs, a lateral electric field charge modulator (LEFM) is used for the developed lock-in pixel. As a result, the time-resolved CIS achieves a very large SD-FWC of approximately 7ke-, low temporal random noise of 1.2e-rms at 20 fps with true correlated double sampling operation and fast intrinsic response less than 500 ps at 635 nm. The proposed imager has an effective pixel array of and a pixel size of . The sensor chip is fabricated by Dongbu HiTek 1P4M 0.11 CIS process.

  6. Gold nanoparticle flow sensors designed for dynamic X-ray imaging in biofluids.

    PubMed

    Ahn, Sungsook; Jung, Sung Yong; Lee, Jin Pyung; Kim, Hae Koo; Lee, Sang Joon

    2010-07-27

    X-ray-based imaging is one of the most powerful and convenient methods in terms of versatility in applicable energy and high performance in use. Different from conventional nuclear medicine imaging, contrast agents are required in X-ray imaging especially for effectively targeted and molecularly specific functions. Here, in contrast to much reported static accumulation of the contrast agents in targeted organs, dynamic visualization in a living organism is successfully accomplished by the particle-traced X-ray imaging for the first time. Flow phenomena across perforated end walls of xylem vessels in rice are monitored by a gold nanoparticle (AuNP) (approximately 20 nm in diameter) as a flow tracing sensor working in nontransparent biofluids. AuNPs are surface-modified to control the hydrodynamic properties such as hydrodynamic size (DH), zeta-potential, and surface plasmonic properties in aqueous conditions. Transmission electron microscopy (TEM), scanning electron microscopy (SEM), X-ray nanoscopy (XN), and X-ray microscopy (XM) are used to correlate the interparticle interactions with X-ray absorption ability. Cluster formation and X-ray contrast ability of the AuNPs are successfully modulated by controlling the interparticle interactions evaluated as flow-tracing sensors.

  7. Adaptation of reference volumes for correlation-based digital holographic particle tracking

    NASA Astrophysics Data System (ADS)

    Hesseling, Christina; Peinke, Joachim; Gülker, Gerd

    2018-04-01

    Numerically reconstructed reference volumes tailored to particle images are used for particle position detection by means of three-dimensional correlation. After a first tracking of these positions, the experimentally recorded particle images are retrieved as a posteriori knowledge about the particle images in the system. This knowledge is used for a further refinement of the detected positions. A transparent description of the individual algorithm steps including the results retrieved with experimental data complete the paper. The work employs extraordinarily small particles, smaller than the pixel pitch of the camera sensor. It is the first approach known to the authors that combines numerical knowledge about particle images and particle images retrieved from the experimental system to an iterative particle tracking approach for digital holographic particle tracking velocimetry.

  8. Ionizing radiation effects on CMOS imagers manufactured in deep submicron process

    NASA Astrophysics Data System (ADS)

    Goiffon, Vincent; Magnan, Pierre; Bernard, Frédéric; Rolland, Guy; Saint-Pé, Olivier; Huger, Nicolas; Corbière, Franck

    2008-02-01

    We present here a study on both CMOS sensors and elementary structures (photodiodes and in-pixel MOSFETs) manufactured in a deep submicron process dedicated to imaging. We designed a test chip made of one 128×128-3T-pixel array with 10 μm pitch and more than 120 isolated test structures including photodiodes and MOSFETs with various implants and different sizes. All these devices were exposed to ionizing radiation up to 100 krad and their responses were correlated to identify the CMOS sensor weaknesses. Characterizations in darkness and under illumination demonstrated that dark current increase is the major sensor degradation. Shallow trench isolation was identified to be responsible for this degradation as it increases the number of generation centers in photodiode depletion regions. Consequences on hardness assurance and hardening-by-design are discussed.

  9. Optimal distance of multi-plane sensor in three-dimensional electrical impedance tomography.

    PubMed

    Hao, Zhenhua; Yue, Shihong; Sun, Benyuan; Wang, Huaxiang

    2017-12-01

    Electrical impedance tomography (EIT) is a visual imaging technique for obtaining the conductivity and permittivity distributions in the domain of interest. As an advanced technique, EIT has the potential to be a valuable tool for continuously bedside monitoring of pulmonary function. The EIT applications in any three-dimensional (3 D) field are very limited to the 3 D effects, i.e. the distribution of electric field spreads far beyond the electrode plane. The 3 D effects can result in measurement errors and image distortion. An important way to overcome the 3 D effect is to use the multiple groups of sensors. The aim of this paper is to find the best space resolution of EIT image over various electrode planes and select an optimal plane spacing in a 3 D EIT sensor, and provide guidance for 3 D EIT electrodes placement in monitoring lung function. In simulation and experiment, several typical conductivity distribution models, such as one rod (central, midway and edge), two rods and three rods, are set at different plane spacings between the two electrode planes. A Tikhonov regularization algorithm is utilized for reconstructing the images; the relative error and the correlation coefficient are utilized for evaluating the image quality. Based on numerical simulation and experimental results, the image performance at different spacing conditions is evaluated. The results demonstrate that there exists an optimal plane spacing between the two electrode planes for 3 D EIT sensor. And then the selection of the optimal plane spacing between the electrode planes is suggested for the electrodes placement of multi-plane EIT sensor.

  10. Joint demosaicking and zooming using moderate spectral correlation and consistent edge map

    NASA Astrophysics Data System (ADS)

    Zhou, Dengwen; Dong, Weiming; Chen, Wengang

    2014-07-01

    The recently published joint demosaicking and zooming algorithms for single-sensor digital cameras all overfit the popular Kodak test images, which have been found to have higher spectral correlation than typical color images. Their performance perhaps significantly degrades on other datasets, such as the McMaster test images, which have weak spectral correlation. A new joint demosaicking and zooming algorithm is proposed for the Bayer color filter array (CFA) pattern, in which the edge direction information (edge map) extracted from the raw CFA data is consistently used in demosaicking and zooming. It also moderately utilizes the spectral correlation between color planes. The experimental results confirm that the proposed algorithm produces an excellent performance on both the Kodak and McMaster datasets in terms of both subjective and objective measures. Our algorithm also has high computational efficiency. It provides a better tradeoff among adaptability, performance, and computational cost compared to the existing algorithms.

  11. Hierarchical classification in high dimensional numerous class cases

    NASA Technical Reports Server (NTRS)

    Kim, Byungyong; Landgrebe, D. A.

    1990-01-01

    As progress in new sensor technology continues, increasingly high resolution imaging sensors are being developed. These sensors give more detailed and complex data for each picture element and greatly increase the dimensionality of data over past systems. Three methods for designing a decision tree classifier are discussed: a top down approach, a bottom up approach, and a hybrid approach. Three feature extraction techniques are implemented. Canonical and extended canonical techniques are mainly dependent upon the mean difference between two classes. An autocorrelation technique is dependent upon the correlation differences. The mathematical relationship between sample size, dimensionality, and risk value is derived.

  12. A Novel Multi-Aperture Based Sun Sensor Based on a Fast Multi-Point MEANSHIFT (FMMS) Algorithm

    PubMed Central

    You, Zheng; Sun, Jian; Xing, Fei; Zhang, Gao-Fei

    2011-01-01

    With the current increased widespread interest in the development and applications of micro/nanosatellites, it was found that we needed to design a small high accuracy satellite attitude determination system, because the star trackers widely used in large satellites are large and heavy, and therefore not suitable for installation on micro/nanosatellites. A Sun sensor + magnetometer is proven to be a better alternative, but the conventional sun sensor has low accuracy, and cannot meet the requirements of the attitude determination systems of micro/nanosatellites, so the development of a small high accuracy sun sensor with high reliability is very significant. This paper presents a multi-aperture based sun sensor, which is composed of a micro-electro-mechanical system (MEMS) mask with 36 apertures and an active pixels sensor (APS) CMOS placed below the mask at a certain distance. A novel fast multi-point MEANSHIFT (FMMS) algorithm is proposed to improve the accuracy and reliability, the two key performance features, of an APS sun sensor. When the sunlight illuminates the sensor, a sun spot array image is formed on the APS detector. Then the sun angles can be derived by analyzing the aperture image location on the detector via the FMMS algorithm. With this system, the centroid accuracy of the sun image can reach 0.01 pixels, without increasing the weight and power consumption, even when some missing apertures and bad pixels appear on the detector due to aging of the devices and operation in a harsh space environment, while the pointing accuracy of the single-aperture sun sensor using the conventional correlation algorithm is only 0.05 pixels. PMID:22163770

  13. Interference data correction methods for lunar observation with a large-aperture static imaging spectrometer.

    PubMed

    Zhang, Geng; Wang, Shuang; Li, Libo; Hu, Xiuqing; Hu, Bingliang

    2016-11-01

    The lunar spectrum has been used in radiometric calibration and sensor stability monitoring for spaceborne optical sensors. A ground-based large-aperture static image spectrometer (LASIS) can be used to acquire the lunar spectral image for lunar radiance model improvement when the moon orbits over its viewing field. The lunar orbiting behavior is not consistent with the desired scanning speed and direction of LASIS. To correctly extract interferograms from the obtained data, a translation correction method based on image correlation is proposed. This method registers the frames to a reference frame to reduce accumulative errors. Furthermore, we propose a circle-matching-based approach to achieve even higher accuracy during observation of the full moon. To demonstrate the effectiveness of our approaches, experiments are run on true lunar observation data. The results show that the proposed approaches outperform the state-of-the-art methods.

  14. Spatio-Temporal Variations in the Associations between Hourly PM2.5 and Aerosol Optical Depth (AOD) from MODIS Sensors on Terra and Aqua*

    PubMed Central

    Kim, Minho; Zhang, Xingyou; Holt, James B.; Liu, Yang

    2015-01-01

    Recent studies have explored the relationship between aerosol optical depth (AOD) measurements by satellite sensors and concentrations of particulate matter with aerodynamic diameters less than 2.5 μm (PM2.5). However, relatively little is known about spatial and temporal patterns in this relationship across the contiguous United States. In this study, we investigated the relationship between US Environmental Protection Agency estimates of PM2.5 concentrations and Moderate Resolution Imaging Spectroradiometer (MODIS) AOD measurements provided by two NASA satellites (Terra and Aqua) across the contiguous United States during 2005. We found that the combined use of both satellite sensors provided more AOD coverage than the use of either satellite sensor alone, that the correlation between AOD measurements and PM2.5 concentrations varied substantially by geographic location, and that this correlation was stronger in the summer and fall than that in the winter and spring. PMID:26336576

  15. Mobile, Multimodal, Label-Free Imaging Probe Analysis of Choroidal Oximetry and Retinal Hypoxia

    DTIC Science & Technology

    2017-12-01

    these same eye injuries. Primary blast-induced injury (PBI), which can occur in eyes that are not punctured or ruptured by the blast, is correlated ...optimization. (1A-F) The component of our PBI- devices, output pressure detection sensor, amplifier, and input pressure panel. (1G) Correlation between...by changing the setting of blast generator. (2A-B) Correlation between output pressure and blast time duration. (2C) After PBI- treatment, the eyes of

  16. Multi-Beam Radio Frequency (RF) Aperture Arrays Using Multiplierless Approximate Fast Fourier Transform (FFT)

    DTIC Science & Technology

    2017-08-01

    filtering, correlation and radio- astronomy . In this report approximate transforms that closely follow the DFT have been studied and found. The approximate...communications, data networks, sensor networks, cognitive radio, radar and beamforming, imaging, filtering, correlation and radio- astronomy . FFTs efficiently...public release; distribution is unlimited. 4.3 Digital Hardware and Design Architectures Collaboration for Astronomy Signal Processing and Electronics

  17. Ranging through Gabor logons-a consistent, hierarchical approach.

    PubMed

    Chang, C; Chatterjee, S

    1993-01-01

    In this work, the correspondence problem in stereo vision is handled by matching two sets of dense feature vectors. Inspired by biological evidence, these feature vectors are generated by a correlation between a bank of Gabor sensors and the intensity image. The sensors consist of two-dimensional Gabor filters at various scales (spatial frequencies) and orientations, which bear close resemblance to the receptive field profiles of simple V1 cells in visual cortex. A hierarchical, stochastic relaxation method is then used to obtain the dense stereo disparities. Unlike traditional hierarchical methods for stereo, feature based hierarchical processing yields consistent disparities. To avoid false matchings due to static occlusion, a dual matching, based on the imaging geometry, is used.

  18. Optical diagnostics in gas turbine combustors

    NASA Astrophysics Data System (ADS)

    Woodruff, Steven D.

    1999-01-01

    Deregulation of the power industry and increasingly tight emission controls are pushing gas turbine manufacturers to develop engines operating at high pressure for efficiency and lean fuel mixtures to control NOx. This combination also gives rise to combustion instabilities which threaten engine integrity through acoustic pressure oscillations and flashback. High speed imaging and OH emission sensors have been demonstrated to be invaluable tools in characterizing and monitoring unstable combustion processes. Asynchronous imaging technique permit detailed viewing of cyclic flame structure in an acoustic environment which may be modeled or utilized in burner design . The response of the flame front to the acoustic pressure cycle may be tracked with an OH emission monitor using a sapphire light pipe for optical access. The OH optical emission can be correlated to pressure sensor data for better understanding of the acoustical coupling of the flame. Active control f the combustion cycle can be implemented using an OH emission sensor for feedback.

  19. Fiber optic sensors and systems at the Federal University of Rio de Janeiro

    NASA Astrophysics Data System (ADS)

    Werneck, Marcelo M.; dos Santos, Paulo A. M.; Ferreira, Aldo P.; Maggi, Luis E.; de Carvalho, Carlos R., Jr.; Ribeiro, R. M.

    1998-08-01

    As widely known, fiberoptics (FO) are being used in a large variety of sensors and systems particularly for their small dimensions and low cost, large bandwidth and favorable dielectric properties. These properties have allowed us to develop sensors and systems for general applications and, particularly, for biomedical engineering. The intravascular pressure sensor was designed for small dimensions and high bandwidth. The system is based on light-intensity modulation technique and uses a 2 mm-diameter elastomer membrane as the sensor element and a pigtailed laser as a light source. The optical power output curve was linear for pressures within the range of 0 to 300 mmHg. The real time optical biosensor uses the evanescent field technique for monitoring Escherichia coli growth in culture media. The optical biosensor monitors interactions between the analytic (bacteria) and the evanescent field of an optical fiber passing through it. The FO based high voltage and current sensor is a measuring system designed for monitoring voltage and current in high voltage transmission lines. The linearity of the system is better than 2% in both ranges of 0 to 25 kV and 0 to 1000 A. The optical flowmeter uses a cross-correlation technique that analyses two light beams crossing the flow separated by a fixed distance. The x-ray image sensor uses a scintillating FO array, one FO for each image pixel to form an image of the x-ray field. The systems described in these paper use general-purpose components including optical fibers and optoelectronic devices, which are readily available, and of low cost.

  20. Research progress in fiber optic sensors and systems at the Federal University of Rio de Janeiro

    NASA Astrophysics Data System (ADS)

    Werneck, Marcelo M.; Ferreira, Aldo P.; Maggi, Luis E.; De Carvalho, C. C.; Ribeiro, R. M.

    1999-02-01

    As widely known, fiberoptics (FO) are being used in a large variety of sensor an systems particularly for their small dimensions and low cost, large bandwidth and favorable dielectric properties. These properties have allowed us to develop sensor and systems for general applications and, particularly, for biomedical engineering. The intravasculator pressure sensor was designed for small dimensions and high bandwidth. The system is based on light- intensity modulation technique and use a 2 mm-diameter elastomer membrane as the sensor element and a pigtailed laser as a light source. The optical power out put curve was linear for pressures within the range of 0 to 300 mmHg. The real time optical biosensor uses the evanescent field technique for monitoring Escherichia coli growth in culture media. The optical biosensor monitors interactions between the analytic and the evanescent field of an optical fiber passing through it. The FO based high voltage and current sensor is a measuring system designed for monitoring voltage and current in high voltage transmission lines. The linearity of the system is better than 2 percent in both ranges of 0 to 25 kV and 0 to 1000 A. The optical flowmeter uses a cross-correlation technique that analyzes two light beams crossing the flow separated by a fixed distance. The x-ray image sensor uses a scintillating FO array, one FO for each image pixel to form an image of the x-ray field. The systems described in this paper use general-purpose components including optical fibers and optoelectronic devices, which are readily available, and of low cost.

  1. A 256×256 low-light-level CMOS imaging sensor with digital CDS

    NASA Astrophysics Data System (ADS)

    Zou, Mei; Chen, Nan; Zhong, Shengyou; Li, Zhengfen; Zhang, Jicun; Yao, Li-bin

    2016-10-01

    In order to achieve high sensitivity for low-light-level CMOS image sensors (CIS), a capacitive transimpedance amplifier (CTIA) pixel circuit with a small integration capacitor is used. As the pixel and the column area are highly constrained, it is difficult to achieve analog correlated double sampling (CDS) to remove the noise for low-light-level CIS. So a digital CDS is adopted, which realizes the subtraction algorithm between the reset signal and pixel signal off-chip. The pixel reset noise and part of the column fixed-pattern noise (FPN) can be greatly reduced. A 256×256 CIS with CTIA array and digital CDS is implemented in the 0.35μm CMOS technology. The chip size is 7.7mm×6.75mm, and the pixel size is 15μm×15μm with a fill factor of 20.6%. The measured pixel noise is 24LSB with digital CDS in RMS value at dark condition, which shows 7.8× reduction compared to the image sensor without digital CDS. Running at 7fps, this low-light-level CIS can capture recognizable images with the illumination down to 0.1lux.

  2. Dosimetry of heavy ions by use of CCD detectors

    NASA Technical Reports Server (NTRS)

    Schott, J. U.

    1994-01-01

    The design and the atomic composition of Charge Coupled Devices (CCD's) make them unique for investigations of single energetic particle events. As detector system for ionizing particles they detect single particles with local resolution and near real time particle tracking. In combination with its properties as optical sensor, particle transversals of single particles are to be correlated to any objects attached to the light sensitive surface of the sensor by simple imaging of their shadow and subsequent image analysis of both, optical image and particle effects, observed in affected pixels. With biological objects it is possible for the first time to investigate effects of single heavy ions in tissue or extinguished organs of metabolizing (i.e. moving) systems with a local resolution better than 15 microns. Calibration data for particle detection in CCD's are presented for low energetic protons and heavy ions.

  3. Results from the Two-Year Infrared Cloud Imager Deployment at ARM's NSA Observatory in Barrow, Alaska

    NASA Astrophysics Data System (ADS)

    Shaw, J. A.; Nugent, P. W.

    2016-12-01

    Ground-based longwave-infrared (LWIR) cloud imaging can provide continuous cloud measurements in the Arctic. This is of particular importance during the Arctic winter when visible wavelength cloud imaging systems cannot operate. This method uses a thermal infrared camera to observe clouds and produce measurements of cloud amount and cloud optical depth. The Montana State University Optical Remote Sensor Laboratory deployed an infrared cloud imager (ICI) at the Atmospheric Radiation Monitoring North Slope of Alaska site at Barrow, AK from July 2012 through July 2014. This study was used to both understand the long-term operation of an ICI in the Arctic and to study the consistency of the ICI data products in relation to co-located active and passive sensors. The ICI was found to have a high correlation (> 0.92) with collocated cloud instruments and to produce an unbiased data product. However, the ICI also detects thin clouds that are not detected by most operational cloud sensors. Comparisons with high-sensitivity actively sensed cloud products confirm the existence of these thin clouds. Infrared cloud imaging systems can serve a critical role in developing our understanding of cloud cover in the Arctic by provided a continuous annual measurement of clouds at sites of interest.

  4. Comparison between DMSP-OLS and S-NPP Day-Night Band in Correlating with Regional Socio-economic Variables

    NASA Astrophysics Data System (ADS)

    Jing, X.; Shao, X.; Cao, C.; Fu, X.

    2013-12-01

    Night-time light imagery offers a unique view of the Earth's surface. In the past, the nighttime light data collected by the DMSP-OLS sensors have been used as efficient means to correlate with the global socio-economic activities. With the launch of Suomi National Polar-orbiting Partnership (S-NPP) satellite in October 2011, the Day Night Band (DNB) of the Visible Infrared Imaging Radiometer Suite (VIIRS) onboard S-NPP represents a major advancement in night time imaging capabilities because it surpassed its predecessor DMSP-OLS in radiometric accuracy, spatial resolution, and geometric quality. In this paper, we compared the performance of DNB image and DMSP image in correlating regional socio-economic activities and analyzed the leading causes for the differences. The correlation coefficients between the socio-economic variables such as population, regional GDP etc. and the characteristic variables derived from the night time light images of DNB and DMSP at provincial level in China were computed as performance metrics for comparison. In general, the correlation between DNB data and socio-economic data is better than that of DMSP data. To explain the difference in the correlation, we further analyzed the effects of several factors such as radiometric saturation and quantization of DMSP data, low spatial resolution, different data acquisition times between DNB and DMSP images, and difference in the transformation used in converting digital number (DN) value to radiance.

  5. Laser speckle strain and deformation sensor using linear array image cross-correlation method for specifically arranged triple-beam triple-camera configuration

    NASA Technical Reports Server (NTRS)

    Sarrafzadeh-Khoee, Adel K. (Inventor)

    2000-01-01

    The invention provides a method of triple-beam and triple-sensor in a laser speckle strain/deformation measurement system. The triple-beam/triple-camera configuration combined with sequential timing of laser beam shutters is capable of providing indications of surface strain and structure deformations. The strain and deformation quantities, the four variables of surface strain, in-plane displacement, out-of-plane displacement and tilt, are determined in closed form solutions.

  6. Multi sensor satellite imagers for commercial remote sensing

    NASA Astrophysics Data System (ADS)

    Cronje, T.; Burger, H.; Du Plessis, J.; Du Toit, J. F.; Marais, L.; Strumpfer, F.

    2005-10-01

    This paper will discuss and compare recent refractive and catodioptric imager designs developed and manufactured at SunSpace for Multi Sensor Satellite Imagers with Panchromatic, Multi-spectral, Area and Hyperspectral sensors on a single Focal Plane Array (FPA). These satellite optical systems were designed with applications to monitor food supplies, crop yield and disaster monitoring in mind. The aim of these imagers is to achieve medium to high resolution (2.5m to 15m) spatial sampling, wide swaths (up to 45km) and noise equivalent reflectance (NER) values of less than 0.5%. State-of-the-art FPA designs are discussed and address the choice of detectors to achieve these performances. Special attention is given to thermal robustness and compactness, the use of folding prisms to place multiple detectors in a large FPA and a specially developed process to customize the spectral selection with the need to minimize mass, power and cost. A refractive imager with up to 6 spectral bands (6.25m GSD) and a catodioptric imager with panchromatic (2.7m GSD), multi-spectral (6 bands, 4.6m GSD), hyperspectral (400nm to 2.35μm, 200 bands, 15m GSD) sensors on the same FPA will be discussed. Both of these imagers are also equipped with real time video view finding capabilities. The electronic units could be subdivided into the Front-End Electronics and Control Electronics with analogue and digital signal processing. A dedicated Analogue Front-End is used for Correlated Double Sampling (CDS), black level correction, variable gain and up to 12-bit digitizing and high speed LVDS data link to a mass memory unit.

  7. 3D digital image correlation methods for full-field vibration measurement

    NASA Astrophysics Data System (ADS)

    Helfrick, Mark N.; Niezrecki, Christopher; Avitabile, Peter; Schmidt, Timothy

    2011-04-01

    In the area of modal test/analysis/correlation, significant effort has been expended over the past twenty years in order to make reduced models and to expand test data for correlation and eventual updating of the finite element models. This has been restricted by vibration measurements which are traditionally limited to the location of relatively few applied sensors. Advances in computers and digital imaging technology have allowed 3D digital image correlation (DIC) methods to measure the shape and deformation of a vibrating structure. This technique allows for full-field measurement of structural response, thus providing a wealth of simultaneous test data. This paper presents some preliminary results for the test/analysis/correlation of data measured using the DIC approach along with traditional accelerometers and a scanning laser vibrometer for comparison to a finite element model. The results indicate that all three approaches correlated well with the finite element model and provide validation for the DIC approach for full-field vibration measurement. Some of the advantages and limitations of the technique are presented and discussed.

  8. Diffraction-Limited Plenoptic Imaging with Correlated Light

    NASA Astrophysics Data System (ADS)

    Pepe, Francesco V.; Di Lena, Francesco; Mazzilli, Aldo; Edrei, Eitan; Garuccio, Augusto; Scarcelli, Giuliano; D'Angelo, Milena

    2017-12-01

    Traditional optical imaging faces an unavoidable trade-off between resolution and depth of field (DOF). To increase resolution, high numerical apertures (NAs) are needed, but the associated large angular uncertainty results in a limited range of depths that can be put in sharp focus. Plenoptic imaging was introduced a few years ago to remedy this trade-off. To this aim, plenoptic imaging reconstructs the path of light rays from the lens to the sensor. However, the improvement offered by standard plenoptic imaging is practical and not fundamental: The increased DOF leads to a proportional reduction of the resolution well above the diffraction limit imposed by the lens NA. In this Letter, we demonstrate that correlation measurements enable pushing plenoptic imaging to its fundamental limits of both resolution and DOF. Namely, we demonstrate maintaining the imaging resolution at the diffraction limit while increasing the depth of field by a factor of 7. Our results represent the theoretical and experimental basis for the effective development of promising applications of plenoptic imaging.

  9. Diffraction-Limited Plenoptic Imaging with Correlated Light.

    PubMed

    Pepe, Francesco V; Di Lena, Francesco; Mazzilli, Aldo; Edrei, Eitan; Garuccio, Augusto; Scarcelli, Giuliano; D'Angelo, Milena

    2017-12-15

    Traditional optical imaging faces an unavoidable trade-off between resolution and depth of field (DOF). To increase resolution, high numerical apertures (NAs) are needed, but the associated large angular uncertainty results in a limited range of depths that can be put in sharp focus. Plenoptic imaging was introduced a few years ago to remedy this trade-off. To this aim, plenoptic imaging reconstructs the path of light rays from the lens to the sensor. However, the improvement offered by standard plenoptic imaging is practical and not fundamental: The increased DOF leads to a proportional reduction of the resolution well above the diffraction limit imposed by the lens NA. In this Letter, we demonstrate that correlation measurements enable pushing plenoptic imaging to its fundamental limits of both resolution and DOF. Namely, we demonstrate maintaining the imaging resolution at the diffraction limit while increasing the depth of field by a factor of 7. Our results represent the theoretical and experimental basis for the effective development of promising applications of plenoptic imaging.

  10. Modeling and performance assessment in QinetiQ of EO and IR airborne reconnaissance systems

    NASA Astrophysics Data System (ADS)

    Williams, John W.; Potter, Gary E.

    2002-11-01

    QinetiQ are the technical authority responsible for specifying the performance requirements for the procurement of airborne reconnaissance systems, on behalf of the UK MoD. They are also responsible for acceptance of delivered systems, overseeing and verifying the installed system performance as predicted and then assessed by the contractor. Measures of functional capability are central to these activities. The conduct of these activities utilises the broad technical insight and wide range of analysis tools and models available within QinetiQ. This paper focuses on the tools, methods and models that are applicable to systems based on EO and IR sensors. The tools, methods and models are described, and representative output for systems that QinetiQ has been responsible for is presented. The principle capability applicable to EO and IR airborne reconnaissance systems is the STAR (Simulation Tools for Airborne Reconnaissance) suite of models. STAR generates predictions of performance measures such as GRD (Ground Resolved Distance) and GIQE (General Image Quality) NIIRS (National Imagery Interpretation Rating Scales). It also generates images representing sensor output, using the scene generation software CAMEO-SIM and the imaging sensor model EMERALD. The simulated image 'quality' is fully correlated with the predicted non-imaging performance measures. STAR also generates image and table data that is compliant with STANAG 7023, which may be used to test ground station functionality.

  11. SU-E-J-66: Evaluation of a Real-Time Positioning Assistance Simulator System for Skull Radiography Using the Microsoft Kinect

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurata, T; Ono, M; Kozono, K

    2014-06-01

    Purpose: The purpose of this study is to investigate the feasibility of a low cost, small size positioning assistance simulator system for skull radiography using the Microsoft Kinect sensor. A conventional radiographic simulator system can only measure the three-dimensional coordinates of an x-ray tube using angle sensors, but not measure the movement of the subject. Therefore, in this study, we developed a real-time simulator system using the Microsoft Kinect to measure both the x-ray tube and the subject, and evaluated its accuracy and feasibility by comparing the simulated and the measured x-ray images. Methods: This system can track a headmore » phantom by using Face Tracking, which is one of the functions of the Kinect. The relative relationship between the Kinect and the head phantom was measured and the projection image was calculated by using the ray casting method, and by using three-dimensional CT head data with 220 slices at 512 × 512 pixels. X-ray images were thus obtained by using a computed radiography (CR) system. We could then compare the simulated projection images with the measured x-ray images from 0 degrees to 45 degrees at increments of 15 degrees by calculating the cross correlation coefficient C. Results: The calculation time of the simulated projection images was almost real-time (within 1 second) by using the Graphics Processing Unit(GPU). The cross-correlation coefficients C are: 0.916; 0.909; 0.891; and, 0.886 at 0, 15, 30, and 45 degrees, respectively. As a result, there were strong correlations between the simulated and measured images. Conclusion: This system can be used to perform head positioning more easily and accurately. It is expected that this system will be useful for learning radiographic techniques by students. Moreover, it could also be used for predicting the actual x-ray image prior to x-ray exposure in clinical environments.« less

  12. Electronic still camera

    NASA Astrophysics Data System (ADS)

    Holland, S. Douglas

    1992-09-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  13. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  14. FDTD-based optical simulations methodology for CMOS image sensors pixels architecture and process optimization

    NASA Astrophysics Data System (ADS)

    Hirigoyen, Flavien; Crocherie, Axel; Vaillant, Jérôme M.; Cazaux, Yvon

    2008-02-01

    This paper presents a new FDTD-based optical simulation model dedicated to describe the optical performances of CMOS image sensors taking into account diffraction effects. Following market trend and industrialization constraints, CMOS image sensors must be easily embedded into even smaller packages, which are now equipped with auto-focus and short-term coming zoom system. Due to miniaturization, the ray-tracing models used to evaluate pixels optical performances are not accurate anymore to describe the light propagation inside the sensor, because of diffraction effects. Thus we adopt a more fundamental description to take into account these diffraction effects: we chose to use Maxwell-Boltzmann based modeling to compute the propagation of light, and to use a software with an FDTD-based (Finite Difference Time Domain) engine to solve this propagation. We present in this article the complete methodology of this modeling: on one hand incoherent plane waves are propagated to approximate a product-use diffuse-like source, on the other hand we use periodic conditions to limit the size of the simulated model and both memory and computation time. After having presented the correlation of the model with measurements we will illustrate its use in the case of the optimization of a 1.75μm pixel.

  15. Fluorescence Intensity- and Lifetime-Based Glucose Sensing Using Glucose/Galactose-Binding Protein

    PubMed Central

    Pickup, John C.; Khan, Faaizah; Zhi, Zheng-Liang; Coulter, Jonathan; Birch, David J. S.

    2013-01-01

    We review progress in our laboratories toward developing in vivo glucose sensors for diabetes that are based on fluorescence labeling of glucose/galactose-binding protein. Measurement strategies have included both monitoring glucose-induced changes in fluorescence resonance energy transfer and labeling with the environmentally sensitive fluorophore, badan. Measuring fluorescence lifetime rather than intensity has particular potential advantages for in vivo sensing. A prototype fiber-optic-based glucose sensor using this technology is being tested.Fluorescence technique is one of the major solutions for achieving the continuous and noninvasive glucose sensor for diabetes. In this article, a highly sensitive nanostructured sensor is developed to detect extremely small amounts of aqueous glucose by applying fluorescence energy transfer (FRET). A one-pot method is applied to produce the dextran-fluorescein isothiocyanate (FITC)-conjugating mesoporous silica nanoparticles (MSNs), which afterward interact with the tetramethylrhodamine isothiocyanate (TRITC)-labeled concanavalin A (Con A) to form the FRET nanoparticles (FITC-dextran-Con A-TRITC@MSNs). The nanostructured glucose sensor is then formed via the self-assembly of the FRET nanoparticles on a transparent, flexible, and biocompatible substrate, e.g., poly(dimethylsiloxane). Our results indicate the diameter of the MSNs is 60 ± 5 nm. The difference in the images before and after adding 20 μl of glucose (0.10 mmol/liter) on the FRET sensor can be detected in less than 2 min by the laser confocal laser scanning microscope. The correlation between the ratio of fluorescence intensity, I(donor)/I(acceptor), of the FRET sensor and the concentration of aqueous glucose in the range of 0.04–4 mmol/liter has been investigated; a linear relationship is found. Furthermore, the durability of the nanostructured FRET sensor is evaluated for 5 days. In addition, the recorded images can be converted to digital images by obtaining the pixels from the resulting matrix using Matlab image processing functions. We have also studied the in vitro cytotoxicity of the device. The nanostructured FRET sensor may provide an alternative method to help patients manage the disease continuously. PMID:23439161

  16. The influence of the in situ camera calibration for direct georeferencing of aerial imagery

    NASA Astrophysics Data System (ADS)

    Mitishita, E.; Barrios, R.; Centeno, J.

    2014-11-01

    The direct determination of exterior orientation parameters (EOPs) of aerial images via GNSS/INS technologies is an essential prerequisite in photogrammetric mapping nowadays. Although direct sensor orientation technologies provide a high degree of automation in the process due to the GNSS/INS technologies, the accuracies of the obtained results depend on the quality of a group of parameters that models accurately the conditions of the system at the moment the job is performed. One sub-group of parameters (lever arm offsets and boresight misalignments) models the position and orientation of the sensors with respect to the IMU body frame due to the impossibility of having all sensors on the same position and orientation in the airborne platform. Another sub-group of parameters models the internal characteristics of the sensor (IOP). A system calibration procedure has been recommended by worldwide studies to obtain accurate parameters (mounting and sensor characteristics) for applications of the direct sensor orientation. Commonly, mounting and sensor characteristics are not stable; they can vary in different flight conditions. The system calibration requires a geometric arrangement of the flight and/or control points to decouple correlated parameters, which are not available in the conventional photogrammetric flight. Considering this difficulty, this study investigates the feasibility of the in situ camera calibration to improve the accuracy of the direct georeferencing of aerial images. The camera calibration uses a minimum image block, extracted from the conventional photogrammetric flight, and control point arrangement. A digital Vexcel UltraCam XP camera connected to POS AV TM system was used to get two photogrammetric image blocks. The blocks have different flight directions and opposite flight line. In situ calibration procedures to compute different sets of IOPs are performed and their results are analyzed and used in photogrammetric experiments. The IOPs from the in situ camera calibration improve significantly the accuracies of the direct georeferencing. The obtained results from the experiments are shown and discussed.

  17. Intelligent Network-Centric Sensors Development Program

    DTIC Science & Technology

    2012-07-31

    Image sensor Configuration: ; Cone 360 degree LWIR PFx Sensor: •■. Image sensor . Configuration: Image MWIR Configuration; Cone 360 degree... LWIR PFx Sensor: Video Configuration: Cone 360 degree SW1R, 2. Reasoning Process to Match Sensor Systems to Algorithms The ontological...effects of coherent imaging because of aberrations. Another reason is the specular nature of active imaging. Both contribute to the nonuniformity

  18. Long-range time-of-flight scanning sensor based on high-speed time-correlated single-photon counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCarthy, Aongus; Collins, Robert J.; Krichel, Nils J.

    2009-11-10

    We describe a scanning time-of-flight system which uses the time-correlated single-photon counting technique to produce three-dimensional depth images of distant, noncooperative surfaces when these targets are illuminated by a kHz to MHz repetition rate pulsed laser source. The data for the scene are acquired using a scanning optical system and an individual single-photon detector. Depth images have been successfully acquired with centimeter xyz resolution, in daylight conditions, for low-signature targets in field trials at distances of up to 325 m using an output illumination with an average optical power of less than 50 {mu}W.

  19. A 128 x 128 CMOS Active Pixel Image Sensor for Highly Integrated Imaging Systems

    NASA Technical Reports Server (NTRS)

    Mendis, Sunetra K.; Kemeny, Sabrina E.; Fossum, Eric R.

    1993-01-01

    A new CMOS-based image sensor that is intrinsically compatible with on-chip CMOS circuitry is reported. The new CMOS active pixel image sensor achieves low noise, high sensitivity, X-Y addressability, and has simple timing requirements. The image sensor was fabricated using a 2 micrometer p-well CMOS process, and consists of a 128 x 128 array of 40 micrometer x 40 micrometer pixels. The CMOS image sensor technology enables highly integrated smart image sensors, and makes the design, incorporation and fabrication of such sensors widely accessible to the integrated circuit community.

  20. Nanophotonic Image Sensors

    PubMed Central

    Hu, Xin; Wen, Long; Yu, Yan; Cumming, David R. S.

    2016-01-01

    The increasing miniaturization and resolution of image sensors bring challenges to conventional optical elements such as spectral filters and polarizers, the properties of which are determined mainly by the materials used, including dye polymers. Recent developments in spectral filtering and optical manipulating techniques based on nanophotonics have opened up the possibility of an alternative method to control light spectrally and spatially. By integrating these technologies into image sensors, it will become possible to achieve high compactness, improved process compatibility, robust stability and tunable functionality. In this Review, recent representative achievements on nanophotonic image sensors are presented and analyzed including image sensors with nanophotonic color filters and polarizers, metamaterial‐based THz image sensors, filter‐free nanowire image sensors and nanostructured‐based multispectral image sensors. This novel combination of cutting edge photonics research and well‐developed commercial products may not only lead to an important application of nanophotonics but also offer great potential for next generation image sensors beyond Moore's Law expectations. PMID:27239941

  1. Optical Demonstration of a Medical Imaging System with an EMCCD-Sensor Array for Use in a High Resolution Dynamic X-ray Imager

    PubMed Central

    Qu, Bin; Huang, Ying; Wang, Weiyuan; Sharma, Prateek; Kuhls-Gilcrist, Andrew T.; Cartwright, Alexander N.; Titus, Albert H.; Bednarek, Daniel R.; Rudin, Stephen

    2011-01-01

    Use of an extensible array of Electron Multiplying CCDs (EMCCDs) in medical x-ray imager applications was demonstrated for the first time. The large variable electronic-gain (up to 2000) and small pixel size of EMCCDs provide effective suppression of readout noise compared to signal, as well as high resolution, enabling the development of an x-ray detector with far superior performance compared to conventional x-ray image intensifiers and flat panel detectors. We are developing arrays of EMCCDs to overcome their limited field of view (FOV). In this work we report on an array of two EMCCD sensors running simultaneously at a high frame rate and optically focused on a mammogram film showing calcified ducts. The work was conducted on an optical table with a pulsed LED bar used to provide a uniform diffuse light onto the film to simulate x-ray projection images. The system can be selected to run at up to 17.5 frames per second or even higher frame rate with binning. Integration time for the sensors can be adjusted from 1 ms to 1000 ms. Twelve-bit correlated double sampling AD converters were used to digitize the images, which were acquired by a National Instruments dual-channel Camera Link PC board in real time. A user-friendly interface was programmed using LabVIEW to save and display 2K × 1K pixel matrix digital images. The demonstration tiles a 2 × 1 array to acquire increased-FOV stationary images taken at different gains and fluoroscopic-like videos recorded by scanning the mammogram simultaneously with both sensors. The results show high resolution and high dynamic range images stitched together with minimal adjustments needed. The EMCCD array design allows for expansion to an M×N array for arbitrarily larger FOV, yet with high resolution and large dynamic range maintained. PMID:23505330

  2. Measurement of Vibrated Bulk Density of Coke Particle Blends Using Image Texture Analysis

    NASA Astrophysics Data System (ADS)

    Azari, Kamran; Bogoya-Forero, Wilinthon; Duchesne, Carl; Tessier, Jayson

    2017-09-01

    A rapid and nondestructive machine vision sensor was developed for predicting the vibrated bulk density (VBD) of petroleum coke particles based on image texture analysis. It could be used for making corrective adjustments to a paste plant operation to reduce green anode variability (e.g., changes in binder demand). Wavelet texture analysis (WTA) and gray level co-occurrence matrix (GLCM) algorithms were used jointly for extracting the surface textural features of coke aggregates from images. These were correlated with the VBD using partial least-squares (PLS) regression. Coke samples of several sizes and from different sources were used to test the sensor. Variations in the coke surface texture introduced by coke size and source allowed for making good predictions of the VBD of individual coke samples and mixtures of them (blends involving two sources and different sizes). Promising results were also obtained for coke blends collected from an industrial-baked carbon anode manufacturer.

  3. Image-based topology for sensor gridlocking and association

    NASA Astrophysics Data System (ADS)

    Stanek, Clay J.; Javidi, Bahram; Yanni, Philip

    2002-07-01

    Correlation engines have been evolving since the implementation of radar. In modern sensor fusion architectures, correlation and gridlock filtering are required to produce common, continuous, and unambiguous tracks of all objects in the surveillance area. The objective is to provide a unified picture of the theatre or area of interest to battlefield decision makers, ultimately enabling them to make better inferences for future action and eliminate fratricide by reducing ambiguities. Here, correlation refers to association, which in this context is track-to-track association. A related process, gridlock filtering or gridlocking, refers to the reduction in navigation errors and sensor misalignment errors so that one sensor's track data can be accurately transformed into another sensor's coordinate system. As platforms gain multiple sensors, the correlation and gridlocking of tracks become significantly more difficult. Much of the existing correlation technology revolves around various interpretations of the generalized Bayesian decision rule: choose the action that minimizes conditional risk. One implementation of this principle equates the risk minimization statement to the comparison of ratios of a priori probability distributions to thresholds. The binary decision problem phrased in terms of likelihood ratios is also known as the famed Neyman-Pearson hypothesis test. Using another restatement of the principle for a symmetric loss function, risk minimization leads to a decision that maximizes the a posteriori probability distribution. Even for deterministic decision rules, situations can arise in correlation where there are ambiguities. For these situations, a common algorithm used is a sparse assignment technique such as the Munkres or JVC algorithm. Furthermore, associated tracks may be combined with the hope of reducing the positional uncertainty of a target or object identified by an existing track from the information of several fused/correlated tracks. Gridlocking is typically accomplished with some type of least-squares algorithm, such as the Kalman filtering technique, which attempts to locate the best bias error vector estimate from a set of correlated/fused track pairs. Here, we will introduce a new approach to this longstanding problem by adapting many of the familiar concepts from pattern recognition, ones certainly familiar to target recognition applications. Furthermore, we will show how this technique can lend itself to specialized processing, such as that available through an optical or hybrid correlator.

  4. A preliminary comparison of Landsat Thematic Mapper and SPOT-1 HRV multispectral data for estimating coniferous forest volume

    NASA Technical Reports Server (NTRS)

    Ripple, W. J.; Wang, S.; Isaacson, D. L.; Paine, D. P.

    1991-01-01

    Digital Landsat Thematic Mapper (TM) and SPOT high-resolution visible (HRV) images of coniferous forest canopies were compared in their relationship to forest wood volume using correlation and regression analyses. Significant inverse relationships were found between softwood volume and the spectral bands from both sensors (P less than 0.01). The highest correlations were between the log of softwood volume and the near-infrared bands.

  5. Image quality evaluation of eight complementary metal-oxide semiconductor intraoral digital X-ray sensors.

    PubMed

    Teich, Sorin; Al-Rawi, Wisam; Heima, Masahiro; Faddoul, Fady F; Goldzweig, Gil; Gutmacher, Zvi; Aizenbud, Dror

    2016-10-01

    To evaluate the image quality generated by eight commercially available intraoral sensors. Eighteen clinicians ranked the quality of a bitewing acquired from one subject using eight different intraoral sensors. Analytical methods used to evaluate clinical image quality included the Visual Grading Characteristics method, which helps to quantify subjective opinions to make them suitable for analysis. The Dexis sensor was ranked significantly better than Sirona and Carestream-Kodak sensors; and the image captured using the Carestream-Kodak sensor was ranked significantly worse than those captured using Dexis, Schick and Cyber Medical Imaging sensors. The Image Works sensor image was rated the lowest by all clinicians. Other comparisons resulted in non-significant results. None of the sensors was considered to generate images of significantly better quality than the other sensors tested. Further research should be directed towards determining the clinical significance of the differences in image quality reported in this study. © 2016 FDI World Dental Federation.

  6. Landsat 7 thermal-IR image sharpening using an artificial neural network and sensor model

    USGS Publications Warehouse

    Lemeshewsky, G.P.; Schowengerdt, R.A.; ,

    2001-01-01

    The enhanced thematic mapper (plus) (ETM+) instrument on Landsat 7 shares the same basic design as the TM sensors on Landsats 4 and 5, with some significant improvements. In common are six multispectral bands with a 30-m ground-projected instantaneous field of view (GIFOV). However, the thermaL-IR (TIR) band now has a 60-m GIFOV, instead of 120-m. Also, a 15-m panchromatic band has been added. The artificial neural network (NN) image sharpening method described here uses data from the higher spatial resolution ETM+ bands to enhance (sharpen) the spatial resolution of the TIR imagery. It is based on an assumed correlation over multiple scales of resolution, between image edge contrast patterns in the TIR band and several other spectral bands. A multilayer, feedforward NN is trained to approximate TIR data at 60m, given degraded (from 30-m to 60-m) spatial resolution input from spectral bands 7, 5, and 2. After training, the NN output for full-resolution input generates an approximation of a TIR image at 30-m resolution. Two methods are used to degrade the spatial resolution of the imagery used for NN training, and the corresponding sharpening results are compared. One degradation method uses a published sensor transfer function (TF) for Landsat 5 to simulate sensor coarser resolution imagery from higher resolution imagery. For comparison, the second degradation method is simply Gaussian low pass filtering and subsampling, wherein the Gaussian filter approximates the full width at half maximum amplitude characteristics of the TF-based spatial filter. Two fixed-size NNs (that is, number of weights and processing elements) were trained separately with the degraded resolution data, and the sharpening results compared. The comparison evaluates the relative influence of the degradation technique employed and whether or not it is desirable to incorporate a sensor TF model. Preliminary results indicate some improvements for the sensor model-based technique. Further evaluation using a higher resolution reference image and strict application of sensor model to data is recommended.

  7. Observation of a Large Landslide on La Reunion Island Using Differential Sar Interferometry (JERS and Radarsat) and Correlation of Optical (Spot5 and Aerial) Images

    PubMed Central

    Delacourt, Christophe; Raucoules, Daniel; Le Mouélic, Stéphane; Carnec, Claudie; Feurer, Denis; Allemand, Pascal; Cruchet, Marc

    2009-01-01

    Slope instabilities are one of the most important geo-hazards in terms of socio-economic costs. The island of La Réunion (Indian Ocean) is affected by constant slope movements and huge landslides due to a combination of rough topography, wet tropical climate and its specific geological context. We show that remote sensing techniques (Differential SAR Interferometry and correlation of optical images) provide complementary means to characterize landslides on a regional scale. The vegetation cover generally hampers the analysis of C–band interferograms. We used JERS-1 images to show that the L-band can be used to overcome the loss of coherence observed in Radarsat C-band interferograms. Image correlation was applied to optical airborne and SPOT 5 sensors images. The two techniques were applied to a landslide near the town of Hellbourg in order to assess their performance for detecting and quantifying the ground motion associated to this landslide. They allowed the mapping of the unstable areas. Ground displacement of about 0.5 m yr-1 was measured. PMID:22389620

  8. Observation of a Large Landslide on La Reunion Island Using Differential Sar Interferometry (JERS and Radarsat) and Correlation of Optical (Spot5 and Aerial) Images.

    PubMed

    Delacourt, Christophe; Raucoules, Daniel; Le Mouélic, Stéphane; Carnec, Claudie; Feurer, Denis; Allemand, Pascal; Cruchet, Marc

    2009-01-01

    Slope instabilities are one of the most important geo-hazards in terms of socio-economic costs. The island of La Réunion (Indian Ocean) is affected by constant slope movements and huge landslides due to a combination of rough topography, wet tropical climate and its specific geological context. We show that remote sensing techniques (Differential SAR Interferometry and correlation of optical images) provide complementary means to characterize landslides on a regional scale. The vegetation cover generally hampers the analysis of C-band interferograms. We used JERS-1 images to show that the L-band can be used to overcome the loss of coherence observed in Radarsat C-band interferograms. Image correlation was applied to optical airborne and SPOT 5 sensors images. The two techniques were applied to a landslide near the town of Hellbourg in order to assess their performance for detecting and quantifying the ground motion associated to this landslide. They allowed the mapping of the unstable areas. Ground displacement of about 0.5 m yr(-1) was measured.

  9. CMOS sensors for atmospheric imaging

    NASA Astrophysics Data System (ADS)

    Pratlong, Jérôme; Burt, David; Jerram, Paul; Mayer, Frédéric; Walker, Andrew; Simpson, Robert; Johnson, Steven; Hubbard, Wendy

    2017-09-01

    Recent European atmospheric imaging missions have seen a move towards the use of CMOS sensors for the visible and NIR parts of the spectrum. These applications have particular challenges that are completely different to those that have driven the development of commercial sensors for applications such as cell-phone or SLR cameras. This paper will cover the design and performance of general-purpose image sensors that are to be used in the MTG (Meteosat Third Generation) and MetImage satellites and the technology challenges that they have presented. We will discuss how CMOS imagers have been designed with 4T pixel sizes of up to 250 μm square achieving good charge transfer efficiency, or low lag, with signal levels up to 2M electrons and with high line rates. In both devices a low noise analogue read-out chain is used with correlated double sampling to suppress the readout noise and give a maximum dynamic range that is significantly larger than in standard commercial devices. Radiation hardness is a particular challenge for CMOS detectors and both of these sensors have been designed to be fully radiation hard with high latch-up and single-event-upset tolerances, which is now silicon proven on MTG. We will also cover the impact of ionising radiation on these devices. Because with such large pixels the photodiodes have a large open area, front illumination technology is sufficient to meet the detection efficiency requirements but with thicker than standard epitaxial silicon to give improved IR response (note that this makes latch up protection even more important). However with narrow band illumination reflections from the front and back of the dielectric stack on the top of the sensor produce Fabry-Perot étalon effects, which have been minimised with process modifications. We will also cover the addition of precision narrow band filters inside the MTG package to provide a complete imaging subsystem. Control of reflected light is also critical in obtaining the required optical performance and this has driven the development of a black coating layer that can be applied between the active silicon regions.

  10. SU-F-T-525: Monitordeep-Inspiratory Breathhold with a Laser Sensor for Radiation Therapy of Left Breast Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tai, A; Currey, A; Li, X Allen

    2016-06-15

    Purpose: Radiation therapy (RT) of left sided breast cancers with deep-inspiratory breathhold (DIBH) can reduce the dose to heart. The purpose of this study is to develop and test a new laser-based tool to improve ease of RT delivery using DIBH. Methods: A laser sensor together with breathing monitor device (Anzai Inc., Japan) was used to record the surface breathing motion of phantom/volunteers. The device projects a laser beam to the chestwall and the reflected light creates a focal spot on a light detecting element. The position change of the focal spot correlates with the patient’s breathing motion and ismore » measured through the change of current in the light detecting element. The signal is amplified and displayed on a computer screen, which is used to trigger radiation gating. The laser sensor can be easily mounted to the simulation/treatment couch with a fixing plate and a magnet base, and has a sensitivity range of 10 to 40 cm from the patient. The correlation of breathing signals detected by laser sensor and visionRT is also investigated. Results: It is found that the measured breathing signal from the laser sensor is stable and reproducible and has no noticeable delay. It correlates well with the VisionRT surface imaging system. The DIBH reference level does not change with movement of the couch because the laser sensor and couch move together. Conclusion: The Anzai laser sensor provides a cost-effective way to improve beam gating with DIBH for treating left breast cancer. It can be used alone or together with VisionRT to determine the correct DIBH level during the radiation treatment of left breast cancer with DIBH.« less

  11. Radioactive Quality Evaluation and Cross Validation of Data from the HJ-1A/B Satellites' CCD Sensors

    PubMed Central

    Zhang, Xin; Zhao, Xiang; Liu, Guodong; Kang, Qian; Wu, Donghai

    2013-01-01

    Data from multiple sensors are frequently used in Earth science to gain a more complete understanding of spatial information changes. Higher quality and mutual consistency are prerequisites when multiple sensors are jointly used. The HJ-1A/B satellites successfully launched on 6 September 2008. There are four charge-coupled device (CCD) sensors with uniform spatial resolutions and spectral range onboard the HJ-A/B satellites. Whether these data are keeping consistency is a major issue before they are used. This research aims to evaluate the data consistency and radioactive quality from the four CCDs. First, images of urban, desert, lake and ocean are chosen as the objects of evaluation. Second, objective evaluation variables, such as mean, variance and angular second moment, are used to identify image performance. Finally, a cross validation method are used to ensure the correlation of the data from the four HJ-1A/B CCDs and that which is gathered from the moderate resolution imaging spectro-radiometer (MODIS). The results show that the image quality of HJ-1A/B CCDs is stable, and the digital number distribution of CCD data is relatively low. In cross validation with MODIS, the root mean square errors of bands 1, 2 and 3 range from 0.055 to 0.065, and for band 4 it is 0.101. The data from HJ-1A/B CCD have better consistency. PMID:23881127

  12. Radioactive quality evaluation and cross validation of data from the HJ-1A/B satellites' CCD sensors.

    PubMed

    Zhang, Xin; Zhao, Xiang; Liu, Guodong; Kang, Qian; Wu, Donghai

    2013-07-05

    Data from multiple sensors are frequently used in Earth science to gain a more complete understanding of spatial information changes. Higher quality and mutual consistency are prerequisites when multiple sensors are jointly used. The HJ-1A/B satellites successfully launched on 6 September 2008. There are four charge-coupled device (CCD) sensors with uniform spatial resolutions and spectral range onboard the HJ-A/B satellites. Whether these data are keeping consistency is a major issue before they are used. This research aims to evaluate the data consistency and radioactive quality from the four CCDs. First, images of urban, desert, lake and ocean are chosen as the objects of evaluation. Second, objective evaluation variables, such as mean, variance and angular second moment, are used to identify image performance. Finally, a cross validation method are used to ensure the correlation of the data from the four HJ-1A/B CCDs and that which is gathered from the moderate resolution imaging spectro-radiometer (MODIS). The results show that the image quality of HJ-1A/B CCDs is stable, and the digital number distribution of CCD data is relatively low. In cross validation with MODIS, the root mean square errors of bands 1, 2 and 3 range from 0.055 to 0.065, and for band 4 it is 0.101. The data from HJ-1A/B CCD have better consistency.

  13. Fusion of Laser Altimetry Data with Dems Derived from Stereo Imaging Systems

    NASA Astrophysics Data System (ADS)

    Schenk, T.; Csatho, B. M.; Duncan, K.

    2016-06-01

    During the last two decades surface elevation data have been gathered over the Greenland Ice Sheet (GrIS) from a variety of different sensors including spaceborne and airborne laser altimetry, such as NASA's Ice Cloud and land Elevation Satellite (ICESat), Airborne Topographic Mapper (ATM) and Laser Vegetation Imaging Sensor (LVIS), as well as from stereo satellite imaging systems, most notably from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) and Worldview. The spatio-temporal resolution, the accuracy, and the spatial coverage of all these data differ widely. For example, laser altimetry systems are much more accurate than DEMs derived by correlation from imaging systems. On the other hand, DEMs usually have a superior spatial resolution and extended spatial coverage. We present in this paper an overview of the SERAC (Surface Elevation Reconstruction And Change detection) system, designed to cope with the data complexity and the computation of elevation change histories. SERAC simultaneously determines the ice sheet surface shape and the time-series of elevation changes for surface patches whose size depends on the ruggedness of the surface and the point distribution of the sensors involved. By incorporating different sensors, SERAC is a true fusion system that generates the best plausible result (time series of elevation changes) a result that is better than the sum of its individual parts. We follow this up with an example of the Helmheim gacier, involving ICESat, ATM and LVIS laser altimetry data, together with ASTER DEMs.

  14. Fixed-pattern noise correction method based on improved moment matching for a TDI CMOS image sensor.

    PubMed

    Xu, Jiangtao; Nie, Huafeng; Nie, Kaiming; Jin, Weimin

    2017-09-01

    In this paper, an improved moment matching method based on a spatial correlation filter (SCF) and bilateral filter (BF) is proposed to correct the fixed-pattern noise (FPN) of a time-delay-integration CMOS image sensor (TDI-CIS). First, the values of row FPN (RFPN) and column FPN (CFPN) are estimated and added to the original image through SCF and BF, respectively. Then the filtered image will be processed by an improved moment matching method with a moving window. Experimental results based on a 128-stage TDI-CIS show that, after correcting the FPN in the image captured under uniform illumination, the standard deviation of row mean vector (SDRMV) decreases from 5.6761 LSB to 0.1948 LSB, while the standard deviation of the column mean vector (SDCMV) decreases from 15.2005 LSB to 13.1949LSB. In addition, for different images captured by different TDI-CISs, the average decrease of SDRMV and SDCMV is 5.4922/2.0357 LSB, respectively. Comparative experimental results indicate that the proposed method can effectively correct the FPNs of different TDI-CISs while maintaining image details without any auxiliary equipment.

  15. a Semi-Empirical Topographic Correction Model for Multi-Source Satellite Images

    NASA Astrophysics Data System (ADS)

    Xiao, Sa; Tian, Xinpeng; Liu, Qiang; Wen, Jianguang; Ma, Yushuang; Song, Zhenwei

    2018-04-01

    Topographic correction of surface reflectance in rugged terrain areas is the prerequisite for the quantitative application of remote sensing in mountainous areas. Physics-based radiative transfer model can be applied to correct the topographic effect and accurately retrieve the reflectance of the slope surface from high quality satellite image such as Landsat8 OLI. However, as more and more images data available from various of sensors, some times we can not get the accurate sensor calibration parameters and atmosphere conditions which are needed in the physics-based topographic correction model. This paper proposed a semi-empirical atmosphere and topographic corrction model for muti-source satellite images without accurate calibration parameters.Based on this model we can get the topographic corrected surface reflectance from DN data, and we tested and verified this model with image data from Chinese satellite HJ and GF. The result shows that the correlation factor was reduced almost 85 % for near infrared bands and the classification overall accuracy of classification increased 14 % after correction for HJ. The reflectance difference of slope face the sun and face away the sun have reduced after correction.

  16. Extended Scene SH Wavefront Sensor Algorithm: Minimization of Scene Content Dependent Shift Estimation Errors

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    Adaptive Periodic-Correlation (APC) algorithm was developed for use in extended-scene Shack-Hartmann wavefront sensors. It provides high-accuracy even when the sub-images in a frame captured by a Shack-Hartmann camera are not only shifted but also distorted relative to each other. Recently we found that the shift-estimate error of the APC algorithm has a component that depends on the content of extended-scene. In this paper we assess the amount of that error and propose a method to minimize it.

  17. Nanophotonic Image Sensors.

    PubMed

    Chen, Qin; Hu, Xin; Wen, Long; Yu, Yan; Cumming, David R S

    2016-09-01

    The increasing miniaturization and resolution of image sensors bring challenges to conventional optical elements such as spectral filters and polarizers, the properties of which are determined mainly by the materials used, including dye polymers. Recent developments in spectral filtering and optical manipulating techniques based on nanophotonics have opened up the possibility of an alternative method to control light spectrally and spatially. By integrating these technologies into image sensors, it will become possible to achieve high compactness, improved process compatibility, robust stability and tunable functionality. In this Review, recent representative achievements on nanophotonic image sensors are presented and analyzed including image sensors with nanophotonic color filters and polarizers, metamaterial-based THz image sensors, filter-free nanowire image sensors and nanostructured-based multispectral image sensors. This novel combination of cutting edge photonics research and well-developed commercial products may not only lead to an important application of nanophotonics but also offer great potential for next generation image sensors beyond Moore's Law expectations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Advances in detection of diffuse seafloor venting using structured light imaging.

    NASA Astrophysics Data System (ADS)

    Smart, C.; Roman, C.; Carey, S.

    2016-12-01

    Systematic, remote detection and high resolution mapping of low temperature diffuse hydrothermal venting is inefficient and not currently tractable using traditional remotely operated vehicle (ROV) mounted sensors. Preliminary results for hydrothermal vent detection using a structured light laser sensor were presented in 2011 and published in 2013 (Smart) with continual advancements occurring in the interim. As the structured light laser passes over active venting, the projected laser line effectively blurs due to the associated turbulence and density anomalies in the vent fluid. The degree laser disturbance is captured by a camera collecting images of the laser line at 20 Hz. Advancements in the detection of the laser and fluid interaction have included extensive normalization of the collected laser data and the implementation of a support vector machine algorithm to develop a classification routine. The image data collected over a hydrothermal vent field is then labeled as seafloor, bacteria or a location of venting. The results can then be correlated with stereo images, bathymetry and backscatter data. This sensor is a component of an ROV mounted imaging suite which also includes stereo cameras and a multibeam sonar system. Originally developed for bathymetric mapping, the structured light laser sensor, and other imaging suite components, are capable of creating visual and bathymetric maps with centimeter level resolution. Surveys are completed in a standard mowing the lawn pattern completing a 30m x 30m survey with centimeter level resolution in under an hour. Resulting co-registered data includes, multibeam and structured light laser bathymetry and backscatter, stereo images and vent detection. This system allows for efficient exploration of areas with diffuse and small point source hydrothermal venting increasing the effectiveness of scientific sampling and observation. Recent vent detection results collected during the 2013-2015 E/V Nautilus seasons will be presented. Smart, C. J. and Roman, C. and Carey, S. N. (2013) Detection of diffuse seafloor venting using structured light imaging, Geochemistry, Geophysics, Geosystems, 14, 4743-4757

  19. A novel dual gating approach using joint inertial sensors: implications for cardiac PET imaging

    NASA Astrophysics Data System (ADS)

    Jafari Tadi, Mojtaba; Teuho, Jarmo; Lehtonen, Eero; Saraste, Antti; Pänkäälä, Mikko; Koivisto, Tero; Teräs, Mika

    2017-10-01

    Positron emission tomography (PET) is a non-invasive imaging technique which may be considered as the state of art for the examination of cardiac inflammation due to atherosclerosis. A fundamental limitation of PET is that cardiac and respiratory motions reduce the quality of the achieved images. Current approaches for motion compensation involve gating the PET data based on the timing of quiescent periods of cardiac and respiratory cycles. In this study, we present a novel gating method called microelectromechanical (MEMS) dual gating which relies on joint non-electrical sensors, i.e. tri-axial accelerometer and gyroscope. This approach can be used for optimized selection of quiescent phases of cardiac and respiratory cycles. Cardiomechanical activity according to echocardiography observations was investigated to confirm whether this dual sensor solution can provide accurate trigger timings for cardiac gating. Additionally, longitudinal chest motions originating from breathing were measured by accelerometric- and gyroscopic-derived respiratory (ADR and GDR) tracking. The ADR and GDR signals were evaluated against Varian real-time position management (RPM) signals in terms of amplitude and phase. Accordingly, high linear correlation and agreement were achieved between the reference electrocardiography, RPM, and measured MEMS signals. We also performed a Ge-68 phantom study to evaluate possible metal artifacts caused by the integrated read-out electronics including mechanical sensors and semiconductors. The reconstructed phantom images did not reveal any image artifacts. Thus, it was concluded that MEMS-driven dual gating can be used in PET studies without an effect on the quantitative or visual accuracy of the PET images. Finally, the applicability of MEMS dual gating for cardiac PET imaging was investigated with two atherosclerosis patients. Dual gated PET images were successfully reconstructed using only MEMS signals and both qualitative and quantitative assessments revealed encouraging results that warrant further investigation of this method.

  20. Simulation of Detecting Damage in Composite Stiffened Panel Using Lamb Waves

    NASA Technical Reports Server (NTRS)

    Wang, John T.; Ross, Richard W.; Huang, Guo L.; Yuan, Fuh G.

    2013-01-01

    Lamb wave damage detection in a composite stiffened panel is simulated by performing explicit transient dynamic finite element analyses and using signal imaging techniques. This virtual test process does not need to use real structures, actuators/sensors, or laboratory equipment. Quasi-isotropic laminates are used for the stiffened panels. Two types of damage are studied. One type is a damage in the skin bay and the other type is a debond between the stiffener flange and the skin. Innovative approaches for identifying the damage location and imaging the damage were developed. The damage location is identified by finding the intersection of the damage locus and the path of the time reversal wave packet re-emitted from the sensor nodes. The damage locus is a circle that envelops the potential damage locations. Its center is at the actuator location and its radius is computed by multiplying the group velocity by the time of flight to damage. To create a damage image for estimating the size of damage, a group of nodes in the neighborhood of the damage location is identified for applying an image condition. The image condition, computed at a finite element node, is the zero-lag cross-correlation (ZLCC) of the time-reversed incident wave signal and the time reversal wave signal from the sensor nodes. This damage imaging process is computationally efficient since only the ZLCC values of a small amount of nodes in the neighborhood of the identified damage location are computed instead of those of the full model.

  1. Innovative approach for in-vivo ablation validation on multimodal images

    NASA Astrophysics Data System (ADS)

    Shahin, O.; Karagkounis, G.; Carnegie, D.; Schlaefer, A.; Boctor, E.

    2014-03-01

    Radiofrequency ablation (RFA) is an important therapeutic procedure for small hepatic tumors. To make sure that the target tumor is effectively treated, RFA monitoring is essential. While several imaging modalities can observe the ablation procedure, it is not clear how ablated lesions on the images correspond to actual necroses. This uncertainty contributes to the high local recurrence rates (up to 55%) after radiofrequency ablative therapy. This study investigates a novel approach to correlate images of ablated lesions with actual necroses. We mapped both intraoperative images of the lesion and a slice through the actual necrosis in a common reference frame. An electromagnetic tracking system was used to accurately match lesion slices from different imaging modalities. To minimize the liver deformation effect, the tracking reference frame was defined inside the tissue by anchoring an electromagnetic sensor adjacent to the lesion. A validation test was performed using a phantom and proved that the end-to-end accuracy of the approach was within 2mm. In an in-vivo experiment, intraoperative magnetic resonance imaging (MRI) and ultrasound (US) ablation images were correlated to gross and histopathology. The results indicate that the proposed method can accurately correlate invivo ablations on different modalities. Ultimately, this will improve the interpretation of the ablation monitoring and reduce the recurrence rates associated with RFA.

  2. Correlating objective and subjective evaluation of texture appearance with applications to camera phone imaging

    NASA Astrophysics Data System (ADS)

    Phillips, Jonathan B.; Coppola, Stephen M.; Jin, Elaine W.; Chen, Ying; Clark, James H.; Mauer, Timothy A.

    2009-01-01

    Texture appearance is an important component of photographic image quality as well as object recognition. Noise cleaning algorithms are used to decrease sensor noise of digital images, but can hinder texture elements in the process. The Camera Phone Image Quality (CPIQ) initiative of the International Imaging Industry Association (I3A) is developing metrics to quantify texture appearance. Objective and subjective experimental results of the texture metric development are presented in this paper. Eight levels of noise cleaning were applied to ten photographic scenes that included texture elements such as faces, landscapes, architecture, and foliage. Four companies (Aptina Imaging, LLC, Hewlett-Packard, Eastman Kodak Company, and Vista Point Technologies) have performed psychophysical evaluations of overall image quality using one of two methods of evaluation. Both methods presented paired comparisons of images on thin film transistor liquid crystal displays (TFT-LCD), but the display pixel pitch and viewing distance differed. CPIQ has also been developing objective texture metrics and targets that were used to analyze the same eight levels of noise cleaning. The correlation of the subjective and objective test results indicates that texture perception can be modeled with an objective metric. The two methods of psychophysical evaluation exhibited high correlation despite the differences in methodology.

  3. Photoacoustic imaging with planoconcave optical microresonator sensors: feasibility studies based on phantom imaging

    NASA Astrophysics Data System (ADS)

    Guggenheim, James A.; Zhang, Edward Z.; Beard, Paul C.

    2017-03-01

    The planar Fabry-Pérot (FP) sensor provides high quality photoacoustic (PA) images but beam walk-off limits sensitivity and thus penetration depth to ≍1 cm. Planoconcave microresonator sensors eliminate beam walk-off enabling sensitivity to be increased by an order-of-magnitude whilst retaining the highly favourable frequency response and directional characteristics of the FP sensor. The first tomographic PA images obtained in a tissue-realistic phantom using the new sensors are described. These show that the microresonator sensors provide near identical image quality as the planar FP sensor but with significantly greater penetration depth (e.g. 2-3cm) due to their higher sensitivity. This offers the prospect of whole body small animal imaging and clinical imaging to depths previously unattainable using the FP planar sensor.

  4. Carbon nanotube based respiratory gated micro-CT imaging of a murine model of lung tumors with optical imaging correlation

    NASA Astrophysics Data System (ADS)

    Burk, Laurel M.; Lee, Yueh Z.; Heathcote, Samuel; Wang, Ko-han; Kim, William Y.; Lu, Jianping; Zhou, Otto

    2011-03-01

    Current optical imaging techniques can successfully measure tumor load in murine models of lung carcinoma but lack structural detail. We demonstrate that respiratory gated micro-CT imaging of such models gives information about structure and correlates with tumor load measurements by optical methods. Four mice with multifocal, Kras-induced tumors expressing firefly luciferase were imaged against four controls using both optical imaging and respiratory gated micro-CT. CT images of anesthetized animals were acquired with a custom CNT-based system using 30 ms x-ray pulses during peak inspiration; respiration motion was tracked with a pressure sensor beneath each animal's abdomen. Optical imaging based on the Luc+ signal correlating with tumor load was performed on a Xenogen IVIS Kinetix. Micro-CT images were post-processed using Osirix, measuring lung volume with region growing. Diameters of the largest three tumors were measured. Relationships between tumor size, lung volumes, and optical signal were compared. CT images and optical signals were obtained for all animals at two time points. In all lobes of the Kras+ mice in all images, tumors were visible; the smallest to be readily identified measured approximately 300 microns diameter. CT-derived tumor volumes and optical signals related linearly, with r=0.94 for all animals. When derived for only tumor bearing animals, r=0.3. The trend of each individual animal's optical signal tracked correctly based on the CT volumes. Interestingly, lung volumes also correlated positively with optical imaging data and tumor volume burden, suggesting active remodeling.

  5. Self-adaptive calibration for staring infrared sensors

    NASA Astrophysics Data System (ADS)

    Kendall, William B.; Stocker, Alan D.

    1993-10-01

    This paper presents a new, self-adaptive technique for the correlation of non-uniformities (fixed-pattern noise) in high-density infrared focal-plane detector arrays. We have developed a new approach to non-uniformity correction in which we use multiple image frames of the scene itself, and take advantage of the aim-point wander caused by jitter, residual tracking errors, or deliberately induced motion. Such wander causes each detector in the array to view multiple scene elements, and each scene element to be viewed by multiple detectors. It is therefore possible to formulate (and solve) a set of simultaneous equations from which correction parameters can be computed for the detectors. We have tested our approach with actual images collected by the ARPA-sponsored MUSIC infrared sensor. For these tests we employed a 60-frame (0.75-second) sequence of terrain images for which an out-of-date calibration was deliberately used. The sensor was aimed at a point on the ground via an operator-assisted tracking system having a maximum aim point wander on the order of ten pixels. With these data, we were able to improve the calibration accuracy by a factor of approximately 100.

  6. - and Scene-Guided Integration of Tls and Photogrammetric Point Clouds for Landslide Monitoring

    NASA Astrophysics Data System (ADS)

    Zieher, T.; Toschi, I.; Remondino, F.; Rutzinger, M.; Kofler, Ch.; Mejia-Aguilar, A.; Schlögel, R.

    2018-05-01

    Terrestrial and airborne 3D imaging sensors are well-suited data acquisition systems for the area-wide monitoring of landslide activity. State-of-the-art surveying techniques, such as terrestrial laser scanning (TLS) and photogrammetry based on unmanned aerial vehicle (UAV) imagery or terrestrial acquisitions have advantages and limitations associated with their individual measurement principles. In this study we present an integration approach for 3D point clouds derived from these techniques, aiming at improving the topographic representation of landslide features while enabling a more accurate assessment of landslide-induced changes. Four expert-based rules involving local morphometric features computed from eigenvectors, elevation and the agreement of the individual point clouds, are used to choose within voxels of selectable size which sensor's data to keep. Based on the integrated point clouds, digital surface models and shaded reliefs are computed. Using an image correlation technique, displacement vectors are finally derived from the multi-temporal shaded reliefs. All results show comparable patterns of landslide movement rates and directions. However, depending on the applied integration rule, differences in spatial coverage and correlation strength emerge.

  7. Detection of potato beetle damage using remote sensing from small unmanned aircraft systems

    NASA Astrophysics Data System (ADS)

    Hunt, E. Raymond; Rondon, Silvia I.

    2017-04-01

    Colorado potato beetle (CPB) adults and larvae devour leaves of potato and other solanaceous crops and weeds, and may quickly develop resistance to pesticides. With early detection of CPB damage, more options are available for precision integrated pest management, which reduces the amount of pesticides applied in a field. Remote sensing with small unmanned aircraft systems (sUAS) has potential for CPB detection because low flight altitudes allow image acquisition at very high spatial resolution. A five-band multispectral sensor and up-looking incident light sensor were mounted on a six-rotor sUAS, which was flown at altitudes of 60 and 30 m in June 2014. Plants went from visibly undamaged to having some damage in just 1 day. Whole-plot normalized difference vegetation index (NDVI) and the number of pixels classified as damaged (0.70≤NDVI≤0.80) were not correlated with visible CPB damage ranked from least to most. Area of CPB damage estimated using object-based image analysis was highly correlated to the visual ranking of damage. Furthermore, plant height calculated using structure-from-motion point clouds was related to CPB damage, but this method required extensive operator intervention for success. Object-based image analysis has potential for early detection based on high spatial resolution sUAS remote sensing.

  8. Sensors management in robotic neurosurgery: the ROBOCAST project.

    PubMed

    Vaccarella, Alberto; Comparetti, Mirko Daniele; Enquobahrie, Andinet; Ferrigno, Giancarlo; De Momi, Elena

    2011-01-01

    Robot and computer-aided surgery platforms bring a variety of sensors into the operating room. These sensors generate information to be synchronized and merged for improving the accuracy and the safety of the surgical procedure for both patients and operators. In this paper, we present our work on the development of a sensor management architecture that is used is to gather and fuse data from localization systems, such as optical and electromagnetic trackers and ultrasound imaging devices. The architecture follows a modular client-server approach and was implemented within the EU-funded project ROBOCAST (FP7 ICT 215190). Furthermore it is based on very well-maintained open-source libraries such as OpenCV and Image-Guided Surgery Toolkit (IGSTK), which are supported from a worldwide community of developers and allow a significant reduction of software costs. We conducted experiments to evaluate the performance of the sensor manager module. We computed the response time needed for a client to receive tracking data or video images, and the time lag between synchronous acquisition with an optical tracker and ultrasound machine. Results showed a median delay of 1.9 ms for a client request of tracking data and about 40 ms for US images; these values are compatible with the data generation rate (20-30 Hz for tracking system and 25 fps for PAL video). Simultaneous acquisitions have been performed with an optical tracking system and US imaging device: data was aligned according to the timestamp associated with each sample and the delay was estimated with a cross-correlation study. A median value of 230 ms delay was calculated showing that realtime 3D reconstruction is not feasible (an offline temporal calibration is needed), although a slow exploration is possible. In conclusion, as far as asleep patient neurosurgery is concerned, the proposed setup is indeed useful for registration error correction because the brain shift occurs with a time constant of few tens of minutes.

  9. Robotic Vehicle Communications Interoperability

    DTIC Science & Technology

    1988-08-01

    starter (cold start) X X Fire suppression X Fording control X Fuel control X Fuel tank selector X Garage toggle X Gear selector X X X X Hazard warning...optic Sensors Sensor switch Video Radar IR Thermal imaging system Image intensifier Laser ranger Video camera selector Forward Stereo Rear Sensor control...optic sensors Sensor switch Video Radar IR Thermal imaging system Image intensifier Laser ranger Video camera selector Forward Stereo Rear Sensor

  10. Establishment of a sensor testbed at NIST for plant productivity monitoring

    NASA Astrophysics Data System (ADS)

    Allen, D. W.; Hutyra, L.; Reinmann, A.; Trlica, A.; Marrs, J.; Jones, T.; Whetstone, J. R.; Logan, B.; Reblin, J.

    2017-12-01

    Accurate assessments of biogenic carbon fluxes is challenging. Correlating optical signatures to plant activity allows for monitoring large regions. New methods, including solar-induced fluorescence (SIF), promise to provide more timely and accurate estimate of plant activity, but we are still developing a full understanding of the mechanistic leakage between plant assimilation of carbon and SIF. We have initiated a testbed to facilitate the evaluation of sensors and methods for remote monitoring of plant activity at the NIST headquarters. The test bed utilizes a forested area of mature trees in a mixed urban environment. A 1 hectare plot within the 26 hectare forest has been instrumented for ecophysiological measurements with an edge (100 m long) that is persistently monitored with multimodal optical sensors (SIF spectrometers, hyperspectral imagers, thermal infrared imaging, and lidar). This biological testbed has the advantage of direct access to the national scales maintained by NIST of measurements related to both the physical and optical measurements of interest. We offer a description of the test site, the sensors, and preliminary results from the first season of observations for ecological, physiological, and remote sensing based estimates of ecosystem productivity.

  11. A novel camera localization system for extending three-dimensional digital image correlation

    NASA Astrophysics Data System (ADS)

    Sabato, Alessandro; Reddy, Narasimha; Khan, Sameer; Niezrecki, Christopher

    2018-03-01

    The monitoring of civil, mechanical, and aerospace structures is important especially as these systems approach or surpass their design life. Often, Structural Health Monitoring (SHM) relies on sensing techniques for condition assessment. Advancements achieved in camera technology and optical sensors have made three-dimensional (3D) Digital Image Correlation (DIC) a valid technique for extracting structural deformations and geometry profiles. Prior to making stereophotogrammetry measurements, a calibration has to be performed to obtain the vision systems' extrinsic and intrinsic parameters. It means that the position of the cameras relative to each other (i.e. separation distance, cameras angle, etc.) must be determined. Typically, cameras are placed on a rigid bar to prevent any relative motion between the cameras. This constraint limits the utility of the 3D-DIC technique, especially as it is applied to monitor large-sized structures and from various fields of view. In this preliminary study, the design of a multi-sensor system is proposed to extend 3D-DIC's capability and allow for easier calibration and measurement. The suggested system relies on a MEMS-based Inertial Measurement Unit (IMU) and a 77 GHz radar sensor for measuring the orientation and relative distance of the stereo cameras. The feasibility of the proposed combined IMU-radar system is evaluated through laboratory tests, demonstrating its ability in determining the cameras position in space for performing accurate 3D-DIC calibration and measurements.

  12. Damage extraction of buildings in the 2015 Gorkha, Nepal earthquake from high-resolution SAR data

    NASA Astrophysics Data System (ADS)

    Yamazaki, Fumio; Bahri, Rendy; Liu, Wen; Sasagawa, Tadashi

    2016-05-01

    Satellite remote sensing is recognized as one of the effective tools for detecting and monitoring affected areas due to natural disasters. Since SAR sensors can capture images not only at daytime but also at nighttime and under cloud-cover conditions, they are especially useful at an emergency response period. In this study, multi-temporal high-resolution TerraSAR-X images were used for damage inspection of the Kathmandu area, which was severely affected by the April 25, 2015 Gorkha Earthquake. The SAR images obtained before and after the earthquake were utilized for calculating the difference and correlation coefficient of backscatter. The affected areas were identified by high values of the absolute difference and low values of the correlation coefficient. The post-event high-resolution optical satellite images were employed as ground truth data to verify our results. Although it was difficult to estimate the damage levels for individual buildings, the high resolution SAR images could illustrate their capability in detecting collapsed buildings at emergency response times.

  13. A back-illuminated megapixel CMOS image sensor

    NASA Technical Reports Server (NTRS)

    Pain, Bedabrata; Cunningham, Thomas; Nikzad, Shouleh; Hoenk, Michael; Jones, Todd; Wrigley, Chris; Hancock, Bruce

    2005-01-01

    In this paper, we present the test and characterization results for a back-illuminated megapixel CMOS imager. The imager pixel consists of a standard junction photodiode coupled to a three transistor-per-pixel switched source-follower readout [1]. The imager also consists of integrated timing and control and bias generation circuits, and provides analog output. The analog column-scan circuits were implemented in such a way that the imager could be configured to run in off-chip correlated double-sampling (CDS) mode. The imager was originally designed for normal front-illuminated operation, and was fabricated in a commercially available 0.5 pn triple-metal CMOS-imager compatible process. For backside illumination, the imager was thinned by etching away the substrate was etched away in a post-fabrication processing step.

  14. An infrared/video fusion system for military robotics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, A.W.; Roberts, R.S.

    1997-08-05

    Sensory information is critical to the telerobotic operation of mobile robots. In particular, visual sensors are a key component of the sensor package on a robot engaged in urban military operations. Visual sensors provide the robot operator with a wealth of information including robot navigation and threat assessment. However, simple countermeasures such as darkness, smoke, or blinding by a laser, can easily neutralize visual sensors. In order to provide a robust visual sensing system, an infrared sensor is required to augment the primary visual sensor. An infrared sensor can acquire useful imagery in conditions that incapacitate a visual sensor. Amore » simple approach to incorporating an infrared sensor into the visual sensing system is to display two images to the operator: side-by-side visual and infrared images. However, dual images might overwhelm the operator with information, and result in degraded robot performance. A better solution is to combine the visual and infrared images into a single image that maximizes scene information. Fusing visual and infrared images into a single image demands balancing the mixture of visual and infrared information. Humans are accustom to viewing and interpreting visual images. They are not accustom to viewing or interpreting infrared images. Hence, the infrared image must be used to enhance the visual image, not obfuscate it.« less

  15. A Disposable Tear Glucose Biosensor—Part 4

    PubMed Central

    Engelschall, Erica; Lan, Kenneth; Shah, Pankti; Saez, Neil; Maxwell, Stephanie; Adamson, Teagan; Abou-Eid, Michelle; McAferty, Kenyon; Patel, Dharmendra R.; Cook, Curtiss B.

    2014-01-01

    Objective: A prototype tear glucose (TG) sensor was tested in New Zealand white rabbits to assess eye irritation, blood glucose (BG) and TG lag time, and correlation with BG. Methods: A total of 4 animals were used. Eye irritation was monitored by Lissamine green dye and analyzed using image analysis software. Lag time was correlated with an oral glucose load while recording TG and BG readings. Correlation between TG and BG were plotted against one another to form a correlation diagram, using a Yellow Springs Instrument (YSI) and self-monitoring of blood glucose as the reference measurements. Finally, TG levels were calculated using analytically derived expressions. Results: From repeated testing carried over the course of 12 months, little to no eye irritation was detected. TG fluctuations over time visually appeared to trace the same pattern as BG with an average lag times of 13 minutes. TG levels calculated from the device current measurements ranged from 4 to 20 mg/dL and correlated linearly with BG levels of 75-160 mg/dL (TG = 0.1723 BG = 7.9448 mg/dL; R2 = .7544). Conclusion: The first steps were taken toward preliminary development of a sensor for self-monitoring of tear glucose (SMTG). No conjunctival irritation in any of the animals was noted. Lag time between TG and BG was found to be noticeable, but a quantitative modeling to correlate lag time in this study is unnecessary. Measured currents from the sensors and the calculated TG showed promising correlation to BG levels. Previous analytical bench marking showed BG and TG levels consistent with other literature. PMID:24876546

  16. CORRELATED ERRORS IN EARTH POINTING MISSIONS

    NASA Technical Reports Server (NTRS)

    Bilanow, Steve; Patt, Frederick S.

    2005-01-01

    Two different Earth-pointing missions dealing with attitude control and dynamics changes illustrate concerns with correlated error sources and coupled effects that can occur. On the OrbView-2 (OV-2) spacecraft, the assumption of a nearly-inertially-fixed momentum axis was called into question when a residual dipole bias apparently changed magnitude. The possibility that alignment adjustments and/or sensor calibration errors may compensate for actual motions of the spacecraft is discussed, and uncertainties in the dynamics are considered. Particular consideration is given to basic orbit frequency and twice orbit frequency effects and their high correlation over the short science observation data span. On the Tropical Rainfall Measuring Mission (TRMM) spacecraft, the switch to a contingency Kalman filter control mode created changes in the pointing error patterns. Results from independent checks on the TRMM attitude using science instrument data are reported, and bias shifts and error correlations are discussed. Various orbit frequency effects are common with the flight geometry for Earth pointing instruments. In both dual-spin momentum stabilized spacecraft (like OV-2) and three axis stabilized spacecraft with gyros (like TRMM under Kalman filter control), changes in the initial attitude state propagate into orbit frequency variations in attitude and some sensor measurements. At the same time, orbit frequency measurement effects can arise from dynamics assumptions, environment variations, attitude sensor calibrations, or ephemeris errors. Also, constant environment torques for dual spin spacecraft have similar effects to gyro biases on three axis stabilized spacecraft, effectively shifting the one-revolution-per-orbit (1-RPO) body rotation axis. Highly correlated effects can create a risk for estimation errors particularly when a mission switches an operating mode or changes its normal flight environment. Some error effects will not be obvious from attitude sensor measurement residuals, so some independent checks using imaging sensors are essential and derived science instrument attitude measurements can prove quite valuable in assessing the attitude accuracy.

  17. CMOS Image Sensors: Electronic Camera On A Chip

    NASA Technical Reports Server (NTRS)

    Fossum, E. R.

    1995-01-01

    Recent advancements in CMOS image sensor technology are reviewed, including both passive pixel sensors and active pixel sensors. On- chip analog to digital converters and on-chip timing and control circuits permit realization of an electronic camera-on-a-chip. Highly miniaturized imaging systems based on CMOS image sensor technology are emerging as a competitor to charge-coupled devices for low cost uses.

  18. Stereo Cloud Height and Wind Determination Using Measurements from a Single Focal Plane

    NASA Astrophysics Data System (ADS)

    Demajistre, R.; Kelly, M. A.

    2014-12-01

    We present here a method for extracting cloud heights and winds from an aircraft or orbital platform using measurements from a single focal plane, exploiting the motion of the platform to provide multiple views of the cloud tops. To illustrate this method we use data acquired during aircraft flight tests of a set of simple stereo imagers that are well suited to this purpose. Each of these imagers has three linear arrays on the focal plane, one looking forward, one looking aft, and one looking down. Push-broom images from each of these arrays are constructed, and then a spatial correlation analysis is used to deduce the delays and displacements required for wind and cloud height determination. We will present the algorithms necessary for the retrievals, as well as the methods used to determine the uncertainties of the derived cloud heights and winds. We will apply the retrievals and uncertainty determination to a number of image sets acquired by the airborne sensors. We then generalize these results to potential space based observations made by similar types of sensors.

  19. Crack Detection in Fibre Reinforced Plastic Structures Using Embedded Fibre Bragg Grating Sensors: Theory, Model Development and Experimental Validation

    PubMed Central

    Pereira, G. F.; Mikkelsen, L. P.; McGugan, M.

    2015-01-01

    In a fibre-reinforced polymer (FRP) structure designed using the emerging damage tolerance and structural health monitoring philosophy, sensors and models that describe crack propagation will enable a structure to operate despite the presence of damage by fully exploiting the material’s mechanical properties. When applying this concept to different structures, sensor systems and damage types, a combination of damage mechanics, monitoring technology, and modelling is required. The primary objective of this article is to demonstrate such a combination. This article is divided in three main topics: the damage mechanism (delamination of FRP), the structural health monitoring technology (fibre Bragg gratings to detect delamination), and the finite element method model of the structure that incorporates these concepts into a final and integrated damage-monitoring concept. A novel method for assessing a crack growth/damage event in fibre-reinforced polymer or structural adhesive-bonded structures using embedded fibre Bragg grating (FBG) sensors is presented by combining conventional measured parameters, such as wavelength shift, with parameters associated with measurement errors, typically ignored by the end-user. Conjointly, a novel model for sensor output prediction (virtual sensor) was developed using this FBG sensor crack monitoring concept and implemented in a finite element method code. The monitoring method was demonstrated and validated using glass fibre double cantilever beam specimens instrumented with an array of FBG sensors embedded in the material and tested using an experimental fracture procedure. The digital image correlation technique was used to validate the model prediction by correlating the specific sensor response caused by the crack with the developed model. PMID:26513653

  20. 77 FR 26787 - Certain CMOS Image Sensors and Products Containing Same; Notice of Receipt of Complaint...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-07

    ... INTERNATIONAL TRADE COMMISSION [Docket No. 2895] Certain CMOS Image Sensors and Products.... International Trade Commission has received a complaint entitled Certain CMOS Image Sensors and Products... importation, and the sale within the United States after importation of certain CMOS image sensors and...

  1. Life cycle monitoring of lithium-ion polymer batteries using cost-effective thermal infrared sensors with applications for lifetime prediction

    NASA Astrophysics Data System (ADS)

    Zhou, Xunfei; Malik, Anav; Hsieh, Sheng-Jen

    2017-05-01

    Lithium-ion batteries have become indispensable parts of our lives for their high-energy density and long lifespan. However, failure due to from abusive usage conditions, flawed manufacturing processes, and aging and adversely affect battery performance and even endanger people and property. Therefore, battery cells that are failing or reaching their end-of-life need to be replaced. Traditionally, battery lifetime prediction is achieved by analyzing data from current, voltage and impedance sensors. However, such a prognostic system is expensive to implement and requires direct contact. In this study, low-cost thermal infrared sensors were used to acquire thermographic images throughout the entire lifetime of small scale lithium-ion polymer batteries (410 cycles). The infrared system (non-destructive) took temperature readings from multiple batteries during charging and discharging cycles of 1C. Thermal characteristics of the batteries were derived from the thermographic images. A time-dependent and spatially resolved temperature mapping was obtained and quantitatively analyzed. The developed model can predict cycle number using the first 10 minutes of surface temperature data acquired through infrared imaging at the beginning of the cycle, with an average error rate of less than 10%. This approach can be used to correlate thermal characteristics of the batteries with life cycles, and to propose cost-effective thermal infrared imaging applications in battery prognostic systems.

  2. Sea surface velocities from visible and infrared multispectral atmospheric mapping sensor imagery

    NASA Technical Reports Server (NTRS)

    Pope, P. A.; Emery, W. J.; Radebaugh, M.

    1992-01-01

    High resolution (100 m), sequential Multispectral Atmospheric Mapping Sensor (MAMS) images were used in a study to calculate advective surface velocities using the Maximum Cross Correlation (MCC) technique. Radiance and brightness temperature gradient magnitude images were formed from visible (0.48 microns) and infrared (11.12 microns) image pairs, respectively, of Chandeleur Sound, which is a shallow body of water northeast of the Mississippi delta, at 145546 GMT and 170701 GMT on 30 Mar. 1989. The gradient magnitude images enhanced the surface water feature boundaries, and a lower cutoff on the gradient magnitudes calculated allowed the undesirable sunglare and backscatter gradients in the visible images, and the water vapor absorption gradients in the infrared images, to be reduced in strength. Requiring high (greater than 0.4) maximum cross correlation coefficients and spatial coherence of the vector field aided in the selection of an optimal template size of 10 x 10 pixels (first image) and search limit of 20 pixels (second image) to use in the MCC technique. Use of these optimum input parameters to the MCC algorithm, and high correlation and spatial coherence filtering of the resulting velocity field from the MCC calculation yielded a clustered velocity distribution over the visible and infrared gradient images. The velocity field calculated from the visible gradient image pair agreed well with a subjective analysis of the motion, but the velocity field from the infrared gradient image pair did not. This was attributed to the changing shapes of the gradient features, their nonuniqueness, and large displacements relative to the mean distance between them. These problems implied a lower repeat time for the imagery was needed in order to improve the velocity field derived from gradient imagery. Suggestions are given for optimizing the repeat time of sequential imagery when using the MCC method for motion studies. Applying the MCC method to the infrared brightness temperature imagery yielded a velocity field which did agree with the subjective analysis of the motion and that derived from the visible gradient imagery. Differences between the visible and infrared derived velocities were 14.9 cm/s in speed and 56.7 degrees in direction. Both of these velocity fields also agreed well with the motion expected from considerations of the ocean bottom topography and wind and tidal forcing in the study area during the 2.175 hour time interval.

  3. Photon Counting Imaging with an Electron-Bombarded Pixel Image Sensor

    PubMed Central

    Hirvonen, Liisa M.; Suhling, Klaus

    2016-01-01

    Electron-bombarded pixel image sensors, where a single photoelectron is accelerated directly into a CCD or CMOS sensor, allow wide-field imaging at extremely low light levels as they are sensitive enough to detect single photons. This technology allows the detection of up to hundreds or thousands of photon events per frame, depending on the sensor size, and photon event centroiding can be employed to recover resolution lost in the detection process. Unlike photon events from electron-multiplying sensors, the photon events from electron-bombarded sensors have a narrow, acceleration-voltage-dependent pulse height distribution. Thus a gain voltage sweep during exposure in an electron-bombarded sensor could allow photon arrival time determination from the pulse height with sub-frame exposure time resolution. We give a brief overview of our work with electron-bombarded pixel image sensor technology and recent developments in this field for single photon counting imaging, and examples of some applications. PMID:27136556

  4. Hemispherical Field-of-View Above-Water Surface Imager for Submarines

    NASA Technical Reports Server (NTRS)

    Hemmati, Hamid; Kovalik, Joseph M.; Farr, William H.; Dannecker, John D.

    2012-01-01

    A document discusses solutions to the problem of submarines having to rise above water to detect airplanes in the general vicinity. Two solutions are provided, in which a sensor is located just under the water surface, and at a few to tens of meter depth under the water surface. The first option is a Fish Eye Lens (FEL) digital-camera combination, situated just under the water surface that will have near-full- hemisphere (360 azimuth and 90 elevation) field of view for detecting objects on the water surface. This sensor can provide a three-dimensional picture of the airspace both in the marine and in the land environment. The FEL is coupled to a camera and can continuously look at the entire sky above it. The camera can have an Active Pixel Sensor (APS) focal plane array that allows logic circuitry to be built directly in the sensor. The logic circuitry allows data processing to occur on the sensor head without the need for any other external electronics. In the second option, a single-photon sensitive (photon counting) detector-array is used at depth, without the need for any optics in front of it, since at this location, optical signals are scattered and arrive at a wide (tens of degrees) range of angles. Beam scattering through clouds and seawater effectively negates optical imaging at depths below a few meters under cloudy or turbulent conditions. Under those conditions, maximum collection efficiency can be achieved by using a non-imaging photon-counting detector behind narrowband filters. In either case, signals from these sensors may be fused and correlated or decorrelated with other sensor data to get an accurate picture of the object(s) above the submarine. These devices can complement traditional submarine periscopes that have a limited field of view in the elevation direction. Also, these techniques circumvent the need for exposing the entire submarine or its periscopes to the outside environment.

  5. Evaluation and comparison of the IRS-P6 and the landsat sensors

    USGS Publications Warehouse

    Chander, G.; Coan, M.J.; Scaramuzza, P.L.

    2008-01-01

    The Indian Remote Sensing Satellite (IRS-P6), also called ResourceSat-1, was launched in a polar sun-synchronous orbit on October 17, 2003. It carries three sensors: the highresolution Linear Imaging Self-Scanner (LISS-IV), the mediumresolution Linear Imaging Self-Scanner (LISS-III), and the Advanced Wide-Field Sensor (AWiFS). These three sensors provide images of different resolutions and coverage. To understand the absolute radiometric calibration accuracy of IRS-P6 AWiFS and LISS-III sensors, image pairs from these sensors were compared to images from the Landsat-5 Thematic Mapper (TM) and Landsat-7 Enhanced TM Plus (ETM+) sensors. The approach involves calibration of surface observations based on image statistics from areas observed nearly simultaneously by the two sensors. This paper also evaluated the viability of data from these nextgeneration imagers for use in creating three National Land Cover Dataset (NLCD) products: land cover, percent tree canopy, and percent impervious surface. Individual products were consistent with previous studies but had slightly lower overall accuracies as compared to data from the Landsat sensors.

  6. Enhanced tactical radar correlator (ETRAC): true interoperability of the 1990s

    NASA Astrophysics Data System (ADS)

    Guillen, Frank J.

    1994-10-01

    The enhanced tactical radar correlator (ETRAC) system is under development at Westinghouse Electric Corporation for the Army Space Program Office (ASPO). ETRAC is a real-time synthetic aperture radar (SAR) processing system that provides tactical IMINT to the corps commander. It features an open architecture comprised of ruggedized commercial-off-the-shelf (COTS), UNIX based workstations and processors. The architecture features the DoD common SAR processor (CSP), a multisensor computing platform to accommodate a variety of current and future imaging needs. ETRAC's principal functions include: (1) Mission planning and control -- ETRAC provides mission planning and control for the U-2R and ASARS-2 sensor, including capability for auto replanning, retasking, and immediate spot. (2) Image formation -- the image formation processor (IFP) provides the CPU intensive processing capability to produce real-time imagery for all ASARS imaging modes of operation. (3) Image exploitation -- two exploitation workstations are provided for first-phase image exploitation, manipulation, and annotation. Products include INTEL reports, annotated NITF SID imagery, high resolution hard copy prints and targeting data. ETRAC is transportable via two C-130 aircraft, with autonomous drive on/off capability for high mobility. Other autonomous capabilities include rapid setup/tear down, extended stand-alone support, internal environmental control units (ECUs) and power generation. ETRAC's mission is to provide the Army field commander with accurate, reliable, and timely imagery intelligence derived from collections made by the ASARS-2 sensor, located on-board the U-2R aircraft. To accomplish this mission, ETRAC receives video phase history (VPH) directly from the U-2R aircraft and converts it in real time into soft copy imagery for immediate exploitation and dissemination to the tactical users.

  7. Performance test and image correction of CMOS image sensor in radiation environment

    NASA Astrophysics Data System (ADS)

    Wang, Congzheng; Hu, Song; Gao, Chunming; Feng, Chang

    2016-09-01

    CMOS image sensors rival CCDs in domains that include strong radiation resistance as well as simple drive signals, so it is widely applied in the high-energy radiation environment, such as space optical imaging application and video monitoring of nuclear power equipment. However, the silicon material of CMOS image sensors has the ionizing dose effect in the high-energy rays, and then the indicators of image sensors, such as signal noise ratio (SNR), non-uniformity (NU) and bad point (BP) are degraded because of the radiation. The radiation environment of test experiments was generated by the 60Co γ-rays source. The camera module based on image sensor CMV2000 from CMOSIS Inc. was chosen as the research object. The ray dose used for the experiments was with a dose rate of 20krad/h. In the test experiences, the output signals of the pixels of image sensor were measured on the different total dose. The results of data analysis showed that with the accumulation of irradiation dose, SNR of image sensors decreased, NU of sensors was enhanced, and the number of BP increased. The indicators correction of image sensors was necessary, as it was the main factors to image quality. The image processing arithmetic was adopt to the data from the experiences in the work, which combined local threshold method with NU correction based on non-local means (NLM) method. The results from image processing showed that image correction can effectively inhibit the BP, improve the SNR, and reduce the NU.

  8. High speed three-dimensional laser scanner with real time processing

    NASA Technical Reports Server (NTRS)

    Lavelle, Joseph P. (Inventor); Schuet, Stefan R. (Inventor)

    2008-01-01

    A laser scanner computes a range from a laser line to an imaging sensor. The laser line illuminates a detail within an area covered by the imaging sensor, the area having a first dimension and a second dimension. The detail has a dimension perpendicular to the area. A traverse moves a laser emitter coupled to the imaging sensor, at a height above the area. The laser emitter is positioned at an offset along the scan direction with respect to the imaging sensor, and is oriented at a depression angle with respect to the area. The laser emitter projects the laser line along the second dimension of the area at a position where a image frame is acquired. The imaging sensor is sensitive to laser reflections from the detail produced by the laser line. The imaging sensor images the laser reflections from the detail to generate the image frame. A computer having a pipeline structure is connected to the imaging sensor for reception of the image frame, and for computing the range to the detail using height, depression angle and/or offset. The computer displays the range to the area and detail thereon covered by the image frame.

  9. CMOS Active-Pixel Image Sensor With Intensity-Driven Readout

    NASA Technical Reports Server (NTRS)

    Langenbacher, Harry T.; Fossum, Eric R.; Kemeny, Sabrina

    1996-01-01

    Proposed complementary metal oxide/semiconductor (CMOS) integrated-circuit image sensor automatically provides readouts from pixels in order of decreasing illumination intensity. Sensor operated in integration mode. Particularly useful in number of image-sensing tasks, including diffractive laser range-finding, three-dimensional imaging, event-driven readout of sparse sensor arrays, and star tracking.

  10. A fiber-optic sensor based on no-core fiber and Faraday rotator mirror structure

    NASA Astrophysics Data System (ADS)

    Lu, Heng; Wang, Xu; Zhang, Songling; Wang, Fang; Liu, Yufang

    2018-05-01

    An optical fiber sensor based on the single-mode/no-core/single-mode (SNS) core-offset technology along with a Faraday rotator mirror structure has been proposed and experimentally demonstrated. A transverse optical field distribution of self-imaging has been simulated and experimental parameters have been selected under theoretical guidance. Results of the experiments demonstrate that the temperature sensitivity of the sensor is 0.0551 nm/°C for temperatures between 25 and 80 °C, and the correlation coefficient is 0.99582. The concentration sensitivity of the device for sucrose and glucose solutions was found to be as high as 12.5416 and 6.02248 nm/(g/ml), respectively. Curves demonstrating a linear fit between wavelength shift and solution concentration for three different heavy metal solutions have also been derived on the basis of experimental results. The proposed fiber-optic sensor design provides valuable guidance for the measurement of concentration and temperature.

  11. Noninvasive, three-dimensional full-field body sensor for surface deformation monitoring of human body in vivo

    NASA Astrophysics Data System (ADS)

    Chen, Zhenning; Shao, Xinxing; He, Xiaoyuan; Wu, Jialin; Xu, Xiangyang; Zhang, Jinlin

    2017-09-01

    Noninvasive, three-dimensional (3-D), full-field surface deformation measurements of the human body are important for biomedical investigations. We proposed a 3-D noninvasive, full-field body sensor based on stereo digital image correlation (stereo-DIC) for surface deformation monitoring of the human body in vivo. First, by applying an improved water-transfer printing (WTP) technique to transfer optimized speckle patterns onto the skin, the body sensor was conveniently and harmlessly fabricated directly onto the human body. Then, stereo-DIC was used to achieve 3-D noncontact and noninvasive surface deformation measurements. The accuracy and efficiency of the proposed body sensor were verified and discussed by considering different complexions. Moreover, the fabrication of speckle patterns on human skin, which has always been considered a challenging problem, was shown to be feasible, effective, and harmless as a result of the improved WTP technique. An application of the proposed stereo-DIC-based body sensor was demonstrated by measuring the pulse wave velocity of human carotid artery.

  12. Adaptive optics system application for solar telescope

    NASA Astrophysics Data System (ADS)

    Lukin, V. P.; Grigor'ev, V. M.; Antoshkin, L. V.; Botugina, N. N.; Emaleev, O. N.; Konyaev, P. A.; Kovadlo, P. G.; Krivolutskiy, N. P.; Lavrionova, L. N.; Skomorovski, V. I.

    2008-07-01

    The possibility of applying adaptive correction to ground-based solar astronomy is considered. Several experimental systems for image stabilization are described along with the results of their tests. Using our work along several years and world experience in solar adaptive optics (AO) we are assuming to obtain first light to the end of 2008 for the first Russian low order ANGARA solar AO system on the Big Solar Vacuum Telescope (BSVT) with 37 subapertures Shack-Hartmann wavefront sensor based of our modified correlation tracker algorithm, DALSTAR video camera, 37 elements deformable bimorph mirror, home made fast tip-tip mirror with separate correlation tracker. Too strong daytime turbulence is on the BSVT site and we are planning to obtain a partial correction for part of Sun surface image.

  13. Flash LIDAR Systems for Planetary Exploration

    NASA Astrophysics Data System (ADS)

    Dissly, Richard; Weinberg, J.; Weimer, C.; Craig, R.; Earhart, P.; Miller, K.

    2009-01-01

    Ball Aerospace offers a mature, highly capable 3D flash-imaging LIDAR system for planetary exploration. Multi mission applications include orbital, standoff and surface terrain mapping, long distance and rapid close-in ranging, descent and surface navigation and rendezvous and docking. Our flash LIDAR is an optical, time-of-flight, topographic imaging system, leveraging innovations in focal plane arrays, readout integrated circuit real time processing, and compact and efficient pulsed laser sources. Due to its modular design, it can be easily tailored to satisfy a wide range of mission requirements. Flash LIDAR offers several distinct advantages over traditional scanning systems. The entire scene within the sensor's field of view is imaged with a single laser flash. This directly produces an image with each pixel already correlated in time, making the sensor resistant to the relative motion of a target subject. Additionally, images may be produced at rates much faster than are possible with a scanning system. And because the system captures a new complete image with each flash, optical glint and clutter are easily filtered and discarded. This allows for imaging under any lighting condition and makes the system virtually insensitive to stray light. Finally, because there are no moving parts, our flash LIDAR system is highly reliable and has a long life expectancy. As an industry leader in laser active sensor system development, Ball Aerospace has been working for more than four years to mature flash LIDAR systems for space applications, and is now under contract to provide the Vision Navigation System for NASA's Orion spacecraft. Our system uses heritage optics and electronics from our star tracker products, and space qualified lasers similar to those used in our CALIPSO LIDAR, which has been in continuous operation since 2006, providing more than 1.3 billion laser pulses to date.

  14. [Present status and trend of heart fluid mechanics research based on medical image analysis].

    PubMed

    Gan, Jianhong; Yin, Lixue; Xie, Shenghua; Li, Wenhua; Lu, Jing; Luo, Anguo

    2014-06-01

    With introduction of current main methods for heart fluid mechanics researches, we studied the characteristics and weakness for three primary analysis methods based on magnetic resonance imaging, color Doppler ultrasound and grayscale ultrasound image, respectively. It is pointed out that particle image velocity (PIV), speckle tracking and block match have the same nature, and three algorithms all adopt block correlation. The further analysis shows that, with the development of information technology and sensor, the research for cardiac function and fluid mechanics will focus on energy transfer process of heart fluid, characteristics of Chamber wall related to blood fluid and Fluid-structure interaction in the future heart fluid mechanics fields.

  15. Design on the x-ray oral digital image display card

    NASA Astrophysics Data System (ADS)

    Wang, Liping; Gu, Guohua; Chen, Qian

    2009-10-01

    According to the main characteristics of X-ray imaging, the X-ray display card is successfully designed and debugged using the basic principle of correlated double sampling (CDS) and combined with embedded computer technology. CCD sensor drive circuit and the corresponding procedures have been designed. Filtering and sampling hold circuit have been designed. The data exchange with PC104 bus has been implemented. Using complex programmable logic device as a device to provide gating and timing logic, the functions which counting, reading CPU control instructions, corresponding exposure and controlling sample-and-hold have been completed. According to the image effect and noise analysis, the circuit components have been adjusted. And high-quality images have been obtained.

  16. Non-contact imaging of venous compliance in humans using an RGB camera

    NASA Astrophysics Data System (ADS)

    Nakano, Kazuya; Satoh, Ryota; Hoshi, Akira; Matsuda, Ryohei; Suzuki, Hiroyuki; Nishidate, Izumi

    2015-04-01

    We propose a technique for non-contact imaging of venous compliance that uses the red, green, and blue (RGB) camera. Any change in blood concentration is estimated from an RGB image of the skin, and a regression formula is calculated from that change. Venous compliance is obtained from a differential form of the regression formula. In vivo experiments with human subjects confirmed that the proposed method does differentiate the venous compliances among individuals. In addition, the image of venous compliance is obtained by performing the above procedures for each pixel. Thus, we can measure venous compliance without physical contact with sensors and, from the resulting images, observe the spatial distribution of venous compliance, which correlates with the distribution of veins.

  17. Evaluation of Sun Glint Correction Algorithms for High-Spatial Resolution Hyperspectral Imagery

    DTIC Science & Technology

    2012-09-01

    ACRONYMS AND ABBREVIATIONS AISA Airborne Imaging Spectrometer for Applications AVIRIS Airborne Visible/Infrared Imaging Spectrometer BIL Band...sensor bracket mount combining Airborne Imaging Spectrometer for Applications ( AISA ) Eagle and Hawk sensors into a single imaging system (SpecTIR 2011...The AISA Eagle is a VNIR sensor with a wavelength range of approximately 400–970 nm and the AISA Hawk sensor is a SWIR sensor with a wavelength

  18. Application and evaluation of ISVR method in QuickBird image fusion

    NASA Astrophysics Data System (ADS)

    Cheng, Bo; Song, Xiaolu

    2014-05-01

    QuickBird satellite images are widely used in many fields, and applications have put forward high requirements for the integration of the spatial information and spectral information of the imagery. A fusion method for high resolution remote sensing images based on ISVR is identified in this study. The core principle of ISVS is taking the advantage of radicalization targeting to remove the effect of different gain and error of satellites' sensors. Transformed from DN to radiance, the multi-spectral image's energy is used to simulate the panchromatic band. The linear regression analysis is carried through the simulation process to find a new synthetically panchromatic image, which is highly linearly correlated to the original panchromatic image. In order to evaluate, test and compare the algorithm results, this paper used ISVR and other two different fusion methods to give a comparative study of the spatial information and spectral information, taking the average gradient and the correlation coefficient as an indicator. Experiments showed that this method could significantly improve the quality of fused image, especially in preserving spectral information, to maximize the spectral information of original multispectral images, while maintaining abundant spatial information.

  19. Temporal Noise Analysis of Charge-Domain Sampling Readout Circuits for CMOS Image Sensors.

    PubMed

    Ge, Xiaoliang; Theuwissen, Albert J P

    2018-02-27

    This paper presents a temporal noise analysis of charge-domain sampling readout circuits for Complementary Metal-Oxide Semiconductor (CMOS) image sensors. In order to address the trade-off between the low input-referred noise and high dynamic range, a Gm-cell-based pixel together with a charge-domain correlated-double sampling (CDS) technique has been proposed to provide a way to efficiently embed a tunable conversion gain along the read-out path. Such readout topology, however, operates in a non-stationery large-signal behavior, and the statistical properties of its temporal noise are a function of time. Conventional noise analysis methods for CMOS image sensors are based on steady-state signal models, and therefore cannot be readily applied for Gm-cell-based pixels. In this paper, we develop analysis models for both thermal noise and flicker noise in Gm-cell-based pixels by employing the time-domain linear analysis approach and the non-stationary noise analysis theory, which help to quantitatively evaluate the temporal noise characteristic of Gm-cell-based pixels. Both models were numerically computed in MATLAB using design parameters of a prototype chip, and compared with both simulation and experimental results. The good agreement between the theoretical and measurement results verifies the effectiveness of the proposed noise analysis models.

  20. Temporal Noise Analysis of Charge-Domain Sampling Readout Circuits for CMOS Image Sensors †

    PubMed Central

    Theuwissen, Albert J. P.

    2018-01-01

    This paper presents a temporal noise analysis of charge-domain sampling readout circuits for Complementary Metal-Oxide Semiconductor (CMOS) image sensors. In order to address the trade-off between the low input-referred noise and high dynamic range, a Gm-cell-based pixel together with a charge-domain correlated-double sampling (CDS) technique has been proposed to provide a way to efficiently embed a tunable conversion gain along the read-out path. Such readout topology, however, operates in a non-stationery large-signal behavior, and the statistical properties of its temporal noise are a function of time. Conventional noise analysis methods for CMOS image sensors are based on steady-state signal models, and therefore cannot be readily applied for Gm-cell-based pixels. In this paper, we develop analysis models for both thermal noise and flicker noise in Gm-cell-based pixels by employing the time-domain linear analysis approach and the non-stationary noise analysis theory, which help to quantitatively evaluate the temporal noise characteristic of Gm-cell-based pixels. Both models were numerically computed in MATLAB using design parameters of a prototype chip, and compared with both simulation and experimental results. The good agreement between the theoretical and measurement results verifies the effectiveness of the proposed noise analysis models. PMID:29495496

  1. Smart sensors II; Proceedings of the Seminar, San Diego, CA, July 31, August 1, 1980

    NASA Astrophysics Data System (ADS)

    Barbe, D. F.

    1980-01-01

    Topics discussed include technology for smart sensors, smart sensors for tracking and surveillance, and techniques and algorithms for smart sensors. Papers are presented on the application of very large scale integrated circuits to smart sensors, imaging charge-coupled devices for deep-space surveillance, ultra-precise star tracking using charge coupled devices, and automatic target identification of blurred images with super-resolution features. Attention is also given to smart sensors for terminal homing, algorithms for estimating image position, and the computational efficiency of multiple image registration algorithms.

  2. CMOS image sensor-based implantable glucose sensor using glucose-responsive fluorescent hydrogel.

    PubMed

    Tokuda, Takashi; Takahashi, Masayuki; Uejima, Kazuhiro; Masuda, Keita; Kawamura, Toshikazu; Ohta, Yasumi; Motoyama, Mayumi; Noda, Toshihiko; Sasagawa, Kiyotaka; Okitsu, Teru; Takeuchi, Shoji; Ohta, Jun

    2014-11-01

    A CMOS image sensor-based implantable glucose sensor based on an optical-sensing scheme is proposed and experimentally verified. A glucose-responsive fluorescent hydrogel is used as the mediator in the measurement scheme. The wired implantable glucose sensor was realized by integrating a CMOS image sensor, hydrogel, UV light emitting diodes, and an optical filter on a flexible polyimide substrate. Feasibility of the glucose sensor was verified by both in vitro and in vivo experiments.

  3. Radiometric Normalization of Large Airborne Image Data Sets Acquired by Different Sensor Types

    NASA Astrophysics Data System (ADS)

    Gehrke, S.; Beshah, B. T.

    2016-06-01

    Generating seamless mosaics of aerial images is a particularly challenging task when the mosaic comprises a large number of im-ages, collected over longer periods of time and with different sensors under varying imaging conditions. Such large mosaics typically consist of very heterogeneous image data, both spatially (different terrain types and atmosphere) and temporally (unstable atmo-spheric properties and even changes in land coverage). We present a new radiometric normalization or, respectively, radiometric aerial triangulation approach that takes advantage of our knowledge about each sensor's properties. The current implementation supports medium and large format airborne imaging sensors of the Leica Geosystems family, namely the ADS line-scanner as well as DMC and RCD frame sensors. A hierarchical modelling - with parameters for the overall mosaic, the sensor type, different flight sessions, strips and individual images - allows for adaptation to each sensor's geometric and radiometric properties. Additional parameters at different hierarchy levels can compensate radiome-tric differences of various origins to compensate for shortcomings of the preceding radiometric sensor calibration as well as BRDF and atmospheric corrections. The final, relative normalization is based on radiometric tie points in overlapping images, absolute radiometric control points and image statistics. It is computed in a global least squares adjustment for the entire mosaic by altering each image's histogram using a location-dependent mathematical model. This model involves contrast and brightness corrections at radiometric fix points with bilinear interpolation for corrections in-between. The distribution of the radiometry fixes is adaptive to each image and generally increases with image size, hence enabling optimal local adaptation even for very long image strips as typi-cally captured by a line-scanner sensor. The normalization approach is implemented in HxMap software. It has been successfully applied to large sets of heterogeneous imagery, including the adjustment of original sensor images prior to quality control and further processing as well as radiometric adjustment for ortho-image mosaic generation.

  4. Roughness and pH changes of enamel surface induced by soft drinks in vitro-applications of stylus profilometry, focus variation 3D scanning microscopy and micro pH sensor.

    PubMed

    Fujii, Mie; Kitasako, Yuichi; Sadr, Alireza; Tagami, Junji

    2011-01-01

    This study aimed to evaluate enamel surface roughness (Ra) and pH before and after erosion by soft drinks. Enamel was exposed to a soft drink (cola, orange juice or green tea) for 1, 5 or 60 min; Ra was measured using contact-stylus surface profilometry (SSP) and non-contact focus variation 3D microscope (FVM). Surface pH was measured using a micro pH sensor. Data were analyzed at significance level of alpha=0.05. There was a significant correlation in Ra between SSP and FVM. FVM images showed no changes in the surface morphology after various periods of exposure to green tea. Unlike cola and orange juice, exposure to green tea did not significantly affect Ra or pH. A significant correlation was observed between surface pH and Ra change after exposure to the drinks. Optical surface analysis and micro pH sensor may be useful tools for non-damaging, quantitative assessment of soft drinks erosion on enamel.

  5. Compensated Row-Column Ultrasound Imaging System Using Fisher Tippett Multilayered Conditional Random Field Model

    PubMed Central

    Ben Daya, Ibrahim; Chen, Albert I. H.; Shafiee, Mohammad Javad; Wong, Alexander; Yeow, John T. W.

    2015-01-01

    3-D ultrasound imaging offers unique opportunities in the field of non destructive testing that cannot be easily found in A-mode and B-mode images. To acquire a 3-D ultrasound image without a mechanically moving transducer, a 2-D array can be used. The row column technique is preferred over a fully addressed 2-D array as it requires a significantly lower number of interconnections. Recent advances in 3-D row-column ultrasound imaging systems were largely focused on sensor design. However, these imaging systems face three intrinsic challenges that cannot be addressed by improving sensor design alone: speckle noise, sparsity of data in the imaged volume, and the spatially dependent point spread function of the imaging system. In this paper, we propose a compensated row-column ultrasound image reconstruction system using Fisher-Tippett multilayered conditional random field model. Tests carried out on both simulated and real row-column ultrasound images show the effectiveness of our proposed system as opposed to other published systems. Visual assessment of the results show our proposed system’s potential at preserving detail and reducing speckle. Quantitative analysis shows that our proposed system outperforms previously published systems when evaluated with metrics such as Peak Signal to Noise Ratio, Coefficient of Correlation, and Effective Number of Looks. These results show the potential of our proposed system as an effective tool for enhancing 3-D row-column imaging. PMID:26658577

  6. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors.

    PubMed

    Dutton, Neale A W; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K

    2016-07-20

    SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed.

  7. Microwave Sensors for Breast Cancer Detection

    PubMed Central

    2018-01-01

    Breast cancer is the leading cause of death among females, early diagnostic methods with suitable treatments improve the 5-year survival rates significantly. Microwave breast imaging has been reported as the most potential to become the alternative or additional tool to the current gold standard X-ray mammography for detecting breast cancer. The microwave breast image quality is affected by the microwave sensor, sensor array, the number of sensors in the array and the size of the sensor. In fact, microwave sensor array and sensor play an important role in the microwave breast imaging system. Numerous microwave biosensors have been developed for biomedical applications, with particular focus on breast tumor detection. Compared to the conventional medical imaging and biosensor techniques, these microwave sensors not only enable better cancer detection and improve the image resolution, but also provide attractive features such as label-free detection. This paper aims to provide an overview of recent important achievements in microwave sensors for biomedical imaging applications, with particular focus on breast cancer detection. The electric properties of biological tissues at microwave spectrum, microwave imaging approaches, microwave biosensors, current challenges and future works are also discussed in the manuscript. PMID:29473867

  8. Microwave Sensors for Breast Cancer Detection.

    PubMed

    Wang, Lulu

    2018-02-23

    Breast cancer is the leading cause of death among females, early diagnostic methods with suitable treatments improve the 5-year survival rates significantly. Microwave breast imaging has been reported as the most potential to become the alternative or additional tool to the current gold standard X-ray mammography for detecting breast cancer. The microwave breast image quality is affected by the microwave sensor, sensor array, the number of sensors in the array and the size of the sensor. In fact, microwave sensor array and sensor play an important role in the microwave breast imaging system. Numerous microwave biosensors have been developed for biomedical applications, with particular focus on breast tumor detection. Compared to the conventional medical imaging and biosensor techniques, these microwave sensors not only enable better cancer detection and improve the image resolution, but also provide attractive features such as label-free detection. This paper aims to provide an overview of recent important achievements in microwave sensors for biomedical imaging applications, with particular focus on breast cancer detection. The electric properties of biological tissues at microwave spectrum, microwave imaging approaches, microwave biosensors, current challenges and future works are also discussed in the manuscript.

  9. An Imaging Sensor-Aided Vision Navigation Approach that Uses a Geo-Referenced Image Database.

    PubMed

    Li, Yan; Hu, Qingwu; Wu, Meng; Gao, Yang

    2016-01-28

    In determining position and attitude, vision navigation via real-time image processing of data collected from imaging sensors is advanced without a high-performance global positioning system (GPS) and an inertial measurement unit (IMU). Vision navigation is widely used in indoor navigation, far space navigation, and multiple sensor-integrated mobile mapping. This paper proposes a novel vision navigation approach aided by imaging sensors and that uses a high-accuracy geo-referenced image database (GRID) for high-precision navigation of multiple sensor platforms in environments with poor GPS. First, the framework of GRID-aided vision navigation is developed with sequence images from land-based mobile mapping systems that integrate multiple sensors. Second, a highly efficient GRID storage management model is established based on the linear index of a road segment for fast image searches and retrieval. Third, a robust image matching algorithm is presented to search and match a real-time image with the GRID. Subsequently, the image matched with the real-time scene is considered to calculate the 3D navigation parameter of multiple sensor platforms. Experimental results show that the proposed approach retrieves images efficiently and has navigation accuracies of 1.2 m in a plane and 1.8 m in height under GPS loss in 5 min and within 1500 m.

  10. An Imaging Sensor-Aided Vision Navigation Approach that Uses a Geo-Referenced Image Database

    PubMed Central

    Li, Yan; Hu, Qingwu; Wu, Meng; Gao, Yang

    2016-01-01

    In determining position and attitude, vision navigation via real-time image processing of data collected from imaging sensors is advanced without a high-performance global positioning system (GPS) and an inertial measurement unit (IMU). Vision navigation is widely used in indoor navigation, far space navigation, and multiple sensor-integrated mobile mapping. This paper proposes a novel vision navigation approach aided by imaging sensors and that uses a high-accuracy geo-referenced image database (GRID) for high-precision navigation of multiple sensor platforms in environments with poor GPS. First, the framework of GRID-aided vision navigation is developed with sequence images from land-based mobile mapping systems that integrate multiple sensors. Second, a highly efficient GRID storage management model is established based on the linear index of a road segment for fast image searches and retrieval. Third, a robust image matching algorithm is presented to search and match a real-time image with the GRID. Subsequently, the image matched with the real-time scene is considered to calculate the 3D navigation parameter of multiple sensor platforms. Experimental results show that the proposed approach retrieves images efficiently and has navigation accuracies of 1.2 m in a plane and 1.8 m in height under GPS loss in 5 min and within 1500 m. PMID:26828496

  11. NRL Fact Book

    DTIC Science & Technology

    1988-05-01

    interests are centered on signal processing and the physics of underwater acoustic propagation, ambient noise, and reverberation. Mr. Rojas is a member...Airborne underwater acoustics Bottom-limited acoustics Arctic underwater acoustics Propagation Noise Ambient noise measurements and modeling Spectral...Multispectral image correlation Space sensor and mission analysis CROWN" STALK Time, ms (100 ms after impactO Time-history of the . radar backscatter from a

  12. A Disposable Tear Glucose Biosensor-Part 4: Preliminary Animal Model Study Assessing Efficacy, Safety, and Feasibility.

    PubMed

    La Belle, Jeffrey T; Engelschall, Erica; Lan, Kenneth; Shah, Pankti; Saez, Neil; Maxwell, Stephanie; Adamson, Teagan; Abou-Eid, Michelle; McAferty, Kenyon; Patel, Dharmendra R; Cook, Curtiss B

    2014-01-01

    A prototype tear glucose (TG) sensor was tested in New Zealand white rabbits to assess eye irritation, blood glucose (BG) and TG lag time, and correlation with BG. A total of 4 animals were used. Eye irritation was monitored by Lissamine green dye and analyzed using image analysis software. Lag time was correlated with an oral glucose load while recording TG and BG readings. Correlation between TG and BG were plotted against one another to form a correlation diagram, using a Yellow Springs Instrument (YSI) and self-monitoring of blood glucose as the reference measurements. Finally, TG levels were calculated using analytically derived expressions. From repeated testing carried over the course of 12 months, little to no eye irritation was detected. TG fluctuations over time visually appeared to trace the same pattern as BG with an average lag times of 13 minutes. TG levels calculated from the device current measurements ranged from 4 to 20 mg/dL and correlated linearly with BG levels of 75-160 mg/dL (TG = 0.1723 BG = 7.9448 mg/dL; R 2 = .7544). The first steps were taken toward preliminary development of a sensor for self-monitoring of tear glucose (SMTG). No conjunctival irritation in any of the animals was noted. Lag time between TG and BG was found to be noticeable, but a quantitative modeling to correlate lag time in this study is unnecessary. Measured currents from the sensors and the calculated TG showed promising correlation to BG levels. Previous analytical bench marking showed BG and TG levels consistent with other literature. © 2014 Diabetes Technology Society.

  13. High speed imaging for assessment of impact damage in natural fibre biocomposites

    NASA Astrophysics Data System (ADS)

    Ramakrishnan, Karthik Ram; Corn, Stephane; Le Moigne, Nicolas; Ienny, Patrick; Leger, Romain; Slangen, Pierre R.

    2017-06-01

    The use of Digital Image Correlation has been generally limited to the estimation of mechanical properties and fracture behaviour at low to moderate strain rates. High speed cameras dedicated to ballistic testing are often used to measure the initial and residual velocities of the projectile but rarely for damage assessment. The evaluation of impact damage is frequently achieved post-impact using visual inspection, ultrasonic C-scan or other NDI methods. Ultra-high speed cameras and developments in image processing have made possible the measurement of surface deformations and stresses in real time during dynamic cracking. In this paper, a method is presented to correlate the force- displacement data from the sensors to the slow motion tracking of the transient failure cracks using real-time high speed imaging. Natural fibre reinforced composites made of flax fibres and polypropylene matrix was chosen for the study. The creation of macro-cracks during the impact results in the loss of stiffness and a corresponding drop in the force history. However, optical instrumentation shows that the initiation of damage is not always evident and so the assessment of damage requires the use of a local approach. Digital Image Correlation is used to study the strain history of the composite and to identify the initiation and progression of damage. The effect of fly-speckled texture on strain measurement by image correlation is also studied. The developed method can be used for the evaluation of impact damage for different composite materials.

  14. High-content analysis of single cells directly assembled on CMOS sensor based on color imaging.

    PubMed

    Tanaka, Tsuyoshi; Saeki, Tatsuya; Sunaga, Yoshihiko; Matsunaga, Tadashi

    2010-12-15

    A complementary metal oxide semiconductor (CMOS) image sensor was applied to high-content analysis of single cells which were assembled closely or directly onto the CMOS sensor surface. The direct assembling of cell groups on CMOS sensor surface allows large-field (6.66 mm×5.32 mm in entire active area of CMOS sensor) imaging within a second. Trypan blue-stained and non-stained cells in the same field area on the CMOS sensor were successfully distinguished as white- and blue-colored images under white LED light irradiation. Furthermore, the chemiluminescent signals of each cell were successfully visualized as blue-colored images on CMOS sensor only when HeLa cells were placed directly on the micro-lens array of the CMOS sensor. Our proposed approach will be a promising technique for real-time and high-content analysis of single cells in a large-field area based on color imaging. Copyright © 2010 Elsevier B.V. All rights reserved.

  15. CMOS image sensor-based implantable glucose sensor using glucose-responsive fluorescent hydrogel

    PubMed Central

    Tokuda, Takashi; Takahashi, Masayuki; Uejima, Kazuhiro; Masuda, Keita; Kawamura, Toshikazu; Ohta, Yasumi; Motoyama, Mayumi; Noda, Toshihiko; Sasagawa, Kiyotaka; Okitsu, Teru; Takeuchi, Shoji; Ohta, Jun

    2014-01-01

    A CMOS image sensor-based implantable glucose sensor based on an optical-sensing scheme is proposed and experimentally verified. A glucose-responsive fluorescent hydrogel is used as the mediator in the measurement scheme. The wired implantable glucose sensor was realized by integrating a CMOS image sensor, hydrogel, UV light emitting diodes, and an optical filter on a flexible polyimide substrate. Feasibility of the glucose sensor was verified by both in vitro and in vivo experiments. PMID:25426316

  16. Source-space ICA for MEG source imaging.

    PubMed

    Jonmohamadi, Yaqub; Jones, Richard D

    2016-02-01

    One of the most widely used approaches in electroencephalography/magnetoencephalography (MEG) source imaging is application of an inverse technique (such as dipole modelling or sLORETA) on the component extracted by independent component analysis (ICA) (sensor-space ICA + inverse technique). The advantage of this approach over an inverse technique alone is that it can identify and localize multiple concurrent sources. Among inverse techniques, the minimum-variance beamformers offer a high spatial resolution. However, in order to have both high spatial resolution of beamformer and be able to take on multiple concurrent sources, sensor-space ICA + beamformer is not an ideal combination. We propose source-space ICA for MEG as a powerful alternative approach which can provide the high spatial resolution of the beamformer and handle multiple concurrent sources. The concept of source-space ICA for MEG is to apply the beamformer first and then singular value decomposition + ICA. In this paper we have compared source-space ICA with sensor-space ICA both in simulation and real MEG. The simulations included two challenging scenarios of correlated/concurrent cluster sources. Source-space ICA provided superior performance in spatial reconstruction of source maps, even though both techniques performed equally from a temporal perspective. Real MEG from two healthy subjects with visual stimuli were also used to compare performance of sensor-space ICA and source-space ICA. We have also proposed a new variant of minimum-variance beamformer called weight-normalized linearly-constrained minimum-variance with orthonormal lead-field. As sensor-space ICA-based source reconstruction is popular in EEG and MEG imaging, and given that source-space ICA has superior spatial performance, it is expected that source-space ICA will supersede its predecessor in many applications.

  17. Quantification of Water Quality Parameters for the Wabash River Using Hyperspectral Remote Sensing

    NASA Astrophysics Data System (ADS)

    Tan, J.; Cherkauer, K. A.; Chaubey, I.

    2011-12-01

    Increasingly impaired water bodies in the agriculturally dominated Midwestern United States pose a risk to water supplies, aquatic ecology and contribute to the eutrophication of the Gulf of Mexico. Improving regional water quality calls for new techniques for monitoring and managing water quality over large river systems. Optical indicators of water quality enable a timely and cost-effective method for observing and quantifying water quality conditions by remote sensing. Compared to broad spectral sensors such as Landsat, which observe reflectance over limited spectral bands, hyperspectral sensors should have significant advantages in their ability to estimate water quality parameters because they are designed to split the spectral signature into hundreds of very narrow spectral bands increasing their ability to resolve optically sensitive water quality indicators. Two airborne hyperspectral images were acquired over the Wabash River using a ProSpecTIR-VS2 sensor system on May 15th, 2010. These images were analyzed together with concurrent in-stream water quality data collected to assess our ability to extract optically sensitive constituents. Utilizing the correlation between in-stream data and reflectance from the hyperspectral images, models were developed to estimate the concentrations of chlorophyll a, dissolved organic carbon and total suspended solids. Models were developed using the full array of hyperspectral bands, as well as Landsat bands synthesized by averaging hyperspectral bands within the Landsat spectral range. Higher R2 and lower RMSE values were found for the models taking full advantage of the hyperspectral sensor, supporting the conclusion that the hyperspectral sensor was better at predicting the in-stream concentrations of chlorophyll a, dissolved organic carbon and total suspended solids in the Wabash River. Results also suggest that predictive models may not be the same for the Wabash River as for its tributaries.

  18. Physical Interpretation of the Correlation Between Multi-Angle Spectral Data and Canopy Height

    NASA Technical Reports Server (NTRS)

    Schull, M. A.; Ganguly, S.; Samanta, A.; Huang, D.; Shabanov, N. V.; Jenkins, J. P.; Chiu, J. C.; Marshak, A.; Blair, J. B.; Myneni, R. B.; hide

    2007-01-01

    Recent empirical studies have shown that multi-angle spectral data can be useful for predicting canopy height, but the physical reason for this correlation was not understood. We follow the concept of canopy spectral invariants, specifically escape probability, to gain insight into the observed correlation. Airborne Multi-Angle Imaging Spectrometer (AirMISR) and airborne Laser Vegetation Imaging Sensor (LVIS) data acquired during a NASA Terrestrial Ecology Program aircraft campaign underlie our analysis. Two multivariate linear regression models were developed to estimate LVIS height measures from 28 AirMISR multi-angle spectral reflectances and from the spectrally invariant escape probability at 7 AirMISR view angles. Both models achieved nearly the same accuracy, suggesting that canopy spectral invariant theory can explain the observed correlation. We hypothesize that the escape probability is sensitive to the aspect ratio (crown diameter to crown height). The multi-angle spectral data alone therefore may not provide enough information to retrieve canopy height globally

  19. Beam imaging sensor and method for using same

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McAninch, Michael D.; Root, Jeffrey J.

    The present invention relates generally to the field of sensors for beam imaging and, in particular, to a new and useful beam imaging sensor for use in determining, for example, the power density distribution of a beam including, but not limited to, an electron beam or an ion beam. In one embodiment, the beam imaging sensor of the present invention comprises, among other items, a circumferential slit that is either circular, elliptical or polygonal in nature. In another embodiment, the beam imaging sensor of the present invention comprises, among other things, a discontinuous partially circumferential slit. Also disclosed is amore » method for using the various beams sensor embodiments of the present invention.« less

  20. Analysis on the Effect of Sensor Views in Image Reconstruction Produced by Optical Tomography System Using Charge-Coupled Device.

    PubMed

    Jamaludin, Juliza; Rahim, Ruzairi Abdul; Fazul Rahiman, Mohd Hafiz; Mohd Rohani, Jemmy

    2018-04-01

    Optical tomography (OPT) is a method to capture a cross-sectional image based on the data obtained by sensors, distributed around the periphery of the analyzed system. This system is based on the measurement of the final light attenuation or absorption of radiation after crossing the measured objects. The number of sensor views will affect the results of image reconstruction, where the high number of sensor views per projection will give a high image quality. This research presents an application of charge-coupled device linear sensor and laser diode in an OPT system. Experiments in detecting solid and transparent objects in crystal clear water were conducted. Two numbers of sensors views, 160 and 320 views are evaluated in this research in reconstructing the images. The image reconstruction algorithms used were filtered images of linear back projection algorithms. Analysis on comparing the simulation and experiments image results shows that, with 320 image views giving less area error than 160 views. This suggests that high image view resulted in the high resolution of image reconstruction.

  1. Retrieval Algorithm for Broadband Albedo at the Top of the Atmosphere

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Ho; Lee, Kyu-Tae; Kim, Bu-Yo; Zo, ll-Sung; Jung, Hyun-Seok; Rim, Se-Hun

    2018-05-01

    The objective of this study is to develop an algorithm that retrieves the broadband albedo at the top of the atmosphere (TOA albedo) for radiation budget and climate analysis of Earth's atmosphere using Geostationary Korea Multi-Purse Satellite/Advanced Meteorological Imager (GK-2A/AMI) data. Because the GK-2A satellite will launch in 2018, we used data from the Japanese weather satellite Himawari-8 and onboard sensor Advanced Himawari Imager (AHI), which has similar sensor properties and observation area to those of GK-2A. TOA albedo was retrieved based on reflectance and regression coefficients of shortwave channels 1 to 6 of AHI. The regression coefficient was calculated using the results of the radiative transfer model (SBDART) and ridge regression. The SBDART used simulations of the correlation between TOA albedo and reflectance of each channel according to each atmospheric conditions (solar zenith angle, viewing zenith angle, relative azimuth angle, surface type, and absence/presence of clouds). The TOA albedo from Himawari-8/AHI were compared to that from the National Aeronautics and Space Administration (NASA) satellite Terra with onboard sensor Clouds and the Earth's Radiant Energy System (CERES). The correlation coefficients between the two datasets from the week containing the first day of every month between 1st August 2015 and 1st July 2016 were high, ranging between 0.934 and 0.955, with the root mean square error in the 0.053-0.068 range.

  2. Integration of piezo-capacitive and piezo-electric nanoweb based pressure sensors for imaging of static and dynamic pressure distribution.

    PubMed

    Jeong, Y J; Oh, T I; Woo, E J; Kim, K J

    2017-07-01

    Recently, highly flexible and soft pressure distribution imaging sensor is in great demand for tactile sensing, gait analysis, ubiquitous life-care based on activity recognition, and therapeutics. In this study, we integrate the piezo-capacitive and piezo-electric nanowebs with the conductive fabric sheets for detecting static and dynamic pressure distributions on a large sensing area. Electrical impedance tomography (EIT) and electric source imaging are applied for reconstructing pressure distribution images from measured current-voltage data on the boundary of the hybrid fabric sensor. We evaluated the piezo-capacitive nanoweb sensor, piezo-electric nanoweb sensor, and hybrid fabric sensor. The results show the feasibility of static and dynamic pressure distribution imaging from the boundary measurements of the fabric sensors.

  3. Improved Denoising via Poisson Mixture Modeling of Image Sensor Noise.

    PubMed

    Zhang, Jiachao; Hirakawa, Keigo

    2017-04-01

    This paper describes a study aimed at comparing the real image sensor noise distribution to the models of noise often assumed in image denoising designs. A quantile analysis in pixel, wavelet transform, and variance stabilization domains reveal that the tails of Poisson, signal-dependent Gaussian, and Poisson-Gaussian models are too short to capture real sensor noise behavior. A new Poisson mixture noise model is proposed to correct the mismatch of tail behavior. Based on the fact that noise model mismatch results in image denoising that undersmoothes real sensor data, we propose a mixture of Poisson denoising method to remove the denoising artifacts without affecting image details, such as edge and textures. Experiments with real sensor data verify that denoising for real image sensor data is indeed improved by this new technique.

  4. A preliminary comparison of Landsat Thematic Mapper and SPOT-1 HRV multispectral data for estimating coniferous forest volume

    NASA Technical Reports Server (NTRS)

    Ripple, William J.; Wang, S.; Isaacson, Dennis L.; Paine, D. P.

    1995-01-01

    Digital Landsat Thematic Mapper (TM) and Satellite Probatoire d'Observation de la Terre (SPOT) High Resolution Visible (HRV) images of coniferous forest canopies were compared in their relationship to forest wood volume using correlation and regression analyses. Significant inverse relationships were found between softwood volume and the spectral bands from both sensors (P less than 0.01). The highest correlations were between the log of softwood volume and the near-infrared bands (HRV band 3, r = -0.89; TM band 4, r = -0.83).

  5. Long-term stability of GOES-8 and -9 attitude control

    NASA Astrophysics Data System (ADS)

    Carr, James L.

    1996-10-01

    An independent audit of the in-orbit behavior of the GOES-8 and GOES-9 satellites has been conducted for the NASA/GSFC. This audit utilized star and landmark observations from the GOES imager to determine long-term histories for spacecraft attitude, orbital position, and instrument internal misalignments. The paper presents results from this audit. Long-term drifts are found in the attitude histories, whereas the misalignment histories are shown to be diurnally stable. The GOES image navigation and registration system is designed to compensate for instrument internal misalignments, and both the diurnally repeatable and drift components of the attitude. Correlations between GOES-8 and GOES-9 long-term roll and pitch drifts implicate the Earth sensor as the origin of these observed drifts. This results clearly demonstrates the enhanced registration stability to be obtained with stellar inertial attitude determination replacing or supplementing Earth sensor control on future GOES missions.

  6. Comparisons between wave directional spectra from SAR and pressure sensor arrays

    NASA Technical Reports Server (NTRS)

    Pawka, S. S.; Inman, D. L.; Hsiao, S. V.; Shemdin, O. H.

    1980-01-01

    Simultaneous directional wave measurements were made at Torrey Pines Beach, California, by a synthetic aperture radar (SAR) and a linear array of pressure sensors. The measurements were conducted during the West Coast Experiment in March 1977. Quantitative comparisons of the normalized directional spectra from the two systems were made for wave periods of 6.9-17.0 s. The comparison results were variable but generally showed good agreement of the primary mode of the normalized directional energy. An attempt was made to quantify the physical criteria for good wave imaging in the SAR. A frequency band analysis of wave parameters such as band energy, slope, and orbital velocity did not show good correlation with the directional comparisons. It is noted that absolute values of the wave height spectrum cannot be derived from the SAR images yet and, consequently, no comparisons of absolute energy levels with corresponding array measurements were intended.

  7. Apparatus and method for a light direction sensor

    NASA Technical Reports Server (NTRS)

    Leviton, Douglas B. (Inventor)

    2011-01-01

    The present invention provides a light direction sensor for determining the direction of a light source. The system includes an image sensor; a spacer attached to the image sensor, and a pattern mask attached to said spacer. The pattern mask has a slit pattern that as light passes through the slit pattern it casts a diffraction pattern onto the image sensor. The method operates by receiving a beam of light onto a patterned mask, wherein the patterned mask as a plurality of a slit segments. Then, diffusing the beam of light onto an image sensor and determining the direction of the light source.

  8. Performance Evaluation Modeling of Network Sensors

    NASA Technical Reports Server (NTRS)

    Clare, Loren P.; Jennings, Esther H.; Gao, Jay L.

    2003-01-01

    Substantial benefits are promised by operating many spatially separated sensors collectively. Such systems are envisioned to consist of sensor nodes that are connected by a communications network. A simulation tool is being developed to evaluate the performance of networked sensor systems, incorporating such metrics as target detection probabilities, false alarms rates, and classification confusion probabilities. The tool will be used to determine configuration impacts associated with such aspects as spatial laydown, and mixture of different types of sensors (acoustic, seismic, imaging, magnetic, RF, etc.), and fusion architecture. The QualNet discrete-event simulation environment serves as the underlying basis for model development and execution. This platform is recognized for its capabilities in efficiently simulating networking among mobile entities that communicate via wireless media. We are extending QualNet's communications modeling constructs to capture the sensing aspects of multi-target sensing (analogous to multiple access communications), unimodal multi-sensing (broadcast), and multi-modal sensing (multiple channels and correlated transmissions). Methods are also being developed for modeling the sensor signal sources (transmitters), signal propagation through the media, and sensors (receivers) that are consistent with the discrete event paradigm needed for performance determination of sensor network systems. This work is supported under the Microsensors Technical Area of the Army Research Laboratory (ARL) Advanced Sensors Collaborative Technology Alliance.

  9. Leg edema quantification for heart failure patients via 3D imaging.

    PubMed

    Hayn, Dieter; Fruhwald, Friedrich; Riedel, Arthur; Falgenhauer, Markus; Schreier, Günter

    2013-08-14

    Heart failure is a common cardiac disease in elderly patients. After discharge, approximately 50% of all patients are readmitted to a hospital within six months. Recent studies show that home monitoring of heart failure patients can reduce the number of readmissions. Still, a large number of false positive alarms as well as underdiagnoses in other cases require more accurate alarm generation algorithms. New low-cost sensors for leg edema detection could be the missing link to help home monitoring to its breakthrough. We evaluated a 3D camera-based measurement setup in order to geometrically detect and quantify leg edemas. 3D images of legs were taken and geometric parameters were extracted semi-automatically from the images. Intra-subject variability for five healthy subjects was evaluated. Thereafter, correlation of 3D parameters with body weight and leg circumference was assessed during a clinical study at the Medical University of Graz. Strong correlation was found in between both reference values and instep height, while correlation in between curvature of the lower leg and references was very low. We conclude that 3D imaging might be a useful and cost-effective extension of home monitoring for heart failure patients, though further (prospective) studies are needed.

  10. Satellite Ocean Color Sensor Design Concepts and Performance Requirements

    NASA Technical Reports Server (NTRS)

    McClain, Charles R.; Meister, Gerhard; Monosmith, Bryan

    2014-01-01

    In late 1978, the National Aeronautics and Space Administration (NASA) launched the Nimbus-7 satellite with the Coastal Zone Color Scanner (CZCS) and several other sensors, all of which provided major advances in Earth remote sensing. The inspiration for the CZCS is usually attributed to an article in Science by Clarke et al. who demonstrated that large changes in open ocean spectral reflectance are correlated to chlorophyll-a concentrations. Chlorophyll-a is the primary photosynthetic pigment in green plants (marine and terrestrial) and is used in estimating primary production, i.e., the amount of carbon fixed into organic matter during photosynthesis. Thus, accurate estimates of global and regional primary production are key to studies of the earth's carbon cycle. Because the investigators used an airborne radiometer, they were able to demonstrate the increased radiance contribution of the atmosphere with altitude that would be a major issue for spaceborne measurements. Since 1978, there has been much progress in satellite ocean color remote sensing such that the technique is well established and is used for climate change science and routine operational environmental monitoring. Also, the science objectives and accompanying methodologies have expanded and evolved through a succession of global missions, e.g., the Ocean Color and Temperature Sensor (OCTS), the Seaviewing Wide Field-of-view Sensor (SeaWiFS), the Moderate Resolution Imaging Spectroradiometer (MODIS), the Medium Resolution Imaging Spectrometer (MERIS), and the Global Imager (GLI). With each advance in science objectives, new and more stringent requirements for sensor capabilities (e.g., spectral coverage) and performance (e.g., signal-to-noise ratio, SNR) are established. The CZCS had four bands for chlorophyll and aerosol corrections. The Ocean Color Imager (OCI) recommended for the NASA Pre-Aerosol, Cloud, and Ocean Ecosystems (PACE) mission includes 5 nanometers hyperspectral coverage from 350 to 800 nanometers with three additional discrete near infrared (NIR) and shortwave infrared (SWIR) ocean aerosol correction bands. Also, to avoid drift in sensor sensitivity from being interpreted as environmental change, climate change research requires rigorous monitoring of sensor stability. For SeaWiFS, monthly lunar imaging accurately tracked stability at an accuracy of approximately 0.1% that allowed the data to be used for climate studies [2]. It is now acknowledged by the international community that future missions and sensor designs need to accommodate lunar calibrations. An overview of ocean color remote sensing and a review of the progress made in ocean color remote sensing and the variety of research applications derived from global satellite ocean color data are provided. The purpose of this chapter is to discuss the design options for ocean color satellite radiometers, performance and testing criteria, and sensor components (optics, detectors, electronics, etc.) that must be integrated into an instrument concept. These ultimately dictate the quality and quantity of data that can be delivered as a trade against mission cost. Historically, science and sensor technology have advanced in a "leap-frog" manner in that sensor design requirements for a mission are defined many years before a sensor is launched and by the end of the mission, perhaps 15-20 years later, science applications and requirements are well beyond the capabilities of the sensor. Section 3 provides a summary of historical mission science objectives and sensor requirements. This progression is expected to continue in the future as long as sensor costs can be constrained to affordable levels and still allow the incorporation of new technologies without incurring unacceptable risk to mission success. The IOCCG Report Number 13 discusses future ocean biology mission Level-1 requirements in depth.

  11. New Optical Sensing Materials for Application in Marine Research

    NASA Astrophysics Data System (ADS)

    Borisov, S.; Klimant, I.

    2012-04-01

    Optical chemosensors are versatile analytical tools which find application in numerous fields of science and technology. They proved to be a promising alternative to electrochemical methods and are applied increasingly often in marine research. However, not all state-of-the- art optical chemosensors are suitable for these demanding applications since they do not fully fulfil the requirements of high luminescence brightness, high chemical- and photochemical stability or their spectral properties are not adequate. Therefore, development of new advanced sensing materials is still of utmost importance. Here we present a set of novel optical sensing materials recently developed in the Institute of Analytical Chemistry and Food Chemistry which are optimized for marine applications. Particularly, we present new NIR indicators and sensors for oxygen and pH which feature high brightness and low level of autofluorescence. The oxygen sensors rely on highly photostable metal complexes of benzoporphyrins and azabenzoporphyrins and enable several important applications such as simultaneous monitoring of oxygen and chlorophyll or ultra-fast oxygen monitoring (Eddy correlation). We also developed ulta-sensitive oxygen optodes which enable monitoring in nM range and are primary designed for investigation of oxygen minimum zones. The dynamic range of our new NIR pH indicators based on aza-BODIPY dyes is optimized for the marine environment. A highly sensitive NIR luminescent phosphor (chromium(III) doped yttrium aluminium borate) can be used for non-invasive temperature measurements. Notably, the oxygen, pH sensors and temperature sensors are fully compatible with the commercially available fiber-optic readers (Firesting from PyroScience). An optical CO2 sensor for marine applications employs novel diketopyrrolopyrrol indicators and enables ratiometric imaging using a CCD camera. Oxygen, pH and temperature sensors suitable for lifetime and ratiometric imaging of analytes distribution are also realized. To enable versatility of applications we also obtained a range of nano- and microparticles suitable for intra- and extracellular imaging of the above analytes. Bright ratiometric 2-photon-excitable probes were also developed. Magnetic microparticles are demonstrated to be very promising tools for imaging of oxygen, temperature and other parameters in biofilms, corals etc. since they combine the sensing function with the possibility of external manipulation.

  12. Towards real-time topical detection and characterization of FDG dose infiltration prior to PET imaging.

    PubMed

    Williams, Jason M; Arlinghaus, Lori R; Rani, Sudheer D; Shone, Martha D; Abramson, Vandana G; Pendyala, Praveen; Chakravarthy, A Bapsi; Gorge, William J; Knowland, Joshua G; Lattanze, Ronald K; Perrin, Steven R; Scarantino, Charles W; Townsend, David W; Abramson, Richard G; Yankeelov, Thomas E

    2016-12-01

    To dynamically detect and characterize 18 F-fluorodeoxyglucose (FDG) dose infiltrations and evaluate their effects on positron emission tomography (PET) standardized uptake values (SUV) at the injection site and in control tissue. Investigational gamma scintillation sensors were topically applied to patients with locally advanced breast cancer scheduled to undergo limited whole-body FDG-PET as part of an ongoing clinical study. Relative to the affected breast, sensors were placed on the contralateral injection arm and ipsilateral control arm during the resting uptake phase prior to each patient's PET scan. Time-activity curves (TACs) from the sensors were integrated at varying intervals (0-10, 0-20, 0-30, 0-40, and 30-40 min) post-FDG and the resulting areas under the curve (AUCs) were compared to SUVs obtained from PET. In cases of infiltration, observed in three sensor recordings (30 %), the injection arm TAC shape varied depending on the extent and severity of infiltration. In two of these cases, TAC characteristics suggested the infiltration was partially resolving prior to image acquisition, although it was still apparent on subsequent PET. Areas under the TAC 0-10 and 0-20 min post-FDG were significantly different in infiltrated versus non-infiltrated cases (Mann-Whitney, p < 0.05). When normalized to control, all TAC integration intervals from the injection arm were significantly correlated with SUV peak and SUV max measured over the infiltration site (Spearman ρ ≥ 0.77, p < 0.05). Receiver operating characteristic (ROC) analyses, testing the ability of the first 10 min of post-FDG sensor data to predict infiltration visibility on the ensuing PET, yielded an area under the ROC curve of 0.92. Topical sensors applied near the injection site provide dynamic information from the time of FDG administration through the uptake period and may be useful in detecting infiltrations regardless of PET image field of view. This dynamic information may also complement the static PET image to better characterize the true extent of infiltrations.

  13. A High-Speed Vision-Based Sensor for Dynamic Vibration Analysis Using Fast Motion Extraction Algorithms.

    PubMed

    Zhang, Dashan; Guo, Jie; Lei, Xiujun; Zhu, Changan

    2016-04-22

    The development of image sensor and optics enables the application of vision-based techniques to the non-contact dynamic vibration analysis of large-scale structures. As an emerging technology, a vision-based approach allows for remote measuring and does not bring any additional mass to the measuring object compared with traditional contact measurements. In this study, a high-speed vision-based sensor system is developed to extract structure vibration signals in real time. A fast motion extraction algorithm is required for this system because the maximum sampling frequency of the charge-coupled device (CCD) sensor can reach up to 1000 Hz. Two efficient subpixel level motion extraction algorithms, namely the modified Taylor approximation refinement algorithm and the localization refinement algorithm, are integrated into the proposed vision sensor. Quantitative analysis shows that both of the two modified algorithms are at least five times faster than conventional upsampled cross-correlation approaches and achieve satisfactory error performance. The practicability of the developed sensor is evaluated by an experiment in a laboratory environment and a field test. Experimental results indicate that the developed high-speed vision-based sensor system can extract accurate dynamic structure vibration signals by tracking either artificial targets or natural features.

  14. Real-time detection of natural objects using AM-coded spectral matching imager

    NASA Astrophysics Data System (ADS)

    Kimachi, Akira

    2004-12-01

    This paper describes application of the amplitude-modulation (AM)-coded spectral matching imager (SMI) to real-time detection of natural objects such as human beings, animals, vegetables, or geological objects or phenomena, which are much more liable to change with time than artificial products while often exhibiting characteristic spectral functions associated with some specific activity states. The AM-SMI produces correlation between spectral functions of the object and a reference at each pixel of the correlation image sensor (CIS) in every frame, based on orthogonal amplitude modulation (AM) of each spectral channel and simultaneous demodulation of all channels on the CIS. This principle makes the SMI suitable to monitoring dynamic behavior of natural objects in real-time by looking at a particular spectral reflectance or transmittance function. A twelve-channel multispectral light source was developed with improved spatial uniformity of spectral irradiance compared to a previous one. Experimental results of spectral matching imaging of human skin and vegetable leaves are demonstrated, as well as a preliminary feasibility test of imaging a reflective object using a test color chart.

  15. Real-time detection of natural objects using AM-coded spectral matching imager

    NASA Astrophysics Data System (ADS)

    Kimachi, Akira

    2005-01-01

    This paper describes application of the amplitude-modulation (AM)-coded spectral matching imager (SMI) to real-time detection of natural objects such as human beings, animals, vegetables, or geological objects or phenomena, which are much more liable to change with time than artificial products while often exhibiting characteristic spectral functions associated with some specific activity states. The AM-SMI produces correlation between spectral functions of the object and a reference at each pixel of the correlation image sensor (CIS) in every frame, based on orthogonal amplitude modulation (AM) of each spectral channel and simultaneous demodulation of all channels on the CIS. This principle makes the SMI suitable to monitoring dynamic behavior of natural objects in real-time by looking at a particular spectral reflectance or transmittance function. A twelve-channel multispectral light source was developed with improved spatial uniformity of spectral irradiance compared to a previous one. Experimental results of spectral matching imaging of human skin and vegetable leaves are demonstrated, as well as a preliminary feasibility test of imaging a reflective object using a test color chart.

  16. Identifying the effects of microsaccades in tripolar EEG signals.

    PubMed

    Bellisle, Rachel; Steele, Preston; Bartels, Rachel; Lei Ding; Sunderam, Sridhar; Besio, Walter

    2017-07-01

    Microsaccades are tiny, involuntary eye movements that occur during fixation, and they are necessary to human sight to maintain a sharp image and correct the effects of other fixational movements. Researchers have theorized and studied the effects of microsaccades on electroencephalography (EEG) signals to understand and eliminate the unwanted artifacts from EEG. The tripolar concentric ring electrode (TCRE) sensors are used to acquire TCRE EEG (tEEG). The tEEG detects extremely focal signals from directly below the TCRE sensor. We have noticed a slow wave frequency found in some tEEG recordings. Therefore, we conducted the current work to determine if there was a correlation between the slow wave in the tEEG and the microsaccades. This was done by analyzing the coherence of the frequency spectrums of both tEEG and eye movement in recordings where microsaccades are present. Our preliminary findings show that there is a correlation between the two.

  17. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors

    PubMed Central

    Dutton, Neale A. W.; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K.

    2016-01-01

    SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed. PMID:27447643

  18. Evaluation of a HDR image sensor with logarithmic response for mobile video-based applications

    NASA Astrophysics Data System (ADS)

    Tektonidis, Marco; Pietrzak, Mateusz; Monnin, David

    2017-10-01

    The performance of mobile video-based applications using conventional LDR (Low Dynamic Range) image sensors highly depends on the illumination conditions. As an alternative, HDR (High Dynamic Range) image sensors with logarithmic response are capable to acquire illumination-invariant HDR images in a single shot. We have implemented a complete image processing framework for a HDR sensor, including preprocessing methods (nonuniformity correction (NUC), cross-talk correction (CTC), and demosaicing) as well as tone mapping (TM). We have evaluated the HDR sensor for video-based applications w.r.t. the display of images and w.r.t. image analysis techniques. Regarding the display we have investigated the image intensity statistics over time, and regarding image analysis we assessed the number of feature correspondences between consecutive frames of temporal image sequences. For the evaluation we used HDR image data recorded from a vehicle on outdoor or combined outdoor/indoor itineraries, and we performed a comparison with corresponding conventional LDR image data.

  19. The lucky image-motion prediction for simple scene observation based soft-sensor technology

    NASA Astrophysics Data System (ADS)

    Li, Yan; Su, Yun; Hu, Bin

    2015-08-01

    High resolution is important to earth remote sensors, while the vibration of the platforms of the remote sensors is a major factor restricting high resolution imaging. The image-motion prediction and real-time compensation are key technologies to solve this problem. For the reason that the traditional autocorrelation image algorithm cannot meet the demand for the simple scene image stabilization, this paper proposes to utilize soft-sensor technology in image-motion prediction, and focus on the research of algorithm optimization in imaging image-motion prediction. Simulations results indicate that the improving lucky image-motion stabilization algorithm combining the Back Propagation Network (BP NN) and support vector machine (SVM) is the most suitable for the simple scene image stabilization. The relative error of the image-motion prediction based the soft-sensor technology is below 5%, the training computing speed of the mathematical predication model is as fast as the real-time image stabilization in aerial photography.

  20. Fusion: ultra-high-speed and IR image sensors

    NASA Astrophysics Data System (ADS)

    Etoh, T. Goji; Dao, V. T. S.; Nguyen, Quang A.; Kimata, M.

    2015-08-01

    Most targets of ultra-high-speed video cameras operating at more than 1 Mfps, such as combustion, crack propagation, collision, plasma, spark discharge, an air bag at a car accident and a tire under a sudden brake, generate sudden heat. Researchers in these fields require tools to measure the high-speed motion and heat simultaneously. Ultra-high frame rate imaging is achieved by an in-situ storage image sensor. Each pixel of the sensor is equipped with multiple memory elements to record a series of image signals simultaneously at all pixels. Image signals stored in each pixel are read out after an image capturing operation. In 2002, we developed an in-situ storage image sensor operating at 1 Mfps 1). However, the fill factor of the sensor was only 15% due to a light shield covering the wide in-situ storage area. Therefore, in 2011, we developed a backside illuminated (BSI) in-situ storage image sensor to increase the sensitivity with 100% fill factor and a very high quantum efficiency 2). The sensor also achieved a much higher frame rate,16.7 Mfps, thanks to the wiring on the front side with more freedom 3). The BSI structure has another advantage that it has less difficulties in attaching an additional layer on the backside, such as scintillators. This paper proposes development of an ultra-high-speed IR image sensor in combination of advanced nano-technologies for IR imaging and the in-situ storage technology for ultra-highspeed imaging with discussion on issues in the integration.

  1. Advanced sensor-simulation capability

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Kalman, Linda S.; Keller, Robert A.

    1990-09-01

    This paper provides an overview of an advanced simulation capability currently in use for analyzing visible and infrared sensor systems. The software system, called VISTAS (VISIBLE/INFRARED SENSOR TRADES, ANALYSES, AND SIMULATIONS) combines classical image processing techniques with detailed sensor models to produce static and time dependent simulations of a variety of sensor systems including imaging, tracking, and point target detection systems. Systems modelled to date include space-based scanning line-array sensors as well as staring 2-dimensional array sensors which can be used for either imaging or point source detection.

  2. Flexible phosphor sensors: a digital supplement or option to rigid sensors.

    PubMed

    Glazer, Howard S

    2014-01-01

    An increasing number of dental practices are upgrading from film radiography to digital radiography, for reasons that include faster image processing, easier image access, better patient education, enhanced data storage, and improved office productivity. Most practices that have converted to digital technology use rigid, or direct, sensors. Another digital option is flexible phosphor sensors, also called indirect sensors or phosphor storage plates (PSPs). Flexible phosphor sensors can be advantageous for use with certain patients who may be averse to direct sensors, and they can deliver a larger image area. Additionally, sensor cost for replacement PSPs is considerably lower than for hard sensors. As such, flexible phosphor sensors appear to be a viable supplement or option to direct sensors.

  3. Spaceborne imaging radar research in the 90's

    NASA Technical Reports Server (NTRS)

    Elachi, Charles

    1986-01-01

    The imaging radar experiments on SEASAT and on the space shuttle (SIR-A and SIR-B) have led to a wide interest in the use of spaceborne imaging radars in Earth and planetary sciences. The radar sensors provide unique and complimentary information to what is acquired with visible and infrared imagers. This includes subsurface imaging in arid regions, all weather observation of ocean surface dynamic phenomena, structural mapping, soil moisture mapping, stereo imaging and resulting topographic mapping. However, experiments up to now have exploited only a very limited range of the generic capability of radar sensors. With planned sensor developments in the late 80's and early 90's, a quantum jump will be made in our ability to fully exploit the potential of these sensors. These developments include: multiparameter research sensors such as SIR-C and X-SAR, long-term and global monitoring sensors such as ERS-1, JERS-1, EOS, Radarsat, GLORI and the spaceborne sounder, planetary mapping sensors such as the Magellan and Cassini/Titan mappers, topographic three-dimensional imagers such as the scanning radar altimeter and three-dimensional rain mapping. These sensors and their associated research are briefly described.

  4. Star centroiding error compensation for intensified star sensors.

    PubMed

    Jiang, Jie; Xiong, Kun; Yu, Wenbo; Yan, Jinyun; Zhang, Guangjun

    2016-12-26

    A star sensor provides high-precision attitude information by capturing a stellar image; however, the traditional star sensor has poor dynamic performance, which is attributed to its low sensitivity. Regarding the intensified star sensor, the image intensifier is utilized to improve the sensitivity, thereby further improving the dynamic performance of the star sensor. However, the introduction of image intensifier results in star centroiding accuracy decrease, further influencing the attitude measurement precision of the star sensor. A star centroiding error compensation method for intensified star sensors is proposed in this paper to reduce the influences. First, the imaging model of the intensified detector, which includes the deformation parameter of the optical fiber panel, is established based on the orthographic projection through the analysis of errors introduced by the image intensifier. Thereafter, the position errors at the target points based on the model are obtained by using the Levenberg-Marquardt (LM) optimization method. Last, the nearest trigonometric interpolation method is presented to compensate for the arbitrary centroiding error of the image plane. Laboratory calibration result and night sky experiment result show that the compensation method effectively eliminates the error introduced by the image intensifier, thus remarkably improving the precision of the intensified star sensors.

  5. Continued Development of Meandering Winding Magnetometer (MWM (Register Trademark)) Eddy Current Sensors for the Health Monitoring, Modeling and Damage Detection of Composite Overwrapped Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Russell, Richard; Wincheski, Russell; Jablonski, David; Washabaugh, Andy; Sheiretov, Yanko; Martin, Christopher; Goldfine, Neil

    2011-01-01

    Composite Overwrapped Pressure Vessels (COPVs) are used in essentially all NASA spacecraft, launch. vehicles and payloads to contain high-pressure fluids for propulsion, life support systems and science experiments. Failure of any COPV either in flight or during ground processing would result in catastrophic damage to the spacecraft or payload, and could lead to loss of life. Therefore, NASA continues to investigate new methods to non-destructively inspect (NDE) COPVs for structural anomalies and to provide a means for in-situ structural health monitoring (SHM) during operational service. Partnering with JENTEK Sensors, engineers at NASA, Kennedy Space Center have successfully conducted a proof-of-concept study to develop Meandering Winding Magnetometer (MWM) eddy current sensors designed to make direct measurements of the stresses of the internal layers of a carbon fiber composite wrapped COPV. During this study three different MWM sensors were tested at three orientations to demonstrate the ability of the technology to measure stresses at various fiber orientations and depths. These results showed good correlation with actual surface strain gage measurements. MWM-Array technology for scanning COPVs can reliably be used to image and detect mechanical damage. To validate this conclusion, several COPVs were scanned to obtain a baseline, and then each COPV was impacted at varying energy levels and then rescanned. The baseline subtracted images were used to demonstrate damage detection. These scans were performed with two different MWM-Arrays. with different geometries for near-surface and deeper penetration imaging at multiple frequencies and in multiple orientations of the linear MWM drive. This presentation will include a review of micromechanical models that relate measured sensor responses to composite material constituent properties, validated by the proof of concept study, as the basis for SHM and NDE data analysis as well as potential improvements including design changes to miniaturize and make the sensors durable in the vacuum of space

  6. Beam imaging sensor

    DOEpatents

    McAninch, Michael D.; Root, Jeffrey J.

    2016-07-05

    The present invention relates generally to the field of sensors for beam imaging and, in particular, to a new and useful beam imaging sensor for use in determining, for example, the power density distribution of a beam including, but not limited to, an electron beam or an ion beam. In one embodiment, the beam imaging sensor of the present invention comprises, among other items, a circumferential slit that is either circular, elliptical or polygonal in nature.

  7. Automated site characterization for robotic sample acquisition systems

    NASA Astrophysics Data System (ADS)

    Scholl, Marija S.; Eberlein, Susan J.

    1993-04-01

    A mobile, semiautonomous vehicle with multiple sensors and on-board intelligence is proposed for performing preliminary scientific investigations on extraterrestrial bodies prior to human exploration. Two technologies, a hybrid optical-digital computer system based on optical correlator technology and an image and instrument data analysis system, provide complementary capabilities that might be part of an instrument package for an intelligent robotic vehicle. The hybrid digital-optical vision system could perform real-time image classification tasks using an optical correlator with programmable matched filters under control of a digital microcomputer. The data analysis system would analyze visible and multiband imagery to extract mineral composition and textural information for geologic characterization. Together these technologies would support the site characterization needs of a robotic vehicle for both navigational and scientific purposes.

  8. Fundamental performance differences between CMOS and CCD imagers: Part II

    NASA Astrophysics Data System (ADS)

    Janesick, James; Andrews, James; Tower, John; Grygon, Mark; Elliott, Tom; Cheng, John; Lesser, Michael; Pinter, Jeff

    2007-09-01

    A new class of CMOS imagers that compete with scientific CCDs is presented. The sensors are based on deep depletion backside illuminated technology to achieve high near infrared quantum efficiency and low pixel cross-talk. The imagers deliver very low read noise suitable for single photon counting - Fano-noise limited soft x-ray applications. Digital correlated double sampling signal processing necessary to achieve low read noise performance is analyzed and demonstrated for CMOS use. Detailed experimental data products generated by different pixel architectures (notably 3TPPD, 5TPPD and 6TPG designs) are presented including read noise, charge capacity, dynamic range, quantum efficiency, charge collection and transfer efficiency and dark current generation. Radiation damage data taken for the imagers is also reported.

  9. Arctic Clouds Infrared Imaging Field Campaign Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, J. A.

    2016-03-01

    The Infrared Cloud Imager (ICI), a passive thermal imaging system, was deployed at the North Slope of Alaska site in Barrow, Alaska, from July 2012 to July 2014 for measuring spatial-temporal cloud statistics. Thermal imaging of the sky from the ground provides high radiometric contrast during night and polar winter when visible sensors and downward-viewing thermal sensors experience low contrast. In addition to demonstrating successful operation in the Arctic for an extended period and providing data for Arctic cloud studies, a primary objective of this deployment was to validate novel instrument calibration algorithms that will allow more compact ICI instrumentsmore » to be deployed without the added expense, weight, size, and operational difficulty of a large-aperture onboard blackbody calibration source. This objective was successfully completed with a comparison of the two-year data set calibrated with and without the onboard blackbody. The two different calibration methods produced daily-average cloud amount data sets with correlation coefficient = 0.99, mean difference = 0.0029 (i.e., 0.29% cloudiness), and a difference standard deviation = 0.054. Finally, the ICI instrument generally detected more thin clouds than reported by other ARM cloud products available as of late 2015.« less

  10. Optical and Electric Multifunctional CMOS Image Sensors for On-Chip Biosensing Applications.

    PubMed

    Tokuda, Takashi; Noda, Toshihiko; Sasagawa, Kiyotaka; Ohta, Jun

    2010-12-29

    In this review, the concept, design, performance, and a functional demonstration of multifunctional complementary metal-oxide-semiconductor (CMOS) image sensors dedicated to on-chip biosensing applications are described. We developed a sensor architecture that allows flexible configuration of a sensing pixel array consisting of optical and electric sensing pixels, and designed multifunctional CMOS image sensors that can sense light intensity and electric potential or apply a voltage to an on-chip measurement target. We describe the sensors' architecture on the basis of the type of electric measurement or imaging functionalities.

  11. A 100 Mfps image sensor for biological applications

    NASA Astrophysics Data System (ADS)

    Etoh, T. Goji; Shimonomura, Kazuhiro; Nguyen, Anh Quang; Takehara, Kosei; Kamakura, Yoshinari; Goetschalckx, Paul; Haspeslagh, Luc; De Moor, Piet; Dao, Vu Truong Son; Nguyen, Hoang Dung; Hayashi, Naoki; Mitsui, Yo; Inumaru, Hideo

    2018-02-01

    Two ultrahigh-speed CCD image sensors with different characteristics were fabricated for applications to advanced scientific measurement apparatuses. The sensors are BSI MCG (Backside-illuminated Multi-Collection-Gate) image sensors with multiple collection gates around the center of the front side of each pixel, placed like petals of a flower. One has five collection gates and one drain gate at the center, which can capture consecutive five frames at 100 Mfps with the pixel count of about 600 kpixels (512 x 576 x 2 pixels). In-pixel signal accumulation is possible for repetitive image capture of reproducible events. The target application is FLIM. The other is equipped with four collection gates each connected to an in-situ CCD memory with 305 elements, which enables capture of 1,220 (4 x 305) consecutive images at 50 Mfps. The CCD memory is folded and looped with the first element connected to the last element, which also makes possible the in-pixel signal accumulation. The sensor is a small test sensor with 32 x 32 pixels. The target applications are imaging TOF MS, pulse neutron tomography and dynamic PSP. The paper also briefly explains an expression of the temporal resolution of silicon image sensors theoretically derived by the authors in 2017. It is shown that the image sensor designed based on the theoretical analysis achieves imaging of consecutive frames at the frame interval of 50 ps.

  12. Smart image sensors: an emerging key technology for advanced optical measurement and microsystems

    NASA Astrophysics Data System (ADS)

    Seitz, Peter

    1996-08-01

    Optical microsystems typically include photosensitive devices, analog preprocessing circuitry and digital signal processing electronics. The advances in semiconductor technology have made it possible today to integrate all photosensitive and electronical devices on one 'smart image sensor' or photo-ASIC (application-specific integrated circuits containing photosensitive elements). It is even possible to provide each 'smart pixel' with additional photoelectronic functionality, without compromising the fill factor substantially. This technological capability is the basis for advanced cameras and optical microsystems showing novel on-chip functionality: Single-chip cameras with on- chip analog-to-digital converters for less than $10 are advertised; image sensors have been developed including novel functionality such as real-time selectable pixel size and shape, the capability of performing arbitrary convolutions simultaneously with the exposure, as well as variable, programmable offset and sensitivity of the pixels leading to image sensors with a dynamic range exceeding 150 dB. Smart image sensors have been demonstrated offering synchronous detection and demodulation capabilities in each pixel (lock-in CCD), and conventional image sensors are combined with an on-chip digital processor for complete, single-chip image acquisition and processing systems. Technological problems of the monolithic integration of smart image sensors include offset non-uniformities, temperature variations of electronic properties, imperfect matching of circuit parameters, etc. These problems can often be overcome either by designing additional compensation circuitry or by providing digital correction routines. Where necessary for technological or economic reasons, smart image sensors can also be combined with or realized as hybrids, making use of commercially available electronic components. It is concluded that the possibilities offered by custom smart image sensors will influence the design and the performance of future electronic imaging systems in many disciplines, reaching from optical metrology to machine vision on the factory floor and in robotics applications.

  13. Testing and evaluation of tactical electro-optical sensors

    NASA Astrophysics Data System (ADS)

    Middlebrook, Christopher T.; Smith, John G.

    2002-07-01

    As integrated electro-optical sensor payloads (multi- sensors) comprised of infrared imagers, visible imagers, and lasers advance in performance, the tests and testing methods must also advance in order to fully evaluate them. Future operational requirements will require integrated sensor payloads to perform missions at further ranges and with increased targeting accuracy. In order to meet these requirements sensors will require advanced imaging algorithms, advanced tracking capability, high-powered lasers, and high-resolution imagers. To meet the U.S. Navy's testing requirements of such multi-sensors, the test and evaluation group in the Night Vision and Chemical Biological Warfare Department at NAVSEA Crane is developing automated testing methods, and improved tests to evaluate imaging algorithms, and procuring advanced testing hardware to measure high resolution imagers and line of sight stabilization of targeting systems. This paper addresses: descriptions of the multi-sensor payloads tested, testing methods used and under development, and the different types of testing hardware and specific payload tests that are being developed and used at NAVSEA Crane.

  14. A multi-sensor data-driven methodology for all-sky passive microwave inundation retrieval

    NASA Astrophysics Data System (ADS)

    Takbiri, Zeinab; Ebtehaj, Ardeshir M.; Foufoula-Georgiou, Efi

    2017-06-01

    We present a multi-sensor Bayesian passive microwave retrieval algorithm for flood inundation mapping at high spatial and temporal resolutions. The algorithm takes advantage of observations from multiple sensors in optical, short-infrared, and microwave bands, thereby allowing for detection and mapping of the sub-pixel fraction of inundated areas under almost all-sky conditions. The method relies on a nearest-neighbor search and a modern sparsity-promoting inversion method that make use of an a priori dataset in the form of two joint dictionaries. These dictionaries contain almost overlapping observations by the Special Sensor Microwave Imager and Sounder (SSMIS) on board the Defense Meteorological Satellite Program (DMSP) F17 satellite and the Moderate Resolution Imaging Spectroradiometer (MODIS) on board the Aqua and Terra satellites. Evaluation of the retrieval algorithm over the Mekong Delta shows that it is capable of capturing to a good degree the inundation diurnal variability due to localized convective precipitation. At longer timescales, the results demonstrate consistency with the ground-based water level observations, denoting that the method is properly capturing inundation seasonal patterns in response to regional monsoonal rain. The calculated Euclidean distance, rank-correlation, and also copula quantile analysis demonstrate a good agreement between the outputs of the algorithm and the observed water levels at monthly and daily timescales. The current inundation products are at a resolution of 12.5 km and taken twice per day, but a higher resolution (order of 5 km and every 3 h) can be achieved using the same algorithm with the dictionary populated by the Global Precipitation Mission (GPM) Microwave Imager (GMI) products.

  15. Wireless image-data transmission from an implanted image sensor through a living mouse brain by intra body communication

    NASA Astrophysics Data System (ADS)

    Hayami, Hajime; Takehara, Hiroaki; Nagata, Kengo; Haruta, Makito; Noda, Toshihiko; Sasagawa, Kiyotaka; Tokuda, Takashi; Ohta, Jun

    2016-04-01

    Intra body communication technology allows the fabrication of compact implantable biomedical sensors compared with RF wireless technology. In this paper, we report the fabrication of an implantable image sensor of 625 µm width and 830 µm length and the demonstration of wireless image-data transmission through a brain tissue of a living mouse. The sensor was designed to transmit output signals of pixel values by pulse width modulation (PWM). The PWM signals from the sensor transmitted through a brain tissue were detected by a receiver electrode. Wireless data transmission of a two-dimensional image was successfully demonstrated in a living mouse brain. The technique reported here is expected to provide useful methods of data transmission using micro sized implantable biomedical sensors.

  16. Improving slowness estimate stability and visualization using limited sensor pair correlation on seismic arrays

    NASA Astrophysics Data System (ADS)

    Gibbons, Steven J.; Näsholm, S. P.; Ruigrok, E.; Kværna, T.

    2018-04-01

    Seismic arrays enhance signal detection and parameter estimation by exploiting the time-delays between arriving signals on sensors at nearby locations. Parameter estimates can suffer due to both signal incoherence, with diminished waveform similarity between sensors, and aberration, with time-delays between coherent waveforms poorly represented by the wave-front model. Sensor-to-sensor correlation approaches to parameter estimation have an advantage over direct beamforming approaches in that individual sensor-pairs can be omitted without necessarily omitting entirely the data from each of the sensors involved. Specifically, we can omit correlations between sensors for which signal coherence in an optimal frequency band is anticipated to be poor or for which anomalous time-delays are anticipated. In practice, this usually means omitting correlations between more distant sensors. We present examples from International Monitoring System seismic arrays with poor parameter estimates resulting when classical f-k analysis is performed over the full array aperture. We demonstrate improved estimates and slowness grid displays using correlation beamforming restricted to correlations between sufficiently closely spaced sensors. This limited sensor-pair correlation (LSPC) approach has lower slowness resolution than would ideally be obtained by considering all sensor-pairs. However, this ideal estimate may be unattainable due to incoherence and/or aberration and the LSPC estimate can often exploit all channels, with the associated noise-suppression, while mitigating the complications arising from correlations between very distant sensors. The greatest need for the method is for short-period signals on large aperture arrays although we also demonstrate significant improvement for secondary regional phases on a small aperture array. LSPC can also provide a robust and flexible approach to parameter estimation on three-component seismic arrays.

  17. Cross-calibration of the Landsat-7 ETM+ and Landsat-5 TM with the ResourceSat-1 (IRS-P6) AWiFS and LISS-III sensors

    USGS Publications Warehouse

    Chander, G.; Scaramuzza, P.L.

    2006-01-01

    Increasingly, data from multiple sensors are used to gain a more complete understanding of land surface processes at a variety of scales. The Landsat suite of satellites has collected the longest continuous archive of multispectral data. The ResourceSat-1 Satellite (also called as IRS-P6) was launched into the polar sunsynchronous orbit on Oct 17, 2003. It carries three remote sensing sensors: the High Resolution Linear Imaging Self-Scanner (LISS-IV), Medium Resolution Linear Imaging Self-Scanner (LISS-III), and the Advanced Wide Field Sensor (AWiFS). These three sensors are used together to provide images with different resolution and coverage. To understand the absolute radiometric calibration accuracy of IRS-P6 AWiFS and LISS-III sensors, image pairs from these sensors were compared to the Landsat-5 TM and Landsat-7 ETM+ sensors. The approach involved the calibration of nearly simultaneous surface observations based on image statistics from areas observed simultaneously by the two sensors.

  18. Proceedings of the Augmented VIsual Display (AVID) Research Workshop

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K. (Editor); Sweet, Barbara T. (Editor)

    1993-01-01

    The papers, abstracts, and presentations were presented at a three day workshop focused on sensor modeling and simulation, and image enhancement, processing, and fusion. The technical sessions emphasized how sensor technology can be used to create visual imagery adequate for aircraft control and operations. Participants from industry, government, and academic laboratories contributed to panels on Sensor Systems, Sensor Modeling, Sensor Fusion, Image Processing (Computer and Human Vision), and Image Evaluation and Metrics.

  19. Integrated Spectral Low Noise Image Sensor with Nanowire Polarization Filters for Low Contrast Imaging

    DTIC Science & Technology

    2015-11-05

    AFRL-AFOSR-VA-TR-2015-0359 Integrated Spectral Low Noise Image Sensor with Nanowire Polarization Filters for Low Contrast Imaging Viktor Gruev...To) 02/15/2011 - 08/15/2015 4. TITLE AND SUBTITLE Integrated Spectral Low Noise Image Sensor with Nanowire Polarization Filters for Low Contrast...investigate alternative spectral imaging architectures based on my previous experience in this research area. I will develop nanowire polarization

  20. Application of passive imaging polarimetry in the discrimination and detection of different color targets of identical shapes using color-blind imaging sensors

    NASA Astrophysics Data System (ADS)

    El-Saba, A. M.; Alam, M. S.; Surpanani, A.

    2006-05-01

    Important aspects of automatic pattern recognition systems are their ability to efficiently discriminate and detect proper targets with low false alarms. In this paper we extend the applications of passive imaging polarimetry to effectively discriminate and detect different color targets of identical shapes using color-blind imaging sensor. For this case of study we demonstrate that traditional color-blind polarization-insensitive imaging sensors that rely only on the spatial distribution of targets suffer from high false detection rates, especially in scenarios where multiple identical shape targets are present. On the other hand we show that color-blind polarization-sensitive imaging sensors can successfully and efficiently discriminate and detect true targets based on their color only. We highlight the main advantages of using our proposed polarization-encoded imaging sensor.

  1. Dental non-linear image registration and collection method with 3D reconstruction and change detection

    NASA Astrophysics Data System (ADS)

    Rahmes, Mark; Fagan, Dean; Lemieux, George

    2017-03-01

    The capability of a software algorithm to automatically align same-patient dental bitewing and panoramic x-rays over time is complicated by differences in collection perspectives. We successfully used image correlation with an affine transform for each pixel to discover common image borders, followed by a non-linear homography perspective adjustment to closely align the images. However, significant improvements in image registration could be realized if images were collected from the same perspective, thus facilitating change analysis. The perspective differences due to current dental image collection devices are so significant that straightforward change analysis is not possible. To address this, a new custom dental tray could be used to provide the standard reference needed for consistent positioning of a patient's mouth. Similar to sports mouth guards, the dental tray could be fabricated in standard sizes from plastic and use integrated electronics that have been miniaturized. In addition, the x-ray source needs to be consistently positioned in order to collect images with similar angles and scales. Solving this pose correction is similar to solving for collection angle in aerial imagery for change detection. A standard collection system would provide a method for consistent source positioning using real-time sensor position feedback from a digital x-ray image reference. Automated, robotic sensor positioning could replace manual adjustments. Given an image set from a standard collection, a disparity map between images can be created using parallax from overlapping viewpoints to enable change detection. This perspective data can be rectified and used to create a three-dimensional dental model reconstruction.

  2. CMOS image sensor with organic photoconductive layer having narrow absorption band and proposal of stack type solid-state image sensors

    NASA Astrophysics Data System (ADS)

    Takada, Shunji; Ihama, Mikio; Inuiya, Masafumi

    2006-02-01

    Digital still cameras overtook film cameras in Japanese market in 2000 in terms of sales volume owing to their versatile functions. However, the image-capturing capabilities such as sensitivity and latitude of color films are still superior to those of digital image sensors. In this paper, we attribute the cause for the high performance of color films to their multi-layered structure, and propose the solid-state image sensors with stacked organic photoconductive layers having narrow absorption bands on CMOS read-out circuits.

  3. Comparative analysis of respiratory motion tracking using Microsoft Kinect v2 sensor.

    PubMed

    Silverstein, Evan; Snyder, Michael

    2018-05-01

    To present and evaluate a straightforward implementation of a marker-less, respiratory motion-tracking process utilizing Kinect v2 camera as a gating tool during 4DCT or during radiotherapy treatments. Utilizing the depth sensor on the Kinect as well as author written C# code, respiratory motion of a subject was tracked by recording depth values obtained at user selected points on the subject, with each point representing one pixel on the depth image. As a patient breathes, specific anatomical points on the chest/abdomen will move slightly within the depth image across pixels. By tracking how depth values change for a specific pixel, instead of how the anatomical point moves throughout the image, a respiratory trace can be obtained based on changing depth values of the selected pixel. Tracking these values was implemented via marker-less setup. Varian's RPM system and the Anzai belt system were used in tandem with the Kinect to compare respiratory traces obtained by each using two different subjects. Analysis of the depth information from the Kinect for purposes of phase- and amplitude-based binning correlated well with the RPM and Anzai systems. Interquartile Range (IQR) values were obtained comparing times correlated with specific amplitude and phase percentages against each product. The IQR time spans indicated the Kinect would measure specific percentage values within 0.077 s for Subject 1 and 0.164 s for Subject 2 when compared to values obtained with RPM or Anzai. For 4DCT scans, these times correlate to less than 1 mm of couch movement and would create an offset of 1/2 an acquired slice. By tracking depth values of user selected pixels within the depth image, rather than tracking specific anatomical locations, respiratory motion can be tracked and visualized utilizing the Kinect with results comparable to that of the Varian RPM and Anzai belt. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  4. Noninvasive, three-dimensional full-field body sensor for surface deformation monitoring of human body in vivo.

    PubMed

    Chen, Zhenning; Shao, Xinxing; He, Xiaoyuan; Wu, Jialin; Xu, Xiangyang; Zhang, Jinlin

    2017-09-01

    Noninvasive, three-dimensional (3-D), full-field surface deformation measurements of the human body are important for biomedical investigations. We proposed a 3-D noninvasive, full-field body sensor based on stereo digital image correlation (stereo-DIC) for surface deformation monitoring of the human body in vivo. First, by applying an improved water-transfer printing (WTP) technique to transfer optimized speckle patterns onto the skin, the body sensor was conveniently and harmlessly fabricated directly onto the human body. Then, stereo-DIC was used to achieve 3-D noncontact and noninvasive surface deformation measurements. The accuracy and efficiency of the proposed body sensor were verified and discussed by considering different complexions. Moreover, the fabrication of speckle patterns on human skin, which has always been considered a challenging problem, was shown to be feasible, effective, and harmless as a result of the improved WTP technique. An application of the proposed stereo-DIC-based body sensor was demonstrated by measuring the pulse wave velocity of human carotid artery. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  5. Strengthening of back muscles using a module of flexible strain sensors.

    PubMed

    Chuang, Wan-Chun; Lin, Hwai-Ting; Chen, Wei-Long

    2015-02-09

    This research aims at developing a flexible strain module applied to the strengthening of back muscles. Silver films were sputtered onto flexible substrates to produce a flexible sensor. Assuming that back muscle elongation is positively correlated with the variations in skin surface length, real-time resistance changes exhibited by the sensor during simulated training sessions were measured. The results were used to identify the relationship between resistance change of sensors and skin surface stretch. In addition, muscle length changes from ultrasound images were used to determine the feasibility of a proof of concept sensor. Furthermore, this module is capable of detecting large muscle contractions, some of which may be undesirable for the prescribed training strategy. Therefore, the developed module can facilitate real-time assessments of the movement accuracy of users during training, and the results are instantly displayed on a screen. People using the developed training system can immediately adjust their posture to the appropriate position. Thus, the training mechanism can be constructed to help user improve the efficiency of back muscle strengthening.

  6. Blur spot limitations in distal endoscope sensors

    NASA Astrophysics Data System (ADS)

    Yaron, Avi; Shechterman, Mark; Horesh, Nadav

    2006-02-01

    In years past, the picture quality of electronic video systems was limited by the image sensor. In the present, the resolution of miniature image sensors, as in medical endoscopy, is typically superior to the resolution of the optical system. This "excess resolution" is utilized by Visionsense to create stereoscopic vision. Visionsense has developed a single chip stereoscopic camera that multiplexes the horizontal dimension of the image sensor into two (left and right) images, compensates the blur phenomena, and provides additional depth resolution without sacrificing planar resolution. The camera is based on a dual-pupil imaging objective and an image sensor coated by an array of microlenses (a plenoptic camera). The camera has the advantage of being compact, providing simultaneous acquisition of left and right images, and offering resolution comparable to a dual chip stereoscopic camera with low to medium resolution imaging lenses. A stereoscopic vision system provides an improved 3-dimensional perspective of intra-operative sites that is crucial for advanced minimally invasive surgery and contributes to surgeon performance. An additional advantage of single chip stereo sensors is improvement of tolerance to electronic signal noise.

  7. Compensation method for the influence of angle of view on animal temperature measurement using thermal imaging camera combined with depth image.

    PubMed

    Jiao, Leizi; Dong, Daming; Zhao, Xiande; Han, Pengcheng

    2016-12-01

    In the study, we proposed an animal surface temperature measurement method based on Kinect sensor and infrared thermal imager to facilitate the screening of animals with febrile diseases. Due to random motion and small surface temperature variation of animals, the influence of the angle of view on temperature measurement is significant. The method proposed in the present study could compensate the temperature measurement error caused by the angle of view. Firstly, we analyzed the relationship between measured temperature and angle of view and established the mathematical model for compensating the influence of the angle of view with the correlation coefficient above 0.99. Secondly, the fusion method of depth and infrared thermal images was established for synchronous image capture with Kinect sensor and infrared thermal imager and the angle of view of each pixel was calculated. According to experimental results, without compensation treatment, the temperature image measured in the angle of view of 74° to 76° showed the difference of more than 2°C compared with that measured in the angle of view of 0°. However, after compensation treatment, the temperature difference range was only 0.03-1.2°C. This method is applicable for real-time compensation of errors caused by the angle of view during the temperature measurement process with the infrared thermal imager. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Multi-mode Observations of Cloud-to-Ground Lightning Strokes

    NASA Astrophysics Data System (ADS)

    Smith, M. W.; Smith, B. J.; Clemenson, M. D.; Zollweg, J. D.

    2015-12-01

    We present hyper-temporal and hyper-spectral data collected using a suite of three Phantom high-speed cameras configured to observe cloud-to-ground lightning strokes. The first camera functioned as a contextual imager to show the location and structure of the strokes. The other two cameras were operated as slit-less spectrometers, with resolutions of 0.2 to 1.0 nm. The imaging camera was operated at a readout rate of 48,000 frames per second and provided an image-based trigger mechanism for the spectrometers. Each spectrometer operated at a readout rate of 400,000 frames per second. The sensors were deployed on the southern edge of Albuquerque, New Mexico and collected data over a 4 week period during the thunderstorm season in the summer of 2015. Strikes observed by the sensor suite were correlated to specific strikes recorded by the National Lightning Data Network (NLDN) and thereby geo-located. Sensor calibration factors, distance to each strike, and calculated values of atmospheric transmission were used to estimate absolute radiometric intensities for the spectral-temporal data. The data that we present show the intensity and time evolution of broadband and line emission features for both leader and return strokes. We highlight several key features and overall statistics of the observations. A companion poster describes a lightning model that is being developed at Sandia National Laboratories.

  9. Image Sensors Enhance Camera Technologies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

  10. An Approach to the Use of Depth Cameras for Weed Volume Estimation

    PubMed Central

    Andújar, Dionisio; Dorado, José; Fernández-Quintanilla, César; Ribeiro, Angela

    2016-01-01

    The use of depth cameras in precision agriculture is increasing day by day. This type of sensor has been used for the plant structure characterization of several crops. However, the discrimination of small plants, such as weeds, is still a challenge within agricultural fields. Improvements in the new Microsoft Kinect v2 sensor can capture the details of plants. The use of a dual methodology using height selection and RGB (Red, Green, Blue) segmentation can separate crops, weeds, and soil. This paper explores the possibilities of this sensor by using Kinect Fusion algorithms to reconstruct 3D point clouds of weed-infested maize crops under real field conditions. The processed models showed good consistency among the 3D depth images and soil measurements obtained from the actual structural parameters. Maize plants were identified in the samples by height selection of the connected faces and showed a correlation of 0.77 with maize biomass. The lower height of the weeds made RGB recognition necessary to separate them from the soil microrelief of the samples, achieving a good correlation of 0.83 with weed biomass. In addition, weed density showed good correlation with volumetric measurements. The canonical discriminant analysis showed promising results for classification into monocots and dictos. These results suggest that estimating volume using the Kinect methodology can be a highly accurate method for crop status determination and weed detection. It offers several possibilities for the automation of agricultural processes by the construction of a new system integrating these sensors and the development of algorithms to properly process the information provided by them. PMID:27347972

  11. An Approach to the Use of Depth Cameras for Weed Volume Estimation.

    PubMed

    Andújar, Dionisio; Dorado, José; Fernández-Quintanilla, César; Ribeiro, Angela

    2016-06-25

    The use of depth cameras in precision agriculture is increasing day by day. This type of sensor has been used for the plant structure characterization of several crops. However, the discrimination of small plants, such as weeds, is still a challenge within agricultural fields. Improvements in the new Microsoft Kinect v2 sensor can capture the details of plants. The use of a dual methodology using height selection and RGB (Red, Green, Blue) segmentation can separate crops, weeds, and soil. This paper explores the possibilities of this sensor by using Kinect Fusion algorithms to reconstruct 3D point clouds of weed-infested maize crops under real field conditions. The processed models showed good consistency among the 3D depth images and soil measurements obtained from the actual structural parameters. Maize plants were identified in the samples by height selection of the connected faces and showed a correlation of 0.77 with maize biomass. The lower height of the weeds made RGB recognition necessary to separate them from the soil microrelief of the samples, achieving a good correlation of 0.83 with weed biomass. In addition, weed density showed good correlation with volumetric measurements. The canonical discriminant analysis showed promising results for classification into monocots and dictos. These results suggest that estimating volume using the Kinect methodology can be a highly accurate method for crop status determination and weed detection. It offers several possibilities for the automation of agricultural processes by the construction of a new system integrating these sensors and the development of algorithms to properly process the information provided by them.

  12. A design of driving circuit for star sensor imaging camera

    NASA Astrophysics Data System (ADS)

    Li, Da-wei; Yang, Xiao-xu; Han, Jun-feng; Liu, Zhao-hui

    2016-01-01

    The star sensor is a high-precision attitude sensitive measuring instruments, which determine spacecraft attitude by detecting different positions on the celestial sphere. Imaging camera is an important portion of star sensor. The purpose of this study is to design a driving circuit based on Kodak CCD sensor. The design of driving circuit based on Kodak KAI-04022 is discussed, and the timing of this CCD sensor is analyzed. By the driving circuit testing laboratory and imaging experiments, it is found that the driving circuits can meet the requirements of Kodak CCD sensor.

  13. Configuration and Specifications of AN Unmanned Aerial Vehicle for Precision Agriculture

    NASA Astrophysics Data System (ADS)

    Erena, M.; Montesinos, S.; Portillo, D.; Alvarez, J.; Marin, C.; Fernandez, L.; Henarejos, J. M.; Ruiz, L. A.

    2016-06-01

    Unmanned Aerial Vehicles (UAVs) with multispectral sensors are increasingly attractive in geosciences for data capture and map updating at high spatial and temporal resolutions. These autonomously-flying systems can be equipped with different sensors, such as a six-band multispectral camera (Tetracam mini-MCA-6), GPS Ublox M8N, and MEMS gyroscopes, and miniaturized sensor systems for navigation, positioning, and mapping purposes. These systems can be used for data collection in precision viticulture. In this study, the efficiency of a light UAV system for data collection, processing, and map updating in small areas is evaluated, generating correlations between classification maps derived from remote sensing and production maps. Based on the comparison of the indices derived from UAVs incorporating infrared sensors with those obtained by satellites (Sentinel 2A and Landsat 8), UAVs show promise for the characterization of vineyard plots with high spatial variability, despite the low vegetative coverage of these crops. Consequently, a procedure for zoning map production based on UAV/UV images could provide important information for farmers.

  14. Multispectral image-fused head-tracked vision system (HTVS) for driving applications

    NASA Astrophysics Data System (ADS)

    Reese, Colin E.; Bender, Edward J.

    2001-08-01

    Current military thermal driver vision systems consist of a single Long Wave Infrared (LWIR) sensor mounted on a manually operated gimbal, which is normally locked forward during driving. The sensor video imagery is presented on a large area flat panel display for direct view. The Night Vision and Electronics Sensors Directorate and Kaiser Electronics are cooperatively working to develop a driver's Head Tracked Vision System (HTVS) which directs dual waveband sensors in a more natural head-slewed imaging mode. The HTVS consists of LWIR and image intensified sensors, a high-speed gimbal, a head mounted display, and a head tracker. The first prototype systems have been delivered and have undergone preliminary field trials to characterize the operational benefits of a head tracked sensor system for tactical military ground applications. This investigation will address the advantages of head tracked vs. fixed sensor systems regarding peripheral sightings of threats, road hazards, and nearby vehicles. An additional thrust will investigate the degree to which additive (A+B) fusion of LWIR and image intensified sensors enhances overall driving performance. Typically, LWIR sensors are better for detecting threats, while image intensified sensors provide more natural scene cues, such as shadows and texture. This investigation will examine the degree to which the fusion of these two sensors enhances the driver's overall situational awareness.

  15. Image-Based Reconstruction and Analysis of Dynamic Scenes in a Landslide Simulation Facility

    NASA Astrophysics Data System (ADS)

    Scaioni, M.; Crippa, J.; Longoni, L.; Papini, M.; Zanzi, L.

    2017-12-01

    The application of image processing and photogrammetric techniques to dynamic reconstruction of landslide simulations in a scaled-down facility is described. Simulations are also used here for active-learning purpose: students are helped understand how physical processes happen and which kinds of observations may be obtained from a sensor network. In particular, the use of digital images to obtain multi-temporal information is presented. On one side, using a multi-view sensor set up based on four synchronized GoPro 4 Black® cameras, a 4D (3D spatial position and time) reconstruction of the dynamic scene is obtained through the composition of several 3D models obtained from dense image matching. The final textured 4D model allows one to revisit in dynamic and interactive mode a completed experiment at any time. On the other side, a digital image correlation (DIC) technique has been used to track surface point displacements from the image sequence obtained from the camera in front of the simulation facility. While the 4D model may provide a qualitative description and documentation of the experiment running, DIC analysis output quantitative information such as local point displacements and velocities, to be related to physical processes and to other observations. All the hardware and software equipment adopted for the photogrammetric reconstruction has been based on low-cost and open-source solutions.

  16. Microlens array processor with programmable weight mask and direct optical input

    NASA Astrophysics Data System (ADS)

    Schmid, Volker R.; Lueder, Ernst H.; Bader, Gerhard; Maier, Gert; Siegordner, Jochen

    1999-03-01

    We present an optical feature extraction system with a microlens array processor. The system is suitable for online implementation of a variety of transforms such as the Walsh transform and DCT. Operating with incoherent light, our processor accepts direct optical input. Employing a sandwich- like architecture, we obtain a very compact design of the optical system. The key elements of the microlens array processor are a square array of 15 X 15 spherical microlenses on acrylic substrate and a spatial light modulator as transmissive mask. The light distribution behind the mask is imaged onto the pixels of a customized a-Si image sensor with adjustable gain. We obtain one output sample for each microlens image and its corresponding weight mask area as summation of the transmitted intensity within one sensor pixel. The resulting architecture is very compact and robust like a conventional camera lens while incorporating a high degree of parallelism. We successfully demonstrate a Walsh transform into the spatial frequency domain as well as the implementation of a discrete cosine transform with digitized gray values. We provide results showing the transformation performance for both synthetic image patterns and images of natural texture samples. The extracted frequency features are suitable for neural classification of the input image. Other transforms and correlations can be implemented in real-time allowing adaptive optical signal processing.

  17. Leica ADS40 Sensor for Coastal Multispectral Imaging

    NASA Technical Reports Server (NTRS)

    Craig, John C.

    2007-01-01

    The Leica ADS40 Sensor as it is used for coastal multispectral imaging is presented. The contents include: 1) Project Area Overview; 2) Leica ADS40 Sensor; 3) Focal Plate Arrangements; 4) Trichroid Filter; 5) Gradient Correction; 6) Image Acquisition; 7) Remote Sensing and ADS40; 8) Band comparisons of Satellite and Airborne Sensors; 9) Impervious Surface Extraction; and 10) Impervious Surface Details.

  18. Establishing imaging sensor specifications for digital still cameras

    NASA Astrophysics Data System (ADS)

    Kriss, Michael A.

    2007-02-01

    Digital Still Cameras, DSCs, have now displaced conventional still cameras in most markets. The heart of a DSC is thought to be the imaging sensor, be it Full Frame CCD, and Interline CCD, a CMOS sensor or the newer Foveon buried photodiode sensors. There is a strong tendency by consumers to consider only the number of mega-pixels in a camera and not to consider the overall performance of the imaging system, including sharpness, artifact control, noise, color reproduction, exposure latitude and dynamic range. This paper will provide a systematic method to characterize the physical requirements of an imaging sensor and supporting system components based on the desired usage. The analysis is based on two software programs that determine the "sharpness", potential for artifacts, sensor "photographic speed", dynamic range and exposure latitude based on the physical nature of the imaging optics, sensor characteristics (including size of pixels, sensor architecture, noise characteristics, surface states that cause dark current, quantum efficiency, effective MTF, and the intrinsic full well capacity in terms of electrons per square centimeter). Examples will be given for consumer, pro-consumer, and professional camera systems. Where possible, these results will be compared to imaging system currently on the market.

  19. Design and implementation of non-linear image processing functions for CMOS image sensor

    NASA Astrophysics Data System (ADS)

    Musa, Purnawarman; Sudiro, Sunny A.; Wibowo, Eri P.; Harmanto, Suryadi; Paindavoine, Michel

    2012-11-01

    Today, solid state image sensors are used in many applications like in mobile phones, video surveillance systems, embedded medical imaging and industrial vision systems. These image sensors require the integration in the focal plane (or near the focal plane) of complex image processing algorithms. Such devices must meet the constraints related to the quality of acquired images, speed and performance of embedded processing, as well as low power consumption. To achieve these objectives, low-level analog processing allows extracting the useful information in the scene directly. For example, edge detection step followed by a local maxima extraction will facilitate the high-level processing like objects pattern recognition in a visual scene. Our goal was to design an intelligent image sensor prototype achieving high-speed image acquisition and non-linear image processing (like local minima and maxima calculations). For this purpose, we present in this article the design and test of a 64×64 pixels image sensor built in a standard CMOS Technology 0.35 μm including non-linear image processing. The architecture of our sensor, named nLiRIC (non-Linear Rapid Image Capture), is based on the implementation of an analog Minima/Maxima Unit. This MMU calculates the minimum and maximum values (non-linear functions), in real time, in a 2×2 pixels neighbourhood. Each MMU needs 52 transistors and the pitch of one pixel is 40×40 mu m. The total area of the 64×64 pixels is 12.5mm2. Our tests have shown the validity of the main functions of our new image sensor like fast image acquisition (10K frames per second), minima/maxima calculations in less then one ms.

  20. Wave analysis of a plenoptic system and its applications

    NASA Astrophysics Data System (ADS)

    Shroff, Sapna A.; Berkner, Kathrin

    2013-03-01

    Traditional imaging systems directly image a 2D object plane on to the sensor. Plenoptic imaging systems contain a lenslet array at the conventional image plane and a sensor at the back focal plane of the lenslet array. In this configuration the data captured at the sensor is not a direct image of the object. Each lenslet effectively images the aperture of the main imaging lens at the sensor. Therefore the sensor data retains angular light-field information which can be used for a posteriori digital computation of multi-angle images and axially refocused images. If a filter array, containing spectral filters or neutral density or polarization filters, is placed at the pupil aperture of the main imaging lens, then each lenslet images the filters on to the sensor. This enables the digital separation of multiple filter modalities giving single snapshot, multi-modal images. Due to the diversity of potential applications of plenoptic systems, their investigation is increasing. As the application space moves towards microscopes and other complex systems, and as pixel sizes become smaller, the consideration of diffraction effects in these systems becomes increasingly important. We discuss a plenoptic system and its wave propagation analysis for both coherent and incoherent imaging. We simulate a system response using our analysis and discuss various applications of the system response pertaining to plenoptic system design, implementation and calibration.

  1. Detection of Obstacles in Monocular Image Sequences

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Camps, Octavia

    1997-01-01

    The ability to detect and locate runways/taxiways and obstacles in images captured using on-board sensors is an essential first step in the automation of low-altitude flight, landing, takeoff, and taxiing phase of aircraft navigation. Automation of these functions under different weather and lighting situations, can be facilitated by using sensors of different modalities. An aircraft-based Synthetic Vision System (SVS), with sensors of different modalities mounted on-board, complements the current ground-based systems in functions such as detection and prevention of potential runway collisions, airport surface navigation, and landing and takeoff in all weather conditions. In this report, we address the problem of detection of objects in monocular image sequences obtained from two types of sensors, a Passive Millimeter Wave (PMMW) sensor and a video camera mounted on-board a landing aircraft. Since the sensors differ in their spatial resolution, and the quality of the images obtained using these sensors is not the same, different approaches are used for detecting obstacles depending on the sensor type. These approaches are described separately in two parts of this report. The goal of the first part of the report is to develop a method for detecting runways/taxiways and objects on the runway in a sequence of images obtained from a moving PMMW sensor. Since the sensor resolution is low and the image quality is very poor, we propose a model-based approach for detecting runways/taxiways. We use the approximate runway model and the position information of the camera provided by the Global Positioning System (GPS) to define regions of interest in the image plane to search for the image features corresponding to the runway markers. Once the runway region is identified, we use histogram-based thresholding to detect obstacles on the runway and regions outside the runway. This algorithm is tested using image sequences simulated from a single real PMMW image.

  2. Linear high-boost fusion of Stokes vector imagery for effective discrimination and recognition of real targets in the presence of multiple identical decoys

    NASA Astrophysics Data System (ADS)

    El-Saba, Aed; Sakla, Wesam A.

    2010-04-01

    Recently, the use of imaging polarimetry has received considerable attention for use in automatic target recognition (ATR) applications. In military remote sensing applications, there is a great demand for sensors that are capable of discriminating between real targets and decoys. Accurate discrimination of decoys from real targets is a challenging task and often requires the fusion of various sensor modalities that operate simultaneously. In this paper, we use a simple linear fusion technique known as the high-boost fusion method for effective discrimination of real targets in the presence of multiple decoys. The HBF assigns more weight to the polarization-based imagery in forming the final fused image that is used for detection. We have captured both intensity and polarization-based imagery from an experimental laboratory arrangement containing a mixture of sand/dirt, rocks, vegetation, and other objects for the purpose of simulating scenery that would be acquired in a remote sensing military application. A target object and three decoys that are identical in physical appearance (shape, surface structure and color) and different in material composition have also been placed in the scene. We use the wavelet-filter joint transform correlation (WFJTC) technique to perform detection between input scenery and the target object. Our results show that use of the HBF method increases the correlation performance metrics associated with the WFJTC-based detection process when compared to using either the traditional intensity or polarization-based images.

  3. Operational calibration and validation of landsat data continuity mission (LDCM) sensors using the image assessment system (IAS)

    USGS Publications Warehouse

    Micijevic, Esad; Morfitt, Ron

    2010-01-01

    Systematic characterization and calibration of the Landsat sensors and the assessment of image data quality are performed using the Image Assessment System (IAS). The IAS was first introduced as an element of the Landsat 7 (L7) Enhanced Thematic Mapper Plus (ETM+) ground segment and recently extended to Landsat 4 (L4) and 5 (L5) Thematic Mappers (TM) and Multispectral Sensors (MSS) on-board the Landsat 1-5 satellites. In preparation for the Landsat Data Continuity Mission (LDCM), the IAS was developed for the Earth Observer 1 (EO-1) Advanced Land Imager (ALI) with a capability to assess pushbroom sensors. This paper describes the LDCM version of the IAS and how it relates to unique calibration and validation attributes of its on-board imaging sensors. The LDCM IAS system will have to handle a significantly larger number of detectors and the associated database than the previous IAS versions. An additional challenge is that the LDCM IAS must handle data from two sensors, as the LDCM products will combine the Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) spectral bands.

  4. Onboard Image Processing System for Hyperspectral Sensor

    PubMed Central

    Hihara, Hiroki; Moritani, Kotaro; Inoue, Masao; Hoshi, Yoshihiro; Iwasaki, Akira; Takada, Jun; Inada, Hitomi; Suzuki, Makoto; Seki, Taeko; Ichikawa, Satoshi; Tanii, Jun

    2015-01-01

    Onboard image processing systems for a hyperspectral sensor have been developed in order to maximize image data transmission efficiency for large volume and high speed data downlink capacity. Since more than 100 channels are required for hyperspectral sensors on Earth observation satellites, fast and small-footprint lossless image compression capability is essential for reducing the size and weight of a sensor system. A fast lossless image compression algorithm has been developed, and is implemented in the onboard correction circuitry of sensitivity and linearity of Complementary Metal Oxide Semiconductor (CMOS) sensors in order to maximize the compression ratio. The employed image compression method is based on Fast, Efficient, Lossless Image compression System (FELICS), which is a hierarchical predictive coding method with resolution scaling. To improve FELICS’s performance of image decorrelation and entropy coding, we apply a two-dimensional interpolation prediction and adaptive Golomb-Rice coding. It supports progressive decompression using resolution scaling while still maintaining superior performance measured as speed and complexity. Coding efficiency and compression speed enlarge the effective capacity of signal transmission channels, which lead to reducing onboard hardware by multiplexing sensor signals into a reduced number of compression circuits. The circuitry is embedded into the data formatter of the sensor system without adding size, weight, power consumption, and fabrication cost. PMID:26404281

  5. A time-resolved image sensor for tubeless streak cameras

    NASA Astrophysics Data System (ADS)

    Yasutomi, Keita; Han, SangMan; Seo, Min-Woong; Takasawa, Taishi; Kagawa, Keiichiro; Kawahito, Shoji

    2014-03-01

    This paper presents a time-resolved CMOS image sensor with draining-only modulation (DOM) pixels for tube-less streak cameras. Although the conventional streak camera has high time resolution, the device requires high voltage and bulky system due to the structure with a vacuum tube. The proposed time-resolved imager with a simple optics realize a streak camera without any vacuum tubes. The proposed image sensor has DOM pixels, a delay-based pulse generator, and a readout circuitry. The delay-based pulse generator in combination with an in-pixel logic allows us to create and to provide a short gating clock to the pixel array. A prototype time-resolved CMOS image sensor with the proposed pixel is designed and implemented using 0.11um CMOS image sensor technology. The image array has 30(Vertical) x 128(Memory length) pixels with the pixel pitch of 22.4um. .

  6. First validation of satellite microwave liquid water path with ship-based observations in marine low clouds

    NASA Astrophysics Data System (ADS)

    Painemal, D.; Cadeddu, M. P.; Greenwald, T. J.; Minnis, P.

    2015-12-01

    We present the first validation study of satellite microwave liquid water path, from four operational sensors, against in-situ observations from a ship-borne three-channel microwave radiometer collected over the northeast Pacific during May-August of 2013, along a ship transect length of 40˚ (33.7˚N, 118.2˚W - 21.3˚N, 157.8˚W). The satellite sensors analyzed here are: The Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI), Special Sensor Microwave Imager/Sounder (SSMIS) on the Defense Meteorological Satellite Program F16 and F17 satellites, and The Advanced Microwave Scanning Radiometer (AMSR-2) on board the Global Change Observation Mission - Water (GCOM-W1). Satellite retrievals show an overall correlation with hourly-averaged in-situ observations of 0.86 and a positive bias of 10.0 gm2, which decreases to 1.0 gm2 and a correlation that increases to 0.91 when selecting overcast scenes. The satellite bias for broken scenes remains below 22.2 gm2, although the removal of clear-sky in-situ samples yields an unbiased relationship. Satellites produce a diurnal cycle with amplitudes (35-47 gm2) consistent with ship-based observations. Longitudinal biases remain below 17.4 gm2, and they are negligible in overcast scenes and when clear-sky samples are removed from the in-situ hourly average. Our study indicates that satellite microwave retrievals are a reliable dataset for climate studies in marine warm low clouds. The implications for satellite visible/infrared retrievals will be also discussed.

  7. A Low-Power Wireless Image Sensor Node with Noise-Robust Moving Object Detection and a Region-of-Interest Based Rate Controller

    DTIC Science & Technology

    2017-03-01

    A Low- Power Wireless Image Sensor Node with Noise-Robust Moving Object Detection and a Region-of-Interest Based Rate Controller Jong Hwan Ko...Atlanta, GA 30332 USA Contact Author Email: jonghwan.ko@gatech.edu Abstract: This paper presents a low- power wireless image sensor node for...present a low- power wireless image sensor node with a noise-robust moving object detection and region-of-interest based rate controller [Fig. 1]. The

  8. Efficient demodulation scheme for rolling-shutter-patterning of CMOS image sensor based visible light communications.

    PubMed

    Chen, Chia-Wei; Chow, Chi-Wai; Liu, Yang; Yeh, Chien-Hung

    2017-10-02

    Recently even the low-end mobile-phones are equipped with a high-resolution complementary-metal-oxide-semiconductor (CMOS) image sensor. This motivates using a CMOS image sensor for visible light communication (VLC). Here we propose and demonstrate an efficient demodulation scheme to synchronize and demodulate the rolling shutter pattern in image sensor based VLC. The implementation algorithm is discussed. The bit-error-rate (BER) performance and processing latency are evaluated and compared with other thresholding schemes.

  9. An Adaptive Cross-Correlation Algorithm for Extended-Scene Shack-Hartmann Wavefront Sensing

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin; Green, Joseph J.; Ohara, Catherine M.; Redding, David C.

    2007-01-01

    This viewgraph presentation reviews the Adaptive Cross-Correlation (ACC) Algorithm for extended scene-Shack Hartmann wavefront (WF) sensing. A Shack-Hartmann sensor places a lenslet array at a plane conjugate to the WF error source. Each sub-aperture lenslet samples the WF in the corresponding patch of the WF. A description of the ACC algorithm is included. The ACC has several benefits; amongst them are: ACC requires only about 4 image-shifting iterations to achieve 0.01 pixel accuracy and ACC is insensitive to both background light and noise much more robust than centroiding,

  10. Model based approach to UXO imaging using the time domain electromagnetic method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lavely, E.M.

    1999-04-01

    Time domain electromagnetic (TDEM) sensors have emerged as a field-worthy technology for UXO detection in a variety of geological and environmental settings. This success has been achieved with commercial equipment that was not optimized for UXO detection and discrimination. The TDEM response displays a rich spatial and temporal behavior which is not currently utilized. Therefore, in this paper the author describes a research program for enhancing the effectiveness of the TDEM method for UXO detection and imaging. Fundamental research is required in at least three major areas: (a) model based imaging capability i.e. the forward and inverse problem, (b) detectormore » modeling and instrument design, and (c) target recognition and discrimination algorithms. These research problems are coupled and demand a unified treatment. For example: (1) the inverse solution depends on solution of the forward problem and knowledge of the instrument response; (2) instrument design with improved diagnostic power requires forward and inverse modeling capability; and (3) improved target recognition algorithms (such as neural nets) must be trained with data collected from the new instrument and with synthetic data computed using the forward model. Further, the design of the appropriate input and output layers of the net will be informed by the results of the forward and inverse modeling. A more fully developed model of the TDEM response would enable the joint inversion of data collected from multiple sensors (e.g., TDEM sensors and magnetometers). Finally, the author suggests that a complementary approach to joint inversions is the statistical recombination of data using principal component analysis. The decomposition into principal components is useful since the first principal component contains those features that are most strongly correlated from image to image.« less

  11. Mathematical models and photogrammetric exploitation of image sensing

    NASA Astrophysics Data System (ADS)

    Puatanachokchai, Chokchai

    Mathematical models of image sensing are generally categorized into physical/geometrical sensor models and replacement sensor models. While the former is determined from image sensing geometry, the latter is based on knowledge of the physical/geometric sensor models and on using such models for its implementation. The main thrust of this research is in replacement sensor models which have three important characteristics: (1) Highly accurate ground-to-image functions; (2) Rigorous error propagation that is essentially of the same accuracy as the physical model; and, (3) Adjustability, or the ability to upgrade the replacement sensor model parameters when additional control information becomes available after the replacement sensor model has replaced the physical model. In this research, such replacement sensor models are considered as True Replacement Models or TRMs. TRMs provide a significant advantage of universality, particularly for image exploitation functions. There have been several writings about replacement sensor models, and except for the so called RSM (Replacement Sensor Model as a product described in the Manual of Photogrammetry), almost all of them pay very little or no attention to errors and their propagation. This is because, it is suspected, the few physical sensor parameters are usually replaced by many more parameters, thus presenting a potential error estimation difficulty. The third characteristic, adjustability, is perhaps the most demanding. It provides an equivalent flexibility to that of triangulation using the physical model. Primary contributions of this thesis include not only "the eigen-approach", a novel means of replacing the original sensor parameter covariance matrices at the time of estimating the TRM, but also the implementation of the hybrid approach that combines the eigen-approach with the added parameters approach used in the RSM. Using either the eigen-approach or the hybrid approach, rigorous error propagation can be performed during image exploitation. Further, adjustability can be performed when additional control information becomes available after the TRM has been implemented. The TRM is shown to apply to imagery from sensors having different geometries, including an aerial frame camera, a spaceborne linear array sensor, an airborne pushbroom sensor, and an airborne whiskbroom sensor. TRM results show essentially negligible differences as compared to those from rigorous physical sensor models, both for geopositioning from single and overlapping images. Simulated as well as real image data are used to address all three characteristics of the TRM.

  12. Non-contact respiration monitoring for in-vivo murine micro computed tomography: characterization and imaging applications

    NASA Astrophysics Data System (ADS)

    Burk, Laurel M.; Lee, Yueh Z.; Wait, J. Matthew; Lu, Jianping; Zhou, Otto Z.

    2012-09-01

    A cone beam micro-CT has previously been utilized along with a pressure-tracking respiration sensor to acquire prospectively gated images of both wild-type mice and various adult murine disease models. While the pressure applied to the abdomen of the subject by this sensor is small and is generally without physiological effect, certain disease models of interest, as well as very young animals, are prone to atelectasis with added pressure, or they generate too weak a respiration signal with this method to achieve optimal prospective gating. In this work we present a new fibre-optic displacement sensor which monitors respiratory motion of a subject without requiring physical contact. The sensor outputs an analogue signal which can be used for prospective respiration gating in micro-CT imaging. The device was characterized and compared against a pneumatic air chamber pressure sensor for the imaging of adult wild-type mice. The resulting images were found to be of similar quality with respect to physiological motion blur; the quality of the respiration signal trace obtained using the non-contact sensor was comparable to that of the pressure sensor and was superior for gating purposes due to its better signal-to-noise ratio. The non-contact sensor was then used to acquire in-vivo micro-CT images of a murine model for congenital diaphragmatic hernia and of 11-day-old mouse pups. In both cases, quality CT images were successfully acquired using this new respiration sensor. Despite the presence of beam hardening artefacts arising from the presence of a fibre-optic cable in the imaging field, we believe this new technique for respiration monitoring and gating presents an opportunity for in-vivo imaging of disease models which were previously considered too delicate for established animal handling methods.

  13. Estimation of Image Sensor Fill Factor Using a Single Arbitrary Image

    PubMed Central

    Wen, Wei; Khatibi, Siamak

    2017-01-01

    Achieving a high fill factor is a bottleneck problem for capturing high-quality images. There are hardware and software solutions to overcome this problem. In the solutions, the fill factor is known. However, this is an industrial secrecy by most image sensor manufacturers due to its direct effect on the assessment of the sensor quality. In this paper, we propose a method to estimate the fill factor of a camera sensor from an arbitrary single image. The virtual response function of the imaging process and sensor irradiance are estimated from the generation of virtual images. Then the global intensity values of the virtual images are obtained, which are the result of fusing the virtual images into a single, high dynamic range radiance map. A non-linear function is inferred from the original and global intensity values of the virtual images. The fill factor is estimated by the conditional minimum of the inferred function. The method is verified using images of two datasets. The results show that our method estimates the fill factor correctly with significant stability and accuracy from one single arbitrary image according to the low standard deviation of the estimated fill factors from each of images and for each camera. PMID:28335459

  14. A review of supervised object-based land-cover image classification

    NASA Astrophysics Data System (ADS)

    Ma, Lei; Li, Manchun; Ma, Xiaoxue; Cheng, Liang; Du, Peijun; Liu, Yongxue

    2017-08-01

    Object-based image classification for land-cover mapping purposes using remote-sensing imagery has attracted significant attention in recent years. Numerous studies conducted over the past decade have investigated a broad array of sensors, feature selection, classifiers, and other factors of interest. However, these research results have not yet been synthesized to provide coherent guidance on the effect of different supervised object-based land-cover classification processes. In this study, we first construct a database with 28 fields using qualitative and quantitative information extracted from 254 experimental cases described in 173 scientific papers. Second, the results of the meta-analysis are reported, including general characteristics of the studies (e.g., the geographic range of relevant institutes, preferred journals) and the relationships between factors of interest (e.g., spatial resolution and study area or optimal segmentation scale, accuracy and number of targeted classes), especially with respect to the classification accuracy of different sensors, segmentation scale, training set size, supervised classifiers, and land-cover types. Third, useful data on supervised object-based image classification are determined from the meta-analysis. For example, we find that supervised object-based classification is currently experiencing rapid advances, while development of the fuzzy technique is limited in the object-based framework. Furthermore, spatial resolution correlates with the optimal segmentation scale and study area, and Random Forest (RF) shows the best performance in object-based classification. The area-based accuracy assessment method can obtain stable classification performance, and indicates a strong correlation between accuracy and training set size, while the accuracy of the point-based method is likely to be unstable due to mixed objects. In addition, the overall accuracy benefits from higher spatial resolution images (e.g., unmanned aerial vehicle) or agricultural sites where it also correlates with the number of targeted classes. More than 95.6% of studies involve an area less than 300 ha, and the spatial resolution of images is predominantly between 0 and 2 m. Furthermore, we identify some methods that may advance supervised object-based image classification. For example, deep learning and type-2 fuzzy techniques may further improve classification accuracy. Lastly, scientists are strongly encouraged to report results of uncertainty studies to further explore the effects of varied factors on supervised object-based image classification.

  15. A 3D image sensor with adaptable charge subtraction scheme for background light suppression

    NASA Astrophysics Data System (ADS)

    Shin, Jungsoon; Kang, Byongmin; Lee, Keechang; Kim, James D. K.

    2013-02-01

    We present a 3D ToF (Time-of-Flight) image sensor with adaptive charge subtraction scheme for background light suppression. The proposed sensor can alternately capture high resolution color image and high quality depth map in each frame. In depth-mode, the sensor requires enough integration time for accurate depth acquisition, but saturation will occur in high background light illumination. We propose to divide the integration time into N sub-integration times adaptively. In each sub-integration time, our sensor captures an image without saturation and subtracts the charge to prevent the pixel from the saturation. In addition, the subtraction results are cumulated N times obtaining a final result image without background illumination at full integration time. Experimental results with our own ToF sensor show high background suppression performance. We also propose in-pixel storage and column-level subtraction circuit for chiplevel implementation of the proposed method. We believe the proposed scheme will enable 3D sensors to be used in out-door environment.

  16. A rondo in three flats

    NASA Astrophysics Data System (ADS)

    Hatheway, Alson E.

    2008-08-01

    "Rondo: an instrumental composition typically with a refrain recurring four times in the tonic and with three couplets in contrasting keys." --G. & C. Merriam Co. New York 1973 The composition will be played on three instruments, a cryogenic space surveillance sensor, a document scanner and an optical image correlator. The performer will take the liberty of including an introduction and a coda. After the first couplet the listeners may sing along with the performer.

  17. Image acquisition system using on sensor compressed sampling technique

    NASA Astrophysics Data System (ADS)

    Gupta, Pravir Singh; Choi, Gwan Seong

    2018-01-01

    Advances in CMOS technology have made high-resolution image sensors possible. These image sensors pose significant challenges in terms of the amount of raw data generated, energy efficiency, and frame rate. This paper presents a design methodology for an imaging system and a simplified image sensor pixel design to be used in the system so that the compressed sensing (CS) technique can be implemented easily at the sensor level. This results in significant energy savings as it not only cuts the raw data rate but also reduces transistor count per pixel; decreases pixel size; increases fill factor; simplifies analog-to-digital converter, JPEG encoder, and JPEG decoder design; decreases wiring; and reduces the decoder size by half. Thus, CS has the potential to increase the resolution of image sensors for a given technology and die size while significantly decreasing the power consumption and design complexity. We show that it has potential to reduce power consumption by about 23% to 65%.

  18. On computer vision in wireless sensor networks.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Nina M.; Ko, Teresa H.

    Wireless sensor networks allow detailed sensing of otherwise unknown and inaccessible environments. While it would be beneficial to include cameras in a wireless sensor network because images are so rich in information, the power cost of transmitting an image across the wireless network can dramatically shorten the lifespan of the sensor nodes. This paper describe a new paradigm for the incorporation of imaging into wireless networks. Rather than focusing on transmitting images across the network, we show how an image can be processed locally for key features using simple detectors. Contrasted with traditional event detection systems that trigger an imagemore » capture, this enables a new class of sensors which uses a low power imaging sensor to detect a variety of visual cues. Sharing these features among relevant nodes cues specific actions to better provide information about the environment. We report on various existing techniques developed for traditional computer vision research which can aid in this work.« less

  19. MUNSELL COLOR ANALYSIS OF LANDSAT COLOR-RATIO-COMPOSITE IMAGES OF LIMONITIC AREAS IN SOUTHWEST NEW MEXICO.

    USGS Publications Warehouse

    Kruse, Fred A.

    1984-01-01

    Green areas on Landsat 4/5 - 4/6 - 6/7 (red - blue - green) color-ratio-composite (CRC) images represent limonite on the ground. Color variation on such images was analyzed to determine the causes of the color differences within and between the green areas. Digital transformation of the CRC data into the modified cylindrical Munsell color coordinates - hue, value, and saturation - was used to correlate image color characteristics with properties of surficial materials. The amount of limonite visible to the sensor is the primary cause of color differences in green areas on the CRCs. Vegetation density is a secondary cause of color variation of green areas on Landsat CRC images. Digital color analysis of Landsat CRC images can be used to map unknown areas. Color variations of green pixels allows discrimination among limonitic bedrock, nonlimonitic bedrock, nonlimonitic alluvium, and limonitic alluvium.

  20. Image-Based Environmental Monitoring Sensor Application Using an Embedded Wireless Sensor Network

    PubMed Central

    Paek, Jeongyeup; Hicks, John; Coe, Sharon; Govindan, Ramesh

    2014-01-01

    This article discusses the experiences from the development and deployment of two image-based environmental monitoring sensor applications using an embedded wireless sensor network. Our system uses low-power image sensors and the Tenet general purpose sensing system for tiered embedded wireless sensor networks. It leverages Tenet's built-in support for reliable delivery of high rate sensing data, scalability and its flexible scripting language, which enables mote-side image compression and the ease of deployment. Our first deployment of a pitfall trap monitoring application at the James San Jacinto Mountain Reserve provided us with insights and lessons learned into the deployment of and compression schemes for these embedded wireless imaging systems. Our three month-long deployment of a bird nest monitoring application resulted in over 100,000 images collected from a 19-camera node network deployed over an area of 0.05 square miles, despite highly variable environmental conditions. Our biologists found the on-line, near-real-time access to images to be useful for obtaining data on answering their biological questions. PMID:25171121

  1. Image-based environmental monitoring sensor application using an embedded wireless sensor network.

    PubMed

    Paek, Jeongyeup; Hicks, John; Coe, Sharon; Govindan, Ramesh

    2014-08-28

    This article discusses the experiences from the development and deployment of two image-based environmental monitoring sensor applications using an embedded wireless sensor network. Our system uses low-power image sensors and the Tenet general purpose sensing system for tiered embedded wireless sensor networks. It leverages Tenet's built-in support for reliable delivery of high rate sensing data, scalability and its flexible scripting language, which enables mote-side image compression and the ease of deployment. Our first deployment of a pitfall trap monitoring application at the James San Cannot Mountain Reserve provided us with insights and lessons learned into the deployment of and compression schemes for these embedded wireless imaging systems. Our three month-long deployment of a bird nest monitoring application resulted in over 100,000 images collected from a 19-camera node network deployed over an area of 0.05 square miles, despite highly variable environmental conditions. Our biologists found the on-line, near-real-time access to images to be useful for obtaining data on answering their biological questions.

  2. Weak beacon detection for air-to-ground optical wireless link establishment.

    PubMed

    Han, Yaoqiang; Dang, Anhong; Tang, Junxiong; Guo, Hong

    2010-02-01

    In an air-to-ground free-space optical communication system, strong background interference seriously affects the beacon detection, which makes it difficult to establish the optical link. In this paper, we propose a correlation beacon detection scheme under strong background interference conditions. As opposed to traditional beacon detection schemes, the beacon is modulated by an m-sequence at the transmitting terminal with a digital differential matched filter (DDMF) array introduced at the receiving end to detect the modulated beacon. This scheme is capable of suppressing both strong interference and noise by correlation reception of the received image sequence. In addition, the DDMF array enables each pixel of the image sensor to have its own DDMF of the same structure to process its received image sequence in parallel, thus it makes fast beacon detection possible. Theoretical analysis and an outdoor experiment have been demonstrated and show that the proposed scheme can realize fast and effective beacon detection under strong background interference conditions. Consequently, the required beacon transmission power can also be reduced dramatically.

  3. 3D digital image correlation using single color camera pseudo-stereo system

    NASA Astrophysics Data System (ADS)

    Li, Junrui; Dan, Xizuo; Xu, Wan; Wang, Yonghong; Yang, Guobiao; Yang, Lianxiang

    2017-10-01

    Three dimensional digital image correlation (3D-DIC) has been widely used by industry to measure the 3D contour and whole-field displacement/strain. In this paper, a novel single color camera 3D-DIC setup, using a reflection-based pseudo-stereo system, is proposed. Compared to the conventional single camera pseudo-stereo system, which splits the CCD sensor into two halves to capture the stereo views, the proposed system achieves both views using the whole CCD chip and without reducing the spatial resolution. In addition, similarly to the conventional 3D-DIC system, the center of the two views stands in the center of the CCD chip, which minimizes the image distortion relative to the conventional pseudo-stereo system. The two overlapped views in the CCD are separated by the color domain, and the standard 3D-DIC algorithm can be utilized directly to perform the evaluation. The system's principle and experimental setup are described in detail, and multiple tests are performed to validate the system.

  4. Bioinspired Polarization Imaging Sensors: From Circuits and Optics to Signal Processing Algorithms and Biomedical Applications

    PubMed Central

    York, Timothy; Powell, Samuel B.; Gao, Shengkui; Kahan, Lindsey; Charanya, Tauseef; Saha, Debajit; Roberts, Nicholas W.; Cronin, Thomas W.; Marshall, Justin; Achilefu, Samuel; Lake, Spencer P.; Raman, Baranidharan; Gruev, Viktor

    2015-01-01

    In this paper, we present recent work on bioinspired polarization imaging sensors and their applications in biomedicine. In particular, we focus on three different aspects of these sensors. First, we describe the electro–optical challenges in realizing a bioinspired polarization imager, and in particular, we provide a detailed description of a recent low-power complementary metal–oxide–semiconductor (CMOS) polarization imager. Second, we focus on signal processing algorithms tailored for this new class of bioinspired polarization imaging sensors, such as calibration and interpolation. Third, the emergence of these sensors has enabled rapid progress in characterizing polarization signals and environmental parameters in nature, as well as several biomedical areas, such as label-free optical neural recording, dynamic tissue strength analysis, and early diagnosis of flat cancerous lesions in a murine colorectal tumor model. We highlight results obtained from these three areas and discuss future applications for these sensors. PMID:26538682

  5. Multi-Image Registration for an Enhanced Vision System

    NASA Technical Reports Server (NTRS)

    Hines, Glenn; Rahman, Zia-Ur; Jobson, Daniel; Woodell, Glenn

    2002-01-01

    An Enhanced Vision System (EVS) utilizing multi-sensor image fusion is currently under development at the NASA Langley Research Center. The EVS will provide enhanced images of the flight environment to assist pilots in poor visibility conditions. Multi-spectral images obtained from a short wave infrared (SWIR), a long wave infrared (LWIR), and a color visible band CCD camera, are enhanced and fused using the Retinex algorithm. The images from the different sensors do not have a uniform data structure: the three sensors not only operate at different wavelengths, but they also have different spatial resolutions, optical fields of view (FOV), and bore-sighting inaccuracies. Thus, in order to perform image fusion, the images must first be co-registered. Image registration is the task of aligning images taken at different times, from different sensors, or from different viewpoints, so that all corresponding points in the images match. In this paper, we present two methods for registering multiple multi-spectral images. The first method performs registration using sensor specifications to match the FOVs and resolutions directly through image resampling. In the second method, registration is obtained through geometric correction based on a spatial transformation defined by user selected control points and regression analysis.

  6. The analysis and rationale behind the upgrading of existing standard definition thermal imagers to high definition

    NASA Astrophysics Data System (ADS)

    Goss, Tristan M.

    2016-05-01

    With 640x512 pixel format IR detector arrays having been on the market for the past decade, Standard Definition (SD) thermal imaging sensors have been developed and deployed across the world. Now with 1280x1024 pixel format IR detector arrays becoming readily available designers of thermal imager systems face new challenges as pixel sizes reduce and the demand and applications for High Definition (HD) thermal imaging sensors increases. In many instances the upgrading of existing under-sampled SD thermal imaging sensors into more optimally sampled or oversampled HD thermal imaging sensors provides a more cost effective and reduced time to market option than to design and develop a completely new sensor. This paper presents the analysis and rationale behind the selection of the best suited HD pixel format MWIR detector for the upgrade of an existing SD thermal imaging sensor to a higher performing HD thermal imaging sensor. Several commercially available and "soon to be" commercially available HD small pixel IR detector options are included as part of the analysis and are considered for this upgrade. The impact the proposed detectors have on the sensor's overall sensitivity, noise and resolution is analyzed, and the improved range performance is predicted. Furthermore with reduced dark currents due to the smaller pixel sizes, the candidate HD MWIR detectors are operated at higher temperatures when compared to their SD predecessors. Therefore, as an additional constraint and as a design goal, the feasibility of achieving upgraded performance without any increase in the size, weight and power consumption of the thermal imager is discussed herein.

  7. Automatic target recognition and detection in infrared imagery under cluttered background

    NASA Astrophysics Data System (ADS)

    Gundogdu, Erhan; Koç, Aykut; Alatan, A. Aydın.

    2017-10-01

    Visual object classification has long been studied in visible spectrum by utilizing conventional cameras. Since the labeled images has recently increased in number, it is possible to train deep Convolutional Neural Networks (CNN) with significant amount of parameters. As the infrared (IR) sensor technology has been improved during the last two decades, labeled images extracted from IR sensors have been started to be used for object detection and recognition tasks. We address the problem of infrared object recognition and detection by exploiting 15K images from the real-field with long-wave and mid-wave IR sensors. For feature learning, a stacked denoising autoencoder is trained in this IR dataset. To recognize the objects, the trained stacked denoising autoencoder is fine-tuned according to the binary classification loss of the target object. Once the training is completed, the test samples are propagated over the network, and the probability of the test sample belonging to a class is computed. Moreover, the trained classifier is utilized in a detect-by-classification method, where the classification is performed in a set of candidate object boxes and the maximum confidence score in a particular location is accepted as the score of the detected object. To decrease the computational complexity, the detection step at every frame is avoided by running an efficient correlation filter based tracker. The detection part is performed when the tracker confidence is below a pre-defined threshold. The experiments conducted on the real field images demonstrate that the proposed detection and tracking framework presents satisfactory results for detecting tanks under cluttered background.

  8. Experimental single-chip color HDTV image acquisition system with 8M-pixel CMOS image sensor

    NASA Astrophysics Data System (ADS)

    Shimamoto, Hiroshi; Yamashita, Takayuki; Funatsu, Ryohei; Mitani, Kohji; Nojiri, Yuji

    2006-02-01

    We have developed an experimental single-chip color HDTV image acquisition system using 8M-pixel CMOS image sensor. The sensor has 3840 × 2160 effective pixels and is progressively scanned at 60 frames per second. We describe the color filter array and interpolation method to improve image quality with a high-pixel-count single-chip sensor. We also describe an experimental image acquisition system we used to measured spatial frequency characteristics in the horizontal direction. The results indicate good prospects for achieving a high quality single chip HDTV camera that reduces pseudo signals and maintains high spatial frequency characteristics within the frequency band for HDTV.

  9. The challenge of sCMOS image sensor technology to EMCCD

    NASA Astrophysics Data System (ADS)

    Chang, Weijing; Dai, Fang; Na, Qiyue

    2018-02-01

    In the field of low illumination image sensor, the noise of the latest scientific-grade CMOS image sensor is close to EMCCD, and the industry thinks it has the potential to compete and even replace EMCCD. Therefore we selected several typical sCMOS and EMCCD image sensors and cameras to compare their performance parameters. The results show that the signal-to-noise ratio of sCMOS is close to EMCCD, and the other parameters are superior. But signal-to-noise ratio is very important for low illumination imaging, and the actual imaging results of sCMOS is not ideal. EMCCD is still the first choice in the high-performance application field.

  10. Evaluation of physical properties of different digital intraoral sensors.

    PubMed

    Al-Rawi, Wisam; Teich, Sorin

    2013-09-01

    Digital technologies provide clinically acceptable results comparable to traditional films while having other advantages such as the ability to store and manipulate images, immediate evaluation of the image diagnostic quality, possible reduction in patient radiation exposure, and so on. The purpose of this paper is to present the results of the evaluation of the physical design of eight CMOS digital intraoral sensors. Sensors tested included: XDR (Cyber Medical Imaging, Los Angeles, CA, USA), RVG 6100 (Carestream Dental LLC, Atlanta, GA, USA), Platinum (DEXIS LLC., Hatfield, PA, USA), CDR Elite (Schick Technologies, Long Island City, NY, USA), ProSensor (Planmeca, Helsinki, Finland), EVA (ImageWorks, Elmsford, NY, USA), XIOS Plus (Sirona, Bensheim, Germany), and GXS-700 (Gendex Dental Systems, Hatfield, PA, USA). The sensors were evaluated for cable configuration, connectivity interface, presence of back-scattering radiation shield, plate thickness, active sensor area, and comparing the active imaging area to the outside casing and to conventional radiographic films. There were variations among the physical design of different sensors. For most parameters tested, a lack of standardization exists in the industry. The results of this study revealed that these details are not always available through the material provided by the manufacturers and are often not advertised. For all sensor sizes, active imaging area was smaller compared with conventional films. There was no sensor in the group that had the best physical design. Data presented in this paper establishes a benchmark for comparing the physical design of digital intraoral sensors.

  11. Wavefront sensorless adaptive optics ophthalmoscopy in the human eye

    PubMed Central

    Hofer, Heidi; Sredar, Nripun; Queener, Hope; Li, Chaohong; Porter, Jason

    2011-01-01

    Wavefront sensor noise and fidelity place a fundamental limit on achievable image quality in current adaptive optics ophthalmoscopes. Additionally, the wavefront sensor ‘beacon’ can interfere with visual experiments. We demonstrate real-time (25 Hz), wavefront sensorless adaptive optics imaging in the living human eye with image quality rivaling that of wavefront sensor based control in the same system. A stochastic parallel gradient descent algorithm directly optimized the mean intensity in retinal image frames acquired with a confocal adaptive optics scanning laser ophthalmoscope (AOSLO). When imaging through natural, undilated pupils, both control methods resulted in comparable mean image intensities. However, when imaging through dilated pupils, image intensity was generally higher following wavefront sensor-based control. Despite the typically reduced intensity, image contrast was higher, on average, with sensorless control. Wavefront sensorless control is a viable option for imaging the living human eye and future refinements of this technique may result in even greater optical gains. PMID:21934779

  12. Photodiode area effect on performance of X-ray CMOS active pixel sensors

    NASA Astrophysics Data System (ADS)

    Kim, M. S.; Kim, Y.; Kim, G.; Lim, K. T.; Cho, G.; Kim, D.

    2018-02-01

    Compared to conventional TFT-based X-ray imaging devices, CMOS-based X-ray imaging sensors are considered next generation because they can be manufactured in very small pixel pitches and can acquire high-speed images. In addition, CMOS-based sensors have the advantage of integration of various functional circuits within the sensor. The image quality can also be improved by the high fill-factor in large pixels. If the size of the subject is small, the size of the pixel must be reduced as a consequence. In addition, the fill factor must be reduced to aggregate various functional circuits within the pixel. In this study, 3T-APS (active pixel sensor) with photodiodes of four different sizes were fabricated and evaluated. It is well known that a larger photodiode leads to improved overall performance. Nonetheless, if the size of the photodiode is > 1000 μm2, the degree to which the sensor performance increases as the photodiode size increases, is reduced. As a result, considering the fill factor, pixel-pitch > 32 μm is not necessary to achieve high-efficiency image quality. In addition, poor image quality is to be expected unless special sensor-design techniques are included for sensors with a pixel pitch of 25 μm or less.

  13. A Review of Digital Image Correlation Applied to Structura Dynamics

    NASA Astrophysics Data System (ADS)

    Niezrecki, Christopher; Avitabile, Peter; Warren, Christopher; Pingle, Pawan; Helfrick, Mark

    2010-05-01

    A significant amount of interest exists in performing non-contacting, full-field surface velocity measurement. For many years traditional non-contacting surface velocity measurements have been made by using scanning Doppler laser vibrometry, shearography, pulsed laser interferometry, pulsed holography, or an electronic speckle pattern interferometer (ESPI). Three dimensional (3D) digital image correlation (DIC) methods utilize the alignment of a stereo pair of images to obtain full-field geometry data, in three dimensions. Information about the change in geometry of an object over time can be found by comparing a sequence of images and virtual strain gages (or position sensors) can be created over the entire visible surface of the object of interest. Digital imaging techniques were first developed in the 1980s but the technology has only recently been exploited in industry and research due to the advances of digital cameras and personal computers. The use of DIC for structural dynamic measurement has only very recently been investigated. Within this paper, the advantages and limits of using DIC for dynamic measurement are reviewed. Several examples of using DIC for dynamic measurement are presented on several vibrating and rotating structures.

  14. First correlated measurements of the shape and light scattering properties of cloud particles using the new Particle Habit Imaging and Polar Scattering (PHIPS) probe

    NASA Astrophysics Data System (ADS)

    Abdelmonem, A.; Schnaiter, M.; Amsler, P.; Hesse, E.; Meyer, J.; Leisner, T.

    2011-10-01

    Studying the radiative impact of cirrus clouds requires knowledge of the relationship between their microphysics and the single scattering properties of cloud particles. Usually, this relationship is obtained by modeling the optical scattering properties from in situ measurements of ice crystal size distributions. The measured size distribution and the assumed particle shape might be erroneous in case of non-spherical ice particles. We present here a novel optical sensor (the Particle Habit Imaging and Polar Scattering probe, PHIPS) designed to measure simultaneously the 3-D morphology and the corresponding optical and microphysical parameters of individual cloud particles. Clouds containing particles ranging from a few micrometers to about 800 μm diameter in size can be characterized systematically with an optical resolution power of 2 μm and polar scattering resolution of 1° for forward scattering directions (from 1° to 10°) and 8° for side and backscattering directions (from 18° to 170°). The maximum acquisition rates for scattering phase functions and images are 262 KHz and 10 Hz, respectively. Some preliminary results collected in two ice cloud campaigns conducted in the AIDA cloud simulation chamber are presented. PHIPS showed reliability in operation and produced size distributions and images comparable to those given by other certified cloud particles instruments. A 3-D model of a hexagonal ice plate is constructed and the corresponding scattering phase function is compared to that modeled using the Ray Tracing with Diffraction on Facets (RTDF) program. PHIPS is a highly promising novel airborne optical sensor for studying the radiative impact of cirrus clouds and correlating the particle habit-scattering properties which will serve as a reference for other single, or multi-independent, measurement instruments.

  15. High-Speed Binary-Output Image Sensor

    NASA Technical Reports Server (NTRS)

    Fossum, Eric; Panicacci, Roger A.; Kemeny, Sabrina E.; Jones, Peter D.

    1996-01-01

    Photodetector outputs digitized by circuitry on same integrated-circuit chip. Developmental special-purpose binary-output image sensor designed to capture up to 1,000 images per second, with resolution greater than 10 to the 6th power pixels per image. Lower-resolution but higher-frame-rate prototype of sensor contains 128 x 128 array of photodiodes on complementary metal oxide/semiconductor (CMOS) integrated-circuit chip. In application for which it is being developed, sensor used to examine helicopter oil to determine whether amount of metal and sand in oil sufficient to warrant replacement.

  16. Radiographic endodontic working length estimation: comparison of three digital image receptors.

    PubMed

    Athar, Anas; Angelopoulos, Christos; Katz, Jerald O; Williams, Karen B; Spencer, Paulette

    2008-10-01

    This in vitro study was conducted to evaluate the accuracy of the Schick wireless image receptor compared with 2 other types of digital image receptors for measuring the radiographic landmarks pertinent to endodontic treatment. Fourteen human cadaver mandibles with retained molars were selected. A fine endodontic file (#10) was introduced into the canal at random distances from the apex and at the apex of the tooth; images were made with 3 different #2-size image receptors: DenOptix storage phosphor plates, Gendex CCD sensor (wired), and Schick CDR sensor (wireless). Six raters viewed the images for identification of the radiographic apex of the tooth and the tip of a fine (#10) endodontic file. Inter-rater reliability was also assessed. Repeated-measures analysis of variance revealed a significant main effect for the type of image receptor. Raters' error in identifying structures of interest was significantly higher for Denoptix storage phosphor plates, whereas the least error was noted with the Schick CDR sensor. A significant interaction effect was observed for rater and type of image receptor used, but this effect contributed only 6% (P < .01; eta(2) = 0.06) toward the outcome of the results. Schick CDR wireless sensor may be preferable to other solid-state sensors, because there is no cable connecting the sensor to the computer. Further testing of this sensor for other diagnostic tasks is recommended, as well as evaluation of patient acceptance.

  17. Precise calibration of pupil images in pyramid wavefront sensor.

    PubMed

    Liu, Yong; Mu, Quanquan; Cao, Zhaoliang; Hu, Lifa; Yang, Chengliang; Xuan, Li

    2017-04-20

    The pyramid wavefront sensor (PWFS) is a novel wavefront sensor with several inspiring advantages compared with Shack-Hartmann wavefront sensors. The PWFS uses four pupil images to calculate the local tilt of the incoming wavefront. Pupil images are conjugated with a telescope pupil so that each pixel in the pupil image is diffraction-limited by the telescope pupil diameter, thus the sensing error of the PWFS is much lower than that of the Shack-Hartmann sensor and is related to the extraction and alignment accuracy of pupil images. However, precise extraction of these images is difficult to conduct in practice. Aiming at improving the sensing accuracy, we analyzed the physical model of calibration of a PWFS and put forward an extraction algorithm. The process was verified via a closed-loop correction experiment. The results showed that the sensing accuracy of the PWFS increased after applying the calibration and extraction method.

  18. Advances in multi-sensor data fusion: algorithms and applications.

    PubMed

    Dong, Jiang; Zhuang, Dafang; Huang, Yaohuan; Fu, Jingying

    2009-01-01

    With the development of satellite and remote sensing techniques, more and more image data from airborne/satellite sensors have become available. Multi-sensor image fusion seeks to combine information from different images to obtain more inferences than can be derived from a single sensor. In image-based application fields, image fusion has emerged as a promising research area since the end of the last century. The paper presents an overview of recent advances in multi-sensor satellite image fusion. Firstly, the most popular existing fusion algorithms are introduced, with emphasis on their recent improvements. Advances in main applications fields in remote sensing, including object identification, classification, change detection and maneuvering targets tracking, are described. Both advantages and limitations of those applications are then discussed. Recommendations are addressed, including: (1) Improvements of fusion algorithms; (2) Development of "algorithm fusion" methods; (3) Establishment of an automatic quality assessment scheme.

  19. Camera sensor arrangement for crop/weed detection accuracy in agronomic images.

    PubMed

    Romeo, Juan; Guerrero, José Miguel; Montalvo, Martín; Emmi, Luis; Guijarro, María; Gonzalez-de-Santos, Pablo; Pajares, Gonzalo

    2013-04-02

    In Precision Agriculture, images coming from camera-based sensors are commonly used for weed identification and crop line detection, either to apply specific treatments or for vehicle guidance purposes. Accuracy of identification and detection is an important issue to be addressed in image processing. There are two main types of parameters affecting the accuracy of the images, namely: (a) extrinsic, related to the sensor's positioning in the tractor; (b) intrinsic, related to the sensor specifications, such as CCD resolution, focal length or iris aperture, among others. Moreover, in agricultural applications, the uncontrolled illumination, existing in outdoor environments, is also an important factor affecting the image accuracy. This paper is exclusively focused on two main issues, always with the goal to achieve the highest image accuracy in Precision Agriculture applications, making the following two main contributions: (a) camera sensor arrangement, to adjust extrinsic parameters and (b) design of strategies for controlling the adverse illumination effects.

  20. Commercial CMOS image sensors as X-ray imagers and particle beam monitors

    NASA Astrophysics Data System (ADS)

    Castoldi, A.; Guazzoni, C.; Maffessanti, S.; Montemurro, G. V.; Carraresi, L.

    2015-01-01

    CMOS image sensors are widely used in several applications such as mobile handsets webcams and digital cameras among others. Furthermore they are available across a wide range of resolutions with excellent spectral and chromatic responses. In order to fulfill the need of cheap systems as beam monitors and high resolution image sensors for scientific applications we exploited the possibility of using commercial CMOS image sensors as X-rays and proton detectors. Two different sensors have been mounted and tested. An Aptina MT9v034, featuring 752 × 480 pixels, 6μm × 6μm pixel size has been mounted and successfully tested as bi-dimensional beam profile monitor, able to take pictures of the incoming proton bunches at the DeFEL beamline (1-6 MeV pulsed proton beam) of the LaBeC of INFN in Florence. The naked sensor is able to successfully detect the interactions of the single protons. The sensor point-spread-function (PSF) has been qualified with 1MeV protons and is equal to one pixel (6 mm) r.m.s. in both directions. A second sensor MT9M032, featuring 1472 × 1096 pixels, 2.2 × 2.2 μm pixel size has been mounted on a dedicated board as high-resolution imager to be used in X-ray imaging experiments with table-top generators. In order to ease and simplify the data transfer and the image acquisition the system is controlled by a dedicated micro-processor board (DM3730 1GHz SoC ARM Cortex-A8) on which a modified LINUX kernel has been implemented. The paper presents the architecture of the sensor systems and the results of the experimental measurements.

  1. Apparatus and method for imaging metallic objects using an array of giant magnetoresistive sensors

    DOEpatents

    Chaiken, Alison

    2000-01-01

    A portable, low-power, metallic object detector and method for providing an image of a detected metallic object. In one embodiment, the present portable low-power metallic object detector an array of giant magnetoresistive (GMR) sensors. The array of GMR sensors is adapted for detecting the presence of and compiling image data of a metallic object. In the embodiment, the array of GMR sensors is arranged in a checkerboard configuration such that axes of sensitivity of alternate GMR sensors are orthogonally oriented. An electronics portion is coupled to the array of GMR sensors. The electronics portion is adapted to receive and process the image data of the metallic object compiled by the array of GMR sensors. The embodiment also includes a display unit which is coupled to the electronics portion. The display unit is adapted to display a graphical representation of the metallic object detected by the array of GMR sensors. In so doing, a graphical representation of the detected metallic object is provided.

  2. Cross-calibration of MODIS with ETM+ and ALI sensors for long-term monitoring of land surface processes

    USGS Publications Warehouse

    Meyer, D.; Chander, G.

    2006-01-01

    Increasingly, data from multiple sensors are used to gain a more complete understanding of land surface processes at a variety of scales. Although higher-level products (e.g., vegetation cover, albedo, surface temperature) derived from different sensors can be validated independently, the degree to which these sensors and their products can be compared to one another is vastly improved if their relative spectroradiometric responses are known. Most often, sensors are directly calibrated to diffuse solar irradiation or vicariously to ground targets. However, space-based targets are not traceable to metrological standards, and vicarious calibrations are expensive and provide a poor sampling of a sensor's full dynamic range. Crosscalibration of two sensors can augment these methods if certain conditions can be met: (1) the spectral responses are similar, (2) the observations are reasonably concurrent (similar atmospheric & solar illumination conditions), (3) errors due to misregistrations of inhomogeneous surfaces can be minimized (including scale differences), and (4) the viewing geometry is similar (or, some reasonable knowledge of surface bi-directional reflectance distribution functions is available). This study explores the impacts of cross-calibrating sensors when such conditions are met to some degree but not perfectly. In order to constrain the range of conditions at some level, the analysis is limited to sensors where cross-calibration studies have been conducted (Enhanced Thematic Mapper Plus (ETM+) on Landsat-7 (L7), Advance Land Imager (ALI) and Hyperion on Earth Observer-1 (EO-1)) and including systems having somewhat dissimilar geometry, spatial resolution & spectral response characteristics but are still part of the so-called "A.M. constellation" (Moderate Resolution Imaging Spectrometer (MODIS) aboard the Terra platform). Measures for spectral response differences and methods for cross calibrating such sensors are provided in this study. These instruments are cross calibrated using the Railroad Valley playa in Nevada. Best fit linear coefficients (slope and offset) are provided for ALI-to-MODIS and ETM+-to-MODIS cross calibrations, and root-mean-squared errors (RMSEs) and correlation coefficients are provided to quantify the uncertainty in these relationships. In theory, the linear fits and uncertainties can be used to compare radiance and reflectance products derived from each instrument.

  3. Matching the best viewing angle in depth cameras for biomass estimation based on poplar seedling geometry.

    PubMed

    Andújar, Dionisio; Fernández-Quintanilla, César; Dorado, José

    2015-06-04

    In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (-45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and -45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass production, with several important advantages: low cost, low power needs and a high frame rate (frames per second) when dynamic measurements are required.

  4. Comparison of optics and performance of a distal sensor high definition cystoscope, a distal sensor standard definition cystoscope, and a fiberoptic cystoscope.

    PubMed

    Lusch, Achim; Liss, Michael A; Greene, Peter; Abdelshehid, Corollos; Menhadji, Ashleigh; Bucur, Philip; Alipanah, Reza; McDougall, Elspeth; Landman, Jaime

    2013-12-01

    To evaluate performance characteristics and optics of a new generation high-definition distal sensor (HD-DS) flexible cystoscope, a standard-definition distal sensor (SD-DS) cystoscope, and a standard fiberoptic (FO) cystoscope. Three new cystoscopes (HD-DS, SD-DS, and FO) were compared for active deflection, irrigation flow, and optical characteristics. Each cystoscope was evaluated with an empty working channel and with various accessories. Optical characteristics (resolution, grayscale imaging, color representation, depth of field, and image brightness) were measured using United States Air Force (USAF)/Edmund Optics test targets and illumination meter. We digitally recorded a porcine cystoscopy in both clear and blood fields, with subsequent video analysis by 8 surgeons via questionnaire. The HD-DS had a higher resolution than the SD-DS and the FO at both 20 mm (6.35 vs 4.00 vs 2.24 line pairs/mm) and 10 mm (14.3 vs 7.13 vs 4.00 line pairs/mm) evaluations, respectively (P <.001 and P <.001). Color representation and depth of field (P = .001 and P <.001) were better in the HD-DS. When compared to the FO, the HD-DS and SD-DS demonstrated superior deflection up and irrigant flow with and without accessory present in the working channel, whereas image brightness was superior in the FO (P <.001, P = .001, and P <.001, respectively). Observers deemed the HD-DS cystoscope superior in visualization in clear and bloody fields, as well as for illumination. The new HD-DS provided significantly improved visualization in a clear and a bloody field, resolution, color representation, and depth of field compared to SD-DS and FO. Clinical correlation of these findings is pending. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Sea ice motion from low-resolution satellite sensors: An alternative method and its validation in the Arctic

    NASA Astrophysics Data System (ADS)

    Lavergne, T.; Eastwood, S.; Teffah, Z.; Schyberg, H.; Breivik, L.-A.

    2010-10-01

    The retrieval of sea ice motion with the Maximum Cross-Correlation (MCC) method from low-resolution (10-15 km) spaceborne imaging sensors is challenged by a dominating quantization noise as the time span of displacement vectors is shortened. To allow investigating shorter displacements from these instruments, we introduce an alternative sea ice motion tracking algorithm that builds on the MCC method but relies on a continuous optimization step for computing the motion vector. The prime effect of this method is to effectively dampen the quantization noise, an artifact of the MCC. It allows for retrieving spatially smooth 48 h sea ice motion vector fields in the Arctic. Strategies to detect and correct erroneous vectors as well as to optimally merge several polarization channels of a given instrument are also described. A test processing chain is implemented and run with several active and passive microwave imagers (Advanced Microwave Scanning Radiometer-EOS (AMSR-E), Special Sensor Microwave Imager, and Advanced Scatterometer) during three Arctic autumn, winter, and spring seasons. Ice motion vectors are collocated to and compared with GPS positions of in situ drifters. Error statistics are shown to be ranging from 2.5 to 4.5 km (standard deviation for components of the vectors) depending on the sensor, without significant bias. We discuss the relative contribution of measurement and representativeness errors by analyzing monthly validation statistics. The 37 GHz channels of the AMSR-E instrument allow for the best validation statistics. The operational low-resolution sea ice drift product of the EUMETSAT OSI SAF (European Organisation for the Exploitation of Meteorological Satellites Ocean and Sea Ice Satellite Application Facility) is based on the algorithms presented in this paper.

  6. Characterization of total ionizing dose damage in COTS pinned photodiode CMOS image sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zujun, E-mail: wangzujun@nint.ac.cn; Ma, Wuying; Huang, Shaoyan

    The characterization of total ionizing dose (TID) damage in COTS pinned photodiode (PPD) CMOS image sensors (CISs) is investigated. The radiation experiments are carried out at a {sup 60}Co γ-ray source. The CISs are produced by 0.18-μm CMOS technology and the pixel architecture is 8T global shutter pixel with correlated double sampling (CDS) based on a 4T PPD front end. The parameters of CISs such as temporal domain, spatial domain, and spectral domain are measured at the CIS test system as the EMVA 1288 standard before and after irradiation. The dark current, random noise, dark signal non-uniformity (DSNU), photo responsemore » non-uniformity (PRNU), overall system gain, saturation output, dynamic range (DR), signal to noise ratio (SNR), quantum efficiency (QE), and responsivity versus the TID are reported. The behaviors of the tested CISs show remarkable degradations after radiation. The degradation mechanisms of CISs induced by TID damage are also analyzed.« less

  7. Characterization of total ionizing dose damage in COTS pinned photodiode CMOS image sensors

    NASA Astrophysics Data System (ADS)

    Wang, Zujun; Ma, Wuying; Huang, Shaoyan; Yao, Zhibin; Liu, Minbo; He, Baoping; Liu, Jing; Sheng, Jiangkun; Xue, Yuan

    2016-03-01

    The characterization of total ionizing dose (TID) damage in COTS pinned photodiode (PPD) CMOS image sensors (CISs) is investigated. The radiation experiments are carried out at a 60Co γ-ray source. The CISs are produced by 0.18-μm CMOS technology and the pixel architecture is 8T global shutter pixel with correlated double sampling (CDS) based on a 4T PPD front end. The parameters of CISs such as temporal domain, spatial domain, and spectral domain are measured at the CIS test system as the EMVA 1288 standard before and after irradiation. The dark current, random noise, dark signal non-uniformity (DSNU), photo response non-uniformity (PRNU), overall system gain, saturation output, dynamic range (DR), signal to noise ratio (SNR), quantum efficiency (QE), and responsivity versus the TID are reported. The behaviors of the tested CISs show remarkable degradations after radiation. The degradation mechanisms of CISs induced by TID damage are also analyzed.

  8. Improving the Ability of Image Sensors to Detect Faint Stars and Moving Objects Using Image Deconvolution Techniques

    PubMed Central

    Fors, Octavi; Núñez, Jorge; Otazu, Xavier; Prades, Albert; Cardinal, Robert D.

    2010-01-01

    In this paper we show how the techniques of image deconvolution can increase the ability of image sensors as, for example, CCD imagers, to detect faint stars or faint orbital objects (small satellites and space debris). In the case of faint stars, we show that this benefit is equivalent to double the quantum efficiency of the used image sensor or to increase the effective telescope aperture by more than 30% without decreasing the astrometric precision or introducing artificial bias. In the case of orbital objects, the deconvolution technique can double the signal-to-noise ratio of the image, which helps to discover and control dangerous objects as space debris or lost satellites. The benefits obtained using CCD detectors can be extrapolated to any kind of image sensors. PMID:22294896

  9. Improving the ability of image sensors to detect faint stars and moving objects using image deconvolution techniques.

    PubMed

    Fors, Octavi; Núñez, Jorge; Otazu, Xavier; Prades, Albert; Cardinal, Robert D

    2010-01-01

    In this paper we show how the techniques of image deconvolution can increase the ability of image sensors as, for example, CCD imagers, to detect faint stars or faint orbital objects (small satellites and space debris). In the case of faint stars, we show that this benefit is equivalent to double the quantum efficiency of the used image sensor or to increase the effective telescope aperture by more than 30% without decreasing the astrometric precision or introducing artificial bias. In the case of orbital objects, the deconvolution technique can double the signal-to-noise ratio of the image, which helps to discover and control dangerous objects as space debris or lost satellites. The benefits obtained using CCD detectors can be extrapolated to any kind of image sensors.

  10. A Method for Application of Classification Tree Models to Map Aquatic Vegetation Using Remotely Sensed Images from Different Sensors and Dates

    PubMed Central

    Jiang, Hao; Zhao, Dehua; Cai, Ying; An, Shuqing

    2012-01-01

    In previous attempts to identify aquatic vegetation from remotely-sensed images using classification trees (CT), the images used to apply CT models to different times or locations necessarily originated from the same satellite sensor as that from which the original images used in model development came, greatly limiting the application of CT. We have developed an effective normalization method to improve the robustness of CT models when applied to images originating from different sensors and dates. A total of 965 ground-truth samples of aquatic vegetation types were obtained in 2009 and 2010 in Taihu Lake, China. Using relevant spectral indices (SI) as classifiers, we manually developed a stable CT model structure and then applied a standard CT algorithm to obtain quantitative (optimal) thresholds from 2009 ground-truth data and images from Landsat7-ETM+, HJ-1B-CCD, Landsat5-TM and ALOS-AVNIR-2 sensors. Optimal CT thresholds produced average classification accuracies of 78.1%, 84.7% and 74.0% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. However, the optimal CT thresholds for different sensor images differed from each other, with an average relative variation (RV) of 6.40%. We developed and evaluated three new approaches to normalizing the images. The best-performing method (Method of 0.1% index scaling) normalized the SI images using tailored percentages of extreme pixel values. Using the images normalized by Method of 0.1% index scaling, CT models for a particular sensor in which thresholds were replaced by those from the models developed for images originating from other sensors provided average classification accuracies of 76.0%, 82.8% and 68.9% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. Applying the CT models developed for normalized 2009 images to 2010 images resulted in high classification (78.0%–93.3%) and overall (92.0%–93.1%) accuracies. Our results suggest that Method of 0.1% index scaling provides a feasible way to apply CT models directly to images from sensors or time periods that differ from those of the images used to develop the original models.

  11. A mobile ferromagnetic shape detection sensor using a Hall sensor array and magnetic imaging.

    PubMed

    Misron, Norhisam; Shin, Ng Wei; Shafie, Suhaidi; Marhaban, Mohd Hamiruce; Mailah, Nashiren Farzilah

    2011-01-01

    This paper presents a mobile Hall sensor array system for the shape detection of ferromagnetic materials that are embedded in walls or floors. The operation of the mobile Hall sensor array system is based on the principle of magnetic flux leakage to describe the shape of the ferromagnetic material. Two permanent magnets are used to generate the magnetic flux flow. The distribution of magnetic flux is perturbed as the ferromagnetic material is brought near the permanent magnets and the changes in magnetic flux distribution are detected by the 1-D array of the Hall sensor array setup. The process for magnetic imaging of the magnetic flux distribution is done by a signal processing unit before it displays the real time images using a netbook. A signal processing application software is developed for the 1-D Hall sensor array signal acquisition and processing to construct a 2-D array matrix. The processed 1-D Hall sensor array signals are later used to construct the magnetic image of ferromagnetic material based on the voltage signal and the magnetic flux distribution. The experimental results illustrate how the shape of specimens such as square, round and triangle shapes is determined through magnetic images based on the voltage signal and magnetic flux distribution of the specimen. In addition, the magnetic images of actual ferromagnetic objects are also illustrated to prove the functionality of mobile Hall sensor array system for actual shape detection. The results prove that the mobile Hall sensor array system is able to perform magnetic imaging in identifying various ferromagnetic materials.

  12. A Mobile Ferromagnetic Shape Detection Sensor Using a Hall Sensor Array and Magnetic Imaging

    PubMed Central

    Misron, Norhisam; Shin, Ng Wei; Shafie, Suhaidi; Marhaban, Mohd Hamiruce; Mailah, Nashiren Farzilah

    2011-01-01

    This paper presents a Mobile Hall Sensor Array system for the shape detection of ferromagnetic materials that are embedded in walls or floors. The operation of the Mobile Hall Sensor Array system is based on the principle of magnetic flux leakage to describe the shape of the ferromagnetic material. Two permanent magnets are used to generate the magnetic flux flow. The distribution of magnetic flux is perturbed as the ferromagnetic material is brought near the permanent magnets and the changes in magnetic flux distribution are detected by the 1-D array of the Hall sensor array setup. The process for magnetic imaging of the magnetic flux distribution is done by a signal processing unit before it displays the real time images using a netbook. A signal processing application software is developed for the 1-D Hall sensor array signal acquisition and processing to construct a 2-D array matrix. The processed 1-D Hall sensor array signals are later used to construct the magnetic image of ferromagnetic material based on the voltage signal and the magnetic flux distribution. The experimental results illustrate how the shape of specimens such as square, round and triangle shapes is determined through magnetic images based on the voltage signal and magnetic flux distribution of the specimen. In addition, the magnetic images of actual ferromagnetic objects are also illustrated to prove the functionality of Mobile Hall Sensor Array system for actual shape detection. The results prove that the Mobile Hall Sensor Array system is able to perform magnetic imaging in identifying various ferromagnetic materials. PMID:22346653

  13. Depth map generation using a single image sensor with phase masks.

    PubMed

    Jang, Jinbeum; Park, Sangwoo; Jo, Jieun; Paik, Joonki

    2016-06-13

    Conventional stereo matching systems generate a depth map using two or more digital imaging sensors. It is difficult to use the small camera system because of their high costs and bulky sizes. In order to solve this problem, this paper presents a stereo matching system using a single image sensor with phase masks for the phase difference auto-focusing. A novel pattern of phase mask array is proposed to simultaneously acquire two pairs of stereo images. Furthermore, a noise-invariant depth map is generated from the raw format sensor output. The proposed method consists of four steps to compute the depth map: (i) acquisition of stereo images using the proposed mask array, (ii) variational segmentation using merging criteria to simplify the input image, (iii) disparity map generation using the hierarchical block matching for disparity measurement, and (iv) image matting to fill holes to generate the dense depth map. The proposed system can be used in small digital cameras without additional lenses or sensors.

  14. A Chip and Pixel Qualification Methodology on Imaging Sensors

    NASA Technical Reports Server (NTRS)

    Chen, Yuan; Guertin, Steven M.; Petkov, Mihail; Nguyen, Duc N.; Novak, Frank

    2004-01-01

    This paper presents a qualification methodology on imaging sensors. In addition to overall chip reliability characterization based on sensor s overall figure of merit, such as Dark Rate, Linearity, Dark Current Non-Uniformity, Fixed Pattern Noise and Photon Response Non-Uniformity, a simulation technique is proposed and used to project pixel reliability. The projected pixel reliability is directly related to imaging quality and provides additional sensor reliability information and performance control.

  15. Noise-based body-wave seismic tomography in an active underground mine.

    NASA Astrophysics Data System (ADS)

    Olivier, G.; Brenguier, F.; Campillo, M.; Lynch, R.; Roux, P.

    2014-12-01

    Over the last decade, ambient noise tomography has become increasingly popular to image the earth's upper crust. The seismic noise recorded in the earth's crust is dominated by surface waves emanating from the interaction of the ocean with the solid earth. These surface waves are low frequency in nature ( < 1 Hz) and not usable for imaging smaller structures associated with mining or oil and gas applications. The seismic noise recorded at higher frequencies are typically from anthropogenic sources, which are short lived, spatially unstable and not well suited for constructing seismic Green's functions between sensors with conventional cross-correlation methods. To examine the use of ambient noise tomography for smaller scale applications, continuous data were recorded for 5 months in an active underground mine in Sweden located more than 1km below surface with 18 high frequency seismic sensors. A wide variety of broadband (10 - 3000 Hz) seismic noise sources are present in an active underground mine ranging from drilling, scraping, trucks, ore crushers and ventilation fans. Some of these sources generate favorable seismic noise, while others are peaked in frequency and not usable. In this presentation, I will show that the noise generated by mining activity can be useful if periods of seismic noise are carefully selected. Although noise sources are not temporally stable and not evenly distributed around the sensor array, good estimates of the seismic Green's functions between sensors can be retrieved for a broad frequency range (20 - 400 Hz) when a selective stacking scheme is used. For frequencies below 100 Hz, the reconstructed Green's functions show clear body-wave arrivals for almost all of the 153 sensor pairs. The arrival times of these body-waves are picked and used to image the local velocity structure. The resulting 3-dimensional image shows a high velocity structure that overlaps with a known ore-body. The material properties of the ore-body differ from the host rock and is likely the cause of the observed high velocity structure. For frequencies above 200 Hz, the seismic waves are multiply scattered by the tunnels and excavations and used to determine the scattering properties of the medium. The results of this study should be useful for future imaging and exploration projects in mining and oil and gas industries.

  16. Compact survey and inspection day/night image sensor suite for small unmanned aircraft systems (EyePod)

    NASA Astrophysics Data System (ADS)

    Bird, Alan; Anderson, Scott A.; Linne von Berg, Dale; Davidson, Morgan; Holt, Niel; Kruer, Melvin; Wilson, Michael L.

    2010-04-01

    EyePod is a compact survey and inspection day/night imaging sensor suite for small unmanned aircraft systems (UAS). EyePod generates georeferenced image products in real-time from visible near infrared (VNIR) and long wave infrared (LWIR) imaging sensors and was developed under the ONR funded FEATHAR (Fusion, Exploitation, Algorithms, and Targeting for High-Altitude Reconnaissance) program. FEATHAR is being directed and executed by the Naval Research Laboratory (NRL) in conjunction with the Space Dynamics Laboratory (SDL) and FEATHAR's goal is to develop and test new tactical sensor systems specifically designed for small manned and unmanned platforms (payload weight < 50 lbs). The EyePod suite consists of two VNIR/LWIR (day/night) gimbaled sensors that, combined, provide broad area survey and focused inspection capabilities. Each EyePod sensor pairs an HD visible EO sensor with a LWIR bolometric imager providing precision geo-referenced and fully digital EO/IR NITFS output imagery. The LWIR sensor is mounted to a patent-pending jitter-reduction stage to correct for the high-frequency motion typically found on small aircraft and unmanned systems. Details will be presented on both the wide-area and inspection EyePod sensor systems, their modes of operation, and results from recent flight demonstrations.

  17. Towards an improved LAI collection protocol via simulated field-based PAR sensing

    DOE PAGES

    Yao, Wei; Van Leeuwen, Martin; Romanczyk, Paul; ...

    2016-07-14

    In support of NASA’s next-generation spectrometer—the Hyperspectral Infrared Imager (HyspIRI)—we are working towards assessing sub-pixel vegetation structure from imaging spectroscopy data. Of particular interest is Leaf Area Index (LAI), which is an informative, yet notoriously challenging parameter to efficiently measure in situ. While photosynthetically-active radiation (PAR) sensors have been validated for measuring crop LAI, there is limited literature on the efficacy of PAR-based LAI measurement in the forest environment. This study (i) validates PAR-based LAI measurement in forest environments, and (ii) proposes a suitable collection protocol, which balances efficiency with measurement variation, e.g., due to sun flecks and various-sized canopymore » gaps. A synthetic PAR sensor model was developed in the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model and used to validate LAI measurement based on first-principles and explicitly-known leaf geometry. Simulated collection parameters were adjusted to empirically identify optimal collection protocols. Furthermore, these collection protocols were then validated in the field by correlating PAR-based LAI measurement to the normalized difference vegetation index (NDVI) extracted from the “classic” Airborne Visible Infrared Imaging Spectrometer (AVIRIS-C) data (R 2 was 0.61). The results indicate that our proposed collecting protocol is suitable for measuring the LAI of sparse forest (LAI < 3–5 ( m 2/m 2)).« less

  18. Compressive light field imaging

    NASA Astrophysics Data System (ADS)

    Ashok, Amit; Neifeld, Mark A.

    2010-04-01

    Light field imagers such as the plenoptic and the integral imagers inherently measure projections of the four dimensional (4D) light field scalar function onto a two dimensional sensor and therefore, suffer from a spatial vs. angular resolution trade-off. Programmable light field imagers, proposed recently, overcome this spatioangular resolution trade-off and allow high-resolution capture of the (4D) light field function with multiple measurements at the cost of a longer exposure time. However, these light field imagers do not exploit the spatio-angular correlations inherent in the light fields of natural scenes and thus result in photon-inefficient measurements. Here, we describe two architectures for compressive light field imaging that require relatively few photon-efficient measurements to obtain a high-resolution estimate of the light field while reducing the overall exposure time. Our simulation study shows that, compressive light field imagers using the principal component (PC) measurement basis require four times fewer measurements and three times shorter exposure time compared to a conventional light field imager in order to achieve an equivalent light field reconstruction quality.

  19. A novel method to increase LinLog CMOS sensors' performance in high dynamic range scenarios.

    PubMed

    Martínez-Sánchez, Antonio; Fernández, Carlos; Navarro, Pedro J; Iborra, Andrés

    2011-01-01

    Images from high dynamic range (HDR) scenes must be obtained with minimum loss of information. For this purpose it is necessary to take full advantage of the quantification levels provided by the CCD/CMOS image sensor. LinLog CMOS sensors satisfy the above demand by offering an adjustable response curve that combines linear and logarithmic responses. This paper presents a novel method to quickly adjust the parameters that control the response curve of a LinLog CMOS image sensor. We propose to use an Adaptive Proportional-Integral-Derivative controller to adjust the exposure time of the sensor, together with control algorithms based on the saturation level and the entropy of the images. With this method the sensor's maximum dynamic range (120 dB) can be used to acquire good quality images from HDR scenes with fast, automatic adaptation to scene conditions. Adaptation to a new scene is rapid, with a sensor response adjustment of less than eight frames when working in real time video mode. At least 67% of the scene entropy can be retained with this method.

  20. Single-event transient imaging with an ultra-high-speed temporally compressive multi-aperture CMOS image sensor.

    PubMed

    Mochizuki, Futa; Kagawa, Keiichiro; Okihara, Shin-ichiro; Seo, Min-Woong; Zhang, Bo; Takasawa, Taishi; Yasutomi, Keita; Kawahito, Shoji

    2016-02-22

    In the work described in this paper, an image reproduction scheme with an ultra-high-speed temporally compressive multi-aperture CMOS image sensor was demonstrated. The sensor captures an object by compressing a sequence of images with focal-plane temporally random-coded shutters, followed by reconstruction of time-resolved images. Because signals are modulated pixel-by-pixel during capturing, the maximum frame rate is defined only by the charge transfer speed and can thus be higher than those of conventional ultra-high-speed cameras. The frame rate and optical efficiency of the multi-aperture scheme are discussed. To demonstrate the proposed imaging method, a 5×3 multi-aperture image sensor was fabricated. The average rising and falling times of the shutters were 1.53 ns and 1.69 ns, respectively. The maximum skew among the shutters was 3 ns. The sensor observed plasma emission by compressing it to 15 frames, and a series of 32 images at 200 Mfps was reconstructed. In the experiment, by correcting disparities and considering temporal pixel responses, artifacts in the reconstructed images were reduced. An improvement in PSNR from 25.8 dB to 30.8 dB was confirmed in simulations.

  1. MEG source imaging method using fast L1 minimum-norm and its applications to signals with brain noise and human resting-state source amplitude images.

    PubMed

    Huang, Ming-Xiong; Huang, Charles W; Robb, Ashley; Angeles, AnneMarie; Nichols, Sharon L; Baker, Dewleen G; Song, Tao; Harrington, Deborah L; Theilmann, Rebecca J; Srinivasan, Ramesh; Heister, David; Diwakar, Mithun; Canive, Jose M; Edgar, J Christopher; Chen, Yu-Han; Ji, Zhengwei; Shen, Max; El-Gabalawy, Fady; Levy, Michael; McLay, Robert; Webb-Murphy, Jennifer; Liu, Thomas T; Drake, Angela; Lee, Roland R

    2014-01-01

    The present study developed a fast MEG source imaging technique based on Fast Vector-based Spatio-Temporal Analysis using a L1-minimum-norm (Fast-VESTAL) and then used the method to obtain the source amplitude images of resting-state magnetoencephalography (MEG) signals for different frequency bands. The Fast-VESTAL technique consists of two steps. First, L1-minimum-norm MEG source images were obtained for the dominant spatial modes of sensor-waveform covariance matrix. Next, accurate source time-courses with millisecond temporal resolution were obtained using an inverse operator constructed from the spatial source images of Step 1. Using simulations, Fast-VESTAL's performance was assessed for its 1) ability to localize multiple correlated sources; 2) ability to faithfully recover source time-courses; 3) robustness to different SNR conditions including SNR with negative dB levels; 4) capability to handle correlated brain noise; and 5) statistical maps of MEG source images. An objective pre-whitening method was also developed and integrated with Fast-VESTAL to remove correlated brain noise. Fast-VESTAL's performance was then examined in the analysis of human median-nerve MEG responses. The results demonstrated that this method easily distinguished sources in the entire somatosensory network. Next, Fast-VESTAL was applied to obtain the first whole-head MEG source-amplitude images from resting-state signals in 41 healthy control subjects, for all standard frequency bands. Comparisons between resting-state MEG sources images and known neurophysiology were provided. Additionally, in simulations and cases with MEG human responses, the results obtained from using conventional beamformer technique were compared with those from Fast-VESTAL, which highlighted the beamformer's problems of signal leaking and distorted source time-courses. © 2013.

  2. MEG Source Imaging Method using Fast L1 Minimum-norm and its Applications to Signals with Brain Noise and Human Resting-state Source Amplitude Images

    PubMed Central

    Huang, Ming-Xiong; Huang, Charles W.; Robb, Ashley; Angeles, AnneMarie; Nichols, Sharon L.; Baker, Dewleen G.; Song, Tao; Harrington, Deborah L.; Theilmann, Rebecca J.; Srinivasan, Ramesh; Heister, David; Diwakar, Mithun; Canive, Jose M.; Edgar, J. Christopher; Chen, Yu-Han; Ji, Zhengwei; Shen, Max; El-Gabalawy, Fady; Levy, Michael; McLay, Robert; Webb-Murphy, Jennifer; Liu, Thomas T.; Drake, Angela; Lee, Roland R.

    2014-01-01

    The present study developed a fast MEG source imaging technique based on Fast Vector-based Spatio-Temporal Analysis using a L1-minimum-norm (Fast-VESTAL) and then used the method to obtain the source amplitude images of resting-state magnetoencephalography (MEG) signals for different frequency bands. The Fast-VESTAL technique consists of two steps. First, L1-minimum-norm MEG source images were obtained for the dominant spatial modes of sensor-waveform covariance matrix. Next, accurate source time-courses with millisecond temporal resolution were obtained using an inverse operator constructed from the spatial source images of Step 1. Using simulations, Fast-VESTAL’s performance of was assessed for its 1) ability to localize multiple correlated sources; 2) ability to faithfully recover source time-courses; 3) robustness to different SNR conditions including SNR with negative dB levels; 4) capability to handle correlated brain noise; and 5) statistical maps of MEG source images. An objective pre-whitening method was also developed and integrated with Fast-VESTAL to remove correlated brain noise. Fast-VESTAL’s performance was then examined in the analysis of human mediannerve MEG responses. The results demonstrated that this method easily distinguished sources in the entire somatosensory network. Next, Fast-VESTAL was applied to obtain the first whole-head MEG source-amplitude images from resting-state signals in 41 healthy control subjects, for all standard frequency bands. Comparisons between resting-state MEG sources images and known neurophysiology were provided. Additionally, in simulations and cases with MEG human responses, the results obtained from using conventional beamformer technique were compared with those from Fast-VESTAL, which highlighted the beamformer’s problems of signal leaking and distorted source time-courses. PMID:24055704

  3. Fingerprint enhancement using a multispectral sensor

    NASA Astrophysics Data System (ADS)

    Rowe, Robert K.; Nixon, Kristin A.

    2005-03-01

    The level of performance of a biometric fingerprint sensor is critically dependent on the quality of the fingerprint images. One of the most common types of optical fingerprint sensors relies on the phenomenon of total internal reflectance (TIR) to generate an image. Under ideal conditions, a TIR fingerprint sensor can produce high-contrast fingerprint images with excellent feature definition. However, images produced by the same sensor under conditions that include dry skin, dirt on the skin, and marginal contact between the finger and the sensor, are likely to be severely degraded. This paper discusses the use of multispectral sensing as a means to collect additional images with new information about the fingerprint that can significantly augment the system performance under both normal and adverse sample conditions. In the context of this paper, "multispectral sensing" is used to broadly denote a collection of images taken under different illumination conditions: different polarizations, different illumination/detection configurations, as well as different wavelength illumination. Results from three small studies using an early-stage prototype of the multispectral-TIR (MTIR) sensor are presented along with results from the corresponding TIR data. The first experiment produced data from 9 people, 4 fingers from each person and 3 measurements per finger under "normal" conditions. The second experiment provided results from a study performed to test the relative performance of TIR and MTIR images when taken under extreme dry and dirty conditions. The third experiment examined the case where the area of contact between the finger and sensor is greatly reduced.

  4. 77 FR 33488 - Certain CMOS Image Sensors and Products Containing Same; Institution of Investigation Pursuant to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-06

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-846] Certain CMOS Image Sensors and..., the sale for importation, and the sale within the United States after importation of certain CMOS image sensors and products containing same by reason of infringement of certain claims of U.S. Patent No...

  5. Radiation tolerant compact image sensor using CdTe photodiode and field emitter array (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Masuzawa, Tomoaki; Neo, Yoichiro; Mimura, Hidenori; Okamoto, Tamotsu; Nagao, Masayoshi; Akiyoshi, Masafumi; Sato, Nobuhiro; Takagi, Ikuji; Tsuji, Hiroshi; Gotoh, Yasuhito

    2016-10-01

    A growing demand on incident detection is recognized since the Great East Japan Earthquake and successive accidents in Fukushima nuclear power plant in 2011. Radiation tolerant image sensors are powerful tools to collect crucial information at initial stages of such incidents. However, semiconductor based image sensors such as CMOS and CCD have limited tolerance to radiation exposure. Image sensors used in nuclear facilities are conventional vacuum tubes using thermal cathodes, which have large size and high power consumption. In this study, we propose a compact image sensor composed of a CdTe-based photodiode and a matrix-driven Spindt-type electron beam source called field emitter array (FEA). A basic principle of FEA-based image sensors is similar to conventional Vidicon type camera tubes, but its electron source is replaced from a thermal cathode to FEA. The use of a field emitter as an electron source should enable significant size reduction while maintaining high radiation tolerance. Current researches on radiation tolerant FEAs and development of CdTe based photoconductive films will be presented.

  6. Holographic leaky-wave metasurfaces for dual-sensor imaging.

    PubMed

    Li, Yun Bo; Li, Lian Lin; Cai, Ben Geng; Cheng, Qiang; Cui, Tie Jun

    2015-12-10

    Metasurfaces have huge potentials to develop new type imaging systems due to their abilities of controlling electromagnetic waves. Here, we propose a new method for dual-sensor imaging based on cross-like holographic leaky-wave metasurfaces which are composed of hybrid isotropic and anisotropic surface impedance textures. The holographic leaky-wave radiations are generated by special impedance modulations of surface waves excited by the sensor ports. For one independent sensor, the main leaky-wave radiation beam can be scanned by frequency in one-dimensional space, while the frequency scanning in the orthogonal spatial dimension is accomplished by the other sensor. Thus, for a probed object, the imaging plane can be illuminated adequately to obtain the two-dimensional backward scattered fields by the dual-sensor for reconstructing the object. The relativity of beams under different frequencies is very low due to the frequency-scanning beam performance rather than the random beam radiations operated by frequency, and the multi-illuminations with low relativity are very appropriate for multi-mode imaging method with high resolution and anti- noise. Good reconstruction results are given to validate the proposed imaging method.

  7. A bio-image sensor for simultaneous detection of multi-neurotransmitters.

    PubMed

    Lee, You-Na; Okumura, Koichi; Horio, Tomoko; Iwata, Tatsuya; Takahashi, Kazuhiro; Hattori, Toshiaki; Sawada, Kazuaki

    2018-03-01

    We report here a new bio-image sensor for simultaneous detection of spatial and temporal distribution of multi-neurotransmitters. It consists of multiple enzyme-immobilized membranes on a 128 × 128 pixel array with read-out circuit. Apyrase and acetylcholinesterase (AChE), as selective elements, are used to recognize adenosine 5'-triphosphate (ATP) and acetylcholine (ACh), respectively. To enhance the spatial resolution, hydrogen ion (H + ) diffusion barrier layers are deposited on top of the bio-image sensor and demonstrated their prevention capability. The results are used to design the space among enzyme-immobilized pixels and the null H + sensor to minimize the undesired signal overlap by H + diffusion. Using this bio-image sensor, we can obtain H + diffusion-independent imaging of concentration gradients of ATP and ACh in real-time. The sensing characteristics, such as sensitivity and detection of limit, are determined experimentally. With the proposed bio-image sensor the possibility exists for customizable monitoring of the activities of various neurochemicals by using different kinds of proton-consuming or generating enzymes. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. MTF evaluation of white pixel sensors

    NASA Astrophysics Data System (ADS)

    Lindner, Albrecht; Atanassov, Kalin; Luo, Jiafu; Goma, Sergio

    2015-01-01

    We present a methodology to compare image sensors with traditional Bayer RGB layouts to sensors with alternative layouts containing white pixels. We focused on the sensors' resolving powers, which we measured in the form of a modulation transfer function for variations in both luma and chroma channels. We present the design of the test chart, the acquisition of images, the image analysis, and an interpretation of results. We demonstrate the approach at the example of two sensors that only differ in their color filter arrays. We confirmed that the sensor with white pixels and the corresponding demosaicing result in a higher resolving power in the luma channel, but a lower resolving power in the chroma channels when compared to the traditional Bayer sensor.

  9. Analysis of simulated image sequences from sensors for restricted-visibility operations

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar

    1991-01-01

    A real time model of the visible output from a 94 GHz sensor, based on a radiometric simulation of the sensor, was developed. A sequence of images as seen from an aircraft as it approaches for landing was simulated using this model. Thirty frames from this sequence of 200 x 200 pixel images were analyzed to identify and track objects in the image using the Cantata image processing package within the visual programming environment provided by the Khoros software system. The image analysis operations are described.

  10. Fully wireless pressure sensor based on endoscopy images

    NASA Astrophysics Data System (ADS)

    Maeda, Yusaku; Mori, Hirohito; Nakagawa, Tomoaki; Takao, Hidekuni

    2018-04-01

    In this paper, the result of developing a fully wireless pressure sensor based on endoscopy images for an endoscopic surgery is reported for the first time. The sensor device has structural color with a nm-scale narrow gap, and the gap is changed by air pressure. The structural color of the sensor is acquired from camera images. Pressure detection can be realized with existing endoscope configurations only. The inner air pressure of the human body should be measured under flexible-endoscope operation using the sensor. Air pressure monitoring, has two important purposes. The first is to quantitatively measure tumor size under a constant air pressure for treatment selection. The second purpose is to prevent the endangerment of a patient due to over transmission of air. The developed sensor was evaluated, and the detection principle based on only endoscopy images has been successfully demonstrated.

  11. Hybrid MRI-Ultrasound acquisitions, and scannerless real-time imaging.

    PubMed

    Preiswerk, Frank; Toews, Matthew; Cheng, Cheng-Chieh; Chiou, Jr-Yuan George; Mei, Chang-Sheng; Schaefer, Lena F; Hoge, W Scott; Schwartz, Benjamin M; Panych, Lawrence P; Madore, Bruno

    2017-09-01

    To combine MRI, ultrasound, and computer science methodologies toward generating MRI contrast at the high frame rates of ultrasound, inside and even outside the MRI bore. A small transducer, held onto the abdomen with an adhesive bandage, collected ultrasound signals during MRI. Based on these ultrasound signals and their correlations with MRI, a machine-learning algorithm created synthetic MR images at frame rates up to 100 per second. In one particular implementation, volunteers were taken out of the MRI bore with the ultrasound sensor still in place, and MR images were generated on the basis of ultrasound signal and learned correlations alone in a "scannerless" manner. Hybrid ultrasound-MRI data were acquired in eight separate imaging sessions. Locations of liver features, in synthetic images, were compared with those from acquired images: The mean error was 1.0 pixel (2.1 mm), with best case 0.4 and worst case 4.1 pixels (in the presence of heavy coughing). For results from outside the bore, qualitative validation involved optically tracked ultrasound imaging with/without coughing. The proposed setup can generate an accurate stream of high-speed MR images, up to 100 frames per second, inside or even outside the MR bore. Magn Reson Med 78:897-908, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  12. Transmission-Type 2-Bit Programmable Metasurface for Single-Sensor and Single-Frequency Microwave Imaging

    PubMed Central

    Li, Yun Bo; Li, Lian Lin; Xu, Bai Bing; Wu, Wei; Wu, Rui Yuan; Wan, Xiang; Cheng, Qiang; Cui, Tie Jun

    2016-01-01

    The programmable and digital metamaterials or metasurfaces presented recently have huge potentials in designing real-time-controlled electromagnetic devices. Here, we propose the first transmission-type 2-bit programmable coding metasurface for single-sensor and single- frequency imaging in the microwave frequency. Compared with the existing single-sensor imagers composed of active spatial modulators with their units controlled independently, we introduce randomly programmable metasurface to transform the masks of modulators, in which their rows and columns are controlled simultaneously so that the complexity and cost of the imaging system can be reduced drastically. Different from the single-sensor approach using the frequency agility, the proposed imaging system makes use of variable modulators under single frequency, which can avoid the object dispersion. In order to realize the transmission-type 2-bit programmable metasurface, we propose a two-layer binary coding unit, which is convenient for changing the voltages in rows and columns to switch the diodes in the top and bottom layers, respectively. In our imaging measurements, we generate the random codes by computer to achieve different transmission patterns, which can support enough multiple modes to solve the inverse-scattering problem in the single-sensor imaging. Simple experimental results are presented in the microwave frequency, validating our new single-sensor and single-frequency imaging system. PMID:27025907

  13. Transmission-Type 2-Bit Programmable Metasurface for Single-Sensor and Single-Frequency Microwave Imaging.

    PubMed

    Li, Yun Bo; Li, Lian Lin; Xu, Bai Bing; Wu, Wei; Wu, Rui Yuan; Wan, Xiang; Cheng, Qiang; Cui, Tie Jun

    2016-03-30

    The programmable and digital metamaterials or metasurfaces presented recently have huge potentials in designing real-time-controlled electromagnetic devices. Here, we propose the first transmission-type 2-bit programmable coding metasurface for single-sensor and single- frequency imaging in the microwave frequency. Compared with the existing single-sensor imagers composed of active spatial modulators with their units controlled independently, we introduce randomly programmable metasurface to transform the masks of modulators, in which their rows and columns are controlled simultaneously so that the complexity and cost of the imaging system can be reduced drastically. Different from the single-sensor approach using the frequency agility, the proposed imaging system makes use of variable modulators under single frequency, which can avoid the object dispersion. In order to realize the transmission-type 2-bit programmable metasurface, we propose a two-layer binary coding unit, which is convenient for changing the voltages in rows and columns to switch the diodes in the top and bottom layers, respectively. In our imaging measurements, we generate the random codes by computer to achieve different transmission patterns, which can support enough multiple modes to solve the inverse-scattering problem in the single-sensor imaging. Simple experimental results are presented in the microwave frequency, validating our new single-sensor and single-frequency imaging system.

  14. CMOS image sensor-based immunodetection by refractive-index change.

    PubMed

    Devadhasan, Jasmine P; Kim, Sanghyo

    2012-01-01

    A complementary metal oxide semiconductor (CMOS) image sensor is an intriguing technology for the development of a novel biosensor. Indeed, the CMOS image sensor mechanism concerning the detection of the antigen-antibody (Ag-Ab) interaction at the nanoscale has been ambiguous so far. To understand the mechanism, more extensive research has been necessary to achieve point-of-care diagnostic devices. This research has demonstrated a CMOS image sensor-based analysis of cardiovascular disease markers, such as C-reactive protein (CRP) and troponin I, Ag-Ab interactions on indium nanoparticle (InNP) substrates by simple photon count variation. The developed sensor is feasible to detect proteins even at a fg/mL concentration under ordinary room light. Possible mechanisms, such as dielectric constant and refractive-index changes, have been studied and proposed. A dramatic change in the refractive index after protein adsorption on an InNP substrate was observed to be a predominant factor involved in CMOS image sensor-based immunoassay.

  15. The advanced along track scanning radiometer (aatsr) on esa's envisat satellite - an early assessment

    NASA Astrophysics Data System (ADS)

    Llewellyn-Jones, D.; Mutlow, C.; Smith, D.; Edwards, M.

    The AATSR sensor is an imaging radiometer designed to measure top-of-the- atmosphere brightness temperature in seven thermal infrared, reflected infrared and visible wavelength channels. The main objective of the AATSR mission is to generate fields of global sea-surface temperature to the high levels of accuracy required for the monitoring and detection of climate change, and to support a broad range of associated research into the marine, terrestrial, cryospheric and atmospheric environments. An essential component of this objective is maintain continuity with the high-quality data-sets already collected form the two predecessor sensors, ATSR1 and 2 on ESA's ERS-1 and -2 satellites respectively. Following the successful launch of ENVISAT on March 1 2002, the AATSR sensor was activated and systematically brought up to full operating configuration in accordance with the agreed Switch-On and Data Acquisition Plan (SODAP). The early images form AATSR are of a quality that is consistent with its objective of effective data continuity. Since the instrument has been returning data, a programme of quality assessment has been taking place. This has included a systematic assessment of instrumental aspects such as signal-to-noise performance and image stability as well as the initial observations in the AATSR validation programme. In this programme, AATSR data-products are compared with correlative observations from other sources, which include, sea-borne radiometers, meteorological analysis fields and data from other satellites. This paper reports early results from some of the activities.

  16. Optical and Electric Multifunctional CMOS Image Sensors for On-Chip Biosensing Applications

    PubMed Central

    Tokuda, Takashi; Noda, Toshihiko; Sasagawa, Kiyotaka; Ohta, Jun

    2010-01-01

    In this review, the concept, design, performance, and a functional demonstration of multifunctional complementary metal-oxide-semiconductor (CMOS) image sensors dedicated to on-chip biosensing applications are described. We developed a sensor architecture that allows flexible configuration of a sensing pixel array consisting of optical and electric sensing pixels, and designed multifunctional CMOS image sensors that can sense light intensity and electric potential or apply a voltage to an on-chip measurement target. We describe the sensors’ architecture on the basis of the type of electric measurement or imaging functionalities. PMID:28879978

  17. Practical design and evaluation methods of omnidirectional vision sensors

    NASA Astrophysics Data System (ADS)

    Ohte, Akira; Tsuzuki, Osamu

    2012-01-01

    A practical omnidirectional vision sensor, consisting of a curved mirror, a mirror-supporting structure, and a megapixel digital imaging system, can view a field of 360 deg horizontally and 135 deg vertically. The authors theoretically analyzed and evaluated several curved mirrors, namely, a spherical mirror, an equidistant mirror, and a single viewpoint mirror (hyperboloidal mirror). The focus of their study was mainly on the image-forming characteristics, position of the virtual images, and size of blur spot images. The authors propose here a practical design method that satisfies the required characteristics. They developed image-processing software for converting circular images to images of the desired characteristics in real time. They also developed several prototype vision sensors using spherical mirrors. Reports dealing with virtual images and blur-spot size of curved mirrors are few; therefore, this paper will be very useful for the development of omnidirectional vision sensors.

  18. Principles and Applications of Imaging Radar, Manual of Remote Sensing, 3rd Edition, Volume 2

    NASA Astrophysics Data System (ADS)

    Moran, M. Susan

    Aerial photographs and digital images from orbiting optical scanners are a daily source of information for the general public through newspapers, television, magazines, and posters. Such images are just as prevalent in scientific journal literature. In the last 6 months, more than half of the weekly issues of Eos published an image acquired by a remote digital sensor. As a result, most geoscientists are familiar with the characteristics and even the acronyms of the current satellites and their optical sensors, common detector filters, and image presentation. In many cases, this familiarity has bred contempt. This is so because the limitations of optical sensors (imaging in the visible and infrared portions of the electromagnetic spectrum) can be quite formidable. Images of the surface cannot be acquired through clouds, and image quality is impaired with low-light conditions (such as at polar regions), atmospheric scattering and absorption, and variations in sun/sensor/surface geometry.

  19. Positron emission imaging device and method of using the same

    DOEpatents

    Bingham, Philip R.; Mullens, James Allen

    2013-01-15

    An imaging system and method of imaging are disclosed. The imaging system can include an external radiation source producing pairs of substantially simultaneous radiation emissions of a picturization emission and a verification emissions at an emission angle. The imaging system can also include a plurality of picturization sensors and at least one verification sensor for detecting the picturization and verification emissions, respectively. The imaging system also includes an object stage is arranged such that a picturization emission can pass through an object supported on said object stage before being detected by one of said plurality of picturization sensors. A coincidence system and a reconstruction system can also be included. The coincidence can receive information from the picturization and verification sensors and determine whether a detected picturization emission is direct radiation or scattered radiation. The reconstruction system can produce a multi-dimensional representation of an object imaged with the imaging system.

  20. A programmable computational image sensor for high-speed vision

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Shi, Cong; Long, Xitian; Wu, Nanjian

    2013-08-01

    In this paper we present a programmable computational image sensor for high-speed vision. This computational image sensor contains four main blocks: an image pixel array, a massively parallel processing element (PE) array, a row processor (RP) array and a RISC core. The pixel-parallel PE is responsible for transferring, storing and processing image raw data in a SIMD fashion with its own programming language. The RPs are one dimensional array of simplified RISC cores, it can carry out complex arithmetic and logic operations. The PE array and RP array can finish great amount of computation with few instruction cycles and therefore satisfy the low- and middle-level high-speed image processing requirement. The RISC core controls the whole system operation and finishes some high-level image processing algorithms. We utilize a simplified AHB bus as the system bus to connect our major components. Programming language and corresponding tool chain for this computational image sensor are also developed.

  1. Digital and optical shape representation and pattern recognition; Proceedings of the Meeting, Orlando, FL, Apr. 4-6, 1988

    NASA Technical Reports Server (NTRS)

    Juday, Richard D. (Editor)

    1988-01-01

    The present conference discusses topics in pattern-recognition correlator architectures, digital stereo systems, geometric image transformations and their applications, topics in pattern recognition, filter algorithms, object detection and classification, shape representation techniques, and model-based object recognition methods. Attention is given to edge-enhancement preprocessing using liquid crystal TVs, massively-parallel optical data base management, three-dimensional sensing with polar exponential sensor arrays, the optical processing of imaging spectrometer data, hybrid associative memories and metric data models, the representation of shape primitives in neural networks, and the Monte Carlo estimation of moment invariants for pattern recognition.

  2. Optical and digital pattern recognition; Proceedings of the Meeting, Los Angeles, CA, Jan. 13-15, 1987

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang (Editor); Schenker, Paul (Editor)

    1987-01-01

    The papers presented in this volume provide an overview of current research in both optical and digital pattern recognition, with a theme of identifying overlapping research problems and methodologies. Topics discussed include image analysis and low-level vision, optical system design, object analysis and recognition, real-time hybrid architectures and algorithms, high-level image understanding, and optical matched filter design. Papers are presented on synthetic estimation filters for a control system; white-light correlator character recognition; optical AI architectures for intelligent sensors; interpreting aerial photographs by segmentation and search; and optical information processing using a new photopolymer.

  3. Approximating SIR-B response characteristics and estimating wave height and wavelength for ocean imagery

    NASA Technical Reports Server (NTRS)

    Tilley, David G.

    1987-01-01

    NASA Space Shuttle Challenger SIR-B ocean scenes are used to derive directional wave spectra for which speckle noise is modeled as a function of Rayleigh random phase coherence downrange and Poisson random amplitude errors inherent in the Doppler measurement of along-track position. A Fourier filter that preserves SIR-B image phase relations is used to correct the stationary and dynamic response characteristics of the remote sensor and scene correlator, as well as to subtract an estimate of the speckle noise component. A two-dimensional map of sea surface elevation is obtained after the filtered image is corrected for both random and deterministic motions.

  4. In situ microscopy for on-line determination of biomass.

    PubMed

    Bittner, C; Wehnert, G; Scheper, T

    1998-10-05

    A sensor is presented, which allows on-line microscopic observation of microorganisms during fermentations in bioreactors. This sensor, an In Situ Microscope (ISM) consists of a direct-light microscope with a measuring chamber, integrated in a 25 mm stainless steel tube, two CCD-cameras, and two frame-grabbers. The data obtained are processed by an automatic image analysis system. The ISM is connected with the bioreactor via a standard port, and it is immersed directly in the culture liquid-in our case Saccharomyces cerevisiae in a synthetic medium. The microscopic examination of the liquid is performed in the measuring chamber, which is situated near the front end of the sensor head. The measuring chamber is opened and closed periodically. In the open state, the liquid in the bioreactor flows unrestricted through the chamber. In closing, a defined volume of 2,2. 10(-8) mL of the liquid becomes enclosed. After a few seconds, when the movement of the cells in the enclosed culture has stopped, they are examined with the microscope. The microscopic images of the cells are registered with the CCD-cameras and are visualized on a monitor, allowing a direct view of the cell population. After detection, the measuring chamber reopens, and the enclosed liquid is released. The images obtained are evaluated as to cell concentration, cell size, cell volume, biomass, and other relevant parameters simultaneously by automatic image analysis. With a PC (486/33 MHz), image processing takes about 15 s per image. The detection range tested when measuring cells of S. cerevisiae is about 10(6) to 10(9) cells/mL (equivalent to a biomass of 0.01 g/L to 12 g/L). The calculated biomass values correlate very well with those obtained using dry weight analysis. Furthermore, histograms can be calculated, which are comparable to those obtained by flow cytometry. Copyright 1998 John Wiley & Sons, Inc.

  5. The NASA Airborne Earth Science Microwave Imaging Radiometer (AESMIR): A New Sensor for Earth Remote Sensing

    NASA Technical Reports Server (NTRS)

    Kim, Edward

    2003-01-01

    The Airborne Earth Science Microwave Imaging Radiometer (AESMIR) is a versatile new airborne imaging radiometer recently developed by NASA. The AESMIR design is unique in that it performs dual-polarized imaging at all standard passive microwave frequency bands (6-89 GHz) using only one sensor headscanner package, providing an efficient solution for Earth remote sensing applications (snow, soil moisture/land parameters, precipitation, ocean winds, sea surface temperature, water vapor, sea ice, etc.). The microwave radiometers themselves will incorporate state-of-the-art receivers, with particular attention given to instrument calibration for the best possible accuracy and sensitivity. The single-package design of AESMIR makes it compatible with high-altitude aircraft platforms such as the NASA ER-2s. The arbitrary 2-axis gimbal can perform conical and cross-track scanning, as well as fixed-beam staring. This compatibility with high-altitude platforms coupled with the flexible scanning configuration, opens up previously unavailable science opportunities for convection/precip/cloud science and co-flying with complementary instruments, as well as providing wider swath coverage for all science applications. By designing AESMIR to be compatible with these high-altitude platforms, we are also compatible with the NASA P-3, the NASA DC-8, C-130s and ground-based deployments. Thus AESMIR can provide low-, mid-, and high- altitude microwave imaging. Parallel filter banks allow AESMIR to simultaneously simulate the exact passbands of multiple satellite radiometers: SSM/I, TMI, AMSR, Windsat, SSMI/S, and the upcoming GPM/GMI and NPOESS/CMIS instruments --a unique capability among aircraft radiometers. An L-band option is also under development, again using the same scanner. With this option, simultaneous imaging from 1.4 to 89 GHz will be feasible. And, all receivers except the sounding channels will be configured for 4-Stokes polarimetric operation using high-speed digital correlators in the near future. The capabilities and unique design features of this new sensor will be described, and example imagery will be presented.

  6. Characterisation of a smartphone image sensor response to direct solar 305nm irradiation at high air masses.

    PubMed

    Igoe, D P; Amar, A; Parisi, A V; Turner, J

    2017-06-01

    This research reports the first time the sensitivity, properties and response of a smartphone image sensor that has been used to characterise the photobiologically important direct UVB solar irradiances at 305nm in clear sky conditions at high air masses. Solar images taken from Autumn to Spring were analysed using a custom Python script, written to develop and apply an adaptive threshold to mitigate the effects of both noise and hot-pixel aberrations in the images. The images were taken in an unobstructed area, observing from a solar zenith angle as high as 84° (air mass=9.6) to local solar maximum (up to a solar zenith angle of 23°) to fully develop the calibration model in temperatures that varied from 2°C to 24°C. The mean ozone thickness throughout all observations was 281±18 DU (to 2 standard deviations). A Langley Plot was used to confirm that there were constant atmospheric conditions throughout the observations. The quadratic calibration model developed has a strong correlation between the red colour channel from the smartphone with the Microtops measurements of the direct sun 305nm UV, with a coefficient of determination of 0.998 and very low standard errors. Validation of the model verified the robustness of the method and the model, with an average discrepancy of only 5% between smartphone derived and Microtops observed direct solar irradiances at 305nm. The results demonstrate the effectiveness of using the smartphone image sensor as a means to measure photobiologically important solar UVB radiation. The use of ubiquitous portable technologies, such as smartphones and laptop computers to perform data collection and analysis of solar UVB observations is an example of how scientific investigations can be performed by citizen science based individuals and groups, communities and schools. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Automated assembly of camera modules using active alignment with up to six degrees of freedom

    NASA Astrophysics Data System (ADS)

    Bräuniger, K.; Stickler, D.; Winters, D.; Volmer, C.; Jahn, M.; Krey, S.

    2014-03-01

    With the upcoming Ultra High Definition (UHD) cameras, the accurate alignment of optical systems with respect to the UHD image sensor becomes increasingly important. Even with a perfect objective lens, the image quality will deteriorate when it is poorly aligned to the sensor. For evaluating the imaging quality the Modulation Transfer Function (MTF) is used as the most accepted test. In the first part it is described how the alignment errors that lead to a low imaging quality can be measured. Collimators with crosshair at defined field positions or a test chart are used as object generators for infinite-finite or respectively finite-finite conjugation. The process how to align the image sensor accurately to the optical system will be described. The focus position, shift, tilt and rotation of the image sensor are automatically corrected to obtain an optimized MTF for all field positions including the center. The software algorithm to grab images, calculate the MTF and adjust the image sensor in six degrees of freedom within less than 30 seconds per UHD camera module is described. The resulting accuracy of the image sensor rotation is better than 2 arcmin and the accuracy position alignment in x,y,z is better 2 μm. Finally, the process of gluing and UV-curing is described and how it is managed in the integrated process.

  8. Fast range estimation based on active range-gated imaging for coastal surveillance

    NASA Astrophysics Data System (ADS)

    Kong, Qingshan; Cao, Yinan; Wang, Xinwei; Tong, Youwan; Zhou, Yan; Liu, Yuliang

    2012-11-01

    Coastal surveillance is very important because it is useful for search and rescue, illegal immigration, or harbor security and so on. Furthermore, range estimation is critical for precisely detecting the target. Range-gated laser imaging sensor is suitable for high accuracy range especially in night and no moonlight. Generally, before detecting the target, it is necessary to change delay time till the target is captured. There are two operating mode for range-gated imaging sensor, one is passive imaging mode, and the other is gate viewing mode. Firstly, the sensor is passive mode, only capturing scenes by ICCD, once the object appears in the range of monitoring area, we can obtain the course range of the target according to the imaging geometry/projecting transform. Then, the sensor is gate viewing mode, applying micro second laser pulses and sensor gate width, we can get the range of targets by at least two continuous images with trapezoid-shaped range intensity profile. This technique enables super-resolution depth mapping with a reduction of imaging data processing. Based on the first step, we can calculate the rough value and quickly fix delay time which the target is detected. This technique has overcome the depth resolution limitation for 3D active imaging and enables super-resolution depth mapping with a reduction of imaging data processing. By the two steps, we can quickly obtain the distance between the object and sensor.

  9. Technical guidance for the development of a solid state image sensor for human low vision image warping

    NASA Technical Reports Server (NTRS)

    Vanderspiegel, Jan

    1994-01-01

    This report surveys different technologies and approaches to realize sensors for image warping. The goal is to study the feasibility, technical aspects, and limitations of making an electronic camera with special geometries which implements certain transformations for image warping. This work was inspired by the research done by Dr. Juday at NASA Johnson Space Center on image warping. The study has looked into different solid-state technologies to fabricate image sensors. It is found that among the available technologies, CMOS is preferred over CCD technology. CMOS provides more flexibility to design different functions into the sensor, is more widely available, and is a lower cost solution. By using an architecture with row and column decoders one has the added flexibility of addressing the pixels at random, or read out only part of the image.

  10. Got Point Clouds: Characterizing Canopy Structure With Active and Passive Sensors

    NASA Astrophysics Data System (ADS)

    Popescu, S. C.; Malambo, L.; Sheridan, R.; Putman, E.; Murray, S.; Rooney, W.; Rajan, N.

    2016-12-01

    Unmanned Aerial Systems (UAS) provide the means to acquire highly customized aerial data at local scale with a multitude of sensors. UAS allow us to obtain affordably repeated observations of canopy structure for agricultural and natural resources applications by using passive optical sensors, such as cameras and photogrammetric techniques, and active sensors, such as lidar (Light Detection and Ranging). The objectives of this presentation are to: (1) offer a brief overview of UAS used for agriculture and natural resources studies, (2) describe experiences in conducting agriculture phenotyping and forest vegetation measurements, and (3) give details on the methodology developed for image and lidar data processing for characterizing the three dimensional structure of plant canopies. The UAS types used for this purpose included rotary platforms, such as quadcopters, hexacopters, and octocopters, with a payload capacity of up to 19 lbs. The sensors that collected data over two crop seasons include multispectral cameras in the visible color spectrum and near infrared, and UAS-lidar. For ground reference data we used terrestrial lidar scanners and field measurements. Results comparing UAS and terrestrial measurements show high correlation and open new areas of scientific investigation of crop canopies previously not possible with affordable techniques.

  11. Multiocular image sensor with on-chip beam-splitter and inner meta-micro-lens for single-main-lens stereo camera.

    PubMed

    Koyama, Shinzo; Onozawa, Kazutoshi; Tanaka, Keisuke; Saito, Shigeru; Kourkouss, Sahim Mohamed; Kato, Yoshihisa

    2016-08-08

    We developed multiocular 1/3-inch 2.75-μm-pixel-size 2.1M- pixel image sensors by co-design of both on-chip beam-splitter and 100-nm-width 800-nm-depth patterned inner meta-micro-lens for single-main-lens stereo camera systems. A camera with the multiocular image sensor can capture horizontally one-dimensional light filed by both the on-chip beam-splitter horizontally dividing ray according to incident angle, and the inner meta-micro-lens collecting the divided ray into pixel with small optical loss. Cross-talks between adjacent light field images of a fabricated binocular image sensor and of a quad-ocular image sensor are as low as 6% and 7% respectively. With the selection of two images from one-dimensional light filed images, a selective baseline for stereo vision is realized to view close objects with single-main-lens. In addition, by adding multiple light field images with different ratios, baseline distance can be tuned within an aperture of a main lens. We suggest the electrically selective or tunable baseline stereo vision to reduce 3D fatigue of viewers.

  12. LANDSAT-D Thematic Mapper image dimensionality reduction and geometric correction accuracy. [Walnut Creek Watershed, Texas

    NASA Technical Reports Server (NTRS)

    Ford, G. E. (Principal Investigator)

    1984-01-01

    Principal components transformations was applied to a Walnut Creek, Texas subscene to reduce the dimensionality of the multispectral sensor data. This transformation was also applied to a LANDSAT 3 MSS subscene of the same area acquired in a different season and year. Results of both procedures are tabulated and allow for comparisons between TM and MSS data. The TM correlation matrix shows that visible bands 1 to 3 exhibit a high degree of correlation in the range 0.92 to 0.96. Correlation for bands 5 to 7 is 0.93. Band 4 is not highly correlated with any other band, with corrections in the range 0.13 to 0.52. The thermal band (6) is not highly correlated with other bands in the range 0.13 to 0.46. The MSS correlation matrix shows that bands 4 and 5 are highly correlated (0.96) as are bands 6 and 7 with a correlation of 0.92.

  13. Improved target detection algorithm using Fukunaga-Koontz transform and distance classifier correlation filter

    NASA Astrophysics Data System (ADS)

    Bal, A.; Alam, M. S.; Aslan, M. S.

    2006-05-01

    Often sensor ego-motion or fast target movement causes the target to temporarily go out of the field-of-view leading to reappearing target detection problem in target tracking applications. Since the target goes out of the current frame and reenters at a later frame, the reentering location and variations in rotation, scale, and other 3D orientations of the target are not known thus complicating the detection algorithm has been developed using Fukunaga-Koontz Transform (FKT) and distance classifier correlation filter (DCCF). The detection algorithm uses target and background information, extracted from training samples, to detect possible candidate target images. The detected candidate target images are then introduced into the second algorithm, DCCF, called clutter rejection module, to determine the target coordinates are detected and tracking algorithm is initiated. The performance of the proposed FKT-DCCF based target detection algorithm has been tested using real-world forward looking infrared (FLIR) video sequences.

  14. Resolution Enhancement of Hyperion Hyperspectral Data using Ikonos Multispectral Data

    DTIC Science & Technology

    2007-09-01

    spatial - resolution hyperspectral image to produce a sharpened product. The result is a product that has the spectral properties of the ...multispectral sensors. In this work, we examine the benefits of combining data from high- spatial - resolution , low- spectral - resolution spectral imaging...sensors with data obtained from high- spectral - resolution , low- spatial - resolution spectral imaging sensors.

  15. Estimating Morning Change in Land Surface Temperature from MODIS Day/Night Observations: Applications for Surface Energy Balance Modeling.

    PubMed

    Hain, Christopher R; Anderson, Martha C

    2017-10-16

    Observations of land surface temperature (LST) are crucial for the monitoring of surface energy fluxes from satellite. Methods that require high temporal resolution LST observations (e.g., from geostationary orbit) can be difficult to apply globally because several geostationary sensors are required to attain near-global coverage (60°N to 60°S). While these LST observations are available from polar-orbiting sensors, providing global coverage at higher spatial resolutions, the temporal sampling (twice daily observations) can pose significant limitations. For example, the Atmosphere Land Exchange Inverse (ALEXI) surface energy balance model, used for monitoring evapotranspiration and drought, requires an observation of the morning change in LST - a quantity not directly observable from polar-orbiting sensors. Therefore, we have developed and evaluated a data-mining approach to estimate the mid-morning rise in LST from a single sensor (2 observations per day) of LST from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor on the Aqua platform. In general, the data-mining approach produced estimates with low relative error (5 to 10%) and statistically significant correlations when compared against geostationary observations. This approach will facilitate global, near real-time applications of ALEXI at higher spatial and temporal coverage from a single sensor than currently achievable with current geostationary datasets.

  16. Design and fabrication of vertically-integrated CMOS image sensors.

    PubMed

    Skorka, Orit; Joseph, Dileepan

    2011-01-01

    Technologies to fabricate integrated circuits (IC) with 3D structures are an emerging trend in IC design. They are based on vertical stacking of active components to form heterogeneous microsystems. Electronic image sensors will benefit from these technologies because they allow increased pixel-level data processing and device optimization. This paper covers general principles in the design of vertically-integrated (VI) CMOS image sensors that are fabricated by flip-chip bonding. These sensors are composed of a CMOS die and a photodetector die. As a specific example, the paper presents a VI-CMOS image sensor that was designed at the University of Alberta, and fabricated with the help of CMC Microsystems and Micralyne Inc. To realize prototypes, CMOS dies with logarithmic active pixels were prepared in a commercial process, and photodetector dies with metal-semiconductor-metal devices were prepared in a custom process using hydrogenated amorphous silicon. The paper also describes a digital camera that was developed to test the prototype. In this camera, scenes captured by the image sensor are read using an FPGA board, and sent in real time to a PC over USB for data processing and display. Experimental results show that the VI-CMOS prototype has a higher dynamic range and a lower dark limit than conventional electronic image sensors.

  17. An improved triangulation laser rangefinder using a custom CMOS HDR linear image sensor

    NASA Astrophysics Data System (ADS)

    Liscombe, Michael

    3-D triangulation laser rangefinders are used in many modern applications, from terrain mapping to biometric identification. Although a wide variety of designs have been proposed, laser speckle noise still provides a fundamental limitation on range accuracy. These works propose a new triangulation laser rangefinder designed specifically to mitigate the effects of laser speckle noise. The proposed rangefinder uses a precision linear translator to laterally reposition the imaging system (e.g., image sensor and imaging lens). For a given spatial location of the laser spot, capturing N spatially uncorrelated laser spot profiles is shown to improve range accuracy by a factor of N . This technique has many advantages over past speckle-reduction technologies, such as a fixed system cost and form factor, and the ability to virtually eliminate laser speckle noise. These advantages are made possible through spatial diversity and come at the cost of increased acquisition time. The rangefinder makes use of the ICFYKWG1 linear image sensor, a custom CMOS sensor developed at the Vision Sensor Laboratory (York University). Tests are performed on the image sensor's innovative high dynamic range technology to determine its effects on range accuracy. As expected, experimental results have shown that the sensor provides a trade-off between dynamic range and range accuracy.

  18. Design and Fabrication of Vertically-Integrated CMOS Image Sensors

    PubMed Central

    Skorka, Orit; Joseph, Dileepan

    2011-01-01

    Technologies to fabricate integrated circuits (IC) with 3D structures are an emerging trend in IC design. They are based on vertical stacking of active components to form heterogeneous microsystems. Electronic image sensors will benefit from these technologies because they allow increased pixel-level data processing and device optimization. This paper covers general principles in the design of vertically-integrated (VI) CMOS image sensors that are fabricated by flip-chip bonding. These sensors are composed of a CMOS die and a photodetector die. As a specific example, the paper presents a VI-CMOS image sensor that was designed at the University of Alberta, and fabricated with the help of CMC Microsystems and Micralyne Inc. To realize prototypes, CMOS dies with logarithmic active pixels were prepared in a commercial process, and photodetector dies with metal-semiconductor-metal devices were prepared in a custom process using hydrogenated amorphous silicon. The paper also describes a digital camera that was developed to test the prototype. In this camera, scenes captured by the image sensor are read using an FPGA board, and sent in real time to a PC over USB for data processing and display. Experimental results show that the VI-CMOS prototype has a higher dynamic range and a lower dark limit than conventional electronic image sensors. PMID:22163860

  19. Toward one Giga frames per second--evolution of in situ storage image sensors.

    PubMed

    Etoh, Takeharu G; Son, Dao V T; Yamada, Tetsuo; Charbon, Edoardo

    2013-04-08

    The ISIS is an ultra-fast image sensor with in-pixel storage. The evolution of the ISIS in the past and in the near future is reviewed and forecasted. To cover the storage area with a light shield, the conventional frontside illuminated ISIS has a limited fill factor. To achieve higher sensitivity, a BSI ISIS was developed. To avoid direct intrusion of light and migration of signal electrons to the storage area on the frontside, a cross-sectional sensor structure with thick pnpn layers was developed, and named "Tetratified structure". By folding and looping in-pixel storage CCDs, an image signal accumulation sensor, ISAS, is proposed. The ISAS has a new function, the in-pixel signal accumulation, in addition to the ultra-high-speed imaging. To achieve much higher frame rate, a multi-collection-gate (MCG) BSI image sensor architecture is proposed. The photoreceptive area forms a honeycomb-like shape. Performance of a hexagonal CCD-type MCG BSI sensor is examined by simulations. The highest frame rate is theoretically more than 1Gfps. For the near future, a stacked hybrid CCD/CMOS MCG image sensor seems most promising. The associated problems are discussed. A fine TSV process is the key technology to realize the structure.

  20. Road sign recognition using Viapix module and correlation

    NASA Astrophysics Data System (ADS)

    Ouerhani, Y.; Desthieux, M.; Alfalou, A.

    2015-03-01

    In this paper, we propose and validate a new system used to explore road assets. In this work we are interested on the vertical road signs. To do this, we are based on the combination of road signs detection, recognition and identification using data provides by sensors. The proposed approach consists on using panoramic views provided by the innovative device, VIAPIX®1, developed by our company ACTRIS2. We are based also on the optimized correlation technique for road signs recognition and identification on pictures. Obtained results shows the interest on using panoramic views compared to results obtained using images provided using only one camera.

  1. A portable NMR sensor to measure dynamic changes in the amount of water in living stems or fruit and its potential to measure sap flow.

    PubMed

    Windt, Carel W; Blümler, Peter

    2015-04-01

    Nuclear magnetic resonance (NMR) and NMR imaging (magnetic resonance imaging) offer the possibility to quantitatively and non-invasively measure the presence and movement of water. Unfortunately, traditional NMR hardware is expensive, poorly suited for plants, and because of its bulk and complexity, not suitable for use in the field. But does it need to be? We here explore how novel, small-scale portable NMR devices can be used as a flow sensor to directly measure xylem sap flow in a poplar tree (Populus nigra L.), or in a dendrometer-like fashion to measure dynamic changes in the absolute water content of fruit or stems. For the latter purpose we monitored the diurnal pattern of growth, expansion and shrinkage in a model fruit (bean pod, Phaseolus vulgaris L.) and in the stem of an oak tree (Quercus robur L.). We compared changes in absolute stem water content, as measured by the NMR sensor, against stem diameter variations as measured by a set of conventional point dendrometers, to test how well the sensitivities of the two methods compare and to investigate how well diurnal changes in trunk absolute water content correlate with the concomitant diurnal variations in stem diameter. Our results confirm the existence of a strong correlation between the two parameters, but also suggest that dynamic changes in oak stem water content could be larger than is apparent on the basis of the stem diameter variation alone. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Illumination adaptation with rapid-response color sensors

    NASA Astrophysics Data System (ADS)

    Zhang, Xinchi; Wang, Quan; Boyer, Kim L.

    2014-09-01

    Smart lighting solutions based on imaging sensors such as webcams or time-of-flight sensors suffer from rising privacy concerns. In this work, we use low-cost non-imaging color sensors to measure local luminous flux of different colors in an indoor space. These sensors have much higher data acquisition rate and are much cheaper than many o_-the-shelf commercial products. We have developed several applications with these sensors, including illumination feedback control and occupancy-driven lighting.

  3. Vision communications based on LED array and imaging sensor

    NASA Astrophysics Data System (ADS)

    Yoo, Jong-Ho; Jung, Sung-Yoon

    2012-11-01

    In this paper, we propose a brand new communication concept, called as "vision communication" based on LED array and image sensor. This system consists of LED array as a transmitter and digital device which include image sensor such as CCD and CMOS as receiver. In order to transmit data, the proposed communication scheme simultaneously uses the digital image processing and optical wireless communication scheme. Therefore, the cognitive communication scheme is possible with the help of recognition techniques used in vision system. By increasing data rate, our scheme can use LED array consisting of several multi-spectral LEDs. Because arranged each LED can emit multi-spectral optical signal such as visible, infrared and ultraviolet light, the increase of data rate is possible similar to WDM and MIMO skills used in traditional optical and wireless communications. In addition, this multi-spectral capability also makes it possible to avoid the optical noises in communication environment. In our vision communication scheme, the data packet is composed of Sync. data and information data. Sync. data is used to detect the transmitter area and calibrate the distorted image snapshots obtained by image sensor. By making the optical rate of LED array be same with the frame rate (frames per second) of image sensor, we can decode the information data included in each image snapshot based on image processing and optical wireless communication techniques. Through experiment based on practical test bed system, we confirm the feasibility of the proposed vision communications based on LED array and image sensor.

  4. CMOS Imaging Sensor Technology for Aerial Mapping Cameras

    NASA Astrophysics Data System (ADS)

    Neumann, Klaus; Welzenbach, Martin; Timm, Martin

    2016-06-01

    In June 2015 Leica Geosystems launched the first large format aerial mapping camera using CMOS sensor technology, the Leica DMC III. This paper describes the motivation to change from CCD sensor technology to CMOS for the development of this new aerial mapping camera. In 2002 the DMC first generation was developed by Z/I Imaging. It was the first large format digital frame sensor designed for mapping applications. In 2009 Z/I Imaging designed the DMC II which was the first digital aerial mapping camera using a single ultra large CCD sensor to avoid stitching of smaller CCDs. The DMC III is now the third generation of large format frame sensor developed by Z/I Imaging and Leica Geosystems for the DMC camera family. It is an evolution of the DMC II using the same system design with one large monolithic PAN sensor and four multi spectral camera heads for R,G, B and NIR. For the first time a 391 Megapixel large CMOS sensor had been used as PAN chromatic sensor, which is an industry record. Along with CMOS technology goes a range of technical benefits. The dynamic range of the CMOS sensor is approx. twice the range of a comparable CCD sensor and the signal to noise ratio is significantly better than with CCDs. Finally results from the first DMC III customer installations and test flights will be presented and compared with other CCD based aerial sensors.

  5. On the Character and Mitigation of Atmospheric Noise in InSAR Time Series Analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Barnhart, W. D.; Fielding, E. J.; Fishbein, E.

    2013-12-01

    Time series analysis of interferometric synthetic aperture radar (InSAR) data, with its broad spatial coverage and ability to image regions that are sometimes very difficult to access, is a powerful tool for characterizing continental surface deformation and its temporal variations. With the impending launch of dedicated SAR missions such as Sentinel-1, ALOS-2, and the planned NASA L-band SAR mission, large volume data sets will allow researchers to further probe ground displacement processes with increased fidelity. Unfortunately, the precision of measurements in individual interferograms is impacted by several sources of noise, notably spatially correlated signals caused by path delays through the stratified and turbulent atmosphere and ionosphere. Spatial and temporal variations in atmospheric water vapor often introduce several to tens of centimeters of apparent deformation in the radar line-of-sight, correlated over short spatial scales (<10 km). Signals resulting from atmospheric path delays are particularly problematic because, like the subsidence and uplift signals associated with tectonic deformation, they are often spatially correlated with topography. In this talk, we provide an overview of the effects of spatially correlated tropospheric noise in individual interferograms and InSAR time series analysis, and we highlight where common assumptions of the temporal and spatial characteristics of tropospheric noise fail. Next, we discuss two classes of methods for mitigating the effects of tropospheric water vapor noise in InSAR time series analysis and single interferograms: noise estimation and characterization with independent observations from multispectral sensors such as MODIS and MERIS; and noise estimation and removal with weather models, multispectral sensor observations, and GPS. Each of these techniques can provide independent assessments of the contribution of water vapor in interferograms, but each technique also suffers from several pitfalls that we outline. The multispectral near-infrared (NIR) sensors provide high spatial resolution (~1 km) estimates of total column tropospheric water vapor by measuring the absorption of reflected solar illumination and provide may excellent estimates of wet delay. The Online Services for Correcting Atmosphere in Radar (OSCAR) project currently provides water vapor products through web services (http://oscar.jpl.nasa.gov). Unfortunately, such sensors require daytime and cloudless observations. Global and regional numerical weather models can provide an additional estimate of both the dry and atmospheric delays with spatial resolution of (3-100 km) and time scales of 1-3 hours, though these models are of lower accuracy than imaging observations and are benefited by independent observations from independent observations of atmospheric water vapor. Despite these issues, the integration of these techniques for InSAR correction and uncertainty estimation may contribute substantially to the reduction and rigorous characterization of uncertainty in InSAR time series analysis - helping to expand the range of tectonic displacements imaged with InSAR, to robustly constrain geophysical models, and to generate a-priori assessments of satellite acquisitions goals.

  6. A Real Time System for Multi-Sensor Image Analysis through Pyramidal Segmentation

    DTIC Science & Technology

    1992-01-30

    A Real Time Syte for M~ulti- sensor Image Analysis S. E I0 through Pyramidal Segmentation/ / c •) L. Rudin, S. Osher, G. Koepfler, J.9. Morel 7. ytu...experiments with reconnaissance photography, multi- sensor satellite imagery, medical CT and MRI multi-band data have shown a great practi- cal potential...C ,SF _/ -- / WSM iS-I-0-d41-40450 $tltwt, kw" I (nor.- . Z-97- A real-time system for multi- sensor image analysis through pyramidal segmentation

  7. CMOS image sensor with lateral electric field modulation pixels for fluorescence lifetime imaging with sub-nanosecond time response

    NASA Astrophysics Data System (ADS)

    Li, Zhuo; Seo, Min-Woong; Kagawa, Keiichiro; Yasutomi, Keita; Kawahito, Shoji

    2016-04-01

    This paper presents the design and implementation of a time-resolved CMOS image sensor with a high-speed lateral electric field modulation (LEFM) gating structure for time domain fluorescence lifetime measurement. Time-windowed signal charge can be transferred from a pinned photodiode (PPD) to a pinned storage diode (PSD) by turning on a pair of transfer gates, which are situated beside the channel. Unwanted signal charge can be drained from the PPD to the drain by turning on another pair of gates. The pixel array contains 512 (V) × 310 (H) pixels with 5.6 × 5.6 µm2 pixel size. The imager chip was fabricated using 0.11 µm CMOS image sensor process technology. The prototype sensor has a time response of 150 ps at 374 nm. The fill factor of the pixels is 5.6%. The usefulness of the prototype sensor is demonstrated for fluorescence lifetime imaging through simulation and measurement results.

  8. Active pixel sensors with substantially planarized color filtering elements

    NASA Technical Reports Server (NTRS)

    Fossum, Eric R. (Inventor); Kemeny, Sabrina E. (Inventor)

    1999-01-01

    A semiconductor imaging system preferably having an active pixel sensor array compatible with a CMOS fabrication process. Color-filtering elements such as polymer filters and wavelength-converting phosphors can be integrated with the image sensor.

  9. A Wireless Sensor Network for Vineyard Monitoring That Uses Image Processing

    PubMed Central

    Lloret, Jaime; Bosch, Ignacio; Sendra, Sandra; Serrano, Arturo

    2011-01-01

    The first step to detect when a vineyard has any type of deficiency, pest or disease is to observe its stems, its grapes and/or its leaves. To place a sensor in each leaf of every vineyard is obviously not feasible in terms of cost and deployment. We should thus look for new methods to detect these symptoms precisely and economically. In this paper, we present a wireless sensor network where each sensor node takes images from the field and internally uses image processing techniques to detect any unusual status in the leaves. This symptom could be caused by a deficiency, pest, disease or other harmful agent. When it is detected, the sensor node sends a message to a sink node through the wireless sensor network in order to notify the problem to the farmer. The wireless sensor uses the IEEE 802.11 a/b/g/n standard, which allows connections from large distances in open air. This paper describes the wireless sensor network design, the wireless sensor deployment, how the node processes the images in order to monitor the vineyard, and the sensor network traffic obtained from a test bed performed in a flat vineyard in Spain. Although the system is not able to distinguish between deficiency, pest, disease or other harmful agents, a symptoms image database and a neuronal network could be added in order learn from the experience and provide an accurate problem diagnosis. PMID:22163948

  10. A wireless sensor network for vineyard monitoring that uses image processing.

    PubMed

    Lloret, Jaime; Bosch, Ignacio; Sendra, Sandra; Serrano, Arturo

    2011-01-01

    The first step to detect when a vineyard has any type of deficiency, pest or disease is to observe its stems, its grapes and/or its leaves. To place a sensor in each leaf of every vineyard is obviously not feasible in terms of cost and deployment. We should thus look for new methods to detect these symptoms precisely and economically. In this paper, we present a wireless sensor network where each sensor node takes images from the field and internally uses image processing techniques to detect any unusual status in the leaves. This symptom could be caused by a deficiency, pest, disease or other harmful agent. When it is detected, the sensor node sends a message to a sink node through the wireless sensor network in order to notify the problem to the farmer. The wireless sensor uses the IEEE 802.11 a/b/g/n standard, which allows connections from large distances in open air. This paper describes the wireless sensor network design, the wireless sensor deployment, how the node processes the images in order to monitor the vineyard, and the sensor network traffic obtained from a test bed performed in a flat vineyard in Spain. Although the system is not able to distinguish between deficiency, pest, disease or other harmful agents, a symptoms image database and a neuronal network could be added in order learn from the experience and provide an accurate problem diagnosis.

  11. A data-management system using sensor technology and wireless devices for port security

    NASA Astrophysics Data System (ADS)

    Saldaña, Manuel; Rivera, Javier; Oyola, Jose; Manian, Vidya

    2014-05-01

    Sensor technologies such as infrared sensors and hyperspectral imaging, video camera surveillance are proven to be viable in port security. Drawing from sources such as infrared sensor data, digital camera images and processed hyperspectral images, this article explores the implementation of a real-time data delivery system. In an effort to improve the manner in which anomaly detection data is delivered to interested parties in port security, this system explores how a client-server architecture can provide protected access to data, reports, and device status. Sensor data and hyperspectral image data will be kept in a monitored directory, where the system will link it to existing users in the database. Since this system will render processed hyperspectral images that are dynamically added to the server - which often occupy a large amount of space - the resolution of these images is trimmed down to around 1024×768 pixels. Changes that occur in any image or data modification that originates from any sensor will trigger a message to all users that have a relation with the aforementioned. These messages will be sent to the corresponding users through automatic email generation and through a push notification using Google Cloud Messaging for Android. Moreover, this paper presents the complete architecture for data reception from the sensors, processing, storage and discusses how users of this system such as port security personnel can use benefit from the use of this service to receive secure real-time notifications if their designated sensors have detected anomalies and/or have remote access to results from processed hyperspectral imagery relevant to their assigned posts.

  12. Engineering workstation: Sensor modeling

    NASA Technical Reports Server (NTRS)

    Pavel, M; Sweet, B.

    1993-01-01

    The purpose of the engineering workstation is to provide an environment for rapid prototyping and evaluation of fusion and image processing algorithms. Ideally, the algorithms are designed to optimize the extraction of information that is useful to a pilot for all phases of flight operations. Successful design of effective fusion algorithms depends on the ability to characterize both the information available from the sensors and the information useful to a pilot. The workstation is comprised of subsystems for simulation of sensor-generated images, image processing, image enhancement, and fusion algorithms. As such, the workstation can be used to implement and evaluate both short-term solutions and long-term solutions. The short-term solutions are being developed to enhance a pilot's situational awareness by providing information in addition to his direct vision. The long term solutions are aimed at the development of complete synthetic vision systems. One of the important functions of the engineering workstation is to simulate the images that would be generated by the sensors. The simulation system is designed to use the graphics modeling and rendering capabilities of various workstations manufactured by Silicon Graphics Inc. The workstation simulates various aspects of the sensor-generated images arising from phenomenology of the sensors. In addition, the workstation can be used to simulate a variety of impairments due to mechanical limitations of the sensor placement and due to the motion of the airplane. Although the simulation is currently not performed in real-time, sequences of individual frames can be processed, stored, and recorded in a video format. In that way, it is possible to examine the appearance of different dynamic sensor-generated and fused images.

  13. Estimation in Linear Systems Featuring Correlated Uncertain Observations Coming from Multiple Sensors

    NASA Astrophysics Data System (ADS)

    Caballero-Águila, R.; Hermoso-Carazo, A.; Linares-Pérez, J.

    2009-08-01

    In this paper, the state least-squares linear estimation problem from correlated uncertain observations coming from multiple sensors is addressed. It is assumed that, at each sensor, the state is measured in the presence of additive white noise and that the uncertainty in the observations is characterized by a set of Bernoulli random variables which are only correlated at consecutive time instants. Assuming that the statistical properties of such variables are not necessarily the same for all the sensors, a recursive filtering algorithm is proposed, and the performance of the estimators is illustrated by a numerical simulation example wherein a signal is estimated from correlated uncertain observations coming from two sensors with different uncertainty characteristics.

  14. Autonomous chemical and biological miniature wireless-sensor

    NASA Astrophysics Data System (ADS)

    Goldberg, Bar-Giora

    2005-05-01

    The presentation discusses a new concept and a paradigm shift in biological, chemical and explosive sensor system design and deployment. From large, heavy, centralized and expensive systems to distributed wireless sensor networks utilizing miniature platforms (nodes) that are lightweight, low cost and wirelessly connected. These new systems are possible due to the emergence and convergence of new innovative radio, imaging, networking and sensor technologies. Miniature integrated radio-sensor networks, is a technology whose time has come. These network systems are based on large numbers of distributed low cost and short-range wireless platforms that sense and process their environment and communicate data thru a network to a command center. The recent emergence of chemical and explosive sensor technology based on silicon nanostructures, coupled with the fast evolution of low-cost CMOS imagers, low power DSP engines and integrated radio chips, has created an opportunity to realize the vision of autonomous wireless networks. These threat detection networks will perform sophisticated analysis at the sensor node and convey alarm information up the command chain. Sensor networks of this type are expected to revolutionize the ability to detect and locate biological, chemical, or explosive threats. The ability to distribute large numbers of low-cost sensors over large areas enables these devices to be close to the targeted threats and therefore improve detection efficiencies and enable rapid counter responses. These sensor networks will be used for homeland security, shipping container monitoring, and other applications such as laboratory medical analysis, drug discovery, automotive, environmental and/or in-vivo monitoring. Avaak"s system concept is to image a chromatic biological, chemical and/or explosive sensor utilizing a digital imager, analyze the images and distribute alarm or image data wirelessly through the network. All the imaging, processing and communications would take place within the miniature, low cost distributed sensor platforms. This concept however presents a significant challenge due to a combination and convergence of required new technologies, as mentioned above. Passive biological and chemical sensors with very high sensitivity and which require no assaying are in development using a technique to optically and chemically encode silicon wafers with tailored nanostructures. The silicon wafer is patterned with nano-structures designed to change colors ad patterns when exposed to the target analytes (TICs, TIMs, VOC). A small video camera detects the color and pattern changes on the sensor. To determine if an alarm condition is present, an on board DSP processor, using specialized image processing algorithms and statistical analysis, determines if color gradient changes occurred on the sensor array. These sensors can detect several agents simultaneously. This system is currently under development by Avaak, with funding from DARPA through an SBIR grant.

  15. Automated Geo/Co-Registration of Multi-Temporal Very-High-Resolution Imagery.

    PubMed

    Han, Youkyung; Oh, Jaehong

    2018-05-17

    For time-series analysis using very-high-resolution (VHR) multi-temporal satellite images, both accurate georegistration to the map coordinates and subpixel-level co-registration among the images should be conducted. However, applying well-known matching methods, such as scale-invariant feature transform and speeded up robust features for VHR multi-temporal images, has limitations. First, they cannot be used for matching an optical image to heterogeneous non-optical data for georegistration. Second, they produce a local misalignment induced by differences in acquisition conditions, such as acquisition platform stability, the sensor's off-nadir angle, and relief displacement of the considered scene. Therefore, this study addresses the problem by proposing an automated geo/co-registration framework for full-scene multi-temporal images acquired from a VHR optical satellite sensor. The proposed method comprises two primary steps: (1) a global georegistration process, followed by (2) a fine co-registration process. During the first step, two-dimensional multi-temporal satellite images are matched to three-dimensional topographic maps to assign the map coordinates. During the second step, a local analysis of registration noise pixels extracted between the multi-temporal images that have been mapped to the map coordinates is conducted to extract a large number of well-distributed corresponding points (CPs). The CPs are finally used to construct a non-rigid transformation function that enables minimization of the local misalignment existing among the images. Experiments conducted on five Kompsat-3 full scenes confirmed the effectiveness of the proposed framework, showing that the georegistration performance resulted in an approximately pixel-level accuracy for most of the scenes, and the co-registration performance further improved the results among all combinations of the georegistered Kompsat-3 image pairs by increasing the calculated cross-correlation values.

  16. Curvature-Based Wavefront Sensor for Use on Extended, Arbitrary, Low-Contract Scenes Final Technical Report August 2004

    NASA Technical Reports Server (NTRS)

    LaBonte, Barry J.

    2004-01-01

    A small amount of work has been done on this project; the strategy to be adopted has been better defined, though no experimental work has been started. 1) Wavefront error signals: The best choice appears use a lenslet array at a pupil image to produce defocused image pairs for each subaperture. Then use the method proposed by Molodij et al. to produce subaperture curvature signals. Basically, this method samples a moderate number of locations in the image where the value of the image Laplacian is high, then taking the curvature signal from the difference of the Laplacians of the extrafocal images at those locations. The tip-tilt error is obtained from the temporal dependence of the first spatial derivatives of an in-focus image, at selected locations where these derivatives are significant. The wavefront tilt can be obtained from the full-aperture image. 2) Extrafocal image generation: The important aspect here is to generate symmetrically defocused images, with dynamically adjustable defocus. The adjustment is needed because larger defocus is required before the feedback loop is closed, and at times when the seeing is worse. It may be that the usual membrane mirror is the best choice, though other options should be explored. 3) Detector: Since the proposed sensor is to work on solar granulation, rather than a point source, an array detector for each subaperture is required. A fast CMOS camera such as that developed by the National Solar Observatory would be a satisfactory choice. 4) Processing: Processing requirements have not been defined in detail, though significantly fewer operations per cycle are required than for a correlation tracker.

  17. High-speed imaging using CMOS image sensor with quasi pixel-wise exposure

    NASA Astrophysics Data System (ADS)

    Sonoda, T.; Nagahara, H.; Endo, K.; Sugiyama, Y.; Taniguchi, R.

    2017-02-01

    Several recent studies in compressive video sensing have realized scene capture beyond the fundamental trade-off limit between spatial resolution and temporal resolution using random space-time sampling. However, most of these studies showed results for higher frame rate video that were produced by simulation experiments or using an optically simulated random sampling camera, because there are currently no commercially available image sensors with random exposure or sampling capabilities. We fabricated a prototype complementary metal oxide semiconductor (CMOS) image sensor with quasi pixel-wise exposure timing that can realize nonuniform space-time sampling. The prototype sensor can reset exposures independently by columns and fix these amount of exposure by rows for each 8x8 pixel block. This CMOS sensor is not fully controllable via the pixels, and has line-dependent controls, but it offers flexibility when compared with regular CMOS or charge-coupled device sensors with global or rolling shutters. We propose a method to realize pseudo-random sampling for high-speed video acquisition that uses the flexibility of the CMOS sensor. We reconstruct the high-speed video sequence from the images produced by pseudo-random sampling using an over-complete dictionary.

  18. Initial test of MITA/DIMM with an operational CBP system

    NASA Astrophysics Data System (ADS)

    Baldwin, Kevin; Hanna, Randall; Brown, Andrea; Brown, David; Moyer, Steven; Hixson, Jonathan G.

    2018-05-01

    The MITA (Motion Imagery Task Analyzer) project was conceived by CBP OA (Customs and Border Protection - Office of Acquisition) and executed by JHU/APL (Johns Hopkins University/Applied Physics Laboratory) and CERDEC NVESD MSD (Communications and Electronics Research Development Engineering Command Night Vision and Electronic Sensors Directorate Modeling and Simulation Division). The intent was to develop an efficient methodology whereby imaging system performance could be quickly and objectively characterized in a field setting. The initial design, development, and testing spanned a period of approximately 18 months with the initial project coming to a conclusion after testing of the MITA system in June 2017 with a fielded CBP system. The NVESD contribution to MITA was thermally heated target resolution boards deployed to support a range close to the sensor and, when possible, at range with the targets of interest. JHU/APL developed a laser DIMM (Differential Image Motion Monitor) system designed to measure the optical turbulence present along the line of sight of the imaging system during the time of image collection. The imagery collected of the target board was processed to calculate the in situ system resolution. This in situ imaging system resolution and the time-correlated turbulence measured by the DIMM system were used in NV-IPM (Night Vision Integrated Performance Model) to calculate the theoretical imaging system performance. Overall, this proves the MITA concept feasible. However, MITA is still in the initial phases of development and requires further verification and validation to ensure accuracy and reliability of both the instrument and the imaging system performance predictions.

  19. Application of 3-D imaging sensor for tracking minipigs in the open field test.

    PubMed

    Kulikov, Victor A; Khotskin, Nikita V; Nikitin, Sergey V; Lankin, Vasily S; Kulikov, Alexander V; Trapezov, Oleg V

    2014-09-30

    The minipig is a promising model in neurobiology and psychopharmacology. However, automated tracking of minipig behavior is still unresolved problem. The study was carried out on white, agouti and black (or spotted) minipiglets (n=108) bred in the Institute of Cytology and Genetics. New method of automated tracking of minipig behavior is based on Microsoft Kinect 3-D image sensor and the 3-D image reconstruction with EthoStudio software. The algorithms of distance run and time in the center evaluation were adapted for 3-D image data and new algorithm of vertical activity quantification was developed. The 3-D imaging system successfully detects white, black, spotted and agouti pigs in the open field test (OFT). No effect of sex or color on horizontal (distance run), vertical activities and time in the center was shown. Agouti pigs explored the arena more intensive than white or black animals, respectively. The OFT behavioral traits were compared with the fear reaction to experimenter. Time in the center of the OFT was positively correlated with fear reaction rank (ρ=0.21, p<0.05). Black pigs were significantly more fearful compared with white or agouti animals. The 3-D imaging system has three advantages over existing automated tracking systems: it avoids perspective distortion, distinguishes animals any color from any background and automatically evaluates vertical activity. The 3-D imaging system can be successfully applied for automated measurement of minipig behavior in neurobiological and psychopharmacological experiments. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Applications of the Integrated High-Performance CMOS Image Sensor to Range Finders - from Optical Triangulation to the Automotive Field.

    PubMed

    Wu, Jih-Huah; Pen, Cheng-Chung; Jiang, Joe-Air

    2008-03-13

    With their significant features, the applications of complementary metal-oxidesemiconductor (CMOS) image sensors covers a very extensive range, from industrialautomation to traffic applications such as aiming systems, blind guidance, active/passiverange finders, etc. In this paper CMOS image sensor-based active and passive rangefinders are presented. The measurement scheme of the proposed active/passive rangefinders is based on a simple triangulation method. The designed range finders chieflyconsist of a CMOS image sensor and some light sources such as lasers or LEDs. Theimplementation cost of our range finders is quite low. Image processing software to adjustthe exposure time (ET) of the CMOS image sensor to enhance the performance oftriangulation-based range finders was also developed. An extensive series of experimentswere conducted to evaluate the performance of the designed range finders. From theexperimental results, the distance measurement resolutions achieved by the active rangefinder and the passive range finder can be better than 0.6% and 0.25% within themeasurement ranges of 1 to 8 m and 5 to 45 m, respectively. Feasibility tests onapplications of the developed CMOS image sensor-based range finders to the automotivefield were also conducted. The experimental results demonstrated that our range finders arewell-suited for distance measurements in this field.

  1. Organic-on-silicon complementary metal-oxide-semiconductor colour image sensors.

    PubMed

    Lim, Seon-Jeong; Leem, Dong-Seok; Park, Kyung-Bae; Kim, Kyu-Sik; Sul, Sangchul; Na, Kyoungwon; Lee, Gae Hwang; Heo, Chul-Joon; Lee, Kwang-Hee; Bulliard, Xavier; Satoh, Ryu-Ichi; Yagi, Tadao; Ro, Takkyun; Im, Dongmo; Jung, Jungkyu; Lee, Myungwon; Lee, Tae-Yon; Han, Moon Gyu; Jin, Yong Wan; Lee, Sangyoon

    2015-01-12

    Complementary metal-oxide-semiconductor (CMOS) colour image sensors are representative examples of light-detection devices. To achieve extremely high resolutions, the pixel sizes of the CMOS image sensors must be reduced to less than a micron, which in turn significantly limits the number of photons that can be captured by each pixel using silicon (Si)-based technology (i.e., this reduction in pixel size results in a loss of sensitivity). Here, we demonstrate a novel and efficient method of increasing the sensitivity and resolution of the CMOS image sensors by superposing an organic photodiode (OPD) onto a CMOS circuit with Si photodiodes, which consequently doubles the light-input surface area of each pixel. To realise this concept, we developed organic semiconductor materials with absorption properties selective to green light and successfully fabricated highly efficient green-light-sensitive OPDs without colour filters. We found that such a top light-receiving OPD, which is selective to specific green wavelengths, demonstrates great potential when combined with a newly designed Si-based CMOS circuit containing only blue and red colour filters. To demonstrate the effectiveness of this state-of-the-art hybrid colour image sensor, we acquired a real full-colour image using a camera that contained the organic-on-Si hybrid CMOS colour image sensor.

  2. Organic-on-silicon complementary metal–oxide–semiconductor colour image sensors

    PubMed Central

    Lim, Seon-Jeong; Leem, Dong-Seok; Park, Kyung-Bae; Kim, Kyu-Sik; Sul, Sangchul; Na, Kyoungwon; Lee, Gae Hwang; Heo, Chul-Joon; Lee, Kwang-Hee; Bulliard, Xavier; Satoh, Ryu-Ichi; Yagi, Tadao; Ro, Takkyun; Im, Dongmo; Jung, Jungkyu; Lee, Myungwon; Lee, Tae-Yon; Han, Moon Gyu; Jin, Yong Wan; Lee, Sangyoon

    2015-01-01

    Complementary metal–oxide–semiconductor (CMOS) colour image sensors are representative examples of light-detection devices. To achieve extremely high resolutions, the pixel sizes of the CMOS image sensors must be reduced to less than a micron, which in turn significantly limits the number of photons that can be captured by each pixel using silicon (Si)-based technology (i.e., this reduction in pixel size results in a loss of sensitivity). Here, we demonstrate a novel and efficient method of increasing the sensitivity and resolution of the CMOS image sensors by superposing an organic photodiode (OPD) onto a CMOS circuit with Si photodiodes, which consequently doubles the light-input surface area of each pixel. To realise this concept, we developed organic semiconductor materials with absorption properties selective to green light and successfully fabricated highly efficient green-light-sensitive OPDs without colour filters. We found that such a top light-receiving OPD, which is selective to specific green wavelengths, demonstrates great potential when combined with a newly designed Si-based CMOS circuit containing only blue and red colour filters. To demonstrate the effectiveness of this state-of-the-art hybrid colour image sensor, we acquired a real full-colour image using a camera that contained the organic-on-Si hybrid CMOS colour image sensor. PMID:25578322

  3. CMOS foveal image sensor chip

    NASA Technical Reports Server (NTRS)

    Scott, Peter (Inventor); Sridhar, Ramalingam (Inventor); Bandera, Cesar (Inventor); Xia, Shu (Inventor)

    2002-01-01

    A foveal image sensor integrated circuit comprising a plurality of CMOS active pixel sensors arranged both within and about a central fovea region of the chip. The pixels in the central fovea region have a smaller size than the pixels arranged in peripheral rings about the central region. A new photocharge normalization scheme and associated circuitry normalizes the output signals from the different size pixels in the array. The pixels are assembled into a multi-resolution rectilinear foveal image sensor chip using a novel access scheme to reduce the number of analog RAM cells needed. Localized spatial resolution declines monotonically with offset from the imager's optical axis, analogous to biological foveal vision.

  4. Thermal luminescence spectroscopy chemical imaging sensor.

    PubMed

    Carrieri, Arthur H; Buican, Tudor N; Roese, Erik S; Sutter, James; Samuels, Alan C

    2012-10-01

    The authors present a pseudo-active chemical imaging sensor model embodying irradiative transient heating, temperature nonequilibrium thermal luminescence spectroscopy, differential hyperspectral imaging, and artificial neural network technologies integrated together. We elaborate on various optimizations, simulations, and animations of the integrated sensor design and apply it to the terrestrial chemical contamination problem, where the interstitial contaminant compounds of detection interest (analytes) comprise liquid chemical warfare agents, their various derivative condensed phase compounds, and other material of a life-threatening nature. The sensor must measure and process a dynamic pattern of absorptive-emissive middle infrared molecular signature spectra of subject analytes to perform its chemical imaging and standoff detection functions successfully.

  5. Space-based infrared sensors of space target imaging effect analysis

    NASA Astrophysics Data System (ADS)

    Dai, Huayu; Zhang, Yasheng; Zhou, Haijun; Zhao, Shuang

    2018-02-01

    Target identification problem is one of the core problem of ballistic missile defense system, infrared imaging simulation is an important means of target detection and recognition. This paper first established the space-based infrared sensors ballistic target imaging model of point source on the planet's atmosphere; then from two aspects of space-based sensors camera parameters and target characteristics simulated atmosphere ballistic target of infrared imaging effect, analyzed the camera line of sight jitter, camera system noise and different imaging effects of wave on the target.

  6. Development of a 750x750 pixels CMOS imager sensor for tracking applications

    NASA Astrophysics Data System (ADS)

    Larnaudie, Franck; Guardiola, Nicolas; Saint-Pé, Olivier; Vignon, Bruno; Tulet, Michel; Davancens, Robert; Magnan, Pierre; Corbière, Franck; Martin-Gonthier, Philippe; Estribeau, Magali

    2017-11-01

    Solid-state optical sensors are now commonly used in space applications (navigation cameras, astronomy imagers, tracking sensors...). Although the charge-coupled devices are still widely used, the CMOS image sensor (CIS), which performances are continuously improving, is a strong challenger for Guidance, Navigation and Control (GNC) systems. This paper describes a 750x750 pixels CMOS image sensor that has been specially designed and developed for star tracker and tracking sensor applications. Such detector, that is featuring smart architecture enabling very simple and powerful operations, is built using the AMIS 0.5μm CMOS technology. It contains 750x750 rectangular pixels with 20μm pitch. The geometry of the pixel sensitive zone is optimized for applications based on centroiding measurements. The main feature of this device is the on-chip control and timing function that makes the device operation easier by drastically reducing the number of clocks to be applied. This powerful function allows the user to operate the sensor with high flexibility: measurement of dark level from masked lines, direct access to the windows of interest… A temperature probe is also integrated within the CMOS chip allowing a very precise measurement through the video stream. A complete electro-optical characterization of the sensor has been performed. The major parameters have been evaluated: dark current and its uniformity, read-out noise, conversion gain, Fixed Pattern Noise, Photo Response Non Uniformity, quantum efficiency, Modulation Transfer Function, intra-pixel scanning. The characterization tests are detailed in the paper. Co60 and protons irradiation tests have been also carried out on the image sensor and the results are presented. The specific features of the 750x750 image sensor such as low power CMOS design (3.3V, power consumption<100mW), natural windowing (that allows efficient and robust tracking algorithms), simple proximity electronics (because of the on-chip control and timing function) enabling a high flexibility architecture, make this imager a good candidate for high performance tracking applications.

  7. Simulating the Effects of an Extended Source on the Shack-Hartmann Wavefront Sensor Through Turbulence

    DTIC Science & Technology

    2011-03-01

    wavefront distortions in real time. Often, it is used to correct for optical fluctuations due to atmospheric turbulence and improve imaging system...propagation paths, the overall turbulence is relatively weak, with a Rytov number of only 0.045. The atmospheric parameters were then used to program a three...on an adaptive optics (AO) system, it enables further research on the effects of deep turbulence on AO systems and correlation based wavefront sensing

  8. Comparison of the performance of intraoral X-ray sensors using objective image quality assessment.

    PubMed

    Hellén-Halme, Kristina; Johansson, Curt; Nilsson, Mats

    2016-05-01

    The main aim of this study was to evaluate the performance of 10 individual sensors of the same make, using objective measures of key image quality parameters. A further aim was to compare 8 brands of sensors. Ten new sensors of 8 different models from 6 manufacturers (i.e., 80 sensors) were included in the study. All sensors were exposed in a standardized way using an X-ray tube voltage of 60 kVp and different exposure times. Sensor response, noise, low-contrast resolution, spatial resolution and uniformity were measured. Individual differences between sensors of the same brand were surprisingly large in some cases. There were clear differences in the characteristics of the different brands of sensors. The largest variations were found for individual sensor response for some of the brands studied. Also, noise level and low contrast resolution showed large variations between brands. Sensors, even of the same brand, vary significantly in their quality. It is thus valuable to establish action levels for the acceptance of newly delivered sensors and to use objective image quality control for commissioning purposes and periodic checks to ensure high performance of individual digital sensors. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Research of diagnosis sensors fault based on correlation analysis of the bridge structural health monitoring system

    NASA Astrophysics Data System (ADS)

    Hu, Shunren; Chen, Weimin; Liu, Lin; Gao, Xiaoxia

    2010-03-01

    Bridge structural health monitoring system is a typical multi-sensor measurement system due to the multi-parameters of bridge structure collected from the monitoring sites on the river-spanning bridges. Bridge structure monitored by multi-sensors is an entity, when subjected to external action; there will be different performances to different bridge structure parameters. Therefore, the data acquired by each sensor should exist countless correlation relation. However, complexity of the correlation relation is decided by complexity of bridge structure. Traditionally correlation analysis among monitoring sites is mainly considered from physical locations. unfortunately, this method is so simple that it cannot describe the correlation in detail. The paper analyzes the correlation among the bridge monitoring sites according to the bridge structural data, defines the correlation of bridge monitoring sites and describes its several forms, then integrating the correlative theory of data mining and signal system to establish the correlation model to describe the correlation among the bridge monitoring sites quantificationally. Finally, The Chongqing Mashangxi Yangtze river bridge health measurement system is regards as research object to diagnosis sensors fault, and simulation results verify the effectiveness of the designed method and theoretical discussions.

  10. Cross-calibration of the Terra MODIS, Landsat 7 ETM+ and EO-1 ALI sensors using near-simultaneous surface observation over the Railroad Valley Playa, Nevada, test site

    USGS Publications Warehouse

    Chander, G.; Angal, A.; Choi, T.; Meyer, D.J.; Xiong, X.; Teillet, P.M.

    2007-01-01

    A cross-calibration methodology has been developed using coincident image pairs from the Terra Moderate Resolution Imaging Spectroradiometer (MODIS), the Landsat 7 (L7) Enhanced Thematic Mapper Plus (ETM+) and the Earth Observing EO-1 Advanced Land Imager (ALI) to verify the absolute radiometric calibration accuracy of these sensors with respect to each other. To quantify the effects due to different spectral responses, the Relative Spectral Responses (RSR) of these sensors were studied and compared by developing a set of "figures-of-merit." Seven cloud-free scenes collected over the Railroad Valley Playa, Nevada (RVPN), test site were used to conduct the cross-calibration study. This cross-calibration approach was based on image statistics from near-simultaneous observations made by different satellite sensors. Homogeneous regions of interest (ROI) were selected in the image pairs, and the mean target statistics were converted to absolute units of at-sensor reflectance. Using these reflectances, a set of cross-calibration equations were developed giving a relative gain and bias between the sensor pair.

  11. Covariance descriptor fusion for target detection

    NASA Astrophysics Data System (ADS)

    Cukur, Huseyin; Binol, Hamidullah; Bal, Abdullah; Yavuz, Fatih

    2016-05-01

    Target detection is one of the most important topics for military or civilian applications. In order to address such detection tasks, hyperspectral imaging sensors provide useful images data containing both spatial and spectral information. Target detection has various challenging scenarios for hyperspectral images. To overcome these challenges, covariance descriptor presents many advantages. Detection capability of the conventional covariance descriptor technique can be improved by fusion methods. In this paper, hyperspectral bands are clustered according to inter-bands correlation. Target detection is then realized by fusion of covariance descriptor results based on the band clusters. The proposed combination technique is denoted Covariance Descriptor Fusion (CDF). The efficiency of the CDF is evaluated by applying to hyperspectral imagery to detect man-made objects. The obtained results show that the CDF presents better performance than the conventional covariance descriptor.

  12. Single-shot and single-sensor high/super-resolution microwave imaging based on metasurface.

    PubMed

    Wang, Libo; Li, Lianlin; Li, Yunbo; Zhang, Hao Chi; Cui, Tie Jun

    2016-06-01

    Real-time high-resolution (including super-resolution) imaging with low-cost hardware is a long sought-after goal in various imaging applications. Here, we propose broadband single-shot and single-sensor high-/super-resolution imaging by using a spatio-temporal dispersive metasurface and an imaging reconstruction algorithm. The metasurface with spatio-temporal dispersive property ensures the feasibility of the single-shot and single-sensor imager for super- and high-resolution imaging, since it can convert efficiently the detailed spatial information of the probed object into one-dimensional time- or frequency-dependent signal acquired by a single sensor fixed in the far-field region. The imaging quality can be improved by applying a feature-enhanced reconstruction algorithm in post-processing, and the desired imaging resolution is related to the distance between the object and metasurface. When the object is placed in the vicinity of the metasurface, the super-resolution imaging can be realized. The proposed imaging methodology provides a unique means to perform real-time data acquisition, high-/super-resolution images without employing expensive hardware (e.g. mechanical scanner, antenna array, etc.). We expect that this methodology could make potential breakthroughs in the areas of microwave, terahertz, optical, and even ultrasound imaging.

  13. Roi-Orientated Sensor Correction Based on Virtual Steady Reimaging Model for Wide Swath High Resolution Optical Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Zhu, Y.; Jin, S.; Tian, Y.; Wang, M.

    2017-09-01

    To meet the requirement of high accuracy and high speed processing for wide swath high resolution optical satellite imagery under emergency situation in both ground processing system and on-board processing system. This paper proposed a ROI-orientated sensor correction algorithm based on virtual steady reimaging model for wide swath high resolution optical satellite imagery. Firstly, the imaging time and spatial window of the ROI is determined by a dynamic search method. Then, the dynamic ROI sensor correction model based on virtual steady reimaging model is constructed. Finally, the corrected image corresponding to the ROI is generated based on the coordinates mapping relationship which is established by the dynamic sensor correction model for corrected image and rigours imaging model for original image. Two experimental results show that the image registration between panchromatic and multispectral images can be well achieved and the image distortion caused by satellite jitter can be also corrected efficiently.

  14. Binary CMOS image sensor with a gate/body-tied MOSFET-type photodetector for high-speed operation

    NASA Astrophysics Data System (ADS)

    Choi, Byoung-Soo; Jo, Sung-Hyun; Bae, Myunghan; Kim, Sang-Hwan; Shin, Jang-Kyoo

    2016-05-01

    In this paper, a binary complementary metal oxide semiconductor (CMOS) image sensor with a gate/body-tied (GBT) metal oxide semiconductor field effect transistor (MOSFET)-type photodetector is presented. The sensitivity of the GBT MOSFET-type photodetector, which was fabricated using the standard CMOS 0.35-μm process, is higher than the sensitivity of the p-n junction photodiode, because the output signal of the photodetector is amplified by the MOSFET. A binary image sensor becomes more efficient when using this photodetector. Lower power consumptions and higher speeds of operation are possible, compared to the conventional image sensors using multi-bit analog to digital converters (ADCs). The frame rate of the proposed image sensor is over 2000 frames per second, which is higher than those of the conventional CMOS image sensors. The output signal of an active pixel sensor is applied to a comparator and compared with a reference level. The 1-bit output data of the binary process is determined by this level. To obtain a video signal, the 1-bit output data is stored in the memory and is read out by horizontal scanning. The proposed chip is composed of a GBT pixel array (144 × 100), binary-process circuit, vertical scanner, horizontal scanner, and readout circuit. The operation mode can be selected from between binary mode and multi-bit mode.

  15. Maintained functionality of an implantable radiotelemetric blood pressure and heart rate sensor after magnetic resonance imaging in rats.

    PubMed

    Nölte, I; Gorbey, S; Boll, H; Figueiredo, G; Groden, C; Lemmer, B; Brockmann, M A

    2011-12-01

    Radiotelemetric sensors for in vivo assessment of blood pressure and heart rate are widely used in animal research. MRI with implanted sensors is regarded as contraindicated as transmitter malfunction and injury of the animal may be caused. Moreover, artefacts are expected to compromise image evaluation. In vitro, the function of a radiotelemetric sensor (TA11PA-C10, Data Sciences International) after exposure to MRI up to 9.4 T was assessed. The magnetic force of the electromagnetic field on the sensor as well as radiofrequency (RF)-induced sensor heating was analysed. Finally, MRI with an implanted sensor was performed in a rat. Imaging artefacts were analysed at 3.0 and 9.4 T ex vivo and in vivo. Transmitted 24 h blood pressure and heart rate were compared before and after MRI to verify the integrity of the telemetric sensor. The function of the sensor was not altered by MRI up to 9.4 T. The maximum force exerted on the sensor was 273 ± 50 mN. RF-induced heating was ruled out. Artefacts impeded the assessment of the abdomen and thorax in a dead rat, but not of the head and neck. MRI with implanted radiotelemetric sensors is feasible in principal. The tested sensor maintains functionality up to 9.4 T. Artefacts hampered abdominal and throacic imaging in rats, while assessment of the head and neck is possible.

  16. Low-cost compact thermal imaging sensors for body temperature measurement

    NASA Astrophysics Data System (ADS)

    Han, Myung-Soo; Han, Seok Man; Kim, Hyo Jin; Shin, Jae Chul; Ahn, Mi Sook; Kim, Hyung Won; Han, Yong Hee

    2013-06-01

    This paper presents a 32x32 microbolometer thermal imaging sensor for human body temperature measurement. Waferlevel vacuum packaging technology allows us to get a low cost and compact imaging sensor chip. The microbolometer uses V-W-O film as sensing material and ROIC has been designed 0.35-um CMOS process in UMC. A thermal image of a human face and a hand using f/1 lens convinces that it has a potential of human body temperature for commercial use.

  17. Blind identification of full-field vibration modes from video measurements with phase-based video motion magnification

    NASA Astrophysics Data System (ADS)

    Yang, Yongchao; Dorn, Charles; Mancini, Tyler; Talken, Zachary; Kenyon, Garrett; Farrar, Charles; Mascareñas, David

    2017-02-01

    Experimental or operational modal analysis traditionally requires physically-attached wired or wireless sensors for vibration measurement of structures. This instrumentation can result in mass-loading on lightweight structures, and is costly and time-consuming to install and maintain on large civil structures, especially for long-term applications (e.g., structural health monitoring) that require significant maintenance for cabling (wired sensors) or periodic replacement of the energy supply (wireless sensors). Moreover, these sensors are typically placed at a limited number of discrete locations, providing low spatial sensing resolution that is hardly sufficient for modal-based damage localization, or model correlation and updating for larger-scale structures. Non-contact measurement methods such as scanning laser vibrometers provide high-resolution sensing capacity without the mass-loading effect; however, they make sequential measurements that require considerable acquisition time. As an alternative non-contact method, digital video cameras are relatively low-cost, agile, and provide high spatial resolution, simultaneous, measurements. Combined with vision based algorithms (e.g., image correlation, optical flow), video camera based measurements have been successfully used for vibration measurements and subsequent modal analysis, based on techniques such as the digital image correlation (DIC) and the point-tracking. However, they typically require speckle pattern or high-contrast markers to be placed on the surface of structures, which poses challenges when the measurement area is large or inaccessible. This work explores advanced computer vision and video processing algorithms to develop a novel video measurement and vision-based operational (output-only) modal analysis method that alleviate the need of structural surface preparation associated with existing vision-based methods and can be implemented in a relatively efficient and autonomous manner with little user supervision and calibration. First a multi-scale image processing method is applied on the frames of the video of a vibrating structure to extract the local pixel phases that encode local structural vibration, establishing a full-field spatiotemporal motion matrix. Then a high-spatial dimensional, yet low-modal-dimensional, over-complete model is used to represent the extracted full-field motion matrix using modal superposition, which is physically connected and manipulated by a family of unsupervised learning models and techniques, respectively. Thus, the proposed method is able to blindly extract modal frequencies, damping ratios, and full-field (as many points as the pixel number of the video frame) mode shapes from line of sight video measurements of the structure. The method is validated by laboratory experiments on a bench-scale building structure and a cantilever beam. Its ability for output (video measurements)-only identification and visualization of the weakly-excited mode is demonstrated and several issues with its implementation are discussed.

  18. Validation Test Report for the Automated Optical Processing System (AOPS) Version 4.12

    DTIC Science & Technology

    2015-09-03

    NPP) with the VIIRS sensor package as well as data from the Geostationary Ocean Color Imager (GOCI) sensor, aboard the Communication Ocean and...capability • Prepare the NRT Geostationary Ocean Color Imager (GOCI) data stream for integration into operations. • Improvements in sensor...Navy (DON) Environmental Data Records (EDRs) Expeditionary Warfare (EXW) Geostationary Ocean Color Imager (GOCI) Gulf of Mexico (GOM) Hierarchical

  19. Online prediction of organileptic data for snack food using color images

    NASA Astrophysics Data System (ADS)

    Yu, Honglu; MacGregor, John F.

    2004-11-01

    In this paper, a study for the prediction of organileptic properties of snack food in real-time using RGB color images is presented. The so-called organileptic properties, which are properties based on texture, taste and sight, are generally measured either by human sensory response or by mechanical devices. Neither of these two methods can be used for on-line feedback control in high-speed production. In this situation, a vision-based soft sensor is very attractive. By taking images of the products, the samples remain untouched and the product properties can be predicted in real time from image data. Four types of organileptic properties are considered in this study: blister level, toast points, taste and peak break force. Wavelet transform are applied on the color images and the averaged absolute value for each filtered image is used as texture feature variable. In order to handle the high correlation among the feature variables, Partial Least Squares (PLS) is used to regress the extracted feature variables against the four response variables.

  20. Neural correlate of Internet use in patients undergoing psychological treatment for Internet addiction.

    PubMed

    Lai, Carlo; Altavilla, Daniela; Mazza, Marianna; Scappaticci, Silvia; Tambelli, Renata; Aceto, Paola; Luciani, Massimiliano; Corvino, Stefano; Martinelli, David; Alimonti, Flaminia; Tonioni, Federico

    2017-06-01

    The new version of Diagnostic and Statistical Manual of Mental Disorders (DSM-5th) proposed the Internet Gaming Disorder for the diagnosis of Internet addiction (IA) considering the neurobiological evidence of the craving. The aim was to test the neural correlate in response to the Internet cue in patients with IA. Sixteen males with IA diagnosis (clinical group) and 14 healthy male (control group) were recruited for an experimental visual task composed of Internet images and emotional images. During the visual presentation of Internet cue, electroencefalographic data were recorded using Net Station 4.5.1 with a 256-channels HydroCel Geodesic Sensor Net. Event-related potential (ERP) components and low-resolution electromagnetic tomography (sLoreta) were analysed. sLoreta analyses showed that patients from the clinical group presented a higher primary somatosensorial cortex and lower paralimbic, temporal and orbito-frontal activation in response to both Internet and emotional images compared to those of the control group. These results suggest that clinically recognized pathological use of Internet could be linked to dissociative symptoms.

  1. Uncooled microbolometer sensors for unattended applications

    NASA Astrophysics Data System (ADS)

    Kohin, Margaret; Miller, James E.; Leary, Arthur R.; Backer, Brian S.; Swift, William; Aston, Peter

    2003-09-01

    BAE SYSTEMS has been developing and producing uncooled microbolometer sensors since 1995. Recently, uncooled sensors have been used on Pointer Unattended Aerial Vehicles and considered for several unattended sensor applications including DARPA Micro-Internetted Unattended Ground Sensors (MIUGS), Army Modular Acoustic Imaging Sensors (MAIS), and Redeployable Unattended Ground Sensors (R-UGS). This paper describes recent breakthrough uncooled sensor performance at BAE SYSTEMS and how this improved performance has been applied to a new Standard Camera Core (SCC) that is ideal for these unattended applications. Video imagery from a BAE SYSTEMS 640x480 imaging camera flown in a Pointer UAV is provided. Recent performance results are also provided.

  2. Measuring MEG closer to the brain: Performance of on-scalp sensor arrays

    PubMed Central

    Iivanainen, Joonas; Stenroos, Matti; Parkkonen, Lauri

    2017-01-01

    Optically-pumped magnetometers (OPMs) have recently reached sensitivity levels required for magnetoencephalography (MEG). OPMs do not need cryogenics and can thus be placed within millimetres from the scalp into an array that adapts to the invidual head size and shape, thereby reducing the distance from cortical sources to the sensors. Here, we quantified the improvement in recording MEG with hypothetical on-scalp OPM arrays compared to a 306-channel state-of-the-art SQUID array (102 magnetometers and 204 planar gradiometers). We simulated OPM arrays that measured either normal (nOPM; 102 sensors), tangential (tOPM; 204 sensors), or all components (aOPM; 306 sensors) of the magnetic field. We built forward models based on magnetic resonance images of 10 adult heads; we employed a three-compartment boundary element model and distributed current dipoles evenly across the cortical mantle. Compared to the SQUID magnetometers, nOPM and tOPM yielded 7.5 and 5.3 times higher signal power, while the correlations between the field patterns of source dipoles were reduced by factors of 2.8 and 3.6, respectively. Values of the field-pattern correlations were similar across nOPM, tOPM and SQUID gradiometers. Volume currents reduced the signals of primary currents on average by 10%, 72% and 15% in nOPM, tOPM and SQUID magnetometers, respectively. The information capacities of the OPM arrays were clearly higher than that of the SQUID array. The dipole-localization accuracies of the arrays were similar while the minimum-norm-based point-spread functions were on average 2.4 and 2.5 times more spread for the SQUID array compared to nOPM and tOPM arrays, respectively. PMID:28007515

  3. Estimating pixel variances in the scenes of staring sensors

    DOEpatents

    Simonson, Katherine M [Cedar Crest, NM; Ma, Tian J [Albuquerque, NM

    2012-01-24

    A technique for detecting changes in a scene perceived by a staring sensor is disclosed. The technique includes acquiring a reference image frame and a current image frame of a scene with the staring sensor. A raw difference frame is generated based upon differences between the reference image frame and the current image frame. Pixel error estimates are generated for each pixel in the raw difference frame based at least in part upon spatial error estimates related to spatial intensity gradients in the scene. The pixel error estimates are used to mitigate effects of camera jitter in the scene between the current image frame and the reference image frame.

  4. Cloud-to-Ground Lightning Estimates Derived from SSMI Microwave Remote Sensing and NLDN

    NASA Technical Reports Server (NTRS)

    Winesett, Thomas; Magi, Brian; Cecil, Daniel

    2015-01-01

    Lightning observations are collected using ground-based and satellite-based sensors. The National Lightning Detection Network (NLDN) in the United States uses multiple ground sensors to triangulate the electromagnetic signals created when lightning strikes the Earth's surface. Satellite-based lightning observations have been made from 1998 to present using the Lightning Imaging Sensor (LIS) on the NASA Tropical Rainfall Measuring Mission (TRMM) satellite, and from 1995 to 2000 using the Optical Transient Detector (OTD) on the Microlab-1 satellite. Both LIS and OTD are staring imagers that detect lightning as momentary changes in an optical scene. Passive microwave remote sensing (85 and 37 GHz brightness temperatures) from the TRMM Microwave Imager (TMI) has also been used to quantify characteristics of thunderstorms related to lightning. Each lightning detection system has fundamental limitations. TRMM satellite coverage is limited to the tropics and subtropics between 38 deg N and 38 deg S, so lightning at the higher latitudes of the northern and southern hemispheres is not observed. The detection efficiency of NLDN sensors exceeds 95%, but the sensors are only located in the USA. Even if data from other ground-based lightning sensors (World Wide Lightning Location Network, the European Cooperation for Lightning Detection, and Canadian Lightning Detection Network) were combined with TRMM and NLDN, there would be enormous spatial gaps in present-day coverage of lightning. In addition, a globally-complete time history of observed lightning activity is currently not available either, with network coverage and detection efficiencies varying through the years. Previous research using the TRMM LIS and Microwave Imager (TMI) showed that there is a statistically significant correlation between lightning flash rates and passive microwave brightness temperatures. The physical basis for this correlation emerges because lightning in a thunderstorm occurs where ice is first present in the cloud and electric charge separation occurs. These ice particles efficiently scatter the microwave radiation at the 85 and 37 GHz frequencies, thus leading to large brightness temperature depressions. Lightning flash rate is related to the total amount of ice passing through the convective updraft regions of thunderstorms. Confirmation of this relationship using TRMM LIS and TMI data, however, remains constrained to TRMM observational limits of the tropics and subtropics. Satellites from the Defense Meteorology Satellite Program (DMSP) have global coverage and are equipped with passive microwave imagers that, like TMI, observe brightness temperatures at 85 and 37 GHz. Unlike the TRMM satellite, however, DMSP satellites do not have a lightning sensor, and the DMSP microwave data has never been used to derive global lightning. In this presentation, a relationship between DMSP Special Sensor Microwave Imager (SSMI) data and ground-based cloud-to-ground (CG) lightning data from NLDN is investigated to derive a spatially complete time history of CG lightning for the USA study area. This relationship is analogous to the established using TRMM LIS and TMI data. NLDN has the most spatially and temporally complete CG lightning data for the USA, and therefore provides the best opportunity to find geospatially coincident observations with SSMI sensors. The strongest thunderstorms generally have minimum 85 GHz Polarized Corrected brightness Temperatures (PCT) less than 150 K. Archived radar data was used to resolve the spatial extent of the individual storms. NLDN data for that storm spatial extent defined by radar data was used to calculate the CG flash rate for the storm. Similar to results using TRMM sensors, a linear model best explained the relationship between storm-specific CG flash rates and minimum 85 GHz PCT. However, the results in this study apply only to CG lightning. To extend the results to weaker storms, the probability of CG lightning (instead of the flash rate) was calculated for storms having 85 GHz PCT greater than 150 K. NLDN data was used to determine if a CG strike occurred for a storm. This probability of CG lightning was plotted as a function of minimum 85 GHz PCT and minimum 37 GHz PCT. These probabilities were used in conjunction with the linear model to estimate the CG flash rate for weaker storms with minimum 85 GHz PCTs greater than 150 K. Results from the investigation of CG lightning and passive microwave radiation signals agree with the previous research investigating total lightning and brightness temperature. Future work will take the established relationships and apply them to the decades of available DMSP data for the USA to derive a map of CG lightning flash rates. Validation of this method and uncertainty analysis will be done by comparing the derived maps of CG lightning flash rates against existing NLDN maps of CG lightning flash rates.

  5. Wide area detection system: Conceptual design study. [using television and microelectronic technology

    NASA Technical Reports Server (NTRS)

    Hilbert, E. E.; Carl, C.; Goss, W.; Hansen, G. R.; Olsasky, M. J.; Johnston, A. R.

    1978-01-01

    An integrated sensor for traffic surveillance on mainline sections of urban freeways is described. Applicable imaging and processor technology is surveyed and the functional requirements for the sensors and the conceptual design of the breadboard sensors are given. Parameters measured by the sensors include lane density, speed, and volume. The freeway image is also used for incident diagnosis.

  6. Advanced scanners and imaging systems for earth observations. [conferences

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Assessments of present and future sensors and sensor related technology are reported along with a description of user needs and applications. Five areas are outlined: (1) electromechanical scanners, (2) self-scanned solid state sensors, (3) electron beam imagers, (4) sensor related technology, and (5) user applications. Recommendations, charts, system designs, technical approaches, and bibliographies are included for each area.

  7. Landsat and Thermal Infrared Imaging

    NASA Technical Reports Server (NTRS)

    Arvidson, Terry; Barsi, Julia; Jhabvala, Murzy; Reuter, Dennis

    2012-01-01

    The purpose of this chapter is to describe the collection of thermal images by Landsat sensors already on orbit and to introduce the new thermal sensor to be launched in 2013. The chapter describes the thematic mapper (TM) and enhanced thematic mapper plus (ETM+) sensors, the calibration of their thermal bands, and the design and prelaunch calibration of the new thermal infrared sensor (TIRS).

  8. High-Sensitivity Fiber-Optic Ultrasound Sensors for Medical Imaging Applications

    PubMed Central

    Wen, H.; Wiesler, D.G.; Tveten, A.; Danver, B.; Dandridge, A.

    2010-01-01

    This paper presents several designs of high-sensitivity, compact fiber-optic ultrasound sensors that may be used for medical imaging applications. These sensors translate ultrasonic pulses into strains in single-mode optical fibers, which are measured with fiber-based laser interferometers at high precision. The sensors are simpler and less expensive to make than piezoelectric sensors, and are not susceptible to electromagnetic interference. It is possible to make focal sensors with these designs, and several schemes are discussed. Because of the minimum bending radius of optical fibers, the designs are suitable for single element sensors rather than for arrays. PMID:9691368

  9. Reconstruction of an acoustic pressure field in a resonance tube by particle image velocimetry.

    PubMed

    Kuzuu, K; Hasegawa, S

    2015-11-01

    A technique for estimating an acoustic field in a resonance tube is suggested. The estimation of an acoustic field in a resonance tube is important for the development of the thermoacoustic engine, and can be conducted employing two sensors to measure pressure. While this measurement technique is known as the two-sensor method, care needs to be taken with the location of pressure sensors when conducting pressure measurements. In the present study, particle image velocimetry (PIV) is employed instead of a pressure measurement by a sensor, and two-dimensional velocity vector images are extracted as sequential data from only a one- time recording made by a video camera of PIV. The spatial velocity amplitude is obtained from those images, and a pressure distribution is calculated from velocity amplitudes at two points by extending the equations derived for the two-sensor method. By means of this method, problems relating to the locations and calibrations of multiple pressure sensors are avoided. Furthermore, to verify the accuracy of the present method, the experiments are conducted employing the conventional two-sensor method and laser Doppler velocimetry (LDV). Then, results by the proposed method are compared with those obtained with the two-sensor method and LDV.

  10. Radiometric characterization of hyperspectral imagers using multispectral sensors

    NASA Astrophysics Data System (ADS)

    McCorkel, Joel; Thome, Kurt; Leisso, Nathan; Anderson, Nikolaus; Czapla-Myers, Jeff

    2009-08-01

    The Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne and satellite based sensors. Often, ground-truth measurements at these tests sites are not always successful due to weather and funding availability. Therefore, RSG has also employed automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor. This work studies the feasibility of determining the radiometric calibration of a hyperspectral imager using multispectral imagery. The work relies on the Moderate Resolution Imaging Spectroradiometer (MODIS) as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. Hyperion bands are compared to MODIS by band averaging Hyperion's high spectral resolution data with the relative spectral response of MODIS. The results compare cross-calibration scenarios that differ in image acquisition coincidence, test site used for the calibration, and reference sensor. Cross-calibration results are presented that show agreement between the use of coincident and non-coincident image pairs within 2% in most bands as well as similar agreement between results that employ the different MODIS sensors as a reference.

  11. Radiometric Characterization of Hyperspectral Imagers using Multispectral Sensors

    NASA Technical Reports Server (NTRS)

    McCorkel, Joel; Kurt, Thome; Leisso, Nathan; Anderson, Nikolaus; Czapla-Myers, Jeff

    2009-01-01

    The Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne and satellite based sensors. Often, ground-truth measurements at these test sites are not always successful due to weather and funding availability. Therefore, RSG has also automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor, This work studies the feasibility of determining the radiometric calibration of a hyperspectral imager using multispectral a imagery. The work relies on the Moderate Resolution Imaging Spectroradiometer (M0DIS) as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. Hyperion bands are compared to MODIS by band averaging Hyperion's high spectral resolution data with the relative spectral response of M0DlS. The results compare cross-calibration scenarios that differ in image acquisition coincidence, test site used for the calibration, and reference sensor. Cross-calibration results are presented that show agreement between the use of coincident and non-coincident image pairs within 2% in most brands as well as similar agreement between results that employ the different MODIS sensors as a reference.

  12. Multispectral image fusion for detecting land mines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, G.A.; Sengupta, S.K.; Aimonetti, W.D.

    1995-04-01

    This report details a system which fuses information contained in registered images from multiple sensors to reduce the effects of clutter and improve the ability to detect surface and buried land mines. The sensor suite currently consists of a camera that acquires images in six bands (400nm, 500nm, 600nm, 700nm, 800nm and 900nm). Past research has shown that it is extremely difficult to distinguish land mines from background clutter in images obtained from a single sensor. It is hypothesized, however, that information fused from a suite of various sensors is likely to provide better detection reliability, because the suite ofmore » sensors detects a variety of physical properties that are more separable in feature space. The materials surrounding the mines can include natural materials (soil, rocks, foliage, water, etc.) and some artifacts.« less

  13. Extracellular Bio-imaging of Acetylcholine-stimulated PC12 Cells Using a Calcium and Potassium Multi-ion Image Sensor.

    PubMed

    Matsuba, Sota; Kato, Ryo; Okumura, Koichi; Sawada, Kazuaki; Hattori, Toshiaki

    2018-01-01

    In biochemistry, Ca 2+ and K + play essential roles to control signal transduction. Much interest has been focused on ion-imaging, which facilitates understanding of their ion flux dynamics. In this paper, we report a calcium and potassium multi-ion image sensor and its application to living cells (PC12). The multi-ion sensor had two selective plasticized poly(vinyl chloride) membranes containing ionophores. Each region on the sensor responded to only the corresponding ion. The multi-ion sensor has many advantages including not only label-free and real-time measurement but also simultaneous detection of Ca 2+ and K + . Cultured PC12 cells treated with nerve growth factor were prepared, and a practical observation for the cells was conducted with the sensor. After the PC12 cells were stimulated by acetylcholine, only the extracellular Ca 2+ concentration increased while there was no increase in the extracellular K + concentration. Through the practical observation, we demonstrated that the sensor was helpful for analyzing the cell events with changing Ca 2+ and/or K + concentration.

  14. Application of Sensor Fusion to Improve Uav Image Classification

    NASA Astrophysics Data System (ADS)

    Jabari, S.; Fathollahi, F.; Zhang, Y.

    2017-08-01

    Image classification is one of the most important tasks of remote sensing projects including the ones that are based on using UAV images. Improving the quality of UAV images directly affects the classification results and can save a huge amount of time and effort in this area. In this study, we show that sensor fusion can improve image quality which results in increasing the accuracy of image classification. Here, we tested two sensor fusion configurations by using a Panchromatic (Pan) camera along with either a colour camera or a four-band multi-spectral (MS) camera. We use the Pan camera to benefit from its higher sensitivity and the colour or MS camera to benefit from its spectral properties. The resulting images are then compared to the ones acquired by a high resolution single Bayer-pattern colour camera (here referred to as HRC). We assessed the quality of the output images by performing image classification tests. The outputs prove that the proposed sensor fusion configurations can achieve higher accuracies compared to the images of the single Bayer-pattern colour camera. Therefore, incorporating a Pan camera on-board in the UAV missions and performing image fusion can help achieving higher quality images and accordingly higher accuracy classification results.

  15. On-Line Temperature Estimation for Noisy Thermal Sensors Using a Smoothing Filter-Based Kalman Predictor

    PubMed Central

    Li, Zhi; Wei, Henglu; Zhou, Wei; Duan, Zhemin

    2018-01-01

    Dynamic thermal management (DTM) mechanisms utilize embedded thermal sensors to collect fine-grained temperature information for monitoring the real-time thermal behavior of multi-core processors. However, embedded thermal sensors are very susceptible to a variety of sources of noise, including environmental uncertainty and process variation. This causes the discrepancies between actual temperatures and those observed by on-chip thermal sensors, which seriously affect the efficiency of DTM. In this paper, a smoothing filter-based Kalman prediction technique is proposed to accurately estimate the temperatures from noisy sensor readings. For the multi-sensor estimation scenario, the spatial correlations among different sensor locations are exploited. On this basis, a multi-sensor synergistic calibration algorithm (known as MSSCA) is proposed to improve the simultaneous prediction accuracy of multiple sensors. Moreover, an infrared imaging-based temperature measurement technique is also proposed to capture the thermal traces of an advanced micro devices (AMD) quad-core processor in real time. The acquired real temperature data are used to evaluate our prediction performance. Simulation shows that the proposed synergistic calibration scheme can reduce the root-mean-square error (RMSE) by 1.2 ∘C and increase the signal-to-noise ratio (SNR) by 15.8 dB (with a very small average runtime overhead) compared with assuming the thermal sensor readings to be ideal. Additionally, the average false alarm rate (FAR) of the corrected sensor temperature readings can be reduced by 28.6%. These results clearly demonstrate that if our approach is used to perform temperature estimation, the response mechanisms of DTM can be triggered to adjust the voltages, frequencies, and cooling fan speeds at more appropriate times. PMID:29393862

  16. On-Line Temperature Estimation for Noisy Thermal Sensors Using a Smoothing Filter-Based Kalman Predictor.

    PubMed

    Li, Xin; Ou, Xingtao; Li, Zhi; Wei, Henglu; Zhou, Wei; Duan, Zhemin

    2018-02-02

    Dynamic thermal management (DTM) mechanisms utilize embedded thermal sensors to collect fine-grained temperature information for monitoring the real-time thermal behavior of multi-core processors. However, embedded thermal sensors are very susceptible to a variety of sources of noise, including environmental uncertainty and process variation. This causes the discrepancies between actual temperatures and those observed by on-chip thermal sensors, which seriously affect the efficiency of DTM. In this paper, a smoothing filter-based Kalman prediction technique is proposed to accurately estimate the temperatures from noisy sensor readings. For the multi-sensor estimation scenario, the spatial correlations among different sensor locations are exploited. On this basis, a multi-sensor synergistic calibration algorithm (known as MSSCA) is proposed to improve the simultaneous prediction accuracy of multiple sensors. Moreover, an infrared imaging-based temperature measurement technique is also proposed to capture the thermal traces of an advanced micro devices (AMD) quad-core processor in real time. The acquired real temperature data are used to evaluate our prediction performance. Simulation shows that the proposed synergistic calibration scheme can reduce the root-mean-square error (RMSE) by 1.2 ∘ C and increase the signal-to-noise ratio (SNR) by 15.8 dB (with a very small average runtime overhead) compared with assuming the thermal sensor readings to be ideal. Additionally, the average false alarm rate (FAR) of the corrected sensor temperature readings can be reduced by 28.6%. These results clearly demonstrate that if our approach is used to perform temperature estimation, the response mechanisms of DTM can be triggered to adjust the voltages, frequencies, and cooling fan speeds at more appropriate times.

  17. Absolute Radiometric Calibration of Narrow-Swath Imaging Sensors with Reference to Non-Coincident Wide-Swath Sensors

    NASA Technical Reports Server (NTRS)

    McCorkel, Joel; Thome, Kurtis; Lockwood, Ronald

    2012-01-01

    An inter-calibration method is developed to provide absolute radiometric calibration of narrow-swath imaging sensors with reference to non-coincident wide-swath sensors. The method predicts at-sensor radiance using non-coincident imagery from the reference sensor and knowledge of spectral reflectance of the test site. The imagery of the reference sensor is restricted to acquisitions that provide similar view and solar illumination geometry to reduce uncertainties due to directional reflectance effects. Spectral reflectance of the test site is found with a simple iterative radiative transfer method using radiance values of a well-understood wide-swath sensor and spectral shape information based on historical ground-based measurements. At-sensor radiance is calculated for the narrow-swath sensor using this spectral reflectance and atmospheric parameters that are also based on historical in situ measurements. Results of the inter-calibration method show agreement on the 2 5 percent level in most spectral regions with the vicarious calibration technique relying on coincident ground-based measurements referred to as the reflectance-based approach. While the variability of the inter-calibration method based on non-coincident image pairs is significantly larger, results are consistent with techniques relying on in situ measurements. The method is also insensitive to spectral differences between the sensors by transferring to surface spectral reflectance prior to prediction of at-sensor radiance. The utility of this inter-calibration method is made clear by its flexibility to utilize image pairings with acquisition dates differing in excess of 30 days allowing frequent absolute calibration comparisons between wide- and narrow-swath sensors.

  18. Traffic Monitor

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Intelligent Vision Systems, Inc. (InVision) needed image acquisition technology that was reliable in bad weather for its TDS-200 Traffic Detection System. InVision researchers used information from NASA Tech Briefs and assistance from Johnson Space Center to finish the system. The NASA technology used was developed for Earth-observing imaging satellites: charge coupled devices, in which silicon chips convert light directly into electronic or digital images. The TDS-200 consists of sensors mounted above traffic on poles or span wires, enabling two sensors to view an intersection; a "swing and sway" feature to compensate for movement of the sensors; a combination of electronic shutter and gain control; and sensor output to an image digital signal processor, still frame video and optionally live video.

  19. Displacement damage effects on CMOS APS image sensors induced by neutron irradiation from a nuclear reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zujun, E-mail: wangzujun@nint.ac.cn; Huang, Shaoyan; Liu, Minbo

    The experiments of displacement damage effects on CMOS APS image sensors induced by neutron irradiation from a nuclear reactor are presented. The CMOS APS image sensors are manufactured in the standard 0.35 μm CMOS technology. The flux of neutron beams was about 1.33 × 10{sup 8} n/cm{sup 2}s. The three samples were exposed by 1 MeV neutron equivalent-fluence of 1 × 10{sup 11}, 5 × 10{sup 11}, and 1 × 10{sup 12} n/cm{sup 2}, respectively. The mean dark signal (K{sub D}), dark signal spike, dark signal non-uniformity (DSNU), noise (V{sub N}), saturation output signal voltage (V{sub S}), and dynamic rangemore » (DR) versus neutron fluence are investigated. The degradation mechanisms of CMOS APS image sensors are analyzed. The mean dark signal increase due to neutron displacement damage appears to be proportional to displacement damage dose. The dark images from CMOS APS image sensors irradiated by neutrons are presented to investigate the generation of dark signal spike.« less

  20. Commercial Sensory Survey Radiation Testing Progress Report

    NASA Technical Reports Server (NTRS)

    Becker, Heidi N.; Dolphic, Michael D.; Thorbourn, Dennis O.; Alexander, James W.; Salomon, Phil M.

    2008-01-01

    The NASA Electronic Parts and Packaging (NEPP) Program Sensor Technology Commercial Sensor Survey task is geared toward benefiting future NASA space missions with low-cost, short-duty-cycle, visible imaging needs. Such applications could include imaging for educational outreach purposes or short surveys of spacecraft, planetary, or lunar surfaces. Under the task, inexpensive commercial grade CMOS sensors were surveyed in fiscal year 2007 (FY07) and three sensors were selected for total ionizing dose (TID) and displacement damage dose (DDD) tolerance testing. The selected sensors had to meet selection criteria chosen to support small, low-mass cameras that produce good resolution color images. These criteria are discussed in detail in [1]. This document discusses the progress of radiation testing on the Micron and OmniVision sensors selected in FY07 for radiation tolerance testing.

  1. Integration of Multi-sensor Data for Desertification Monitoring

    NASA Astrophysics Data System (ADS)

    Lin, S.; Kim, J.

    2010-12-01

    The desert area has been rapidly expanding globally due to reasons such as climate change, uninhibited human activities, etc. The continuous desertification has seriously affected in (and near) desert area all over the world. As sand dune activity has been recognised as an essential indicator of desertification (it is the signature and the consequence of desertification), an accurate monitoring of desert dune movement hence becomes crucial for understanding and modelling the progress of desertification. In order to determine dune’s moving speed and tendency, also to understand the propagation occurring in transition region between desert and soil rich area, a monitoring system applying multi-temporal and multi-sensor remote sensed data are proposed and implemented. Remote sensed data involved in the monitoring scheme include space-borne optical image, Synthetic Aperture Radar (SAR) data, multi- and hyper-spectral image, and terrestrial close range image. In order to determine the movement of dunes, a reference terrain surface is required. To this end, a digital terrain model (DTM) covering the test site is firstly produced using high resolution optical stereo satellite images. Subsequently, ERS-1/2 SAR imagery are employed as another resource for dune field observation. Through the interferometric SAR (InSAR) technique combining with image-based stereo DTM, the surface displacements are obtained. From which the movement and speed of the dunes can be determined. To understand the effect of desertification combating activities, the correlation between dune activities and the landcover change is also an important issue to be covered in the monitoring scheme. The task is accomplished by tracing soil and vegetation canopy variation with the multi and hyper spectral image analysis using Hyperion and Ali imagery derived from Earth Observation Mission 1 (EO-1). As a result, the correlation between the soil restorations, expanding of vegetation canopy and the ceasing of dune activities can be clearly revealed. For the very detailed measurement, a terrestrial system applying close range photogrammetry will be set up in the test sites to acquire sequential images and used to generate 4D model of the dunes in future. Finally, all the outputs from the multi-sensor data will be crossly verified and compiled to model the desertification process and the consequences. A desertification combating activity which is performed by Korea-China NGO alliance has been conducted in Qubuqi desert in Nei Mongol, China. The method and system proposed above will be established and applied to monitor the dune mobility occurring in this area. The results are expected to be of great value to demonstrate the first case of remote sensing monitoring over the combat desertification activities.

  2. A Sensitive Measurement for Estimating Impressions of Image-Contents

    NASA Astrophysics Data System (ADS)

    Sato, Mie; Matouge, Shingo; Mori, Toshifumi; Suzuki, Noboru; Kasuga, Masao

    We have investigated Kansei Content that appeals maker's intention to viewer's kansei. An SD method is a very good way to evaluate subjective impression of image-contents. However, because the SD method is performed after subjects view the image-contents, it is difficult to examine impression of detailed scenes of the image-contents in real time. To measure viewer's impression of the image-contents in real time, we have developed a Taikan sensor. With the Taikan sensor, we investigate relations among the image-contents, the grip strength and the body temperature. We also explore the interface of the Taikan sensor to use it easily. In our experiment, a horror movie is used that largely affects emotion of the subjects. Our results show that there is a possibility that the grip strength increases when the subjects view a strained scene and that it is easy to use the Taikan sensor without its circle base that is originally installed.

  3. Low noise WDR ROIC for InGaAs SWIR image sensor

    NASA Astrophysics Data System (ADS)

    Ni, Yang

    2017-11-01

    Hybridized image sensors are actually the only solution for image sensing beyond the spectral response of silicon devices. By hybridization, we can combine the best sensing material and photo-detector design with high performance CMOS readout circuitry. In the infrared band, we are facing typically 2 configurations: high background situation and low background situation. The performance of high background sensors are conditioned mainly by the integration capacity in each pixel which is the case for mid-wave and long-wave infrared detectors. For low background situation, the detector's performance is mainly limited by the pixel's noise performance which is conditioned by dark signal and readout noise. In the case of reflection based imaging condition, the pixel's dynamic range is also an important parameter. This is the case for SWIR band imaging. We are particularly interested by InGaAs based SWIR image sensors.

  4. A CMOS image sensor with stacked photodiodes for lensless observation system of digital enzyme-linked immunosorbent assay

    NASA Astrophysics Data System (ADS)

    Takehara, Hironari; Miyazawa, Kazuya; Noda, Toshihiko; Sasagawa, Kiyotaka; Tokuda, Takashi; Kim, Soo Hyeon; Iino, Ryota; Noji, Hiroyuki; Ohta, Jun

    2014-01-01

    A CMOS image sensor with stacked photodiodes was fabricated using 0.18 µm mixed signal CMOS process technology. Two photodiodes were stacked at the same position of each pixel of the CMOS image sensor. The stacked photodiodes consist of shallow high-concentration N-type layer (N+), P-type well (PW), deep N-type well (DNW), and P-type substrate (P-sub). PW and P-sub were shorted to ground. By monitoring the voltage of N+ and DNW individually, we can observe two monochromatic colors simultaneously without using any color filters. The CMOS image sensor is suitable for fluorescence imaging, especially contact imaging such as a lensless observation system of digital enzyme-linked immunosorbent assay (ELISA). Since the fluorescence increases with time in digital ELISA, it is possible to observe fluorescence accurately by calculating the difference from the initial relation between the pixel values for both photodiodes.

  5. Low-voltage 96 dB snapshot CMOS image sensor with 4.5 nW power dissipation per pixel.

    PubMed

    Spivak, Arthur; Teman, Adam; Belenky, Alexander; Yadid-Pecht, Orly; Fish, Alexander

    2012-01-01

    Modern "smart" CMOS sensors have penetrated into various applications, such as surveillance systems, bio-medical applications, digital cameras, cellular phones and many others. Reducing the power of these sensors continuously challenges designers. In this paper, a low power global shutter CMOS image sensor with Wide Dynamic Range (WDR) ability is presented. This sensor features several power reduction techniques, including a dual voltage supply, a selective power down, transistors with different threshold voltages, a non-rationed logic, and a low voltage static memory. A combination of all these approaches has enabled the design of the low voltage "smart" image sensor, which is capable of reaching a remarkable dynamic range, while consuming very low power. The proposed power-saving solutions have allowed the maintenance of the standard architecture of the sensor, reducing both the time and the cost of the design. In order to maintain the image quality, a relation between the sensor performance and power has been analyzed and a mathematical model, describing the sensor Signal to Noise Ratio (SNR) and Dynamic Range (DR) as a function of the power supplies, is proposed. The described sensor was implemented in a 0.18 um CMOS process and successfully tested in the laboratory. An SNR of 48 dB and DR of 96 dB were achieved with a power dissipation of 4.5 nW per pixel.

  6. Low-Voltage 96 dB Snapshot CMOS Image Sensor with 4.5 nW Power Dissipation per Pixel

    PubMed Central

    Spivak, Arthur; Teman, Adam; Belenky, Alexander; Yadid-Pecht, Orly; Fish, Alexander

    2012-01-01

    Modern “smart” CMOS sensors have penetrated into various applications, such as surveillance systems, bio-medical applications, digital cameras, cellular phones and many others. Reducing the power of these sensors continuously challenges designers. In this paper, a low power global shutter CMOS image sensor with Wide Dynamic Range (WDR) ability is presented. This sensor features several power reduction techniques, including a dual voltage supply, a selective power down, transistors with different threshold voltages, a non-rationed logic, and a low voltage static memory. A combination of all these approaches has enabled the design of the low voltage “smart” image sensor, which is capable of reaching a remarkable dynamic range, while consuming very low power. The proposed power-saving solutions have allowed the maintenance of the standard architecture of the sensor, reducing both the time and the cost of the design. In order to maintain the image quality, a relation between the sensor performance and power has been analyzed and a mathematical model, describing the sensor Signal to Noise Ratio (SNR) and Dynamic Range (DR) as a function of the power supplies, is proposed. The described sensor was implemented in a 0.18 um CMOS process and successfully tested in the laboratory. An SNR of 48 dB and DR of 96 dB were achieved with a power dissipation of 4.5 nW per pixel. PMID:23112588

  7. Electric Potential and Electric Field Imaging with Applications

    NASA Technical Reports Server (NTRS)

    Generazio, Ed

    2016-01-01

    The technology and techniques for remote quantitative imaging of electrostatic potentials and electrostatic fields in and around objects and in free space is presented. Electric field imaging (EFI) technology may be applied to characterize intrinsic or existing electric potentials and electric fields, or an externally generated electrostatic field may be used for (illuminating) volumes to be inspected with EFI. The baseline sensor technology, electric field sensor (e-sensor), and its construction, optional electric field generation (quasistatic generator), and current e-sensor enhancements (ephemeral e-sensor) are discussed. Demonstrations for structural, electronic, human, and memory applications are shown. This new EFI capability is demonstrated to reveal characterization of electric charge distribution, creating a new field of study that embraces areas of interest including electrostatic discharge mitigation, crime scene forensics, design and materials selection for advanced sensors, dielectric morphology of structures, inspection of containers, inspection for hidden objects, tether integrity, organic molecular memory, and medical diagnostic and treatment efficacy applications such as cardiac polarization wave propagation and electromyography imaging.

  8. CMOS image sensor for detection of interferon gamma protein interaction as a point-of-care approach.

    PubMed

    Marimuthu, Mohana; Kandasamy, Karthikeyan; Ahn, Chang Geun; Sung, Gun Yong; Kim, Min-Gon; Kim, Sanghyo

    2011-09-01

    Complementary metal oxide semiconductor (CMOS)-based image sensors have received increased attention owing to the possibility of incorporating them into portable diagnostic devices. The present research examined the efficiency and sensitivity of a CMOS image sensor for the detection of antigen-antibody interactions involving interferon gamma protein without the aid of expensive instruments. The highest detection sensitivity of about 1 fg/ml primary antibody was achieved simply by a transmission mechanism. When photons are prevented from hitting the sensor surface, a reduction in digital output occurs in which the number of photons hitting the sensor surface is approximately proportional to the digital number. Nanoscale variation in substrate thickness after protein binding can be detected with high sensitivity by the CMOS image sensor. Therefore, this technique can be easily applied to smartphones or any clinical diagnostic devices for the detection of several biological entities, with high impact on the development of point-of-care applications.

  9. Effective Fingerprint Quality Estimation for Diverse Capture Sensors

    PubMed Central

    Xie, Shan Juan; Yoon, Sook; Shin, Jinwook; Park, Dong Sun

    2010-01-01

    Recognizing the quality of fingerprints in advance can be beneficial for improving the performance of fingerprint recognition systems. The representative features to assess the quality of fingerprint images from different types of capture sensors are known to vary. In this paper, an effective quality estimation system that can be adapted for different types of capture sensors is designed by modifying and combining a set of features including orientation certainty, local orientation quality and consistency. The proposed system extracts basic features, and generates next level features which are applicable for various types of capture sensors. The system then uses the Support Vector Machine (SVM) classifier to determine whether or not an image should be accepted as input to the recognition system. The experimental results show that the proposed method can perform better than previous methods in terms of accuracy. In the meanwhile, the proposed method has an ability to eliminate residue images from the optical and capacitive sensors, and the coarse images from thermal sensors. PMID:22163632

  10. Applications of the Integrated High-Performance CMOS Image Sensor to Range Finders — from Optical Triangulation to the Automotive Field

    PubMed Central

    Wu, Jih-Huah; Pen, Cheng-Chung; Jiang, Joe-Air

    2008-01-01

    With their significant features, the applications of complementary metal-oxide semiconductor (CMOS) image sensors covers a very extensive range, from industrial automation to traffic applications such as aiming systems, blind guidance, active/passive range finders, etc. In this paper CMOS image sensor-based active and passive range finders are presented. The measurement scheme of the proposed active/passive range finders is based on a simple triangulation method. The designed range finders chiefly consist of a CMOS image sensor and some light sources such as lasers or LEDs. The implementation cost of our range finders is quite low. Image processing software to adjust the exposure time (ET) of the CMOS image sensor to enhance the performance of triangulation-based range finders was also developed. An extensive series of experiments were conducted to evaluate the performance of the designed range finders. From the experimental results, the distance measurement resolutions achieved by the active range finder and the passive range finder can be better than 0.6% and 0.25% within the measurement ranges of 1 to 8 m and 5 to 45 m, respectively. Feasibility tests on applications of the developed CMOS image sensor-based range finders to the automotive field were also conducted. The experimental results demonstrated that our range finders are well-suited for distance measurements in this field. PMID:27879789

  11. The Geoscience Spaceborne Imaging Spectroscopy Technical Committees Calibration and Validation Workshop

    NASA Technical Reports Server (NTRS)

    Ong, Cindy; Mueller, Andreas; Thome, Kurtis; Pierce, Leland E.; Malthus, Timothy

    2016-01-01

    Calibration is the process of quantitatively defining a system's responses to known, controlled signal inputs, and validation is the process of assessing, by independent means, the quality of the data products derived from those system outputs [1]. Similar to other Earth observation (EO) sensors, the calibration and validation of spaceborne imaging spectroscopy sensors is a fundamental underpinning activity. Calibration and validation determine the quality and integrity of the data provided by spaceborne imaging spectroscopy sensors and have enormous downstream impacts on the accuracy and reliability of products generated from these sensors. At least five imaging spectroscopy satellites are planned to be launched within the next five years, with the two most advanced scheduled to be launched in the next two years [2]. The launch of these sensors requires the establishment of suitable, standardized, and harmonized calibration and validation strategies to ensure that high-quality data are acquired and comparable between these sensor systems. Such activities are extremely important for the community of imaging spectroscopy users. Recognizing the need to focus on this underpinning topic, the Geoscience Spaceborne Imaging Spectroscopy (previously, the International Spaceborne Imaging Spectroscopy) Technical Committee launched a calibration and validation initiative at the 2013 International Geoscience and Remote Sensing Symposium (IGARSS) in Melbourne, Australia, and a post-conference activity of a vicarious calibration field trip at Lake Lefroy in Western Australia.

  12. CMOS image sensors as an efficient platform for glucose monitoring.

    PubMed

    Devadhasan, Jasmine Pramila; Kim, Sanghyo; Choi, Cheol Soo

    2013-10-07

    Complementary metal oxide semiconductor (CMOS) image sensors have been used previously in the analysis of biological samples. In the present study, a CMOS image sensor was used to monitor the concentration of oxidized mouse plasma glucose (86-322 mg dL(-1)) based on photon count variation. Measurement of the concentration of oxidized glucose was dependent on changes in color intensity; color intensity increased with increasing glucose concentration. The high color density of glucose highly prevented photons from passing through the polydimethylsiloxane (PDMS) chip, which suggests that the photon count was altered by color intensity. Photons were detected by a photodiode in the CMOS image sensor and converted to digital numbers by an analog to digital converter (ADC). Additionally, UV-spectral analysis and time-dependent photon analysis proved the efficiency of the detection system. This simple, effective, and consistent method for glucose measurement shows that CMOS image sensors are efficient devices for monitoring glucose in point-of-care applications.

  13. High-resolution dynamic pressure sensor array based on piezo-phototronic effect tuned photoluminescence imaging.

    PubMed

    Peng, Mingzeng; Li, Zhou; Liu, Caihong; Zheng, Qiang; Shi, Xieqing; Song, Ming; Zhang, Yang; Du, Shiyu; Zhai, Junyi; Wang, Zhong Lin

    2015-03-24

    A high-resolution dynamic tactile/pressure display is indispensable to the comprehensive perception of force/mechanical stimulations such as electronic skin, biomechanical imaging/analysis, or personalized signatures. Here, we present a dynamic pressure sensor array based on pressure/strain tuned photoluminescence imaging without the need for electricity. Each sensor is a nanopillar that consists of InGaN/GaN multiple quantum wells. Its photoluminescence intensity can be modulated dramatically and linearly by small strain (0-0.15%) owing to the piezo-phototronic effect. The sensor array has a high pixel density of 6350 dpi and exceptional small standard deviation of photoluminescence. High-quality tactile/pressure sensing distribution can be real-time recorded by parallel photoluminescence imaging without any cross-talk. The sensor array can be inexpensively fabricated over large areas by semiconductor product lines. The proposed dynamic all-optical pressure imaging with excellent resolution, high sensitivity, good uniformity, and ultrafast response time offers a suitable way for smart sensing, micro/nano-opto-electromechanical systems.

  14. Laser doppler blood flow imaging using a CMOS imaging sensor with on-chip signal processing.

    PubMed

    He, Diwei; Nguyen, Hoang C; Hayes-Gill, Barrie R; Zhu, Yiqun; Crowe, John A; Gill, Cally; Clough, Geraldine F; Morgan, Stephen P

    2013-09-18

    The first fully integrated 2D CMOS imaging sensor with on-chip signal processing for applications in laser Doppler blood flow (LDBF) imaging has been designed and tested. To obtain a space efficient design over 64 × 64 pixels means that standard processing electronics used off-chip cannot be implemented. Therefore the analog signal processing at each pixel is a tailored design for LDBF signals with balanced optimization for signal-to-noise ratio and silicon area. This custom made sensor offers key advantages over conventional sensors, viz. the analog signal processing at the pixel level carries out signal normalization; the AC amplification in combination with an anti-aliasing filter allows analog-to-digital conversion with a low number of bits; low resource implementation of the digital processor enables on-chip processing and the data bottleneck that exists between the detector and processing electronics has been overcome. The sensor demonstrates good agreement with simulation at each design stage. The measured optical performance of the sensor is demonstrated using modulated light signals and in vivo blood flow experiments. Images showing blood flow changes with arterial occlusion and an inflammatory response to a histamine skin-prick demonstrate that the sensor array is capable of detecting blood flow signals from tissue.

  15. Highly curved image sensors: a practical approach for improved optical performance

    NASA Astrophysics Data System (ADS)

    Guenter, Brian; Joshi, Neel; Stoakley, Richard; Keefe, Andrew; Geary, Kevin; Freeman, Ryan; Hundley, Jake; Patterson, Pamela; Hammon, David; Herrera, Guillermo; Sherman, Elena; Nowak, Andrew; Schubert, Randall; Brewer, Peter; Yang, Louis; Mott, Russell; McKnight, Geoff

    2017-06-01

    The significant optical and size benefits of using a curved focal surface for imaging systems have been well studied yet never brought to market for lack of a high-quality, mass-producible, curved image sensor. In this work we demonstrate that commercial silicon CMOS image sensors can be thinned and formed into accurate, highly curved optical surfaces with undiminished functionality. Our key development is a pneumatic forming process that avoids rigid mechanical constraints and suppresses wrinkling instabilities. A combination of forming-mold design, pressure membrane elastic properties, and controlled friction forces enables us to gradually contact the die at the corners and smoothly press the sensor into a spherical shape. Allowing the die to slide into the concave target shape enables a threefold increase in the spherical curvature over prior approaches having mechanical constraints that resist deformation, and create a high-stress, stretch-dominated state. Our process creates a bridge between the high precision and low-cost but planar CMOS process, and ideal non-planar component shapes such as spherical imagers for improved optical systems. We demonstrate these curved sensors in prototype cameras with custom lenses, measuring exceptional resolution of 3220 line-widths per picture height at an aperture of f/1.2 and nearly 100% relative illumination across the field. Though we use a 1/2.3" format image sensor in this report, we also show this process is generally compatible with many state of the art imaging sensor formats. By example, we report photogrammetry test data for an APS-C sized silicon die formed to a 30$^\\circ$ subtended spherical angle. These gains in sharpness and relative illumination enable a new generation of ultra-high performance, manufacturable, digital imaging systems for scientific, industrial, and artistic use.

  16. Experimental Verification of Buffet Calculation Procedure Using Unsteady PSP

    NASA Technical Reports Server (NTRS)

    Panda, Jayanta

    2016-01-01

    Typically a limited number of dynamic pressure sensors are employed to determine the unsteady aerodynamic forces on large, slender aerospace structures. The estimated forces are known to be very sensitive to the number of the dynamic pressure sensors and the details of the integration scheme. This report describes a robust calculation procedure, based on frequency-specific correlation lengths, that is found to produce good estimation of fluctuating forces from a few dynamic pressure sensors. The validation test was conducted on a flat panel, placed on the floor of a wind tunnel, and was subjected to vortex shedding from a rectangular bluff-body. The panel was coated with fast response Pressure Sensitive Paint (PSP), which allowed time-resolved measurements of unsteady pressure fluctuations on a dense grid of spatial points. The first part of the report describes the detail procedure used to analyze the high-speed, PSP camera images. The procedure includes steps to reduce contamination by electronic shot noise, correction for spatial non-uniformities, and lamp brightness variation, and finally conversion of fluctuating light intensity to fluctuating pressure. The latter involved applying calibration constants from a few dynamic pressure sensors placed at selective points on the plate. Excellent comparison in the spectra, coherence and phase, calculated via PSP and dynamic pressure sensors validated the PSP processing steps. The second part of the report describes the buffet validation process, for which the first step was to use pressure histories from all PSP points to determine the "true" force fluctuations. In the next step only a selected number of pixels were chosen as "virtual sensors" and a correlation-length based buffet calculation procedure was applied to determine "modeled" force fluctuations. By progressively decreasing the number of virtual sensors it was observed that the present calculation procedure was able to make a close estimate of the "true" unsteady forces only from four sensors. It is believed that the present work provides the first validation of the buffet calculation procedure which has been used for the development of many space vehicles.

  17. Image Processing for Cameras with Fiber Bundle Image Relay

    DTIC Science & Technology

    length. Optical fiber bundles have been used to couple between this focal surface and planar image sensors . However, such fiber-coupled imaging systems...coupled to six discrete CMOS focal planes. We characterize the locally space-variant system impulse response at various stages: monocentric lens image...vignetting, and stitch together the image data from discrete sensors into a single panorama. We compare processed images from the prototype to those taken with

  18. Accuracy of Shack-Hartmann wavefront sensor using a coherent wound fibre image bundle

    NASA Astrophysics Data System (ADS)

    Zheng, Jessica R.; Goodwin, Michael; Lawrence, Jon

    2018-03-01

    Shack-Hartmannwavefront sensors using wound fibre image bundles are desired for multi-object adaptive optical systems to provide large multiplex positioned by Starbugs. The use of a large-sized wound fibre image bundle provides the flexibility to use more sub-apertures wavefront sensor for ELTs. These compact wavefront sensors take advantage of large focal surfaces such as the Giant Magellan Telescope. The focus of this paper is to study the wound fibre image bundle structure defects effect on the centroid measurement accuracy of a Shack-Hartmann wavefront sensor. We use the first moment centroid method to estimate the centroid of a focused Gaussian beam sampled by a simulated bundle. Spot estimation accuracy with wound fibre image bundle and its structure impact on wavefront measurement accuracy statistics are addressed. Our results show that when the measurement signal-to-noise ratio is high, the centroid measurement accuracy is dominated by the wound fibre image bundle structure, e.g. tile angle and gap spacing. For the measurement with low signal-to-noise ratio, its accuracy is influenced by the read noise of the detector instead of the wound fibre image bundle structure defects. We demonstrate this both with simulation and experimentally. We provide a statistical model of the centroid and wavefront error of a wound fibre image bundle found through experiment.

  19. Analysis of Multipath Pixels in SAR Images

    NASA Astrophysics Data System (ADS)

    Zhao, J. W.; Wu, J. C.; Ding, X. L.; Zhang, L.; Hu, F. M.

    2016-06-01

    As the received radar signal is the sum of signal contributions overlaid in one single pixel regardless of the travel path, the multipath effect should be seriously tackled as the multiple bounce returns are added to direct scatter echoes which leads to ghost scatters. Most of the existing solution towards the multipath is to recover the signal propagation path. To facilitate the signal propagation simulation process, plenty of aspects such as sensor parameters, the geometry of the objects (shape, location, orientation, mutual position between adjacent buildings) and the physical parameters of the surface (roughness, correlation length, permittivity)which determine the strength of radar signal backscattered to the SAR sensor should be given in previous. However, it's not practical to obtain the highly detailed object model in unfamiliar area by field survey as it's a laborious work and time-consuming. In this paper, SAR imaging simulation based on RaySAR is conducted at first aiming at basic understanding of multipath effects and for further comparison. Besides of the pre-imaging simulation, the product of the after-imaging, which refers to radar images is also taken into consideration. Both Cosmo-SkyMed ascending and descending SAR images of Lupu Bridge in Shanghai are used for the experiment. As a result, the reflectivity map and signal distribution map of different bounce level are simulated and validated by 3D real model. The statistic indexes such as the phase stability, mean amplitude, amplitude dispersion, coherence and mean-sigma ratio in case of layover are analyzed with combination of the RaySAR output.

  20. Quantitative evaluation of the accuracy and variance of individual pixels in a scientific CMOS (sCMOS) camera for computational imaging

    NASA Astrophysics Data System (ADS)

    Watanabe, Shigeo; Takahashi, Teruo; Bennett, Keith

    2017-02-01

    The"scientific" CMOS (sCMOS) camera architecture fundamentally differs from CCD and EMCCD cameras. In digital CCD and EMCCD cameras, conversion from charge to the digital output is generally through a single electronic chain, and the read noise and the conversion factor from photoelectrons to digital outputs are highly uniform for all pixels, although quantum efficiency may spatially vary. In CMOS cameras, the charge to voltage conversion is separate for each pixel and each column has independent amplifiers and analog-to-digital converters, in addition to possible pixel-to-pixel variation in quantum efficiency. The "raw" output from the CMOS image sensor includes pixel-to-pixel variability in the read noise, electronic gain, offset and dark current. Scientific camera manufacturers digitally compensate the raw signal from the CMOS image sensors to provide usable images. Statistical noise in images, unless properly modeled, can introduce errors in methods such as fluctuation correlation spectroscopy or computational imaging, for example, localization microscopy using maximum likelihood estimation. We measured the distributions and spatial maps of individual pixel offset, dark current, read noise, linearity, photoresponse non-uniformity and variance distributions of individual pixels for standard, off-the-shelf Hamamatsu ORCA-Flash4.0 V3 sCMOS cameras using highly uniform and controlled illumination conditions, from dark conditions to multiple low light levels between 20 to 1,000 photons / pixel per frame to higher light conditions. We further show that using pixel variance for flat field correction leads to errors in cameras with good factory calibration.

  1. SpectraCAM SPM: a camera system with high dynamic range for scientific and medical applications

    NASA Astrophysics Data System (ADS)

    Bhaskaran, S.; Baiko, D.; Lungu, G.; Pilon, M.; VanGorden, S.

    2005-08-01

    A scientific camera system having high dynamic range designed and manufactured by Thermo Electron for scientific and medical applications is presented. The newly developed CID820 image sensor with preamplifier-per-pixel technology is employed in this camera system. The 4 Mega-pixel imaging sensor has a raw dynamic range of 82dB. Each high-transparent pixel is based on a preamplifier-per-pixel architecture and contains two photogates for non-destructive readout of the photon-generated charge (NDRO). Readout is achieved via parallel row processing with on-chip correlated double sampling (CDS). The imager is capable of true random pixel access with a maximum operating speed of 4MHz. The camera controller consists of a custom camera signal processor (CSP) with an integrated 16-bit A/D converter and a PowerPC-based CPU running a Linux embedded operating system. The imager is cooled to -40C via three-stage cooler to minimize dark current. The camera housing is sealed and is designed to maintain the CID820 imager in the evacuated chamber for at least 5 years. Thermo Electron has also developed custom software and firmware to drive the SpectraCAM SPM camera. Included in this firmware package is the new Extreme DRTM algorithm that is designed to extend the effective dynamic range of the camera by several orders of magnitude up to 32-bit dynamic range. The RACID Exposure graphical user interface image analysis software runs on a standard PC that is connected to the camera via Gigabit Ethernet.

  2. Wavefront Derived Refraction and Full Eye Biometry in Pseudophakic Eyes

    PubMed Central

    Mao, Xinjie; Banta, James T.; Ke, Bilian; Jiang, Hong; He, Jichang; Liu, Che; Wang, Jianhua

    2016-01-01

    Purpose To assess wavefront derived refraction and full eye biometry including ciliary muscle dimension and full eye axial geometry in pseudophakic eyes using spectral domain OCT equipped with a Shack-Hartmann wavefront sensor. Methods Twenty-eight adult subjects (32 pseudophakic eyes) having recently undergone cataract surgery were enrolled in this study. A custom system combining two optical coherence tomography systems with a Shack-Hartmann wavefront sensor was constructed to image and monitor changes in whole eye biometry, the ciliary muscle and ocular aberration in the pseudophakic eye. A Badal optical channel and a visual target aligning with the wavefront sensor were incorporated into the system for measuring the wavefront-derived refraction. The imaging acquisition was performed twice. The coefficients of repeatability (CoR) and intraclass correlation coefficient (ICC) were calculated. Results Images were acquired and processed successfully in all patients. No significant difference was detected between repeated measurements of ciliary muscle dimension, full-eye biometry or defocus aberration. The CoR of full-eye biometry ranged from 0.36% to 3.04% and the ICC ranged from 0.981 to 0.999. The CoR for ciliary muscle dimensions ranged from 12.2% to 41.6% and the ICC ranged from 0.767 to 0.919. The defocus aberrations of the two measurements were 0.443 ± 0.534 D and 0.447 ± 0.586 D and the ICC was 0.951. Conclusions The combined system is capable of measuring full eye biometry and refraction with good repeatability. The system is suitable for future investigation of pseudoaccommodation in the pseudophakic eye. PMID:27010674

  3. Wavefront Derived Refraction and Full Eye Biometry in Pseudophakic Eyes.

    PubMed

    Mao, Xinjie; Banta, James T; Ke, Bilian; Jiang, Hong; He, Jichang; Liu, Che; Wang, Jianhua

    2016-01-01

    To assess wavefront derived refraction and full eye biometry including ciliary muscle dimension and full eye axial geometry in pseudophakic eyes using spectral domain OCT equipped with a Shack-Hartmann wavefront sensor. Twenty-eight adult subjects (32 pseudophakic eyes) having recently undergone cataract surgery were enrolled in this study. A custom system combining two optical coherence tomography systems with a Shack-Hartmann wavefront sensor was constructed to image and monitor changes in whole eye biometry, the ciliary muscle and ocular aberration in the pseudophakic eye. A Badal optical channel and a visual target aligning with the wavefront sensor were incorporated into the system for measuring the wavefront-derived refraction. The imaging acquisition was performed twice. The coefficients of repeatability (CoR) and intraclass correlation coefficient (ICC) were calculated. Images were acquired and processed successfully in all patients. No significant difference was detected between repeated measurements of ciliary muscle dimension, full-eye biometry or defocus aberration. The CoR of full-eye biometry ranged from 0.36% to 3.04% and the ICC ranged from 0.981 to 0.999. The CoR for ciliary muscle dimensions ranged from 12.2% to 41.6% and the ICC ranged from 0.767 to 0.919. The defocus aberrations of the two measurements were 0.443 ± 0.534 D and 0.447 ± 0.586 D and the ICC was 0.951. The combined system is capable of measuring full eye biometry and refraction with good repeatability. The system is suitable for future investigation of pseudoaccommodation in the pseudophakic eye.

  4. Design Considerations For Imaging Charge-Coupled Device (ICCD) Star Sensors

    NASA Astrophysics Data System (ADS)

    McAloon, K. J.

    1981-04-01

    A development program is currently underway to produce a precision star sensor using imaging charge coupled device (ICCD) technology. The effort is the critical component development phase for the Air Force Multi-Mission Attitude Determination and Autonomous Navigation System (MADAN). A number of unique considerations have evolved in designing an arcsecond accuracy sensor around an ICCD detector. Three tiers of performance criteria are involved: at the spacecraft attitude determination system level, at the star sensor level, and at the detector level. Optimum attitude determination system performance involves a tradeoff between Kalman filter iteration time and sensor ICCD integration time. The ICCD star sensor lends itself to the use of a new approach in the functional interface between the attitude determination system and the sensor. At the sensor level image data processing tradeoffs are important for optimum sensor performance. These tradeoffs involve the sensor optic configuration, the optical point spread function (PSF) size and shape, the PSF position locator, and the microprocessor locator algorithm. Performance modelling of the sensor mandates the use of computer simulation programs. Five key performance parameters at the ICCD detector level are defined. ICCD error characteristics have also been isolated to five key parameters.

  5. Radar E-O image fusion

    NASA Technical Reports Server (NTRS)

    Oneil, William F.

    1993-01-01

    The fusion of radar and electro-optic (E-O) sensor images presents unique challenges. The two sensors measure different properties of the real three-dimensional (3-D) world. Forming the sensor outputs into a common format does not mask these differences. In this paper, the conditions under which fusion of the two sensor signals is possible are explored. The program currently planned to investigate this problem is briefly discussed.

  6. Air, telescope, and instrument temperature effects on the Gemini Planet Imager’s image quality

    NASA Astrophysics Data System (ADS)

    Tallis, Melisa; Bailey, Vanessa P.; Macintosh, Bruce; Hayward, Thomas L.; Chilcote, Jeffrey K.; Ruffio, Jean-Baptiste; Poyneer, Lisa A.; Savransky, Dmitry; Wang, Jason J.; GPIES Team

    2018-01-01

    We present results from an analysis of air, telescope, and instrument temperature effects on the Gemini Planet Imager’s (GPI) image quality. GPI is a near-infrared, adaptive optics-fed, high-contrast imaging instrument at the Gemini South telescope, designed to directly image and characterize exoplanets and circumstellar disks. One key metric for instrument performance is “contrast,” which quantifies the sensitivity of an image in terms of the flux ratio of the noise floor vs. the primary star. Very high contrast signifies that GPI could succeed at imaging a dim, close companion around the primary star. We examine relationships between multiple temperature sensors placed on the instrument and telescope vs. image contrast. These results show that there is a strong correlation between image contrast and the presence of temperature differentials between the instrument and the temperature outside the dome. We discuss potential causes such as strong induced dome seeing or optical misalignment due to thermal gradients. We then assess the impact of the current temperature control and ventilation strategy and discuss potential modifications.

  7. A real-time photogrammetric algorithm for sensor and synthetic image fusion with application to aviation combined vision

    NASA Astrophysics Data System (ADS)

    Lebedev, M. A.; Stepaniants, D. G.; Komarov, D. V.; Vygolov, O. V.; Vizilter, Yu. V.; Zheltov, S. Yu.

    2014-08-01

    The paper addresses a promising visualization concept related to combination of sensor and synthetic images in order to enhance situation awareness of a pilot during an aircraft landing. A real-time algorithm for a fusion of a sensor image, acquired by an onboard camera, and a synthetic 3D image of the external view, generated in an onboard computer, is proposed. The pixel correspondence between the sensor and the synthetic images is obtained by an exterior orientation of a "virtual" camera using runway points as a geospatial reference. The runway points are detected by the Projective Hough Transform, which idea is to project the edge map onto a horizontal plane in the object space (the runway plane) and then to calculate intensity projections of edge pixels on different directions of intensity gradient. The performed experiments on simulated images show that on a base glide path the algorithm provides image fusion with pixel accuracy, even in the case of significant navigation errors.

  8. Heterogeneous iris image hallucination using sparse representation on a learned heterogeneous patch dictionary

    NASA Astrophysics Data System (ADS)

    Li, Yung-Hui; Zheng, Bo-Ren; Ji, Dai-Yan; Tien, Chung-Hao; Liu, Po-Tsun

    2014-09-01

    Cross sensor iris matching may seriously degrade the recognition performance because of the sensor mis-match problem of iris images between the enrollment and test stage. In this paper, we propose two novel patch-based heterogeneous dictionary learning method to attack this problem. The first method applies the latest sparse representation theory while the second method tries to learn the correspondence relationship through PCA in heterogeneous patch space. Both methods learn the basic atoms in iris textures across different image sensors and build connections between them. After such connections are built, at test stage, it is possible to hallucinate (synthesize) iris images across different sensors. By matching training images with hallucinated images, the recognition rate can be successfully enhanced. The experimental results showed the satisfied results both visually and in terms of recognition rate. Experimenting with an iris database consisting of 3015 images, we show that the EER is decreased 39.4% relatively by the proposed method.

  9. Estimation and analysis of interannual variations in tropical oceanic rainfall using data from SSM/I

    NASA Technical Reports Server (NTRS)

    Berg, Wesley

    1992-01-01

    Rainfall over tropical ocean regions, particularly in the tropical Pacific, is estimated using Special Sensor Microwave/Imager (SSM/I) data. Instantaneous rainfall estimates are derived from brightness temperature values obtained from the satellite data using the Hughes D-Matrix algorithm. Comparisons with other satellite techniques are made to validate the SSM/I results for the tropical Pacific. The correlation coefficients are relatively high for the three data sets investigated, especially for the annual case.

  10. SSM/I Rainfall Volume Correlated with Deepening Rate in Extratropical Cyclones

    NASA Technical Reports Server (NTRS)

    Petty, Grant W.; Miller, Douglas K.

    1994-01-01

    With the emergence of reasonably robust, physically based rain rate algorithms designed for the Special Sensor Microwave/Imager (SSM/I), a unique opportunity exists to directly observe a physical component which can contribute to or be a signature of cyclone deepening (latent heat release). The emphasis of the research in this paper is to seek systematic differences in rain rate observed by the SSM/I, using the algorithm of Petty in cases of explosive and nonexplosive cyclone deepening.

  11. Imaging optical sensor arrays.

    PubMed

    Walt, David R

    2002-10-01

    Imaging optical fibres have been etched to prepare microwell arrays. These microwells have been loaded with sensing materials such as bead-based sensors and living cells to create high-density sensor arrays. The extremely small sizes and volumes of the wells enable high sensitivity and high information content sensing capabilities.

  12. Efficient Solar Scene Wavefront Estimation with Reduced Systematic and RMS Errors: Summary

    NASA Astrophysics Data System (ADS)

    Anugu, N.; Garcia, P.

    2016-04-01

    Wave front sensing for solar telescopes is commonly implemented with the Shack-Hartmann sensors. Correlation algorithms are usually used to estimate the extended scene Shack-Hartmann sub-aperture image shifts or slopes. The image shift is computed by correlating a reference sub-aperture image with the target distorted sub-aperture image. The pixel position where the maximum correlation is located gives the image shift in integer pixel coordinates. Sub-pixel precision image shifts are computed by applying a peak-finding algorithm to the correlation peak Poyneer (2003); Löfdahl (2010). However, the peak-finding algorithm results are usually biased towards the integer pixels, these errors are called as systematic bias errors Sjödahl (1994). These errors are caused due to the low pixel sampling of the images. The amplitude of these errors depends on the type of correlation algorithm and the type of peak-finding algorithm being used. To study the systematic errors in detail, solar sub-aperture synthetic images are constructed by using a Swedish Solar Telescope solar granulation image1. The performance of cross-correlation algorithm in combination with different peak-finding algorithms is investigated. The studied peak-finding algorithms are: parabola Poyneer (2003); quadratic polynomial Löfdahl (2010); threshold center of gravity Bailey (2003); Gaussian Nobach & Honkanen (2005) and Pyramid Bailey (2003). The systematic error study reveals that that the pyramid fit is the most robust to pixel locking effects. The RMS error analysis study reveals that the threshold centre of gravity behaves better in low SNR, although the systematic errors in the measurement are large. It is found that no algorithm is best for both the systematic and the RMS error reduction. To overcome the above problem, a new solution is proposed. In this solution, the image sampling is increased prior to the actual correlation matching. The method is realized in two steps to improve its computational efficiency. In the first step, the cross-correlation is implemented at the original image spatial resolution grid (1 pixel). In the second step, the cross-correlation is performed using a sub-pixel level grid by limiting the field of search to 4 × 4 pixels centered at the first step delivered initial position. The generation of these sub-pixel grid based region of interest images is achieved with the bi-cubic interpolation. The correlation matching with sub-pixel grid technique was previously reported in electronic speckle photography Sjö'dahl (1994). This technique is applied here for the solar wavefront sensing. A large dynamic range and a better accuracy in the measurements are achieved with the combination of the original pixel grid based correlation matching in a large field of view and a sub-pixel interpolated image grid based correlation matching within a small field of view. The results revealed that the proposed method outperforms all the different peak-finding algorithms studied in the first approach. It reduces both the systematic error and the RMS error by a factor of 5 (i.e., 75% systematic error reduction), when 5 times improved image sampling was used. This measurement is achieved at the expense of twice the computational cost. With the 5 times improved image sampling, the wave front accuracy is increased by a factor of 5. The proposed solution is strongly recommended for wave front sensing in the solar telescopes, particularly, for measuring large dynamic image shifts involved open loop adaptive optics. Also, by choosing an appropriate increment of image sampling in trade-off between the computational speed limitation and the aimed sub-pixel image shift accuracy, it can be employed in closed loop adaptive optics. The study is extended to three other class of sub-aperture images (a point source; a laser guide star; a Galactic Center extended scene). The results are planned to submit for the Optical Express journal.

  13. The Design of a Single-Bit CMOS Image Sensor for Iris Recognition Applications

    PubMed Central

    Park, Keunyeol; Song, Minkyu

    2018-01-01

    This paper presents a single-bit CMOS image sensor (CIS) that uses a data processing technique with an edge detection block for simple iris segmentation. In order to recognize the iris image, the image sensor conventionally captures high-resolution image data in digital code, extracts the iris data, and then compares it with a reference image through a recognition algorithm. However, in this case, the frame rate decreases by the time required for digital signal conversion of multi-bit digital data through the analog-to-digital converter (ADC) in the CIS. In order to reduce the overall processing time as well as the power consumption, we propose a data processing technique with an exclusive OR (XOR) logic gate to obtain single-bit and edge detection image data instead of multi-bit image data through the ADC. In addition, we propose a logarithmic counter to efficiently measure single-bit image data that can be applied to the iris recognition algorithm. The effective area of the proposed single-bit image sensor (174 × 144 pixel) is 2.84 mm2 with a 0.18 μm 1-poly 4-metal CMOS image sensor process. The power consumption of the proposed single-bit CIS is 2.8 mW with a 3.3 V of supply voltage and 520 frame/s of the maximum frame rates. The error rate of the ADC is 0.24 least significant bit (LSB) on an 8-bit ADC basis at a 50 MHz sampling frequency. PMID:29495273

  14. The Design of a Single-Bit CMOS Image Sensor for Iris Recognition Applications.

    PubMed

    Park, Keunyeol; Song, Minkyu; Kim, Soo Youn

    2018-02-24

    This paper presents a single-bit CMOS image sensor (CIS) that uses a data processing technique with an edge detection block for simple iris segmentation. In order to recognize the iris image, the image sensor conventionally captures high-resolution image data in digital code, extracts the iris data, and then compares it with a reference image through a recognition algorithm. However, in this case, the frame rate decreases by the time required for digital signal conversion of multi-bit digital data through the analog-to-digital converter (ADC) in the CIS. In order to reduce the overall processing time as well as the power consumption, we propose a data processing technique with an exclusive OR (XOR) logic gate to obtain single-bit and edge detection image data instead of multi-bit image data through the ADC. In addition, we propose a logarithmic counter to efficiently measure single-bit image data that can be applied to the iris recognition algorithm. The effective area of the proposed single-bit image sensor (174 × 144 pixel) is 2.84 mm² with a 0.18 μm 1-poly 4-metal CMOS image sensor process. The power consumption of the proposed single-bit CIS is 2.8 mW with a 3.3 V of supply voltage and 520 frame/s of the maximum frame rates. The error rate of the ADC is 0.24 least significant bit (LSB) on an 8-bit ADC basis at a 50 MHz sampling frequency.

  15. Statistical Analysis of the Random Telegraph Noise in a 1.1 μm Pixel, 8.3 MP CMOS Image Sensor Using On-Chip Time Constant Extraction Method.

    PubMed

    Chao, Calvin Yi-Ping; Tu, Honyih; Wu, Thomas Meng-Hsiu; Chou, Kuo-Yu; Yeh, Shang-Fu; Yin, Chin; Lee, Chih-Lin

    2017-11-23

    A study of the random telegraph noise (RTN) of a 1.1 μm pitch, 8.3 Mpixel CMOS image sensor (CIS) fabricated in a 45 nm backside-illumination (BSI) technology is presented in this paper. A noise decomposition scheme is used to pinpoint the noise source. The long tail of the random noise (RN) distribution is directly linked to the RTN from the pixel source follower (SF). The full 8.3 Mpixels are classified into four categories according to the observed RTN histogram peaks. A theoretical formula describing the RTN as a function of the time difference between the two phases of the correlated double sampling (CDS) is derived and validated by measured data. An on-chip time constant extraction method is developed and applied to the RTN analysis. The effects of readout circuit bandwidth on the settling ratios of the RTN histograms are investigated and successfully accounted for in a simulation using a RTN behavior model.

  16. Quantitative detection of the colloidal gold immunochromatographic strip in HSV color space

    NASA Astrophysics Data System (ADS)

    Wu, Yuanshu; Gao, Yueming; Du, Min

    2014-09-01

    In this paper, a fast, reliable and accurate quantitative detection method for the colloidal gold immunochromatographic strip(GICA) is presented. An image acquisition device which is mainly composed of annular LED source, zoom ratio lens, and 10bit CMOS image sensors with 54.5dB SNR is designed for the detection. Firstly, the test line is extracted from the strip window through using the H component peak points of the HSV space as the clustering centers via the Fuzzy C-Means(FCM) clustering method. Then, a two dimensional eigenvalue composed with the hue(H) and saturation(S) of HSV space was proposed to improve the accuracy of the quantitative detection. At last, the experiment of human chorionic gonadotropin(HCG) with the concentration range 0-500mIU/mL is carried out. The results show that the linear correlation coefficient between this method and optical density(OD) values measured by the fiber optical sensor reach 96.74%. Meanwhile, the linearity of fitting curve constructed with concentration was greater than 95.00%.

  17. Network compensation for missing sensors

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.; Mulligan, Jeffrey B.

    1991-01-01

    A network learning translation invariance algorithm to compute interpolation functions is presented. This algorithm with one fixed receptive field can construct a linear transformation compensating for gain changes, sensor position jitter, and sensor loss when there are enough remaining sensors to adequately sample the input images. However, when the images are undersampled and complete compensation is not possible, the algorithm need to be modified. For moderate sensor losses, the algorithm works if the transformation weight adjustment is restricted to the weights to output units affected by the loss.

  18. Compressive Sensing Image Sensors-Hardware Implementation

    PubMed Central

    Dadkhah, Mohammadreza; Deen, M. Jamal; Shirani, Shahram

    2013-01-01

    The compressive sensing (CS) paradigm uses simultaneous sensing and compression to provide an efficient image acquisition technique. The main advantages of the CS method include high resolution imaging using low resolution sensor arrays and faster image acquisition. Since the imaging philosophy in CS imagers is different from conventional imaging systems, new physical structures have been developed for cameras that use the CS technique. In this paper, a review of different hardware implementations of CS encoding in optical and electrical domains is presented. Considering the recent advances in CMOS (complementary metal–oxide–semiconductor) technologies and the feasibility of performing on-chip signal processing, important practical issues in the implementation of CS in CMOS sensors are emphasized. In addition, the CS coding for video capture is discussed. PMID:23584123

  19. Imaging through water turbulence with a plenoptic sensor

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C.

    2016-09-01

    A plenoptic sensor can be used to improve the image formation process in a conventional camera. Through this process, the conventional image is mapped to an image array that represents the image's photon paths along different angular directions. Therefore, it can be used to resolve imaging problems where severe distortion happens. Especially for objects observed at moderate range (10m to 200m) through turbulent water, the image can be twisted to be entirely unrecognizable and correction algorithms need to be applied. In this paper, we show how to use a plenoptic sensor to recover an unknown object in line of sight through significant water turbulence distortion. In general, our approach can be applied to both atmospheric turbulence and water turbulence conditions.

  20. Dual light field and polarization imaging using CMOS diffractive image sensors.

    PubMed

    Jayasuriya, Suren; Sivaramakrishnan, Sriram; Chuang, Ellen; Guruaribam, Debashree; Wang, Albert; Molnar, Alyosha

    2015-05-15

    In this Letter we present, to the best of our knowledge, the first integrated CMOS image sensor that can simultaneously perform light field and polarization imaging without the use of external filters or additional optical elements. Previous work has shown how photodetectors with two stacks of integrated metal gratings above them (called angle sensitive pixels) diffract light in a Talbot pattern to capture four-dimensional light fields. We show, in addition to diffractive imaging, that these gratings polarize incoming light and characterize the response of these sensors to polarization and incidence angle. Finally, we show two applications of polarization imaging: imaging stress-induced birefringence and identifying specular reflections in scenes to improve light field algorithms for these scenes.

  1. Median filters as a tool to determine dark noise thresholds in high resolution smartphone image sensors for scientific imaging

    NASA Astrophysics Data System (ADS)

    Igoe, Damien P.; Parisi, Alfio V.; Amar, Abdurazaq; Rummenie, Katherine J.

    2018-01-01

    An evaluation of the use of median filters in the reduction of dark noise in smartphone high resolution image sensors is presented. The Sony Xperia Z1 employed has a maximum image sensor resolution of 20.7 Mpixels, with each pixel having a side length of just over 1 μm. Due to the large number of photosites, this provides an image sensor with very high sensitivity but also makes them prone to noise effects such as hot-pixels. Similar to earlier research with older models of smartphone, no appreciable temperature effects were observed in the overall average pixel values for images taken in ambient temperatures between 5 °C and 25 °C. In this research, hot-pixels are defined as pixels with intensities above a specific threshold. The threshold is determined using the distribution of pixel values of a set of images with uniform statistical properties associated with the application of median-filters of increasing size. An image with uniform statistics was employed as a training set from 124 dark images, and the threshold was determined to be 9 digital numbers (DN). The threshold remained constant for multiple resolutions and did not appreciably change even after a year of extensive field use and exposure to solar ultraviolet radiation. Although the temperature effects' uniformity masked an increase in hot-pixel occurrences, the total number of occurrences represented less than 0.1% of the total image. Hot-pixels were removed by applying a median filter, with an optimum filter size of 7 × 7; similar trends were observed for four additional smartphone image sensors used for validation. Hot-pixels were also reduced by decreasing image resolution. The method outlined in this research provides a methodology to characterise the dark noise behavior of high resolution image sensors for use in scientific investigations, especially as pixel sizes decrease.

  2. Phase aided 3D imaging and modeling: dedicated systems and case studies

    NASA Astrophysics Data System (ADS)

    Yin, Yongkai; He, Dong; Liu, Zeyi; Liu, Xiaoli; Peng, Xiang

    2014-05-01

    Dedicated prototype systems for 3D imaging and modeling (3DIM) are presented. The 3D imaging systems are based on the principle of phase-aided active stereo, which have been developed in our laboratory over the past few years. The reported 3D imaging prototypes range from single 3D sensor to a kind of optical measurement network composed of multiple node 3D-sensors. To enable these 3D imaging systems, we briefly discuss the corresponding calibration techniques for both single sensor and multi-sensor optical measurement network, allowing good performance of the 3DIM prototype systems in terms of measurement accuracy and repeatability. Furthermore, two case studies including the generation of high quality color model of movable cultural heritage and photo booth from body scanning are presented to demonstrate our approach.

  3. UTOFIA: an underwater time-of-flight image acquisition system

    NASA Astrophysics Data System (ADS)

    Driewer, Adrian; Abrosimov, Igor; Alexander, Jonathan; Benger, Marc; O'Farrell, Marion; Haugholt, Karl Henrik; Softley, Chris; Thielemann, Jens T.; Thorstensen, Jostein; Yates, Chris

    2017-10-01

    In this article the development of a newly designed Time-of-Flight (ToF) image sensor for underwater applications is described. The sensor is developed as part of the project UTOFIA (underwater time-of-flight image acquisition) funded by the EU within the Horizon 2020 framework. This project aims to develop a camera based on range gating that extends the visible range compared to conventional cameras by a factor of 2 to 3 and delivers real-time range information by means of a 3D video stream. The principle of underwater range gating as well as the concept of the image sensor are presented. Based on measurements on a test image sensor a pixel structure that suits best to the requirements has been selected. Within an extensive characterization underwater the capability of distance measurements in turbid environments is demonstrated.

  4. Electric potential and electric field imaging

    NASA Astrophysics Data System (ADS)

    Generazio, E. R.

    2017-02-01

    The technology and methods for remote quantitative imaging of electrostatic potentials and electrostatic fields in and around objects and in free space is presented. Electric field imaging (EFI) technology may be applied to characterize intrinsic or existing electric potentials and electric fields, or an externally generated electrostatic field made be used for "illuminating" volumes to be inspected with EFI. The baseline sensor technology (e-Sensor) and its construction, optional electric field generation (quasi-static generator), and current e-Sensor enhancements (ephemeral e-Sensor) are discussed. Demonstrations for structural, electronic, human, and memory applications are shown. This new EFI capability is demonstrated to reveal characterization of electric charge distribution creating a new field of study embracing areas of interest including electrostatic discharge (ESD) mitigation, crime scene forensics, design and materials selection for advanced sensors, dielectric morphology of structures, tether integrity, organic molecular memory, and medical diagnostic and treatment efficacy applications such as cardiac polarization wave propagation and electromyography imaging.

  5. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.

  6. Protection performance evaluation regarding imaging sensors hardened against laser dazzling

    NASA Astrophysics Data System (ADS)

    Ritt, Gunnar; Koerber, Michael; Forster, Daniel; Eberle, Bernd

    2015-05-01

    Electro-optical imaging sensors are widely distributed and used for many different purposes, including civil security and military operations. However, laser irradiation can easily disturb their operational capability. Thus, an adequate protection mechanism for electro-optical sensors against dazzling and damaging is highly desirable. Different protection technologies exist now, but none of them satisfies the operational requirements without any constraints. In order to evaluate the performance of various laser protection measures, we present two different approaches based on triangle orientation discrimination on the one hand and structural similarity on the other hand. For both approaches, image analysis algorithms are applied to images taken of a standard test scene with triangular test patterns which is superimposed by dazzling laser light of various irradiance levels. The evaluation methods are applied to three different sensors: a standard complementary metal oxide semiconductor camera, a high dynamic range camera with a nonlinear response curve, and a sensor hardened against laser dazzling.

  7. A micro-vibration generated method for testing the imaging quality on ground of space remote sensing

    NASA Astrophysics Data System (ADS)

    Gu, Yingying; Wang, Li; Wu, Qingwen

    2018-03-01

    In this paper, a novel method is proposed, which can simulate satellite platform micro-vibration and test the impact of satellite micro-vibration on imaging quality of space optical remote sensor on ground. The method can generate micro-vibration of satellite platform in orbit from vibrational degrees of freedom, spectrum, magnitude, and coupling path. Experiment results show that the relative error of acceleration control is within 7%, in frequencies from 7Hz to 40Hz. Utilizing this method, the system level test about the micro-vibration impact on imaging quality of space optical remote sensor can be realized. This method will have an important applications in testing micro-vibration tolerance margin of optical remote sensor, verifying vibration isolation and suppression performance of optical remote sensor, exploring the principle of micro-vibration impact on imaging quality of optical remote sensor.

  8. A Digital Sensor Simulator of the Pushbroom Offner Hyperspectral Imaging Spectrometer

    PubMed Central

    Tao, Dongxing; Jia, Guorui; Yuan, Yan; Zhao, Huijie

    2014-01-01

    Sensor simulators can be used in forecasting the imaging quality of a new hyperspectral imaging spectrometer, and generating simulated data for the development and validation of the data processing algorithms. This paper presents a novel digital sensor simulator for the pushbroom Offner hyperspectral imaging spectrometer, which is widely used in the hyperspectral remote sensing. Based on the imaging process, the sensor simulator consists of a spatial response module, a spectral response module, and a radiometric response module. In order to enhance the simulation accuracy, spatial interpolation-resampling, which is implemented before the spatial degradation, is developed to compromise the direction error and the extra aliasing effect. Instead of using the spectral response function (SRF), the dispersive imaging characteristics of the Offner convex grating optical system is accurately modeled by its configuration parameters. The non-uniformity characteristics, such as keystone and smile effects, are simulated in the corresponding modules. In this work, the spatial, spectral and radiometric calibration processes are simulated to provide the parameters of modulation transfer function (MTF), SRF and radiometric calibration parameters of the sensor simulator. Some uncertainty factors (the stability, band width of the monochromator for the spectral calibration, and the integrating sphere uncertainty for the radiometric calibration) are considered in the simulation of the calibration process. With the calibration parameters, several experiments were designed to validate the spatial, spectral and radiometric response of the sensor simulator, respectively. The experiment results indicate that the sensor simulator is valid. PMID:25615727

  9. Fusion of spectral and panchromatic images using false color mapping and wavelet integrated approach

    NASA Astrophysics Data System (ADS)

    Zhao, Yongqiang; Pan, Quan; Zhang, Hongcai

    2006-01-01

    With the development of sensory technology, new image sensors have been introduced that provide a greater range of information to users. But as the power limitation of radiation, there will always be some trade-off between spatial and spectral resolution in the image captured by specific sensors. Images with high spatial resolution can locate objects with high accuracy, whereas images with high spectral resolution can be used to identify the materials. Many applications in remote sensing require fusing low-resolution imaging spectral images with panchromatic images to identify materials at high resolution in clutter. A pixel-based false color mapping and wavelet transform integrated fusion algorithm is presented in this paper, the resulting images have a higher information content than each of the original images and retain sensor-specific image information. The simulation results show that this algorithm can enhance the visibility of certain details and preserve the difference of different materials.

  10. Single-shot and single-sensor high/super-resolution microwave imaging based on metasurface

    PubMed Central

    Wang, Libo; Li, Lianlin; Li, Yunbo; Zhang, Hao Chi; Cui, Tie Jun

    2016-01-01

    Real-time high-resolution (including super-resolution) imaging with low-cost hardware is a long sought-after goal in various imaging applications. Here, we propose broadband single-shot and single-sensor high-/super-resolution imaging by using a spatio-temporal dispersive metasurface and an imaging reconstruction algorithm. The metasurface with spatio-temporal dispersive property ensures the feasibility of the single-shot and single-sensor imager for super- and high-resolution imaging, since it can convert efficiently the detailed spatial information of the probed object into one-dimensional time- or frequency-dependent signal acquired by a single sensor fixed in the far-field region. The imaging quality can be improved by applying a feature-enhanced reconstruction algorithm in post-processing, and the desired imaging resolution is related to the distance between the object and metasurface. When the object is placed in the vicinity of the metasurface, the super-resolution imaging can be realized. The proposed imaging methodology provides a unique means to perform real-time data acquisition, high-/super-resolution images without employing expensive hardware (e.g. mechanical scanner, antenna array, etc.). We expect that this methodology could make potential breakthroughs in the areas of microwave, terahertz, optical, and even ultrasound imaging. PMID:27246668

  11. Development of CMOS Active Pixel Image Sensors for Low Cost Commercial Applications

    NASA Technical Reports Server (NTRS)

    Gee, R.; Kemeny, S.; Kim, Q.; Mendis, S.; Nakamura, J.; Nixon, R.; Ortiz, M.; Pain, B.; Staller, C.; Zhou, Z; hide

    1994-01-01

    JPL, under sponsorship from the NASA Office of Advanced Concepts and Technology, has been developing a second-generation solid-state image sensor technology. Charge-coupled devices (CCD) are a well-established first generation image sensor technology. For both commercial and NASA applications, CCDs have numerous shortcomings. In response, the active pixel sensor (APS) technology has been under research. The major advantages of APS technology are the ability to integrate on-chip timing, control, signal-processing and analog-to-digital converter functions, reduced sensitivity to radiation effects, low power operation, and random access readout.

  12. Can direct electron detectors outperform phosphor-CCD systems for TEM?

    NASA Astrophysics Data System (ADS)

    Moldovan, G.; Li, X.; Kirkland, A.

    2008-08-01

    A new generation of imaging detectors is being considered for application in TEM, but which device architectures can provide the best images? Monte Carlo simulations of the electron-sensor interaction are used here to calculate the expected modulation transfer of monolithic active pixel sensors (MAPS), hybrid active pixel sensors (HAPS) and double sided Silicon strip detectors (DSSD), showing that ideal and nearly ideal transfer can be obtained using DSSD and MAPS sensors. These results highly recommend the replacement of current phosphor screen and charge coupled device imaging systems with such new directly exposed position sensitive electron detectors.

  13. A Dual Conductance Sensor for Simultaneous Measurement of Void Fraction and Structure Velocity of Downward Two-Phase Flow in a Slightly Inclined Pipe

    PubMed Central

    Lee, Yeon-Gun; Won, Woo-Youn; Lee, Bo-An; Kim, Sin

    2017-01-01

    In this study, a new and improved electrical conductance sensor is proposed for application not only to a horizontal pipe, but also an inclined one. The conductance sensor was designed to have a dual layer, each consisting of a three-electrode set to obtain two instantaneous conductance signals in turns, so that the area-averaged void fraction and structure velocity could be measured simultaneously. The optimum configuration of the electrodes was determined through numerical analysis, and the calibration curves for stratified and annular flow were obtained through a series of static experiments. The fabricated conductance sensor was applied to a 45 mm inner diameter U-shaped downward inclined pipe with an inclination angle of 3° under adiabatic air-water flow conditions. In the tests, the superficial velocities ranged from 0.1 to 3.0 m/s for water and from 0.1 to 18 m/s for air. The obtained mean void fraction and the structure velocity from the conductance sensor were validated against the measurement by the wire-mesh sensor and the cross-correlation technique for the visualized images, respectively. The results of the flow regime classification and the corresponding time series of the void fraction at a variety of flow velocities were also discussed. PMID:28481308

  14. Microwave Imaging Sensor Using Compact Metamaterial UWB Antenna with a High Correlation Factor.

    PubMed

    Islam, Md Moinul; Islam, Mohammad Tariqul; Faruque, Mohammad Rashed Iqbal; Samsuzzaman, Md; Misran, Norbahiah; Arshad, Haslina

    2015-07-23

    The design of a compact metamaterial ultra-wideband (UWB) antenna with a goal towards application in microwave imaging systems for detecting unwanted cells in human tissue, such as in cases of breast cancer, heart failure and brain stroke detection is proposed. This proposed UWB antenna is made of four metamaterial unit cells, where each cell is an integration of a modified split ring resonator (SRR), capacitive loaded strip (CLS) and wire, to attain a design layout that simultaneously exhibits both a negative magnetic permeability and a negative electrical permittivity. This design results in an astonishing negative refractive index that enables amplification of the radiated power of this reported antenna, and therefore, high antenna performance. A low-cost FR4 substrate material is used to design and print this reported antenna, and has the following characteristics: thickness of 1.6 mm, relative permeability of one, relative permittivity of 4.60 and loss tangent of 0.02. The overall antenna size is 19.36 mm × 27.72 mm × 1.6 mm where the electrical dimension is 0.20 λ × 0.28 λ × 0.016 λ at the 3.05 GHz lower frequency band. Voltage Standing Wave Ratio (VSWR) measurements have illustrated that this antenna exhibits an impedance bandwidth from 3.05 GHz to more than 15 GHz for VSWR < 2 with an average gain of 4.38 dBi throughout the operating frequency band. The simulations (both HFSS and computer simulation technology (CST)) and the measurements are in high agreement. A high correlation factor and the capability of detecting tumour simulants confirm that this reported UWB antenna can be used as an imaging sensor.

  15. Microwave Imaging Sensor Using Compact Metamaterial UWB Antenna with a High Correlation Factor

    PubMed Central

    Islam, Md. Moinul; Islam, Mohammad Tariqul; Faruque, Mohammad Rashed Iqbal; Samsuzzaman, Md.; Misran, Norbahiah; Arshad, Haslina

    2015-01-01

    The design of a compact metamaterial ultra-wideband (UWB) antenna with a goal towards application in microwave imaging systems for detecting unwanted cells in human tissue, such as in cases of breast cancer, heart failure and brain stroke detection is proposed. This proposed UWB antenna is made of four metamaterial unit cells, where each cell is an integration of a modified split ring resonator (SRR), capacitive loaded strip (CLS) and wire, to attain a design layout that simultaneously exhibits both a negative magnetic permeability and a negative electrical permittivity. This design results in an astonishing negative refractive index that enables amplification of the radiated power of this reported antenna, and therefore, high antenna performance. A low-cost FR4 substrate material is used to design and print this reported antenna, and has the following characteristics: thickness of 1.6 mm, relative permeability of one, relative permittivity of 4.60 and loss tangent of 0.02. The overall antenna size is 19.36 mm × 27.72 mm × 1.6 mm where the electrical dimension is 0.20 λ × 0.28 λ × 0.016 λ at the 3.05 GHz lower frequency band. Voltage Standing Wave Ratio (VSWR) measurements have illustrated that this antenna exhibits an impedance bandwidth from 3.05 GHz to more than 15 GHz for VSWR < 2 with an average gain of 4.38 dBi throughout the operating frequency band. The simulations (both HFSS and computer simulation technology (CST)) and the measurements are in high agreement. A high correlation factor and the capability of detecting tumour simulants confirm that this reported UWB antenna can be used as an imaging sensor. PMID:28793461

  16. Image restoration using aberration taken by a Hartmann wavefront sensor on extended object, towards real-time deconvolution

    NASA Astrophysics Data System (ADS)

    Darudi, Ahmad; Bakhshi, Hadi; Asgari, Reza

    2015-05-01

    In this paper we present the results of image restoration using the data taken by a Hartmann sensor. The aberration is measure by a Hartmann sensor in which the object itself is used as reference. Then the Point Spread Function (PSF) is simulated and used for image reconstruction using the Lucy-Richardson technique. A technique is presented for quantitative evaluation the Lucy-Richardson technique for deconvolution.

  17. Real-time digital signal processing for live electro-optic imaging.

    PubMed

    Sasagawa, Kiyotaka; Kanno, Atsushi; Tsuchiya, Masahiro

    2009-08-31

    We present an imaging system that enables real-time magnitude and phase detection of modulated signals and its application to a Live Electro-optic Imaging (LEI) system, which realizes instantaneous visualization of RF electric fields. The real-time acquisition of magnitude and phase images of a modulated optical signal at 5 kHz is demonstrated by imaging with a Si-based high-speed CMOS image sensor and real-time signal processing with a digital signal processor. In the LEI system, RF electric fields are probed with light via an electro-optic crystal plate and downconverted to an intermediate frequency by parallel optical heterodyning, which can be detected with the image sensor. The artifacts caused by the optics and the image sensor characteristics are corrected by image processing. As examples, we demonstrate real-time visualization of electric fields from RF circuits.

  18. Methods and apparatuses for detection of radiation with semiconductor image sensors

    DOEpatents

    Cogliati, Joshua Joseph

    2018-04-10

    A semiconductor image sensor is repeatedly exposed to high-energy photons while a visible light obstructer is in place to block visible light from impinging on the sensor to generate a set of images from the exposures. A composite image is generated from the set of images with common noise substantially removed so the composite image includes image information corresponding to radiated pixels that absorbed at least some energy from the high-energy photons. The composite image is processed to determine a set of bright points in the composite image, each bright point being above a first threshold. The set of bright points is processed to identify lines with two or more bright points that include pixels therebetween that are above a second threshold and identify a presence of the high-energy particles responsive to a number of lines.

  19. Design of a multi-axis implantable MEMS sensor for intraosseous bone stress monitoring

    NASA Astrophysics Data System (ADS)

    Alfaro, Fernando; Weiss, Lee; Campbell, Phil; Miller, Mark; Fedder, Gary K.

    2009-08-01

    The capability to assess the biomechanical properties of living bone is important for basic research as well as the clinical management of skeletal trauma and disease. Even though radiodensitometric imaging is commonly used to infer bone quality, bone strength does not necessarily correlate well with these non-invasive measurements. This paper reports on the design, fabrication and initial testing of an implantable ultra-miniature multi-axis sensor for directly measuring bone stresses at a micro-scale. The device, which is fabricated with CMOS-MEMS processes, is intended to be permanently implanted within open fractures, or embedded in bone grafts, or placed on implants at the interfaces between bone and prosthetics. The stress sensor comprises an array of piezoresistive pixels to detect a stress tensor at the interfacial area between the MEMS chip and bone, with a resolution to 100 Pa, in 1 s averaging. The sensor system design and manufacture is also compatible with the integration of wireless RF telemetry, for power and data retrieval, all within a 3 mm × 3 mm × 0.3 mm footprint. The piezoresistive elements are integrated within a textured surface to enhance sensor integration with bone. Finite element analysis led to a sensor design for normal and shear stress detection. A wired sensor was fabricated in the Jazz 0.35 µm BiCMOS process and then embedded in mock bone material to characterize its response to tensile and bending loads up to 250 kPa.

  20. Seismic Structure of Perth Basin (Australia) and surroundings from Passive Seismic Deployments

    NASA Astrophysics Data System (ADS)

    Issa, N.; Saygin, E.; Lumley, D. E.; Hoskin, T. E.

    2016-12-01

    We image the subsurface structure of Perth Basin, Western Australia and surroundings by using ambient seismic noise data from 14 seismic stations recently deployed by University of Western Australia (UWA) and other available permanent stations from Geoscience Australia seismic network and the Australian Seismometers in Schools program. Each of these 14 UWA seismic stations comprises a broadband sensor and a high fidelity 3-component 10 Hz geophone, recording in tandem at 250 Hz and 1000 Hz. The other stations used in this study are equipped with short period and broadband sensors. In addition, one shallow borehole station is operated with eight 3 component geophones at depths of between 2 and 44 m. The network is deployed to characterize natural seismicity in the basin and to try and identify any microseismic activity across Darling Fault Zone (DFZ), bounding the basin to the east. The DFZ stretches to approximately 1000 km north-south in Western Australia, and is one of the longest fault zones on the earth with a limited number of detected earthquakes. We use seismic noise cross- and auto-correlation methods to map seismic velocity perturbations across the basin and the transition from DFZ to the basin. Retrieved Green's functions are stable and show clear dispersed waveforms. Travel times of the surface wave Green's functions from noise cross-correlations are inverted with a two-step probabilistic framework to map the absolute shear wave velocities as a function of depth. The single station auto-correlations from the seismic noise yields P wave reflectivity under each station, marking the major discontinuities. Resulting images show the shear velocity perturbations across the region. We also quantify the variation of ambient seismic noise at different depths in the near surface using the geophones in the shallow borehole array.

  1. Forensic use of photo response non-uniformity of imaging sensors and a counter method.

    PubMed

    Dirik, Ahmet Emir; Karaküçük, Ahmet

    2014-01-13

    Analogous to use of bullet scratches in forensic science, the authenticity of a digital image can be verified through the noise characteristics of an imaging sensor. In particular, photo-response non-uniformity noise (PRNU) has been used in source camera identification (SCI). However, this technique can be used maliciously to track or inculpate innocent people. To impede such tracking, PRNU noise should be suppressed significantly. Based on this motivation, we propose a counter forensic method to deceive SCI. Experimental results show that it is possible to impede PRNU-based camera identification for various imaging sensors while preserving the image quality.

  2. MECS-VINE®: A New Proximal Sensor for Segmented Mapping of Vigor and Yield Parameters on Vineyard Rows

    PubMed Central

    Gatti, Matteo; Dosso, Paolo; Maurino, Marco; Merli, Maria Clara; Bernizzoni, Fabio; José Pirez, Facundo; Platè, Bonfiglio; Bertuzzi, Gian Carlo; Poni, Stefano

    2016-01-01

    Ground-based proximal sensing of vineyard features is gaining interest due to its ability to serve in even quite small plots with the advantage of being conducted concurrently with normal vineyard practices (i.e., spraying, pruning or soil tilling) with no dependence upon weather conditions, external services or law-imposed limitations. The purpose of the present work was to test performance of the new terrestrial multi-sensor MECS-VINE® in terms of reliability and degree of correlation with several canopy growth and yield parameters in the grapevine. MECS-VINE®, once conveniently positioned in front of the tractor, can provide simultaneous assessment of growth features and microclimate of specific canopy sections of the two adjacent row sides. MECS-VINE® integrates a series of microclimate sensors (air relative humidity, air and surface temperature) with two (left and right) matrix-based optical RGB imaging sensors and a related algorithm, termed Canoyct). MECS-VINE® was run five times along the season in a mature cv. Barbera vineyard and a Canopy Index (CI, pure number varying from 0 to 1000), calculated through its built-in algorithm, validated vs. canopy structure parameters (i.e., leaf layer number, fractions of canopy gaps and interior leaves) derived from point quadrat analysis. Results showed that CI was highly correlated vs. any canopy parameter at any date, although the closest relationships were found for CI vs. fraction of canopy gaps (R2 = 0.97) and leaf layer number (R2 = 0.97) for data pooled over 24 test vines. While correlations against canopy light interception and total lateral leaf area were still unsatisfactory, a good correlation was found vs. cluster and berry weight (R2 = 0.76 and 0.71, respectively) suggesting a good potential also for yield estimates. Besides the quite satisfactory calibration provided, main improvements of MECS-VINE® usage versus other current equipment are: (i) MECS-VINE® delivers a segmented evaluation of the canopy up to 15 different sectors, therefore allowing to differentiate canopy structure and density at specific and crucial canopy segments (i.e., basal part where clusters are located) and (ii) the sensor is optimized to work at any time of the day with any weather condition without the need of any supplemental lighting system. PMID:27898049

  3. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  4. Ultrasonic imaging of seismic physical models using a fringe visibility enhanced fiber-optic Fabry-Perot interferometric sensor.

    PubMed

    Zhang, Wenlu; Chen, Fengyi; Ma, Wenwen; Rong, Qiangzhou; Qiao, Xueguang; Wang, Ruohui

    2018-04-16

    A fringe visibility enhanced fiber-optic Fabry-Perot interferometer based ultrasonic sensor is proposed and experimentally demonstrated for seismic physical model imaging. The sensor consists of a graded index multimode fiber collimator and a PTFE (polytetrafluoroethylene) diaphragm to form a Fabry-Perot interferometer. Owing to the increase of the sensor's spectral sideband slope and the smaller Young's modulus of the PTFE diaphragm, a high response to both continuous and pulsed ultrasound with a high SNR of 42.92 dB in 300 kHz is achieved when the spectral sideband filter technique is used to interrogate the sensor. The ultrasonic reconstructed images can clearly differentiate the shape of models with a high resolution.

  5. A Negative Index Metamaterial-Inspired UWB Antenna with an Integration of Complementary SRR and CLS Unit Cells for Microwave Imaging Sensor Applications

    PubMed Central

    Islam, Mohammad Tariqul; Islam, Md. Moinul; Samsuzzaman, Md.; Faruque, Mohammad Rashed Iqbal; Misran, Norbahiah

    2015-01-01

    This paper presents a negative index metamaterial incorporated UWB antenna with an integration of complementary SRR (split-ring resonator) and CLS (capacitive loaded strip) unit cells for microwave imaging sensor applications. This metamaterial UWB antenna sensor consists of four unit cells along one axis, where each unit cell incorporates a complementary SRR and CLS pair. This integration enables a design layout that allows both a negative value of permittivity and a negative value of permeability simultaneous, resulting in a durable negative index to enhance the antenna sensor performance for microwave imaging sensor applications. The proposed MTM antenna sensor was designed and fabricated on an FR4 substrate having a thickness of 1.6 mm and a dielectric constant of 4.6. The electrical dimensions of this antenna sensor are 0.20 λ × 0.29 λ at a lower frequency of 3.1 GHz. This antenna sensor achieves a 131.5% bandwidth (VSWR < 2) covering the frequency bands from 3.1 GHz to more than 15 GHz with a maximum gain of 6.57 dBi. High fidelity factor and gain, smooth surface-current distribution and nearly omni-directional radiation patterns with low cross-polarization confirm that the proposed negative index UWB antenna is a promising entrant in the field of microwave imaging sensors. PMID:26007721

  6. A Negative Index Metamaterial-Inspired UWB Antenna with an Integration of Complementary SRR and CLS Unit Cells for Microwave Imaging Sensor Applications.

    PubMed

    Islam, Mohammad Tariqul; Islam, Md Moinul; Samsuzzaman, Md; Faruque, Mohammad Rashed Iqbal; Misran, Norbahiah

    2015-05-20

    This paper presents a negative index metamaterial incorporated UWB antenna with an integration of complementary SRR (split-ring resonator) and CLS (capacitive loaded strip) unit cells for microwave imaging sensor applications. This metamaterial UWB antenna sensor consists of four unit cells along one axis, where each unit cell incorporates a complementary SRR and CLS pair. This integration enables a design layout that allows both a negative value of permittivity and a negative value of permeability simultaneous, resulting in a durable negative index to enhance the antenna sensor performance for microwave imaging sensor applications. The proposed MTM antenna sensor was designed and fabricated on an FR4 substrate having a thickness of 1.6 mm and a dielectric constant of 4.6. The electrical dimensions of this antenna sensor are 0.20 λ × 0.29 λ at a lower frequency of 3.1 GHz. This antenna sensor achieves a 131.5% bandwidth (VSWR < 2) covering the frequency bands from 3.1 GHz to more than 15 GHz with a maximum gain of 6.57 dBi. High fidelity factor and gain, smooth surface-current distribution and nearly omni-directional radiation patterns with low cross-polarization confirm that the proposed negative index UWB antenna is a promising entrant in the field of microwave imaging sensors.

  7. A digital sedimentator for measuring erythrocyte sedimentation rate using a linear image sensor

    NASA Astrophysics Data System (ADS)

    Yoshikoshi, Akio; Sakanishi, Akio; Toyama, Yoshiharu

    2004-11-01

    A digital apparatus was fabricated to determine accurately the erythrocyte sedimentation rate (ESR) using a linear image sensor. Currently, ESR is utilized for clinical diagnosis, and in the laboratory as one of the many rheological properties of blood through the settling of red blood cells (RBCs). In this work, we aimed to measure ESR automatically using a small amount of a sample and without moving parts. The linear image sensor was placed behind a microhematocrit tube containing 36 μl of RBC suspension on a holder plate; the holder plate was fixed on an optical bench together with a tungsten lamp and an opal glass placed in front. RBC suspensions were prepared in autologous plasma with hematocrit H from 25% to 44%. The intensity profiles of transmitted light in 36 μl of RBC suspension were detected using the linear image sensor and sent to a personal computer every minute. ESR was observed at the settling interface between the plasma and RBC suspension in the profile in 1024 pixels (25 μm/pixel) along a microhematocrit tube of 25.6 mm total length for 1 h at a temperature of 37.0±0.1 °C. First, we determined the initial pixel position of the sample at the boundary with air. The boundary and the interface were defined by inflection points in the profile with 25 μm resolution. We obtained sedimentation curves that were determined by the RBC settling distance l(t) at the time t from the difference between pixel locations at the boundary and the interface. The sedimentation curves were well fitted to an empirical equation [Puccini et al., Biorheol. 14, 43 (1977)] from which we calculated the maximum sedimentation velocity smax at the time tmax. We reached tmax within 30 min at any H, and smax linearly related to the settling distance l(60) at 60 min after the start of sedimentation from 30% to 44% H with the correlation coefficient r=0.993. Thus, we may estimate conventional ESR at 1 h from smax more quickly and accurately with less effort.

  8. Automatic integration of data from dissimilar sensors

    NASA Astrophysics Data System (ADS)

    Citrin, W. I.; Proue, R. W.; Thomas, J. W.

    The present investigation is concerned with the automatic integration of radar and electronic support measures (ESM) sensor data, and with the development of a method for the automatical integration of identification friend or foe (IFF) and radar sensor data. On the basis of the two considered proojects, significant advances have been made in the areas of sensor data integration. It is pointed out that the log likelihood approach in sensor data correlation is appropriate for both similar and dissimilar sensor data. Attention is given to the real time integration of radar and ESM sensor data, and a radar ESM correlation simulation program.

  9. WE-AB-BRA-11: Improved Imaging of Permanent Prostate Brachytherapy Seed Implants by Combining an Endorectal X-Ray Sensor with a CT Scanner

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steiner, J; Matthews, K; Jia, G

    Purpose: To test feasibility of the use of a digital endorectal x-ray sensor for improved image resolution of permanent brachytherapy seed implants compared to conventional CT. Methods: Two phantoms simulating the male pelvic region were used to test the capabilities of a digital endorectal x-ray sensor for imaging permanent brachytherapy seed implants. Phantom 1 was constructed from acrylic plastic with cavities milled in the locations of the prostate and the rectum. The prostate cavity was filled a Styrofoam plug implanted with 10 training seeds. Phantom 2 was constructed from tissue-equivalent gelatins and contained a prostate phantom implanted with 18 strandsmore » of training seeds. For both phantoms, an intraoral digital dental x-ray sensor was placed in the rectum within 2 cm of the seed implants. Scout scans were taken of the phantoms over a limited arc angle using a CT scanner (80 kV, 120–200 mA). The dental sensor was removed from the phantoms and normal helical CT and scout (0 degree) scans using typical parameters for pelvic CT (120 kV, auto-mA) were collected. A shift-and add tomosynthesis algorithm was developed to localize seed plane location normal to detector face. Results: The endorectal sensor produced images with improved resolution compared to CT scans. Seed clusters and individual seed geometry were more discernable using the endorectal sensor. Seed 3D locations, including seeds that were not located in every projection image, were discernable using the shift and add algorithm. Conclusion: This work shows that digital endorectal x-ray sensors are a feasible method for improving imaging of permanent brachytherapy seed implants. Future work will consist of optimizing the tomosynthesis technique to produce higher resolution, lower dose images of 1) permanent brachytherapy seed implants for post-implant dosimetry and 2) fine anatomic details for imaging and managing prostatic disease compared to CT images. Funding: LSU Faculty Start-up Funding. Disclosure: XDR Radiography has loaned our research group the digital x-ray detector used in this work. CoI: None.« less

  10. Perceptual approaches to finding features in data

    NASA Astrophysics Data System (ADS)

    Rogowitz, Bernice E.

    2013-03-01

    Electronic imaging applications hinge on the ability to discover features in data. For example, doctors examine diagnostic images for tumors, broken bones and changes in metabolic activity. Financial analysts explore visualizations of market data to find correlations, outliers and interaction effects. Seismologists look for signatures in geological data to tell them where to drill or where an earthquake may begin. These data are very diverse, including images, numbers, graphs, 3-D graphics, and text, and are growing exponentially, largely through the rise in automatic data collection technologies such as sensors and digital imaging. This paper explores important trends in the art and science of finding features in data, such as the tension between bottom-up and top-down processing, the semantics of features, and the integration of human- and algorithm-based approaches. This story is told from the perspective of the IS and T/SPIE Conference on Human Vision and Electronic Imaging (HVEI), which has fostered research at the intersection between human perception and the evolution of new technologies.

  11. Characterization of modulated time-of-flight range image sensors

    NASA Astrophysics Data System (ADS)

    Payne, Andrew D.; Dorrington, Adrian A.; Cree, Michael J.; Carnegie, Dale A.

    2009-01-01

    A number of full field image sensors have been developed that are capable of simultaneously measuring intensity and distance (range) for every pixel in a given scene using an indirect time-of-flight measurement technique. A light source is intensity modulated at a frequency between 10-100 MHz, and an image sensor is modulated at the same frequency, synchronously sampling light reflected from objects in the scene (homodyne detection). The time of flight is manifested as a phase shift in the illumination modulation envelope, which can be determined from the sampled data simultaneously for each pixel in the scene. This paper presents a method of characterizing the high frequency modulation response of these image sensors, using a pico-second laser pulser. The characterization results allow the optimal operating parameters, such as the modulation frequency, to be identified in order to maximize the range measurement precision for a given sensor. A number of potential sources of error exist when using these sensors, including deficiencies in the modulation waveform shape, duty cycle, or phase, resulting in contamination of the resultant range data. From the characterization data these parameters can be identified and compensated for by modifying the sensor hardware or through post processing of the acquired range measurements.

  12. Fusing MODIS with Landsat 8 data to downscale weekly normalized difference vegetation index estimates for central Great Basin rangelands, USA

    USGS Publications Warehouse

    Boyte, Stephen; Wylie, Bruce K.; Rigge, Matthew B.; Dahal, Devendra

    2018-01-01

    Data fused from distinct but complementary satellite sensors mitigate tradeoffs that researchers make when selecting between spatial and temporal resolutions of remotely sensed data. We integrated data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor aboard the Terra satellite and the Operational Land Imager sensor aboard the Landsat 8 satellite into four regression-tree models and applied those data to a mapping application. This application produced downscaled maps that utilize the 30-m spatial resolution of Landsat in conjunction with daily acquisitions of MODIS normalized difference vegetation index (NDVI) that are composited and temporally smoothed. We produced four weekly, atmospherically corrected, and nearly cloud-free, downscaled 30-m synthetic MODIS NDVI predictions (maps) built from these models. Model results were strong with R2 values ranging from 0.74 to 0.85. The correlation coefficients (r ≥ 0.89) were strong for all predictions when compared to corresponding original MODIS NDVI data. Downscaled products incorporated into independently developed sagebrush ecosystem models yielded mixed results. The visual quality of the downscaled 30-m synthetic MODIS NDVI predictions were remarkable when compared to the original 250-m MODIS NDVI. These 30-m maps improve knowledge of dynamic rangeland seasonal processes in the central Great Basin, United States, and provide land managers improved resource maps.

  13. An agreement coefficient for image comparison

    USGS Publications Warehouse

    Ji, Lei; Gallo, Kevin

    2006-01-01

    Combination of datasets acquired from different sensor systems is necessary to construct a long time-series dataset for remotely sensed land-surface variables. Assessment of the agreement of the data derived from various sources is an important issue in understanding the data continuity through the time-series. Some traditional measures, including correlation coefficient, coefficient of determination, mean absolute error, and root mean square error, are not always optimal for evaluating the data agreement. For this reason, we developed a new agreement coefficient for comparing two different images. The agreement coefficient has the following properties: non-dimensional, bounded, symmetric, and distinguishable between systematic and unsystematic differences. The paper provides examples of agreement analyses for hypothetical data and actual remotely sensed data. The results demonstrate that the agreement coefficient does include the above properties, and therefore is a useful tool for image comparison.

  14. First correlated measurements of the shape and scattering properties of cloud particles using the new Particle Habit Imaging and Polar Scattering (PHIPS) probe

    NASA Astrophysics Data System (ADS)

    Abdelmonem, A.; Schnaiter, M.; Amsler, P.; Hesse, E.; Meyer, J.; Leisner, T.

    2011-05-01

    Studying the radiative impact of cirrus clouds requires the knowledge of the link between their microphysics and the single scattering properties of the cloud particles. Usually, this link is created by modeling the optical scattering properties from in situ measurements of ice crystal size distributions. The measured size distribution and the assumed particle shape might be erroneous in case of non-spherical ice particles. We present here a novel optical sensor (the Particle Habit Imaging and Polar Scattering probe, PHIPS) designed to measure the 3-D morphology and the corresponding optical and microphysical parameters of individual cloud particles, simultaneously. Clouds containing particles ranging in size from a few micrometers to about 800 μm diameter can be systematically characterized with an optical resolution power of 2 μm and polar scattering resolution of 1° for forward scattering directions (from 1° to 10°) and 8° for side and backscattering directions (from 18° to 170°). The maximum acquisition rates for scattering phase functions and images are 262 KHz and 10 Hz, respectively. Some preliminary results collected in two ice cloud campaigns which were conducted in the AIDA cloud simulation chamber are presented. PHIPS showed reliability in operation and produced comparable size distributions and images to those given by other certified cloud particles instruments. A 3-D model of a hexagonal ice plate is constructed and the corresponding scattering phase function is compared to that modeled using the Ray Tracing with Diffraction on Facets (RTDF) program. PHIPS is candidate to be a novel air borne optical sensor for studying the radiative impact of cirrus clouds and correlating the particle habit-scattering properties which will serve as a reference for other single, or multi-independent, measurements instruments.

  15. A CMOS high speed imaging system design based on FPGA

    NASA Astrophysics Data System (ADS)

    Tang, Hong; Wang, Huawei; Cao, Jianzhong; Qiao, Mingrui

    2015-10-01

    CMOS sensors have more advantages than traditional CCD sensors. The imaging system based on CMOS has become a hot spot in research and development. In order to achieve the real-time data acquisition and high-speed transmission, we design a high-speed CMOS imaging system on account of FPGA. The core control chip of this system is XC6SL75T and we take advantages of CameraLink interface and AM41V4 CMOS image sensors to transmit and acquire image data. AM41V4 is a 4 Megapixel High speed 500 frames per second CMOS image sensor with global shutter and 4/3" optical format. The sensor uses column parallel A/D converters to digitize the images. The CameraLink interface adopts DS90CR287 and it can convert 28 bits of LVCMOS/LVTTL data into four LVDS data stream. The reflected light of objects is photographed by the CMOS detectors. CMOS sensors convert the light to electronic signals and then send them to FPGA. FPGA processes data it received and transmits them to upper computer which has acquisition cards through CameraLink interface configured as full models. Then PC will store, visualize and process images later. The structure and principle of the system are both explained in this paper and this paper introduces the hardware and software design of the system. FPGA introduces the driven clock of CMOS. The data in CMOS is converted to LVDS signals and then transmitted to the data acquisition cards. After simulation, the paper presents a row transfer timing sequence of CMOS. The system realized real-time image acquisition and external controls.

  16. An airborne thematic thermal infrared and electro-optical imaging system

    NASA Astrophysics Data System (ADS)

    Sun, Xiuhong; Shu, Peter

    2011-08-01

    This paper describes an advanced Airborne Thematic Thermal InfraRed and Electro-Optical Imaging System (ATTIREOIS) and its potential applications. ATTIREOIS sensor payload consists of two sets of advanced Focal Plane Arrays (FPAs) - a broadband Thermal InfraRed Sensor (TIRS) and a four (4) band Multispectral Electro-Optical Sensor (MEOS) to approximate Landsat ETM+ bands 1,2,3,4, and 6, and LDCM bands 2,3,4,5, and 10+11. The airborne TIRS is 3-axis stabilized payload capable of providing 3D photogrammetric images with a 1,850 pixel swathwidth via pushbroom operation. MEOS has a total of 116 million simultaneous sensor counts capable of providing 3 cm spatial resolution multispectral orthophotos for continuous airborne mapping. ATTIREOIS is a complete standalone and easy-to-use portable imaging instrument for light aerial vehicle deployment. Its miniaturized backend data system operates all ATTIREOIS imaging sensor components, an INS/GPS, and an e-Gimbal™ Control Electronic Unit (ECU) with a data throughput of 300 Megabytes/sec. The backend provides advanced onboard processing, performing autonomous raw sensor imagery development, TIRS image track-recovery reconstruction, LWIR/VNIR multi-band co-registration, and photogrammetric image processing. With geometric optics and boresight calibrations, the ATTIREOIS data products are directly georeferenced with an accuracy of approximately one meter. A prototype ATTIREOIS has been configured. Its sample LWIR/EO image data will be presented. Potential applications of ATTIREOIS include: 1) Providing timely and cost-effective, precisely and directly georeferenced surface emissive and solar reflective LWIR/VNIR multispectral images via a private Google Earth Globe to enhance NASA's Earth science research capabilities; and 2) Underflight satellites to support satellite measurement calibration and validation observations.

  17. Celestial Object Imaging Model and Parameter Optimization for an Optical Navigation Sensor Based on the Well Capacity Adjusting Scheme.

    PubMed

    Wang, Hao; Jiang, Jie; Zhang, Guangjun

    2017-04-21

    The simultaneous extraction of optical navigation measurements from a target celestial body and star images is essential for autonomous optical navigation. Generally, a single optical navigation sensor cannot simultaneously image the target celestial body and stars well-exposed because their irradiance difference is generally large. Multi-sensor integration or complex image processing algorithms are commonly utilized to solve the said problem. This study analyzes and demonstrates the feasibility of simultaneously imaging the target celestial body and stars well-exposed within a single exposure through a single field of view (FOV) optical navigation sensor using the well capacity adjusting (WCA) scheme. First, the irradiance characteristics of the celestial body are analyzed. Then, the celestial body edge model and star spot imaging model are established when the WCA scheme is applied. Furthermore, the effect of exposure parameters on the accuracy of star centroiding and edge extraction is analyzed using the proposed model. Optimal exposure parameters are also derived by conducting Monte Carlo simulation to obtain the best performance of the navigation sensor. Finally, laboratorial and night sky experiments are performed to validate the correctness of the proposed model and optimal exposure parameters.

  18. Plenoptic camera image simulation for reconstruction algorithm verification

    NASA Astrophysics Data System (ADS)

    Schwiegerling, Jim

    2014-09-01

    Plenoptic cameras have emerged in recent years as a technology for capturing light field data in a single snapshot. A conventional digital camera can be modified with the addition of a lenslet array to create a plenoptic camera. Two distinct camera forms have been proposed in the literature. The first has the camera image focused onto the lenslet array. The lenslet array is placed over the camera sensor such that each lenslet forms an image of the exit pupil onto the sensor. The second plenoptic form has the lenslet array relaying the image formed by the camera lens to the sensor. We have developed a raytracing package that can simulate images formed by a generalized version of the plenoptic camera. Several rays from each sensor pixel are traced backwards through the system to define a cone of rays emanating from the entrance pupil of the camera lens. Objects that lie within this cone are integrated to lead to a color and exposure level for that pixel. To speed processing three-dimensional objects are approximated as a series of planes at different depths. Repeating this process for each pixel in the sensor leads to a simulated plenoptic image on which different reconstruction algorithms can be tested.

  19. A novel optical gating method for laser gated imaging

    NASA Astrophysics Data System (ADS)

    Ginat, Ran; Schneider, Ron; Zohar, Eyal; Nesher, Ofer

    2013-06-01

    For the past 15 years, Elbit Systems is developing time-resolved active laser-gated imaging (LGI) systems for various applications. Traditional LGI systems are based on high sensitive gated sensors, synchronized to pulsed laser sources. Elbit propriety multi-pulse per frame method, which is being implemented in LGI systems, improves significantly the imaging quality. A significant characteristic of the LGI is its ability to penetrate a disturbing media, such as rain, haze and some fog types. Current LGI systems are based on image intensifier (II) sensors, limiting the system in spectral response, image quality, reliability and cost. A novel propriety optical gating module was developed in Elbit, untying the dependency of LGI system on II. The optical gating module is not bounded to the radiance wavelength and positioned between the system optics and the sensor. This optical gating method supports the use of conventional solid state sensors. By selecting the appropriate solid state sensor, the new LGI systems can operate at any desired wavelength. In this paper we present the new gating method characteristics, performance and its advantages over the II gating method. The use of the gated imaging systems is described in a variety of applications, including results from latest field experiments.

  20. Celestial Object Imaging Model and Parameter Optimization for an Optical Navigation Sensor Based on the Well Capacity Adjusting Scheme

    PubMed Central

    Wang, Hao; Jiang, Jie; Zhang, Guangjun

    2017-01-01

    The simultaneous extraction of optical navigation measurements from a target celestial body and star images is essential for autonomous optical navigation. Generally, a single optical navigation sensor cannot simultaneously image the target celestial body and stars well-exposed because their irradiance difference is generally large. Multi-sensor integration or complex image processing algorithms are commonly utilized to solve the said problem. This study analyzes and demonstrates the feasibility of simultaneously imaging the target celestial body and stars well-exposed within a single exposure through a single field of view (FOV) optical navigation sensor using the well capacity adjusting (WCA) scheme. First, the irradiance characteristics of the celestial body are analyzed. Then, the celestial body edge model and star spot imaging model are established when the WCA scheme is applied. Furthermore, the effect of exposure parameters on the accuracy of star centroiding and edge extraction is analyzed using the proposed model. Optimal exposure parameters are also derived by conducting Monte Carlo simulation to obtain the best performance of the navigation sensor. Finally, laboratorial and night sky experiments are performed to validate the correctness of the proposed model and optimal exposure parameters. PMID:28430132

Top