Sample records for range image sensor

  1. Fast range estimation based on active range-gated imaging for coastal surveillance

    NASA Astrophysics Data System (ADS)

    Kong, Qingshan; Cao, Yinan; Wang, Xinwei; Tong, Youwan; Zhou, Yan; Liu, Yuliang

    2012-11-01

    Coastal surveillance is very important because it is useful for search and rescue, illegal immigration, or harbor security and so on. Furthermore, range estimation is critical for precisely detecting the target. Range-gated laser imaging sensor is suitable for high accuracy range especially in night and no moonlight. Generally, before detecting the target, it is necessary to change delay time till the target is captured. There are two operating mode for range-gated imaging sensor, one is passive imaging mode, and the other is gate viewing mode. Firstly, the sensor is passive mode, only capturing scenes by ICCD, once the object appears in the range of monitoring area, we can obtain the course range of the target according to the imaging geometry/projecting transform. Then, the sensor is gate viewing mode, applying micro second laser pulses and sensor gate width, we can get the range of targets by at least two continuous images with trapezoid-shaped range intensity profile. This technique enables super-resolution depth mapping with a reduction of imaging data processing. Based on the first step, we can calculate the rough value and quickly fix delay time which the target is detected. This technique has overcome the depth resolution limitation for 3D active imaging and enables super-resolution depth mapping with a reduction of imaging data processing. By the two steps, we can quickly obtain the distance between the object and sensor.

  2. Applications of the Integrated High-Performance CMOS Image Sensor to Range Finders — from Optical Triangulation to the Automotive Field

    PubMed Central

    Wu, Jih-Huah; Pen, Cheng-Chung; Jiang, Joe-Air

    2008-01-01

    With their significant features, the applications of complementary metal-oxide semiconductor (CMOS) image sensors covers a very extensive range, from industrial automation to traffic applications such as aiming systems, blind guidance, active/passive range finders, etc. In this paper CMOS image sensor-based active and passive range finders are presented. The measurement scheme of the proposed active/passive range finders is based on a simple triangulation method. The designed range finders chiefly consist of a CMOS image sensor and some light sources such as lasers or LEDs. The implementation cost of our range finders is quite low. Image processing software to adjust the exposure time (ET) of the CMOS image sensor to enhance the performance of triangulation-based range finders was also developed. An extensive series of experiments were conducted to evaluate the performance of the designed range finders. From the experimental results, the distance measurement resolutions achieved by the active range finder and the passive range finder can be better than 0.6% and 0.25% within the measurement ranges of 1 to 8 m and 5 to 45 m, respectively. Feasibility tests on applications of the developed CMOS image sensor-based range finders to the automotive field were also conducted. The experimental results demonstrated that our range finders are well-suited for distance measurements in this field. PMID:27879789

  3. Applications of the Integrated High-Performance CMOS Image Sensor to Range Finders - from Optical Triangulation to the Automotive Field.

    PubMed

    Wu, Jih-Huah; Pen, Cheng-Chung; Jiang, Joe-Air

    2008-03-13

    With their significant features, the applications of complementary metal-oxidesemiconductor (CMOS) image sensors covers a very extensive range, from industrialautomation to traffic applications such as aiming systems, blind guidance, active/passiverange finders, etc. In this paper CMOS image sensor-based active and passive rangefinders are presented. The measurement scheme of the proposed active/passive rangefinders is based on a simple triangulation method. The designed range finders chieflyconsist of a CMOS image sensor and some light sources such as lasers or LEDs. Theimplementation cost of our range finders is quite low. Image processing software to adjustthe exposure time (ET) of the CMOS image sensor to enhance the performance oftriangulation-based range finders was also developed. An extensive series of experimentswere conducted to evaluate the performance of the designed range finders. From theexperimental results, the distance measurement resolutions achieved by the active rangefinder and the passive range finder can be better than 0.6% and 0.25% within themeasurement ranges of 1 to 8 m and 5 to 45 m, respectively. Feasibility tests onapplications of the developed CMOS image sensor-based range finders to the automotivefield were also conducted. The experimental results demonstrated that our range finders arewell-suited for distance measurements in this field.

  4. High speed three-dimensional laser scanner with real time processing

    NASA Technical Reports Server (NTRS)

    Lavelle, Joseph P. (Inventor); Schuet, Stefan R. (Inventor)

    2008-01-01

    A laser scanner computes a range from a laser line to an imaging sensor. The laser line illuminates a detail within an area covered by the imaging sensor, the area having a first dimension and a second dimension. The detail has a dimension perpendicular to the area. A traverse moves a laser emitter coupled to the imaging sensor, at a height above the area. The laser emitter is positioned at an offset along the scan direction with respect to the imaging sensor, and is oriented at a depression angle with respect to the area. The laser emitter projects the laser line along the second dimension of the area at a position where a image frame is acquired. The imaging sensor is sensitive to laser reflections from the detail produced by the laser line. The imaging sensor images the laser reflections from the detail to generate the image frame. A computer having a pipeline structure is connected to the imaging sensor for reception of the image frame, and for computing the range to the detail using height, depression angle and/or offset. The computer displays the range to the area and detail thereon covered by the image frame.

  5. An improved triangulation laser rangefinder using a custom CMOS HDR linear image sensor

    NASA Astrophysics Data System (ADS)

    Liscombe, Michael

    3-D triangulation laser rangefinders are used in many modern applications, from terrain mapping to biometric identification. Although a wide variety of designs have been proposed, laser speckle noise still provides a fundamental limitation on range accuracy. These works propose a new triangulation laser rangefinder designed specifically to mitigate the effects of laser speckle noise. The proposed rangefinder uses a precision linear translator to laterally reposition the imaging system (e.g., image sensor and imaging lens). For a given spatial location of the laser spot, capturing N spatially uncorrelated laser spot profiles is shown to improve range accuracy by a factor of N . This technique has many advantages over past speckle-reduction technologies, such as a fixed system cost and form factor, and the ability to virtually eliminate laser speckle noise. These advantages are made possible through spatial diversity and come at the cost of increased acquisition time. The rangefinder makes use of the ICFYKWG1 linear image sensor, a custom CMOS sensor developed at the Vision Sensor Laboratory (York University). Tests are performed on the image sensor's innovative high dynamic range technology to determine its effects on range accuracy. As expected, experimental results have shown that the sensor provides a trade-off between dynamic range and range accuracy.

  6. Stochastic performance modeling and evaluation of obstacle detectability with imaging range sensors

    NASA Technical Reports Server (NTRS)

    Matthies, Larry; Grandjean, Pierrick

    1993-01-01

    Statistical modeling and evaluation of the performance of obstacle detection systems for Unmanned Ground Vehicles (UGVs) is essential for the design, evaluation, and comparison of sensor systems. In this report, we address this issue for imaging range sensors by dividing the evaluation problem into two levels: quality of the range data itself and quality of the obstacle detection algorithms applied to the range data. We review existing models of the quality of range data from stereo vision and AM-CW LADAR, then use these to derive a new model for the quality of a simple obstacle detection algorithm. This model predicts the probability of detecting obstacles and the probability of false alarms, as a function of the size and distance of the obstacle, the resolution of the sensor, and the level of noise in the range data. We evaluate these models experimentally using range data from stereo image pairs of a gravel road with known obstacles at several distances. The results show that the approach is a promising tool for predicting and evaluating the performance of obstacle detection with imaging range sensors.

  7. UTOFIA: an underwater time-of-flight image acquisition system

    NASA Astrophysics Data System (ADS)

    Driewer, Adrian; Abrosimov, Igor; Alexander, Jonathan; Benger, Marc; O'Farrell, Marion; Haugholt, Karl Henrik; Softley, Chris; Thielemann, Jens T.; Thorstensen, Jostein; Yates, Chris

    2017-10-01

    In this article the development of a newly designed Time-of-Flight (ToF) image sensor for underwater applications is described. The sensor is developed as part of the project UTOFIA (underwater time-of-flight image acquisition) funded by the EU within the Horizon 2020 framework. This project aims to develop a camera based on range gating that extends the visible range compared to conventional cameras by a factor of 2 to 3 and delivers real-time range information by means of a 3D video stream. The principle of underwater range gating as well as the concept of the image sensor are presented. Based on measurements on a test image sensor a pixel structure that suits best to the requirements has been selected. Within an extensive characterization underwater the capability of distance measurements in turbid environments is demonstrated.

  8. CMOS Active-Pixel Image Sensor With Intensity-Driven Readout

    NASA Technical Reports Server (NTRS)

    Langenbacher, Harry T.; Fossum, Eric R.; Kemeny, Sabrina

    1996-01-01

    Proposed complementary metal oxide/semiconductor (CMOS) integrated-circuit image sensor automatically provides readouts from pixels in order of decreasing illumination intensity. Sensor operated in integration mode. Particularly useful in number of image-sensing tasks, including diffractive laser range-finding, three-dimensional imaging, event-driven readout of sparse sensor arrays, and star tracking.

  9. Characterization of modulated time-of-flight range image sensors

    NASA Astrophysics Data System (ADS)

    Payne, Andrew D.; Dorrington, Adrian A.; Cree, Michael J.; Carnegie, Dale A.

    2009-01-01

    A number of full field image sensors have been developed that are capable of simultaneously measuring intensity and distance (range) for every pixel in a given scene using an indirect time-of-flight measurement technique. A light source is intensity modulated at a frequency between 10-100 MHz, and an image sensor is modulated at the same frequency, synchronously sampling light reflected from objects in the scene (homodyne detection). The time of flight is manifested as a phase shift in the illumination modulation envelope, which can be determined from the sampled data simultaneously for each pixel in the scene. This paper presents a method of characterizing the high frequency modulation response of these image sensors, using a pico-second laser pulser. The characterization results allow the optimal operating parameters, such as the modulation frequency, to be identified in order to maximize the range measurement precision for a given sensor. A number of potential sources of error exist when using these sensors, including deficiencies in the modulation waveform shape, duty cycle, or phase, resulting in contamination of the resultant range data. From the characterization data these parameters can be identified and compensated for by modifying the sensor hardware or through post processing of the acquired range measurements.

  10. Range imaging pulsed laser sensor with two-dimensional scanning of transmitted beam and scanless receiver using high-aspect avalanche photodiode array for eye-safe wavelength

    NASA Astrophysics Data System (ADS)

    Tsuji, Hidenobu; Imaki, Masaharu; Kotake, Nobuki; Hirai, Akihito; Nakaji, Masaharu; Kameyama, Shumpei

    2017-03-01

    We demonstrate a range imaging pulsed laser sensor with two-dimensional scanning of a transmitted beam and a scanless receiver using a high-aspect avalanche photodiode (APD) array for the eye-safe wavelength. The system achieves a high frame rate and long-range imaging with a relatively simple sensor configuration. We developed a high-aspect APD array for the wavelength of 1.5 μm, a receiver integrated circuit, and a range and intensity detector. By combining these devices, we realized 160×120 pixels range imaging with a frame rate of 8 Hz at a distance of about 50 m.

  11. A novel method to increase LinLog CMOS sensors' performance in high dynamic range scenarios.

    PubMed

    Martínez-Sánchez, Antonio; Fernández, Carlos; Navarro, Pedro J; Iborra, Andrés

    2011-01-01

    Images from high dynamic range (HDR) scenes must be obtained with minimum loss of information. For this purpose it is necessary to take full advantage of the quantification levels provided by the CCD/CMOS image sensor. LinLog CMOS sensors satisfy the above demand by offering an adjustable response curve that combines linear and logarithmic responses. This paper presents a novel method to quickly adjust the parameters that control the response curve of a LinLog CMOS image sensor. We propose to use an Adaptive Proportional-Integral-Derivative controller to adjust the exposure time of the sensor, together with control algorithms based on the saturation level and the entropy of the images. With this method the sensor's maximum dynamic range (120 dB) can be used to acquire good quality images from HDR scenes with fast, automatic adaptation to scene conditions. Adaptation to a new scene is rapid, with a sensor response adjustment of less than eight frames when working in real time video mode. At least 67% of the scene entropy can be retained with this method.

  12. Evaluation of a HDR image sensor with logarithmic response for mobile video-based applications

    NASA Astrophysics Data System (ADS)

    Tektonidis, Marco; Pietrzak, Mateusz; Monnin, David

    2017-10-01

    The performance of mobile video-based applications using conventional LDR (Low Dynamic Range) image sensors highly depends on the illumination conditions. As an alternative, HDR (High Dynamic Range) image sensors with logarithmic response are capable to acquire illumination-invariant HDR images in a single shot. We have implemented a complete image processing framework for a HDR sensor, including preprocessing methods (nonuniformity correction (NUC), cross-talk correction (CTC), and demosaicing) as well as tone mapping (TM). We have evaluated the HDR sensor for video-based applications w.r.t. the display of images and w.r.t. image analysis techniques. Regarding the display we have investigated the image intensity statistics over time, and regarding image analysis we assessed the number of feature correspondences between consecutive frames of temporal image sequences. For the evaluation we used HDR image data recorded from a vehicle on outdoor or combined outdoor/indoor itineraries, and we performed a comparison with corresponding conventional LDR image data.

  13. The Multidimensional Integrated Intelligent Imaging project (MI-3)

    NASA Astrophysics Data System (ADS)

    Allinson, N.; Anaxagoras, T.; Aveyard, J.; Arvanitis, C.; Bates, R.; Blue, A.; Bohndiek, S.; Cabello, J.; Chen, L.; Chen, S.; Clark, A.; Clayton, C.; Cook, E.; Cossins, A.; Crooks, J.; El-Gomati, M.; Evans, P. M.; Faruqi, W.; French, M.; Gow, J.; Greenshaw, T.; Greig, T.; Guerrini, N.; Harris, E. J.; Henderson, R.; Holland, A.; Jeyasundra, G.; Karadaglic, D.; Konstantinidis, A.; Liang, H. X.; Maini, K. M. S.; McMullen, G.; Olivo, A.; O'Shea, V.; Osmond, J.; Ott, R. J.; Prydderch, M.; Qiang, L.; Riley, G.; Royle, G.; Segneri, G.; Speller, R.; Symonds-Tayler, J. R. N.; Triger, S.; Turchetta, R.; Venanzi, C.; Wells, K.; Zha, X.; Zin, H.

    2009-06-01

    MI-3 is a consortium of 11 universities and research laboratories whose mission is to develop complementary metal-oxide semiconductor (CMOS) active pixel sensors (APS) and to apply these sensors to a range of imaging challenges. A range of sensors has been developed: On-Pixel Intelligent CMOS (OPIC)—designed for in-pixel intelligence; FPN—designed to develop novel techniques for reducing fixed pattern noise; HDR—designed to develop novel techniques for increasing dynamic range; Vanilla/PEAPS—with digital and analogue modes and regions of interest, which has also been back-thinned; Large Area Sensor (LAS)—a novel, stitched LAS; and eLeNA—which develops a range of low noise pixels. Applications being developed include autoradiography, a gamma camera system, radiotherapy verification, tissue diffraction imaging, X-ray phase-contrast imaging, DNA sequencing and electron microscopy.

  14. Evaluation of Sun Glint Correction Algorithms for High-Spatial Resolution Hyperspectral Imagery

    DTIC Science & Technology

    2012-09-01

    ACRONYMS AND ABBREVIATIONS AISA Airborne Imaging Spectrometer for Applications AVIRIS Airborne Visible/Infrared Imaging Spectrometer BIL Band...sensor bracket mount combining Airborne Imaging Spectrometer for Applications ( AISA ) Eagle and Hawk sensors into a single imaging system (SpecTIR 2011...The AISA Eagle is a VNIR sensor with a wavelength range of approximately 400–970 nm and the AISA Hawk sensor is a SWIR sensor with a wavelength

  15. Sensor assembly method using silicon interposer with trenches for three-dimensional binocular range sensors

    NASA Astrophysics Data System (ADS)

    Nakajima, Kazuhiro; Yamamoto, Yuji; Arima, Yutaka

    2018-04-01

    To easily assemble a three-dimensional binocular range sensor, we devised an alignment method for two image sensors using a silicon interposer with trenches. The trenches were formed using deep reactive ion etching (RIE) equipment. We produced a three-dimensional (3D) range sensor using the method and experimentally confirmed that sufficient alignment accuracy was realized. It was confirmed that the alignment accuracy of the two image sensors when using the proposed method is more than twice that of the alignment assembly method on a conventional board. In addition, as a result of evaluating the deterioration of the detection performance caused by the alignment accuracy, it was confirmed that the vertical deviation between the corresponding pixels in the two image sensors is substantially proportional to the decrease in detection performance. Therefore, we confirmed that the proposed method can realize more than twice the detection performance of the conventional method. Through these evaluations, the effectiveness of the 3D binocular range sensor aligned by the silicon interposer with the trenches was confirmed.

  16. High dynamic range CMOS (HDRC) imagers for safety systems

    NASA Astrophysics Data System (ADS)

    Strobel, Markus; Döttling, Dietmar

    2013-04-01

    The first part of this paper describes the high dynamic range CMOS (HDRC®) imager - a special type of CMOS image sensor with logarithmic response. The powerful property of a high dynamic range (HDR) image acquisition is detailed by mathematical definition and measurement of the optoelectronic conversion function (OECF) of two different HDRC imagers. Specific sensor parameters will be discussed including the pixel design for the global shutter readout. The second part will give an outline on the applications and requirements of cameras for industrial safety. Equipped with HDRC global shutter sensors SafetyEYE® is a high-performance stereo camera system for safe three-dimensional zone monitoring enabling new and more flexible solutions compared to existing safety guards.

  17. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.

  18. CMOS Imaging Sensor Technology for Aerial Mapping Cameras

    NASA Astrophysics Data System (ADS)

    Neumann, Klaus; Welzenbach, Martin; Timm, Martin

    2016-06-01

    In June 2015 Leica Geosystems launched the first large format aerial mapping camera using CMOS sensor technology, the Leica DMC III. This paper describes the motivation to change from CCD sensor technology to CMOS for the development of this new aerial mapping camera. In 2002 the DMC first generation was developed by Z/I Imaging. It was the first large format digital frame sensor designed for mapping applications. In 2009 Z/I Imaging designed the DMC II which was the first digital aerial mapping camera using a single ultra large CCD sensor to avoid stitching of smaller CCDs. The DMC III is now the third generation of large format frame sensor developed by Z/I Imaging and Leica Geosystems for the DMC camera family. It is an evolution of the DMC II using the same system design with one large monolithic PAN sensor and four multi spectral camera heads for R,G, B and NIR. For the first time a 391 Megapixel large CMOS sensor had been used as PAN chromatic sensor, which is an industry record. Along with CMOS technology goes a range of technical benefits. The dynamic range of the CMOS sensor is approx. twice the range of a comparable CCD sensor and the signal to noise ratio is significantly better than with CCDs. Finally results from the first DMC III customer installations and test flights will be presented and compared with other CCD based aerial sensors.

  19. Swap intensified WDR CMOS module for I2/LWIR fusion

    NASA Astrophysics Data System (ADS)

    Ni, Yang; Noguier, Vincent

    2015-05-01

    The combination of high resolution visible-near-infrared low light sensor and moderate resolution uncooled thermal sensor provides an efficient way for multi-task night vision. Tremendous progress has been made on uncooled thermal sensors (a-Si, VOx, etc.). It's possible to make a miniature uncooled thermal camera module in a tiny 1cm3 cube with <1W power consumption. For silicon based solid-state low light CCD/CMOS sensors have observed also a constant progress in terms of readout noise, dark current, resolution and frame rate. In contrast to thermal sensing which is intrinsic day&night operational, the silicon based solid-state sensors are not yet capable to do the night vision performance required by defense and critical surveillance applications. Readout noise, dark current are 2 major obstacles. The low dynamic range at high sensitivity mode of silicon sensors is also an important limiting factor, which leads to recognition failure due to local or global saturations & blooming. In this context, the image intensifier based solution is still attractive for the following reasons: 1) high gain and ultra-low dark current; 2) wide dynamic range and 3) ultra-low power consumption. With high electron gain and ultra low dark current of image intensifier, the only requirement on the silicon image pickup device are resolution, dynamic range and power consumption. In this paper, we present a SWAP intensified Wide Dynamic Range CMOS module for night vision applications, especially for I2/LWIR fusion. This module is based on a dedicated CMOS image sensor using solar-cell mode photodiode logarithmic pixel design which covers a huge dynamic range (> 140dB) without saturation and blooming. The ultra-wide dynamic range image from this new generation logarithmic sensor can be used directly without any image processing and provide an instant light accommodation. The complete module is slightly bigger than a simple ANVIS format I2 tube with <500mW power consumption.

  20. Geological terrain models

    NASA Technical Reports Server (NTRS)

    Kaupp, V. H.; Macdonald, H. C.; Waite, W. P.

    1981-01-01

    The initial phase of a program to determine the best interpretation strategy and sensor configuration for a radar remote sensing system for geologic applications is discussed. In this phase, terrain modeling and radar image simulation were used to perform parametric sensitivity studies. A relatively simple computer-generated terrain model is presented, and the data base, backscatter file, and transfer function for digital image simulation are described. Sets of images are presented that simulate the results obtained with an X-band radar from an altitude of 800 km and at three different terrain-illumination angles. The simulations include power maps, slant-range images, ground-range images, and ground-range images with statistical noise incorporated. It is concluded that digital image simulation and computer modeling provide cost-effective methods for evaluating terrain variations and sensor parameter changes, for predicting results, and for defining optimum sensor parameters.

  1. Atmospheric turbulence and sensor system effects on biometric algorithm performance

    NASA Astrophysics Data System (ADS)

    Espinola, Richard L.; Leonard, Kevin R.; Byrd, Kenneth A.; Potvin, Guy

    2015-05-01

    Biometric technologies composed of electro-optical/infrared (EO/IR) sensor systems and advanced matching algorithms are being used in various force protection/security and tactical surveillance applications. To date, most of these sensor systems have been widely used in controlled conditions with varying success (e.g., short range, uniform illumination, cooperative subjects). However the limiting conditions of such systems have yet to be fully studied for long range applications and degraded imaging environments. Biometric technologies used for long range applications will invariably suffer from the effects of atmospheric turbulence degradation. Atmospheric turbulence causes blur, distortion and intensity fluctuations that can severely degrade image quality of electro-optic and thermal imaging systems and, for the case of biometrics technology, translate to poor matching algorithm performance. In this paper, we evaluate the effects of atmospheric turbulence and sensor resolution on biometric matching algorithm performance. We use a subset of the Facial Recognition Technology (FERET) database and a commercial algorithm to analyze facial recognition performance on turbulence degraded facial images. The goal of this work is to understand the feasibility of long-range facial recognition in degraded imaging conditions, and the utility of camera parameter trade studies to enable the design of the next generation biometrics sensor systems.

  2. Establishing imaging sensor specifications for digital still cameras

    NASA Astrophysics Data System (ADS)

    Kriss, Michael A.

    2007-02-01

    Digital Still Cameras, DSCs, have now displaced conventional still cameras in most markets. The heart of a DSC is thought to be the imaging sensor, be it Full Frame CCD, and Interline CCD, a CMOS sensor or the newer Foveon buried photodiode sensors. There is a strong tendency by consumers to consider only the number of mega-pixels in a camera and not to consider the overall performance of the imaging system, including sharpness, artifact control, noise, color reproduction, exposure latitude and dynamic range. This paper will provide a systematic method to characterize the physical requirements of an imaging sensor and supporting system components based on the desired usage. The analysis is based on two software programs that determine the "sharpness", potential for artifacts, sensor "photographic speed", dynamic range and exposure latitude based on the physical nature of the imaging optics, sensor characteristics (including size of pixels, sensor architecture, noise characteristics, surface states that cause dark current, quantum efficiency, effective MTF, and the intrinsic full well capacity in terms of electrons per square centimeter). Examples will be given for consumer, pro-consumer, and professional camera systems. Where possible, these results will be compared to imaging system currently on the market.

  3. A Novel Method to Increase LinLog CMOS Sensors’ Performance in High Dynamic Range Scenarios

    PubMed Central

    Martínez-Sánchez, Antonio; Fernández, Carlos; Navarro, Pedro J.; Iborra, Andrés

    2011-01-01

    Images from high dynamic range (HDR) scenes must be obtained with minimum loss of information. For this purpose it is necessary to take full advantage of the quantification levels provided by the CCD/CMOS image sensor. LinLog CMOS sensors satisfy the above demand by offering an adjustable response curve that combines linear and logarithmic responses. This paper presents a novel method to quickly adjust the parameters that control the response curve of a LinLog CMOS image sensor. We propose to use an Adaptive Proportional-Integral-Derivative controller to adjust the exposure time of the sensor, together with control algorithms based on the saturation level and the entropy of the images. With this method the sensor’s maximum dynamic range (120 dB) can be used to acquire good quality images from HDR scenes with fast, automatic adaptation to scene conditions. Adaptation to a new scene is rapid, with a sensor response adjustment of less than eight frames when working in real time video mode. At least 67% of the scene entropy can be retained with this method. PMID:22164083

  4. Characterization of the range effect in synthetic aperture radar images of concrete specimens for width estimation

    NASA Astrophysics Data System (ADS)

    Alzeyadi, Ahmed; Yu, Tzuyang

    2018-03-01

    Nondestructive evaluation (NDE) is an indispensable approach for the sustainability of critical civil infrastructure systems such as bridges and buildings. Recently, microwave/radar sensors are widely used for assessing the condition of concrete structures. Among existing imaging techniques in microwave/radar sensors, synthetic aperture radar (SAR) imaging enables researchers to conduct surface and subsurface inspection of concrete structures in the range-cross-range representation of SAR images. The objective of this paper is to investigate the range effect of concrete specimens in SAR images at various ranges (15 cm, 50 cm, 75 cm, 100 cm, and 200 cm). One concrete panel specimen (water-to-cement ratio = 0.45) of 30-cm-by-30-cm-by-5-cm was manufactured and scanned by a 10 GHz SAR imaging radar sensor inside an anechoic chamber. Scatterers in SAR images representing two corners of the concrete panel were used to estimate the width of the panel. It was found that the range-dependent pattern of corner scatters can be used to predict the width of concrete panels. Also, the maximum SAR amplitude decreases when the range increases. An empirical model was also proposed for width estimation of concrete panels.

  5. Low-voltage 96 dB snapshot CMOS image sensor with 4.5 nW power dissipation per pixel.

    PubMed

    Spivak, Arthur; Teman, Adam; Belenky, Alexander; Yadid-Pecht, Orly; Fish, Alexander

    2012-01-01

    Modern "smart" CMOS sensors have penetrated into various applications, such as surveillance systems, bio-medical applications, digital cameras, cellular phones and many others. Reducing the power of these sensors continuously challenges designers. In this paper, a low power global shutter CMOS image sensor with Wide Dynamic Range (WDR) ability is presented. This sensor features several power reduction techniques, including a dual voltage supply, a selective power down, transistors with different threshold voltages, a non-rationed logic, and a low voltage static memory. A combination of all these approaches has enabled the design of the low voltage "smart" image sensor, which is capable of reaching a remarkable dynamic range, while consuming very low power. The proposed power-saving solutions have allowed the maintenance of the standard architecture of the sensor, reducing both the time and the cost of the design. In order to maintain the image quality, a relation between the sensor performance and power has been analyzed and a mathematical model, describing the sensor Signal to Noise Ratio (SNR) and Dynamic Range (DR) as a function of the power supplies, is proposed. The described sensor was implemented in a 0.18 um CMOS process and successfully tested in the laboratory. An SNR of 48 dB and DR of 96 dB were achieved with a power dissipation of 4.5 nW per pixel.

  6. Low-Voltage 96 dB Snapshot CMOS Image Sensor with 4.5 nW Power Dissipation per Pixel

    PubMed Central

    Spivak, Arthur; Teman, Adam; Belenky, Alexander; Yadid-Pecht, Orly; Fish, Alexander

    2012-01-01

    Modern “smart” CMOS sensors have penetrated into various applications, such as surveillance systems, bio-medical applications, digital cameras, cellular phones and many others. Reducing the power of these sensors continuously challenges designers. In this paper, a low power global shutter CMOS image sensor with Wide Dynamic Range (WDR) ability is presented. This sensor features several power reduction techniques, including a dual voltage supply, a selective power down, transistors with different threshold voltages, a non-rationed logic, and a low voltage static memory. A combination of all these approaches has enabled the design of the low voltage “smart” image sensor, which is capable of reaching a remarkable dynamic range, while consuming very low power. The proposed power-saving solutions have allowed the maintenance of the standard architecture of the sensor, reducing both the time and the cost of the design. In order to maintain the image quality, a relation between the sensor performance and power has been analyzed and a mathematical model, describing the sensor Signal to Noise Ratio (SNR) and Dynamic Range (DR) as a function of the power supplies, is proposed. The described sensor was implemented in a 0.18 um CMOS process and successfully tested in the laboratory. An SNR of 48 dB and DR of 96 dB were achieved with a power dissipation of 4.5 nW per pixel. PMID:23112588

  7. Passive range estimation for rotorcraft low-altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, B.; Suorsa, R.; Hussien, B.

    1991-01-01

    The automation of rotorcraft low-altitude flight presents challenging problems in control, computer vision and image understanding. A critical element in this problem is the ability to detect and locate obstacles, using on-board sensors, and modify the nominal trajectory. This requirement is also necessary for the safe landing of an autonomous lander on Mars. This paper examines some of the issues in the location of objects using a sequence of images from a passive sensor, and describes a Kalman filter approach to estimate the range to obstacles. The Kalman filter is also used to track features in the images leading to a significant reduction of search effort in the feature extraction step of the algorithm. The method can compute range for both straight line and curvilinear motion of the sensor. A laboratory experiment was designed to acquire a sequence of images along with sensor motion parameters under conditions similar to helicopter flight. Range estimation results using this imagery are presented.

  8. Multispectral Imaging in Cultural Heritage Conservation

    NASA Astrophysics Data System (ADS)

    Del Pozo, S.; Rodríguez-Gonzálvez, P.; Sánchez-Aparicio, L. J.; Muñoz-Nieto, A.; Hernández-López, D.; Felipe-García, B.; González-Aguilera, D.

    2017-08-01

    This paper sums up the main contribution derived from the thesis entitled "Multispectral imaging for the analysis of materials and pathologies in civil engineering, constructions and natural spaces" awarded by CIPA-ICOMOS for its connection with the preservation of Cultural Heritage. This thesis is framed within close-range remote sensing approaches by the fusion of sensors operating in the optical domain (visible to shortwave infrared spectrum). In the field of heritage preservation, multispectral imaging is a suitable technique due to its non-destructive nature and its versatility. It combines imaging and spectroscopy to analyse materials and land covers and enables the use of a variety of different geomatic sensors for this purpose. These sensors collect both spatial and spectral information for a given scenario and a specific spectral range, so that, their smaller storage units save the spectral properties of the radiation reflected by the surface of interest. The main goal of this research work is to characterise different construction materials as well as the main pathologies of Cultural Heritage elements by combining active and passive sensors recording data in different ranges. Conclusions about the suitability of each type of sensor and spectral range are drawn in relation to each particular case study and damage. It should be emphasised that results are not limited to images, since 3D intensity data from laser scanners can be integrated with 2D data from passive sensors obtaining high quality products due to the added value that metric brings to multispectral images.

  9. Determining the 3-D structure and motion of objects using a scanning laser range sensor

    NASA Technical Reports Server (NTRS)

    Nandhakumar, N.; Smith, Philip W.

    1993-01-01

    In order for the EVAHR robot to autonomously track and grasp objects, its vision system must be able to determine the 3-D structure and motion of an object from a sequence of sensory images. This task is accomplished by the use of a laser radar range sensor which provides dense range maps of the scene. Unfortunately, the currently available laser radar range cameras use a sequential scanning approach which complicates image analysis. Although many algorithms have been developed for recognizing objects from range images, none are suited for use with single beam, scanning, time-of-flight sensors because all previous algorithms assume instantaneous acquisition of the entire image. This assumption is invalid since the EVAHR robot is equipped with a sequential scanning laser range sensor. If an object is moving while being imaged by the device, the apparent structure of the object can be significantly distorted due to the significant non-zero delay time between sampling each image pixel. If an estimate of the motion of the object can be determined, this distortion can be eliminated; but, this leads to the motion-structure paradox - most existing algorithms for 3-D motion estimation use the structure of objects to parameterize their motions. The goal of this research is to design a rigid-body motion recovery technique which overcomes this limitation. The method being developed is an iterative, linear, feature-based approach which uses the non-zero image acquisition time constraint to accurately recover the motion parameters from the distorted structure of the 3-D range maps. Once the motion parameters are determined, the structural distortion in the range images is corrected.

  10. Dense range map reconstruction from a versatile robotic sensor system with an active trinocular vision and a passive binocular vision.

    PubMed

    Kim, Min Young; Lee, Hyunkee; Cho, Hyungsuck

    2008-04-10

    One major research issue associated with 3D perception by robotic systems is the creation of efficient sensor systems that can generate dense range maps reliably. A visual sensor system for robotic applications is developed that is inherently equipped with two types of sensor, an active trinocular vision and a passive stereo vision. Unlike in conventional active vision systems that use a large number of images with variations of projected patterns for dense range map acquisition or from conventional passive vision systems that work well on specific environments with sufficient feature information, a cooperative bidirectional sensor fusion method for this visual sensor system enables us to acquire a reliable dense range map using active and passive information simultaneously. The fusion algorithms are composed of two parts, one in which the passive stereo vision helps active vision and the other in which the active trinocular vision helps the passive one. The first part matches the laser patterns in stereo laser images with the help of intensity images; the second part utilizes an information fusion technique using the dynamic programming method in which image regions between laser patterns are matched pixel-by-pixel with help of the fusion results obtained in the first part. To determine how the proposed sensor system and fusion algorithms can work in real applications, the sensor system is implemented on a robotic system, and the proposed algorithms are applied. A series of experimental tests is performed for a variety of configurations of robot and environments. The performance of the sensor system is discussed in detail.

  11. Performance of PHOTONIS' low light level CMOS imaging sensor for long range observation

    NASA Astrophysics Data System (ADS)

    Bourree, Loig E.

    2014-05-01

    Identification of potential threats in low-light conditions through imaging is commonly achieved through closed-circuit television (CCTV) and surveillance cameras by combining the extended near infrared (NIR) response (800-10000nm wavelengths) of the imaging sensor with NIR LED or laser illuminators. Consequently, camera systems typically used for purposes of long-range observation often require high-power lasers in order to generate sufficient photons on targets to acquire detailed images at night. While these systems may adequately identify targets at long-range, the NIR illumination needed to achieve such functionality can easily be detected and therefore may not be suitable for covert applications. In order to reduce dependency on supplemental illumination in low-light conditions, the frame rate of the imaging sensors may be reduced to increase the photon integration time and thus improve the signal to noise ratio of the image. However, this may hinder the camera's ability to image moving objects with high fidelity. In order to address these particular drawbacks, PHOTONIS has developed a CMOS imaging sensor (CIS) with a pixel architecture and geometry designed specifically to overcome these issues in low-light level imaging. By combining this CIS with field programmable gate array (FPGA)-based image processing electronics, PHOTONIS has achieved low-read noise imaging with enhanced signal-to-noise ratio at quarter moon illumination, all at standard video frame rates. The performance of this CIS is discussed herein and compared to other commercially available CMOS and CCD for long-range observation applications.

  12. Fusion of radar and ultrasound sensors for concealed weapons detection

    NASA Astrophysics Data System (ADS)

    Felber, Franklin S.; Davis, Herbert T., III; Mallon, Charles E.; Wild, Norbert C.

    1996-06-01

    An integrated radar and ultrasound sensor, capable of remotely detecting and imaging concealed weapons, is being developed. A modified frequency-agile, mine-detection radar is intended to specify with high probability of detection at ranges of 1 to 10 m which individuals in a moving crowd may be concealing metallic or nonmetallic weapons. Within about 1 to 5 m, the active ultrasound sensor is intended to enable a user to identify a concealed weapon on a moving person with low false-detection rate, achieved through a real-time centimeter-resolution image of the weapon. The goal for sensor fusion is to have the radar acquire concealed weapons at long ranges and seamlessly hand over tracking data to the ultrasound sensor for high-resolution imaging on a video monitor. We have demonstrated centimeter-resolution ultrasound images of metallic and non-metallic weapons concealed on a human at ranges over 1 m. Processing of the ultrasound images includes filters for noise, frequency, brightness, and contrast. A frequency-agile radar has been developed by JAYCOR under the U.S. Army Advanced Mine Detection Radar Program. The signature of an armed person, detected by this radar, differs appreciably from that of the same person unarmed.

  13. An Over 90 dB Intra-Scene Single-Exposure Dynamic Range CMOS Image Sensor Using a 3.0 μm Triple-Gain Pixel Fabricated in a Standard BSI Process.

    PubMed

    Takayanagi, Isao; Yoshimura, Norio; Mori, Kazuya; Matsuo, Shinichiro; Tanaka, Shunsuke; Abe, Hirofumi; Yasuda, Naoto; Ishikawa, Kenichiro; Okura, Shunsuke; Ohsawa, Shinji; Otaka, Toshinori

    2018-01-12

    To respond to the high demand for high dynamic range imaging suitable for moving objects with few artifacts, we have developed a single-exposure dynamic range image sensor by introducing a triple-gain pixel and a low noise dual-gain readout circuit. The developed 3 μm pixel is capable of having three conversion gains. Introducing a new split-pinned photodiode structure, linear full well reaches 40 ke - . Readout noise under the highest pixel gain condition is 1 e - with a low noise readout circuit. Merging two signals, one with high pixel gain and high analog gain, and the other with low pixel gain and low analog gain, a single exposure dynamic rage (SEHDR) signal is obtained. Using this technology, a 1/2.7", 2M-pixel CMOS image sensor has been developed and characterized. The image sensor also employs an on-chip linearization function, yielding a 16-bit linear signal at 60 fps, and an intra-scene dynamic range of higher than 90 dB was successfully demonstrated. This SEHDR approach inherently mitigates the artifacts from moving objects or time-varying light sources that can appear in the multiple exposure high dynamic range (MEHDR) approach.

  14. An Over 90 dB Intra-Scene Single-Exposure Dynamic Range CMOS Image Sensor Using a 3.0 μm Triple-Gain Pixel Fabricated in a Standard BSI Process †

    PubMed Central

    Takayanagi, Isao; Yoshimura, Norio; Mori, Kazuya; Matsuo, Shinichiro; Tanaka, Shunsuke; Abe, Hirofumi; Yasuda, Naoto; Ishikawa, Kenichiro; Okura, Shunsuke; Ohsawa, Shinji; Otaka, Toshinori

    2018-01-01

    To respond to the high demand for high dynamic range imaging suitable for moving objects with few artifacts, we have developed a single-exposure dynamic range image sensor by introducing a triple-gain pixel and a low noise dual-gain readout circuit. The developed 3 μm pixel is capable of having three conversion gains. Introducing a new split-pinned photodiode structure, linear full well reaches 40 ke−. Readout noise under the highest pixel gain condition is 1 e− with a low noise readout circuit. Merging two signals, one with high pixel gain and high analog gain, and the other with low pixel gain and low analog gain, a single exposure dynamic rage (SEHDR) signal is obtained. Using this technology, a 1/2.7”, 2M-pixel CMOS image sensor has been developed and characterized. The image sensor also employs an on-chip linearization function, yielding a 16-bit linear signal at 60 fps, and an intra-scene dynamic range of higher than 90 dB was successfully demonstrated. This SEHDR approach inherently mitigates the artifacts from moving objects or time-varying light sources that can appear in the multiple exposure high dynamic range (MEHDR) approach. PMID:29329210

  15. Method and apparatus of high dynamic range image sensor with individual pixel reset

    NASA Technical Reports Server (NTRS)

    Yadid-Pecht, Orly (Inventor); Pain, Bedabrata (Inventor); Fossum, Eric R. (Inventor)

    2001-01-01

    A wide dynamic range image sensor provides individual pixel reset to vary the integration time of individual pixels. The integration time of each pixel is controlled by column and row reset control signals which activate a logical reset transistor only when both signals coincide for a given pixel.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dorum, O.H.; Hoover, A.; Jones, J.P.

    This paper addresses some issues in the development of sensor-based systems for mobile robot navigation which use range imaging sensors as the primary source for geometric information about the environment. In particular, we describe a model of scanning laser range cameras which takes into account the properties of the mechanical system responsible for image formation and a calibration procedure which yields improved accuracy over previous models. In addition, we describe an algorithm which takes the limitations of these sensors into account in path planning and path execution. In particular, range imaging sensors are characterized by a limited field of viewmore » and a standoff distance -- a minimum distance nearer than which surfaces cannot be sensed. These limitations can be addressed by enriching the concept of configuration space to include information about what can be sensed from a given configuration, and using this information to guide path planning and path following.« less

  17. Automatic panoramic thermal integrated sensor

    NASA Astrophysics Data System (ADS)

    Gutin, Mikhail A.; Tsui, Eddy K.; Gutin, Olga N.

    2005-05-01

    Historically, the US Army has recognized the advantages of panoramic imagers with high image resolution: increased area coverage with fewer cameras, instantaneous full horizon detection, location and tracking of multiple targets simultaneously, extended range, and others. The novel ViperViewTM high-resolution panoramic thermal imager is the heart of the Automatic Panoramic Thermal Integrated Sensor (APTIS), being jointly developed by Applied Science Innovative, Inc. (ASI) and the Armament Research, Development and Engineering Center (ARDEC) in support of the Future Combat Systems (FCS) and the Intelligent Munitions Systems (IMS). The APTIS is anticipated to operate as an intelligent node in a wireless network of multifunctional nodes that work together to improve situational awareness (SA) in many defense and offensive operations, as well as serve as a sensor node in tactical Intelligence Surveillance Reconnaissance (ISR). The ViperView is as an aberration-corrected omnidirectional imager with small optics designed to match the resolution of a 640x480 pixels IR camera with improved image quality for longer range target detection, classification, and tracking. The same approach is applicable to panoramic cameras working in the visible spectral range. Other components of the ATPIS sensor suite include ancillary sensors, advanced power management, and wakeup capability. This paper describes the development status of the APTIS system.

  18. Range and egomotion estimation from compound photodetector arrays with parallel optical axis using optical flow techniques.

    PubMed

    Chahl, J S

    2014-01-20

    This paper describes an application for arrays of narrow-field-of-view sensors with parallel optical axes. These devices exhibit some complementary characteristics with respect to conventional perspective projection or angular projection imaging devices. Conventional imaging devices measure rotational egomotion directly by measuring the angular velocity of the projected image. Translational egomotion cannot be measured directly by these devices because the induced image motion depends on the unknown range of the viewed object. On the other hand, a known translational motion generates image velocities which can be used to recover the ranges of objects and hence the three-dimensional (3D) structure of the environment. A new method is presented for computing egomotion and range using the properties of linear arrays of independent narrow-field-of-view optical sensors. An approximate parallel projection can be used to measure translational egomotion in terms of the velocity of the image. On the other hand, a known rotational motion of the paraxial sensor array generates image velocities, which can be used to recover the 3D structure of the environment. Results of tests of an experimental array confirm these properties.

  19. Test of the Practicality and Feasibility of EDoF-Empowered Image Sensors for Long-Range Biometrics.

    PubMed

    Hsieh, Sheng-Hsun; Li, Yung-Hui; Tien, Chung-Hao

    2016-11-25

    For many practical applications of image sensors, how to extend the depth-of-field (DoF) is an important research topic; if successfully implemented, it could be beneficial in various applications, from photography to biometrics. In this work, we want to examine the feasibility and practicability of a well-known "extended DoF" (EDoF) technique, or "wavefront coding," by building real-time long-range iris recognition and performing large-scale iris recognition. The key to the success of long-range iris recognition includes long DoF and image quality invariance toward various object distance, which is strict and harsh enough to test the practicality and feasibility of EDoF-empowered image sensors. Besides image sensor modification, we also explored the possibility of varying enrollment/testing pairs. With 512 iris images from 32 Asian people as the database, 400-mm focal length and F/6.3 optics over 3 m working distance, our results prove that a sophisticated coding design scheme plus homogeneous enrollment/testing setups can effectively overcome the blurring caused by phase modulation and omit Wiener-based restoration. In our experiments, which are based on 3328 iris images in total, the EDoF factor can achieve a result 3.71 times better than the original system without a loss of recognition accuracy.

  20. Hyperspectral CMOS imager

    NASA Astrophysics Data System (ADS)

    Jerram, P. A.; Fryer, M.; Pratlong, J.; Pike, A.; Walker, A.; Dierickx, B.; Dupont, B.; Defernez, A.

    2017-11-01

    CCDs have been used for many years for Hyperspectral imaging missions and have been extremely successful. These include the Medium Resolution Imaging Spectrometer (MERIS) [1] on Envisat, the Compact High Resolution Imaging Spectrometer (CHRIS) on Proba and the Ozone Monitoring Instrument operating in the UV spectral region. ESA are also planning a number of further missions that are likely to use CCD technology (Sentinel 3, 4 and 5). However CMOS sensors have a number of advantages which means that they will probably be used for hyperspectral applications in the longer term. There are two main advantages with CMOS sensors: First a hyperspectral image consists of spectral lines with a large difference in intensity; in a frame transfer CCD the faint spectral lines have to be transferred through the part of the imager illuminated by intense lines. This can lead to cross-talk and whilst this problem can be reduced by the use of split frame transfer and faster line rates CMOS sensors do not require a frame transfer and hence inherently will not suffer from this problem. Second, with a CMOS sensor the intense spectral lines can be read multiple times within a frame to give a significant increase in dynamic range. We will describe the design, and initial test of a CMOS sensor for use in hyperspectral applications. This device has been designed to give as high a dynamic range as possible with minimum cross-talk. The sensor has been manufactured on high resistivity epitaxial silicon wafers and is be back-thinned and left relatively thick in order to obtain the maximum quantum efficiency across the entire spectral range

  1. Characterization of photocathode dark current vs. temperature in image intensifier tube modules and intensified televisions

    NASA Astrophysics Data System (ADS)

    Bender, Edward J.; Wood, Michael V.; Hart, Steve; Heim, Gerald B.; Torgerson, John A.

    2004-10-01

    Image intensifiers (I2) have gained wide acceptance throughout the Army as the premier nighttime mobility sensor for the individual soldier, with over 200,000 fielded systems. There is increasing need, however, for such a sensor with a video output, so that it can be utilized in remote vehicle platforms, and/or can be electronically fused with other sensors. The image-intensified television (I2TV), typically consisting of an image intensifier tube coupled via fiber optic to a solid-state imaging array, has been the primary solution to this need. I2TV platforms in vehicles, however, can generate high internal heat loads and must operate in high-temperature environments. Intensifier tube dark current, called "Equivalent Background Input" or "EBI", is not a significant factor at room temperature, but can seriously degrade image contrast and intra-scene dynamic range at such high temperatures. Cooling of the intensifier's photocathode is the only practical solution to this problem. The US Army RDECOM CERDEC Night Vision & Electronic Sensors Directorate (NVESD) and Ball Aerospace have collaborated in the reported effort to more rigorously characterize intensifier EBI versus temperature. NVESD performed non-imaging EBI measurements of Generation 2 and 3 tube modules over a large range of ambient temperature, while Ball performed an imaging evaluation of Generation 3 I2TVs over a similar temperature range. The findings and conclusions of this effort are presented.

  2. Testing and evaluation of tactical electro-optical sensors

    NASA Astrophysics Data System (ADS)

    Middlebrook, Christopher T.; Smith, John G.

    2002-07-01

    As integrated electro-optical sensor payloads (multi- sensors) comprised of infrared imagers, visible imagers, and lasers advance in performance, the tests and testing methods must also advance in order to fully evaluate them. Future operational requirements will require integrated sensor payloads to perform missions at further ranges and with increased targeting accuracy. In order to meet these requirements sensors will require advanced imaging algorithms, advanced tracking capability, high-powered lasers, and high-resolution imagers. To meet the U.S. Navy's testing requirements of such multi-sensors, the test and evaluation group in the Night Vision and Chemical Biological Warfare Department at NAVSEA Crane is developing automated testing methods, and improved tests to evaluate imaging algorithms, and procuring advanced testing hardware to measure high resolution imagers and line of sight stabilization of targeting systems. This paper addresses: descriptions of the multi-sensor payloads tested, testing methods used and under development, and the different types of testing hardware and specific payload tests that are being developed and used at NAVSEA Crane.

  3. A 4MP high-dynamic-range, low-noise CMOS image sensor

    NASA Astrophysics Data System (ADS)

    Ma, Cheng; Liu, Yang; Li, Jing; Zhou, Quan; Chang, Yuchun; Wang, Xinyang

    2015-03-01

    In this paper we present a 4 Megapixel high dynamic range, low dark noise and dark current CMOS image sensor, which is ideal for high-end scientific and surveillance applications. The pixel design is based on a 4-T PPD structure. During the readout of the pixel array, signals are first amplified, and then feed to a low- power column-parallel ADC array which is already presented in [1]. Measurement results show that the sensor achieves a dynamic range of 96dB, a dark noise of 1.47e- at 24fps speed. The dark current is 0.15e-/pixel/s at -20oC.

  4. Photometric Calibration and Image Stitching for a Large Field of View Multi-Camera System

    PubMed Central

    Lu, Yu; Wang, Keyi; Fan, Gongshu

    2016-01-01

    A new compact large field of view (FOV) multi-camera system is introduced. The camera is based on seven tiny complementary metal-oxide-semiconductor sensor modules covering over 160° × 160° FOV. Although image stitching has been studied extensively, sensor and lens differences have not been considered in previous multi-camera devices. In this study, we have calibrated the photometric characteristics of the multi-camera device. Lenses were not mounted on the sensor in the process of radiometric response calibration to eliminate the influence of the focusing effect of uniform light from an integrating sphere. Linearity range of the radiometric response, non-linearity response characteristics, sensitivity, and dark current of the camera response function are presented. The R, G, and B channels have different responses for the same illuminance. Vignetting artifact patterns have been tested. The actual luminance of the object is retrieved by sensor calibration results, and is used to blend images to make panoramas reflect the objective luminance more objectively. This compensates for the limitation of stitching images that are more realistic only through the smoothing method. The dynamic range limitation of can be resolved by using multiple cameras that cover a large field of view instead of a single image sensor with a wide-angle lens. The dynamic range is expanded by 48-fold in this system. We can obtain seven images in one shot with this multi-camera system, at 13 frames per second. PMID:27077857

  5. Generalised optical differentiation wavefront sensor: a sensitive high dynamic range wavefront sensor.

    PubMed

    Haffert, S Y

    2016-08-22

    Current wavefront sensors for high resolution imaging have either a large dynamic range or a high sensitivity. A new kind of wavefront sensor is developed which can have both: the Generalised Optical Differentiation wavefront sensor. This new wavefront sensor is based on the principles of optical differentiation by amplitude filters. We have extended the theory behind linear optical differentiation and generalised it to nonlinear filters. We used numerical simulations and laboratory experiments to investigate the properties of the generalised wavefront sensor. With this we created a new filter that can decouple the dynamic range from the sensitivity. These properties make it suitable for adaptive optic systems where a large range of phase aberrations have to be measured with high precision.

  6. Test of the Practicality and Feasibility of EDoF-Empowered Image Sensors for Long-Range Biometrics

    PubMed Central

    Hsieh, Sheng-Hsun; Li, Yung-Hui; Tien, Chung-Hao

    2016-01-01

    For many practical applications of image sensors, how to extend the depth-of-field (DoF) is an important research topic; if successfully implemented, it could be beneficial in various applications, from photography to biometrics. In this work, we want to examine the feasibility and practicability of a well-known “extended DoF” (EDoF) technique, or “wavefront coding,” by building real-time long-range iris recognition and performing large-scale iris recognition. The key to the success of long-range iris recognition includes long DoF and image quality invariance toward various object distance, which is strict and harsh enough to test the practicality and feasibility of EDoF-empowered image sensors. Besides image sensor modification, we also explored the possibility of varying enrollment/testing pairs. With 512 iris images from 32 Asian people as the database, 400-mm focal length and F/6.3 optics over 3 m working distance, our results prove that a sophisticated coding design scheme plus homogeneous enrollment/testing setups can effectively overcome the blurring caused by phase modulation and omit Wiener-based restoration. In our experiments, which are based on 3328 iris images in total, the EDoF factor can achieve a result 3.71 times better than the original system without a loss of recognition accuracy. PMID:27897976

  7. Active Sensor for Microwave Tissue Imaging with Bias-Switched Arrays.

    PubMed

    Foroutan, Farzad; Nikolova, Natalia K

    2018-05-06

    A prototype of a bias-switched active sensor was developed and measured to establish the achievable dynamic range in a new generation of active arrays for microwave tissue imaging. The sensor integrates a printed slot antenna, a low-noise amplifier (LNA) and an active mixer in a single unit, which is sufficiently small to enable inter-sensor separation distance as small as 12 mm. The sensor’s input covers the bandwidth from 3 GHz to 7.5 GHz. Its output intermediate frequency (IF) is 30 MHz. The sensor is controlled by a simple bias-switching circuit, which switches ON and OFF the bias of the LNA and the mixer simultaneously. It was demonstrated experimentally that the dynamic range of the sensor, as determined by its ON and OFF states, is 109 dB and 118 dB at resolution bandwidths of 1 kHz and 100 Hz, respectively.

  8. Spaceborne imaging radar research in the 90's

    NASA Technical Reports Server (NTRS)

    Elachi, Charles

    1986-01-01

    The imaging radar experiments on SEASAT and on the space shuttle (SIR-A and SIR-B) have led to a wide interest in the use of spaceborne imaging radars in Earth and planetary sciences. The radar sensors provide unique and complimentary information to what is acquired with visible and infrared imagers. This includes subsurface imaging in arid regions, all weather observation of ocean surface dynamic phenomena, structural mapping, soil moisture mapping, stereo imaging and resulting topographic mapping. However, experiments up to now have exploited only a very limited range of the generic capability of radar sensors. With planned sensor developments in the late 80's and early 90's, a quantum jump will be made in our ability to fully exploit the potential of these sensors. These developments include: multiparameter research sensors such as SIR-C and X-SAR, long-term and global monitoring sensors such as ERS-1, JERS-1, EOS, Radarsat, GLORI and the spaceborne sounder, planetary mapping sensors such as the Magellan and Cassini/Titan mappers, topographic three-dimensional imagers such as the scanning radar altimeter and three-dimensional rain mapping. These sensors and their associated research are briefly described.

  9. High-speed uncooled MWIR hostile fire indication sensor

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Pantuso, F. P.; Jin, G.; Mazurenko, A.; Erdtmann, M.; Radhakrishnan, S.; Salerno, J.

    2011-06-01

    Hostile fire indication (HFI) systems require high-resolution sensor operation at extremely high speeds to capture hostile fire events, including rocket-propelled grenades, anti-aircraft artillery, heavy machine guns, anti-tank guided missiles and small arms. HFI must also be conducted in a waveband with large available signal and low background clutter, in particular the mid-wavelength infrared (MWIR). The shortcoming of current HFI sensors in the MWIR is the bandwidth of the sensor is not sufficient to achieve the required frame rate at the high sensor resolution. Furthermore, current HFI sensors require cryogenic cooling that contributes to size, weight, and power (SWAP) in aircraft-mounted applications where these factors are at a premium. Based on its uncooled photomechanical infrared imaging technology, Agiltron has developed a low-SWAP, high-speed MWIR HFI sensor that breaks the bandwidth bottleneck typical of current infrared sensors. This accomplishment is made possible by using a commercial-off-the-shelf, high-performance visible imager as the readout integrated circuit and physically separating this visible imager from the MWIR-optimized photomechanical sensor chip. With this approach, we have achieved high-resolution operation of our MWIR HFI sensor at 1000 fps, which is unprecedented for an uncooled infrared sensor. We have field tested our MWIR HFI sensor for detecting all hostile fire events mentioned above at several test ranges under a wide range of environmental conditions. The field testing results will be presented.

  10. EOID Evaluation and Automated Target Recognition

    DTIC Science & Technology

    2002-09-30

    Electro - Optic IDentification (EOID) sensors into shallow water littoral zone minehunting systems on towed, remotely operated, and autonomous platforms. These downlooking laser-based sensors operate at unparalleled standoff ranges in visible wavelengths to image and identify mine-like objects (MLOs) that have been detected through other sensing means such as magnetic induction and various modes of acoustic imaging. Our long term goal is to provide a robust automated target cueing and identification capability for use with these imaging sensors. It is also our goal to assist

  11. EOID Evaluation and Automated Target Recognition

    DTIC Science & Technology

    2001-09-30

    Electro - Optic IDentification (EOID) sensors into shallow water littoral zone minehunting systems on towed, remotely operated, and autonomous platforms. These downlooking laser-based sensors operate at unparalleled standoff ranges in visible wavelengths to image and identify mine-like objects that have been detected through other sensing means such as magnetic induction and various modes of acoustic imaging. Our long term goal is to provide a robust automated target cueing and identification capability for use with these imaging sensors. It is also our goal to assist the

  12. Adaptive time-sequential binary sensing for high dynamic range imaging

    NASA Astrophysics Data System (ADS)

    Hu, Chenhui; Lu, Yue M.

    2012-06-01

    We present a novel image sensor for high dynamic range imaging. The sensor performs an adaptive one-bit quantization at each pixel, with the pixel output switched from 0 to 1 only if the number of photons reaching that pixel is greater than or equal to a quantization threshold. With an oracle knowledge of the incident light intensity, one can pick an optimal threshold (for that light intensity) and the corresponding Fisher information contained in the output sequence follows closely that of an ideal unquantized sensor over a wide range of intensity values. This observation suggests the potential gains one may achieve by adaptively updating the quantization thresholds. As the main contribution of this work, we propose a time-sequential threshold-updating rule that asymptotically approaches the performance of the oracle scheme. With every threshold mapped to a number of ordered states, the dynamics of the proposed scheme can be modeled as a parametric Markov chain. We show that the frequencies of different thresholds converge to a steady-state distribution that is concentrated around the optimal choice. Moreover, numerical experiments show that the theoretical performance measures (Fisher information and Craḿer-Rao bounds) can be achieved by a maximum likelihood estimator, which is guaranteed to find globally optimal solution due to the concavity of the log-likelihood functions. Compared with conventional image sensors and the strategy that utilizes a constant single-photon threshold considered in previous work, the proposed scheme attains orders of magnitude improvement in terms of sensor dynamic ranges.

  13. The CAOS camera platform: ushering in a paradigm change in extreme dynamic range imager design

    NASA Astrophysics Data System (ADS)

    Riza, Nabeel A.

    2017-02-01

    Multi-pixel imaging devices such as CCD, CMOS and Focal Plane Array (FPA) photo-sensors dominate the imaging world. These Photo-Detector Array (PDA) devices certainly have their merits including increasingly high pixel counts and shrinking pixel sizes, nevertheless, they are also being hampered by limitations in instantaneous dynamic range, inter-pixel crosstalk, quantum full well capacity, signal-to-noise ratio, sensitivity, spectral flexibility, and in some cases, imager response time. Recently invented is the Coded Access Optical Sensor (CAOS) Camera platform that works in unison with current Photo-Detector Array (PDA) technology to counter fundamental limitations of PDA-based imagers while providing high enough imaging spatial resolution and pixel counts. Using for example the Texas Instruments (TI) Digital Micromirror Device (DMD) to engineer the CAOS camera platform, ushered in is a paradigm change in advanced imager design, particularly for extreme dynamic range applications.

  14. Real-time image processing of TOF range images using a reconfigurable processor system

    NASA Astrophysics Data System (ADS)

    Hussmann, S.; Knoll, F.; Edeler, T.

    2011-07-01

    During the last years, Time-of-Flight sensors achieved a significant impact onto research fields in machine vision. In comparison to stereo vision system and laser range scanners they combine the advantages of active sensors providing accurate distance measurements and camera-based systems recording a 2D matrix at a high frame rate. Moreover low cost 3D imaging has the potential to open a wide field of additional applications and solutions in markets like consumer electronics, multimedia, digital photography, robotics and medical technologies. This paper focuses on the currently implemented 4-phase-shift algorithm in this type of sensors. The most time critical operation of the phase-shift algorithm is the arctangent function. In this paper a novel hardware implementation of the arctangent function using a reconfigurable processor system is presented and benchmarked against the state-of-the-art CORDIC arctangent algorithm. Experimental results show that the proposed algorithm is well suited for real-time processing of the range images of TOF cameras.

  15. A CMOS image sensor with programmable pixel-level analog processing.

    PubMed

    Massari, Nicola; Gottardi, Massimo; Gonzo, Lorenzo; Stoppa, David; Simoni, Andrea

    2005-11-01

    A prototype of a 34 x 34 pixel image sensor, implementing real-time analog image processing, is presented. Edge detection, motion detection, image amplification, and dynamic-range boosting are executed at pixel level by means of a highly interconnected pixel architecture based on the absolute value of the difference among neighbor pixels. The analog operations are performed over a kernel of 3 x 3 pixels. The square pixel, consisting of 30 transistors, has a pitch of 35 microm with a fill-factor of 20%. The chip was fabricated in a 0.35 microm CMOS technology, and its power consumption is 6 mW with 3.3 V power supply. The device was fully characterized and achieves a dynamic range of 50 dB with a light power density of 150 nW/mm2 and a frame rate of 30 frame/s. The measured fixed pattern noise corresponds to 1.1% of the saturation level. The sensor's dynamic range can be extended up to 96 dB using the double-sampling technique.

  16. Light-Addressable Potentiometric Sensors for Quantitative Spatial Imaging of Chemical Species.

    PubMed

    Yoshinobu, Tatsuo; Miyamoto, Ko-Ichiro; Werner, Carl Frederik; Poghossian, Arshak; Wagner, Torsten; Schöning, Michael J

    2017-06-12

    A light-addressable potentiometric sensor (LAPS) is a semiconductor-based chemical sensor, in which a measurement site on the sensing surface is defined by illumination. This light addressability can be applied to visualize the spatial distribution of pH or the concentration of a specific chemical species, with potential applications in the fields of chemistry, materials science, biology, and medicine. In this review, the features of this chemical imaging sensor technology are compared with those of other technologies. Instrumentation, principles of operation, and various measurement modes of chemical imaging sensor systems are described. The review discusses and summarizes state-of-the-art technologies, especially with regard to the spatial resolution and measurement speed; for example, a high spatial resolution in a submicron range and a readout speed in the range of several tens of thousands of pixels per second have been achieved with the LAPS. The possibility of combining this technology with microfluidic devices and other potential future developments are discussed.

  17. Compact LWIR sensors using spatial interferometric technology (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Bingham, Adam L.; Lucey, Paul G.; Knobbe, Edward T.

    2017-05-01

    Recent developments in reducing the cost and mass of hyperspectral sensors have enabled more widespread use for short range compositional imaging applications. HSI in the long wave infrared (LWIR) is of interest because it is sensitive to spectral phenomena not accessible to other wavelengths, and because of its inherent thermal imaging capability. At Spectrum Photonics we have pursued compact LWIR hyperspectral sensors both using microbolometer arrays and compact cryogenic detector cameras. Our microbolometer-based systems are principally aimed at short standoff applications, currently weigh 10-15 lbs and feature sizes approximately 20x20x10 cm, with sensitivity in the 1-2 microflick range, and imaging times on the order of 30 seconds. Our systems that employ cryogenic arrays are aimed at medium standoff ranges such as nadir looking missions from UAVs. Recent work with cooled sensors has focused on Strained Layer Superlattice (SLS) technology, as these detector arrays are undergoing rapid improvements, and have some advantages compared to HgCdTe detectors in terms of calibration stability. These sensors include full on-board processing sensor stabilization so are somewhat larger than the microbolometer systems, but could be adapted to much more compact form factors. We will review our recent progress in both these application areas.

  18. Simulation and ground testing with the Advanced Video Guidance Sensor

    NASA Technical Reports Server (NTRS)

    Howard, Richard T.; Johnston, Albert S.; Bryan, Thomas C.; Book, Michael L.

    2005-01-01

    The Advanced Video Guidance Sensor (AVGS), an active sensor system that provides near-range 6-degree-of-freedom sensor data, has been developed as part of an automatic rendezvous and docking system for the Demonstration of Autonomous Rendezvous Technology (DART). The sensor determines the relative positions and attitudes between the active sensor and the passive target at ranges up to 300 meters. The AVGS uses laser diodes to illuminate retro-reflectors in the target, a solid-state imager to detect the light returned from the target, and image capture electronics and a digital signal processor to convert the video information into the relative positions and attitudes. The development of the sensor, through initial prototypes, final prototypes, and three flight units, has required a great deal of testing at every phase, and the different types of testing, their effectiveness, and their results, are presented in this paper, focusing on the testing of the flight units. Testing has improved the sensor's performance.

  19. Gyrocopter-Based Remote Sensing Platform

    NASA Astrophysics Data System (ADS)

    Weber, I.; Jenal, A.; Kneer, C.; Bongartz, J.

    2015-04-01

    In this paper the development of a lightweight and highly modularized airborne sensor platform for remote sensing applications utilizing a gyrocopter as a carrier platform is described. The current sensor configuration consists of a high resolution DSLR camera for VIS-RGB recordings. As a second sensor modality, a snapshot hyperspectral camera was integrated in the aircraft. Moreover a custom-developed thermal imaging system composed of a VIS-PAN camera and a LWIR-camera is used for aerial recordings in the thermal infrared range. Furthermore another custom-developed highly flexible imaging system for high resolution multispectral image acquisition with up to six spectral bands in the VIS-NIR range is presented. The performance of the overall system was tested during several flights with all sensor modalities and the precalculated demands with respect to spatial resolution and reliability were validated. The collected data sets were georeferenced, georectified, orthorectified and then stitched to mosaics.

  20. Method of orthogonally splitting imaging pose measurement

    NASA Astrophysics Data System (ADS)

    Zhao, Na; Sun, Changku; Wang, Peng; Yang, Qian; Liu, Xintong

    2018-01-01

    In order to meet the aviation's and machinery manufacturing's pose measurement need of high precision, fast speed and wide measurement range, and to resolve the contradiction between measurement range and resolution of vision sensor, this paper proposes an orthogonally splitting imaging pose measurement method. This paper designs and realizes an orthogonally splitting imaging vision sensor and establishes a pose measurement system. The vision sensor consists of one imaging lens, a beam splitter prism, cylindrical lenses and dual linear CCD. Dual linear CCD respectively acquire one dimensional image coordinate data of the target point, and two data can restore the two dimensional image coordinates of the target point. According to the characteristics of imaging system, this paper establishes the nonlinear distortion model to correct distortion. Based on cross ratio invariability, polynomial equation is established and solved by the least square fitting method. After completing distortion correction, this paper establishes the measurement mathematical model of vision sensor, and determines intrinsic parameters to calibrate. An array of feature points for calibration is built by placing a planar target in any different positions for a few times. An terative optimization method is presented to solve the parameters of model. The experimental results show that the field angle is 52 °, the focus distance is 27.40 mm, image resolution is 5185×5117 pixels, displacement measurement error is less than 0.1mm, and rotation angle measurement error is less than 0.15°. The method of orthogonally splitting imaging pose measurement can satisfy the pose measurement requirement of high precision, fast speed and wide measurement range.

  1. Scintillator high-gain avalanche rushing photoconductor active-matrix flat panel imager: Zero-spatial frequency x-ray imaging properties of the solid-state SHARP sensor structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wronski, M.; Zhao, W.; Tanioka, K.

    Purpose: The authors are investigating the feasibility of a new type of solid-state x-ray imaging sensor with programmable avalanche gain: scintillator high-gain avalanche rushing photoconductor active matrix flat panel imager (SHARP-AMFPI). The purpose of the present work is to investigate the inherent x-ray detection properties of SHARP and demonstrate its wide dynamic range through programmable gain. Methods: A distributed resistive layer (DRL) was developed to maintain stable avalanche gain operation in a solid-state HARP. The signal and noise properties of the HARP-DRL for optical photon detection were investigated as a function of avalanche gain both theoretically and experimentally, and themore » results were compared with HARP tube (with electron beam readout) used in previous investigations of zero spatial frequency performance of SHARP. For this new investigation, a solid-state SHARP x-ray image sensor was formed by direct optical coupling of the HARP-DRL with a structured cesium iodide (CsI) scintillator. The x-ray sensitivity of this sensor was measured as a function of avalanche gain and the results were compared with the sensitivity of HARP-DRL measured optically. The dynamic range of HARP-DRL with variable avalanche gain was investigated for the entire exposure range encountered in radiography/fluoroscopy (R/F) applications. Results: The signal from HARP-DRL as a function of electric field showed stable avalanche gain, and the noise associated with the avalanche process agrees well with theory and previous measurements from a HARP tube. This result indicates that when coupled with CsI for x-ray detection, the additional noise associated with avalanche gain in HARP-DRL is negligible. The x-ray sensitivity measurements using the SHARP sensor produced identical avalanche gain dependence on electric field as the optical measurements with HARP-DRL. Adjusting the avalanche multiplication gain in HARP-DRL enabled a very wide dynamic range which encompassed all clinically relevant medical x-ray exposures. Conclusions: This work demonstrates that the HARP-DRL sensor enables the practical implementation of a SHARP solid-state x-ray sensor capable of quantum noise limited operation throughout the entire range of clinically relevant x-ray exposures. This is an important step toward the realization of a SHARP-AMFPI x-ray flat-panel imager.« less

  2. Estimation of Image Sensor Fill Factor Using a Single Arbitrary Image

    PubMed Central

    Wen, Wei; Khatibi, Siamak

    2017-01-01

    Achieving a high fill factor is a bottleneck problem for capturing high-quality images. There are hardware and software solutions to overcome this problem. In the solutions, the fill factor is known. However, this is an industrial secrecy by most image sensor manufacturers due to its direct effect on the assessment of the sensor quality. In this paper, we propose a method to estimate the fill factor of a camera sensor from an arbitrary single image. The virtual response function of the imaging process and sensor irradiance are estimated from the generation of virtual images. Then the global intensity values of the virtual images are obtained, which are the result of fusing the virtual images into a single, high dynamic range radiance map. A non-linear function is inferred from the original and global intensity values of the virtual images. The fill factor is estimated by the conditional minimum of the inferred function. The method is verified using images of two datasets. The results show that our method estimates the fill factor correctly with significant stability and accuracy from one single arbitrary image according to the low standard deviation of the estimated fill factors from each of images and for each camera. PMID:28335459

  3. ManPortable and UGV LIVAR: advances in sensor suite integration bring improvements to target observation and identification for the electronic battlefield

    NASA Astrophysics Data System (ADS)

    Lynam, Jeff R.

    2001-09-01

    A more highly integrated, electro-optical sensor suite using Laser Illuminated Viewing and Ranging (LIVAR) techniques is being developed under the Army Advanced Concept Technology- II (ACT-II) program for enhanced manportable target surveillance and identification. The ManPortable LIVAR system currently in development employs a wide-array of sensor technologies that provides the foot-bound soldier and UGV significant advantages and capabilities in lightweight, fieldable, target location, ranging and imaging systems. The unit incorporates a wide field-of-view, 5DEG x 3DEG, uncooled LWIR passive sensor for primary target location. Laser range finding and active illumination is done with a triggered, flash-lamp pumped, eyesafe micro-laser operating in the 1.5 micron region, and is used in conjunction with a range-gated, electron-bombarded CCD digital camera to then image the target objective in a more- narrow, 0.3$DEG, field-of-view. Target range determination is acquired using the integrated LRF and a target position is calculated using data from other onboard devices providing GPS coordinates, tilt, bank and corrected magnetic azimuth. Range gate timing and coordinated receiver optics focus control allow for target imaging operations to be optimized. The onboard control electronics provide power efficient, system operations for extended field use periods from the internal, rechargeable battery packs. Image data storage, transmission, and processing performance capabilities are also being incorporated to provide the best all-around support, for the electronic battlefield, in this type of system. The paper will describe flash laser illumination technology, EBCCD camera technology with flash laser detection system, and image resolution improvement through frame averaging.

  4. Progress in sensor performance testing, modeling and range prediction using the TOD method: an overview

    NASA Astrophysics Data System (ADS)

    Bijl, Piet; Hogervorst, Maarten A.; Toet, Alexander

    2017-05-01

    The Triangle Orientation Discrimination (TOD) methodology includes i) a widely applicable, accurate end-to-end EO/IR sensor test, ii) an image-based sensor system model and iii) a Target Acquisition (TA) range model. The method has been extensively validated against TA field performance for a wide variety of well- and under-sampled imagers, systems with advanced image processing techniques such as dynamic super resolution and local adaptive contrast enhancement, and sensors showing smear or noise drift, for both static and dynamic test stimuli and as a function of target contrast. Recently, significant progress has been made in various directions. Dedicated visual and NIR test charts for lab and field testing are available and thermal test benches are on the market. Automated sensor testing using an objective synthetic human observer is within reach. Both an analytical and an image-based TOD model have recently been developed and are being implemented in the European Target Acquisition model ECOMOS and in the EOSTAR TDA. Further, the methodology is being applied for design optimization of high-end security camera systems. Finally, results from a recent perception study suggest that DRI ranges for real targets can be predicted by replacing the relevant distinctive target features by TOD test patterns of the same characteristic size and contrast, enabling a new TA modeling approach. This paper provides an overview.

  5. Highly Concentrated Seed-Mediated Synthesis of Monodispersed Gold Nanorods (Postprint)

    DTIC Science & Technology

    2017-07-17

    imaging, therapeutics and sensors, to large area coatings, filters , and optical attenuators. Development of the latter technologies has been hindered by...sensors, to large area coatings, filters , and optical attenuators. Development of the latter technologies has been hindered by the lack of cost-effective...challenges the utilization of Au-NRs in a diverse array of technologies, ranging from therapeutics, imaging and sensors, to large area coatings, filters and

  6. Imaging through water turbulence with a plenoptic sensor

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C.

    2016-09-01

    A plenoptic sensor can be used to improve the image formation process in a conventional camera. Through this process, the conventional image is mapped to an image array that represents the image's photon paths along different angular directions. Therefore, it can be used to resolve imaging problems where severe distortion happens. Especially for objects observed at moderate range (10m to 200m) through turbulent water, the image can be twisted to be entirely unrecognizable and correction algorithms need to be applied. In this paper, we show how to use a plenoptic sensor to recover an unknown object in line of sight through significant water turbulence distortion. In general, our approach can be applied to both atmospheric turbulence and water turbulence conditions.

  7. Gradient-based interpolation method for division-of-focal-plane polarimeters.

    PubMed

    Gao, Shengkui; Gruev, Viktor

    2013-01-14

    Recent advancements in nanotechnology and nanofabrication have allowed for the emergence of the division-of-focal-plane (DoFP) polarization imaging sensors. These sensors capture polarization properties of the optical field at every imaging frame. However, the DoFP polarization imaging sensors suffer from large registration error as well as reduced spatial-resolution output. These drawbacks can be improved by applying proper image interpolation methods for the reconstruction of the polarization results. In this paper, we present a new gradient-based interpolation method for DoFP polarimeters. The performance of the proposed interpolation method is evaluated against several previously published interpolation methods by using visual examples and root mean square error (RMSE) comparison. We found that the proposed gradient-based interpolation method can achieve better visual results while maintaining a lower RMSE than other interpolation methods under various dynamic ranges of a scene ranging from dim to bright conditions.

  8. VisNAV 100: a robust, compact imaging sensor for enabling autonomous air-to-air refueling of aircraft and unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Katake, Anup; Choi, Heeyoul

    2010-01-01

    To enable autonomous air-to-refueling of manned and unmanned vehicles a robust high speed relative navigation sensor capable of proving high accuracy 3DOF information in diverse operating conditions is required. To help address this problem, StarVision Technologies Inc. has been developing a compact, high update rate (100Hz), wide field-of-view (90deg) direction and range estimation imaging sensor called VisNAV 100. The sensor is fully autonomous requiring no communication from the tanker aircraft and contains high reliability embedded avionics to provide range, azimuth, elevation (3 degrees of freedom solution 3DOF) and closing speed relative to the tanker aircraft. The sensor is capable of providing 3DOF with an error of 1% in range and 0.1deg in azimuth/elevation up to a range of 30m and 1 deg error in direction for ranges up to 200m at 100Hz update rates. In this paper we will discuss the algorithms that were developed in-house to enable robust beacon pattern detection, outlier rejection and 3DOF estimation in adverse conditions and present the results of several outdoor tests. Results from the long range single beacon detection tests will also be discussed.

  9. Low noise WDR ROIC for InGaAs SWIR image sensor

    NASA Astrophysics Data System (ADS)

    Ni, Yang

    2017-11-01

    Hybridized image sensors are actually the only solution for image sensing beyond the spectral response of silicon devices. By hybridization, we can combine the best sensing material and photo-detector design with high performance CMOS readout circuitry. In the infrared band, we are facing typically 2 configurations: high background situation and low background situation. The performance of high background sensors are conditioned mainly by the integration capacity in each pixel which is the case for mid-wave and long-wave infrared detectors. For low background situation, the detector's performance is mainly limited by the pixel's noise performance which is conditioned by dark signal and readout noise. In the case of reflection based imaging condition, the pixel's dynamic range is also an important parameter. This is the case for SWIR band imaging. We are particularly interested by InGaAs based SWIR image sensors.

  10. Phase aided 3D imaging and modeling: dedicated systems and case studies

    NASA Astrophysics Data System (ADS)

    Yin, Yongkai; He, Dong; Liu, Zeyi; Liu, Xiaoli; Peng, Xiang

    2014-05-01

    Dedicated prototype systems for 3D imaging and modeling (3DIM) are presented. The 3D imaging systems are based on the principle of phase-aided active stereo, which have been developed in our laboratory over the past few years. The reported 3D imaging prototypes range from single 3D sensor to a kind of optical measurement network composed of multiple node 3D-sensors. To enable these 3D imaging systems, we briefly discuss the corresponding calibration techniques for both single sensor and multi-sensor optical measurement network, allowing good performance of the 3DIM prototype systems in terms of measurement accuracy and repeatability. Furthermore, two case studies including the generation of high quality color model of movable cultural heritage and photo booth from body scanning are presented to demonstrate our approach.

  11. Sierra Madre Oriental in Coahuila, Mexico

    NASA Technical Reports Server (NTRS)

    2002-01-01

    This desolate landscape is part of the Sierra Madre Oriental mountain range, on the border between the Coahuila and Nuevo Leon provinces of Mexico. This image was acquired by Landsat 7's Enhanced Thematic Mapper plus (ETM+) sensor on November 28, 1999. This is a false-color composite image made using shortwave infrared, infrared, and green wavelengths. The image has also been sharpened using the sensor's panchromatic band. Image provided by the USGS EROS Data Center Satellite Systems Branch

  12. High-resolution CCD imaging alternatives

    NASA Astrophysics Data System (ADS)

    Brown, D. L.; Acker, D. E.

    1992-08-01

    High resolution CCD color cameras have recently stimulated the interest of a large number of potential end-users for a wide range of practical applications. Real-time High Definition Television (HDTV) systems are now being used or considered for use in applications ranging from entertainment program origination through digital image storage to medical and scientific research. HDTV generation of electronic images offers significant cost and time-saving advantages over the use of film in such applications. Further in still image systems electronic image capture is faster and more efficient than conventional image scanners. The CCD still camera can capture 3-dimensional objects into the computing environment directly without having to shoot a picture on film develop it and then scan the image into a computer. 2. EXTENDING CCD TECHNOLOGY BEYOND BROADCAST Most standard production CCD sensor chips are made for broadcast-compatible systems. One popular CCD and the basis for this discussion offers arrays of roughly 750 x 580 picture elements (pixels) or a total array of approximately 435 pixels (see Fig. 1). FOR. A has developed a technique to increase the number of available pixels for a given image compared to that produced by the standard CCD itself. Using an inter-lined CCD with an overall spatial structure several times larger than the photo-sensitive sensor areas each of the CCD sensors is shifted in two dimensions in order to fill in spatial gaps between adjacent sensors.

  13. Smart image sensors: an emerging key technology for advanced optical measurement and microsystems

    NASA Astrophysics Data System (ADS)

    Seitz, Peter

    1996-08-01

    Optical microsystems typically include photosensitive devices, analog preprocessing circuitry and digital signal processing electronics. The advances in semiconductor technology have made it possible today to integrate all photosensitive and electronical devices on one 'smart image sensor' or photo-ASIC (application-specific integrated circuits containing photosensitive elements). It is even possible to provide each 'smart pixel' with additional photoelectronic functionality, without compromising the fill factor substantially. This technological capability is the basis for advanced cameras and optical microsystems showing novel on-chip functionality: Single-chip cameras with on- chip analog-to-digital converters for less than $10 are advertised; image sensors have been developed including novel functionality such as real-time selectable pixel size and shape, the capability of performing arbitrary convolutions simultaneously with the exposure, as well as variable, programmable offset and sensitivity of the pixels leading to image sensors with a dynamic range exceeding 150 dB. Smart image sensors have been demonstrated offering synchronous detection and demodulation capabilities in each pixel (lock-in CCD), and conventional image sensors are combined with an on-chip digital processor for complete, single-chip image acquisition and processing systems. Technological problems of the monolithic integration of smart image sensors include offset non-uniformities, temperature variations of electronic properties, imperfect matching of circuit parameters, etc. These problems can often be overcome either by designing additional compensation circuitry or by providing digital correction routines. Where necessary for technological or economic reasons, smart image sensors can also be combined with or realized as hybrids, making use of commercially available electronic components. It is concluded that the possibilities offered by custom smart image sensors will influence the design and the performance of future electronic imaging systems in many disciplines, reaching from optical metrology to machine vision on the factory floor and in robotics applications.

  14. Protection performance evaluation regarding imaging sensors hardened against laser dazzling

    NASA Astrophysics Data System (ADS)

    Ritt, Gunnar; Koerber, Michael; Forster, Daniel; Eberle, Bernd

    2015-05-01

    Electro-optical imaging sensors are widely distributed and used for many different purposes, including civil security and military operations. However, laser irradiation can easily disturb their operational capability. Thus, an adequate protection mechanism for electro-optical sensors against dazzling and damaging is highly desirable. Different protection technologies exist now, but none of them satisfies the operational requirements without any constraints. In order to evaluate the performance of various laser protection measures, we present two different approaches based on triangle orientation discrimination on the one hand and structural similarity on the other hand. For both approaches, image analysis algorithms are applied to images taken of a standard test scene with triangular test patterns which is superimposed by dazzling laser light of various irradiance levels. The evaluation methods are applied to three different sensors: a standard complementary metal oxide semiconductor camera, a high dynamic range camera with a nonlinear response curve, and a sensor hardened against laser dazzling.

  15. Fusion of spectral and panchromatic images using false color mapping and wavelet integrated approach

    NASA Astrophysics Data System (ADS)

    Zhao, Yongqiang; Pan, Quan; Zhang, Hongcai

    2006-01-01

    With the development of sensory technology, new image sensors have been introduced that provide a greater range of information to users. But as the power limitation of radiation, there will always be some trade-off between spatial and spectral resolution in the image captured by specific sensors. Images with high spatial resolution can locate objects with high accuracy, whereas images with high spectral resolution can be used to identify the materials. Many applications in remote sensing require fusing low-resolution imaging spectral images with panchromatic images to identify materials at high resolution in clutter. A pixel-based false color mapping and wavelet transform integrated fusion algorithm is presented in this paper, the resulting images have a higher information content than each of the original images and retain sensor-specific image information. The simulation results show that this algorithm can enhance the visibility of certain details and preserve the difference of different materials.

  16. Multi-dimensional position sensor using range detectors

    DOEpatents

    Vann, Charles S.

    2000-01-01

    A small, non-contact optical sensor uses ranges and images to detect its relative position to an object in up to six degrees of freedom. The sensor has three light emitting range detectors which illuminate a target and can be used to determine distance and two tilt angles. A camera located between the three range detectors senses the three remaining degrees of freedom, two translations and one rotation. Various range detectors, with different light sources, e.g. lasers and LEDs, different collection options, and different detection schemes, e.g. diminishing return and time of flight can be used. This sensor increases the capability and flexibility of computer controlled machines, e.g. it can instruct a robot how to adjust automatically to different positions and orientations of a part.

  17. Close-range sensors for small unmanned bottom vehicles: update

    NASA Astrophysics Data System (ADS)

    Bernstein, Charles L.

    2000-07-01

    The Surf Zone Reconnaissance Project is developing sensors for small, autonomous, Underwater Bottom-crawling Vehicles. The objective is to enable small, crawling robots to autonomously detect and classify mines and obstacles on the ocean bottom in depths between 0 and 10 feet. We have identified a promising set of techniques that will exploit the electromagnetic, shape, texture, image, and vibratory- modal features of this images. During FY99 and FY00 we have worked toward refining these techniques. Signature data sets have been collected for a standard target set to facilitate the development of sensor fusion and target detection and classification algorithms. Specific behaviors, termed microbehaviors, are developed to utilize the robot's mobility to position and operate the sensors. A first generation, close-range sensor suite, composed of 5 sensors, will be completed and tested on a crawling platform in FY00, and will be further refined and demonstrated in FY01 as part of the Mine Countermeasures 6.3 core program sponsored by the Office of Naval Research.

  18. Arrays of Nano Tunnel Junctions as Infrared Image Sensors

    NASA Technical Reports Server (NTRS)

    Son, Kyung-Ah; Moon, Jeong S.; Prokopuk, Nicholas

    2006-01-01

    Infrared image sensors based on high density rectangular planar arrays of nano tunnel junctions have been proposed. These sensors would differ fundamentally from prior infrared sensors based, variously, on bolometry or conventional semiconductor photodetection. Infrared image sensors based on conventional semiconductor photodetection must typically be cooled to cryogenic temperatures to reduce noise to acceptably low levels. Some bolometer-type infrared sensors can be operated at room temperature, but they exhibit low detectivities and long response times, which limit their utility. The proposed infrared image sensors could be operated at room temperature without incurring excessive noise, and would exhibit high detectivities and short response times. Other advantages would include low power demand, high resolution, and tailorability of spectral response. Neither bolometers nor conventional semiconductor photodetectors, the basic detector units as proposed would partly resemble rectennas. Nanometer-scale tunnel junctions would be created by crossing of nanowires with quantum-mechanical-barrier layers in the form of thin layers of electrically insulating material between them (see figure). A microscopic dipole antenna sized and shaped to respond maximally in the infrared wavelength range that one seeks to detect would be formed integrally with the nanowires at each junction. An incident signal in that wavelength range would become coupled into the antenna and, through the antenna, to the junction. At the junction, the flow of electrons between the crossing wires would be dominated by quantum-mechanical tunneling rather than thermionic emission. Relative to thermionic emission, quantum mechanical tunneling is a fast process.

  19. Toward CMOS image sensor based glucose monitoring.

    PubMed

    Devadhasan, Jasmine Pramila; Kim, Sanghyo

    2012-09-07

    Complementary metal oxide semiconductor (CMOS) image sensor is a powerful tool for biosensing applications. In this present study, CMOS image sensor has been exploited for detecting glucose levels by simple photon count variation with high sensitivity. Various concentrations of glucose (100 mg dL(-1) to 1000 mg dL(-1)) were added onto a simple poly-dimethylsiloxane (PDMS) chip and the oxidation of glucose was catalyzed with the aid of an enzymatic reaction. Oxidized glucose produces a brown color with the help of chromogen during enzymatic reaction and the color density varies with the glucose concentration. Photons pass through the PDMS chip with varying color density and hit the sensor surface. Photon count was recognized by CMOS image sensor depending on the color density with respect to the glucose concentration and it was converted into digital form. By correlating the obtained digital results with glucose concentration it is possible to measure a wide range of blood glucose levels with great linearity based on CMOS image sensor and therefore this technique will promote a convenient point-of-care diagnosis.

  20. An analog gamma correction scheme for high dynamic range CMOS logarithmic image sensors.

    PubMed

    Cao, Yuan; Pan, Xiaofang; Zhao, Xiaojin; Wu, Huisi

    2014-12-15

    In this paper, a novel analog gamma correction scheme with a logarithmic image sensor dedicated to minimize the quantization noise of the high dynamic applications is presented. The proposed implementation exploits a non-linear voltage-controlled-oscillator (VCO) based analog-to-digital converter (ADC) to perform the gamma correction during the analog-to-digital conversion. As a result, the quantization noise does not increase while the same high dynamic range of logarithmic image sensor is preserved. Moreover, by combining the gamma correction with the analog-to-digital conversion, the silicon area and overall power consumption can be greatly reduced. The proposed gamma correction scheme is validated by the reported simulation results and the experimental results measured for our designed test structure, which is fabricated with 0.35 μm standard complementary-metal-oxide-semiconductor (CMOS) process.

  1. An Analog Gamma Correction Scheme for High Dynamic Range CMOS Logarithmic Image Sensors

    PubMed Central

    Cao, Yuan; Pan, Xiaofang; Zhao, Xiaojin; Wu, Huisi

    2014-01-01

    In this paper, a novel analog gamma correction scheme with a logarithmic image sensor dedicated to minimize the quantization noise of the high dynamic applications is presented. The proposed implementation exploits a non-linear voltage-controlled-oscillator (VCO) based analog-to-digital converter (ADC) to perform the gamma correction during the analog-to-digital conversion. As a result, the quantization noise does not increase while the same high dynamic range of logarithmic image sensor is preserved. Moreover, by combining the gamma correction with the analog-to-digital conversion, the silicon area and overall power consumption can be greatly reduced. The proposed gamma correction scheme is validated by the reported simulation results and the experimental results measured for our designed test structure, which is fabricated with 0.35 μm standard complementary-metal-oxide-semiconductor (CMOS) process. PMID:25517692

  2. Color sensitivity of the multi-exposure HDR imaging process

    NASA Astrophysics Data System (ADS)

    Lenseigne, Boris; Jacobs, Valéry Ann; Withouck, Martijn; Hanselaer, Peter; Jonker, Pieter P.

    2013-04-01

    Multi-exposure high dynamic range(HDR) imaging builds HDR radiance maps by stitching together different views of a same scene with varying exposures. Practically, this process involves converting raw sensor data into low dynamic range (LDR) images, estimate the camera response curves, and use them in order to recover the irradiance for every pixel. During the export, applying white balance settings and image stitching, which both have an influence on the color balance in the final image. In this paper, we use a calibrated quasi-monochromatic light source, an integrating sphere, and a spectrograph in order to evaluate and compare the average spectral response of the image sensor. We finally draw some conclusion about the color consistency of HDR imaging and the additional steps necessary to use multi-exposure HDR imaging as a tool to measure the physical quantities such as radiance and luminance.

  3. An ultrasensitive method of real time pH monitoring with complementary metal oxide semiconductor image sensor.

    PubMed

    Devadhasan, Jasmine Pramila; Kim, Sanghyo

    2015-02-09

    CMOS sensors are becoming a powerful tool in the biological and chemical field. In this work, we introduce a new approach on quantifying various pH solutions with a CMOS image sensor. The CMOS image sensor based pH measurement produces high-accuracy analysis, making it a truly portable and user friendly system. pH indicator blended hydrogel matrix was fabricated as a thin film to the accurate color development. A distinct color change of red, green and blue (RGB) develops in the hydrogel film by applying various pH solutions (pH 1-14). The semi-quantitative pH evolution was acquired by visual read out. Further, CMOS image sensor absorbs the RGB color intensity of the film and hue value converted into digital numbers with the aid of an analog-to-digital converter (ADC) to determine the pH ranges of solutions. Chromaticity diagram and Euclidean distance represent the RGB color space and differentiation of pH ranges, respectively. This technique is applicable to sense the various toxic chemicals and chemical vapors by situ sensing. Ultimately, the entire approach can be integrated into smartphone and operable with the user friendly manner. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Design and fabrication of vertically-integrated CMOS image sensors.

    PubMed

    Skorka, Orit; Joseph, Dileepan

    2011-01-01

    Technologies to fabricate integrated circuits (IC) with 3D structures are an emerging trend in IC design. They are based on vertical stacking of active components to form heterogeneous microsystems. Electronic image sensors will benefit from these technologies because they allow increased pixel-level data processing and device optimization. This paper covers general principles in the design of vertically-integrated (VI) CMOS image sensors that are fabricated by flip-chip bonding. These sensors are composed of a CMOS die and a photodetector die. As a specific example, the paper presents a VI-CMOS image sensor that was designed at the University of Alberta, and fabricated with the help of CMC Microsystems and Micralyne Inc. To realize prototypes, CMOS dies with logarithmic active pixels were prepared in a commercial process, and photodetector dies with metal-semiconductor-metal devices were prepared in a custom process using hydrogenated amorphous silicon. The paper also describes a digital camera that was developed to test the prototype. In this camera, scenes captured by the image sensor are read using an FPGA board, and sent in real time to a PC over USB for data processing and display. Experimental results show that the VI-CMOS prototype has a higher dynamic range and a lower dark limit than conventional electronic image sensors.

  5. Design and Fabrication of Vertically-Integrated CMOS Image Sensors

    PubMed Central

    Skorka, Orit; Joseph, Dileepan

    2011-01-01

    Technologies to fabricate integrated circuits (IC) with 3D structures are an emerging trend in IC design. They are based on vertical stacking of active components to form heterogeneous microsystems. Electronic image sensors will benefit from these technologies because they allow increased pixel-level data processing and device optimization. This paper covers general principles in the design of vertically-integrated (VI) CMOS image sensors that are fabricated by flip-chip bonding. These sensors are composed of a CMOS die and a photodetector die. As a specific example, the paper presents a VI-CMOS image sensor that was designed at the University of Alberta, and fabricated with the help of CMC Microsystems and Micralyne Inc. To realize prototypes, CMOS dies with logarithmic active pixels were prepared in a commercial process, and photodetector dies with metal-semiconductor-metal devices were prepared in a custom process using hydrogenated amorphous silicon. The paper also describes a digital camera that was developed to test the prototype. In this camera, scenes captured by the image sensor are read using an FPGA board, and sent in real time to a PC over USB for data processing and display. Experimental results show that the VI-CMOS prototype has a higher dynamic range and a lower dark limit than conventional electronic image sensors. PMID:22163860

  6. Laser range profiling for small target recognition

    NASA Astrophysics Data System (ADS)

    Steinvall, Ove; Tulldahl, Michael

    2016-05-01

    The detection and classification of small surface and airborne targets at long ranges is a growing need for naval security. Long range ID or ID at closer range of small targets has its limitations in imaging due to the demand on very high transverse sensor resolution. It is therefore motivated to look for 1D laser techniques for target ID. These include vibrometry, and laser range profiling. Vibrometry can give good results but is also sensitive to certain vibrating parts on the target being in the field of view. Laser range profiling is attractive because the maximum range can be substantial, especially for a small laser beam width. A range profiler can also be used in a scanning mode to detect targets within a certain sector. The same laser can also be used for active imaging when the target comes closer and is angular resolved. The present paper will show both experimental and simulated results for laser range profiling of small boats out to 6-7 km range and a UAV mockup at close range (1.3 km). We obtained good results with the profiling system both for target detection and recognition. Comparison of experimental and simulated range waveforms based on CAD models of the target support the idea of having a profiling system as a first recognition sensor and thus narrowing the search space for the automatic target recognition based on imaging at close ranges. The naval experiments took place in the Baltic Sea with many other active and passive EO sensors beside the profiling system. Discussion of data fusion between laser profiling and imaging systems will be given. The UAV experiments were made from the rooftop laboratory at FOI.

  7. The analysis and rationale behind the upgrading of existing standard definition thermal imagers to high definition

    NASA Astrophysics Data System (ADS)

    Goss, Tristan M.

    2016-05-01

    With 640x512 pixel format IR detector arrays having been on the market for the past decade, Standard Definition (SD) thermal imaging sensors have been developed and deployed across the world. Now with 1280x1024 pixel format IR detector arrays becoming readily available designers of thermal imager systems face new challenges as pixel sizes reduce and the demand and applications for High Definition (HD) thermal imaging sensors increases. In many instances the upgrading of existing under-sampled SD thermal imaging sensors into more optimally sampled or oversampled HD thermal imaging sensors provides a more cost effective and reduced time to market option than to design and develop a completely new sensor. This paper presents the analysis and rationale behind the selection of the best suited HD pixel format MWIR detector for the upgrade of an existing SD thermal imaging sensor to a higher performing HD thermal imaging sensor. Several commercially available and "soon to be" commercially available HD small pixel IR detector options are included as part of the analysis and are considered for this upgrade. The impact the proposed detectors have on the sensor's overall sensitivity, noise and resolution is analyzed, and the improved range performance is predicted. Furthermore with reduced dark currents due to the smaller pixel sizes, the candidate HD MWIR detectors are operated at higher temperatures when compared to their SD predecessors. Therefore, as an additional constraint and as a design goal, the feasibility of achieving upgraded performance without any increase in the size, weight and power consumption of the thermal imager is discussed herein.

  8. Small SWAP 3D imaging flash ladar for small tactical unmanned air systems

    NASA Astrophysics Data System (ADS)

    Bird, Alan; Anderson, Scott A.; Wojcik, Michael; Budge, Scott E.

    2015-05-01

    The Space Dynamics Laboratory (SDL), working with Naval Research Laboratory (NRL) and industry leaders Advanced Scientific Concepts (ASC) and Hood Technology Corporation, has developed a small SWAP (size, weight, and power) 3D imaging flash ladar (LAser Detection And Ranging) sensor system concept design for small tactical unmanned air systems (STUAS). The design utilizes an ASC 3D flash ladar camera and laser in a Hood Technology gyro-stabilized gimbal system. The design is an autonomous, intelligent, geo-aware sensor system that supplies real-time 3D terrain and target images. Flash ladar and visible camera data are processed at the sensor using a custom digitizer/frame grabber with compression. Mounted in the aft housing are power, controls, processing computers, and GPS/INS. The onboard processor controls pointing and handles image data, detection algorithms and queuing. The small SWAP 3D imaging flash ladar sensor system generates georeferenced terrain and target images with a low probability of false return and <10 cm range accuracy through foliage in real-time. The 3D imaging flash ladar is designed for a STUAS with a complete system SWAP estimate of <9 kg, <0.2 m3 and <350 W power. The system is modeled using LadarSIM, a MATLAB® and Simulink®- based ladar system simulator designed and developed by the Center for Advanced Imaging Ladar (CAIL) at Utah State University. We will present the concept design and modeled performance predictions.

  9. High dynamic range coding imaging system

    NASA Astrophysics Data System (ADS)

    Wu, Renfan; Huang, Yifan; Hou, Guangqi

    2014-10-01

    We present a high dynamic range (HDR) imaging system design scheme based on coded aperture technique. This scheme can help us obtain HDR images which have extended depth of field. We adopt Sparse coding algorithm to design coded patterns. Then we utilize the sensor unit to acquire coded images under different exposure settings. With the guide of the multiple exposure parameters, a series of low dynamic range (LDR) coded images are reconstructed. We use some existing algorithms to fuse and display a HDR image by those LDR images. We build an optical simulation model and get some simulation images to verify the novel system.

  10. Fusion of 3D laser scanner and depth images for obstacle recognition in mobile applications

    NASA Astrophysics Data System (ADS)

    Budzan, Sebastian; Kasprzyk, Jerzy

    2016-02-01

    The problem of obstacle detection and recognition or, generally, scene mapping is one of the most investigated problems in computer vision, especially in mobile applications. In this paper a fused optical system using depth information with color images gathered from the Microsoft Kinect sensor and 3D laser range scanner data is proposed for obstacle detection and ground estimation in real-time mobile systems. The algorithm consists of feature extraction in the laser range images, processing of the depth information from the Kinect sensor, fusion of the sensor information, and classification of the data into two separate categories: road and obstacle. Exemplary results are presented and it is shown that fusion of information gathered from different sources increases the effectiveness of the obstacle detection in different scenarios, and it can be used successfully for road surface mapping.

  11. Small Imaging Depth LIDAR and DCNN-Based Localization for Automated Guided Vehicle †

    PubMed Central

    Ito, Seigo; Hiratsuka, Shigeyoshi; Ohta, Mitsuhiko; Matsubara, Hiroyuki; Ogawa, Masaru

    2018-01-01

    We present our third prototype sensor and a localization method for Automated Guided Vehicles (AGVs), for which small imaging LIght Detection and Ranging (LIDAR) and fusion-based localization are fundamentally important. Our small imaging LIDAR, named the Single-Photon Avalanche Diode (SPAD) LIDAR, uses a time-of-flight method and SPAD arrays. A SPAD is a highly sensitive photodetector capable of detecting at the single-photon level, and the SPAD LIDAR has two SPAD arrays on the same chip for detection of laser light and environmental light. Therefore, the SPAD LIDAR simultaneously outputs range image data and monocular image data with the same coordinate system and does not require external calibration among outputs. As AGVs travel both indoors and outdoors with vibration, this calibration-less structure is particularly useful for AGV applications. We also introduce a fusion-based localization method, named SPAD DCNN, which uses the SPAD LIDAR and employs a Deep Convolutional Neural Network (DCNN). SPAD DCNN can fuse the outputs of the SPAD LIDAR: range image data, monocular image data and peak intensity image data. The SPAD DCNN has two outputs: the regression result of the position of the SPAD LIDAR and the classification result of the existence of a target to be approached. Our third prototype sensor and the localization method are evaluated in an indoor environment by assuming various AGV trajectories. The results show that the sensor and localization method improve the localization accuracy. PMID:29320434

  12. Small Imaging Depth LIDAR and DCNN-Based Localization for Automated Guided Vehicle.

    PubMed

    Ito, Seigo; Hiratsuka, Shigeyoshi; Ohta, Mitsuhiko; Matsubara, Hiroyuki; Ogawa, Masaru

    2018-01-10

    We present our third prototype sensor and a localization method for Automated Guided Vehicles (AGVs), for which small imaging LIght Detection and Ranging (LIDAR) and fusion-based localization are fundamentally important. Our small imaging LIDAR, named the Single-Photon Avalanche Diode (SPAD) LIDAR, uses a time-of-flight method and SPAD arrays. A SPAD is a highly sensitive photodetector capable of detecting at the single-photon level, and the SPAD LIDAR has two SPAD arrays on the same chip for detection of laser light and environmental light. Therefore, the SPAD LIDAR simultaneously outputs range image data and monocular image data with the same coordinate system and does not require external calibration among outputs. As AGVs travel both indoors and outdoors with vibration, this calibration-less structure is particularly useful for AGV applications. We also introduce a fusion-based localization method, named SPAD DCNN, which uses the SPAD LIDAR and employs a Deep Convolutional Neural Network (DCNN). SPAD DCNN can fuse the outputs of the SPAD LIDAR: range image data, monocular image data and peak intensity image data. The SPAD DCNN has two outputs: the regression result of the position of the SPAD LIDAR and the classification result of the existence of a target to be approached. Our third prototype sensor and the localization method are evaluated in an indoor environment by assuming various AGV trajectories. The results show that the sensor and localization method improve the localization accuracy.

  13. Applications of spectral band adjustment factors (SBAF) for cross-calibration

    USGS Publications Warehouse

    Chander, Gyanesh

    2013-01-01

    To monitor land surface processes over a wide range of temporal and spatial scales, it is critical to have coordinated observations of the Earth's surface acquired from multiple spaceborne imaging sensors. However, an integrated global observation framework requires an understanding of how land surface processes are seen differently by various sensors. This is particularly true for sensors acquiring data in spectral bands whose relative spectral responses (RSRs) are not similar and thus may produce different results while observing the same target. The intrinsic offsets between two sensors caused by RSR mismatches can be compensated by using a spectral band adjustment factor (SBAF), which takes into account the spectral profile of the target and the RSR of the two sensors. The motivation of this work comes from the need to compensate the spectral response differences of multispectral sensors in order to provide a more accurate cross-calibration between the sensors. In this paper, radiometric cross-calibration of the Landsat 7 Enhanced Thematic Mapper Plus (ETM+) and the Terra Moderate Resolution Imaging Spectroradiometer (MODIS) sensors was performed using near-simultaneous observations over the Libya 4 pseudoinvariant calibration site in the visible and near-infrared spectral range. The RSR differences of the analogous ETM+ and MODIS spectral bands provide the opportunity to explore, understand, quantify, and compensate for the measurement differences between these two sensors. The cross-calibration was initially performed by comparing the top-of-atmosphere (TOA) reflectances between the two sensors over their lifetimes. The average percent differences in the long-term trends ranged from $-$5% to $+$6%. The RSR compensated ETM+ TOA reflectance (ETM+$^{ast}$) measurements were then found to agree with MODIS TOA reflectance to within 5% for all bands when Earth Observing-1 Hy- erion hyperspectral data were used to produce the SBAFs. These differences were later reduced to within 1% for all bands (except band 2) by using Environmental Satellite Scanning Imaging Absorption Spectrometer for Atmospheric Cartography hyperspectral data to produce the SBAFs.

  14. Advanced Video Guidance Sensor and next-generation autonomous docking sensors

    NASA Astrophysics Data System (ADS)

    Granade, Stephen R.

    2004-09-01

    In recent decades, NASA's interest in spacecraft rendezvous and proximity operations has grown. Additional instrumentation is needed to improve manned docking operations' safety, as well as to enable telerobotic operation of spacecraft or completely autonomous rendezvous and docking. To address this need, Advanced Optical Systems, Inc., Orbital Sciences Corporation, and Marshall Space Flight Center have developed the Advanced Video Guidance Sensor (AVGS) under the auspices of the Demonstration of Autonomous Rendezvous Technology (DART) program. Given a cooperative target comprising several retro-reflectors, AVGS provides six-degree-of-freedom information at ranges of up to 300 meters for the DART target. It does so by imaging the target, then performing pattern recognition on the resulting image. Longer range operation is possible through different target geometries. Now that AVGS is being readied for its test flight in 2004, the question is: what next? Modifications can be made to AVGS, including different pattern recognition algorithms and changes to the retro-reflector targets, to make it more robust and accurate. AVGS could be coupled with other space-qualified sensors, such as a laser range-and-bearing finder, that would operate at longer ranges. Different target configurations, including the use of active targets, could result in significant miniaturization over the current AVGS package. We will discuss these and other possibilities for a next-generation docking sensor or sensor suite that involve AVGS.

  15. Advanced Video Guidance Sensor and Next Generation Autonomous Docking Sensors

    NASA Technical Reports Server (NTRS)

    Granade, Stephen R.

    2004-01-01

    In recent decades, NASA's interest in spacecraft rendezvous and proximity operations has grown. Additional instrumentation is needed to improve manned docking operations' safety, as well as to enable telerobotic operation of spacecraft or completely autonomous rendezvous and docking. To address this need, Advanced Optical Systems, Inc., Orbital Sciences Corporation, and Marshall Space Flight Center have developed the Advanced Video Guidance Sensor (AVGS) under the auspices of the Demonstration of Autonomous Rendezvous Technology (DART) program. Given a cooperative target comprising several retro-reflectors, AVGS provides six-degree-of-freedom information at ranges of up to 300 meters for the DART target. It does so by imaging the target, then performing pattern recognition on the resulting image. Longer range operation is possible through different target geometries. Now that AVGS is being readied for its test flight in 2004, the question is: what next? Modifications can be made to AVGS, including different pattern recognition algorithms and changes to the retro-reflector targets, to make it more robust and accurate. AVGS could be coupled with other space-qualified sensors, such as a laser range-and-bearing finder, that would operate at longer ranges. Different target configurations, including the use of active targets, could result in significant miniaturization over the current AVGS package. We will discuss these and other possibilities for a next-generation docking sensor or sensor suite that involve AVGS.

  16. Commercial CMOS image sensors as X-ray imagers and particle beam monitors

    NASA Astrophysics Data System (ADS)

    Castoldi, A.; Guazzoni, C.; Maffessanti, S.; Montemurro, G. V.; Carraresi, L.

    2015-01-01

    CMOS image sensors are widely used in several applications such as mobile handsets webcams and digital cameras among others. Furthermore they are available across a wide range of resolutions with excellent spectral and chromatic responses. In order to fulfill the need of cheap systems as beam monitors and high resolution image sensors for scientific applications we exploited the possibility of using commercial CMOS image sensors as X-rays and proton detectors. Two different sensors have been mounted and tested. An Aptina MT9v034, featuring 752 × 480 pixels, 6μm × 6μm pixel size has been mounted and successfully tested as bi-dimensional beam profile monitor, able to take pictures of the incoming proton bunches at the DeFEL beamline (1-6 MeV pulsed proton beam) of the LaBeC of INFN in Florence. The naked sensor is able to successfully detect the interactions of the single protons. The sensor point-spread-function (PSF) has been qualified with 1MeV protons and is equal to one pixel (6 mm) r.m.s. in both directions. A second sensor MT9M032, featuring 1472 × 1096 pixels, 2.2 × 2.2 μm pixel size has been mounted on a dedicated board as high-resolution imager to be used in X-ray imaging experiments with table-top generators. In order to ease and simplify the data transfer and the image acquisition the system is controlled by a dedicated micro-processor board (DM3730 1GHz SoC ARM Cortex-A8) on which a modified LINUX kernel has been implemented. The paper presents the architecture of the sensor systems and the results of the experimental measurements.

  17. Multiplexed 3D FRET imaging in deep tissue of live embryos

    PubMed Central

    Zhao, Ming; Wan, Xiaoyang; Li, Yu; Zhou, Weibin; Peng, Leilei

    2015-01-01

    Current deep tissue microscopy techniques are mostly restricted to intensity mapping of fluorophores, which significantly limit their applications in investigating biochemical processes in vivo. We present a deep tissue multiplexed functional imaging method that probes multiple Förster resonant energy transfer (FRET) sensors in live embryos with high spatial resolution. The method simultaneously images fluorescence lifetimes in 3D with multiple excitation lasers. Through quantitative analysis of triple-channel intensity and lifetime images, we demonstrated that Ca2+ and cAMP levels of live embryos expressing dual FRET sensors can be monitored simultaneously at microscopic resolution. The method is compatible with a broad range of FRET sensors currently available for probing various cellular biochemical functions. It opens the door to imaging complex cellular circuitries in whole live organisms. PMID:26387920

  18. Real time three-dimensional space video rate sensors for millimeter waves imaging based very inexpensive plasma LED lamps

    NASA Astrophysics Data System (ADS)

    Levanon, Assaf; Yitzhaky, Yitzhak; Kopeika, Natan S.; Rozban, Daniel; Abramovich, Amir

    2014-10-01

    In recent years, much effort has been invested to develop inexpensive but sensitive Millimeter Wave (MMW) detectors that can be used in focal plane arrays (FPAs), in order to implement real time MMW imaging. Real time MMW imaging systems are required for many varied applications in many fields as homeland security, medicine, communications, military products and space technology. It is mainly because this radiation has high penetration and good navigability through dust storm, fog, heavy rain, dielectric materials, biological tissue, and diverse materials. Moreover, the atmospheric attenuation in this range of the spectrum is relatively low and the scattering is also low compared to NIR and VIS. The lack of inexpensive room temperature imaging systems makes it difficult to provide a suitable MMW system for many of the above applications. In last few years we advanced in research and development of sensors using very inexpensive (30-50 cents) Glow Discharge Detector (GDD) plasma indicator lamps as MMW detectors. This paper presents three kinds of GDD sensor based lamp Focal Plane Arrays (FPA). Those three kinds of cameras are different in the number of detectors, scanning operation, and detection method. The 1st and 2nd generations are 8 × 8 pixel array and an 18 × 2 mono-rail scanner array respectively, both of them for direct detection and limited to fixed imaging. The last designed sensor is a multiplexing frame rate of 16x16 GDD FPA. It permits real time video rate imaging of 30 frames/ sec and comprehensive 3D MMW imaging. The principle of detection in this sensor is a frequency modulated continuous wave (FMCW) system while each of the 16 GDD pixel lines is sampled simultaneously. Direct detection is also possible and can be done with a friendly user interface. This FPA sensor is built over 256 commercial GDD lamps with 3 mm diameter International Light, Inc., Peabody, MA model 527 Ne indicator lamps as pixel detectors. All three sensors are fully supported by software Graphical Unit Interface (GUI). They were tested and characterized through different kinds of optical systems for imaging applications, super resolution, and calibration methods. Capability of the 16x16 sensor is to employ a chirp radar like method to produced depth and reflectance information in the image. This enables 3-D MMW imaging in real time with video frame rate. In this work we demonstrate different kinds of optical imaging systems. Those systems have capability of 3-D imaging for short range and longer distances to at least 10-20 meters.

  19. Quantum efficiency and dark current evaluation of a backside illuminated CMOS image sensor

    NASA Astrophysics Data System (ADS)

    Vereecke, Bart; Cavaco, Celso; De Munck, Koen; Haspeslagh, Luc; Minoglou, Kyriaki; Moore, George; Sabuncuoglu, Deniz; Tack, Klaas; Wu, Bob; Osman, Haris

    2015-04-01

    We report on the development and characterization of monolithic backside illuminated (BSI) imagers at imec. Different surface passivation, anti-reflective coatings (ARCs), and anneal conditions were implemented and their effect on dark current (DC) and quantum efficiency (QE) are analyzed. Two different single layer ARC materials were developed for visible light and near UV applications, respectively. QE above 75% over the entire visible spectrum range from 400 to 700 nm is measured. In the spectral range from 260 to 400 nm wavelength, QE values above 50% over the entire range are achieved. A new technique, high pressure hydrogen anneal at 20 atm, was applied on photodiodes and improvement in DC of 30% for the BSI imager with HfO2 as ARC as well as for the front side imager was observed. The entire BSI process was developed 200 mm wafers and evaluated on test diode structures. The knowhow is then transferred to real imager sensors arrays.

  20. High Dynamic Range Imaging at the Quantum Limit with Single Photon Avalanche Diode-Based Image Sensors †

    PubMed Central

    Mattioli Della Rocca, Francescopaolo

    2018-01-01

    This paper examines methods to best exploit the High Dynamic Range (HDR) of the single photon avalanche diode (SPAD) in a high fill-factor HDR photon counting pixel that is scalable to megapixel arrays. The proposed method combines multi-exposure HDR with temporal oversampling in-pixel. We present a silicon demonstration IC with 96 × 40 array of 8.25 µm pitch 66% fill-factor SPAD-based pixels achieving >100 dB dynamic range with 3 back-to-back exposures (short, mid, long). Each pixel sums 15 bit-planes or binary field images internally to constitute one frame providing 3.75× data compression, hence the 1k frames per second (FPS) output off-chip represents 45,000 individual field images per second on chip. Two future projections of this work are described: scaling SPAD-based image sensors to HDR 1 MPixel formats and shrinking the pixel pitch to 1–3 µm. PMID:29641479

  1. Imaging intracellular pH in live cells with a genetically encoded red fluorescent protein sensor.

    PubMed

    Tantama, Mathew; Hung, Yin Pun; Yellen, Gary

    2011-07-06

    Intracellular pH affects protein structure and function, and proton gradients underlie the function of organelles such as lysosomes and mitochondria. We engineered a genetically encoded pH sensor by mutagenesis of the red fluorescent protein mKeima, providing a new tool to image intracellular pH in live cells. This sensor, named pHRed, is the first ratiometric, single-protein red fluorescent sensor of pH. Fluorescence emission of pHRed peaks at 610 nm while exhibiting dual excitation peaks at 440 and 585 nm that can be used for ratiometric imaging. The intensity ratio responds with an apparent pK(a) of 6.6 and a >10-fold dynamic range. Furthermore, pHRed has a pH-responsive fluorescence lifetime that changes by ~0.4 ns over physiological pH values and can be monitored with single-wavelength two-photon excitation. After characterizing the sensor, we tested pHRed's ability to monitor intracellular pH by imaging energy-dependent changes in cytosolic and mitochondrial pH.

  2. Surface chemistry and morphology in single particle optical imaging

    NASA Astrophysics Data System (ADS)

    Ekiz-Kanik, Fulya; Sevenler, Derin Deniz; Ünlü, Neşe Lortlar; Chiari, Marcella; Ünlü, M. Selim

    2017-05-01

    Biological nanoparticles such as viruses and exosomes are important biomarkers for a range of medical conditions, from infectious diseases to cancer. Biological sensors that detect whole viruses and exosomes with high specificity, yet without additional labeling, are promising because they reduce the complexity of sample preparation and may improve measurement quality by retaining information about nanoscale physical structure of the bio-nanoparticle (BNP). Towards this end, a variety of BNP biosensor technologies have been developed, several of which are capable of enumerating the precise number of detected viruses or exosomes and analyzing physical properties of each individual particle. Optical imaging techniques are promising candidates among broad range of label-free nanoparticle detectors. These imaging BNP sensors detect the binding of single nanoparticles on a flat surface functionalized with a specific capture molecule or an array of multiplexed capture probes. The functionalization step confers all molecular specificity for the sensor's target but can introduce an unforeseen problem; a rough and inhomogeneous surface coating can be a source of noise, as these sensors detect small local changes in optical refractive index. In this paper, we review several optical technologies for label-free BNP detectors with a focus on imaging systems. We compare the surface-imaging methods including dark-field, surface plasmon resonance imaging and interference reflectance imaging. We discuss the importance of ensuring consistently uniform and smooth surface coatings of capture molecules for these types of biosensors and finally summarize several methods that have been developed towards addressing this challenge.

  3. Wavelength-Scanning SPR Imaging Sensors Based on an Acousto-Optic Tunable Filter and a White Light Laser

    PubMed Central

    Zeng, Youjun; Wang, Lei; Wu, Shu-Yuen; He, Jianan; Qu, Junle; Li, Xuejin; Ho, Ho-Pui; Gu, Dayong; Gao, Bruce Zhi; Shao, Yonghong

    2017-01-01

    A fast surface plasmon resonance (SPR) imaging biosensor system based on wavelength interrogation using an acousto-optic tunable filter (AOTF) and a white light laser is presented. The system combines the merits of a wide-dynamic detection range and high sensitivity offered by the spectral approach with multiplexed high-throughput data collection and a two-dimensional (2D) biosensor array. The key feature is the use of AOTF to realize wavelength scan from a white laser source and thus to achieve fast tracking of the SPR dip movement caused by target molecules binding to the sensor surface. Experimental results show that the system is capable of completing a SPR dip measurement within 0.35 s. To the best of our knowledge, this is the fastest time ever reported in the literature for imaging spectral interrogation. Based on a spectral window with a width of approximately 100 nm, a dynamic detection range and resolution of 4.63 × 10−2 refractive index unit (RIU) and 1.27 × 10−6 RIU achieved in a 2D-array sensor is reported here. The spectral SPR imaging sensor scheme has the capability of performing fast high-throughput detection of biomolecular interactions from 2D sensor arrays. The design has no mechanical moving parts, thus making the scheme completely solid-state. PMID:28067766

  4. Characterisation of GaAs:Cr pixel sensors coupled to Timepix chips in view of synchrotron applications

    NASA Astrophysics Data System (ADS)

    Ponchut, C.; Cotte, M.; Lozinskaya, A.; Zarubin, A.; Tolbanov, O.; Tyazhev, A.

    2017-12-01

    In order to meet the needs of some ESRF beamlines for highly efficient 2D X-ray detectors in the 20-50 keV range, GaAs:Cr pixel sensors coupled to TIMEPIX readout chips were implemented into a MAXIPIX detector. Use of GaAs:Cr sensor material is intended to overcome the limitations of Si (low absorption) and of CdTe (fluorescence) in this energy range The GaAs:Cr sensor assemblies were characterised with both laboratory X-ray sources and monochromatic synchrotron X-ray beams. The sensor response as a function of bias voltage was compared to a theoretical model, leading to an estimation of the μτ product of electrons in GaAs:Cr sensor material of 1.6×10-4 cm2/V. The spatial homogeneity of X-ray images obtained with the sensors was measured in different irradiation conditions, showing a particular sensitivity to small variations in the incident beam spectrum. 2D-resolved elemental mapping of the sensor surface was carried out to investigate a possible relation between the noise pattern observed in X-ray images and local fluctuations in chemical composition. A scanning of the sensor response at subpixel scale revealed that these irregularities can be correlated with a distortion of the effective pixel shapes.

  5. STARR: shortwave-targeted agile Raman robot for the detection and identification of emplaced explosives

    NASA Astrophysics Data System (ADS)

    Gomer, Nathaniel R.; Gardner, Charles W.

    2014-05-01

    In order to combat the threat of emplaced explosives (land mines, etc.), ChemImage Sensor Systems (CISS) has developed a multi-sensor, robot mounted sensor capable of identification and confirmation of potential threats. The system, known as STARR (Shortwave-infrared Targeted Agile Raman Robot), utilizes shortwave infrared spectroscopy for the identification of potential threats, combined with a visible short-range standoff Raman hyperspectral imaging (HSI) system for material confirmation. The entire system is mounted onto a Talon UGV (Unmanned Ground Vehicle), giving the sensor an increased area search rate and reducing the risk of injury to the operator. The Raman HSI system utilizes a fiber array spectral translator (FAST) for the acquisition of high quality Raman chemical images, allowing for increased sensitivity and improved specificity. An overview of the design and operation of the system will be presented, along with initial detection results of the fusion sensor.

  6. DNAzyme sensors for detection of metal ions in the environment and imaging them in living cells

    PubMed Central

    McGhee, Claire E.; Loh, Kang Yong

    2017-01-01

    The on-site and real-time detection of metal ions is important for environmental monitoring and for understanding the impact of metal ions on human health. However, developing sensors selective for a wide range of metal ions that can work in the complex matrices of untreated samples and cells presents significant challenges. To meet these challenges, DNAzymes, an emerging class of metal ion-dependent enzymes selective for almost any metal ion, have been functionalized with fluorophores, nanoparticles and other imaging agents and incorporated into sensors for the detection of metal ions in environmental samples and for imaging the metal ions in living cells. Herein, we highlight the recent developments of DNAzyme-based fluorescent, colorimetric, SERS, electrochemical and electrochemiluminscent sensors for metal ions for these applications. PMID:28458112

  7. Real-time motion artifacts compensation of ToF sensors data on GPU

    NASA Astrophysics Data System (ADS)

    Lefloch, Damien; Hoegg, Thomas; Kolb, Andreas

    2013-05-01

    Over the last decade, ToF sensors attracted many computer vision and graphics researchers. Nevertheless, ToF devices suffer from severe motion artifacts for dynamic scenes as well as low-resolution depth data which strongly justifies the importance of a valid correction. To counterbalance this effect, a pre-processing approach is introduced to greatly improve range image data on dynamic scenes. We first demonstrate the robustness of our approach using simulated data to finally validate our method using sensor range data. Our GPU-based processing pipeline enhances range data reliability in real-time.

  8. Near-IR Two-Photon Fluorescent Sensor for K(+) Imaging in Live Cells.

    PubMed

    Sui, Binglin; Yue, Xiling; Kim, Bosung; Belfield, Kevin D

    2015-08-19

    A new two-photon excited fluorescent K(+) sensor is reported. The sensor comprises three moieties, a highly selective K(+) chelator as the K(+) recognition unit, a boron-dipyrromethene (BODIPY) derivative modified with phenylethynyl groups as the fluorophore, and two polyethylene glycol chains to afford water solubility. The sensor displays very high selectivity (>52-fold) in detecting K(+) over other physiological metal cations. Upon binding K(+), the sensor switches from nonfluorescent to highly fluorescent, emitting red to near-IR (NIR) fluorescence. The sensor exhibited a good two-photon absorption cross section, 500 GM at 940 nm. Moreover, it is not sensitive to pH in the physiological pH range. Time-dependent cell imaging studies via both one- and two-photon fluorescence microscopy demonstrate that the sensor is suitable for dynamic K(+) sensing in living cells.

  9. Hyperspectral Imager for the Coastal Ocean: instrument description and first images.

    PubMed

    Lucke, Robert L; Corson, Michael; McGlothlin, Norman R; Butcher, Steve D; Wood, Daniel L; Korwan, Daniel R; Li, Rong R; Snyder, Willliam A; Davis, Curt O; Chen, Davidson T

    2011-04-10

    The Hyperspectral Imager for the Coastal Ocean (HICO) is the first spaceborne hyperspectral sensor designed specifically for the coastal ocean and estuarial, riverine, or other shallow-water areas. The HICO generates hyperspectral images, primarily over the 400-900 nm spectral range, with a ground sample distance of ≈90 m (at nadir) and a high signal-to-noise ratio. The HICO is now operating on the International Space Station (ISS). Its cross-track and along-track fields of view are 42 km (at nadir) and 192 km, respectively, for a total scene area of 8000 km(2). The HICO is an innovative prototype sensor that builds on extensive experience with airborne sensors and makes extensive use of commercial off-the-shelf components to build a space sensor at a small fraction of the usual cost and time. Here we describe the instrument's design and characterization and present early images from the ISS. © 2011 Optical Society of America

  10. Fixed Pattern Noise pixel-wise linear correction for crime scene imaging CMOS sensor

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Messinger, David W.; Dube, Roger R.; Ientilucci, Emmett J.

    2017-05-01

    Filtered multispectral imaging technique might be a potential method for crime scene documentation and evidence detection due to its abundant spectral information as well as non-contact and non-destructive nature. Low-cost and portable multispectral crime scene imaging device would be highly useful and efficient. The second generation crime scene imaging system uses CMOS imaging sensor to capture spatial scene and bandpass Interference Filters (IFs) to capture spectral information. Unfortunately CMOS sensors suffer from severe spatial non-uniformity compared to CCD sensors and the major cause is Fixed Pattern Noise (FPN). IFs suffer from "blue shift" effect and introduce spatial-spectral correlated errors. Therefore, Fixed Pattern Noise (FPN) correction is critical to enhance crime scene image quality and is also helpful for spatial-spectral noise de-correlation. In this paper, a pixel-wise linear radiance to Digital Count (DC) conversion model is constructed for crime scene imaging CMOS sensor. Pixel-wise conversion gain Gi,j and Dark Signal Non-Uniformity (DSNU) Zi,j are calculated. Also, conversion gain is divided into four components: FPN row component, FPN column component, defects component and effective photo response signal component. Conversion gain is then corrected to average FPN column and row components and defects component so that the sensor conversion gain is uniform. Based on corrected conversion gain and estimated image incident radiance from the reverse of pixel-wise linear radiance to DC model, corrected image spatial uniformity can be enhanced to 7 times as raw image, and the bigger the image DC value within its dynamic range, the better the enhancement.

  11. The Juno Magnetic Field Investigation

    NASA Astrophysics Data System (ADS)

    Connerney, J. E. P.; Benn, M.; Bjarno, J. B.; Denver, T.; Espley, J.; Jorgensen, J. L.; Jorgensen, P. S.; Lawton, P.; Malinnikova, A.; Merayo, J. M.; Murphy, S.; Odom, J.; Oliversen, R.; Schnurr, R.; Sheppard, D.; Smith, E. J.

    2017-11-01

    The Juno Magnetic Field investigation (MAG) characterizes Jupiter's planetary magnetic field and magnetosphere, providing the first globally distributed and proximate measurements of the magnetic field of Jupiter. The magnetic field instrumentation consists of two independent magnetometer sensor suites, each consisting of a tri-axial Fluxgate Magnetometer (FGM) sensor and a pair of co-located imaging sensors mounted on an ultra-stable optical bench. The imaging system sensors are part of a subsystem that provides accurate attitude information (to ˜20 arcsec on a spinning spacecraft) near the point of measurement of the magnetic field. The two sensor suites are accommodated at 10 and 12 m from the body of the spacecraft on a 4 m long magnetometer boom affixed to the outer end of one of 's three solar array assemblies. The magnetometer sensors are controlled by independent and functionally identical electronics boards within the magnetometer electronics package mounted inside Juno's massive radiation shielded vault. The imaging sensors are controlled by a fully hardware redundant electronics package also mounted within the radiation vault. Each magnetometer sensor measures the vector magnetic field with 100 ppm absolute vector accuracy over a wide dynamic range (to 16 Gauss = 1.6 × 106 nT per axis) with a resolution of ˜0.05 nT in the most sensitive dynamic range (±1600 nT per axis). Both magnetometers sample the magnetic field simultaneously at an intrinsic sample rate of 64 vector samples per second. The magnetic field instrumentation may be reconfigured in flight to meet unanticipated needs and is fully hardware redundant. The attitude determination system compares images with an on-board star catalog to provide attitude solutions (quaternions) at a rate of up to 4 solutions per second, and may be configured to acquire images of selected targets for science and engineering analysis. The system tracks and catalogs objects that pass through the imager field of view and also provides a continuous record of radiation exposure. A spacecraft magnetic control program was implemented to provide a magnetically clean environment for the magnetic sensors, and residual spacecraft fields and/or sensor offsets are monitored in flight taking advantage of Juno's spin (nominally 2 rpm) to separate environmental fields from those that rotate with the spacecraft.

  12. The Juno Magnetic Field Investigation

    NASA Technical Reports Server (NTRS)

    Connerney, J. E. P.; Benna, M.; Bjarno, J. B.; Denver, T.; Espley, J.; Jorgensen, J. L.; Jorgensen, P. S.; Lawton, P.; Malinnikova, A.; Merayo, J. M.; hide

    2017-01-01

    The Juno Magnetic Field investigation (MAG) characterizes Jupiter's planetary magnetic field and magnetosphere, providing the first globally distributed and proximate measurements of the magnetic field of Jupiter. The magnetic field instrumentation consists of two independent magnetometer sensor suites, each consisting of a tri-axial Fluxgate Magnetometer (FGM) sensor and a pair of co-located imaging sensors mounted on an ultra-stable optical bench. The imaging system sensors are part of a subsystem that provides accurate attitude information (to approx. 20 arcsec on a spinning spacecraft) near the point of measurement of the magnetic field. The two sensor suites are accommodated at 10 and 12 m from the body of the spacecraft on a 4 m long magnetometer boom affixed to the outer end of one of 's three solar array assemblies. The magnetometer sensors are controlled by independent and functionally identical electronics boards within the magnetometer electronics package mounted inside Juno's massive radiation shielded vault. The imaging sensors are controlled by a fully hardware redundant electronics package also mounted within the radiation vault. Each magnetometer sensor measures the vector magnetic field with 100 ppm absolute vector accuracy over a wide dynamic range (to 16 Gauss = 1.6 x 10(exp. 6) nT per axis) with a resolution of approx. 0.05 nT in the most sensitive dynamic range (+/-1600 nT per axis). Both magnetometers sample the magnetic field simultaneously at an intrinsic sample rate of 64 vector samples per second. The magnetic field instrumentation may be reconfigured in flight to meet unanticipated needs and is fully hardware redundant. The attitude determination system compares images with an on-board star catalog to provide attitude solutions (quaternions) at a rate of up to 4 solutions per second, and may be configured to acquire images of selected targets for science and engineering analysis. The system tracks and catalogs objects that pass through the imager field of view and also provides a continuous record of radiation exposure. A spacecraft magnetic control program was implemented to provide a magnetically clean environment for the magnetic sensors, and residual spacecraft fields andor sensor offsets are monitored in flight taking advantage of Juno's spin (nominally 2 rpm) to separate environmental fields from those that rotate with the spacecraft.

  13. An HDR imaging method with DTDI technology for push-broom cameras

    NASA Astrophysics Data System (ADS)

    Sun, Wu; Han, Chengshan; Xue, Xucheng; Lv, Hengyi; Shi, Junxia; Hu, Changhong; Li, Xiangzhi; Fu, Yao; Jiang, Xiaonan; Huang, Liang; Han, Hongyin

    2018-03-01

    Conventionally, high dynamic-range (HDR) imaging is based on taking two or more pictures of the same scene with different exposure. However, due to a high-speed relative motion between the camera and the scene, it is hard for this technique to be applied to push-broom remote sensing cameras. For the sake of HDR imaging in push-broom remote sensing applications, the present paper proposes an innovative method which can generate HDR images without redundant image sensors or optical components. Specifically, this paper adopts an area array CMOS (complementary metal oxide semiconductor) with the digital domain time-delay-integration (DTDI) technology for imaging, instead of adopting more than one row of image sensors, thereby taking more than one picture with different exposure. And then a new HDR image by fusing two original images with a simple algorithm can be achieved. By conducting the experiment, the dynamic range (DR) of the image increases by 26.02 dB. The proposed method is proved to be effective and has potential in other imaging applications where there is a relative motion between the cameras and scenes.

  14. Fusing Laser Reflectance and Image Data for Terrain Classification for Small Autonomous Robots

    DTIC Science & Technology

    2014-12-01

    limit us to low power, lightweight sensors , and a maximum range of approximately 5 meters. Contrast these robot characteristics to typical terrain...classifi- cation work which uses large autonomous ground vehicles with sensors mounted high above the ground. Terrain classification for small autonomous...into predefined classes [10], [11]. However, wheeled vehicles offer the ability to use non-traditional sensors such as vibration sensors [12] and

  15. Sensors to Support the Soldier

    DTIC Science & Technology

    2005-02-01

    limited by the need to be man- portable . The Marine infantryman relies on mobility , aggressiveness, and training 2 rather than elaborate equipment to...the ab- sence of conventional GPS guidance, including man- portable inertial inea- surement units, and digital imaging sensors combined with image...cell phones and 802.11 networks shows that walls and floors are not impenentra- ble to wireless signals; it is a question of power, range, and frequency

  16. Hybrid graphene-copper UWB array sensor for brain tumor detection via scattering parameters in microwave detection system

    NASA Astrophysics Data System (ADS)

    Jamlos, Mohd Aminudin; Ismail, Abdul Hafiizh; Jamlos, Mohd Faizal; Narbudowicz, Adam

    2017-01-01

    Hybrid graphene-copper ultra-wideband array sensor applied to microwave imaging technique is successfully used in detecting and visualizing tumor inside human brain. The sensor made of graphene coated film for the patch while copper for both the transmission line and parasitic element. The hybrid sensor performance is better than fully copper sensor. Hybrid sensor recorded wider bandwidth of 2.0-10.1 GHz compared with fully copper sensor operated from 2.5 to 10.1 GHz. Higher gain of 3.8-8.5 dB is presented by hybrid sensor, while fully copper sensor stated lower gain ranging from 2.6 to 6.7 dB. Both sensors recorded excellent total efficiency averaged at 97 and 94%, respectively. The sensor used for both transmits equivalent signal and receives backscattering signal from stratified human head model in detecting tumor. Difference in the data of the scattering parameters recorded from the head model with presence and absence of tumor is used as the main data to be further processed in confocal microwave imaging algorithm in generating image. MATLAB software is utilized to analyze S-parameter signals obtained from measurement. Tumor presence is indicated by lower S-parameter values compared to higher values recorded by tumor absence.

  17. 2D-Visualization of metabolic activity with planar optical chemical sensors (optodes)

    NASA Astrophysics Data System (ADS)

    Meier, R. J.; Liebsch, G.

    2015-12-01

    Microbia plays an outstandingly important role in many hydrologic compartments, such as e.g. the benthic community in sediments, or biologically active microorganisms in the capillary fringe, in ground water, or soil. Oxygen, pH, and CO2 are key factors and indicators for microbial activity. They can be measured using optical chemical sensors. These sensors record changing fluorescence properties of specific indicator dyes. The signals can be measured in a non-contact mode, even through transparent walls, which is important for many lab-experiments. They can measure in closed (transparent) systems, without sampling or intruding into the sample. They do not consume the analytes while measuring, are fully reversible and able to measure in non-stirred solutions. These sensors can be applied as high precision fiberoptic sensors (for profiling), robust sensor spots, or as planar sensors for 2D visualization (imaging). Imaging enables to detect thousands of measurement spots at the same time and generate 2D analyte maps over a region of interest. It allows for comparing different regions within one recorded image, visualizing spatial analyte gradients, or more important to identify hot spots of metabolic activity. We present ready-to-use portable imaging systems for the analytes oxygen, pH, and CO2. They consist of a detector unit, planar sensor foils and a software for easy data recording and evaluation. Sensors foils for various analytes and measurement ranges enable visualizing metabolic activity or analyte changes in the desired range. Dynamics of metabolic activity can be detected in one shot or over long time periods. We demonstrate the potential of this analytical technique by presenting experiments on benthic disturbance-recovery dynamics in sediments and microbial degradation of organic material in the capillary fringe. We think this technique is a new tool to further understand how microbial and geochemical processes are linked in (not solely) hydrologic systems.

  18. Low SWaP multispectral sensors using dichroic filter arrays

    NASA Astrophysics Data System (ADS)

    Dougherty, John; Varghese, Ron

    2015-06-01

    The benefits of multispectral imaging are well established in a variety of applications including remote sensing, authentication, satellite and aerial surveillance, machine vision, biomedical, and other scientific and industrial uses. However, many of the potential solutions require more compact, robust, and cost-effective cameras to realize these benefits. The next generation of multispectral sensors and cameras needs to deliver improvements in size, weight, power, portability, and spectral band customization to support widespread deployment for a variety of purpose-built aerial, unmanned, and scientific applications. A novel implementation uses micro-patterning of dichroic filters1 into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. This approach can be implemented across a variety of wavelength ranges and on a variety of detector types including linear, area, silicon, and InGaAs. This dichroic filter array approach can also reduce payloads and increase range for unmanned systems, with the capability to support both handheld and autonomous systems. Recent examples and results of 4 band RGB + NIR dichroic filter arrays in multispectral cameras are discussed. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and scalable production.

  19. Optimization of CMOS image sensor utilizing variable temporal multisampling partial transfer technique to achieve full-frame high dynamic range with superior low light and stop motion capability

    NASA Astrophysics Data System (ADS)

    Kabir, Salman; Smith, Craig; Armstrong, Frank; Barnard, Gerrit; Schneider, Alex; Guidash, Michael; Vogelsang, Thomas; Endsley, Jay

    2018-03-01

    Differential binary pixel technology is a threshold-based timing, readout, and image reconstruction method that utilizes the subframe partial charge transfer technique in a standard four-transistor (4T) pixel CMOS image sensor to achieve a high dynamic range video with stop motion. This technology improves low light signal-to-noise ratio (SNR) by up to 21 dB. The method is verified in silicon using a Taiwan Semiconductor Manufacturing Company's 65 nm 1.1 μm pixel technology 1 megapixel test chip array and is compared with a traditional 4 × oversampling technique using full charge transfer to show low light SNR superiority of the presented technology.

  20. First Experiences with Kinect v2 Sensor for Close Range 3d Modelling

    NASA Astrophysics Data System (ADS)

    Lachat, E.; Macher, H.; Mittet, M.-A.; Landes, T.; Grussenmeyer, P.

    2015-02-01

    RGB-D cameras, also known as range imaging cameras, are a recent generation of sensors. As they are suitable for measuring distances to objects at high frame rate, such sensors are increasingly used for 3D acquisitions, and more generally for applications in robotics or computer vision. This kind of sensors became popular especially since the Kinect v1 (Microsoft) arrived on the market in November 2010. In July 2014, Windows has released a new sensor, the Kinect for Windows v2 sensor, based on another technology as its first device. However, due to its initial development for video games, the quality assessment of this new device for 3D modelling represents a major investigation axis. In this paper first experiences with Kinect v2 sensor are related, and the ability of close range 3D modelling is investigated. For this purpose, error sources on output data as well as a calibration approach are presented.

  1. High-resolution panoramic images with megapixel MWIR FPA

    NASA Astrophysics Data System (ADS)

    Leboucher, Vincent; Aubry, Gilles

    2014-06-01

    In the continuity of its current strategy, HGH maintains a deep effort in developing its most recent product family: the infrared (IR) panoramic 360-degree surveillance sensors. During the last two years, HGH optimized its prototype Middle Wave IR (MWIR) panoramic sensor IR Revolution 360 HD that gave birth to Spynel-S product. Various test campaigns proved its excellent image quality. Cyclope, the software associated with Spynel, benefitted from recent image processing improvements and new functionalities such as target geolocalization, long range sensor slue to cue and facilitated forensics analysis. In the frame of the PANORAMIR project sustained by the DGA (Délégation Générale de l'Armement), HGH designed a new extra large resolution sensor including a MWIR megapixel Focal Plane Array (FPA) detector (1280×1024 pixels). This new sensor is called Spynel-X. It provides outstanding resolution 360-degree images (with more than 100 Mpixels). The mechanical frame of Spynel (-S and -X) was designed with the collaboration of an industrial design agency. Spynel got the "Observeur du Design 2013" label.

  2. An evaluation of three-dimensional sensors for the extravehicular activity helper/retreiver

    NASA Technical Reports Server (NTRS)

    Magee, Michael

    1993-01-01

    The Extravehicular Activity Retriever/Helper (EVAHR) is a robotic device currently under development at the NASA Johnson Space Center that is designed to fetch objects or to assist in retrieving an astronaut who may have become inadvertently de-tethered. The EVAHR will be required to exhibit a high degree of intelligent autonomous operation and will base much of its reasoning upon information obtained from one or more three-dimensional sensors that it will carry and control. At the highest level of visual cognition and reasoning, the EVAHR will be required to detect objects, recognize them, and estimate their spatial orientation and location. The recognition phase and estimation of spatial pose will depend on the ability of the vision system to reliably extract geometric features of the objects such as whether the surface topologies observed are planar or curved and the spatial relationships between the component surfaces. In order to achieve these tasks, accurate sensing of the operational environment and objects in the environment will therefore be critical. The qualitative and quantitative results of empirical studies of three sensors that are capable of providing three-dimensional information to the EVAHR, but using completely different hardware approaches are documented. The first of these devices is a phase shift laser with an effective operating range (ambiguity interval) of approximately 15 meters. The second sensor is a laser triangulation system designed to operate at much closer range and to provide higher resolution images. The third sensor is a dual camera stereo imaging system from which range images can also be obtained. The remainder of the report characterizes the strengths and weaknesses of each of these systems relative to quality of data extracted and how different object characteristics affect sensor operation.

  3. Compact, self-contained enhanced-vision system (EVS) sensor simulator

    NASA Astrophysics Data System (ADS)

    Tiana, Carlo

    2007-04-01

    We describe the model SIM-100 PC-based simulator, for imaging sensors used, or planned for use, in Enhanced Vision System (EVS) applications. Typically housed in a small-form-factor PC, it can be easily integrated into existing out-the-window visual simulators for fixed-wing or rotorcraft, to add realistic sensor imagery to the simulator cockpit. Multiple bands of infrared (short-wave, midwave, extended-midwave and longwave) as well as active millimeter-wave RADAR systems can all be simulated in real time. Various aspects of physical and electronic image formation and processing in the sensor are accurately (and optionally) simulated, including sensor random and fixed pattern noise, dead pixels, blooming, B-C scope transformation (MMWR). The effects of various obscurants (fog, rain, etc.) on the sensor imagery are faithfully represented and can be selected by an operator remotely and in real-time. The images generated by the system are ideally suited for many applications, ranging from sensor development engineering tradeoffs (Field Of View, resolution, etc.), to pilot familiarization and operational training, and certification support. The realistic appearance of the simulated images goes well beyond that of currently deployed systems, and beyond that required by certification authorities; this level of realism will become necessary as operational experience with EVS systems grows.

  4. A Dynamic Range Enhanced Readout Technique with a Two-Step TDC for High Speed Linear CMOS Image Sensors.

    PubMed

    Gao, Zhiyuan; Yang, Congjie; Xu, Jiangtao; Nie, Kaiming

    2015-11-06

    This paper presents a dynamic range (DR) enhanced readout technique with a two-step time-to-digital converter (TDC) for high speed linear CMOS image sensors. A multi-capacitor and self-regulated capacitive trans-impedance amplifier (CTIA) structure is employed to extend the dynamic range. The gain of the CTIA is auto adjusted by switching different capacitors to the integration node asynchronously according to the output voltage. A column-parallel ADC based on a two-step TDC is utilized to improve the conversion rate. The conversion is divided into coarse phase and fine phase. An error calibration scheme is also proposed to correct quantization errors caused by propagation delay skew within -T(clk)~+T(clk). A linear CMOS image sensor pixel array is designed in the 0.13 μm CMOS process to verify this DR-enhanced high speed readout technique. The post simulation results indicate that the dynamic range of readout circuit is 99.02 dB and the ADC achieves 60.22 dB SNDR and 9.71 bit ENOB at a conversion rate of 2 MS/s after calibration, with 14.04 dB and 2.4 bit improvement, compared with SNDR and ENOB of that without calibration.

  5. Intelligent imaging systems for automotive applications

    NASA Astrophysics Data System (ADS)

    Thompson, Chris; Huang, Yingping; Fu, Shan

    2004-03-01

    In common with many other application areas, visual signals are becoming an increasingly important information source for many automotive applications. For several years CCD cameras have been used as research tools for a range of automotive applications. Infrared cameras, RADAR and LIDAR are other types of imaging sensors that have also been widely investigated for use in cars. This paper will describe work in this field performed in C2VIP over the last decade - starting with Night Vision Systems and looking at various other Advanced Driver Assistance Systems. Emerging from this experience, we make the following observations which are crucial for "intelligent" imaging systems: 1. Careful arrangement of sensor array. 2. Dynamic-Self-Calibration. 3. Networking and processing. 4. Fusion with other imaging sensors, both at the image level and the feature level, provides much more flexibility and reliability in complex situations. We will discuss how these problems can be addressed and what are the outstanding issues.

  6. Method of radiometric quality assessment of NIR images acquired with a custom sensor mounted on an unmanned aerial vehicle

    NASA Astrophysics Data System (ADS)

    Wierzbicki, Damian; Fryskowska, Anna; Kedzierski, Michal; Wojtkowska, Michalina; Delis, Paulina

    2018-01-01

    Unmanned aerial vehicles are suited to various photogrammetry and remote sensing missions. Such platforms are equipped with various optoelectronic sensors imaging in the visible and infrared spectral ranges and also thermal sensors. Nowadays, near-infrared (NIR) images acquired from low altitudes are often used for producing orthophoto maps for precision agriculture among other things. One major problem results from the application of low-cost custom and compact NIR cameras with wide-angle lenses introducing vignetting. In numerous cases, such cameras acquire low radiometric quality images depending on the lighting conditions. The paper presents a method of radiometric quality assessment of low-altitude NIR imagery data from a custom sensor. The method utilizes statistical analysis of NIR images. The data used for the analyses were acquired from various altitudes in various weather and lighting conditions. An objective NIR imagery quality index was determined as a result of the research. The results obtained using this index enabled the classification of images into three categories: good, medium, and low radiometric quality. The classification makes it possible to determine the a priori error of the acquired images and assess whether a rerun of the photogrammetric flight is necessary.

  7. Sensor Management for Tactical Surveillance Operations

    DTIC Science & Technology

    2007-11-01

    active and passive sonar for submarine and tor- pedo detection, and mine avoidance. [range, bearing] range 1.8 km to 55 km Active or Passive AN/SLQ-501...finding (DF) unit [bearing, classification] maximum range 1100 km Passive Cameras (day- light/ night- vision) ( video & still) Record optical and...infrared still images or motion video of events for near-real time assessment or long term analysis and archiving. Range is limited by the image resolution

  8. Micromachined Chip Scale Thermal Sensor for Thermal Imaging.

    PubMed

    Shekhawat, Gajendra S; Ramachandran, Srinivasan; Jiryaei Sharahi, Hossein; Sarkar, Souravi; Hujsak, Karl; Li, Yuan; Hagglund, Karl; Kim, Seonghwan; Aden, Gary; Chand, Ami; Dravid, Vinayak P

    2018-02-27

    The lateral resolution of scanning thermal microscopy (SThM) has hitherto never approached that of mainstream atomic force microscopy, mainly due to poor performance of the thermal sensor. Herein, we report a nanomechanical system-based thermal sensor (thermocouple) that enables high lateral resolution that is often required in nanoscale thermal characterization in a wide range of applications. This thermocouple-based probe technology delivers excellent lateral resolution (∼20 nm), extended high-temperature measurements >700 °C without cantilever bending, and thermal sensitivity (∼0.04 °C). The origin of significantly improved figures-of-merit lies in the probe design that consists of a hollow silicon tip integrated with a vertically oriented thermocouple sensor at the apex (low thermal mass) which interacts with the sample through a metallic nanowire (50 nm diameter), thereby achieving high lateral resolution. The efficacy of this approach to SThM is demonstrated by imaging embedded metallic nanostructures in silica core-shell, metal nanostructures coated with polymer films, and metal-polymer interconnect structures. The nanoscale pitch and extremely small thermal mass of the probe promise significant improvements over existing methods and wide range of applications in several fields including semiconductor industry, biomedical imaging, and data storage.

  9. Backside illuminated CMOS-TDI line scanner for space applications

    NASA Astrophysics Data System (ADS)

    Cohen, O.; Ben-Ari, N.; Nevo, I.; Shiloah, N.; Zohar, G.; Kahanov, E.; Brumer, M.; Gershon, G.; Ofer, O.

    2017-09-01

    A new multi-spectral line scanner CMOS image sensor is reported. The backside illuminated (BSI) image sensor was designed for continuous scanning Low Earth Orbit (LEO) space applications including A custom high quality CMOS Active Pixels, Time Delayed Integration (TDI) mechanism that increases the SNR, 2-phase exposure mechanism that increases the dynamic Modulation Transfer Function (MTF), very low power internal Analog to Digital Converters (ADC) with resolution of 12 bit per pixel and on chip controller. The sensor has 4 independent arrays of pixels where each array is arranged in 2600 TDI columns with controllable TDI depth from 8 up to 64 TDI levels. A multispectral optical filter with specific spectral response per array is assembled at the package level. In this paper we briefly describe the sensor design and present some electrical and electro-optical recent measurements of the first prototypes including high Quantum Efficiency (QE), high MTF, wide range selectable Full Well Capacity (FWC), excellent linearity of approximately 1.3% in a signal range of 5-85% and approximately 1.75% in a signal range of 2-95% out of the signal span, readout noise of approximately 95 electrons with 64 TDI levels, negligible dark current and power consumption of less than 1.5W total for 4 bands sensor at all operation conditions .

  10. Preliminary Design of a Lightning Optical Camera and ThundEr (LOCATE) Sensor

    NASA Technical Reports Server (NTRS)

    Phanord, Dieudonne D.; Koshak, William J.; Rybski, Paul M.; Arnold, James E. (Technical Monitor)

    2001-01-01

    The preliminary design of an optical/acoustical instrument is described for making highly accurate real-time determinations of the location of cloud-to-ground (CG) lightning. The instrument, named the Lightning Optical Camera And ThundEr (LOCATE) sensor, will also image the clear and cloud-obscured lightning channel produced from CGs and cloud flashes, and will record the transient optical waveforms produced from these discharges. The LOCATE sensor will consist of a full (360 degrees) field-of-view optical camera for obtaining CG channel image and azimuth, a sensitive thunder microphone for obtaining CG range, and a fast photodiode system for time-resolving the lightning optical waveform. The optical waveform data will be used to discriminate CGs from cloud flashes. Together, the optical azimuth and thunder range is used to locate CGs and it is anticipated that a network of LOCATE sensors would determine CG source location to well within 100 meters. All of this would be accomplished for a relatively inexpensive cost compared to present RF lightning location technologies, but of course the range detection is limited and will be quantified in the future. The LOCATE sensor technology would have practical applications for electric power utility companies, government (e.g. NASA Kennedy Space Center lightning safety and warning), golf resort lightning safety, telecommunications, and other industries.

  11. Optical system design of CCD star sensor with large aperture and wide field of view

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Jiang, Lun; Li, Ying-chao; Liu, Zhuang

    2017-10-01

    The star sensor is one of the sensors which are used to determine the spatial attitude of the space vehicle. An optical system of star sensor with large aperture and wide field of view was designed in this paper. The effective focal length of the optics was 16mm, and the F-number is 1.2, the field of view of the optical system is 20°.The working spectrum is 500 to 800 nanometer. The lens system selects a similar complicated Petzval structure and special glass-couple, and get a high imaging quality in the whole spectrum range. For each field-of-view point, the values of the modulation transfer function at 50 cycles/mm is higher than 0.3. On the detecting plane, the encircled energy in a circle of 14μm diameter could be up to 80% of the total energy. In the whole range of the field of view, the dispersion spot diameter in the imaging plane is no larger than 13μm. The full field distortion was less than 0.1%, which was helpful to obtain the accurate location of the reference star through the picture gotten by the star sensor. The lateral chromatic aberration is less than 2μm in the whole spectrum range.

  12. An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability.

    PubMed

    Cevik, Ismail; Huang, Xiwei; Yu, Hao; Yan, Mei; Ay, Suat U

    2015-03-06

    An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability is introduced in this paper. The photodiode pixel array can not only capture images but also harvest solar energy. As such, the CMOS image sensor chip is able to switch between imaging and harvesting modes towards self-power operation. Moreover, an on-chip maximum power point tracking (MPPT)-based power management system (PMS) is designed for the dual-mode image sensor to further improve the energy efficiency. A new isolated P-well energy harvesting and imaging (EHI) pixel with very high fill factor is introduced. Several ultra-low power design techniques such as reset and select boosting techniques have been utilized to maintain a wide pixel dynamic range. The chip was designed and fabricated in a 1.8 V, 1P6M 0.18 µm CMOS process. Total power consumption of the imager is 6.53 µW for a 96 × 96 pixel array with 1 V supply and 5 fps frame rate. Up to 30 μW of power could be generated by the new EHI pixels. The PMS is capable of providing 3× the power required during imaging mode with 50% efficiency allowing energy autonomous operation with a 72.5% duty cycle.

  13. An Ultra-Low Power CMOS Image Sensor with On-Chip Energy Harvesting and Power Management Capability

    PubMed Central

    Cevik, Ismail; Huang, Xiwei; Yu, Hao; Yan, Mei; Ay, Suat U.

    2015-01-01

    An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability is introduced in this paper. The photodiode pixel array can not only capture images but also harvest solar energy. As such, the CMOS image sensor chip is able to switch between imaging and harvesting modes towards self-power operation. Moreover, an on-chip maximum power point tracking (MPPT)-based power management system (PMS) is designed for the dual-mode image sensor to further improve the energy efficiency. A new isolated P-well energy harvesting and imaging (EHI) pixel with very high fill factor is introduced. Several ultra-low power design techniques such as reset and select boosting techniques have been utilized to maintain a wide pixel dynamic range. The chip was designed and fabricated in a 1.8 V, 1P6M 0.18 µm CMOS process. Total power consumption of the imager is 6.53 µW for a 96 × 96 pixel array with 1 V supply and 5 fps frame rate. Up to 30 μW of power could be generated by the new EHI pixels. The PMS is capable of providing 3× the power required during imaging mode with 50% efficiency allowing energy autonomous operation with a 72.5% duty cycle. PMID:25756863

  14. Frequency band justifications for passive sensors, 1 to 10 GHz. [for monitoring earth resources and the environment

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Remote sensor systems operating in the microwave region of the frequency spectrum provide information unobtainable with basic imaging techniques such as photography, television, or multispectral imaging. The frequency allocation requirements for passive microwave sensors used in the earth exploration satellite and space research services are presented for: (1) agriculture, forestry, and range resources; (2) land use survey and mapping: (3) water resources; (4) weather and climate; (5) environmental quality; and (6) marine resources, estuarine and oceans. Because measurements are required simultaneously in multiple frequency bands to adequately determine values of some phenomena, the relationships between frequency bands are discussed. The various measurement accuracies, dynamic range, resolutions and frequency needs are examined. A band-by-band summary of requirements, unique aspects, and sharing analyses of the required frequency bands is included.

  15. A NIR-BODIPY derivative for sensing copper(II) in blood and mitochondrial imaging

    NASA Astrophysics Data System (ADS)

    He, Shao-Jun; Xie, Yu-Wen; Chen, Qiu-Yun

    2018-04-01

    In order to develop NIR BODIPY for mitochondria targeting imaging agents and metal sensors, a side chain modified BODIPY (BPN) was synthesized and spectroscopically characterized. BPN has NIR emission at 765 nm when excited at 704 nm. The emission at 765 nm responded differently to Cu2+ and Mn2+ ions, respectively. The BPN coordinated with Cu2+ forming [BPNCu]2+ complex with quenched emission, while Mn2+ induced aggregation of BPN with specific fluorescence enhancement. Moreover, BPN can be applied to monitor Cu2+ in live cells and image mitochondria. Further, BPN was used as sensor for the detection of Cu2+ ions in serum with linear detection range of 0.45 μM-36.30 μM. Results indicate that BPN is a good sensor for the detection of Cu2+ in serum and image mitochondria. This study gives strategies for future design of NIR sensors for the analysis of metal ions in blood.

  16. A NIR-BODIPY derivative for sensing copper(II) in blood and mitochondrial imaging.

    PubMed

    He, Shao-Jun; Xie, Yu-Wen; Chen, Qiu-Yun

    2018-04-15

    In order to develop NIR BODIPY for mitochondria targeting imaging agents and metal sensors, a side chain modified BODIPY (BPN) was synthesized and spectroscopically characterized. BPN has NIR emission at 765nm when excited at 704nm. The emission at 765nm responded differently to Cu 2+ and Mn 2+ ions, respectively. The BPN coordinated with Cu 2+ forming [BPNCu] 2+ complex with quenched emission, while Mn 2+ induced aggregation of BPN with specific fluorescence enhancement. Moreover, BPN can be applied to monitor Cu 2+ in live cells and image mitochondria. Further, BPN was used as sensor for the detection of Cu 2+ ions in serum with linear detection range of 0.45μM-36.30μM. Results indicate that BPN is a good sensor for the detection of Cu 2+ in serum and image mitochondria. This study gives strategies for future design of NIR sensors for the analysis of metal ions in blood. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Scanning Shack-Hartmann wavefront sensor

    NASA Astrophysics Data System (ADS)

    Molebny, Vasyl V.

    2004-09-01

    Criss-crossing of focal images is the cause of a narrow dynamic range in Shack-Hartmann sensors. Practically, aberration range wider than +/-3 diopters can not be measured. A method has been proposed for ophthalmologic applications using a rarefied lenslet array through which a wave front is projected with the successive step-by-step changing of the global tilt. The data acquired in each step are accumulated and processed. In experimental setup, a doubled dynamic range was achieved with four steps of wave front tilting.

  18. Imaging sensor constellation for tomographic chemical cloud mapping.

    PubMed

    Cosofret, Bogdan R; Konno, Daisei; Faghfouri, Aram; Kindle, Harry S; Gittins, Christopher M; Finson, Michael L; Janov, Tracy E; Levreault, Mark J; Miyashiro, Rex K; Marinelli, William J

    2009-04-01

    A sensor constellation capable of determining the location and detailed concentration distribution of chemical warfare agent simulant clouds has been developed and demonstrated on government test ranges. The constellation is based on the use of standoff passive multispectral infrared imaging sensors to make column density measurements through the chemical cloud from two or more locations around its periphery. A computed tomography inversion method is employed to produce a 3D concentration profile of the cloud from the 2D line density measurements. We discuss the theoretical basis of the approach and present results of recent field experiments where controlled releases of chemical warfare agent simulants were simultaneously viewed by three chemical imaging sensors. Systematic investigations of the algorithm using synthetic data indicate that for complex functions, 3D reconstruction errors are less than 20% even in the case of a limited three-sensor measurement network. Field data results demonstrate the capability of the constellation to determine 3D concentration profiles that account for ~?86%? of the total known mass of material released.

  19. Changing requirements and solutions for unattended ground sensors

    NASA Astrophysics Data System (ADS)

    Prado, Gervasio; Johnson, Robert

    2007-10-01

    Unattended Ground Sensors (UGS) were first used to monitor Viet Cong activity along the Ho Chi Minh Trail in the 1960's. In the 1980's, significant improvement in the capabilities of UGS became possible with the development of digital signal processors; this led to their use as fire control devices for smart munitions (for example: the Wide Area Mine) and later to monitor the movements of mobile missile launchers. In these applications, the targets of interest were large military vehicles with strong acoustic, seismic and magnetic signatures. Currently, the requirements imposed by new terrorist threats and illegal border crossings have changed the emphasis to the monitoring of light vehicles and foot traffic. These new requirements have changed the way UGS are used. To improve performance against targets with lower emissions, sensors are used in multi-modal arrangements. Non-imaging sensors (acoustic, seismic, magnetic and passive infrared) are now being used principally as activity sensors to cue imagers and remote cameras. The availability of better imaging technology has made imagers the preferred source of "actionable intelligence". Infrared cameras are now based on un-cooled detector-arrays that have made their application in UGS possible in terms of their cost and power consumption. Visible light imagers are also more sensitive extending their utility well beyond twilight. The imagers are equipped with sophisticated image processing capabilities (image enhancement, moving target detection and tracking, image compression). Various commercial satellite services now provide relatively inexpensive long-range communications and the Internet provides fast worldwide access to the data.

  20. Imaging Flash Lidar for Autonomous Safe Landing and Spacecraft Proximity Operation

    NASA Technical Reports Server (NTRS)

    Amzajerdian, Farzin; Roback, Vincent E.; Brewster, Paul F.; Hines, Glenn D.; Bulyshev, Alexander E.

    2016-01-01

    3-D Imaging flash lidar is recognized as a primary candidate sensor for safe precision landing on solar system bodies (Moon, Mars, Jupiter and Saturn moons, etc.), and autonomous rendezvous proximity operations and docking/capture necessary for asteroid sample return and redirect missions, spacecraft docking, satellite servicing, and space debris removal. During the final stages of landing, from about 1 km to 500 m above the ground, the flash lidar can generate 3-Dimensional images of the terrain to identify hazardous features such as craters, rocks, and steep slopes. The onboard fli1ght computer can then use the 3-D map of terrain to guide the vehicle to a safe location. As an automated rendezvous and docking sensor, the flash lidar can provide relative range, velocity, and bearing from an approaching spacecraft to another spacecraft or a space station from several kilometers distance. NASA Langley Research Center has developed and demonstrated a flash lidar sensor system capable of generating 16k pixels range images with 7 cm precision, at a 20 Hz frame rate, from a maximum slant range of 1800 m from the target area. This paper describes the lidar instrument design and capabilities as demonstrated by the closed-loop flight tests onboard a rocket-propelled free-flyer vehicle (Morpheus). Then a plan for continued advancement of the flash lidar technology will be explained. This proposed plan is aimed at the development of a common sensor that with a modest design adjustment can meet the needs of both landing and proximity operation and docking applications.

  1. Fiber optic sensors and systems at the Federal University of Rio de Janeiro

    NASA Astrophysics Data System (ADS)

    Werneck, Marcelo M.; dos Santos, Paulo A. M.; Ferreira, Aldo P.; Maggi, Luis E.; de Carvalho, Carlos R., Jr.; Ribeiro, R. M.

    1998-08-01

    As widely known, fiberoptics (FO) are being used in a large variety of sensors and systems particularly for their small dimensions and low cost, large bandwidth and favorable dielectric properties. These properties have allowed us to develop sensors and systems for general applications and, particularly, for biomedical engineering. The intravascular pressure sensor was designed for small dimensions and high bandwidth. The system is based on light-intensity modulation technique and uses a 2 mm-diameter elastomer membrane as the sensor element and a pigtailed laser as a light source. The optical power output curve was linear for pressures within the range of 0 to 300 mmHg. The real time optical biosensor uses the evanescent field technique for monitoring Escherichia coli growth in culture media. The optical biosensor monitors interactions between the analytic (bacteria) and the evanescent field of an optical fiber passing through it. The FO based high voltage and current sensor is a measuring system designed for monitoring voltage and current in high voltage transmission lines. The linearity of the system is better than 2% in both ranges of 0 to 25 kV and 0 to 1000 A. The optical flowmeter uses a cross-correlation technique that analyses two light beams crossing the flow separated by a fixed distance. The x-ray image sensor uses a scintillating FO array, one FO for each image pixel to form an image of the x-ray field. The systems described in these paper use general-purpose components including optical fibers and optoelectronic devices, which are readily available, and of low cost.

  2. Research progress in fiber optic sensors and systems at the Federal University of Rio de Janeiro

    NASA Astrophysics Data System (ADS)

    Werneck, Marcelo M.; Ferreira, Aldo P.; Maggi, Luis E.; De Carvalho, C. C.; Ribeiro, R. M.

    1999-02-01

    As widely known, fiberoptics (FO) are being used in a large variety of sensor an systems particularly for their small dimensions and low cost, large bandwidth and favorable dielectric properties. These properties have allowed us to develop sensor and systems for general applications and, particularly, for biomedical engineering. The intravasculator pressure sensor was designed for small dimensions and high bandwidth. The system is based on light- intensity modulation technique and use a 2 mm-diameter elastomer membrane as the sensor element and a pigtailed laser as a light source. The optical power out put curve was linear for pressures within the range of 0 to 300 mmHg. The real time optical biosensor uses the evanescent field technique for monitoring Escherichia coli growth in culture media. The optical biosensor monitors interactions between the analytic and the evanescent field of an optical fiber passing through it. The FO based high voltage and current sensor is a measuring system designed for monitoring voltage and current in high voltage transmission lines. The linearity of the system is better than 2 percent in both ranges of 0 to 25 kV and 0 to 1000 A. The optical flowmeter uses a cross-correlation technique that analyzes two light beams crossing the flow separated by a fixed distance. The x-ray image sensor uses a scintillating FO array, one FO for each image pixel to form an image of the x-ray field. The systems described in this paper use general-purpose components including optical fibers and optoelectronic devices, which are readily available, and of low cost.

  3. Architecture and applications of a high resolution gated SPAD image sensor

    PubMed Central

    Burri, Samuel; Maruyama, Yuki; Michalet, Xavier; Regazzoni, Francesco; Bruschini, Claudio; Charbon, Edoardo

    2014-01-01

    We present the architecture and three applications of the largest resolution image sensor based on single-photon avalanche diodes (SPADs) published to date. The sensor, fabricated in a high-voltage CMOS process, has a resolution of 512 × 128 pixels and a pitch of 24 μm. The fill-factor of 5% can be increased to 30% with the use of microlenses. For precise control of the exposure and for time-resolved imaging, we use fast global gating signals to define exposure windows as small as 4 ns. The uniformity of the gate edges location is ∼140 ps (FWHM) over the whole array, while in-pixel digital counting enables frame rates as high as 156 kfps. Currently, our camera is used as a highly sensitive sensor with high temporal resolution, for applications ranging from fluorescence lifetime measurements to fluorescence correlation spectroscopy and generation of true random numbers. PMID:25090572

  4. Active laser radar (lidar) for measurement of corresponding height and reflectance images

    NASA Astrophysics Data System (ADS)

    Froehlich, Christoph; Mettenleiter, M.; Haertl, F.

    1997-08-01

    For the survey and inspection of environmental objects, a non-tactile, robust and precise imaging of height and depth is the basis sensor technology. For visual inspection,surface classification, and documentation purposes, however, additional information concerning reflectance of measured objects is necessary. High-speed acquisition of both geometric and visual information is achieved by means of an active laser radar, supporting consistent 3D height and 2D reflectance images. The laser radar is an optical-wavelength system, and is comparable to devices built by ERIM, Odetics, and Perceptron, measuring the range between sensor and target surfaces as well as the reflectance of the target surface, which corresponds to the magnitude of the back scattered laser energy. In contrast to these range sensing devices, the laser radar under consideration is designed for high speed and precise operation in both indoor and outdoor environments, emitting a minimum of near-IR laser energy. It integrates a laser range measurement system and a mechanical deflection system for 3D environmental measurements. This paper reports on design details of the laser radar for surface inspection tasks. It outlines the performance requirements and introduces the measurement principle. The hardware design, including the main modules, such as the laser head, the high frequency unit, the laser beam deflection system, and the digital signal processing unit are discussed.the signal processing unit consists of dedicated signal processors for real-time sensor data preprocessing as well as a sensor computer for high-level image analysis and feature extraction. The paper focuses on performance data of the system, including noise, drift over time, precision, and accuracy with measurements. It discuses the influences of ambient light, surface material of the target, and ambient temperature for range accuracy and range precision. Furthermore, experimental results from inspection of buildings, monuments and industrial environments are presented. The paper concludes by summarizing results achieved in industrial environments and gives a short outlook to future work.

  5. High dynamic range vision sensor for automotive applications

    NASA Astrophysics Data System (ADS)

    Grenet, Eric; Gyger, Steve; Heim, Pascal; Heitger, Friedrich; Kaess, Francois; Nussbaum, Pascal; Ruedi, Pierre-Francois

    2005-02-01

    A 128 x 128 pixels, 120 dB vision sensor extracting at the pixel level the contrast magnitude and direction of local image features is used to implement a lane tracking system. The contrast representation (relative change of illumination) delivered by the sensor is independent of the illumination level. Together with the high dynamic range of the sensor, it ensures a very stable image feature representation even with high spatial and temporal inhomogeneities of the illumination. Dispatching off chip image feature is done according to the contrast magnitude, prioritizing features with high contrast magnitude. This allows to reduce drastically the amount of data transmitted out of the chip, hence the processing power required for subsequent processing stages. To compensate for the low fill factor (9%) of the sensor, micro-lenses have been deposited which increase the sensitivity by a factor of 5, corresponding to an equivalent of 2000 ASA. An algorithm exploiting the contrast representation output by the vision sensor has been developed to estimate the position of a vehicle relative to the road markings. The algorithm first detects the road markings based on the contrast direction map. Then, it performs quadratic fits on selected kernel of 3 by 3 pixels to achieve sub-pixel accuracy on the estimation of the lane marking positions. The resulting precision on the estimation of the vehicle lateral position is 1 cm. The algorithm performs efficiently under a wide variety of environmental conditions, including night and rainy conditions.

  6. SU-E-I-92: Accuracy Evaluation of Depth Data in Microsoft Kinect.

    PubMed

    Kozono, K; Aoki, M; Ono, M; Kamikawa, Y; Arimura, H; Toyofuku, F

    2012-06-01

    Microsoft Kinect has potential for use in real-time patient position monitoring in diagnostic radiology and radiotherapy. We evaluated the accuracy of depth image data and the device-to-device variation in various conditions simulating clinical applications in a hospital. Kinect sensor consists of infrared-ray depth camera and RGB camera. We developed a computer program using OpenNI and OpenCV for measuring quantitative distance data. The program displays depth image obtained from Kinect sensor on the screen, and the cartesian coordinates at an arbitrary point selected by mouse-clicking can be measured. A rectangular box without luster (300 × 198 × 50 mm 3 ) was used as a measuring object. The object was placed on the floor at various distances ranging from 0 to 400 cm in increments of 10 cm from the sensor, and depth data were measured for 10 points on the planar surface of the box. The measured distance data were calibrated by using the least square method. The device-to-device variations were evaluated using five Kinect sensors. There was almost linear relationship between true and measured values. Kinect sensor was unable to measure at a distance of less than 50 cm from the sensor. It was found that distance data calibration was necessary for each sensor. The device-to-device variation error for five Kinect sensors was within 0.46% at the distance range from 50 cm to 2 m from the sensor. The maximum deviation of the distance data after calibration was 1.1 mm at a distance from 50 to 150 cm. The overall average error of five Kinect sensors was 0.18 mm at a distance range of 50 to 150 cm. Kinect sensor has distance accuracy of about 1 mm if each device is properly calibrated. This sensor will be useable for positioning of patients in diagnostic radiology and radiotherapy. © 2012 American Association of Physicists in Medicine.

  7. Comparison of JPL-AIRSAR and DLR E-SAR images from the MAC Europe 1991 campaign over testsite Oberpfaffenhofen: Frequency and polarization dependent backscatter variations from agricultural fields

    NASA Technical Reports Server (NTRS)

    Schmullius, C.; Nithack, J.

    1992-01-01

    On July 12, the MAC Europe '91 (Multi-Sensor Airborne Campaign) took place over test site Oberpfaffenhofen. The DLR Institute of Radio-Frequency Technology participated with its C-VV, X-VV, and X-HH Experimental Synthetic Aperture Radar (E-SAR). The high resolution E-SAR images with a pixel size between 1 and 2 m and the polarimetric AIRSAR images were analyzed. Using both sensors in combination is a unique opportunity to evaluate SAR images in a frequency range from P- to X-band and to investigate polarimetric information.

  8. Multi-image acquisition-based distance sensor using agile laser spot beam.

    PubMed

    Riza, Nabeel A; Amin, M Junaid

    2014-09-01

    We present a novel laser-based distance measurement technique that uses multiple-image-based spatial processing to enable distance measurements. Compared with the first-generation distance sensor using spatial processing, the modified sensor is no longer hindered by the classic Rayleigh axial resolution limit for the propagating laser beam at its minimum beam waist location. The proposed high-resolution distance sensor design uses an electronically controlled variable focus lens (ECVFL) in combination with an optical imaging device, such as a charged-coupled device (CCD), to produce and capture different laser spot size images on a target with these beam spot sizes different from the minimal spot size possible at this target distance. By exploiting the unique relationship of the target located spot sizes with the varying ECVFL focal length for each target distance, the proposed distance sensor can compute the target distance with a distance measurement resolution better than the axial resolution via the Rayleigh resolution criterion. Using a 30 mW 633 nm He-Ne laser coupled with an electromagnetically actuated liquid ECVFL, along with a 20 cm focal length bias lens, and using five spot images captured per target position by a CCD-based Nikon camera, a proof-of-concept proposed distance sensor is successfully implemented in the laboratory over target ranges from 10 to 100 cm with a demonstrated sub-cm axial resolution, which is better than the axial Rayleigh resolution limit at these target distances. Applications for the proposed potentially cost-effective distance sensor are diverse and include industrial inspection and measurement and 3D object shape mapping and imaging.

  9. Automatic Generation of Wide Dynamic Range Image without Pseudo-Edge Using Integration of Multi-Steps Exposure Images

    NASA Astrophysics Data System (ADS)

    Migiyama, Go; Sugimura, Atsuhiko; Osa, Atsushi; Miike, Hidetoshi

    Recently, digital cameras are offering technical advantages rapidly. However, the shot image is different from the sight image generated when that scenery is seen with the naked eye. There are blown-out highlights and crushed blacks in the image that photographed the scenery of wide dynamic range. The problems are hardly generated in the sight image. These are contributory cause of difference between the shot image and the sight image. Blown-out highlights and crushed blacks are caused by the difference of dynamic range between the image sensor installed in a digital camera such as CCD and CMOS and the human visual system. Dynamic range of the shot image is narrower than dynamic range of the sight image. In order to solve the problem, we propose an automatic method to decide an effective exposure range in superposition of edges. We integrate multi-step exposure images using the method. In addition, we try to erase pseudo-edges using the process to blend exposure values. Afterwards, we get a pseudo wide dynamic range image automatically.

  10. Wavefront image sensor chip

    PubMed Central

    Cui, Xiquan; Ren, Jian; Tearney, Guillermo J.; Yang, Changhuei

    2010-01-01

    We report the implementation of an image sensor chip, termed wavefront image sensor chip (WIS), that can measure both intensity/amplitude and phase front variations of a light wave separately and quantitatively. By monitoring the tightly confined transmitted light spots through a circular aperture grid in a high Fresnel number regime, we can measure both intensity and phase front variations with a high sampling density (11 µm) and high sensitivity (the sensitivity of normalized phase gradient measurement is 0.1 mrad under the typical working condition). By using WIS in a standard microscope, we can collect both bright-field (transmitted light intensity) and normalized phase gradient images. Our experiments further demonstrate that the normalized phase gradient images of polystyrene microspheres, unstained and stained starfish embryos, and strongly birefringent potato starch granules are improved versions of their corresponding differential interference contrast (DIC) microscope images in that they are artifact-free and quantitative. Besides phase microscopy, WIS can benefit machine recognition, object ranging, and texture assessment for a variety of applications. PMID:20721059

  11. Column-parallel correlated multiple sampling circuits for CMOS image sensors and their noise reduction effects.

    PubMed

    Suh, Sungho; Itoh, Shinya; Aoyama, Satoshi; Kawahito, Shoji

    2010-01-01

    For low-noise complementary metal-oxide-semiconductor (CMOS) image sensors, the reduction of pixel source follower noises is becoming very important. Column-parallel high-gain readout circuits are useful for low-noise CMOS image sensors. This paper presents column-parallel high-gain signal readout circuits, correlated multiple sampling (CMS) circuits and their noise reduction effects. In the CMS, the gain of the noise cancelling is controlled by the number of samplings. It has a similar effect to that of an amplified CDS for the thermal noise but is a little more effective for 1/f and RTS noises. Two types of the CMS with simple integration and folding integration are proposed. In the folding integration, the output signal swing is suppressed by a negative feedback using a comparator and one-bit D-to-A converter. The CMS circuit using the folding integration technique allows to realize a very low-noise level while maintaining a wide dynamic range. The noise reduction effects of their circuits have been investigated with a noise analysis and an implementation of a 1Mpixel pinned photodiode CMOS image sensor. Using 16 samplings, dynamic range of 59.4 dB and noise level of 1.9 e(-) for the simple integration CMS and 75 dB and 2.2 e(-) for the folding integration CMS, respectively, are obtained.

  12. Application of Optical Imaging Techniques for Quantification of pH and O2 Dynamicsin Porous Media

    NASA Astrophysics Data System (ADS)

    Li, B.; Seliman, A. F.; Pales, A. R.; Liang, W.; Sams, A.; Darnault, C. J. G.; DeVol, T. A.

    2016-12-01

    Understanding the spatial and temporal distribution of physical and chemical parameters (e.g. pH, O2) is imperative to characterize the behavior of contaminants in a natural environment. The objectives of this research are to calibrate pH and O2 sensor foils, to develop a dual pH/O2 sensor foil, and to apply them into flow and transport experiments, in order to understand the physical and chemical parameters that control contaminant fate and transport in an unsaturated sandy porous medium. In addition, demonstration of a sensor foil that quantifies aqueous uranium concentration will be presented. Optical imaging techniques will be conducted with 2D tanks to investigate the influence of microbial exudates and plant roots on pH and O2 parameters and radionuclides transport. As a non-invasive method, the optical imaging technique utilizes optical chemical sensor films and either a digital camera or a spectrometer to capture the changes with high temporal and spatial resolutions. Sensor foils are made for different parameters by applying dyes to generate favorable fluorescence that is proportional to the parameter of interest. Preliminary results suggested that this method could detect pH ranging from 4.5 to 7.5. The result from uranium foil test with different concentrations in the range of 2 to 8 ppm indicated that a higher concentration of uranium resulted in a greater color intensity.

  13. The eyes of LITENING

    NASA Astrophysics Data System (ADS)

    Moser, Eric K.

    2016-05-01

    LITENING is an airborne system-of-systems providing long-range imaging, targeting, situational awareness, target tracking, weapon guidance, and damage assessment, incorporating a laser designator and laser range finders, as well as non-thermal and thermal imaging systems, with multi-sensor boresight. Robust operation is at a premium, and subsystems are partitioned to modular, swappable line-replaceable-units (LRUs) and shop-replaceable-units (SRUs). This presentation will explore design concepts for sensing, data storage, and presentation of imagery associated with the LITENING targeting pod. The "eyes" of LITENING are the electro-optic sensors. Since the initial LITENING II introduction to the US market in the late 90s, as the program has evolved and matured, a series of spiral functional improvements and sensor upgrades have been incorporated. These include laser-illuminated imaging, and more recently, color sensing. While aircraft displays are outside of the LITENING system, updates to the available viewing modules have also driven change, and resulted in increasingly effective ways of utilizing the targeting system. One of the latest LITENING spiral upgrades adds a new capability to display and capture visible-band color imagery, using new sensors. This is an augmentation to the system's existing capabilities, which operate over a growing set of visible and invisible colors, infrared bands, and laser line wavelengths. A COTS visible-band camera solution using a CMOS sensor has been adapted to meet the particular needs associated with the airborne targeting use case.

  14. Optimizing Floating Guard Ring Designs for FASPAX N-in-P Silicon Sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Kyung-Wook; Bradford, Robert; Lipton, Ronald

    2016-10-06

    FASPAX (Fermi-Argonne Semiconducting Pixel Array X-ray detector) is being developed as a fast integrating area detector with wide dynamic range for time resolved applications at the upgraded Advanced Photon Source (APS.) A burst mode detector with intendedmore » $$\\mbox{13 $$MHz$}$ image rate, FASPAX will also incorporate a novel integration circuit to achieve wide dynamic range, from single photon sensitivity to $$10^{\\text{5}}$$ x-rays/pixel/pulse. To achieve these ambitious goals, a novel silicon sensor design is required. This paper will detail early design of the FASPAX sensor. Results from TCAD optimization studies, and characterization of prototype sensors will be presented.« less

  15. VIS-NIR multispectral synchronous imaging pyrometer for high-temperature measurements.

    PubMed

    Fu, Tairan; Liu, Jiangfan; Tian, Jibin

    2017-06-01

    A visible-infrared multispectral synchronous imaging pyrometer was developed for simultaneous, multispectral, two-dimensional high temperature measurements. The multispectral image pyrometer uses prism separation construction in the spectrum range of 650-950 nm and multi-sensor fusion of three CCD sensors for high-temperature measurements. The pyrometer had 650-750 nm, 750-850 nm, and 850-950 nm channels all with the same optical path. The wavelength choice for each channel is flexible with three center wavelengths (700 nm, 810 nm, and 920 nm) with a full width at half maximum of the spectrum of 3 nm used here. The three image sensors were precisely aligned to avoid spectrum artifacts by micro-mechanical adjustments of the sensors relative to each other to position them within a quarter pixel of each other. The pyrometer was calibrated with the standard blackbody source, and the temperature measurement uncertainty was within 0.21 °C-0.99 °C in the temperatures of 600 °C-1800 °C for the blackbody measurements. The pyrometer was then used to measure the leading edge temperatures of a ceramics model exposed to high-enthalpy plasma aerodynamic heating environment to verify the system applicability. The measured temperature ranges are 701-991 °C, 701-1134 °C, and 701-834 °C at the heating transient, steady state, and cooling transient times. A significant temperature gradient (170 °C/mm) was observed away from the leading edge facing the plasma jet during the steady state heating time. The temperature non-uniformity on the surface occurs during the entire aerodynamic heating process. However, the temperature distribution becomes more uniform after the heater is shut down and the experimental model is naturally cooled. This result shows that the multispectral simultaneous image measurement mode provides a wider temperature range for one imaging measurement of high spatial temperature gradients in transient applications.

  16. Very-large-area CCD image sensors: concept and cost-effective research

    NASA Astrophysics Data System (ADS)

    Bogaart, E. W.; Peters, I. M.; Kleimann, A. C.; Manoury, E. J. P.; Klaassens, W.; de Laat, W. T. F. M.; Draijer, C.; Frost, R.; Bosiers, J. T.

    2009-01-01

    A new-generation full-frame 36x48 mm2 48Mp CCD image sensor with vertical anti-blooming for professional digital still camera applications is developed by means of the so-called building block concept. The 48Mp devices are formed by stitching 1kx1k building blocks with 6.0 µm pixel pitch in 6x8 (hxv) format. This concept allows us to design four large-area (48Mp) and sixty-two basic (1Mp) devices per 6" wafer. The basic image sensor is relatively small in order to obtain data from many devices. Evaluation of the basic parameters such as the image pixel and on-chip amplifier provides us statistical data using a limited number of wafers. Whereas the large-area devices are evaluated for aspects typical to large-sensor operation and performance, such as the charge transport efficiency. Combined with the usability of multi-layer reticles, the sensor development is cost effective for prototyping. Optimisation of the sensor design and technology has resulted in a pixel charge capacity of 58 ke- and significantly reduced readout noise (12 electrons at 25 MHz pixel rate, after CDS). Hence, a dynamic range of 73 dB is obtained. Microlens and stack optimisation resulted in an excellent angular response that meets with the wide-angle photography demands.

  17. Active imaging system performance model for target acquisition

    NASA Astrophysics Data System (ADS)

    Espinola, Richard L.; Teaney, Brian; Nguyen, Quang; Jacobs, Eddie L.; Halford, Carl E.; Tofsted, David H.

    2007-04-01

    The U.S. Army RDECOM CERDEC Night Vision & Electronic Sensors Directorate has developed a laser-range-gated imaging system performance model for the detection, recognition, and identification of vehicle targets. The model is based on the established US Army RDECOM CERDEC NVESD sensor performance models of the human system response through an imaging system. The Java-based model, called NVLRG, accounts for the effect of active illumination, atmospheric attenuation, and turbulence effects relevant to LRG imagers, such as speckle and scintillation, and for the critical sensor and display components. This model can be used to assess the performance of recently proposed active SWIR systems through various trade studies. This paper will describe the NVLRG model in detail, discuss the validation of recent model components, present initial trade study results, and outline plans to validate and calibrate the end-to-end model with field data through human perception testing.

  18. Blurred Star Image Processing for Star Sensors under Dynamic Conditions

    PubMed Central

    Zhang, Weina; Quan, Wei; Guo, Lei

    2012-01-01

    The precision of star point location is significant to identify the star map and to acquire the aircraft attitude for star sensors. Under dynamic conditions, star images are not only corrupted by various noises, but also blurred due to the angular rate of the star sensor. According to different angular rates under dynamic conditions, a novel method is proposed in this article, which includes a denoising method based on adaptive wavelet threshold and a restoration method based on the large angular rate. The adaptive threshold is adopted for denoising the star image when the angular rate is in the dynamic range. Then, the mathematical model of motion blur is deduced so as to restore the blurred star map due to large angular rate. Simulation results validate the effectiveness of the proposed method, which is suitable for blurred star image processing and practical for attitude determination of satellites under dynamic conditions. PMID:22778666

  19. Novel snapshot hyperspectral imager for fluorescence imaging

    NASA Astrophysics Data System (ADS)

    Chandler, Lynn; Chandler, Andrea; Periasamy, Ammasi

    2018-02-01

    Hyperspectral imaging has emerged as a new technique for the identification and classification of biological tissue1. Benefitting recent developments in sensor technology, the new class of hyperspectral imagers can capture entire hypercubes with single shot operation and it shows great potential for real-time imaging in biomedical sciences. This paper explores the use of a SnapShot imager in fluorescence imaging via microscope for the very first time. Utilizing the latest imaging sensor, the Snapshot imager is both compact and attachable via C-mount to any commercially available light microscope. Using this setup, fluorescence hypercubes of several cells were generated, containing both spatial and spectral information. The fluorescence images were acquired with one shot operation for all the emission range from visible to near infrared (VIS-IR). The paper will present the hypercubes obtained images from example tissues (475-630nm). This study demonstrates the potential of application in cell biology or biomedical applications for real time monitoring.

  20. Multispectral photoacoustic tomography for detection of small tumors inside biological tissues

    NASA Astrophysics Data System (ADS)

    Hirasawa, Takeshi; Okawa, Shinpei; Tsujita, Kazuhiro; Kushibiki, Toshihiro; Fujita, Masanori; Urano, Yasuteru; Ishihara, Miya

    2018-02-01

    Visualization of small tumors inside biological tissue is important in cancer treatment because that promotes accurate surgical resection and enables therapeutic effect monitoring. For sensitive detection of tumor, we have been developing photoacoustic (PA) imaging technique to visualize tumor-specific contrast agents, and have already succeeded to image a subcutaneous tumor of a mouse using the contrast agents. To image tumors inside biological tissues, extension of imaging depth and improvement of sensitivity were required. In this study, to extend imaging depth, we developed a PA tomography (PAT) system that can image entire cross section of mice. To improve sensitivity, we discussed the use of the P(VDF-TrFE) linear array acoustic sensor that can detect PA signals with wide ranges of frequencies. Because PA signals produced from low absorbance optical absorbers shifts to low frequency, we hypothesized that the detection of low frequency PA signals improves sensitivity to low absorbance optical absorbers. We developed a PAT system with both a PZT linear array acoustic sensor and the P(VDF-TrFE) sensor, and performed experiment using tissue-mimicking phantoms to evaluate lower detection limits of absorbance. As a result, PAT images calculated from low frequency components of PA signals detected by the P(VDF-TrFE) sensor could visualize optical absorbers with lower absorbance.

  1. Image sensor with high dynamic range linear output

    NASA Technical Reports Server (NTRS)

    Yadid-Pecht, Orly (Inventor); Fossum, Eric R. (Inventor)

    2007-01-01

    Designs and operational methods to increase the dynamic range of image sensors and APS devices in particular by achieving more than one integration times for each pixel thereof. An APS system with more than one column-parallel signal chains for readout are described for maintaining a high frame rate in readout. Each active pixel is sampled for multiple times during a single frame readout, thus resulting in multiple integration times. The operation methods can also be used to obtain multiple integration times for each pixel with an APS design having a single column-parallel signal chain for readout. Furthermore, analog-to-digital conversion of high speed and high resolution can be implemented.

  2. SU-F-J-206: Systematic Evaluation of the Minimum Detectable Shift Using a Range- Finding Camera

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Platt, M; Platt, M; Lamba, M

    2016-06-15

    Purpose: The robotic table used for patient alignment in proton therapy is calibrated only at commissioning under well-defined conditions and table shifts may vary over time and with differing conditions. The purpose of this study is to systematically investigate minimum detectable shifts using a time-of-flight (TOF) range-finding camera for table position feedback. Methods: A TOF camera was used to acquire one hundred 424 × 512 range images from a flat surface before and after known shifts. Range was assigned by averaging central regions of the image across multiple images. Depth resolution was determined by evaluating the difference between the actualmore » shift of the surface and the measured shift. Depth resolution was evaluated for number of images averaged, area of sensor over which depth was averaged, distance from camera to surface, central versus peripheral image regions, and angle of surface relative to camera. Results: For one to one thousand images with a shift of one millimeter the range in error was 0.852 ± 0.27 mm to 0.004 ± 0.01 mm (95% C.I.). For varying regions of the camera sensor the range in error was 0.02 ± 0.05 mm to 0.47 ± 0.04 mm. The following results are for 10 image averages. For areas ranging from one pixel to 9 × 9 pixels the range in error was 0.15 ± 0.09 to 0.29 ± 0.15 mm (1σ). For distances ranging from two to four meters the range in error was 0.15 ± 0.09 to 0.28 ± 0.15 mm. For an angle of incidence between thirty degrees and ninety degrees the average range in error was 0.11 ± 0.08 to 0.17 ± 0.09 mm. Conclusion: It is feasible to use a TOF camera for measuring shifts in flat surfaces under clinically relevant conditions with submillimeter precision.« less

  3. A Monitoring System for Laying Hens That Uses a Detection Sensor Based on Infrared Technology and Image Pattern Recognition.

    PubMed

    Zaninelli, Mauro; Redaelli, Veronica; Luzi, Fabio; Bontempo, Valentino; Dell'Orto, Vittorio; Savoini, Giovanni

    2017-05-24

    In Italy, organic egg production farms use free-range housing systems with a big outdoor area and a flock of no more than 500 hens. With additional devices and/or farming procedures, the whole flock could be forced to stay in the outdoor area for a limited time of the day. As a consequence, ozone treatments of housing areas could be performed in order to reduce the levels of atmospheric ammonia and bacterial load without risks, due by its toxicity, both for hens and workers. However, an automatic monitoring system, and a sensor able to detect the presence of animals, would be necessary. For this purpose, a first sensor was developed but some limits, related to the time necessary to detect a hen, were observed. In this study, significant improvements, for this sensor, are proposed. They were reached by an image pattern recognition technique that was applied to thermografic images acquired from the housing system. An experimental group of seven laying hens was selected for the tests, carried out for three weeks. The first week was used to set-up the sensor. Different templates, to use for the pattern recognition, were studied and different floor temperature shifts were investigated. At the end of these evaluations, a template of elliptical shape, and sizes of 135 × 63 pixels, was chosen. Furthermore, a temperature shift of one degree was selected to calculate, for each image, a color background threshold to apply in the following field tests. Obtained results showed an improvement of the sensor detection accuracy that reached values of sensitivity and specificity of 95.1% and 98.7%. In addition, the range of time necessary to detect a hen, or classify a case, was reduced at two seconds. This result could allow the sensor to control a bigger area of the housing system. Thus, the resulting monitoring system could allow to perform the sanitary treatments without risks both for animals and humans.

  4. A Monitoring System for Laying Hens That Uses a Detection Sensor Based on Infrared Technology and Image Pattern Recognition

    PubMed Central

    Zaninelli, Mauro; Redaelli, Veronica; Luzi, Fabio; Bontempo, Valentino; Dell’Orto, Vittorio; Savoini, Giovanni

    2017-01-01

    In Italy, organic egg production farms use free-range housing systems with a big outdoor area and a flock of no more than 500 hens. With additional devices and/or farming procedures, the whole flock could be forced to stay in the outdoor area for a limited time of the day. As a consequence, ozone treatments of housing areas could be performed in order to reduce the levels of atmospheric ammonia and bacterial load without risks, due by its toxicity, both for hens and workers. However, an automatic monitoring system, and a sensor able to detect the presence of animals, would be necessary. For this purpose, a first sensor was developed but some limits, related to the time necessary to detect a hen, were observed. In this study, significant improvements, for this sensor, are proposed. They were reached by an image pattern recognition technique that was applied to thermografic images acquired from the housing system. An experimental group of seven laying hens was selected for the tests, carried out for three weeks. The first week was used to set-up the sensor. Different templates, to use for the pattern recognition, were studied and different floor temperature shifts were investigated. At the end of these evaluations, a template of elliptical shape, and sizes of 135 × 63 pixels, was chosen. Furthermore, a temperature shift of one degree was selected to calculate, for each image, a color background threshold to apply in the following field tests. Obtained results showed an improvement of the sensor detection accuracy that reached values of sensitivity and specificity of 95.1% and 98.7%. In addition, the range of time necessary to detect a hen, or classify a case, was reduced at two seconds. This result could allow the sensor to control a bigger area of the housing system. Thus, the resulting monitoring system could allow to perform the sanitary treatments without risks both for animals and humans. PMID:28538654

  5. Characterization study of an intensified complementary metal-oxide-semiconductor active pixel sensor.

    PubMed

    Griffiths, J A; Chen, D; Turchetta, R; Royle, G J

    2011-03-01

    An intensified CMOS active pixel sensor (APS) has been constructed for operation in low-light-level applications: a high-gain, fast-light decay image intensifier has been coupled via a fiber optic stud to a prototype "VANILLA" APS, developed by the UK based MI3 consortium. The sensor is capable of high frame rates and sparse readout. This paper presents a study of the performance parameters of the intensified VANILLA APS system over a range of image intensifier gain levels when uniformly illuminated with 520 nm green light. Mean-variance analysis shows the APS saturating around 3050 Digital Units (DU), with the maximum variance increasing with increasing image intensifier gain. The system's quantum efficiency varies in an exponential manner from 260 at an intensifier gain of 7.45 × 10(3) to 1.6 at a gain of 3.93 × 10(1). The usable dynamic range of the system is 60 dB for intensifier gains below 1.8 × 10(3), dropping to around 40 dB at high gains. The conclusion is that the system shows suitability for the desired application.

  6. Characterization study of an intensified complementary metal-oxide-semiconductor active pixel sensor

    NASA Astrophysics Data System (ADS)

    Griffiths, J. A.; Chen, D.; Turchetta, R.; Royle, G. J.

    2011-03-01

    An intensified CMOS active pixel sensor (APS) has been constructed for operation in low-light-level applications: a high-gain, fast-light decay image intensifier has been coupled via a fiber optic stud to a prototype "VANILLA" APS, developed by the UK based MI3 consortium. The sensor is capable of high frame rates and sparse readout. This paper presents a study of the performance parameters of the intensified VANILLA APS system over a range of image intensifier gain levels when uniformly illuminated with 520 nm green light. Mean-variance analysis shows the APS saturating around 3050 Digital Units (DU), with the maximum variance increasing with increasing image intensifier gain. The system's quantum efficiency varies in an exponential manner from 260 at an intensifier gain of 7.45 × 103 to 1.6 at a gain of 3.93 × 101. The usable dynamic range of the system is 60 dB for intensifier gains below 1.8 × 103, dropping to around 40 dB at high gains. The conclusion is that the system shows suitability for the desired application.

  7. Event-Based Tone Mapping for Asynchronous Time-Based Image Sensor

    PubMed Central

    Simon Chane, Camille; Ieng, Sio-Hoi; Posch, Christoph; Benosman, Ryad B.

    2016-01-01

    The asynchronous time-based neuromorphic image sensor ATIS is an array of autonomously operating pixels able to encode luminance information with an exceptionally high dynamic range (>143 dB). This paper introduces an event-based methodology to display data from this type of event-based imagers, taking into account the large dynamic range and high temporal accuracy that go beyond available mainstream display technologies. We introduce an event-based tone mapping methodology for asynchronously acquired time encoded gray-level data. A global and a local tone mapping operator are proposed. Both are designed to operate on a stream of incoming events rather than on time frame windows. Experimental results on real outdoor scenes are presented to evaluate the performance of the tone mapping operators in terms of quality, temporal stability, adaptation capability, and computational time. PMID:27642275

  8. Medipix2 based CdTe microprobe for dental imaging

    NASA Astrophysics Data System (ADS)

    Vykydal, Z.; Fauler, A.; Fiederle, M.; Jakubek, J.; Svestkova, M.; Zwerger, A.

    2011-12-01

    Medical imaging devices and techniques are demanded to provide high resolution and low dose images of samples or patients. Hybrid semiconductor single photon counting devices together with suitable sensor materials and advanced techniques of image reconstruction fulfil these requirements. In particular cases such as the direct observation of dental implants also the size of the imaging device itself plays a critical role. This work presents the comparison of 2D radiographs of tooth provided by a standard commercial dental imaging system (Gendex 765DC X-ray tube with VisualiX scintillation detector) and two Medipix2 USB Lite detectors one equipped with a Si sensor (300 μm thick) and one with a CdTe sensor (1 mm thick). Single photon counting capability of the Medipix2 device allows virtually unlimited dynamic range of the images and thus increases the contrast significantly. The dimensions of the whole USB Lite device are only 15 mm × 60 mm of which 25% consists of the sensitive area. Detector of this compact size can be used directly inside the patients' mouth.

  9. Development of collision avoidance system for useful UAV applications using image sensors with laser transmitter

    NASA Astrophysics Data System (ADS)

    Cheong, M. K.; Bahiki, M. R.; Azrad, S.

    2016-10-01

    The main goal of this study is to demonstrate the approach of achieving collision avoidance on Quadrotor Unmanned Aerial Vehicle (QUAV) using image sensors with colour- based tracking method. A pair of high definition (HD) stereo cameras were chosen as the stereo vision sensor to obtain depth data from flat object surfaces. Laser transmitter was utilized to project high contrast tracking spot for depth calculation using common triangulation. Stereo vision algorithm was developed to acquire the distance from tracked point to QUAV and the control algorithm was designed to manipulate QUAV's response based on depth calculated. Attitude and position controller were designed using the non-linear model with the help of Optitrack motion tracking system. A number of collision avoidance flight tests were carried out to validate the performance of the stereo vision and control algorithm based on image sensors. In the results, the UAV was able to hover with fairly good accuracy in both static and dynamic collision avoidance for short range collision avoidance. Collision avoidance performance of the UAV was better with obstacle of dull surfaces in comparison to shiny surfaces. The minimum collision avoidance distance achievable was 0.4 m. The approach was suitable to be applied in short range collision avoidance.

  10. Optical Demonstration of a Medical Imaging System with an EMCCD-Sensor Array for Use in a High Resolution Dynamic X-ray Imager

    PubMed Central

    Qu, Bin; Huang, Ying; Wang, Weiyuan; Sharma, Prateek; Kuhls-Gilcrist, Andrew T.; Cartwright, Alexander N.; Titus, Albert H.; Bednarek, Daniel R.; Rudin, Stephen

    2011-01-01

    Use of an extensible array of Electron Multiplying CCDs (EMCCDs) in medical x-ray imager applications was demonstrated for the first time. The large variable electronic-gain (up to 2000) and small pixel size of EMCCDs provide effective suppression of readout noise compared to signal, as well as high resolution, enabling the development of an x-ray detector with far superior performance compared to conventional x-ray image intensifiers and flat panel detectors. We are developing arrays of EMCCDs to overcome their limited field of view (FOV). In this work we report on an array of two EMCCD sensors running simultaneously at a high frame rate and optically focused on a mammogram film showing calcified ducts. The work was conducted on an optical table with a pulsed LED bar used to provide a uniform diffuse light onto the film to simulate x-ray projection images. The system can be selected to run at up to 17.5 frames per second or even higher frame rate with binning. Integration time for the sensors can be adjusted from 1 ms to 1000 ms. Twelve-bit correlated double sampling AD converters were used to digitize the images, which were acquired by a National Instruments dual-channel Camera Link PC board in real time. A user-friendly interface was programmed using LabVIEW to save and display 2K × 1K pixel matrix digital images. The demonstration tiles a 2 × 1 array to acquire increased-FOV stationary images taken at different gains and fluoroscopic-like videos recorded by scanning the mammogram simultaneously with both sensors. The results show high resolution and high dynamic range images stitched together with minimal adjustments needed. The EMCCD array design allows for expansion to an M×N array for arbitrarily larger FOV, yet with high resolution and large dynamic range maintained. PMID:23505330

  11. High-resolution imaging of magnetic fields using scanning superconducting quantum interference device (SQUID) microscopy

    NASA Astrophysics Data System (ADS)

    Fong de Los Santos, Luis E.

    Development of a scanning superconducting quantum interference device (SQUID) microscope system with interchangeable sensor configurations for imaging magnetic fields of room-temperature (RT) samples with sub-millimeter resolution. The low-critical-temperature (Tc) niobium-based monolithic SQUID sensor is mounted in the tip of a sapphire rod and thermally anchored to the cryostat helium reservoir. A 25 mum sapphire window separates the vacuum space from the RT sample. A positioning mechanism allows adjusting the sample-to-sensor spacing from the top of the Dewar. I have achieved a sensor-to-sample spacing of 100 mum, which could be maintained for periods of up to 4 weeks. Different SQUID sensor configurations are necessary to achieve the best combination of spatial resolution and field sensitivity for a given magnetic source. For imaging thin sections of geological samples, I used a custom-designed monolithic low-Tc niobium bare SQUID sensor, with an effective diameter of 80 mum, and achieved a field sensitivity of 1.5 pT/Hz1/2 and a magnetic moment sensitivity of 5.4 x 10-18 Am2/Hz1/2 at a sensor-to-sample spacing of 100 mum in the white noise region for frequencies above 100 Hz. Imaging action currents in cardiac tissue requires higher field sensitivity, which can only be achieved by compromising spatial resolution. I developed a monolithic low-Tc niobium multiloop SQUID sensor, with sensor sizes ranging from 250 mum to 1 mm, and achieved sensitivities of 480 - 180 fT/Hz1/2 in the white noise region for frequencies above 100 Hz, respectively. For all sensor configurations, the spatial resolution was comparable to the effective diameter and limited by the sensor-to-sample spacing. Spatial registration allowed us to compare high-resolution images of magnetic fields associated with action currents and optical recordings of transmembrane potentials to study the bidomain nature of cardiac tissue or to match petrography to magnetic field maps in thin sections of geological samples.

  12. LIRIS flight database and its use toward noncooperative rendezvous

    NASA Astrophysics Data System (ADS)

    Mongrard, O.; Ankersen, F.; Casiez, P.; Cavrois, B.; Donnard, A.; Vergnol, A.; Southivong, U.

    2018-06-01

    ESA's fifth and last Automated Transfer Vehicle, ATV Georges Lemaître, tested new rendezvous technology before docking with the International Space Station (ISS) in August 2014. The technology demonstration called Laser Infrared Imaging Sensors (LIRIS) provides an unseen view of the ISS. During Georges Lemaître's rendezvous, LIRIS sensors, composed of two infrared cameras, one visible camera, and a scanning LIDAR (Light Detection and Ranging), were turned on two and a half hours and 3500 m from the Space Station. All sensors worked as expected and a large amount of data was recorded and stored within ATV-5's cargo hold before being returned to Earth with the Soyuz flight 38S in September 2014. As a part of the LIRIS postflight activities, the information gathered by all sensors is collected inside a flight database together with the reference ATV trajectory and attitude estimated by ATV main navigation sensors. Although decoupled from the ATV main computer, the LIRIS data were carefully synchronized with ATV guidance, navigation, and control (GNC) data. Hence, the LIRIS database can be used to assess the performance of various image processing algorithms to provide range and line-of-sight (LoS) navigation at long/medium range but also 6 degree-of-freedom (DoF) navigation at short range. The database also contains information related to the overall ATV position with respect to Earth and the Sun direction within ATV frame such that the effect of the environment on the sensors can also be investigated. This paper introduces the structure of the LIRIS database and provides some example of applications to increase the technology readiness level of noncooperative rendezvous.

  13. Event-based Sensing for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Cohen, G.; Afshar, S.; van Schaik, A.; Wabnitz, A.; Bessell, T.; Rutten, M.; Morreale, B.

    A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatiotemporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and nighttime (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An eventbased sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding challenges required by space-based SSA systems. Results from these experiments and the systems developed highlight the applicability of event-based sensors to ground and space-based SSA tasks.

  14. LSI-based amperometric sensor for bio-imaging and multi-point biosensing.

    PubMed

    Inoue, Kumi Y; Matsudaira, Masahki; Kubo, Reyushi; Nakano, Masanori; Yoshida, Shinya; Matsuzaki, Sakae; Suda, Atsushi; Kunikata, Ryota; Kimura, Tatsuo; Tsurumi, Ryota; Shioya, Toshihito; Ino, Kosuke; Shiku, Hitoshi; Satoh, Shiro; Esashi, Masayoshi; Matsue, Tomokazu

    2012-09-21

    We have developed an LSI-based amperometric sensor called "Bio-LSI" with 400 measurement points as a platform for electrochemical bio-imaging and multi-point biosensing. The system is comprised of a 10.4 mm × 10.4 mm CMOS sensor chip with 20 × 20 unit cells, an external circuit box, a control unit for data acquisition, and a DC power box. Each unit cell of the chip contains an operational amplifier with a switched-capacitor type I-V converter for in-pixel signal amplification. We successfully realized a wide dynamic range from ±1 pA to ±100 nA with a well-organized circuit design and operating software. In particular, in-pixel signal amplification and an original program to control the signal read-out contribute to the lower detection limit and wide detection range of Bio-LSI. The spacial resolution is 250 μm and the temporal resolution is 18-125 ms/400 points, which depends on the desired current detection range. The coefficient of variance of the current for 400 points is within 5%. We also demonstrated the real-time imaging of a biological molecule using Bio-LSI. The LSI coated with an Os-HRP film was successfully applied to the monitoring of the changes of hydrogen peroxide concentration in a flow. The Os-HRP-coated LSI was spotted with glucose oxidase and used for bioelectrochemical imaging of the glucose oxidase (GOx)-catalyzed oxidation of glucose. Bio-LSI is a promising platform for a wide range of analytical fields, including diagnostics, environmental measurements and basic biochemistry.

  15. Piezo-based, high dynamic range, wide bandwidth steering system for optical applications

    NASA Astrophysics Data System (ADS)

    Karasikov, Nir; Peled, Gal; Yasinov, Roman; Feinstein, Alan

    2017-05-01

    Piezoelectric motors and actuators are characterized by direct drive, fast response, high positioning resolution and high mechanical power density. These properties are beneficial for optical devices such as gimbals, optical image stabilizers and mirror angular positioners. The range of applications includes sensor pointing systems, image stabilization, laser steering and more. This paper reports on the construction, properties and operation of three types of piezo based building blocks for optical steering applications: a small gimbal and a two-axis OIS (Optical Image Stabilization) mechanism, both based on piezoelectric motors, and a flexure-assisted piezoelectric actuator for mirror angular positioning. The gimbal weighs less than 190 grams, has a wide angular span (solid angle of > 2π) and allows for a 80 micro-radian stabilization with a stabilization frequency up to 25 Hz. The OIS is an X-Y, closed loop, platform having a lateral positioning resolution better than 1 μm, a stabilization frequency up to 25 Hz and a travel of +/-2 mm. It is used for laser steering or positioning of the image sensor, based on signals from a MEMS Gyro sensor. The actuator mirror positioner is based on three piezoelectric actuation axes for tip tilt (each providing a 50 μm motion range), has a positioning resolution of 10 nm and is capable of a 1000 Hz response. A combination of the gimbal with the mirror positioner or the OIS stage is explored by simulations, indicating a <10 micro-radian stabilization capability under substantial perturbation. Simulations and experimental results are presented for a combined device facilitating both wide steering angle range and bandwidth.

  16. Multiple-Event, Single-Photon Counting Imaging Sensor

    NASA Technical Reports Server (NTRS)

    Zheng, Xinyu; Cunningham, Thomas J.; Sun, Chao; Wang, Kang L.

    2011-01-01

    The single-photon counting imaging sensor is typically an array of silicon Geiger-mode avalanche photodiodes that are monolithically integrated with CMOS (complementary metal oxide semiconductor) readout, signal processing, and addressing circuits located in each pixel and the peripheral area of the chip. The major problem is its single-event method for photon count number registration. A single-event single-photon counting imaging array only allows registration of up to one photon count in each of its pixels during a frame time, i.e., the interval between two successive pixel reset operations. Since the frame time can t be too short, this will lead to very low dynamic range and make the sensor merely useful for very low flux environments. The second problem of the prior technique is a limited fill factor resulting from consumption of chip area by the monolithically integrated CMOS readout in pixels. The resulting low photon collection efficiency will substantially ruin any benefit gained from the very sensitive single-photon counting detection. The single-photon counting imaging sensor developed in this work has a novel multiple-event architecture, which allows each of its pixels to register as more than one million (or more) photon-counting events during a frame time. Because of a consequently boosted dynamic range, the imaging array of the invention is capable of performing single-photon counting under ultra-low light through high-flux environments. On the other hand, since the multiple-event architecture is implemented in a hybrid structure, back-illumination and close-to-unity fill factor can be realized, and maximized quantum efficiency can also be achieved in the detector array.

  17. New Optical Sensing Materials for Application in Marine Research

    NASA Astrophysics Data System (ADS)

    Borisov, S.; Klimant, I.

    2012-04-01

    Optical chemosensors are versatile analytical tools which find application in numerous fields of science and technology. They proved to be a promising alternative to electrochemical methods and are applied increasingly often in marine research. However, not all state-of-the- art optical chemosensors are suitable for these demanding applications since they do not fully fulfil the requirements of high luminescence brightness, high chemical- and photochemical stability or their spectral properties are not adequate. Therefore, development of new advanced sensing materials is still of utmost importance. Here we present a set of novel optical sensing materials recently developed in the Institute of Analytical Chemistry and Food Chemistry which are optimized for marine applications. Particularly, we present new NIR indicators and sensors for oxygen and pH which feature high brightness and low level of autofluorescence. The oxygen sensors rely on highly photostable metal complexes of benzoporphyrins and azabenzoporphyrins and enable several important applications such as simultaneous monitoring of oxygen and chlorophyll or ultra-fast oxygen monitoring (Eddy correlation). We also developed ulta-sensitive oxygen optodes which enable monitoring in nM range and are primary designed for investigation of oxygen minimum zones. The dynamic range of our new NIR pH indicators based on aza-BODIPY dyes is optimized for the marine environment. A highly sensitive NIR luminescent phosphor (chromium(III) doped yttrium aluminium borate) can be used for non-invasive temperature measurements. Notably, the oxygen, pH sensors and temperature sensors are fully compatible with the commercially available fiber-optic readers (Firesting from PyroScience). An optical CO2 sensor for marine applications employs novel diketopyrrolopyrrol indicators and enables ratiometric imaging using a CCD camera. Oxygen, pH and temperature sensors suitable for lifetime and ratiometric imaging of analytes distribution are also realized. To enable versatility of applications we also obtained a range of nano- and microparticles suitable for intra- and extracellular imaging of the above analytes. Bright ratiometric 2-photon-excitable probes were also developed. Magnetic microparticles are demonstrated to be very promising tools for imaging of oxygen, temperature and other parameters in biofilms, corals etc. since they combine the sensing function with the possibility of external manipulation.

  18. Achieving thermography with a thermal security camera using uncooled amorphous silicon microbolometer image sensors

    NASA Astrophysics Data System (ADS)

    Wang, Yu-Wei; Tesdahl, Curtis; Owens, Jim; Dorn, David

    2012-06-01

    Advancements in uncooled microbolometer technology over the last several years have opened up many commercial applications which had been previously cost prohibitive. Thermal technology is no longer limited to the military and government market segments. One type of thermal sensor with low NETD which is available in the commercial market segment is the uncooled amorphous silicon (α-Si) microbolometer image sensor. Typical thermal security cameras focus on providing the best image quality by auto tonemaping (contrast enhancing) the image, which provides the best contrast depending on the temperature range of the scene. While this may provide enough information to detect objects and activities, there are further benefits of being able to estimate the actual object temperatures in a scene. This thermographic ability can provide functionality beyond typical security cameras by being able to monitor processes. Example applications of thermography[2] with thermal camera include: monitoring electrical circuits, industrial machinery, building thermal leaks, oil/gas pipelines, power substations, etc...[3][5] This paper discusses the methodology of estimating object temperatures by characterizing/calibrating different components inside a thermal camera utilizing an uncooled amorphous silicon microbolometer image sensor. Plots of system performance across camera operating temperatures will be shown.

  19. Automatic relative RPC image model bias compensation through hierarchical image matching for improving DEM quality

    NASA Astrophysics Data System (ADS)

    Noh, Myoung-Jong; Howat, Ian M.

    2018-02-01

    The quality and efficiency of automated Digital Elevation Model (DEM) extraction from stereoscopic satellite imagery is critically dependent on the accuracy of the sensor model used for co-locating pixels between stereo-pair images. In the absence of ground control or manual tie point selection, errors in the sensor models must be compensated with increased matching search-spaces, increasing both the computation time and the likelihood of spurious matches. Here we present an algorithm for automatically determining and compensating the relative bias in Rational Polynomial Coefficients (RPCs) between stereo-pairs utilizing hierarchical, sub-pixel image matching in object space. We demonstrate the algorithm using a suite of image stereo-pairs from multiple satellites over a range stereo-photogrammetrically challenging polar terrains. Besides providing a validation of the effectiveness of the algorithm for improving DEM quality, experiments with prescribed sensor model errors yield insight into the dependence of DEM characteristics and quality on relative sensor model bias. This algorithm is included in the Surface Extraction through TIN-based Search-space Minimization (SETSM) DEM extraction software package, which is the primary software used for the U.S. National Science Foundation ArcticDEM and Reference Elevation Model of Antarctica (REMA) products.

  20. Autonomous chemical and biological miniature wireless-sensor

    NASA Astrophysics Data System (ADS)

    Goldberg, Bar-Giora

    2005-05-01

    The presentation discusses a new concept and a paradigm shift in biological, chemical and explosive sensor system design and deployment. From large, heavy, centralized and expensive systems to distributed wireless sensor networks utilizing miniature platforms (nodes) that are lightweight, low cost and wirelessly connected. These new systems are possible due to the emergence and convergence of new innovative radio, imaging, networking and sensor technologies. Miniature integrated radio-sensor networks, is a technology whose time has come. These network systems are based on large numbers of distributed low cost and short-range wireless platforms that sense and process their environment and communicate data thru a network to a command center. The recent emergence of chemical and explosive sensor technology based on silicon nanostructures, coupled with the fast evolution of low-cost CMOS imagers, low power DSP engines and integrated radio chips, has created an opportunity to realize the vision of autonomous wireless networks. These threat detection networks will perform sophisticated analysis at the sensor node and convey alarm information up the command chain. Sensor networks of this type are expected to revolutionize the ability to detect and locate biological, chemical, or explosive threats. The ability to distribute large numbers of low-cost sensors over large areas enables these devices to be close to the targeted threats and therefore improve detection efficiencies and enable rapid counter responses. These sensor networks will be used for homeland security, shipping container monitoring, and other applications such as laboratory medical analysis, drug discovery, automotive, environmental and/or in-vivo monitoring. Avaak"s system concept is to image a chromatic biological, chemical and/or explosive sensor utilizing a digital imager, analyze the images and distribute alarm or image data wirelessly through the network. All the imaging, processing and communications would take place within the miniature, low cost distributed sensor platforms. This concept however presents a significant challenge due to a combination and convergence of required new technologies, as mentioned above. Passive biological and chemical sensors with very high sensitivity and which require no assaying are in development using a technique to optically and chemically encode silicon wafers with tailored nanostructures. The silicon wafer is patterned with nano-structures designed to change colors ad patterns when exposed to the target analytes (TICs, TIMs, VOC). A small video camera detects the color and pattern changes on the sensor. To determine if an alarm condition is present, an on board DSP processor, using specialized image processing algorithms and statistical analysis, determines if color gradient changes occurred on the sensor array. These sensors can detect several agents simultaneously. This system is currently under development by Avaak, with funding from DARPA through an SBIR grant.

  1. Small craft ID criteria (N50/V50) for short wave infrared sensors in maritime security

    NASA Astrophysics Data System (ADS)

    Krapels, Keith; Driggers, Ronald G.; Larson, Paul; Garcia, Jose; Walden, Barry; Agheera, Sameer; Deaver, Dawne; Hixson, Jonathan; Boettcher, Evelyn

    2008-04-01

    The need for Anti-Terrorism and Force Protection (AT/FP), for both shore and sea platform protection, has resulted in a need for imager design and evaluation tools which can predict field performance against maritime asymmetric threats. In the design of tactical imaging systems for target acquisition, a discrimination criterion is required for successful sensor realization. It characterizes the difficulty of the task being performed by the observer and varies for different target sets. This criterion is used in both assessment of existing infrared sensor and in the design of new conceptual sensors. In this experiment, we collected 8 small craft signatures (military and civilian) in the short wave infrared (SWIR) band during the day. These signatures were processed to determine the targets' characteristic dimension and contrast. They were also processed to bandlimit the signature's spatial information content (simulating longer range) and a perception experiment was performed to determine the task difficulty (N50 and V50). The results are presented in this paper and can be used for maritime security imaging sensor design and evaluation.

  2. Development of a handheld widefield hyperspectral imaging (HSI) sensor for standoff detection of explosive, chemical, and narcotic residues

    NASA Astrophysics Data System (ADS)

    Nelson, Matthew P.; Basta, Andrew; Patil, Raju; Klueva, Oksana; Treado, Patrick J.

    2013-05-01

    The utility of Hyper Spectral Imaging (HSI) passive chemical detection employing wide field, standoff imaging continues to be advanced in detection applications. With a drive for reduced SWaP (Size, Weight, and Power), increased speed of detection and sensitivity, developing a handheld platform that is robust and user-friendly increases the detection capabilities of the end user. In addition, easy to use handheld detectors could improve the effectiveness of locating and identifying threats while reducing risks to the individual. ChemImage Sensor Systems (CISS) has developed the HSI Aperio™ sensor for real time, wide area surveillance and standoff detection of explosives, chemical threats, and narcotics for use in both government and commercial contexts. Employing liquid crystal tunable filter technology, the HSI system has an intuitive user interface that produces automated detections and real-time display of threats with an end user created library of threat signatures that is easily updated allowing for new hazardous materials. Unlike existing detection technologies that often require close proximity for sensing and so endanger operators and costly equipment, the handheld sensor allows the individual operator to detect threats from a safe distance. Uses of the sensor include locating production facilities of illegal drugs or IEDs by identification of materials on surfaces such as walls, floors, doors, deposits on production tools and residue on individuals. In addition, the sensor can be used for longer-range standoff applications such as hasty checkpoint or vehicle inspection of residue materials on surfaces or bulk material identification. The CISS Aperio™ sensor has faster data collection, faster image processing, and increased detection capability compared to previous sensors.

  3. A CMOS-based large-area high-resolution imaging system for high-energy x-ray applications

    NASA Astrophysics Data System (ADS)

    Rodricks, Brian; Fowler, Boyd; Liu, Chiao; Lowes, John; Haeffner, Dean; Lienert, Ulrich; Almer, John

    2008-08-01

    CCDs have been the primary sensor in imaging systems for x-ray diffraction and imaging applications in recent years. CCDs have met the fundamental requirements of low noise, high-sensitivity, high dynamic range and spatial resolution necessary for these scientific applications. State-of-the-art CMOS image sensor (CIS) technology has experienced dramatic improvements recently and their performance is rivaling or surpassing that of most CCDs. The advancement of CIS technology is at an ever-accelerating pace and is driven by the multi-billion dollar consumer market. There are several advantages of CIS over traditional CCDs and other solid-state imaging devices; they include low power, high-speed operation, system-on-chip integration and lower manufacturing costs. The combination of superior imaging performance and system advantages makes CIS a good candidate for high-sensitivity imaging system development. This paper will describe a 1344 x 1212 CIS imaging system with a 19.5μm pitch optimized for x-ray scattering studies at high-energies. Fundamental metrics of linearity, dynamic range, spatial resolution, conversion gain, sensitivity are estimated. The Detective Quantum Efficiency (DQE) is also estimated. Representative x-ray diffraction images are presented. Diffraction images are compared against a CCD-based imaging system.

  4. Small image laser range finder for planetary rover

    NASA Technical Reports Server (NTRS)

    Wakabayashi, Yasufumi; Honda, Masahisa; Adachi, Tadashi; Iijima, Takahiko

    1994-01-01

    A variety of technical subjects need to be solved before planetary rover navigation could be a part of future missions. The sensors which will perceive terrain environment around the rover will require critical development efforts. The image laser range finder (ILRF) discussed here is one of the candidate sensors because of its advantage in providing range data required for its navigation. The authors developed a new compact-sized ILRF which is a quarter of the size of conventional ones. Instead of the current two directional scanning system which is comprised of nodding and polygon mirrors, the new ILRF is equipped with the new concept of a direct polygon mirror driving system, which successfully made its size compact to accommodate the design requirements. The paper reports on the design concept and preliminary technical specifications established in the current development phase.

  5. Centroid tracker and aimpoint selection

    NASA Astrophysics Data System (ADS)

    Venkateswarlu, Ronda; Sujata, K. V.; Venkateswara Rao, B.

    1992-11-01

    Autonomous fire and forget weapons have gained importance to achieve accurate first pass kill by hitting the target at an appropriate aim point. Centroid of the image presented by a target in the field of view (FOV) of a sensor is generally accepted as the aimpoint for these weapons. Centroid trackers are applicable only when the target image is of significant size in the FOV of the sensor but does not overflow the FOV. But as the range between the sensor and the target decreases the image of the target will grow and finally overflow the FOV at close ranges and the centroid point on the target will keep on changing which is not desirable. And also centroid need not be the most desired/vulnerable point on the target. For hardened targets like tanks, proper aimpoint selection and guidance up to almost zero range is essential to achieve maximum kill probability. This paper presents a centroid tracker realization. As centroid offers a stable tracking point, it can be used as a reference to select the proper aimpoint. The centroid and the desired aimpoint are simultaneously tracked to avoid jamming by flares and also to take care of the problems arising due to image overflow. Thresholding of gray level image to binary image is a crucial step in centroid tracker. Different thresholding algorithms are discussed and a suitable algorithm is chosen. The real-time hardware implementation of centroid tracker with a suitable thresholding technique is presented including the interfacing to a multimode tracker for autonomous target tracking and aimpoint selection. The hardware uses very high speed arithmetic and programmable logic devices to meet the speed requirement and a microprocessor based subsystem for the system control. The tracker has been evaluated in a field environment.

  6. Radioactive Quality Evaluation and Cross Validation of Data from the HJ-1A/B Satellites' CCD Sensors

    PubMed Central

    Zhang, Xin; Zhao, Xiang; Liu, Guodong; Kang, Qian; Wu, Donghai

    2013-01-01

    Data from multiple sensors are frequently used in Earth science to gain a more complete understanding of spatial information changes. Higher quality and mutual consistency are prerequisites when multiple sensors are jointly used. The HJ-1A/B satellites successfully launched on 6 September 2008. There are four charge-coupled device (CCD) sensors with uniform spatial resolutions and spectral range onboard the HJ-A/B satellites. Whether these data are keeping consistency is a major issue before they are used. This research aims to evaluate the data consistency and radioactive quality from the four CCDs. First, images of urban, desert, lake and ocean are chosen as the objects of evaluation. Second, objective evaluation variables, such as mean, variance and angular second moment, are used to identify image performance. Finally, a cross validation method are used to ensure the correlation of the data from the four HJ-1A/B CCDs and that which is gathered from the moderate resolution imaging spectro-radiometer (MODIS). The results show that the image quality of HJ-1A/B CCDs is stable, and the digital number distribution of CCD data is relatively low. In cross validation with MODIS, the root mean square errors of bands 1, 2 and 3 range from 0.055 to 0.065, and for band 4 it is 0.101. The data from HJ-1A/B CCD have better consistency. PMID:23881127

  7. Radioactive quality evaluation and cross validation of data from the HJ-1A/B satellites' CCD sensors.

    PubMed

    Zhang, Xin; Zhao, Xiang; Liu, Guodong; Kang, Qian; Wu, Donghai

    2013-07-05

    Data from multiple sensors are frequently used in Earth science to gain a more complete understanding of spatial information changes. Higher quality and mutual consistency are prerequisites when multiple sensors are jointly used. The HJ-1A/B satellites successfully launched on 6 September 2008. There are four charge-coupled device (CCD) sensors with uniform spatial resolutions and spectral range onboard the HJ-A/B satellites. Whether these data are keeping consistency is a major issue before they are used. This research aims to evaluate the data consistency and radioactive quality from the four CCDs. First, images of urban, desert, lake and ocean are chosen as the objects of evaluation. Second, objective evaluation variables, such as mean, variance and angular second moment, are used to identify image performance. Finally, a cross validation method are used to ensure the correlation of the data from the four HJ-1A/B CCDs and that which is gathered from the moderate resolution imaging spectro-radiometer (MODIS). The results show that the image quality of HJ-1A/B CCDs is stable, and the digital number distribution of CCD data is relatively low. In cross validation with MODIS, the root mean square errors of bands 1, 2 and 3 range from 0.055 to 0.065, and for band 4 it is 0.101. The data from HJ-1A/B CCD have better consistency.

  8. Advanced Video Guidance Sensor (AVGS) Development Testing

    NASA Technical Reports Server (NTRS)

    Howard, Richard T.; Johnston, Albert S.; Bryan, Thomas C.; Book, Michael L.

    2004-01-01

    NASA's Marshall Space Flight Center was the driving force behind the development of the Advanced Video Guidance Sensor, an active sensor system that provides near-range sensor data as part of an automatic rendezvous and docking system. The sensor determines the relative positions and attitudes between the active sensor and the passive target at ranges up to 300 meters. The AVGS uses laser diodes to illuminate retro-reflectors in the target, a solid-state camera to detect the return from the target, and image capture electronics and a digital signal processor to convert the video information into the relative positions and attitudes. The AVGS will fly as part of the Demonstration of Autonomous Rendezvous Technologies (DART) in October, 2004. This development effort has required a great deal of testing of various sorts at every phase of development. Some of the test efforts included optical characterization of performance with the intended target, thermal vacuum testing, performance tests in long range vacuum facilities, EMI/EMC tests, and performance testing in dynamic situations. The sensor has been shown to track a target at ranges of up to 300 meters, both in vacuum and ambient conditions, to survive and operate during the thermal vacuum cycling specific to the DART mission, to handle EM1 well, and to perform well in dynamic situations.

  9. The Performance Evaluation of Multi-Image 3d Reconstruction Software with Different Sensors

    NASA Astrophysics Data System (ADS)

    Mousavi, V.; Khosravi, M.; Ahmadi, M.; Noori, N.; Naveh, A. Hosseini; Varshosaz, M.

    2015-12-01

    Today, multi-image 3D reconstruction is an active research field and generating three dimensional model of the objects is one the most discussed issues in Photogrammetry and Computer Vision that can be accomplished using range-based or image-based methods. Very accurate and dense point clouds generated by range-based methods such as structured light systems and laser scanners has introduced them as reliable tools in the industry. Image-based 3D digitization methodologies offer the option of reconstructing an object by a set of unordered images that depict it from different viewpoints. As their hardware requirements are narrowed down to a digital camera and a computer system, they compose an attractive 3D digitization approach, consequently, although range-based methods are generally very accurate, image-based methods are low-cost and can be easily used by non-professional users. One of the factors affecting the accuracy of the obtained model in image-based methods is the software and algorithm used to generate three dimensional model. These algorithms are provided in the form of commercial software, open source and web-based services. Another important factor in the accuracy of the obtained model is the type of sensor used. Due to availability of mobile sensors to the public, popularity of professional sensors and the advent of stereo sensors, a comparison of these three sensors plays an effective role in evaluating and finding the optimized method to generate three-dimensional models. Lots of research has been accomplished to identify a suitable software and algorithm to achieve an accurate and complete model, however little attention is paid to the type of sensors used and its effects on the quality of the final model. The purpose of this paper is deliberation and the introduction of an appropriate combination of a sensor and software to provide a complete model with the highest accuracy. To do this, different software, used in previous studies, were compared and the most popular ones in each category were selected (Arc 3D, Visual SfM, Sure, Agisoft). Also four small objects with distinct geometric properties and especial complexities were chosen and their accurate models as reliable true data was created using ATOS Compact Scan 2M 3D scanner. Images were taken using Fujifilm Real 3D stereo camera, Apple iPhone 5 and Nikon D3200 professional camera and three dimensional models of the objects were obtained using each of the software. Finally, a comprehensive comparison between the detailed reviews of the results on the data set showed that the best combination of software and sensors for generating three-dimensional models is directly related to the object shape as well as the expected accuracy of the final model. Generally better quantitative and qualitative results were obtained by using the Nikon D3200 professional camera, while Fujifilm Real 3D stereo camera and Apple iPhone 5 were the second and third respectively in this comparison. On the other hand, three software of Visual SfM, Sure and Agisoft had a hard competition to achieve the most accurate and complete model of the objects and the best software was different according to the geometric properties of the object.

  10. Design of the OMPS limb sensor correction algorithm

    NASA Astrophysics Data System (ADS)

    Jaross, Glen; McPeters, Richard; Seftor, Colin; Kowitt, Mark

    The Sensor Data Records (SDR) for the Ozone Mapping and Profiler Suite (OMPS) on NPOESS (National Polar-orbiting Operational Environmental Satellite System) contains geolocated and calibrated radiances, and are similar to the Level 1 data of NASA Earth Observing System and other programs. The SDR algorithms (one for each of the 3 OMPS focal planes) are the processes by which the Raw Data Records (RDR) from the OMPS sensors are converted into the records that contain all data necessary for ozone retrievals. Consequently, the algorithms must correct and calibrate Earth signals, geolocate the data, and identify and ingest collocated ancillary data. As with other limb sensors, ozone profile retrievals are relatively insensitive to calibration errors due to the use of altitude normalization and wavelength pairing. But the profile retrievals as they pertain to OMPS are not immune from sensor changes. In particular, the OMPS Limb sensor images an altitude range of > 100 km and a spectral range of 290-1000 nm on its detector. Uncorrected sensor degradation and spectral registration drifts can lead to changes in the measured radiance profile, which in turn affects the ozone trend measurement. Since OMPS is intended for long-term monitoring, sensor calibration is a specific concern. The calibration is maintained via the ground data processing. This means that all sensor calibration data, including direct solar measurements, are brought down in the raw data and processed separately by the SDR algorithms. One of the sensor corrections performed by the algorithm is the correction for stray light. The imaging spectrometer and the unique focal plane design of OMPS makes these corrections particularly challenging and important. Following an overview of the algorithm flow, we will briefly describe the sensor stray light characterization and the correction approach used in the code.

  11. Plenoptic wavefront sensor with scattering pupil.

    PubMed

    Vdovin, Gleb; Soloviev, Oleg; Loktev, Mikhail

    2014-04-21

    We consider a wavefront sensor combining scattering pupil with a plenoptic imager. Such a sensor utilizes the same reconstruction principle as the Hartmann-Shack sensor, however it is free from the ambiguity of the spot location caused by the periodic structure of the sensor matrix, and allows for wider range of measured aberrations. In our study, sensor with scattering pupil has demonstrated a good match between the introduced and reconstructed aberrations, both in the simulation and experiment. The concept is expected to be applicable to optical metrology of strongly distorted wavefronts, especially for measurements through dirty, distorted, or scattering windows and pupils, such as cataract eyes.

  12. Ultra-high resolution coded wavefront sensor.

    PubMed

    Wang, Congli; Dun, Xiong; Fu, Qiang; Heidrich, Wolfgang

    2017-06-12

    Wavefront sensors and more general phase retrieval methods have recently attracted a lot of attention in a host of application domains, ranging from astronomy to scientific imaging and microscopy. In this paper, we introduce a new class of sensor, the Coded Wavefront Sensor, which provides high spatio-temporal resolution using a simple masked sensor under white light illumination. Specifically, we demonstrate megapixel spatial resolution and phase accuracy better than 0.1 wavelengths at reconstruction rates of 50 Hz or more, thus opening up many new applications from high-resolution adaptive optics to real-time phase retrieval in microscopy.

  13. Simple Colorimetric Sensor for Trinitrotoluene Testing

    NASA Astrophysics Data System (ADS)

    Samanman, S.; Masoh, N.; Salah, Y.; Srisawat, S.; Wattanayon, R.; Wangsirikul, P.; Phumivanichakit, K.

    2017-02-01

    A simple operating colorimetric sensor for trinitrotoluene (TNT) determination using a commercial scanner as a captured image was designed. The sensor is based on the chemical reaction between TNT and sodium hydroxide reagent to produce the color change within 96 well plates, which observed finally, recorded using a commercial scanner. The intensity of the color change increased with increase in TNT concentration and could easily quantify the concentration of TNT by digital image analysis using the Image J free software. Under optimum conditions, the sensor provided a linear dynamic range between 0.20 and 1.00 mg mL-1(r = 0.9921) with a limit of detection of 0.10± 0.01 mg mL-1. The relative standard deviation for eight experiments for the sensitivity was 3.8%. When applied for the analysis of TNT in two soil extract samples, the concentrations were found to be non-detectable to 0.26±0.04 mg mL-1. The obtained recovery values (93-95%) were acceptable for soil samples tested.

  14. Millimeter wave imaging for concealed weapon detection and surveillance at up to 220 GHz

    NASA Astrophysics Data System (ADS)

    Stanko, S.; Nötel, D.; Huck, J.; Wirtz, S.; Klöppel, F.; Essen, H.

    2008-04-01

    Sensors used for security purposes have to cover the non-invasive control of men and direct surroundings of buildings and camps to detect weapons, explosives and chemical or biological threat material. Those sensors have to cope with different environmental conditions. Ideally, the control of people has to be done at a longer distance as standoff detection. The work described in this paper concentrates on passive radiometric sensors at 0.1 and 0.2 THz which are able to detect non-metallic objects like ceramic knifes. Also the identification of objects like mobile phones or PDAs will be shown. Additionally, standoff surveillance is possible, which is of high importance with regard to suicide bombers. The presentation will include images at both mentioned frequencies comparing the efficiency in terms of range and resolution. In addition, the concept of the sensor design showing a Dicke-type 220GHz radiometer using new LNAs and the results along with image enhancement methods are shown. 2.1 Main principle

  15. A protein-dye hybrid system as a narrow range tunable intracellular pH sensor.

    PubMed

    Anees, Palapuravan; Sudheesh, Karivachery V; Jayamurthy, Purushothaman; Chandrika, Arunkumar R; Omkumar, Ramakrishnapillai V; Ajayaghosh, Ayyappanpillai

    2016-11-18

    Accurate monitoring of pH variations inside cells is important for the early diagnosis of diseases such as cancer. Even though a variety of different pH sensors are available, construction of a custom-made sensor array for measuring minute variations in a narrow biological pH window, using easily available constituents, is a challenge. Here we report two-component hybrid sensors derived from a protein and organic dye nanoparticles whose sensitivity range can be tuned by choosing different ratios of the components, to monitor the minute pH variations in a given system. The dye interacts noncovalently with the protein at lower pH and covalently at higher pH, triggering two distinguishable fluorescent signals at 700 and 480 nm, respectively. The pH sensitivity region of the probe can be tuned for every unit of the pH window resulting in custom-made pH sensors. These narrow range tunable pH sensors have been used to monitor pH variations in HeLa cells using the fluorescence imaging technique.

  16. Geometric Calibration and Radiometric Correction of the Maia Multispectral Camera

    NASA Astrophysics Data System (ADS)

    Nocerino, E.; Dubbini, M.; Menna, F.; Remondino, F.; Gattelli, M.; Covi, D.

    2017-10-01

    Multispectral imaging is a widely used remote sensing technique, whose applications range from agriculture to environmental monitoring, from food quality check to cultural heritage diagnostic. A variety of multispectral imaging sensors are available on the market, many of them designed to be mounted on different platform, especially small drones. This work focuses on the geometric and radiometric characterization of a brand-new, lightweight, low-cost multispectral camera, called MAIA. The MAIA camera is equipped with nine sensors, allowing for the acquisition of images in the visible and near infrared parts of the electromagnetic spectrum. Two versions are available, characterised by different set of band-pass filters, inspired by the sensors mounted on the WorlView-2 and Sentinel2 satellites, respectively. The camera details and the developed procedures for the geometric calibrations and radiometric correction are presented in the paper.

  17. A versatile calibration procedure for portable coded aperture gamma cameras and RGB-D sensors

    NASA Astrophysics Data System (ADS)

    Paradiso, V.; Crivellaro, A.; Amgarou, K.; de Lanaute, N. Blanc; Fua, P.; Liénard, E.

    2018-04-01

    The present paper proposes a versatile procedure for the geometrical calibration of coded aperture gamma cameras and RGB-D depth sensors, using only one radioactive point source and a simple experimental set-up. Calibration data is then used for accurately aligning radiation images retrieved by means of the γ-camera with the respective depth images computed with the RGB-D sensor. The system resulting from such a combination is thus able to retrieve, automatically, the distance of radioactive hotspots by means of pixel-wise mapping between gamma and depth images. This procedure is of great interest for a wide number of applications, ranging from precise automatic estimation of the shape and distance of radioactive objects to Augmented Reality systems. Incidentally, the corresponding results validated the choice of a perspective design model for a coded aperture γ-camera.

  18. Secure and Efficient Transmission of Hyperspectral Images for Geosciences Applications

    NASA Astrophysics Data System (ADS)

    Carpentieri, Bruno; Pizzolante, Raffaele

    2017-12-01

    Hyperspectral images are acquired through air-borne or space-borne special cameras (sensors) that collect information coming from the electromagnetic spectrum of the observed terrains. Hyperspectral remote sensing and hyperspectral images are used for a wide range of purposes: originally, they were developed for mining applications and for geology because of the capability of this kind of images to correctly identify various types of underground minerals by analysing the reflected spectrums, but their usage has spread in other application fields, such as ecology, military and surveillance, historical research and even archaeology. The large amount of data obtained by the hyperspectral sensors, the fact that these images are acquired at a high cost by air-borne sensors and that they are generally transmitted to a base, makes it necessary to provide an efficient and secure transmission protocol. In this paper, we propose a novel framework that allows secure and efficient transmission of hyperspectral images, by combining a reversible invisible watermarking scheme, used in conjunction with digital signature techniques, and a state-of-art predictive-based lossless compression algorithm.

  19. No scanning depth imaging system based on TOF

    NASA Astrophysics Data System (ADS)

    Sun, Rongchun; Piao, Yan; Wang, Yu; Liu, Shuo

    2016-03-01

    To quickly obtain a 3D model of real world objects, multi-point ranging is very important. However, the traditional measuring method usually adopts the principle of point by point or line by line measurement, which is too slow and of poor efficiency. In the paper, a no scanning depth imaging system based on TOF (time of flight) was proposed. The system is composed of light source circuit, special infrared image sensor module, processor and controller of image data, data cache circuit, communication circuit, and so on. According to the working principle of the TOF measurement, image sequence was collected by the high-speed CMOS sensor, and the distance information was obtained by identifying phase difference, and the amplitude image was also calculated. Experiments were conducted and the experimental results show that the depth imaging system can achieve no scanning depth imaging function with good performance.

  20. Cheetah: A high frame rate, high resolution SWIR image camera

    NASA Astrophysics Data System (ADS)

    Neys, Joel; Bentell, Jonas; O'Grady, Matt; Vermeiren, Jan; Colin, Thierry; Hooylaerts, Peter; Grietens, Bob

    2008-10-01

    A high resolution, high frame rate InGaAs based image sensor and associated camera has been developed. The sensor and the camera are capable of recording and delivering more than 1700 full 640x512pixel frames per second. The FPA utilizes a low lag CTIA current integrator in each pixel, enabling integration times shorter than one microsecond. On-chip logics allows for four different sub windows to be read out simultaneously at even higher rates. The spectral sensitivity of the FPA is situated in the SWIR range [0.9-1.7 μm] and can be further extended into the Visible and NIR range. The Cheetah camera has max 16 GB of on-board memory to store the acquired images and transfer the data over a Gigabit Ethernet connection to the PC. The camera is also equipped with a full CameralinkTM interface to directly stream the data to a frame grabber or dedicated image processing unit. The Cheetah camera is completely under software control.

  1. Hdr Imaging for Feature Detection on Detailed Architectural Scenes

    NASA Astrophysics Data System (ADS)

    Kontogianni, G.; Stathopoulou, E. K.; Georgopoulos, A.; Doulamis, A.

    2015-02-01

    3D reconstruction relies on accurate detection, extraction, description and matching of image features. This is even truer for complex architectural scenes that pose needs for 3D models of high quality, without any loss of detail in geometry or color. Illumination conditions influence the radiometric quality of images, as standard sensors cannot depict properly a wide range of intensities in the same scene. Indeed, overexposed or underexposed pixels cause irreplaceable information loss and degrade digital representation. Images taken under extreme lighting environments may be thus prohibitive for feature detection/extraction and consequently for matching and 3D reconstruction. High Dynamic Range (HDR) images could be helpful for these operators because they broaden the limits of illumination range that Standard or Low Dynamic Range (SDR/LDR) images can capture and increase in this way the amount of details contained in the image. Experimental results of this study prove this assumption as they examine state of the art feature detectors applied both on standard dynamic range and HDR images.

  2. Plenoptic Imager for Automated Surface Navigation

    NASA Technical Reports Server (NTRS)

    Zollar, Byron; Milder, Andrew; Milder, Andrew; Mayo, Michael

    2010-01-01

    An electro-optical imaging device is capable of autonomously determining the range to objects in a scene without the use of active emitters or multiple apertures. The novel, automated, low-power imaging system is based on a plenoptic camera design that was constructed as a breadboard system. Nanohmics proved feasibility of the concept by designing an optical system for a prototype plenoptic camera, developing simulated plenoptic images and range-calculation algorithms, constructing a breadboard prototype plenoptic camera, and processing images (including range calculations) from the prototype system. The breadboard demonstration included an optical subsystem comprised of a main aperture lens, a mechanical structure that holds an array of micro lenses at the focal distance from the main lens, and a structure that mates a CMOS imaging sensor the correct distance from the micro lenses. The demonstrator also featured embedded electronics for camera readout, and a post-processor executing image-processing algorithms to provide ranging information.

  3. Radiometric Characterization Results for the IKONOS, Quickbird, and OrbView-3 Sensor

    NASA Technical Reports Server (NTRS)

    Holekamp, Kara; Aaron, David; Thome, Kurtis

    2006-01-01

    Radiometric calibration of commercial imaging satellite products is required to ensure that science and application communities better understand commercial imaging satellite properties. Inaccurate radiometric calibrations can lead to erroneous decisions and invalid conclusions and can limit intercomparisons with other systems. To address this calibration need, the NASA Applied Sciences Directorate (ASD) at Stennis Space Center established a commercial satellite imaging radiometric calibration team consisting of three independent groups: NASA ASD, the University of Arizona Remote Sensing Group, and South Dakota State University. Each group independently determined the absolute radiometric calibration coefficients of available high-spatial-resolution commercial 4-band multispectral products, in the visible though near-infrared spectrum, from GeoEye(tradeMark) (formerly SpaceImaging(Registered TradeMark)) IKONOS, DigitalGlobe(Regitered TradeMark) QuickBird, and GeoEye (formerly ORBIMAGE(Registered TradeMark) OrbView. Each team member employed some variant of reflectance-based vicarious calibration approach, requiring ground-based measurements coincident with image acquisitions and radiative transfer calculations. Several study sites throughout the United States that covered a significant portion of the sensor's dynamic range were employed. Satellite at-sensor radiance values were compared to those estimated by each independent team member to evaluate the sensor's radiometric accuracy. The combined results of this evaluation provide the user community with an independent assessment of these sensors' absolute calibration values.

  4. Advanced radiometric and interferometric milimeter-wave scene simulations

    NASA Technical Reports Server (NTRS)

    Hauss, B. I.; Moffa, P. J.; Steele, W. G.; Agravante, H.; Davidheiser, R.; Samec, T.; Young, S. K.

    1993-01-01

    Smart munitions and weapons utilize various imaging sensors (including passive IR, active and passive millimeter-wave, and visible wavebands) to detect/identify targets at short standoff ranges and in varied terrain backgrounds. In order to design and evaluate these sensors under a variety of conditions, a high-fidelity scene simulation capability is necessary. Such a capability for passive millimeter-wave scene simulation exists at TRW. TRW's Advanced Radiometric Millimeter-Wave Scene Simulation (ARMSS) code is a rigorous, benchmarked, end-to-end passive millimeter-wave scene simulation code for interpreting millimeter-wave data, establishing scene signatures and evaluating sensor performance. In passive millimeter-wave imaging, resolution is limited due to wavelength and aperture size. Where high resolution is required, the utility of passive millimeter-wave imaging is confined to short ranges. Recent developments in interferometry have made possible high resolution applications on military platforms. Interferometry or synthetic aperture radiometry allows the creation of a high resolution image with a sparsely filled aperture. Borrowing from research work in radio astronomy, we have developed and tested at TRW scene reconstruction algorithms that allow the recovery of the scene from a relatively small number of spatial frequency components. In this paper, the TRW modeling capability is described and numerical results are presented.

  5. IR sensitivity enhancement of CMOS Image Sensor with diffractive light trapping pixels.

    PubMed

    Yokogawa, Sozo; Oshiyama, Itaru; Ikeda, Harumi; Ebiko, Yoshiki; Hirano, Tomoyuki; Saito, Suguru; Oinoue, Takashi; Hagimoto, Yoshiya; Iwamoto, Hayato

    2017-06-19

    We report on the IR sensitivity enhancement of back-illuminated CMOS Image Sensor (BI-CIS) with 2-dimensional diffractive inverted pyramid array structure (IPA) on crystalline silicon (c-Si) and deep trench isolation (DTI). FDTD simulations of semi-infinite thick c-Si having 2D IPAs on its surface whose pitches over 400 nm shows more than 30% improvement of light absorption at λ = 850 nm and the maximum enhancement of 43% with the 540 nm pitch at the wavelength is confirmed. A prototype BI-CIS sample with pixel size of 1.2 μm square containing 400 nm pitch IPAs shows 80% sensitivity enhancement at λ = 850 nm compared to the reference sample with flat surface. This is due to diffraction with the IPA and total reflection at the pixel boundary. The NIR images taken by the demo camera equip with a C-mount lens show 75% sensitivity enhancement in the λ = 700-1200 nm wavelength range with negligible spatial resolution degradation. Light trapping CIS pixel technology promises to improve NIR sensitivity and appears to be applicable to many different image sensor applications including security camera, personal authentication, and range finding Time-of-Flight camera with IR illuminations.

  6. An Algorithm to Identify and Localize Suitable Dock Locations from 3-D LiDAR Scans

    DTIC Science & Technology

    2013-05-10

    Locations from 3-D LiDAR Scans 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Graves, Mitchell Robert 5d. PROJECT NUMBER...Ranging ( LiDAR ) scans. A LiDAR sensor is a sensor that collects range images from a rotating array of vertically aligned lasers. Our solution leverages...Algorithm, Dock, Locations, Point Clouds, LiDAR , Identify 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a

  7. Video sensor with range measurement capability

    NASA Technical Reports Server (NTRS)

    Howard, Richard T. (Inventor); Briscoe, Jeri M. (Inventor); Corder, Eric L. (Inventor); Broderick, David J. (Inventor)

    2008-01-01

    A video sensor device is provided which incorporates a rangefinder function. The device includes a single video camera and a fixed laser spaced a predetermined distance from the camera for, when activated, producing a laser beam. A diffractive optic element divides the beam so that multiple light spots are produced on a target object. A processor calculates the range to the object based on the known spacing and angles determined from the light spots on the video images produced by the camera.

  8. A LWIR hyperspectral imager using a Sagnac interferometer and cooled HgCdTe detector array

    NASA Astrophysics Data System (ADS)

    Lucey, Paul G.; Wood, Mark; Crites, Sarah T.; Akagi, Jason

    2012-06-01

    LWIR hyperspectral imaging has a wide range of civil and military applications with its ability to sense chemical compositions at standoff ranges. Most recent implementations of this technology use spectrographs employing varying degrees of cryogenic cooling to reduce sensor self-emission that can severely limit sensitivity. We have taken an interferometric approach that promises to reduce the need for cooling while preserving high resolution. Reduced cooling has multiple benefits including faster system readiness from a power off state, lower mass, and potentially lower cost owing to lower system complexity. We coupled an uncooled Sagnac interferometer with a 256x320 mercury cadmium telluride array with an 11 micron cutoff to produce a spatial interferometric LWIR hyperspectral imaging system operating from 7.5 to 11 microns. The sensor was tested in ground-ground applications, and from a small aircraft producing spectral imagery including detection of gas emission from high vapor pressure liquids.

  9. Thermal infrared panoramic imaging sensor

    NASA Astrophysics Data System (ADS)

    Gutin, Mikhail; Tsui, Eddy K.; Gutin, Olga; Wang, Xu-Ming; Gutin, Alexey

    2006-05-01

    Panoramic cameras offer true real-time, 360-degree coverage of the surrounding area, valuable for a variety of defense and security applications, including force protection, asset protection, asset control, security including port security, perimeter security, video surveillance, border control, airport security, coastguard operations, search and rescue, intrusion detection, and many others. Automatic detection, location, and tracking of targets outside protected area ensures maximum protection and at the same time reduces the workload on personnel, increases reliability and confidence of target detection, and enables both man-in-the-loop and fully automated system operation. Thermal imaging provides the benefits of all-weather, 24-hour day/night operation with no downtime. In addition, thermal signatures of different target types facilitate better classification, beyond the limits set by camera's spatial resolution. The useful range of catadioptric panoramic cameras is affected by their limited resolution. In many existing systems the resolution is optics-limited. Reflectors customarily used in catadioptric imagers introduce aberrations that may become significant at large camera apertures, such as required in low-light and thermal imaging. Advantages of panoramic imagers with high image resolution include increased area coverage with fewer cameras, instantaneous full horizon detection, location and tracking of multiple targets simultaneously, extended range, and others. The Automatic Panoramic Thermal Integrated Sensor (APTIS), being jointly developed by Applied Science Innovative, Inc. (ASI) and the Armament Research, Development and Engineering Center (ARDEC) combines the strengths of improved, high-resolution panoramic optics with thermal imaging in the 8 - 14 micron spectral range, leveraged by intelligent video processing for automated detection, location, and tracking of moving targets. The work in progress supports the Future Combat Systems (FCS) and the Intelligent Munitions Systems (IMS). The APTIS is anticipated to operate as an intelligent node in a wireless network of multifunctional nodes that work together to serve in a wide range of applications of homeland security, as well as serve the Army in tasks of improved situational awareness (SA) in defense and offensive operations, and as a sensor node in tactical Intelligence Surveillance Reconnaissance (ISR). The novel ViperView TM high-resolution panoramic thermal imager is the heart of the APTIS system. It features an aberration-corrected omnidirectional imager with small optics designed to match the resolution of a 640x480 pixels IR camera with improved image quality for longer range target detection, classification, and tracking. The same approach is applicable to panoramic cameras working in the visible spectral range. Other components of the ATPIS system include network communications, advanced power management, and wakeup capability. Recent developments include image processing, optical design being expanded into the visible spectral range, and wireless communications design. This paper describes the development status of the APTIS system.

  10. Target acquisition modeling over the exact optical path: extending the EOSTAR TDA with the TOD sensor performance model

    NASA Astrophysics Data System (ADS)

    Dijk, J.; Bijl, P.; Oppeneer, M.; ten Hove, R. J. M.; van Iersel, M.

    2017-10-01

    The Electro-Optical Signal Transmission and Ranging (EOSTAR) model is an image-based Tactical Decision Aid (TDA) for thermal imaging systems (MWIR/LWIR) developed for a sea environment with an extensive atmosphere model. The Triangle Orientation Discrimination (TOD) Target Acquisition model calculates the sensor and signal processing effects on a set of input triangle test pattern images, judges their orientation using humans or a Human Visual System (HVS) model and derives the system image quality and operational field performance from the correctness of the responses. Combination of the TOD model and EOSTAR, basically provides the possibility to model Target Acquisition (TA) performance over the exact path from scene to observer. In this method ship representative TOD test patterns are placed at the position of the real target, subsequently the combined effects of the environment (atmosphere, background, etc.), sensor and signal processing on the image are calculated using EOSTAR and finally the results are judged by humans. The thresholds are converted into Detection-Recognition-Identification (DRI) ranges of the real target. In experiments is shown that combination of the TOD model and the EOSTAR model is indeed possible. The resulting images look natural and provide insight in the possibilities of combining the two models. The TOD observation task can be done well by humans, and the measured TOD is consistent with analytical TOD predictions for the same camera that was modeled in the ECOMOS project.

  11. Optimum parameters of image preprocessing method for Shack-Hartmann wavefront sensor in different SNR condition

    NASA Astrophysics Data System (ADS)

    Wei, Ping; Li, Xinyang; Luo, Xi; Li, Jianfeng

    2018-02-01

    The centroid method is commonly adopted to locate the spot in the sub-apertures in the Shack-Hartmann wavefront sensor (SH-WFS), in which preprocessing image is required before calculating the spot location due to that the centroid method is extremely sensitive to noises. In this paper, the SH-WFS image was simulated according to the characteristics of the noises, background and intensity distribution. The Optimal parameters of SH-WFS image preprocessing method were put forward, in different signal-to-noise ratio (SNR) conditions, where the wavefront reconstruction error was considered as the evaluation index. Two methods of image preprocessing, thresholding method and windowing combing with thresholding method, were compared by studying the applicable range of SNR and analyzing the stability of the two methods, respectively.

  12. Real-time optically sectioned wide-field microscopy employing structured light illumination and a CMOS detector

    NASA Astrophysics Data System (ADS)

    Mitic, Jelena; Anhut, Tiemo; Serov, Alexandre; Lasser, Theo; Bourquin, Stephane

    2003-07-01

    Real-time optically sectioned microscopy is demonstrated using an AC-sensitive detection concept realized with smart CMOS image sensor and structured light illumination by a continuously moving periodic pattern. We describe two different detection systems based on CMOS image sensors for the detection and on-chip processing of the sectioned images in real time. A region-of-interest is sampled at high frame rate. The demodulated signal delivered by the detector corresponds to the depth discriminated image of the sample. The measured FWHM of the axial response depends on the spatial frequency of the projected grid illumination and is in the μm-range. The effect of using broadband incoherent illumination is discussed. The performance of these systems is demonstrated by imaging technical as well as biological samples.

  13. Design and testing of a dual-band enhanced vision system

    NASA Astrophysics Data System (ADS)

    Way, Scott P.; Kerr, Richard; Imamura, Joseph J.; Arnoldy, Dan; Zeylmaker, Dick; Zuro, Greg

    2003-09-01

    An effective enhanced vision system must operate over a broad spectral range in order to offer a pilot an optimized scene that includes runway background as well as airport lighting and aircraft operations. The large dynamic range of intensities of these images is best handled with separate imaging sensors. The EVS 2000 is a patented dual-band Infrared Enhanced Vision System (EVS) utilizing image fusion concepts. It has the ability to provide a single image from uncooled infrared imagers combined with SWIR, NIR or LLLTV sensors. The system is designed to provide commercial and corporate airline pilots with improved situational awareness at night and in degraded weather conditions but can also be used in a variety of applications where the fusion of dual band or multiband imagery is required. A prototype of this system was recently fabricated and flown on the Boeing Advanced Technology Demonstrator 737-900 aircraft. This paper will discuss the current EVS 2000 concept, show results taken from the Boeing Advanced Technology Demonstrator program, and discuss future plans for the fusion system.

  14. Origin of high photoconductive gain in fully transparent heterojunction nanocrystalline oxide image sensors and interconnects.

    PubMed

    Jeon, Sanghun; Song, Ihun; Lee, Sungsik; Ryu, Byungki; Ahn, Seung-Eon; Lee, Eunha; Kim, Young; Nathan, Arokia; Robertson, John; Chung, U-In

    2014-11-05

    A technique for invisible image capture using a photosensor array based on transparent conducting oxide semiconductor thin-film transistors and transparent interconnection technologies is presented. A transparent conducting layer is employed for the sensor electrodes as well as interconnection in the array, providing about 80% transmittance at visible-light wavelengths. The phototransistor is a Hf-In-Zn-O/In-Zn-O heterostructure yielding a high quantum-efficiency in the visible range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. SpectraCAM SPM: a camera system with high dynamic range for scientific and medical applications

    NASA Astrophysics Data System (ADS)

    Bhaskaran, S.; Baiko, D.; Lungu, G.; Pilon, M.; VanGorden, S.

    2005-08-01

    A scientific camera system having high dynamic range designed and manufactured by Thermo Electron for scientific and medical applications is presented. The newly developed CID820 image sensor with preamplifier-per-pixel technology is employed in this camera system. The 4 Mega-pixel imaging sensor has a raw dynamic range of 82dB. Each high-transparent pixel is based on a preamplifier-per-pixel architecture and contains two photogates for non-destructive readout of the photon-generated charge (NDRO). Readout is achieved via parallel row processing with on-chip correlated double sampling (CDS). The imager is capable of true random pixel access with a maximum operating speed of 4MHz. The camera controller consists of a custom camera signal processor (CSP) with an integrated 16-bit A/D converter and a PowerPC-based CPU running a Linux embedded operating system. The imager is cooled to -40C via three-stage cooler to minimize dark current. The camera housing is sealed and is designed to maintain the CID820 imager in the evacuated chamber for at least 5 years. Thermo Electron has also developed custom software and firmware to drive the SpectraCAM SPM camera. Included in this firmware package is the new Extreme DRTM algorithm that is designed to extend the effective dynamic range of the camera by several orders of magnitude up to 32-bit dynamic range. The RACID Exposure graphical user interface image analysis software runs on a standard PC that is connected to the camera via Gigabit Ethernet.

  16. Comparison of a CCD and an APS for soft X-ray diffraction

    NASA Astrophysics Data System (ADS)

    Stewart, Graeme; Bates, R.; Blue, A.; Clark, A.; Dhesi, S. S.; Maneuski, D.; Marchal, J.; Steadman, P.; Tartoni, N.; Turchetta, R.

    2011-12-01

    We compare a new CMOS Active Pixel Sensor (APS) to a Princeton Instruments PIXIS-XO: 2048B Charge Coupled Device (CCD) with soft X-rays tested in a synchrotron beam line at the Diamond Light Source (DLS). Despite CCDs being established in the field of scientific imaging, APS are an innovative technology that offers advantages over CCDs. These include faster readout, higher operational temperature, in-pixel electronics for advanced image processing and reduced manufacturing cost. The APS employed was the Vanilla sensor designed by the MI3 collaboration and funded by an RCUK Basic technology grant. This sensor has 520 x 520 square pixels, of size 25 μm on each side. The sensor can operate at a full frame readout of up to 20 Hz. The sensor had been back-thinned, to the epitaxial layer. This was the first time that a back-thinned APS had been demonstrated at a beam line at DLS. In the synchrotron experiment soft X-rays with an energy of approximately 708 eV were used to produce a diffraction pattern from a permalloy sample. The pattern was imaged at a range of integration times with both sensors. The CCD had to be operated at a temperature of -55°C whereas the Vanilla was operated over a temperature range from 20°C to -10°C. We show that the APS detector can operate with frame rates up to two hundred times faster than the CCD, without excessive degradation of image quality. The signal to noise of the APS is shown to be the same as that of the CCD at identical integration times and the response is shown to be linear, with no charge blooming effects. The experiment has allowed a direct comparison of back thinned APS and CCDs in a real soft x-ray synchrotron experiment.

  17. Spatial Quality Evaluation of Resampled Unmanned Aerial Vehicle-Imagery for Weed Mapping.

    PubMed

    Borra-Serrano, Irene; Peña, José Manuel; Torres-Sánchez, Jorge; Mesas-Carrascosa, Francisco Javier; López-Granados, Francisca

    2015-08-12

    Unmanned aerial vehicles (UAVs) combined with different spectral range sensors are an emerging technology for providing early weed maps for optimizing herbicide applications. Considering that weeds, at very early phenological stages, are similar spectrally and in appearance, three major components are relevant: spatial resolution, type of sensor and classification algorithm. Resampling is a technique to create a new version of an image with a different width and/or height in pixels, and it has been used in satellite imagery with different spatial and temporal resolutions. In this paper, the efficiency of resampled-images (RS-images) created from real UAV-images (UAV-images; the UAVs were equipped with two types of sensors, i.e., visible and visible plus near-infrared spectra) captured at different altitudes is examined to test the quality of the RS-image output. The performance of the object-based-image-analysis (OBIA) implemented for the early weed mapping using different weed thresholds was also evaluated. Our results showed that resampling accurately extracted the spectral values from high spatial resolution UAV-images at an altitude of 30 m and the RS-image data at altitudes of 60 and 100 m, was able to provide accurate weed cover and herbicide application maps compared with UAV-images from real flights.

  18. Spatial Quality Evaluation of Resampled Unmanned Aerial Vehicle-Imagery for Weed Mapping

    PubMed Central

    Borra-Serrano, Irene; Peña, José Manuel; Torres-Sánchez, Jorge; Mesas-Carrascosa, Francisco Javier; López-Granados, Francisca

    2015-01-01

    Unmanned aerial vehicles (UAVs) combined with different spectral range sensors are an emerging technology for providing early weed maps for optimizing herbicide applications. Considering that weeds, at very early phenological stages, are similar spectrally and in appearance, three major components are relevant: spatial resolution, type of sensor and classification algorithm. Resampling is a technique to create a new version of an image with a different width and/or height in pixels, and it has been used in satellite imagery with different spatial and temporal resolutions. In this paper, the efficiency of resampled-images (RS-images) created from real UAV-images (UAV-images; the UAVs were equipped with two types of sensors, i.e., visible and visible plus near-infrared spectra) captured at different altitudes is examined to test the quality of the RS-image output. The performance of the object-based-image-analysis (OBIA) implemented for the early weed mapping using different weed thresholds was also evaluated. Our results showed that resampling accurately extracted the spectral values from high spatial resolution UAV-images at an altitude of 30 m and the RS-image data at altitudes of 60 and 100 m, was able to provide accurate weed cover and herbicide application maps compared with UAV-images from real flights. PMID:26274960

  19. Miniaturized optical wavelength sensors

    NASA Astrophysics Data System (ADS)

    Kung, Helen Ling-Ning

    Recently semiconductor processing technology has been applied to the miniaturization of optical wavelength sensors. Compact sensors enable new applications such as integrated diode-laser wavelength monitors and frequency lockers, portable chemical and biological detection, and portable and adaptive hyperspectral imaging arrays. Small sensing systems have trade-offs between resolution, operating range, throughput, multiplexing and complexity. We have developed a new wavelength sensing architecture that balances these parameters for applications involving hyperspectral imaging spectrometer arrays. In this thesis we discuss and demonstrate two new wavelength-sensing architectures whose single-pixel designs can easily be extended into spectrometer arrays. The first class of devices is based on sampling a standing wave. These devices are based on measuring the wavelength-dependent period of optical standing waves formed by the interference of forward and reflected waves at a mirror. We fabricated two different devices based on this principle. The first device is a wavelength monitor, which measures the wavelength and power of a monochromatic source. The second device is a spectrometer that can also act as a selective spectral coherence sensor. The spectrometer contains a large displacement piston-motion MEMS mirror and a thin GaAs photodiode flip-chip bonded to a quartz substrate. The performance of this spectrometer is similar to that of a Michelson in resolution, operating range, throughput and multiplexing but with the added advantages of fewer components and one-dimensional architecture. The second class of devices is based on the Talbot self-imaging effect. The Talbot effect occurs when a periodic object is illuminated with a spatially coherent wave. Periodically spaced self-images are formed behind the object. The spacing of the self-images is proportional to wavelength of the incident light. We discuss and demonstrate how this effect can be used for spectroscopy. In the conclusion we compare these two new miniaturized spectrometer architectures to existing miniaturized spectrometers. We believe that the combination of miniaturized wavelength sensors and smart processing should facilitate the development real-time, adaptive and portable sensing systems.

  20. Advances in image compression and automatic target recognition; Proceedings of the Meeting, Orlando, FL, Mar. 30, 31, 1989

    NASA Technical Reports Server (NTRS)

    Tescher, Andrew G. (Editor)

    1989-01-01

    Various papers on image compression and automatic target recognition are presented. Individual topics addressed include: target cluster detection in cluttered SAR imagery, model-based target recognition using laser radar imagery, Smart Sensor front-end processor for feature extraction of images, object attitude estimation and tracking from a single video sensor, symmetry detection in human vision, analysis of high resolution aerial images for object detection, obscured object recognition for an ATR application, neural networks for adaptive shape tracking, statistical mechanics and pattern recognition, detection of cylinders in aerial range images, moving object tracking using local windows, new transform method for image data compression, quad-tree product vector quantization of images, predictive trellis encoding of imagery, reduced generalized chain code for contour description, compact architecture for a real-time vision system, use of human visibility functions in segmentation coding, color texture analysis and synthesis using Gibbs random fields.

  1. Precision Formation Keeping at L2 Using the Autonomous Formation Flying Sensor

    NASA Technical Reports Server (NTRS)

    McLoughlin, Terence H.; Campbell, Mark

    2004-01-01

    Recent advances in formation keeping for large numbers of spacecraft using the Autonomous Formation Flying are presented. This sensor, currently under development at JPL, has been identified as a key component in future formation flying spacecraft missions. The sensor provides accurate range and bearing measurements between pairs of spacecraft using GPS technology. Previous theoretical work by the authors has focused on developing a decentralized scheduling algorithm to control the tasking of such a sensor between the relative range and bearing measurements to each node in the formation. The resulting algorithm has been modified to include switching constraints in the sensor. This paper also presents a testbed for real time validation of a sixteen-node formation based on the Stellar Imager mission. Key aspects of the simulation include minimum fuel maneuvers based on free-body dynamics and a three body propagator for simulating the formation at L2.

  2. A robust color signal processing with wide dynamic range WRGB CMOS image sensor

    NASA Astrophysics Data System (ADS)

    Kawada, Shun; Kuroda, Rihito; Sugawa, Shigetoshi

    2011-01-01

    We have developed a robust color reproduction methodology by a simple calculation with a new color matrix using the formerly developed wide dynamic range WRGB lateral overflow integration capacitor (LOFIC) CMOS image sensor. The image sensor was fabricated through a 0.18 μm CMOS technology and has a 45 degrees oblique pixel array, the 4.2 μm effective pixel pitch and the W pixels. A W pixel was formed by replacing one of the two G pixels in the Bayer RGB color filter. The W pixel has a high sensitivity through the visible light waveband. An emerald green and yellow (EGY) signal is generated from the difference between the W signal and the sum of RGB signals. This EGY signal mainly includes emerald green and yellow lights. These colors are difficult to be reproduced accurately by the conventional simple linear matrix because their wave lengths are in the valleys of the spectral sensitivity characteristics of the RGB pixels. A new linear matrix based on the EGY-RGB signal was developed. Using this simple matrix, a highly accurate color processing with a large margin to the sensitivity fluctuation and noise has been achieved.

  3. Design of an integrated aerial image sensor

    NASA Astrophysics Data System (ADS)

    Xue, Jing; Spanos, Costas J.

    2005-05-01

    The subject of this paper is a novel integrated aerial image sensor (IAIS) system suitable for integration within the surface of an autonomous test wafer. The IAIS could be used as a lithography processing monitor, affording a "wafer's eye view" of the process, and therefore facilitating advanced process control and diagnostics without integrating (and dedicating) the sensor to the processing equipment. The IAIS is composed of an aperture mask and an array of photo-detectors. In order to retrieve nanometer scale resolution of the aerial image with a practical photo-detector pixel size, we propose a design of an aperture mask involving a series of spatial phase "moving" aperture groups. We demonstrate a design example aimed at the 65nm technology node through TEMPEST simulation. The optimized, key design parameters include an aperture width in the range of 30nm, aperture thickness in the range of 70nm, and offer a spatial resolution of about 5nm, all with comfortable fabrication tolerances. Our preliminary simulation work indicates the possibility of the IAIS being applied to the immersion lithography. A bench-top far-field experiment verifies that our approach of the spatial frequency down-shift through forming large Moire patterns is feasible.

  4. Sensor noise camera identification: countering counter-forensics

    NASA Astrophysics Data System (ADS)

    Goljan, Miroslav; Fridrich, Jessica; Chen, Mo

    2010-01-01

    In camera identification using sensor noise, the camera that took a given image can be determined with high certainty by establishing the presence of the camera's sensor fingerprint in the image. In this paper, we develop methods to reveal counter-forensic activities in which an attacker estimates the camera fingerprint from a set of images and pastes it onto an image from a different camera with the intent to introduce a false alarm and, in doing so, frame an innocent victim. We start by classifying different scenarios based on the sophistication of the attacker's activity and the means available to her and to the victim, who wishes to defend herself. The key observation is that at least some of the images that were used by the attacker to estimate the fake fingerprint will likely be available to the victim as well. We describe the socalled "triangle test" that helps the victim reveal attacker's malicious activity with high certainty under a wide range of conditions. This test is then extended to the case when none of the images that the attacker used to create the fake fingerprint are available to the victim but the victim has at least two forged images to analyze. We demonstrate the test's performance experimentally and investigate its limitations. The conclusion that can be made from this study is that planting a sensor fingerprint in an image without leaving a trace is significantly more difficult than previously thought.

  5. An improved algorithm for de-striping of ocean colour monitor imageries aided by measured sensor characteristics

    NASA Astrophysics Data System (ADS)

    Dutt, Ashutosh; Mishra, Ashish; Goswami, D. R.; Kumar, A. S. Kiran

    2016-05-01

    The push-broom sensors in bands meant to study oceans, in general suffer from residual non uniformity even after radiometric correction. The in-orbit data from OCM-2 shows pronounced striping in lower bands. There have been many attempts and different approaches to solve the problem using image data itself. The success or lack of it of each algorithm lies on the quality of the uniform region identified. In this paper, an image based destriping algorithm is presented with constraints being derived from Ground Calibration exercise. The basis of the methodology is determination of pixel to pixel non-uniformity through uniform segments identified and collected from large number of images, covering the dynamic range of the sensor. The results show the effectiveness of the algorithm over different targets. The performance is qualitatively evaluated by visual inspection and quantitatively measured by two parameters.

  6. A study on rational function model generation for TerraSAR-X imagery.

    PubMed

    Eftekhari, Akram; Saadatseresht, Mohammad; Motagh, Mahdi

    2013-09-09

    The Rational Function Model (RFM) has been widely used as an alternative to rigorous sensor models of high-resolution optical imagery in photogrammetry and remote sensing geometric processing. However, not much work has been done to evaluate the applicability of the RF model for Synthetic Aperture Radar (SAR) image processing. This paper investigates how to generate a Rational Polynomial Coefficient (RPC) for high-resolution TerraSAR-X imagery using an independent approach. The experimental results demonstrate that the RFM obtained using the independent approach fits the Range-Doppler physical sensor model with an accuracy of greater than 10-3 pixel. Because independent RPCs indicate absolute errors in geolocation, two methods can be used to improve the geometric accuracy of the RFM. In the first method, Ground Control Points (GCPs) are used to update SAR sensor orientation parameters, and the RPCs are calculated using the updated parameters. Our experiment demonstrates that by using three control points in the corners of the image, an accuracy of 0.69 pixels in range and 0.88 pixels in the azimuth direction is achieved. For the second method, we tested the use of an affine model for refining RPCs. In this case, by applying four GCPs in the corners of the image, the accuracy reached 0.75 pixels in range and 0.82 pixels in the azimuth direction.

  7. A Study on Rational Function Model Generation for TerraSAR-X Imagery

    PubMed Central

    Eftekhari, Akram; Saadatseresht, Mohammad; Motagh, Mahdi

    2013-01-01

    The Rational Function Model (RFM) has been widely used as an alternative to rigorous sensor models of high-resolution optical imagery in photogrammetry and remote sensing geometric processing. However, not much work has been done to evaluate the applicability of the RF model for Synthetic Aperture Radar (SAR) image processing. This paper investigates how to generate a Rational Polynomial Coefficient (RPC) for high-resolution TerraSAR-X imagery using an independent approach. The experimental results demonstrate that the RFM obtained using the independent approach fits the Range-Doppler physical sensor model with an accuracy of greater than 10−3 pixel. Because independent RPCs indicate absolute errors in geolocation, two methods can be used to improve the geometric accuracy of the RFM. In the first method, Ground Control Points (GCPs) are used to update SAR sensor orientation parameters, and the RPCs are calculated using the updated parameters. Our experiment demonstrates that by using three control points in the corners of the image, an accuracy of 0.69 pixels in range and 0.88 pixels in the azimuth direction is achieved. For the second method, we tested the use of an affine model for refining RPCs. In this case, by applying four GCPs in the corners of the image, the accuracy reached 0.75 pixels in range and 0.82 pixels in the azimuth direction. PMID:24021971

  8. Remote Sensing of Selected Water-Quality Indicators with the Hyperspectral Imager for the Coastal Ocean (HICO) Sensor

    EPA Science Inventory

    The Hyperspectral Imager for the Coastal Ocean (HICO) offers the coastal environmental monitoring community an unprecedented opportunity to observe changes in coastal and estuarine water quality across a range of spatial scales not feasible with traditional field-based monitoring...

  9. High dynamic spectroscopy using a digital micromirror device and periodic shadowing.

    PubMed

    Kristensson, Elias; Ehn, Andreas; Berrocal, Edouard

    2017-01-09

    We present an optical solution called DMD-PS to boost the dynamic range of 2D imaging spectroscopic measurements up to 22 bits by incorporating a digital micromirror device (DMD) prior to detection in combination with the periodic shadowing (PS) approach. In contrast to high dynamic range (HDR), where the dynamic range is increased by recording several images at different exposure times, the current approach has the potential of improving the dynamic range from a single exposure and without saturation of the CCD sensor. In the procedure, the spectrum is imaged onto the DMD that selectively reduces the reflection from the intense spectral lines, allowing the signal from the weaker lines to be increased by a factor of 28 via longer exposure times, higher camera gains or increased laser power. This manipulation of the spectrum can either be based on a priori knowledge of the spectrum or by first performing a calibration measurement to sense the intensity distribution. The resulting benefits in detection sensitivity come, however, at the cost of strong generation of interfering stray light. To solve this issue the Periodic Shadowing technique, which is based on spatial light modulation, is also employed. In this proof-of-concept article we describe the full methodology of DMD-PS and demonstrate - using the calibration-based concept - an improvement in dynamic range by a factor of ~100 over conventional imaging spectroscopy. The dynamic range of the presented approach will directly benefit from future technological development of DMDs and camera sensors.

  10. High-resolution Imaging of pH in Alkaline Sediments and Water Based on a New Rapid Response Fluorescent Planar Optode

    NASA Astrophysics Data System (ADS)

    Han, Chao; Yao, Lei; Xu, Di; Xie, Xianchuan; Zhang, Chaosheng

    2016-05-01

    A new dual-lumophore optical sensor combined with a robust RGB referencing method was developed for two-dimensional (2D) pH imaging in alkaline sediments and water. The pH sensor film consisted of a proton-permeable polymer (PVC) in which two dyes with different pH sensitivities and emission colors: (1) chloro phenyl imino propenyl aniline (CPIPA) and (2) the coumarin dye Macrolex® fluorescence yellow 10 GN (MFY-10 GN) were entrapped. Calibration experiments revealed the typical sigmoid function and temperature dependencies. This sensor featured high sensitivity and fast response over the alkaline working ranges from pH 7.5 to pH 10.5. Cross-sensitivity towards ionic strength (IS) was found to be negligible for freshwater when IS <0.1 M. The sensor had a spatial resolution of approximately 22 μm and aresponse time of <120 s when going from pH 7.0 to 9.0. The feasibility of the sensor was demonstrated using the pH microelectrode. An example of pH image obtained in the natrual freshwater sediment and water associated with the photosynthesis of Vallisneria spiral species was also presented, suggesting that the sensor held great promise for the field applications.

  11. High-resolution Imaging of pH in Alkaline Sediments and Water Based on a New Rapid Response Fluorescent Planar Optode

    PubMed Central

    Han, Chao; Yao, Lei; Xu, Di; Xie, Xianchuan; Zhang, Chaosheng

    2016-01-01

    A new dual-lumophore optical sensor combined with a robust RGB referencing method was developed for two-dimensional (2D) pH imaging in alkaline sediments and water. The pH sensor film consisted of a proton-permeable polymer (PVC) in which two dyes with different pH sensitivities and emission colors: (1) chloro phenyl imino propenyl aniline (CPIPA) and (2) the coumarin dye Macrolex® fluorescence yellow 10 GN (MFY-10 GN) were entrapped. Calibration experiments revealed the typical sigmoid function and temperature dependencies. This sensor featured high sensitivity and fast response over the alkaline working ranges from pH 7.5 to pH 10.5. Cross-sensitivity towards ionic strength (IS) was found to be negligible for freshwater when IS <0.1 M. The sensor had a spatial resolution of approximately 22 μm and aresponse time of <120 s when going from pH 7.0 to 9.0. The feasibility of the sensor was demonstrated using the pH microelectrode. An example of pH image obtained in the natrual freshwater sediment and water associated with the photosynthesis of Vallisneria spiral species was also presented, suggesting that the sensor held great promise for the field applications. PMID:27199163

  12. Development of Ferrite-Based Temperature Sensors for Magnetic Resonance Imaging: A Study of Cu1 -xZnxFe2O4

    NASA Astrophysics Data System (ADS)

    Alghamdi, N. A.; Hankiewicz, J. H.; Anderson, N. R.; Stupic, K. F.; Camley, R. E.; Przybylski, M.; Żukrowski, J.; Celinski, Z.

    2018-05-01

    We investigate the use of Cu1 -xZnxFe2O4 ferrites (0.60

  13. Information theory analysis of sensor-array imaging systems for computer vision

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Fales, C. L.; Park, S. K.; Samms, R. W.; Self, M. O.

    1983-01-01

    Information theory is used to assess the performance of sensor-array imaging systems, with emphasis on the performance obtained with image-plane signal processing. By electronically controlling the spatial response of the imaging system, as suggested by the mechanism of human vision, it is possible to trade-off edge enhancement for sensitivity, increase dynamic range, and reduce data transmission. Computational results show that: signal information density varies little with large variations in the statistical properties of random radiance fields; most information (generally about 85 to 95 percent) is contained in the signal intensity transitions rather than levels; and performance is optimized when the OTF of the imaging system is nearly limited to the sampling passband to minimize aliasing at the cost of blurring, and the SNR is very high to permit the retrieval of small spatial detail from the extensively blurred signal. Shading the lens aperture transmittance to increase depth of field and using a regular hexagonal sensor-array instead of square lattice to decrease sensitivity to edge orientation also improves the signal information density up to about 30 percent at high SNRs.

  14. A Range Ambiguity Suppression Processing Method for Spaceborne SAR with Up and Down Chirp Modulation.

    PubMed

    Wen, Xuejiao; Qiu, Xiaolan; Han, Bing; Ding, Chibiao; Lei, Bin; Chen, Qi

    2018-05-07

    Range ambiguity is one of the factors which affect the SAR image quality. Alternately transmitting up and down chirp modulation pulses is one of the methods used to suppress the range ambiguity. However, the defocusing range ambiguous signal can still hold the stronger backscattering intensity than the mainlobe imaging area in some case, which has a severe impact on visual effects and subsequent applications. In this paper, a novel hybrid range ambiguity suppression method for up and down chirp modulation is proposed. The method can obtain the ambiguity area image and reduce the ambiguity signal power appropriately, by applying pulse compression using a contrary modulation rate and CFAR detecting method. The effectiveness and correctness of the approach is demonstrated by processing the archive images acquired by Chinese Gaofen-3 SAR sensor in full-polarization mode.

  15. Laser range profiling for small target recognition

    NASA Astrophysics Data System (ADS)

    Steinvall, Ove; Tulldahl, Michael

    2017-03-01

    Long range identification (ID) or ID at closer range of small targets has its limitations in imaging due to the demand for very high-transverse sensor resolution. This is, therefore, a motivation to look for one-dimensional laser techniques for target ID. These include laser vibrometry and laser range profiling. Laser vibrometry can give good results, but is not always robust as it is sensitive to certain vibrating parts on the target being in the field of view. Laser range profiling is attractive because the maximum range can be substantial, especially for a small laser beam width. A range profiler can also be used in a scanning mode to detect targets within a certain sector. The same laser can also be used for active imaging when the target comes closer and is angularly resolved. Our laser range profiler is based on a laser with a pulse width of 6 ns (full width half maximum). This paper will show both experimental and simulated results for laser range profiling of small boats out to a 6 to 7-km range and a unmanned arrial vehicle (UAV) mockup at close range (1.3 km). The naval experiments took place in the Baltic Sea using many other active and passive electro-optical sensors in addition to the profiling system. The UAV experiments showed the need for a high-range resolution, thus we used a photon counting system in addition to the more conventional profiler used in the naval experiments. This paper shows the influence of target pose and range resolution on the capability of classification. The typical resolution (in our case 0.7 m) obtainable with a conventional range finder type of sensor can be used for large target classification with a depth structure over 5 to 10 m or more, but for smaller targets such as a UAV a high resolution (in our case 7.5 mm) is needed to reveal depth structures and surface shapes. This paper also shows the need for 3-D target information to build libraries for comparison of measured and simulated range profiles. At closer ranges, full 3-D images should be preferable.

  16. Two-photon fluorescent sensor for K+ imaging in live cells (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Sui, Binglin; Yue, Xiling; Kim, Bosung; Belfield, Kevin D.

    2016-03-01

    It is difficult to overstate the physiological importance of potassium for life as its indispensable roles in a variety of biological processes are widely known. As a result, efficient methods for determining physiological levels of potassium are of paramount importance. Despite this, relatively few K+ fluorescence sensors have been reported, with only one being commercially available. A new two-photon excited fluorescent K+ sensor is reported. The sensor is comprised of three moieties, a highly selective K+ chelator as the K+ recognition unit, a boron-dipyrromethene (BODIPY) derivative modified with phenylethynyl groups as the fluorophore, and two polyethylene glycol chains to afford water solubility. The sensor displays very high selectivity (<52-fold) in detecting K+ over other physiological metal cations. Upon binding K+, the sensor switches from non-fluorescent to highly fluorescent, emitting red to near-IR (NIR) fluorescence. The sensor exhibited a good two-photon absorption cross section, 500 GM at 940 nm. Moreover, it is not sensitive to pH in the physiological pH range. Time-dependent cell imaging studies via both one- and two-photon fluorescence microscopy demonstrate that the sensor is suitable for dynamic K+ sensing in living cells.

  17. Small craft identification discrimination criteria N 50 and V 50 for visible and infrared sensors in maritime security

    NASA Astrophysics Data System (ADS)

    Krapels, Keith; Deaver, Dawne; Driggers, Ronald

    2006-09-01

    The new emphasis on Anti-Terrorism and Force Protection (AT/FP), for both shore and sea platform protection, has resulted in a need for infrared imager design and evaluation tools which demonstrate field performance against U.S. Navy AT/FP requirements. In the design of infrared imaging systems for target acquisition, a discrimination criterion is required for successful sensor realization. It characterizes the difficulty of the task being performed by the observer and varies for different target sets. This criterion is used in both assessment of existing infrared sensor and in the design of new conceptual sensors. In this experiment, we collected 12 small craft signatures (military and civilian) in the visible band during the day and the LWIR and MWIR spectra in both the day and the night environments. These signatures were processed to determine the targets' characteristic dimension and contrast. They were also processed to bandlimit the signature's spatial information content (simulating longer range) and a perception experiment was performed to determine the task difficulty (N 50 and V 50). The results are presented in this paper and can be used for Navy and Coast Guard imaging infrared sensor design and evaluation.

  18. Midwave infrared and visible sensor performance modeling: small craft identification discrimination criteria for maritime security

    NASA Astrophysics Data System (ADS)

    Krapels, Keith; Driggers, Ronald G.; Deaver, Dawne; Moker, Steven K.; Palmer, John

    2007-10-01

    The new emphasis on Anti-Terrorism and Force Protection (AT/FP), for both shore and sea platform protection, has resulted in a need for infrared imager design and evaluation tools that demonstrate field performance against U.S. Navy AT/FP requirements. In the design of infrared imaging systems for target acquisition, a discrimination criterion is required for successful sensor realization. It characterizes the difficulty of the task being performed by the observer and varies for different target sets. This criterion is used in both assessment of existing infrared sensor and in the design of new conceptual sensors. We collected 12 small craft signatures (military and civilian) in the visible band during the day and the long-wave and midwave infrared spectra in both the day and the night environments. These signatures were processed to determine the targets' characteristic dimension and contrast. They were also processed to band limit the signature's spatial information content (simulating longer range), and a perception experiment was performed to determine the task difficulty (N50 and V50). The results are presented and can be used for Navy and Coast Guard imaging infrared sensor design and evaluation.

  19. Nitrogen-rich functional groups carbon nanoparticles based fluorescent pH sensor with broad-range responding for environmental and live cells applications.

    PubMed

    Shi, Bingfang; Su, Yubin; Zhang, Liangliang; Liu, Rongjun; Huang, Mengjiao; Zhao, Shulin

    2016-08-15

    A nitrogen-rich functional groups carbon nanoparticles (N-CNs) based fluorescent pH sensor with a broad-range responding was prepared by one-pot hydrothermal treatment of melamine and triethanolamine. The as-prepared N-CNs exhibited excellent photoluminesence properties with an absolute quantum yield (QY) of 11.0%. Furthermore, the N-CNs possessed a broad-range pH response. The linear pH response range was 3.0 to 12.0, which is much wider than that of previously reported fluorescent pH sensors. The possible mechanism for the pH-sensitive response of the N-CNs was ascribed to photoinduced electron transfer (PET). Cell toxicity experiment showed that the as-prepared N-CNs exhibited low cytotoxicity and excellent biocompatibility with the cell viabilities of more than 87%. The proposed N-CNs-based pH sensor was used for pH monitoring of environmental water samples, and pH fluorescence imaging of live T24 cells. The N-CNs is promising as a convenient and general fluorescent pH sensor for environmental monitoring and bioimaging applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Demonstration of the CDMA-mode CAOS smart camera.

    PubMed

    Riza, Nabeel A; Mazhar, Mohsin A

    2017-12-11

    Demonstrated is the code division multiple access (CDMA)-mode coded access optical sensor (CAOS) smart camera suited for bright target scenarios. Deploying a silicon CMOS sensor and a silicon point detector within a digital micro-mirror device (DMD)-based spatially isolating hybrid camera design, this smart imager first engages the DMD starring mode with a controlled factor of 200 high optical attenuation of the scene irradiance to provide a classic unsaturated CMOS sensor-based image for target intelligence gathering. Next, this CMOS sensor provided image data is used to acquire a focused zone more robust un-attenuated true target image using the time-modulated CDMA-mode of the CAOS camera. Using four different bright light test target scenes, successfully demonstrated is a proof-of-concept visible band CAOS smart camera operating in the CDMA-mode using up-to 4096 bits length Walsh design CAOS pixel codes with a maximum 10 KHz code bit rate giving a 0.4096 seconds CAOS frame acquisition time. A 16-bit analog-to-digital converter (ADC) with time domain correlation digital signal processing (DSP) generates the CDMA-mode images with a 3600 CAOS pixel count and a best spatial resolution of one micro-mirror square pixel size of 13.68 μm side. The CDMA-mode of the CAOS smart camera is suited for applications where robust high dynamic range (DR) imaging is needed for un-attenuated un-spoiled bright light spectrally diverse targets.

  1. Consistency of L4 TM absolute calibration with respect to the L5 TM sensor based on near-simultaneous image acquisition

    NASA Astrophysics Data System (ADS)

    Chander, Gyanesh; Helder, Dennis L.; Malla, Rimy; Micijevic, Esad; Mettler, Cory J.

    2007-09-01

    The Landsat archive provides more than 35 years of uninterrupted multispectral remotely sensed data of Earth observations. Since 1972, Landsat missions have carried different types of sensors, from the Return Beam Vidicon (RBV) camera to the Enhanced Thematic Mapper Plus (ETM+). However, the Thematic Mapper (TM) sensors on Landsat 4 (L4) and Landsat 5 (L5), launched in 1982 and 1984 respectively, are the backbone of an extensive archive. Effective April 2, 2007, the radiometric calibration of L5 TM data processed and distributed by the U.S. Geological Survey (USGS) Center for Earth Resources Observation and Science (EROS) was updated to use an improved lifetime gain model, based on the instrument's detector response to pseudo-invariant desert site data and cross-calibration with the L7 ETM+. However, no modifications were ever made to the radiometric calibration procedure of the Landsat 4 (L4) TM data. The L4 TM radiometric calibration procedure has continued to use the Internal Calibrator (IC) based calibration algorithms and the post calibration dynamic ranges, as previously defined. To evaluate the "current" absolute accuracy of these two sensors, image pairs from the L5 TM and L4 TM sensors were compared. The number of coincident image pairs in the USGS EROS archive is limited, so the scene selection for the cross-calibration studies proved to be a challenge. Additionally, because of the lack of near-simultaneous images available over well-characterized and traditionally used calibration sites, alternate sites that have high reflectance, large dynamic range, high spatial uniformity, high sun elevation, and minimal cloud cover were investigated. The alternate sites were identified in Yuma, Iraq, Egypt, Libya, and Algeria. The cross-calibration approach involved comparing image statistics derived from large common areas observed eight days apart by the two sensors. This paper summarizes the average percent differences in reflectance estimates obtained between the two sensors. The work presented in this paper is a first step in understanding the current performance of L4 TM absolute calibration and potentially serves as a platform to revise and improve the radiometric calibration procedures implemented for the processing of L4 TM data.

  2. Consistency of L4 TM absolute calibration with respect to the L5 TM sensor based on near-simultaneous image acquisition

    USGS Publications Warehouse

    Chander, G.; Helder, D.L.; Malla, R.; Micijevic, E.; Mettler, C.J.

    2007-01-01

    The Landsat archive provides more than 35 years of uninterrupted multispectral remotely sensed data of Earth observations. Since 1972, Landsat missions have carried different types of sensors, from the Return Beam Vidicon (RBV) camera to the Enhanced Thematic Mapper Plus (ETM+). However, the Thematic Mapper (TM) sensors on Landsat 4 (L4) and Landsat 5 (L5), launched in 1982 and 1984 respectively, are the backbone of an extensive archive. Effective April 2, 2007, the radiometric calibration of L5 TM data processed and distributed by the U.S. Geological Survey (USGS) Center for Earth Resources Observation and Science (EROS) was updated to use an improved lifetime gain model, based on the instrument's detector response to pseudo-invariant desert site data and cross-calibration with the L7 ETM+. However, no modifications were ever made to the radiometric calibration procedure of the Landsat 4 (L4) TM data. The L4 TM radiometric calibration procedure has continued to use the Internal Calibrator (IC) based calibration algorithms and the post calibration dynamic ranges, as previously defined. To evaluate the "current" absolute accuracy of these two sensors, image pairs from the L5 TM and L4 TM sensors were compared. The number of coincident image pairs in the USGS EROS archive is limited, so the scene selection for the cross-calibration studies proved to be a challenge. Additionally, because of the lack of near-simultaneous images available over well-characterized and traditionally used calibration sites, alternate sites that have high reflectance, large dynamic range, high spatial uniformity, high sun elevation, and minimal cloud cover were investigated. The alternate sites were identified in Yuma, Iraq, Egypt, Libya, and Algeria. The cross-calibration approach involved comparing image statistics derived from large common areas observed eight days apart by the two sensors. This paper summarizes the average percent differences in reflectance estimates obtained between the two sensors. The work presented in this paper is a first step in understanding the current performance of L4 TM absolute calibration and potentially serves as a platform to revise and improve the radiometric calibration procedures implemented for the processing of L4 TM data.

  3. Radiometric and geometric assessment of data from the RapidEye constellation of satellites

    USGS Publications Warehouse

    Chander, Gyanesh; Haque, Md. Obaidul; Sampath, Aparajithan; Brunn, A.; Trosset, G.; Hoffmann, D.; Roloff, S.; Thiele, M.; Anderson, C.

    2013-01-01

    To monitor land surface processes over a wide range of temporal and spatial scales, it is critical to have coordinated observations of the Earth's surface using imagery acquired from multiple spaceborne imaging sensors. The RapidEye (RE) satellite constellation acquires high-resolution satellite images covering the entire globe within a very short period of time by sensors identical in construction and cross-calibrated to each other. To evaluate the RE high-resolution Multi-spectral Imager (MSI) sensor capabilities, a cross-comparison between the RE constellation of sensors was performed first using image statistics based on large common areas observed over pseudo-invariant calibration sites (PICS) by the sensors and, second, by comparing the on-orbit radiometric calibration temporal trending over a large number of calibration sites. For any spectral band, the individual responses measured by the five satellites of the RE constellation were found to differ <2–3% from the average constellation response depending on the method used for evaluation. Geometric assessment was also performed to study the positional accuracy and relative band-to-band (B2B) alignment of the image data sets. The position accuracy was assessed by comparing the RE imagery against high-resolution aerial imagery, while the B2B characterization was performed by registering each band against every other band to ensure that the proper band alignment is provided for an image product. The B2B results indicate that the internal alignments of these five RE bands are in agreement, with bands typically registered to within 0.25 pixels of each other or better.

  4. Parallel Processing Systems for Passive Ranging During Helicopter Flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Bavavar; Suorsa, Raymond E.; Showman, Robert D. (Technical Monitor)

    1994-01-01

    The complexity of rotorcraft missions involving operations close to the ground result in high pilot workload. In order to allow a pilot time to perform mission-oriented tasks, sensor-aiding and automation of some of the guidance and control functions are highly desirable. Images from an electro-optical sensor provide a covert way of detecting objects in the flight path of a low-flying helicopter. Passive ranging consists of processing a sequence of images using techniques based on optical low computation and recursive estimation. The passive ranging algorithm has to extract obstacle information from imagery at rates varying from five to thirty or more frames per second depending on the helicopter speed. We have implemented and tested the passive ranging algorithm off-line using helicopter-collected images. However, the real-time data and computation requirements of the algorithm are beyond the capability of any off-the-shelf microprocessor or digital signal processor. This paper describes the computational requirements of the algorithm and uses parallel processing technology to meet these requirements. Various issues in the selection of a parallel processing architecture are discussed and four different computer architectures are evaluated regarding their suitability to process the algorithm in real-time. Based on this evaluation, we conclude that real-time passive ranging is a realistic goal and can be achieved with a short time.

  5. Adaptive scene-based correction algorithm for removal of residual fixed pattern noise in microgrid image data

    NASA Astrophysics Data System (ADS)

    Ratliff, Bradley M.; LeMaster, Daniel A.

    2012-06-01

    Pixel-to-pixel response nonuniformity is a common problem that affects nearly all focal plane array sensors. This results in a frame-to-frame fixed pattern noise (FPN) that causes an overall degradation in collected data. FPN is often compensated for through the use of blackbody calibration procedures; however, FPN is a particularly challenging problem because the detector responsivities drift relative to one another in time, requiring that the sensor be recalibrated periodically. The calibration process is obstructive to sensor operation and is therefore only performed at discrete intervals in time. Thus, any drift that occurs between calibrations (along with error in the calibration sources themselves) causes varying levels of residual calibration error to be present in the data at all times. Polarimetric microgrid sensors are particularly sensitive to FPN due to the spatial differencing involved in estimating the Stokes vector images. While many techniques exist in the literature to estimate FPN for conventional video sensors, few have been proposed to address the problem in microgrid imaging sensors. Here we present a scene-based nonuniformity correction technique for microgrid sensors that is able to reduce residual fixed pattern noise while preserving radiometry under a wide range of conditions. The algorithm requires a low number of temporal data samples to estimate the spatial nonuniformity and is computationally efficient. We demonstrate the algorithm's performance using real data from the AFRL PIRATE and University of Arizona LWIR microgrid sensors.

  6. x-y curvature wavefront sensor.

    PubMed

    Cagigal, Manuel P; Valle, Pedro J

    2015-04-15

    In this Letter, we propose a new curvature wavefront sensor based on the principles of optical differentiation. The theoretically modeled setup consists of a diffractive optical mask placed at the intermediate plane of a classical two-lens coherent optical processor. The resulting image is composed of a number of local derivatives of the entrance pupil function whose proper combination provides the wavefront curvature. In contrast to the common radial curvature sensors, this one is able to provide the x and y wavefront curvature maps simultaneously. The sensor offers other additional advantages like having high spatial resolution, adjustable dynamic range, and not being sensitive to misalignment.

  7. Onboard TDI stage estimation and calibration using SNR analysis

    NASA Astrophysics Data System (ADS)

    Haghshenas, Javad

    2017-09-01

    Electro-Optical design of a push-broom space camera for a Low Earth Orbit (LEO) remote sensing satellite is performed based on the noise analysis of TDI sensors for very high GSDs and low light level missions. It is well demonstrated that the CCD TDI mode of operation provides increased photosensitivity relative to a linear CCD array, without the sacrifice of spatial resolution. However, for satellite imaging, in order to utilize the advantages which the TDI mode of operation offers, attention should be given to the parameters which affect the image quality of TDI sensors such as jitters, vibrations, noises and etc. A predefined TDI stages may not properly satisfy image quality requirement of the satellite camera. Furthermore, in order to use the whole dynamic range of the sensor, imager must be capable to set the TDI stages in every shots based on the affecting parameters. This paper deals with the optimal estimation and setting the stages based on tradeoffs among MTF, noises and SNR. On-board SNR estimation is simulated using the atmosphere analysis based on the MODTRAN algorithm in PcModWin software. According to the noises models, we have proposed a formulation to estimate TDI stages in such a way to satisfy the system SNR requirement. On the other hand, MTF requirement must be satisfy in the same manner. A proper combination of both parameters will guaranty the full dynamic range usage along with the high SNR and image quality.

  8. Using polynomials to simplify fixed pattern noise and photometric correction of logarithmic CMOS image sensors.

    PubMed

    Li, Jing; Mahmoodi, Alireza; Joseph, Dileepan

    2015-10-16

    An important class of complementary metal-oxide-semiconductor (CMOS) image sensors are those where pixel responses are monotonic nonlinear functions of light stimuli. This class includes various logarithmic architectures, which are easily capable of wide dynamic range imaging, at video rates, but which are vulnerable to image quality issues. To minimize fixed pattern noise (FPN) and maximize photometric accuracy, pixel responses must be calibrated and corrected due to mismatch and process variation during fabrication. Unlike literature approaches, which employ circuit-based models of varying complexity, this paper introduces a novel approach based on low-degree polynomials. Although each pixel may have a highly nonlinear response, an approximately-linear FPN calibration is possible by exploiting the monotonic nature of imaging. Moreover, FPN correction requires only arithmetic, and an optimal fixed-point implementation is readily derived, subject to a user-specified number of bits per pixel. Using a monotonic spline, involving cubic polynomials, photometric calibration is also possible without a circuit-based model, and fixed-point photometric correction requires only a look-up table. The approach is experimentally validated with a logarithmic CMOS image sensor and is compared to a leading approach from the literature. The novel approach proves effective and efficient.

  9. Energy dispersive CdTe and CdZnTe detectors for spectral clinical CT and NDT applications

    NASA Astrophysics Data System (ADS)

    Barber, W. C.; Wessel, J. C.; Nygard, E.; Iwanczyk, J. S.

    2015-06-01

    We are developing room temperature compound semiconductor detectors for applications in energy-resolved high-flux single x-ray photon-counting spectral computed tomography (CT), including functional imaging with nanoparticle contrast agents for medical applications and non-destructive testing (NDT) for security applications. Energy-resolved photon-counting can provide reduced patient dose through optimal energy weighting for a particular imaging task in CT, functional contrast enhancement through spectroscopic imaging of metal nanoparticles in CT, and compositional analysis through multiple basis function material decomposition in CT and NDT. These applications produce high input count rates from an x-ray generator delivered to the detector. Therefore, in order to achieve energy-resolved single photon counting in these applications, a high output count rate (OCR) for an energy-dispersive detector must be achieved at the required spatial resolution and across the required dynamic range for the application. The required performance in terms of the OCR, spatial resolution, and dynamic range must be obtained with sufficient field of view (FOV) for the application thus requiring the tiling of pixel arrays and scanning techniques. Room temperature cadmium telluride (CdTe) and cadmium zinc telluride (CdZnTe) compound semiconductors, operating as direct conversion x-ray sensors, can provide the required speed when connected to application specific integrated circuits (ASICs) operating at fast peaking times with multiple fixed thresholds per pixel provided the sensors are designed for rapid signal formation across the x-ray energy ranges of the application at the required energy and spatial resolutions, and at a sufficiently high detective quantum efficiency (DQE). We have developed high-flux energy-resolved photon-counting x-ray imaging array sensors using pixellated CdTe and CdZnTe semiconductors optimized for clinical CT and security NDT. We have also fabricated high-flux ASICs with a two dimensional (2D) array of inputs for readout from the sensors. The sensors are guard ring free and have a 2D array of pixels and can be tiled in 2D while preserving pixel pitch. The 2D ASICs have four energy bins with a linear energy response across sufficient dynamic range for clinical CT and some NDT applications. The ASICs can also be tiled in 2D and are designed to fit within the active area of the sensors. We have measured several important performance parameters including: the output count rate (OCR) in excess of 20 million counts per second per square mm with a minimum loss of counts due to pulse pile-up, an energy resolution of 7 keV full width at half-maximum (FWHM) across the entire dynamic range, and a noise floor about 20 keV. This is achieved by directly interconnecting the ASIC inputs to the pixels of the CdZnTe sensors incurring very little input capacitance to the ASICs. We present measurements of the performance of the CdTe and CdZnTe sensors including the OCR, FWHM energy resolution, noise floor, as well as the temporal stability and uniformity under the rapidly varying high flux expected in CT and NDT applications.

  10. Energy dispersive CdTe and CdZnTe detectors for spectral clinical CT and NDT applications

    PubMed Central

    Barber, W. C.; Wessel, J. C.; Nygard, E.; Iwanczyk, J. S.

    2014-01-01

    We are developing room temperature compound semiconductor detectors for applications in energy-resolved high-flux single x-ray photon-counting spectral computed tomography (CT), including functional imaging with nanoparticle contrast agents for medical applications and non destructive testing (NDT) for security applications. Energy-resolved photon-counting can provide reduced patient dose through optimal energy weighting for a particular imaging task in CT, functional contrast enhancement through spectroscopic imaging of metal nanoparticles in CT, and compositional analysis through multiple basis function material decomposition in CT and NDT. These applications produce high input count rates from an x-ray generator delivered to the detector. Therefore, in order to achieve energy-resolved single photon counting in these applications, a high output count rate (OCR) for an energy-dispersive detector must be achieved at the required spatial resolution and across the required dynamic range for the application. The required performance in terms of the OCR, spatial resolution, and dynamic range must be obtained with sufficient field of view (FOV) for the application thus requiring the tiling of pixel arrays and scanning techniques. Room temperature cadmium telluride (CdTe) and cadmium zinc telluride (CdZnTe) compound semiconductors, operating as direct conversion x-ray sensors, can provide the required speed when connected to application specific integrated circuits (ASICs) operating at fast peaking times with multiple fixed thresholds per pixel provided the sensors are designed for rapid signal formation across the x-ray energy ranges of the application at the required energy and spatial resolutions, and at a sufficiently high detective quantum efficiency (DQE). We have developed high-flux energy-resolved photon-counting x-ray imaging array sensors using pixellated CdTe and CdZnTe semiconductors optimized for clinical CT and security NDT. We have also fabricated high-flux ASICs with a two dimensional (2D) array of inputs for readout from the sensors. The sensors are guard ring free and have a 2D array of pixels and can be tiled in 2D while preserving pixel pitch. The 2D ASICs have four energy bins with a linear energy response across sufficient dynamic range for clinical CT and some NDT applications. The ASICs can also be tiled in 2D and are designed to fit within the active area of the sensors. We have measured several important performance parameters including; the output count rate (OCR) in excess of 20 million counts per second per square mm with a minimum loss of counts due to pulse pile-up, an energy resolution of 7 keV full width at half maximum (FWHM) across the entire dynamic range, and a noise floor about 20keV. This is achieved by directly interconnecting the ASIC inputs to the pixels of the CdZnTe sensors incurring very little input capacitance to the ASICs. We present measurements of the performance of the CdTe and CdZnTe sensors including the OCR, FWHM energy resolution, noise floor, as well as the temporal stability and uniformity under the rapidly varying high flux expected in CT and NDT applications. PMID:25937684

  11. Energy dispersive CdTe and CdZnTe detectors for spectral clinical CT and NDT applications.

    PubMed

    Barber, W C; Wessel, J C; Nygard, E; Iwanczyk, J S

    2015-06-01

    We are developing room temperature compound semiconductor detectors for applications in energy-resolved high-flux single x-ray photon-counting spectral computed tomography (CT), including functional imaging with nanoparticle contrast agents for medical applications and non destructive testing (NDT) for security applications. Energy-resolved photon-counting can provide reduced patient dose through optimal energy weighting for a particular imaging task in CT, functional contrast enhancement through spectroscopic imaging of metal nanoparticles in CT, and compositional analysis through multiple basis function material decomposition in CT and NDT. These applications produce high input count rates from an x-ray generator delivered to the detector. Therefore, in order to achieve energy-resolved single photon counting in these applications, a high output count rate (OCR) for an energy-dispersive detector must be achieved at the required spatial resolution and across the required dynamic range for the application. The required performance in terms of the OCR, spatial resolution, and dynamic range must be obtained with sufficient field of view (FOV) for the application thus requiring the tiling of pixel arrays and scanning techniques. Room temperature cadmium telluride (CdTe) and cadmium zinc telluride (CdZnTe) compound semiconductors, operating as direct conversion x-ray sensors, can provide the required speed when connected to application specific integrated circuits (ASICs) operating at fast peaking times with multiple fixed thresholds per pixel provided the sensors are designed for rapid signal formation across the x-ray energy ranges of the application at the required energy and spatial resolutions, and at a sufficiently high detective quantum efficiency (DQE). We have developed high-flux energy-resolved photon-counting x-ray imaging array sensors using pixellated CdTe and CdZnTe semiconductors optimized for clinical CT and security NDT. We have also fabricated high-flux ASICs with a two dimensional (2D) array of inputs for readout from the sensors. The sensors are guard ring free and have a 2D array of pixels and can be tiled in 2D while preserving pixel pitch. The 2D ASICs have four energy bins with a linear energy response across sufficient dynamic range for clinical CT and some NDT applications. The ASICs can also be tiled in 2D and are designed to fit within the active area of the sensors. We have measured several important performance parameters including; the output count rate (OCR) in excess of 20 million counts per second per square mm with a minimum loss of counts due to pulse pile-up, an energy resolution of 7 keV full width at half maximum (FWHM) across the entire dynamic range, and a noise floor about 20keV. This is achieved by directly interconnecting the ASIC inputs to the pixels of the CdZnTe sensors incurring very little input capacitance to the ASICs. We present measurements of the performance of the CdTe and CdZnTe sensors including the OCR, FWHM energy resolution, noise floor, as well as the temporal stability and uniformity under the rapidly varying high flux expected in CT and NDT applications.

  12. Planoconcave optical microresonator sensors for photoacoustic imaging: pushing the limits of sensitivity (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Guggenheim, James A.; Zhang, Edward Z.; Beard, Paul C.

    2016-03-01

    Most photoacoustic scanners use piezoelectric detectors but these have two key limitations. Firstly, they are optically opaque, inhibiting backward mode operation. Secondly, it is difficult to achieve adequate detection sensitivity with the small element sizes needed to provide near-omnidirectional response as required for tomographic imaging. Planar Fabry-Perot (FP) ultrasound sensing etalons can overcome both of these limitations and have proved extremely effective for superficial (<1cm) imaging applications. To achieve small element sizes (<100μm), the etalon is illuminated with a focused laser beam. However, this has the disadvantage that beam walk-off due to the divergence of the beam fundamentally limits the etalon finesse and thus sensitivity - in essence, the problem is one of insufficient optical confinement. To overcome this, novel planoconcave micro-resonator sensors have been fabricated using precision ink-jet printed polymer domes with curvatures matching that of the laser wavefront. By providing near-perfect beam confinement, we show that it is possible to approach the maximum theoretical limit for finesse (f) imposed by the etalon mirror reflectivities (e.g. f=400 for R=99.2% in contrast to a typical planar sensor value of f<50). This yields an order of magnitude increase in sensitivity over a planar FP sensor with the same acoustic bandwidth. Furthermore by eliminating beam walk-off, viable sensors can be made with significantly greater thickness than planar FP sensors. This provides an additional sensitivity gain for deep tissue imaging applications such as breast imaging where detection bandwidths in the low MHz can be tolerated. For example, for a 250 μm thick planoconcave sensor with a -3dB bandwidth of 5MHz, the measured NEP was 4 Pa. This NEP is comparable to that provided by mm scale piezoelectric detectors used for breast imaging applications but with more uniform frequency response characteristics and an order-of-magnitude smaller element size. Following previous proof-of-concept work, several important advances towards practical application have been made. A family of sensors with bandwidths ranging from 3MHz to 20MHz have been fabricated and characterised. A novel interrogation scheme based on rapid wavelength sweeping has been implemented in order to avoid previously encountered instability problems due to self-heating. Finally, a prototype microresonator based photoacoustic scanner has been developed and applied to the problem of deep-tissue (>1cm) photoacoustic imaging in vivo. Imaging results for second generation microresonator sensors (with R = 99.5% and thickness up to ~800um) are compared to the best achievable with the planar FP sensors and piezoelectric receivers.

  13. A novel digital image sensor with row wise gain compensation for Hyper Spectral Imager (HySI) application

    NASA Astrophysics Data System (ADS)

    Lin, Shengmin; Lin, Chi-Pin; Wang, Weng-Lyang; Hsiao, Feng-Ke; Sikora, Robert

    2009-08-01

    A 256x512 element digital image sensor has been developed which has a large pixel size, slow scan and low power consumption for Hyper Spectral Imager (HySI) applications. The device is a mixed mode, silicon on chip (SOC) IC. It combines analog circuitry, digital circuitry and optical sensor circuitry into a single chip. This chip integrates a 256x512 active pixel sensor array, a programming gain amplifier (PGA) for row wise gain setting, I2C interface, SRAM, 12 bit analog to digital convertor (ADC), voltage regulator, low voltage differential signal (LVDS) and timing generator. The device can be used for 256 pixels of spatial resolution and 512 bands of spectral resolution ranged from 400 nm to 950 nm in wavelength. In row wise gain readout mode, one can set a different gain on each row of the photo detector by storing the gain setting data on the SRAM thru the I2C interface. This unique row wise gain setting can be used to compensate the silicon spectral response non-uniformity problem. Due to this unique function, the device is suitable for hyper-spectral imager applications. The HySI camera located on-board the Chandrayaan-1 satellite, was successfully launched to the moon on Oct. 22, 2008. The device is currently mapping the moon and sending back excellent images of the moon surface. The device design and the moon image data will be presented in the paper.

  14. Field Calibration of Wind Direction Sensor to the True North and Its Application to the Daegwanryung Wind Turbine Test Sites

    PubMed Central

    Lee, Jeong Wan

    2008-01-01

    This paper proposes a field calibration technique for aligning a wind direction sensor to the true north. The proposed technique uses the synchronized measurements of captured images by a camera, and the output voltage of a wind direction sensor. The true wind direction was evaluated through image processing techniques using the captured picture of the sensor with the least square sense. Then, the evaluated true value was compared with the measured output voltage of the sensor. This technique solves the discordance problem of the wind direction sensor in the process of installing meteorological mast. For this proposed technique, some uncertainty analyses are presented and the calibration accuracy is discussed. Finally, the proposed technique was applied to the real meteorological mast at the Daegwanryung test site, and the statistical analysis of the experimental testing estimated the values of stable misalignment and uncertainty level. In a strict sense, it is confirmed that the error range of the misalignment from the exact north could be expected to decrease within the credibility level. PMID:27873957

  15. Pulse Based Time-of-Flight Range Sensing.

    PubMed

    Sarbolandi, Hamed; Plack, Markus; Kolb, Andreas

    2018-05-23

    Pulse-based Time-of-Flight (PB-ToF) cameras are an attractive alternative range imaging approach, compared to the widely commercialized Amplitude Modulated Continuous-Wave Time-of-Flight (AMCW-ToF) approach. This paper presents an in-depth evaluation of a PB-ToF camera prototype based on the Hamamatsu area sensor S11963-01CR. We evaluate different ToF-related effects, i.e., temperature drift, systematic error, depth inhomogeneity, multi-path effects, and motion artefacts. Furthermore, we evaluate the systematic error of the system in more detail, and introduce novel concepts to improve the quality of range measurements by modifying the mode of operation of the PB-ToF camera. Finally, we describe the means of measuring the gate response of the PB-ToF sensor and using this information for PB-ToF sensor simulation.

  16. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 μm technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and μm-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  17. Visible Wavelength Color Filters Using Dielectric Subwavelength Gratings for Backside-Illuminated CMOS Image Sensor Technologies.

    PubMed

    Horie, Yu; Han, Seunghoon; Lee, Jeong-Yub; Kim, Jaekwan; Kim, Yongsung; Arbabi, Amir; Shin, Changgyun; Shi, Lilong; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Lee, Hong-Seok; Hwang, Sungwoo; Faraon, Andrei

    2017-05-10

    We report transmissive color filters based on subwavelength dielectric gratings that can replace conventional dye-based color filters used in backside-illuminated CMOS image sensor (BSI CIS) technologies. The filters are patterned in an 80 nm-thick poly silicon film on a 115 nm-thick SiO 2 spacer layer. They are optimized for operating at the primary RGB colors, exhibit peak transmittance of 60-80%, and have an almost insensitive response over a ± 20° angular range. This technology enables shrinking of the pixel sizes down to near a micrometer.

  18. Wavefront Derived Refraction and Full Eye Biometry in Pseudophakic Eyes

    PubMed Central

    Mao, Xinjie; Banta, James T.; Ke, Bilian; Jiang, Hong; He, Jichang; Liu, Che; Wang, Jianhua

    2016-01-01

    Purpose To assess wavefront derived refraction and full eye biometry including ciliary muscle dimension and full eye axial geometry in pseudophakic eyes using spectral domain OCT equipped with a Shack-Hartmann wavefront sensor. Methods Twenty-eight adult subjects (32 pseudophakic eyes) having recently undergone cataract surgery were enrolled in this study. A custom system combining two optical coherence tomography systems with a Shack-Hartmann wavefront sensor was constructed to image and monitor changes in whole eye biometry, the ciliary muscle and ocular aberration in the pseudophakic eye. A Badal optical channel and a visual target aligning with the wavefront sensor were incorporated into the system for measuring the wavefront-derived refraction. The imaging acquisition was performed twice. The coefficients of repeatability (CoR) and intraclass correlation coefficient (ICC) were calculated. Results Images were acquired and processed successfully in all patients. No significant difference was detected between repeated measurements of ciliary muscle dimension, full-eye biometry or defocus aberration. The CoR of full-eye biometry ranged from 0.36% to 3.04% and the ICC ranged from 0.981 to 0.999. The CoR for ciliary muscle dimensions ranged from 12.2% to 41.6% and the ICC ranged from 0.767 to 0.919. The defocus aberrations of the two measurements were 0.443 ± 0.534 D and 0.447 ± 0.586 D and the ICC was 0.951. Conclusions The combined system is capable of measuring full eye biometry and refraction with good repeatability. The system is suitable for future investigation of pseudoaccommodation in the pseudophakic eye. PMID:27010674

  19. Wavefront Derived Refraction and Full Eye Biometry in Pseudophakic Eyes.

    PubMed

    Mao, Xinjie; Banta, James T; Ke, Bilian; Jiang, Hong; He, Jichang; Liu, Che; Wang, Jianhua

    2016-01-01

    To assess wavefront derived refraction and full eye biometry including ciliary muscle dimension and full eye axial geometry in pseudophakic eyes using spectral domain OCT equipped with a Shack-Hartmann wavefront sensor. Twenty-eight adult subjects (32 pseudophakic eyes) having recently undergone cataract surgery were enrolled in this study. A custom system combining two optical coherence tomography systems with a Shack-Hartmann wavefront sensor was constructed to image and monitor changes in whole eye biometry, the ciliary muscle and ocular aberration in the pseudophakic eye. A Badal optical channel and a visual target aligning with the wavefront sensor were incorporated into the system for measuring the wavefront-derived refraction. The imaging acquisition was performed twice. The coefficients of repeatability (CoR) and intraclass correlation coefficient (ICC) were calculated. Images were acquired and processed successfully in all patients. No significant difference was detected between repeated measurements of ciliary muscle dimension, full-eye biometry or defocus aberration. The CoR of full-eye biometry ranged from 0.36% to 3.04% and the ICC ranged from 0.981 to 0.999. The CoR for ciliary muscle dimensions ranged from 12.2% to 41.6% and the ICC ranged from 0.767 to 0.919. The defocus aberrations of the two measurements were 0.443 ± 0.534 D and 0.447 ± 0.586 D and the ICC was 0.951. The combined system is capable of measuring full eye biometry and refraction with good repeatability. The system is suitable for future investigation of pseudoaccommodation in the pseudophakic eye.

  20. Vision based obstacle detection and grouping for helicopter guidance

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Chatterji, Gano

    1993-01-01

    Electro-optical sensors can be used to compute range to objects in the flight path of a helicopter. The computation is based on the optical flow/motion at different points in the image. The motion algorithms provide a sparse set of ranges to discrete features in the image sequence as a function of azimuth and elevation. For obstacle avoidance guidance and display purposes, these discrete set of ranges, varying from a few hundreds to several thousands, need to be grouped into sets which correspond to objects in the real world. This paper presents a new method for object segmentation based on clustering the sparse range information provided by motion algorithms together with the spatial relation provided by the static image. The range values are initially grouped into clusters based on depth. Subsequently, the clusters are modified by using the K-means algorithm in the inertial horizontal plane and the minimum spanning tree algorithms in the image plane. The object grouping allows interpolation within a group and enables the creation of dense range maps. Researchers in robotics have used densely scanned sequence of laser range images to build three-dimensional representation of the outside world. Thus, modeling techniques developed for dense range images can be extended to sparse range images. The paper presents object segmentation results for a sequence of flight images.

  1. A real-time 3D range image sensor based on a novel tip-tilt-piston micromirror and dual frequency phase shifting

    NASA Astrophysics Data System (ADS)

    Skotheim, Øystein; Schumann-Olsen, Henrik; Thorstensen, Jostein; Kim, Anna N.; Lacolle, Matthieu; Haugholt, Karl-Henrik; Bakke, Thor

    2015-03-01

    Structured light is a robust and accurate method for 3D range imaging in which one or more light patterns are projected onto the scene and observed with an off-axis camera. Commercial sensors typically utilize DMD- or LCD-based LED projectors, which produce good results but have a number of drawbacks, e.g. limited speed, limited depth of focus, large sensitivity to ambient light and somewhat low light efficiency. We present a 3D imaging system based on a laser light source and a novel tip-tilt-piston micro-mirror. Optical interference is utilized to create sinusoidal fringe patterns. The setup allows fast and easy control of both the frequency and the phase of the fringe patterns by altering the axes of the micro-mirror. For 3D reconstruction we have adapted a Dual Frequency Phase Shifting method which gives robust range measurements with sub-millimeter accuracy. The use of interference for generating sine patterns provides high light efficiency and good focusing properties. The use of a laser and a bandpass filter allows easy removal of ambient light. The fast response of the micro-mirror in combination with a high-speed camera and real-time processing on the GPU allows highly accurate 3D range image acquisition at video rates.

  2. Measurements of SWIR backgrounds using the swux unit of measure

    NASA Astrophysics Data System (ADS)

    Richards, A.; Hübner, M.; Vollmer, M.

    2018-04-01

    The SWIR waveband between 0.8μm-1.8μm is getting increasingly exploited by imaging systems in a variety of different applications, including persistent imaging for security and surveillance of high-value assets, handheld tactical imagers, range-gated imaging systems and imaging LADAR for driverless vehicles. The vast majority of these applications utilize lattice-matched InGaAs detectors in their imaging sensors, and these sensors are rapidly falling in price, leading to their widening adoption. As these sensors are used in novel applications and locations, it is important that ambient SWIR backgrounds be understood and characterized for a variety of different field conditions, primarily for the purposes of system performance modeling of SNR and range metrics. SWIR irradiance backgrounds do not consistently track visible-light illumination at all. There is currently little of this type of information in the open literature, particularly measurements of SWIR backgrounds in urban areas, natural areas, or indoors. This paper presents field measurements done with an InGaAs detector calibrated in the swux unit of InGaAs-band-specific irradiance proposed by two of the authors in 2017. Simultaneous measurements of illuminance levels (in lux) at these sites are presented, as well as visible and InGaAs camera images of the scenery at some of these measurement sites. The swux and lux measurement hardware is described, along with the methods used to calibrate it. Finally, the swux levels during the partial and total phases of the total solar eclipse of 2017 are presented, along with curves fitted to the data from a theoretical model, based on obscuration of the sun by the moon. The apparent differences between photometric and swux measurements will be discussed.

  3. Flash LIDAR Systems for Planetary Exploration

    NASA Astrophysics Data System (ADS)

    Dissly, Richard; Weinberg, J.; Weimer, C.; Craig, R.; Earhart, P.; Miller, K.

    2009-01-01

    Ball Aerospace offers a mature, highly capable 3D flash-imaging LIDAR system for planetary exploration. Multi mission applications include orbital, standoff and surface terrain mapping, long distance and rapid close-in ranging, descent and surface navigation and rendezvous and docking. Our flash LIDAR is an optical, time-of-flight, topographic imaging system, leveraging innovations in focal plane arrays, readout integrated circuit real time processing, and compact and efficient pulsed laser sources. Due to its modular design, it can be easily tailored to satisfy a wide range of mission requirements. Flash LIDAR offers several distinct advantages over traditional scanning systems. The entire scene within the sensor's field of view is imaged with a single laser flash. This directly produces an image with each pixel already correlated in time, making the sensor resistant to the relative motion of a target subject. Additionally, images may be produced at rates much faster than are possible with a scanning system. And because the system captures a new complete image with each flash, optical glint and clutter are easily filtered and discarded. This allows for imaging under any lighting condition and makes the system virtually insensitive to stray light. Finally, because there are no moving parts, our flash LIDAR system is highly reliable and has a long life expectancy. As an industry leader in laser active sensor system development, Ball Aerospace has been working for more than four years to mature flash LIDAR systems for space applications, and is now under contract to provide the Vision Navigation System for NASA's Orion spacecraft. Our system uses heritage optics and electronics from our star tracker products, and space qualified lasers similar to those used in our CALIPSO LIDAR, which has been in continuous operation since 2006, providing more than 1.3 billion laser pulses to date.

  4. High Dynamic Range Spectral Imaging Pipeline For Multispectral Filter Array Cameras.

    PubMed

    Lapray, Pierre-Jean; Thomas, Jean-Baptiste; Gouton, Pierre

    2017-06-03

    Spectral filter arrays imaging exhibits a strong similarity with color filter arrays. This permits us to embed this technology in practical vision systems with little adaptation of the existing solutions. In this communication, we define an imaging pipeline that permits high dynamic range (HDR)-spectral imaging, which is extended from color filter arrays. We propose an implementation of this pipeline on a prototype sensor and evaluate the quality of our implementation results on real data with objective metrics and visual examples. We demonstrate that we reduce noise, and, in particular we solve the problem of noise generated by the lack of energy balance. Data are provided to the community in an image database for further research.

  5. Best-next-view algorithm for three-dimensional scene reconstruction using range images

    NASA Astrophysics Data System (ADS)

    Banta, J. E.; Zhien, Yu; Wang, X. Z.; Zhang, G.; Smith, M. T.; Abidi, Mongi A.

    1995-10-01

    The primary focus of the research detailed in this paper is to develop an intelligent sensing module capable of automatically determining the optimal next sensor position and orientation during scene reconstruction. To facilitate a solution to this problem, we have assembled a system for reconstructing a 3D model of an object or scene from a sequence of range images. Candidates for the best-next-view position are determined by detecting and measuring occlusions to the range camera's view in an image. Ultimately, the candidate which will reveal the greatest amount of unknown scene information is selected as the best-next-view position. Our algorithm uses ray tracing to determine how much new information a given sensor perspective will reveal. We have tested our algorithm successfully on several synthetic range data streams, and found the system's results to be consistent with an intuitive human search. The models recovered by our system from range data compared well with the ideal models. Essentially, we have proven that range information of physical objects can be employed to automatically reconstruct a satisfactory dynamic 3D computer model at a minimal computational expense. This has obvious implications in the contexts of robot navigation, manufacturing, and hazardous materials handling. The algorithm we developed takes advantage of no a priori information in finding the best-next-view position.

  6. Orbital Express Advanced Video Guidance Sensor

    NASA Technical Reports Server (NTRS)

    Howard, Ricky; Heaton, Andy; Pinson, Robin; Carrington, Connie

    2008-01-01

    In May 2007 the first US fully autonomous rendezvous and capture was successfully performed by DARPA's Orbital Express (OE) mission. Since then, the Boeing ASTRO spacecraft and the Ball Aerospace NEXTSat have performed multiple rendezvous and docking maneuvers to demonstrate the technologies needed for satellite servicing. MSFC's Advanced Video Guidance Sensor (AVGS) is a primary near-field proximity operations sensor integrated into ASTRO's Autonomous Rendezvous and Capture Sensor System (ARCSS), which provides relative state knowledge to the ASTRO GN&C system. This paper provides an overview of the AVGS sensor flying on Orbital Express, and a summary of the ground testing and on-orbit performance of the AVGS for OE. The AVGS is a laser-based system that is capable of providing range and bearing at midrange distances and full six degree-of-freedom (6DOF) knowledge at near fields. The sensor fires lasers at two different frequencies to illuminate the Long Range Targets (LRTs) and the Short Range Targets (SRTs) on NEXTSat. Subtraction of one image from the other image removes extraneous light sources and reflections from anything other than the corner cubes on the LRTs and SRTs. This feature has played a significant role for Orbital Express in poor lighting conditions. The very bright spots that remain in the subtracted image are processed by the target recognition algorithms and the inverse-perspective algorithms, to provide 3DOF or 6DOF relative state information. Although Orbital Express has configured the ASTRO ARCSS system to only use AVGS at ranges of 120 m or less, some OE scenarios have provided opportunities for AVGS to acquire and track NEXTSat at greater distances. Orbital Express scenarios to date that have utilized AVGS include a berthing operation performed by the ASTRO robotic arm, sensor checkout maneuvers performed by the ASTRO robotic arm, 10-m unmated operations, 30-m unmated operations, and Scenario 3-1 anomaly recovery. The AVGS performed very well during the pre-unmated operations, effectively tracking beyond its 10-degree Pitch and Yaw limit-specifications, and did not require I-LOAD adjustments before unmated operations. AVGS provided excellent performance in the 10-m unmated operations, effectively tracking and maintaining lock for the duration of this scenario, and showing good agreement between the short and long range targets. During the 30-m unmated operations, the AVGS continuously tracked the SRT to 31.6 m, exceeding expectations, and continuously tracked the LRT from 8.8 m out to 31.6 m, with good agreement between these two target solutions. After this scenario was aborted at a 10-m separation during remate operations, the AVGS tracked the LRT out 54.3 m, until the relative attitude between the vehicles was too large. The vehicles remained apart for eight days, at ranges from 1 km to 6 km. During the approach to remate in this recovery operation, the AVGS began tracking the LRT at 150 m, well beyond the OE planned limits for AVGS ranges, and functioned as the primary sensor for the autonomous rendezvous and docking.

  7. The fast and accurate 3D-face scanning technology based on laser triangle sensors

    NASA Astrophysics Data System (ADS)

    Wang, Jinjiang; Chang, Tianyu; Ge, Baozhen; Tian, Qingguo; Chen, Yang; Kong, Bin

    2013-08-01

    A laser triangle scanning method and the structure of 3D-face measurement system were introduced. In presented system, a liner laser source was selected as an optical indicated signal in order to scanning a line one times. The CCD image sensor was used to capture image of the laser line modulated by human face. The system parameters were obtained by system calibrated calculated. The lens parameters of image part of were calibrated with machine visual image method and the triangle structure parameters were calibrated with fine wire paralleled arranged. The CCD image part and line laser indicator were set with a linear motor carry which can achieve the line laser scanning form top of the head to neck. For the nose is ledge part and the eyes are sunk part, one CCD image sensor can not obtain the completed image of laser line. In this system, two CCD image sensors were set symmetric at two sides of the laser indicator. In fact, this structure includes two laser triangle measure units. Another novel design is there laser indicators were arranged in order to reduce the scanning time for it is difficult for human to keep static for longer time. The 3D data were calculated after scanning. And further data processing include 3D coordinate refine, mesh calculate and surface show. Experiments show that this system has simply structure, high scanning speed and accurate. The scanning range covers the whole head of adult, the typical resolution is 0.5mm.

  8. A screen-printed flexible flow sensor

    NASA Astrophysics Data System (ADS)

    Moschos, A.; Syrovy, T.; Syrova, L.; Kaltsas, G.

    2017-04-01

    A thermal flow sensor was printed on a flexible plastic substrate using exclusively screen-printing techniques. The presented device was implemented with custom made screen-printed thermistors, which allows simple, cost-efficient production on a variety of flexible substrates while maintaining the typical advantages of thermal flow sensors. Evaluation was performed for both static (zero flow) and dynamic conditions using a combination of electrical measurements and IR imaging techniques in order to determine important characteristics, such as temperature response, output repeatability, etc. The flow sensor was characterized utilizing the hot-wire and calorimetric principles of operation, while the preliminary results appear to be very promising, since the sensor was successfully evaluated and displayed adequate sensitivity in a relatively wide flow range.

  9. Silicon Based Schottky Barrier Infrared Sensors For Power System And Industrial Applications

    NASA Astrophysics Data System (ADS)

    Elabd, Hammam; Kosonocky, Walter F.

    1984-03-01

    Schottky barrier infrared charge coupled device sensors (IR-CCDs) have been developed. PtSi Schottky barrier detectors require cooling to liquid Nitrogen temperature and cover the wavelength range between 1 and 6 μm. The PtSi IR-CCDs can be used in industrial thermography with NEAT below 0.1°C. Pd Si-Schottkybarrier detectors require cooling to 145K and cover the spectral range between 1 and 3.5 μm. 11d2Si-IR-CCDs can be used in imaging high temperature scenes with NE▵T around 100°C. Several high density staring area and line imagers are available. Both interlaced and noninterlaced area imagers can be operated with variable and TV compatible frame rates as well as various field of view angles. The advantages of silicon fabrication technology in terms of cost and high density structures opens the doors for the design of special purpose thermal camera systems for a number of power aystem and industrial applications.

  10. Validation of Special Sensor Ultraviolet Limb Imager (SSULI) Ionospheric Tomography using ALTAIR Incoherent Scatter Radar Measurements

    NASA Astrophysics Data System (ADS)

    Dymond, K.; Nicholas, A. C.; Budzien, S. A.; Stephan, A. W.; Coker, C.; Hei, M. A.; Groves, K. M.

    2015-12-01

    The Special Sensor Ultraviolet Limb Imager (SSULI) instruments are ultraviolet limb scanning sensors flying on the Defense Meteorological Satellite Program (DMSP) satellites. The SSULIs observe the 80-170 nanometer wavelength range covering emissions at 91 and 136 nm, which are produced by radiative recombination of the ionosphere. We invert these emissions tomographically using newly developed algorithms that include optical depth effects due to pure absorption and resonant scattering. We present the details of our approach including how the optimal altitude and along-track sampling were determined and the newly developed approach we are using for regularizing the SSULI tomographic inversions. Finally, we conclude with validations of the SSULI inversions against ALTAIR incoherent scatter radar measurements and demonstrate excellent agreement between the measurements.

  11. Surface Plasmon Resonance Imaging biosensor for cystatin determination based on the application of bromelain, ficin and chymopapain.

    PubMed

    Gorodkiewicz, Ewa; Breczko, Joanna; Sankiewicz, Anna

    2012-04-24

    A Surface Plasmon Resonance Imaging (SPRI) sensor based on bromelain or chymopapain or ficin has been developed for specific cystatin determination. Cystatin was captured from a solution by immobilized bromelain or chymopapain or ficin due to the formation of an enzyme-inhibitor complex on the biosensor surface. The influence of bromelain, chymopapain or ficin concentration, as well as the pH of the interaction on the SPRI signal, was investigated and optimized. Sensor dynamic response range is between 0-0.6 μg/ml and the detection limit is equal to 0.1 μg/ml. In order to demonstrate the sensor potential, cystatin was determined in blood plasma, urine and saliva, showing good agreement with the data reported in the literature.

  12. Three-dimensional cascaded system analysis of a 50 µm pixel pitch wafer-scale CMOS active pixel sensor x-ray detector for digital breast tomosynthesis.

    PubMed

    Zhao, C; Vassiljev, N; Konstantinidis, A C; Speller, R D; Kanicki, J

    2017-03-07

    High-resolution, low-noise x-ray detectors based on the complementary metal-oxide-semiconductor (CMOS) active pixel sensor (APS) technology have been developed and proposed for digital breast tomosynthesis (DBT). In this study, we evaluated the three-dimensional (3D) imaging performance of a 50 µm pixel pitch CMOS APS x-ray detector named DynAMITe (Dynamic Range Adjustable for Medical Imaging Technology). The two-dimensional (2D) angle-dependent modulation transfer function (MTF), normalized noise power spectrum (NNPS), and detective quantum efficiency (DQE) were experimentally characterized and modeled using the cascaded system analysis at oblique incident angles up to 30°. The cascaded system model was extended to the 3D spatial frequency space in combination with the filtered back-projection (FBP) reconstruction method to calculate the 3D and in-plane MTF, NNPS and DQE parameters. The results demonstrate that the beam obliquity blurs the 2D MTF and DQE in the high spatial frequency range. However, this effect can be eliminated after FBP image reconstruction. In addition, impacts of the image acquisition geometry and detector parameters were evaluated using the 3D cascaded system analysis for DBT. The result shows that a wider projection angle range (e.g.  ±30°) improves the low spatial frequency (below 5 mm -1 ) performance of the CMOS APS detector. In addition, to maintain a high spatial resolution for DBT, a focal spot size of smaller than 0.3 mm should be used. Theoretical analysis suggests that a pixelated scintillator in combination with the 50 µm pixel pitch CMOS APS detector could further improve the 3D image resolution. Finally, the 3D imaging performance of the CMOS APS and an indirect amorphous silicon (a-Si:H) thin-film transistor (TFT) passive pixel sensor (PPS) detector was simulated and compared.

  13. Three-dimensional cascaded system analysis of a 50 µm pixel pitch wafer-scale CMOS active pixel sensor x-ray detector for digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Zhao, C.; Vassiljev, N.; Konstantinidis, A. C.; Speller, R. D.; Kanicki, J.

    2017-03-01

    High-resolution, low-noise x-ray detectors based on the complementary metal-oxide-semiconductor (CMOS) active pixel sensor (APS) technology have been developed and proposed for digital breast tomosynthesis (DBT). In this study, we evaluated the three-dimensional (3D) imaging performance of a 50 µm pixel pitch CMOS APS x-ray detector named DynAMITe (Dynamic Range Adjustable for Medical Imaging Technology). The two-dimensional (2D) angle-dependent modulation transfer function (MTF), normalized noise power spectrum (NNPS), and detective quantum efficiency (DQE) were experimentally characterized and modeled using the cascaded system analysis at oblique incident angles up to 30°. The cascaded system model was extended to the 3D spatial frequency space in combination with the filtered back-projection (FBP) reconstruction method to calculate the 3D and in-plane MTF, NNPS and DQE parameters. The results demonstrate that the beam obliquity blurs the 2D MTF and DQE in the high spatial frequency range. However, this effect can be eliminated after FBP image reconstruction. In addition, impacts of the image acquisition geometry and detector parameters were evaluated using the 3D cascaded system analysis for DBT. The result shows that a wider projection angle range (e.g.  ±30°) improves the low spatial frequency (below 5 mm-1) performance of the CMOS APS detector. In addition, to maintain a high spatial resolution for DBT, a focal spot size of smaller than 0.3 mm should be used. Theoretical analysis suggests that a pixelated scintillator in combination with the 50 µm pixel pitch CMOS APS detector could further improve the 3D image resolution. Finally, the 3D imaging performance of the CMOS APS and an indirect amorphous silicon (a-Si:H) thin-film transistor (TFT) passive pixel sensor (PPS) detector was simulated and compared.

  14. Three-dimensional photoacoustic imaging of vascular anatomy in small animals using an optical detection system

    NASA Astrophysics Data System (ADS)

    Zhang, Edward Z.; Laufer, Jan; Beard, Paul

    2007-02-01

    A 3D photoacoustic imaging instrument for characterising small animal models of human disease processes has been developed. The system comprises an OPO excitation source and a backward-mode planar ultrasound imaging head based upon a Fabry Perot polymer film sensing interferometer (FPI). The mirrors of the latter are transparent between 590 - 1200nm but highly reflective between 1500-1600nm. This enables nanosecond excitation laser pulses in the former wavelength range, where biological tissues are relatively transparent, to be transmitted through the sensor head into the tissue. The resulting photoacoustic signals arrive at the sensor where they modulate the optical thickness of the FPI and therefore its reflectivity. By scanning a CW focused interrogating laser beam at 1550nm across the surface of the sensor, the spatial-temporal distribution of the photoacoustic signals can therefore be mapped in 2D enabling a 3D photoacoustic image to be reconstructed. To demonstrate the application of the system to imaging small animals such as mice, 3D images of the vascular anatomy of the mouse brain and the microvasculature in the skin around the abdomen were obtained non invasively. It is considered that this system provides a practical alternative to photoacoustic scanners based upon piezoelectric detectors for high resolution non invasive small animal imaging.

  15. Ultrahigh sensitivity endoscopic camera using a new CMOS image sensor: providing with clear images under low illumination in addition to fluorescent images.

    PubMed

    Aoki, Hisae; Yamashita, Hiromasa; Mori, Toshiyuki; Fukuyo, Tsuneo; Chiba, Toshio

    2014-11-01

    We developed a new ultrahigh-sensitive CMOS camera using a specific sensor that has a wide range of spectral sensitivity characteristics. The objective of this study is to present our updated endoscopic technology that has successfully integrated two innovative functions; ultrasensitive imaging as well as advanced fluorescent viewing. Two different experiments were conducted. One was carried out to evaluate the function of the ultrahigh-sensitive camera. The other was to test the availability of the newly developed sensor and its performance as a fluorescence endoscope. In both studies, the distance from the endoscopic tip to the target was varied and those endoscopic images in each setting were taken for further comparison. In the first experiment, the 3-CCD camera failed to display the clear images under low illumination, and the target was hardly seen. In contrast, the CMOS camera was able to display the targets regardless of the camera-target distance under low illumination. Under high illumination, imaging quality given by both cameras was quite alike. In the second experiment as a fluorescence endoscope, the CMOS camera was capable of clearly showing the fluorescent-activated organs. The ultrahigh sensitivity CMOS HD endoscopic camera is expected to provide us with clear images under low illumination in addition to the fluorescent images under high illumination in the field of laparoscopic surgery.

  16. Modeling of 1.5 μm range gated imaging for small surface vessel identification

    NASA Astrophysics Data System (ADS)

    Espinola, Richard L.; Steinvall, Ove; Elmquist, Magnus; Karlsson, Kjell

    2010-10-01

    Within the framework of the NATO group (NATO SET-132/RTG-72) on imaging ladars, a test was performed to collect simultaneous multi-mode LADAR signatures of maritime objects entering and leaving San Diego Harbor. Beside ladars, passive sensors were also employed during the test which occurred during April 2009 from Point Loma and the harbor in San Diego. This paper will report on 1.5 μm gated imaging on a number of small civilian surface vessels with the aim to present human perception experimental results and comparisons with sensor performance models developed by US Army RDECOM CERDEC NVESD. We use controlled human perception tests to measure target identification performance and compare the experimental results with model predictions.

  17. Object recognition of ladar with support vector machine

    NASA Astrophysics Data System (ADS)

    Sun, Jian-Feng; Li, Qi; Wang, Qi

    2005-01-01

    Intensity, range and Doppler images can be obtained by using laser radar. Laser radar can detect much more object information than other detecting sensor, such as passive infrared imaging and synthetic aperture radar (SAR), so it is well suited as the sensor of object recognition. Traditional method of laser radar object recognition is extracting target features, which can be influenced by noise. In this paper, a laser radar recognition method-Support Vector Machine is introduced. Support Vector Machine (SVM) is a new hotspot of recognition research after neural network. It has well performance on digital written and face recognition. Two series experiments about SVM designed for preprocessing and non-preprocessing samples are performed by real laser radar images, and the experiments results are compared.

  18. Human perception testing methodology for evaluating EO/IR imaging systems

    NASA Astrophysics Data System (ADS)

    Graybeal, John J.; Monfort, Samuel S.; Du Bosq, Todd W.; Familoni, Babajide O.

    2018-04-01

    The U.S. Army's RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD) Perception Lab is tasked with supporting the development of sensor systems for the U.S. Army by evaluating human performance of emerging technologies. Typical research questions involve detection, recognition and identification as a function of range, blur, noise, spectral band, image processing techniques, image characteristics, and human factors. NVESD's Perception Lab provides an essential bridge between the physics of the imaging systems and the performance of the human operator. In addition to quantifying sensor performance, perception test results can also be used to generate models of human performance and to drive future sensor requirements. The Perception Lab seeks to develop and employ scientifically valid and efficient perception testing procedures within the practical constraints of Army research, including rapid development timelines for critical technologies, unique guidelines for ethical testing of Army personnel, and limited resources. The purpose of this paper is to describe NVESD Perception Lab capabilities, recent methodological improvements designed to align our methodology more closely with scientific best practice, and to discuss goals for future improvements and expanded capabilities. Specifically, we discuss modifying our methodology to improve training, to account for human fatigue, to improve assessments of human performance, and to increase experimental design consultation provided by research psychologists. Ultimately, this paper outlines a template for assessing human perception and overall system performance related to EO/IR imaging systems.

  19. Supervised autonomous rendezvous and docking system technology evaluation

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville I.

    1991-01-01

    Technology for manned space flight is mature and has an extensive history of the use of man-in-the-loop rendezvous and docking, but there is no history of automated rendezvous and docking. Sensors exist that can operate in the space environment. The Shuttle radar can be used for ranges down to 30 meters, Japan and France are developing laser rangers, and considerable work is going on in the U.S. However, there is a need to validate a flight qualified sensor for the range of 30 meters to contact. The number of targets and illumination patterns should be minimized to reduce operation constraints with one or more sensors integrated into a robust system for autonomous operation. To achieve system redundancy, it is worthwhile to follow a parallel development of qualifying and extending the range of the 0-12 meter MSFC sensor and to simultaneously qualify the 0-30(+) meter JPL laser ranging system as an additional sensor with overlapping capabilities. Such an approach offers a redundant sensor suite for autonomous rendezvous and docking. The development should include the optimization of integrated sensory systems, packaging, mission envelopes, and computer image processing to mimic brain perception and real-time response. The benefits of the Global Positioning System in providing real-time positioning data of high accuracy must be incorporated into the design. The use of GPS-derived attitude data should be investigated further and validated.

  20. Work step indication with grid-pattern projection for demented senior people.

    PubMed

    Uranishi, Yuki; Yamamoto, Goshiro; Asghar, Zeeshan; Pulli, Petri; Kato, Hirokazu; Oshiro, Osamu

    2013-01-01

    This paper proposes a work step indication method for supporting daily work with a grid-pattern projection. To support an independent life of demented senior people, it is desirable that an instruction is easy to understand visually and not complicated. The proposed method in this paper uses a range image sensor and a camera in addition to a projector. A 3D geometry of a target scene is measured by the range image sensor, and the grid-pattern is projected onto the scene directly. Direct projection of the work step is easier to be associated with the target objects around the assisted person, and the grid-pattern is a solution to indicate the spatial instruction. A prototype has been implemented and has demonstrated that the proposed grid-pattern projection is easy to show the work step.

  1. Image-plane processing of visual information

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Fales, C. L.; Park, S. K.; Samms, R. W.

    1984-01-01

    Shannon's theory of information is used to optimize the optical design of sensor-array imaging systems which use neighborhood image-plane signal processing for enhancing edges and compressing dynamic range during image formation. The resultant edge-enhancement, or band-pass-filter, response is found to be very similar to that of human vision. Comparisons of traits in human vision with results from information theory suggest that: (1) Image-plane processing, like preprocessing in human vision, can improve visual information acquisition for pattern recognition when resolving power, sensitivity, and dynamic range are constrained. Improvements include reduced sensitivity to changes in lighter levels, reduced signal dynamic range, reduced data transmission and processing, and reduced aliasing and photosensor noise degradation. (2) Information content can be an appropriate figure of merit for optimizing the optical design of imaging systems when visual information is acquired for pattern recognition. The design trade-offs involve spatial response, sensitivity, and sampling interval.

  2. Error analysis on spinal motion measurement using skin mounted sensors.

    PubMed

    Yang, Zhengyi; Ma, Heather Ting; Wang, Deming; Lee, Raymond

    2008-01-01

    Measurement errors of skin-mounted sensors in measuring forward bending movement of the lumbar spines are investigated. In this investigation, radiographic images capturing the entire lumbar spines' positions were acquired and used as a 'gold' standard. Seventeen young male volunteers (21 (SD 1) years old) agreed to participate in the study. Light-weight miniature sensors of the electromagnetic tracking systems-Fastrak were attached to the skin overlying the spinous processes of the lumbar spine. With the sensors attached, the subjects were requested to take lateral radiographs in two postures: neutral upright and full flexion. The ranges of motions of lumbar spine were calculated from two sets of digitized data: the bony markers of vertebral bodies and the sensors and compared. The differences between the two sets of results were then analyzed. The relative movement between sensor and vertebrae was decomposed into sensor sliding and titling, from which sliding error and titling error were introduced. Gross motion range of forward bending of lumbar spine measured from bony markers of vertebrae is 67.8 degrees (SD 10.6 degrees ) and that from sensors is 62.8 degrees (SD 12.8 degrees ). The error and absolute error for gross motion range were 5.0 degrees (SD 7.2 degrees ) and 7.7 degrees (SD 3.9 degrees ). The contributions of sensors placed on S1 and L1 to the absolute error were 3.9 degrees (SD 2.9 degrees ) and 4.4 degrees (SD 2.8 degrees ), respectively.

  3. Intelligent Network-Centric Sensors Development Program

    DTIC Science & Technology

    2012-07-31

    Image sensor Configuration: ; Cone 360 degree LWIR PFx Sensor: •■. Image sensor . Configuration: Image MWIR Configuration; Cone 360 degree... LWIR PFx Sensor: Video Configuration: Cone 360 degree SW1R, 2. Reasoning Process to Match Sensor Systems to Algorithms The ontological...effects of coherent imaging because of aberrations. Another reason is the specular nature of active imaging. Both contribute to the nonuniformity

  4. Concurrent initialization for Bearing-Only SLAM.

    PubMed

    Munguía, Rodrigo; Grau, Antoni

    2010-01-01

    Simultaneous Localization and Mapping (SLAM) is perhaps the most fundamental problem to solve in robotics in order to build truly autonomous mobile robots. The sensors have a large impact on the algorithm used for SLAM. Early SLAM approaches focused on the use of range sensors as sonar rings or lasers. However, cameras have become more and more used, because they yield a lot of information and are well adapted for embedded systems: they are light, cheap and power saving. Unlike range sensors which provide range and angular information, a camera is a projective sensor which measures the bearing of images features. Therefore depth information (range) cannot be obtained in a single step. This fact has propitiated the emergence of a new family of SLAM algorithms: the Bearing-Only SLAM methods, which mainly rely in especial techniques for features system-initialization in order to enable the use of bearing sensors (as cameras) in SLAM systems. In this work a novel and robust method, called Concurrent Initialization, is presented which is inspired by having the complementary advantages of the Undelayed and Delayed methods that represent the most common approaches for addressing the problem. The key is to use concurrently two kinds of feature representations for both undelayed and delayed stages of the estimation. The simulations results show that the proposed method surpasses the performance of previous schemes.

  5. Velocity filtering applied to optical flow calculations

    NASA Technical Reports Server (NTRS)

    Barniv, Yair

    1990-01-01

    Optical flow is a method by which a stream of two-dimensional images obtained from a forward-looking passive sensor is used to map the three-dimensional volume in front of a moving vehicle. Passive ranging via optical flow is applied here to the helicopter obstacle-avoidance problem. Velocity filtering is used as a field-based method to determine range to all pixels in the initial image. The theoretical understanding and performance analysis of velocity filtering as applied to optical flow is expanded and experimental results are presented.

  6. Filtered Rayleigh Scattering Measurements in a Buoyant Flowfield

    DTIC Science & Technology

    2007-03-01

    common filter used in FRS applications . Iodine is more attractive than mercury to use in a filter due to its broader range of blocking and transmission...is a 4032x2688 pixel camera with a monochrome or colored CCD imaging sensor. The binning range of the camera is (HxV) 1x1 to 2x8. The manufacturer...center position of the jet of the time averaged image . The z center position is chosen so that it is the average z value bounding helium

  7. Polymer-carbon black composite sensors in an electronic nose for air-quality monitoring

    NASA Technical Reports Server (NTRS)

    Ryan, M. A.; Shevade, A. V.; Zhou, H.; Homer, M. L.

    2004-01-01

    An electronic nose that uses an array of 32 polymer-carbon black composite sensors has been developed, trained, and tested. By selecting a variety of chemical functionalities in the polymers used to make sensors, it is possible to construct an array capable of identifying and quantifying a broad range of target compounds, such as alcohols and aromatics, and distinguishing isomers and enantiomers (mirror-image isomers). A model of the interaction between target molecules and the polymer-carbon black composite sensors is under development to aid in selecting the array members and to enable identification of compounds with responses not stored in the analysis library.

  8. Optical and electrical characterization of a back-thinned CMOS active pixel sensor

    NASA Astrophysics Data System (ADS)

    Blue, Andrew; Clark, A.; Houston, S.; Laing, A.; Maneuski, D.; Prydderch, M.; Turchetta, R.; O'Shea, V.

    2009-06-01

    This work will report on the first work on the characterization of a back-thinned Vanilla-a 512×512 (25 μm squared) active pixel sensor (APS). Characterization of the detectors was carried out through the analysis of photon transfer curves to yield a measurement of full well capacity, noise levels, gain constants and linearity. Spectral characterization of the sensors was also performed in the visible and UV regions. A full comparison against non-back-thinned front illuminated Vanilla sensors is included. Such measurements suggest that the Vanilla APS will be suitable for a wide range of applications, including particle physics and biomedical imaging.

  9. A 128 x 128 CMOS Active Pixel Image Sensor for Highly Integrated Imaging Systems

    NASA Technical Reports Server (NTRS)

    Mendis, Sunetra K.; Kemeny, Sabrina E.; Fossum, Eric R.

    1993-01-01

    A new CMOS-based image sensor that is intrinsically compatible with on-chip CMOS circuitry is reported. The new CMOS active pixel image sensor achieves low noise, high sensitivity, X-Y addressability, and has simple timing requirements. The image sensor was fabricated using a 2 micrometer p-well CMOS process, and consists of a 128 x 128 array of 40 micrometer x 40 micrometer pixels. The CMOS image sensor technology enables highly integrated smart image sensors, and makes the design, incorporation and fabrication of such sensors widely accessible to the integrated circuit community.

  10. Nanophotonic Image Sensors

    PubMed Central

    Hu, Xin; Wen, Long; Yu, Yan; Cumming, David R. S.

    2016-01-01

    The increasing miniaturization and resolution of image sensors bring challenges to conventional optical elements such as spectral filters and polarizers, the properties of which are determined mainly by the materials used, including dye polymers. Recent developments in spectral filtering and optical manipulating techniques based on nanophotonics have opened up the possibility of an alternative method to control light spectrally and spatially. By integrating these technologies into image sensors, it will become possible to achieve high compactness, improved process compatibility, robust stability and tunable functionality. In this Review, recent representative achievements on nanophotonic image sensors are presented and analyzed including image sensors with nanophotonic color filters and polarizers, metamaterial‐based THz image sensors, filter‐free nanowire image sensors and nanostructured‐based multispectral image sensors. This novel combination of cutting edge photonics research and well‐developed commercial products may not only lead to an important application of nanophotonics but also offer great potential for next generation image sensors beyond Moore's Law expectations. PMID:27239941

  11. Non-Linearity in Wide Dynamic Range CMOS Image Sensors Utilizing a Partial Charge Transfer Technique.

    PubMed

    Shafie, Suhaidi; Kawahito, Shoji; Halin, Izhal Abdul; Hasan, Wan Zuha Wan

    2009-01-01

    The partial charge transfer technique can expand the dynamic range of a CMOS image sensor by synthesizing two types of signal, namely the long and short accumulation time signals. However the short accumulation time signal obtained from partial transfer operation suffers of non-linearity with respect to the incident light. In this paper, an analysis of the non-linearity in partial charge transfer technique has been carried, and the relationship between dynamic range and the non-linearity is studied. The results show that the non-linearity is caused by two factors, namely the current diffusion, which has an exponential relation with the potential barrier, and the initial condition of photodiodes in which it shows that the error in the high illumination region increases as the ratio of the long to the short accumulation time raises. Moreover, the increment of the saturation level of photodiodes also increases the error in the high illumination region.

  12. Image quality evaluation of eight complementary metal-oxide semiconductor intraoral digital X-ray sensors.

    PubMed

    Teich, Sorin; Al-Rawi, Wisam; Heima, Masahiro; Faddoul, Fady F; Goldzweig, Gil; Gutmacher, Zvi; Aizenbud, Dror

    2016-10-01

    To evaluate the image quality generated by eight commercially available intraoral sensors. Eighteen clinicians ranked the quality of a bitewing acquired from one subject using eight different intraoral sensors. Analytical methods used to evaluate clinical image quality included the Visual Grading Characteristics method, which helps to quantify subjective opinions to make them suitable for analysis. The Dexis sensor was ranked significantly better than Sirona and Carestream-Kodak sensors; and the image captured using the Carestream-Kodak sensor was ranked significantly worse than those captured using Dexis, Schick and Cyber Medical Imaging sensors. The Image Works sensor image was rated the lowest by all clinicians. Other comparisons resulted in non-significant results. None of the sensors was considered to generate images of significantly better quality than the other sensors tested. Further research should be directed towards determining the clinical significance of the differences in image quality reported in this study. © 2016 FDI World Dental Federation.

  13. A 7 ke-SD-FWC 1.2 e-RMS Temporal Random Noise 128×256 Time-Resolved CMOS Image Sensor With Two In-Pixel SDs for Biomedical Applications.

    PubMed

    Seo, Min-Woong; Kawahito, Shoji

    2017-12-01

    A large full well capacity (FWC) for wide signal detection range and low temporal random noise for high sensitivity lock-in pixel CMOS image sensor (CIS) embedded with two in-pixel storage diodes (SDs) has been developed and presented in this paper. For fast charge transfer from photodiode to SDs, a lateral electric field charge modulator (LEFM) is used for the developed lock-in pixel. As a result, the time-resolved CIS achieves a very large SD-FWC of approximately 7ke-, low temporal random noise of 1.2e-rms at 20 fps with true correlated double sampling operation and fast intrinsic response less than 500 ps at 635 nm. The proposed imager has an effective pixel array of and a pixel size of . The sensor chip is fabricated by Dongbu HiTek 1P4M 0.11 CIS process.

  14. Bioinspired design of a polymer gel sensor for the realization of extracellular Ca2+ imaging

    NASA Astrophysics Data System (ADS)

    Ishiwari, Fumitaka; Hasebe, Hanako; Matsumura, Satoko; Hajjaj, Fatin; Horii-Hayashi, Noriko; Nishi, Mayumi; Someya, Takao; Fukushima, Takanori

    2016-04-01

    Although the role of extracellular Ca2+ draws increasing attention as a messenger in intercellular communications, there is currently no tool available for imaging Ca2+ dynamics in extracellular regions. Here we report the first solid-state fluorescent Ca2+ sensor that fulfills the essential requirements for realizing extracellular Ca2+ imaging. Inspired by natural extracellular Ca2+-sensing receptors, we designed a particular type of chemically-crosslinked polyacrylic acid gel, which can undergo single-chain aggregation in the presence of Ca2+. By attaching aggregation-induced emission luminogen to the polyacrylic acid as a pendant, the conformational state of the main chain at a given Ca2+ concentration is successfully translated into fluorescence property. The Ca2+ sensor has a millimolar-order apparent dissociation constant compatible with extracellular Ca2+ concentrations, and exhibits sufficient dynamic range and excellent selectivity in the presence of physiological concentrations of biologically relevant ions, thus enabling monitoring of submillimolar fluctuations of Ca2+ in flowing analytes containing millimolar Ca2+ concentrations.

  15. UGS video target detection and discrimination

    NASA Astrophysics Data System (ADS)

    Roberts, G. Marlon; Fitzgerald, James; McCormack, Michael; Steadman, Robert; Vitale, Joseph D.

    2007-04-01

    This project focuses on developing electro-optic algorithms which rank images by their likelihood of containing vehicles and people. These algorithms have been applied to images obtained from Textron's Terrain Commander 2 (TC2) Unattended Ground Sensor system. The TC2 is a multi-sensor surveillance system used in military applications. It combines infrared, acoustic, seismic, magnetic, and electro-optic sensors to detect nearby targets. When targets are detected by the seismic and acoustic sensors, the system is triggered and images are taken in the visible and infrared spectrum. The original Terrain Commander system occasionally captured and transmitted an excessive number of images, sometimes triggered by undesirable targets such as swaying trees. This wasted communications bandwidth, increased power consumption, and resulted in a large amount of end-user time being spent evaluating unimportant images. The algorithms discussed here help alleviate these problems. These algorithms are currently optimized for infra-red images, which give the best visibility in a wide range of environments, but could be adapted to visible imagery as well. It is important that the algorithms be robust, with minimal dependency on user input. They should be effective when tracking varying numbers of targets of different sizes and orientations, despite the low resolutions of the images used. Most importantly, the algorithms must be appropriate for implementation on a low-power processor in real time. This would enable us to maintain frame rates of 2 Hz for effective surveillance operations. Throughout our project we have implemented several algorithms, and used an appropriate methodology to quantitatively compare their performance. They are discussed in this paper.

  16. Multi-MGy Radiation Hardened Camera for Nuclear Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Girard, Sylvain; Boukenter, Aziz; Ouerdane, Youcef

    There is an increasing interest in developing cameras for surveillance systems to monitor nuclear facilities or nuclear waste storages. Particularly, for today's and the next generation of nuclear facilities increasing safety requirements consecutive to Fukushima Daiichi's disaster have to be considered. For some applications, radiation tolerance needs to overcome doses in the MGy(SiO{sub 2}) range whereas the most tolerant commercial or prototypes products based on solid state image sensors withstand doses up to few kGy. The objective of this work is to present the radiation hardening strategy developed by our research groups to enhance the tolerance to ionizing radiations ofmore » the various subparts of these imaging systems by working simultaneously at the component and system design levels. Developing radiation-hardened camera implies to combine several radiation-hardening strategies. In our case, we decided not to use the simplest one, the shielding approach. This approach is efficient but limits the camera miniaturization and is not compatible with its future integration in remote-handling or robotic systems. Then, the hardening-by-component strategy appears mandatory to avoid the failure of one of the camera subparts at doses lower than the MGy. Concerning the image sensor itself, the used technology is a CMOS Image Sensor (CIS) designed by ISAE team with custom pixel designs used to mitigate the total ionizing dose (TID) effects that occur well below the MGy range in classical image sensors (e.g. Charge Coupled Devices (CCD), Charge Injection Devices (CID) and classical Active Pixel Sensors (APS)), such as the complete loss of functionality, the dark current increase and the gain drop. We'll present at the conference a comparative study between these radiation-hardened pixel radiation responses with respect to conventional ones, demonstrating the efficiency of the choices made. The targeted strategy to develop the complete radiation hard camera electronics will be exposed. Another important element of the camera is the optical system that transports the image from the scene to the image sensor. This arrangement of glass-based lenses is affected by radiations through two mechanisms: the radiation induced absorption and the radiation induced refractive index changes. The first one will limit the signal to noise ratio of the image whereas the second one will directly affect the resolution of the camera. We'll present at the conference a coupled simulation/experiment study of these effects for various commercial glasses and present vulnerability study of typical optical systems to radiations at MGy doses. The last very important part of the camera is the illumination system that can be based on various technologies of emitting devices like LED, SLED or lasers. The most promising solutions for high radiation doses will be presented at the conference. In addition to this hardening-by-component approach, the global radiation tolerance of the camera can be drastically improve by working at the system level, combining innovative approaches eg. for the optical and illumination systems. We'll present at the conference the developed approach allowing to extend the camera lifetime up to the MGy dose range. (authors)« less

  17. Hard-X-Ray/Soft-Gamma-Ray Imaging Sensor Assembly for Astronomy

    NASA Technical Reports Server (NTRS)

    Myers, Richard A.

    2008-01-01

    An improved sensor assembly has been developed for astronomical imaging at photon energies ranging from 1 to 100 keV. The assembly includes a thallium-doped cesium iodide scintillator divided into pixels and coupled to an array of high-gain avalanche photodiodes (APDs). Optionally, the array of APDs can be operated without the scintillator to detect photons at energies below 15 keV. The array of APDs is connected to compact electronic readout circuitry that includes, among other things, 64 independent channels for detection of photons in various energy ranges, up to a maximum energy of 100 keV, at a count rate up to 3 kHz. The readout signals are digitized and processed by imaging software that performs "on-the-fly" analysis. The sensor assembly has been integrated into an imaging spectrometer, along with a pair of coded apertures (Fresnel zone plates) that are used in conjunction with the pixel layout to implement a shadow-masking technique to obtain relatively high spatial resolution without having to use extremely small pixels. Angular resolutions of about 20 arc-seconds have been measured. Thus, for example, the imaging spectrometer can be used to (1) determine both the energy spectrum of a distant x-ray source and the angular deviation of the source from the nominal line of sight of an x-ray telescope in which the spectrometer is mounted or (2) study the spatial and temporal development of solar flares, repeating - ray bursters, and other phenomena that emit transient radiation in the hard-x-ray/soft- -ray region of the electromagnetic spectrum.

  18. Statistical characterization of speckle noise in coherent imaging systems

    NASA Astrophysics Data System (ADS)

    Yaroslavsky, Leonid; Shefler, A.

    2003-05-01

    Speckle noise imposes fundamental limitation on image quality in coherent radiation based imaging and optical metrology systems. Speckle noise phenomena are associated with properties of objects to diffusely scatter irradiation and with the fact that in recording the wave field, a number of signal distortions inevitably occur due to technical limitations inherent to hologram sensors. The statistical theory of speckle noise was developed with regard to only limited resolving power of coherent imaging devices. It is valid only asymptotically as much as the central limit theorem of the probability theory can be applied. In applications this assumption is not always applicable. Moreover, in treating speckle noise problem one should also consider other sources of the hologram deterioration. In the paper, statistical properties of speckle due to the limitation of hologram size, dynamic range and hologram signal quantization are studied by Monte-Carlo simulation for holograms recorded in near and far diffraction zones. The simulation experiments have shown that, for limited resolving power of the imaging system, widely accepted opinion that speckle contrast is equal to one holds only for rather severe level of the hologram size limitation. For moderate limitations, speckle contrast changes gradually from zero for no limitation to one for limitation to less than about 20% of hologram size. The results obtained for the limitation of the hologram sensor"s dynamic range and hologram signal quantization reveal that speckle noise due to these hologram signal distortions is not multiplicative and is directly associated with the severity of the limitation and quantization. On the base of the simulation results, analytical models are suggested.

  19. Proton-counting radiography for proton therapy: a proof of principle using CMOS APS technology

    NASA Astrophysics Data System (ADS)

    Poludniowski, G.; Allinson, N. M.; Anaxagoras, T.; Esposito, M.; Green, S.; Manolopoulos, S.; Nieto-Camero, J.; Parker, D. J.; Price, T.; Evans, P. M.

    2014-06-01

    Despite the early recognition of the potential of proton imaging to assist proton therapy (Cormack 1963 J. Appl. Phys. 34 2722), the modality is still removed from clinical practice, with various approaches in development. For proton-counting radiography applications such as computed tomography (CT), the water-equivalent-path-length that each proton has travelled through an imaged object must be inferred. Typically, scintillator-based technology has been used in various energy/range telescope designs. Here we propose a very different alternative of using radiation-hard CMOS active pixel sensor technology. The ability of such a sensor to resolve the passage of individual protons in a therapy beam has not been previously shown. Here, such capability is demonstrated using a 36 MeV cyclotron beam (University of Birmingham Cyclotron, Birmingham, UK) and a 200 MeV clinical radiotherapy beam (iThemba LABS, Cape Town, SA). The feasibility of tracking individual protons through multiple CMOS layers is also demonstrated using a two-layer stack of sensors. The chief advantages of this solution are the spatial discrimination of events intrinsic to pixelated sensors, combined with the potential provision of information on both the range and residual energy of a proton. The challenges in developing a practical system are discussed.

  20. Proton-counting radiography for proton therapy: a proof of principle using CMOS APS technology

    PubMed Central

    Poludniowski, G; Allinson, N M; Anaxagoras, T; Esposito, M; Green, S; Manolopoulos, S; Nieto-Camero, J; Parker, D J; Price, T; Evans, P M

    2014-01-01

    Despite the early recognition of the potential of proton imaging to assist proton therapy the modality is still removed from clinical practice, with various approaches in development. For proton-counting radiography applications such as Computed Tomography (CT), the Water-Equivalent-Path-Length (WEPL) that each proton has travelled through an imaged object must be inferred. Typically, scintillator-based technology has been used in various energy/range telescope designs. Here we propose a very different alternative of using radiation-hard CMOS Active Pixel Sensor (APS) technology. The ability of such a sensor to resolve the passage of individual protons in a therapy beam has not been previously shown. Here, such capability is demonstrated using a 36 MeV cyclotron beam (University of Birmingham Cyclotron, Birmingham, UK) and a 200 MeV clinical radiotherapy beam (iThemba LABS, Cape Town, SA). The feasibility of tracking individual protons through multiple CMOS layers is also demonstrated using a two-layer stack of sensors. The chief advantages of this solution are the spatial discrimination of events intrinsic to pixelated sensors, combined with the potential provision of information on both the range and residual energy of a proton. The challenges in developing a practical system are discussed. PMID:24785680

  1. High-resolution depth profiling using a range-gated CMOS SPAD quanta image sensor.

    PubMed

    Ren, Ximing; Connolly, Peter W R; Halimi, Abderrahim; Altmann, Yoann; McLaughlin, Stephen; Gyongy, Istvan; Henderson, Robert K; Buller, Gerald S

    2018-03-05

    A CMOS single-photon avalanche diode (SPAD) quanta image sensor is used to reconstruct depth and intensity profiles when operating in a range-gated mode used in conjunction with pulsed laser illumination. By designing the CMOS SPAD array to acquire photons within a pre-determined temporal gate, the need for timing circuitry was avoided and it was therefore possible to have an enhanced fill factor (61% in this case) and a frame rate (100,000 frames per second) that is more difficult to achieve in a SPAD array which uses time-correlated single-photon counting. When coupled with appropriate image reconstruction algorithms, millimeter resolution depth profiles were achieved by iterating through a sequence of temporal delay steps in synchronization with laser illumination pulses. For photon data with high signal-to-noise ratios, depth images with millimeter scale depth uncertainty can be estimated using a standard cross-correlation approach. To enhance the estimation of depth and intensity images in the sparse photon regime, we used a bespoke clustering-based image restoration strategy, taking into account the binomial statistics of the photon data and non-local spatial correlations within the scene. For sparse photon data with total exposure times of 75 ms or less, the bespoke algorithm can reconstruct depth images with millimeter scale depth uncertainty at a stand-off distance of approximately 2 meters. We demonstrate a new approach to single-photon depth and intensity profiling using different target scenes, taking full advantage of the high fill-factor, high frame rate and large array format of this range-gated CMOS SPAD array.

  2. Validation of Inertial and Optical Navigation Techniques for Space Applications with UAVS

    NASA Astrophysics Data System (ADS)

    Montaño, J.; Wis, M.; Pulido, J. A.; Latorre, A.; Molina, P.; Fernández, E.; Angelats, E.; Colomina, I.

    2015-09-01

    PERIGEO is an R&D project, funded by the INNPRONTA 2011-2014 programme from Spanish CDTI, which aims to investigate the use of UAV technologies and processes for the validation of space oriented technologies. For this purpose, among different space missions and technologies, a set of activities for absolute and relative navigation are being carried out to deal with the attitude and position estimation problem from a temporal image sequence from a camera on the visible spectrum and/or Light Detection and Ranging (LIDAR) sensor. The process is covered entirely: from sensor measurements and data acquisition (images, LiDAR ranges and angles), data pre-processing (calibration and co-registration of camera and LIDAR data), features and landmarks extraction from the images and image/LiDAR-based state estimation. In addition to image processing area, classical navigation system based on inertial sensors is also included in the research. The reason of combining both approaches is to enable the possibility to keep navigation capability in environments or missions where the radio beacon or reference signal as the GNSS satellite is not available (as for example an atmospheric flight in Titan). The rationale behind the combination of those systems is that they complement each other. The INS is capable of providing accurate position, velocity and full attitude estimations at high data rates. However, they need an absolute reference observation to compensate the time accumulative errors caused by inertial sensor inaccuracies. On the other hand, imaging observables can provide absolute and relative positioning and attitude estimations. However they need that the sensor head is pointing toward ground (something that may not be possible if the carrying platform is maneuvering) to provide accurate estimations and they are not capable of provide some hundreds of Hz that can deliver an INS. This mutual complementarity has been observed in PERIGEO and because of this they are combined into one system. The inertial navigation system implemented in PERIGEO is based on a classical loosely coupled INS/GNSS approach that is very similar to the implementation of the INS/Imaging navigation system that is mentioned above. The activities envisaged in PERIGEO cover the algorithms development and validation and technology testing on UAVs under representative conditions. Past activities have covered the design and development of the algorithms and systems. This paper presents the most recent activities and results on the area of image processing for robust estimation within PERIGEO, which are related with the hardware platforms definition (including sensors) and its integration in UAVs. Results for the tests performed during the flight campaigns in representative outdoor environments will be also presented (at the time of the full paper submission the tests will be performed), as well as analyzed, together with a roadmap definition for future developments.

  3. A research on radiation calibration of high dynamic range based on the dual channel CMOS

    NASA Astrophysics Data System (ADS)

    Ma, Kai; Shi, Zhan; Pan, Xiaodong; Wang, Yongsheng; Wang, Jianghua

    2017-10-01

    The dual channel complementary metal-oxide semiconductor (CMOS) can get high dynamic range (HDR) image through extending the gray level of the image by using image fusion with high gain channel image and low gain channel image in a same frame. In the process of image fusion with dual channel, it adopts the coefficients of radiation response of a pixel from dual channel in a same frame, and then calculates the gray level of the pixel in the HDR image. For the coefficients of radiation response play a crucial role in image fusion, it has to find an effective method to acquire these parameters. In this article, it makes a research on radiation calibration of high dynamic range based on the dual channel CMOS, and designs an experiment to calibrate the coefficients of radiation response for the sensor it used. In the end, it applies these response parameters in the dual channel CMOS which calibrates, and verifies the correctness and feasibility of the method mentioned in this paper.

  4. High-resolution room-temperature sample scanning superconducting quantum interference device microscope configurable for geological and biomagnetic applications

    NASA Astrophysics Data System (ADS)

    Fong, L. E.; Holzer, J. R.; McBride, K. K.; Lima, E. A.; Baudenbacher, F.; Radparvar, M.

    2005-05-01

    We have developed a scanning superconducting quantum interference device (SQUID) microscope system with interchangeable sensor configurations for imaging magnetic fields of room-temperature (RT) samples with submillimeter resolution. The low-critical-temperature (Tc) niobium-based monolithic SQUID sensors are mounted on the tip of a sapphire and thermally anchored to the helium reservoir. A 25μm sapphire window separates the vacuum space from the RT sample. A positioning mechanism allows us to adjust the sample-to-sensor spacing from the top of the Dewar. We achieved a sensor-to-sample spacing of 100μm, which could be maintained for periods of up to four weeks. Different SQUID sensor designs are necessary to achieve the best combination of spatial resolution and field sensitivity for a given source configuration. For imaging thin sections of geological samples, we used a custom-designed monolithic low-Tc niobium bare SQUID sensor, with an effective diameter of 80μm, and achieved a field sensitivity of 1.5pT/Hz1/2 and a magnetic moment sensitivity of 5.4×10-18Am2/Hz1/2 at a sensor-to-sample spacing of 100μm in the white noise region for frequencies above 100Hz. Imaging action currents in cardiac tissue requires a higher field sensitivity, which can only be achieved by compromising spatial resolution. We developed a monolithic low-Tc niobium multiloop SQUID sensor, with sensor sizes ranging from 250μm to 1mm, and achieved sensitivities of 480-180fT /Hz1/2 in the white noise region for frequencies above 100Hz, respectively. For all sensor configurations, the spatial resolution was comparable to the effective diameter and limited by the sensor-to-sample spacing. Spatial registration allowed us to compare high-resolution images of magnetic fields associated with action currents and optical recordings of transmembrane potentials to study the bidomain nature of cardiac tissue or to match petrography to magnetic field maps in thin sections of geological samples.

  5. Using Polynomials to Simplify Fixed Pattern Noise and Photometric Correction of Logarithmic CMOS Image Sensors

    PubMed Central

    Li, Jing; Mahmoodi, Alireza; Joseph, Dileepan

    2015-01-01

    An important class of complementary metal-oxide-semiconductor (CMOS) image sensors are those where pixel responses are monotonic nonlinear functions of light stimuli. This class includes various logarithmic architectures, which are easily capable of wide dynamic range imaging, at video rates, but which are vulnerable to image quality issues. To minimize fixed pattern noise (FPN) and maximize photometric accuracy, pixel responses must be calibrated and corrected due to mismatch and process variation during fabrication. Unlike literature approaches, which employ circuit-based models of varying complexity, this paper introduces a novel approach based on low-degree polynomials. Although each pixel may have a highly nonlinear response, an approximately-linear FPN calibration is possible by exploiting the monotonic nature of imaging. Moreover, FPN correction requires only arithmetic, and an optimal fixed-point implementation is readily derived, subject to a user-specified number of bits per pixel. Using a monotonic spline, involving cubic polynomials, photometric calibration is also possible without a circuit-based model, and fixed-point photometric correction requires only a look-up table. The approach is experimentally validated with a logarithmic CMOS image sensor and is compared to a leading approach from the literature. The novel approach proves effective and efficient. PMID:26501287

  6. Evaluation of electrical capacitance tomography sensor based on the coupling of fluid field and electrostatic field

    NASA Astrophysics Data System (ADS)

    Ye, Jiamin; Wang, Haigang; Yang, Wuqiang

    2016-07-01

    Electrical capacitance tomography (ECT) is based on capacitance measurements from electrode pairs mounted outside of a pipe or vessel. The structure of ECT sensors is vital to image quality. In this paper, issues with the number of electrodes and the electrode covering ratio for complex liquid-solids flows in a rotating device are investigated based on a new coupling simulation model. The number of electrodes is increased from 4 to 32 while the electrode covering ratio is changed from 0.1 to 0.9. Using the coupling simulation method, real permittivity distributions and the corresponding capacitance data at 0, 0.5, 1, 2, 3, 5, and 8 s with a rotation speed of 96 rotations per minute (rpm) are collected. Linear back projection (LBP) and Landweber iteration algorithms are used for image reconstruction. The quality of reconstructed images is evaluated by correlation coefficient compared with the real permittivity distributions obtained from the coupling simulation. The sensitivity for each sensor is analyzed and compared with the correlation coefficient. The capacitance data with a range of signal-to-noise ratios (SNRs) of 45, 50, 55 and 60 dB are generated to evaluate the effect of data noise on the performance of ECT sensors. Furthermore, the SNRs of experimental data are analyzed for a stationary pipe with permittivity distribution. Based on the coupling simulation, 16-electrode ECT sensors are recommended to achieve good image quality.

  7. Computer vision barrel inspection

    NASA Astrophysics Data System (ADS)

    Wolfe, William J.; Gunderson, James; Walworth, Matthew E.

    1994-02-01

    One of the Department of Energy's (DOE) ongoing tasks is the storage and inspection of a large number of waste barrels containing a variety of hazardous substances. Martin Marietta is currently contracted to develop a robotic system -- the Intelligent Mobile Sensor System (IMSS) -- for the automatic monitoring and inspection of these barrels. The IMSS is a mobile robot with multiple sensors: video cameras, illuminators, laser ranging and barcode reader. We assisted Martin Marietta in this task, specifically in the development of image processing algorithms that recognize and classify the barrel labels. Our subsystem uses video images to detect and locate the barcode, so that the barcode reader can be pointed at the barcode.

  8. OmniBird: a miniature PTZ NIR sensor system for UCAV day/night autonomous operations

    NASA Astrophysics Data System (ADS)

    Yi, Steven; Li, Hui

    2007-04-01

    Through a SBIR funding from NAVAIR, we have successfully developed an innovative, miniaturized, and lightweight PTZ UCAV imager called OmniBird for UCAV taxiing. The proposed OmniBird will be able to fit in a small space. The designed zoom capability allows it to acquire focused images for targets ranging from 10 to 250 feet. The innovative panning mechanism also allows the system to have a field of view of +/- 100 degrees within the provided limited spacing (6 cubic inches). The integrated optics, camera sensor, and mechanics solution will allow the OmniBird to stay optically aligned and shock-proof under harsh environments.

  9. Review of infrared technology in The Netherlands

    NASA Astrophysics Data System (ADS)

    de Jong, Arie N.

    1993-11-01

    The use of infrared sensors in the Netherlands is substantial. Users can be found in a variety of disciplines, military as well as civil. This need for IR sensors implied a long history on IR technology and development. The result was a large technological-capability allowing the realization of IR hardware: specialized measuring equipment, engineering development models, prototype and production sensors for different applications. These applications range from small size, local radiometry up to large space-borne imaging. Large scale production of IR sensors has been realized for army vehicles. IR sensors have been introduced now in all of the armed forces. Facilities have been built to test the performance of these sensors. Models have been developed to predict the performance of a new sensor. A great effort has been spent on atmospheric research, leading to knowledge upon atmospheric- and background limitations of IR sensors.

  10. Automatic Fault Recognition of Photovoltaic Modules Based on Statistical Analysis of Uav Thermography

    NASA Astrophysics Data System (ADS)

    Kim, D.; Youn, J.; Kim, C.

    2017-08-01

    As a malfunctioning PV (Photovoltaic) cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle). The proposed algorithm uses statistical analysis of thermal intensity (surface temperature) characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.

  11. Optoelectronic Sensor System for Guidance in Docking

    NASA Technical Reports Server (NTRS)

    Howard, Richard T.; Bryan, Thomas C.; Book, Michael L.; Jackson, John L.

    2004-01-01

    The Video Guidance Sensor (VGS) system is an optoelectronic sensor that provides automated guidance between two vehicles. In the original intended application, the two vehicles would be spacecraft docking together, but the basic principles of design and operation of the sensor are applicable to aircraft, robots, vehicles, or other objects that may be required to be aligned for docking, assembly, resupply, or precise separation. The system includes a sensor head containing a monochrome charge-coupled- device video camera and pulsed laser diodes mounted on the tracking vehicle, and passive reflective targets on the tracked vehicle. The lasers illuminate the targets, and the resulting video images of the targets are digitized. Then, from the positions of the digitized target images and known geometric relationships among the targets, the relative position and orientation of the vehicles are computed. As described thus far, the VGS system is based on the same principles as those of the system described in "Improved Video Sensor System for Guidance in Docking" (MFS-31150), NASA Tech Briefs, Vol. 21, No. 4 (April 1997), page 9a. However, the two systems differ in the details of design and operation. The VGS system is designed to operate with the target completely visible within a relative-azimuth range of +/-10.5deg and a relative-elevation range of +/-8deg. The VGS acquires and tracks the target within that field of view at any distance from 1.0 to 110 m and at any relative roll, pitch, and/or yaw angle within +/-10deg. The VGS produces sets of distance and relative-orientation data at a repetition rate of 5 Hz. The software of this system also accommodates the simultaneous operation of two sensors for redundancy

  12. Sensor performance and weather effects modeling for intelligent transportation systems (ITS) applications

    NASA Astrophysics Data System (ADS)

    Everson, Jeffrey H.; Kopala, Edward W.; Lazofson, Laurence E.; Choe, Howard C.; Pomerleau, Dean A.

    1995-01-01

    Optical sensors are used for several ITS applications, including lateral control of vehicles, traffic sign recognition, car following, autonomous vehicle navigation, and obstacle detection. This paper treats the performance assessment of a sensor/image processor used as part of an on-board countermeasure system to prevent single vehicle roadway departure crashes. Sufficient image contrast between objects of interest and backgrounds is an essential factor influencing overall system performance. Contrast is determined by material properties affecting reflected/radiated intensities, as well as weather and visibility conditions. This paper discusses the modeling of these parameters and characterizes the contrast performance effects due to reduced visibility. The analysis process first involves generation of inherent road/off- road contrasts, followed by weather effects as a contrast modification. The sensor is modeled as a charge coupled device (CCD), with variable parameters. The results of the sensor/weather modeling are used to predict the performance on an in-vehicle warning system under various levels of adverse weather. Software employed in this effort was previously developed for the U.S. Air Force Wright Laboratory to determine target/background detection and recognition ranges for different sensor systems operating under various mission scenarios.

  13. Report on recent results of the PERCIVAL soft X-ray imager

    NASA Astrophysics Data System (ADS)

    Khromova, A.; Cautero, G.; Giuressi, D.; Menk, R.; Pinaroli, G.; Stebel, L.; Correa, J.; Marras, A.; Wunderer, C. B.; Lange, S.; Tennert, M.; Niemann, M.; Hirsemann, H.; Smoljanin, S.; Reza, S.; Graafsma, H.; Göttlicher, P.; Shevyakov, I.; Supra, J.; Xia, Q.; Zimmer, M.; Guerrini, N.; Marsh, B.; Sedgwick, I.; Nicholls, T.; Turchetta, R.; Pedersen, U.; Tartoni, N.; Hyun, H. J.; Kim, K. S.; Rah, S. Y.; Hoenk, M. E.; Jewell, A. D.; Jones, T. J.; Nikzad, S.

    2016-11-01

    The PERCIVAL (Pixelated Energy Resolving CMOS Imager, Versatile And Large) soft X-ray 2D imaging detector is based on stitched, wafer-scale sensors possessing a thick epi-layer, which together with back-thinning and back-side illumination yields elevated quantum efficiency in the photon energy range of 125-1000 eV. Main application fields of PERCIVAL are foreseen in photon science with FELs and synchrotron radiation. This requires high dynamic range up to 105 ph @ 250 eV paired with single photon sensitivity with high confidence at moderate frame rates in the range of 10-120 Hz. These figures imply the availability of dynamic gain switching on a pixel-by-pixel basis and a highly parallel, low noise analog and digital readout, which has been realized in the PERCIVAL sensor layout. Different aspects of the detector performance have been assessed using prototype sensors with different pixel and ADC types. This work will report on the recent test results performed on the newest chip prototypes with the improved pixel and ADC architecture. For the target frame rates in the 10-120 Hz range an average noise floor of 14e- has been determined, indicating the ability of detecting single photons with energies above 250 eV. Owing to the successfully implemented adaptive 3-stage multiple-gain switching, the integrated charge level exceeds 4 · 106 e- or 57000 X-ray photons at 250 eV per frame at 120 Hz. For all gains the noise level remains below the Poisson limit also in high-flux conditions. Additionally, a short overview over the updates on an oncoming 2 Mpixel (P2M) detector system (expected at the end of 2016) will be reported.

  14. Automatic registration of Iphone images to LASER point clouds of the urban structures using shape features

    NASA Astrophysics Data System (ADS)

    Sirmacek, B.; Lindenbergh, R. C.; Menenti, M.

    2013-10-01

    Fusion of 3D airborne laser (LIDAR) data and terrestrial optical imagery can be applied in 3D urban modeling and model up-dating. The most challenging aspect of the fusion procedure is registering the terrestrial optical images on the LIDAR point clouds. In this article, we propose an approach for registering these two different data from different sensor sources. As we use iPhone camera images which are taken in front of the interested urban structure by the application user and the high resolution LIDAR point clouds of the acquired by an airborne laser sensor. After finding the photo capturing position and orientation from the iPhone photograph metafile, we automatically select the area of interest in the point cloud and transform it into a range image which has only grayscale intensity levels according to the distance from the image acquisition position. We benefit from local features for registering the iPhone image to the generated range image. In this article, we have applied the registration process based on local feature extraction and graph matching. Finally, the registration result is used for facade texture mapping on the 3D building surface mesh which is generated from the LIDAR point cloud. Our experimental results indicate possible usage of the proposed algorithm framework for 3D urban map updating and enhancing purposes.

  15. A magnetic/fluorometric bimodal sensor based on a carbon dots-MnO2 platform for glutathione detection

    NASA Astrophysics Data System (ADS)

    Xu, Yang; Chen, Xi; Chai, Ran; Xing, Chengfen; Li, Huanrong; Yin, Xue-Bo

    2016-07-01

    A novel magnetic/fluorometric bimodal sensor was built from carbon dots (CDs) and MnO2. The resulting sensor was sensitive to glutathione (GSH), leading to apparent enhancement of magnetic resonance (MR) and fluorescence signals along with visual changes. The bimodal detection strategy is based on the decomposition of the CDs-MnO2 through a redox reaction between GSH and MnO2. This process causes the transformation from non-MR-active MnO2 to MR-active Mn2+, and is accompanied by fluorescence restoration of CDs. Compared with a range of other CDs, the polyethylenimine (PEI) passivated CDs (denoted as pCDs) were suitable for detection due to their positive surface potential. Cross-validation between MR and fluorescence provided detailed information regarding the MnO2 reduction process, and revealed the three distinct stages of the redox process. Thus, the design of a CD-based sensor for the magnetic/fluorometric bimodal detection of GSH was emphasized for the first time. This platform showed a detection limit of 0.6 μM with a linear range of 1-200 μM in the fluorescence mode, while the MR mode exhibited a linear range of 5-200 μM and a GSH detection limit of 2.8 μM with a visible change being observed rapidly at 1 μM in the MR images. Furthermore, the introduction of the MR mode allowed the biothiols to be easily identified. The integration of CD fluorescence with an MR response was demonstrated to be promising for providing detailed information and discriminating power, and therefore extend the application of CDs in sensing and imaging.A novel magnetic/fluorometric bimodal sensor was built from carbon dots (CDs) and MnO2. The resulting sensor was sensitive to glutathione (GSH), leading to apparent enhancement of magnetic resonance (MR) and fluorescence signals along with visual changes. The bimodal detection strategy is based on the decomposition of the CDs-MnO2 through a redox reaction between GSH and MnO2. This process causes the transformation from non-MR-active MnO2 to MR-active Mn2+, and is accompanied by fluorescence restoration of CDs. Compared with a range of other CDs, the polyethylenimine (PEI) passivated CDs (denoted as pCDs) were suitable for detection due to their positive surface potential. Cross-validation between MR and fluorescence provided detailed information regarding the MnO2 reduction process, and revealed the three distinct stages of the redox process. Thus, the design of a CD-based sensor for the magnetic/fluorometric bimodal detection of GSH was emphasized for the first time. This platform showed a detection limit of 0.6 μM with a linear range of 1-200 μM in the fluorescence mode, while the MR mode exhibited a linear range of 5-200 μM and a GSH detection limit of 2.8 μM with a visible change being observed rapidly at 1 μM in the MR images. Furthermore, the introduction of the MR mode allowed the biothiols to be easily identified. The integration of CD fluorescence with an MR response was demonstrated to be promising for providing detailed information and discriminating power, and therefore extend the application of CDs in sensing and imaging. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr03129c

  16. Development of integrated semiconductor optical sensors for functional brain imaging

    NASA Astrophysics Data System (ADS)

    Lee, Thomas T.

    Optical imaging of neural activity is a widely accepted technique for imaging brain function in the field of neuroscience research, and has been used to study the cerebral cortex in vivo for over two decades. Maps of brain activity are obtained by monitoring intensity changes in back-scattered light, called Intrinsic Optical Signals (IOS), that correspond to fluctuations in blood oxygenation and volume associated with neural activity. Current imaging systems typically employ bench-top equipment including lamps and CCD cameras to study animals using visible light. Such systems require the use of anesthetized or immobilized subjects with craniotomies, which imposes limitations on the behavioral range and duration of studies. The ultimate goal of this work is to overcome these limitations by developing a single-chip semiconductor sensor using arrays of sources and detectors operating at near-infrared (NIR) wavelengths. A single-chip implementation, combined with wireless telemetry, will eliminate the need for immobilization or anesthesia of subjects and allow in vivo studies of free behavior. NIR light offers additional advantages because it experiences less absorption in animal tissue than visible light, which allows for imaging through superficial tissues. This, in turn, reduces or eliminates the need for traumatic surgery and enables long-term brain-mapping studies in freely-behaving animals. This dissertation concentrates on key engineering challenges of implementing the sensor. This work shows the feasibility of using a GaAs-based array of vertical-cavity surface emitting lasers (VCSELs) and PIN photodiodes for IOS imaging. I begin with in-vivo studies of IOS imaging through the skull in mice, and use these results along with computer simulations to establish minimum performance requirements for light sources and detectors. I also evaluate the performance of a current commercial VCSEL for IOS imaging, and conclude with a proposed prototype sensor.

  17. Nanophotonic Image Sensors.

    PubMed

    Chen, Qin; Hu, Xin; Wen, Long; Yu, Yan; Cumming, David R S

    2016-09-01

    The increasing miniaturization and resolution of image sensors bring challenges to conventional optical elements such as spectral filters and polarizers, the properties of which are determined mainly by the materials used, including dye polymers. Recent developments in spectral filtering and optical manipulating techniques based on nanophotonics have opened up the possibility of an alternative method to control light spectrally and spatially. By integrating these technologies into image sensors, it will become possible to achieve high compactness, improved process compatibility, robust stability and tunable functionality. In this Review, recent representative achievements on nanophotonic image sensors are presented and analyzed including image sensors with nanophotonic color filters and polarizers, metamaterial-based THz image sensors, filter-free nanowire image sensors and nanostructured-based multispectral image sensors. This novel combination of cutting edge photonics research and well-developed commercial products may not only lead to an important application of nanophotonics but also offer great potential for next generation image sensors beyond Moore's Law expectations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Design of a temperature control system using incremental PID algorithm for a special homemade shortwave infrared spatial remote sensor based on FPGA

    NASA Astrophysics Data System (ADS)

    Xu, Zhipeng; Wei, Jun; Li, Jianwei; Zhou, Qianting

    2010-11-01

    An image spectrometer of a spatial remote sensing satellite requires shortwave band range from 2.1μm to 3μm which is one of the most important bands in remote sensing. We designed an infrared sub-system of the image spectrometer using a homemade 640x1 InGaAs shortwave infrared sensor working on FPA system which requires high uniformity and low level of dark current. The working temperature should be -15+/-0.2 Degree Celsius. This paper studies the model of noise for focal plane array (FPA) system, investigated the relationship with temperature and dark current noise, and adopts Incremental PID algorithm to generate PWM wave in order to control the temperature of the sensor. There are four modules compose of the FPGA module design. All of the modules are coded by VHDL and implemented in FPGA device APA300. Experiment shows the intelligent temperature control system succeeds in controlling the temperature of the sensor.

  19. Vector Acoustics, Vector Sensors, and 3D Underwater Imaging

    NASA Astrophysics Data System (ADS)

    Lindwall, D.

    2007-12-01

    Vector acoustic data has two more dimensions of information than pressure data and may allow for 3D underwater imaging with much less data than with hydrophone data. The vector acoustic sensors measures the particle motions due to passing sound waves and, in conjunction with a collocated hydrophone, the direction of travel of the sound waves. When using a controlled source with known source and sensor locations, the reflection points of the sound field can be determined with a simple trigonometric calculation. I demonstrate this concept with an experiment that used an accelerometer based vector acoustic sensor in a water tank with a short-pulse source and passive scattering targets. The sensor consists of a three-axis accelerometer and a matched hydrophone. The sound source was a standard transducer driven by a short 7 kHz pulse. The sensor was suspended in a fixed location and the hydrophone was moved about the tank by a robotic arm to insonify the tank from many locations. Several floats were placed in the tank as acoustic targets at diagonal ranges of approximately one meter. The accelerometer data show the direct source wave as well as the target scattered waves and reflections from the nearby water surface, tank bottom and sides. Without resorting to the usual methods of seismic imaging, which in this case is only two dimensional and relied entirely on the use of a synthetic source aperture, the two targets, the tank walls, the tank bottom, and the water surface were imaged. A directional ambiguity inherent to vector sensors is removed by using collocated hydrophone data. Although this experiment was in a very simple environment, it suggests that 3-D seismic surveys may be achieved with vector sensors using the same logistics as a 2-D survey that uses conventional hydrophones. This work was supported by the Office of Naval Research, program element 61153N.

  20. Monitoring on-orbit calibration stability of the Terra MODIS and Landsat 7 ETM+ sensors using pseudo-invariant test sites

    USGS Publications Warehouse

    Chander, G.; Xiong, X.(J.); Choi, T.(J.); Angal, A.

    2010-01-01

    The ability to detect and quantify changes in the Earth's environment depends on sensors that can provide calibrated, consistent measurements of the Earth's surface features through time. A critical step in this process is to put image data from different sensors onto a common radiometric scale. This work focuses on monitoring the long-term on-orbit calibration stability of the Terra Moderate Resolution Imaging Spectroradiometer (MODIS) and the Landsat 7 (L7) Enhanced Thematic Mapper Plus (ETM+) sensors using the Committee on Earth Observation Satellites (CEOS) reference standard pseudo-invariant test sites (Libya 4, Mauritania 1/2, Algeria 3, Libya 1, and Algeria 5). These sites have been frequently used as radiometric targets because of their relatively stable surface conditions temporally. This study was performed using all cloud-free calibrated images from the Terra MODIS and the L7 ETM+ sensors, acquired from launch to December 2008. Homogeneous regions of interest (ROI) were selected in the calibrated images and the mean target statistics were derived from sensor measurements in terms of top-of-atmosphere (TOA) reflectance. For each band pair, a set of fitted coefficients (slope and offset) is provided to monitor the long-term stability over very stable pseudo-invariant test sites. The average percent differences in intercept from the long-term trends obtained from the ETM + TOA reflectance estimates relative to the MODIS for all the CEOS reference standard test sites range from 2.5% to 15%. This gives an estimate of the collective differences due to the Relative Spectral Response (RSR) characteristics of each sensor, bi-directional reflectance distribution function (BRDF), spectral signature of the ground target, and atmospheric composition. The lifetime TOA reflectance trends from both sensors over 10 years are extremely stable, changing by no more than 0.4% per year in its TOA reflectance over the CEOS reference standard test sites.

  1. Near-infrared fluorescence goggle system with complementary metal–oxide–semiconductor imaging sensor and see-through display

    PubMed Central

    Liu, Yang; Njuguna, Raphael; Matthews, Thomas; Akers, Walter J.; Sudlow, Gail P.; Mondal, Suman; Tang, Rui

    2013-01-01

    Abstract. We have developed a near-infrared (NIR) fluorescence goggle system based on the complementary metal–oxide–semiconductor active pixel sensor imaging and see-through display technologies. The fluorescence goggle system is a compact wearable intraoperative fluorescence imaging and display system that can guide surgery in real time. The goggle is capable of detecting fluorescence of indocyanine green solution in the picomolar range. Aided by NIR quantum dots, we successfully used the fluorescence goggle to guide sentinel lymph node mapping in a rat model. We further demonstrated the feasibility of using the fluorescence goggle in guiding surgical resection of breast cancer metastases in the liver in conjunction with NIR fluorescent probes. These results illustrate the diverse potential use of the goggle system in surgical procedures. PMID:23728180

  2. Flight Results from the HST SM4 Relative Navigation Sensor System

    NASA Technical Reports Server (NTRS)

    Naasz, Bo; Eepoel, John Van; Queen, Steve; Southward, C. Michael; Hannah, Joel

    2010-01-01

    On May 11, 2009, Space Shuttle Atlantis roared off of Launch Pad 39A enroute to the Hubble Space Telescope (HST) to undertake its final servicing of HST, Servicing Mission 4. Onboard Atlantis was a small payload called the Relative Navigation Sensor experiment, which included three cameras of varying focal ranges, avionics to record images and estimate, in real time, the relative position and attitude (aka "pose") of the telescope during rendezvous and deploy. The avionics package, known as SpaceCube and developed at the Goddard Space Flight Center, performed image processing using field programmable gate arrays to accelerate this process, and in addition executed two different pose algorithms in parallel, the Goddard Natural Feature Image Recognition and the ULTOR Passive Pose and Position Engine (P3E) algorithms

  3. Few-photon color imaging using energy-dispersive superconducting transition-edge sensor spectrometry

    NASA Astrophysics Data System (ADS)

    Niwa, Kazuki; Numata, Takayuki; Hattori, Kaori; Fukuda, Daiji

    2017-04-01

    Highly sensitive spectral imaging is increasingly being demanded in bioanalysis research and industry to obtain the maximum information possible from molecules of different colors. We introduce an application of the superconducting transition-edge sensor (TES) technique to highly sensitive spectral imaging. A TES is an energy-dispersive photodetector that can distinguish the wavelength of each incident photon. Its effective spectral range is from the visible to the infrared (IR), up to 2800 nm, which is beyond the capabilities of other photodetectors. TES was employed in this study in a fiber-coupled optical scanning microscopy system, and a test sample of a three-color ink pattern was observed. A red-green-blue (RGB) image and a near-IR image were successfully obtained in the few-incident-photon regime, whereas only a black and white image could be obtained using a photomultiplier tube. Spectral data were also obtained from a selected focal area out of the entire image. The results of this study show that TES is feasible for use as an energy-dispersive photon-counting detector in spectral imaging applications.

  4. Few-photon color imaging using energy-dispersive superconducting transition-edge sensor spectrometry.

    PubMed

    Niwa, Kazuki; Numata, Takayuki; Hattori, Kaori; Fukuda, Daiji

    2017-04-04

    Highly sensitive spectral imaging is increasingly being demanded in bioanalysis research and industry to obtain the maximum information possible from molecules of different colors. We introduce an application of the superconducting transition-edge sensor (TES) technique to highly sensitive spectral imaging. A TES is an energy-dispersive photodetector that can distinguish the wavelength of each incident photon. Its effective spectral range is from the visible to the infrared (IR), up to 2800 nm, which is beyond the capabilities of other photodetectors. TES was employed in this study in a fiber-coupled optical scanning microscopy system, and a test sample of a three-color ink pattern was observed. A red-green-blue (RGB) image and a near-IR image were successfully obtained in the few-incident-photon regime, whereas only a black and white image could be obtained using a photomultiplier tube. Spectral data were also obtained from a selected focal area out of the entire image. The results of this study show that TES is feasible for use as an energy-dispersive photon-counting detector in spectral imaging applications.

  5. D Capturing Performances of Low-Cost Range Sensors for Mass-Market Applications

    NASA Astrophysics Data System (ADS)

    Guidi, G.; Gonizzi, S.; Micoli, L.

    2016-06-01

    Since the advent of the first Kinect as motion controller device for the Microsoft XBOX platform (November 2010), several similar active and low-cost range sensing devices have been introduced on the mass-market for several purposes, including gesture based interfaces, 3D multimedia interaction, robot navigation, finger tracking, 3D body scanning for garment design and proximity sensors for automotive. However, given their capability to generate a real time stream of range images, these has been used in some projects also as general purpose range devices, with performances that for some applications might be satisfying. This paper shows the working principle of the various devices, analyzing them in terms of systematic errors and random errors for exploring the applicability of them in standard 3D capturing problems. Five actual devices have been tested featuring three different technologies: i) Kinect V1 by Microsoft, Structure Sensor by Occipital, and Xtion PRO by ASUS, all based on different implementations of the Primesense sensor; ii) F200 by Intel/Creative, implementing the Realsense pattern projection technology; Kinect V2 by Microsoft, equipped with the Canesta TOF Camera. A critical analysis of the results tries first of all to compare them, and secondarily to focus the range of applications for which such devices could actually work as a viable solution.

  6. Photoacoustic imaging with planoconcave optical microresonator sensors: feasibility studies based on phantom imaging

    NASA Astrophysics Data System (ADS)

    Guggenheim, James A.; Zhang, Edward Z.; Beard, Paul C.

    2017-03-01

    The planar Fabry-Pérot (FP) sensor provides high quality photoacoustic (PA) images but beam walk-off limits sensitivity and thus penetration depth to ≍1 cm. Planoconcave microresonator sensors eliminate beam walk-off enabling sensitivity to be increased by an order-of-magnitude whilst retaining the highly favourable frequency response and directional characteristics of the FP sensor. The first tomographic PA images obtained in a tissue-realistic phantom using the new sensors are described. These show that the microresonator sensors provide near identical image quality as the planar FP sensor but with significantly greater penetration depth (e.g. 2-3cm) due to their higher sensitivity. This offers the prospect of whole body small animal imaging and clinical imaging to depths previously unattainable using the FP planar sensor.

  7. Shack-Hartmann wavefront sensor with large dynamic range by adaptive spot search method.

    PubMed

    Shinto, Hironobu; Saita, Yusuke; Nomura, Takanori

    2016-07-10

    A Shack-Hartmann wavefront sensor (SHWFS) that consists of a microlens array and an image sensor has been used to measure the wavefront aberrations of human eyes. However, a conventional SHWFS has finite dynamic range depending on the diameter of the each microlens. The dynamic range cannot be easily expanded without a decrease of the spatial resolution. In this study, an adaptive spot search method to expand the dynamic range of an SHWFS is proposed. In the proposed method, spots are searched with the help of their approximate displacements measured with low spatial resolution and large dynamic range. By the proposed method, a wavefront can be correctly measured even if the spot is beyond the detection area. The adaptive spot search method is realized by using the special microlens array that generates both spots and discriminable patterns. The proposed method enables expanding the dynamic range of an SHWFS with a single shot and short processing time. The performance of the proposed method is compared with that of a conventional SHWFS by optical experiments. Furthermore, the dynamic range of the proposed method is quantitatively evaluated by numerical simulations.

  8. Biomedical Imaging

    DTIC Science & Technology

    1994-04-01

    distribution unlimited. United States Army Aeromedical Research Laboratory Fort Rucker, Alabama 36362-0577 Qualified recuesters Qualified requesters may...FUNDING NUMBER5 I PROGRAM zfJECT TASK WORK UNIT ELEMENT NO. NO. ACCESSION NO. 62787A 30162787A87$ EA 138 Biomedical Imaging 12. PERSONAL AUTHOR(S...times larger. Usually they are expensive with commercially available units starting at around $100,000. Triangulation sensors are capable of range

  9. Measuring noise equivalent irradiance of a digital short-wave infrared imaging system using a broadband source to simulate the night spectrum

    NASA Astrophysics Data System (ADS)

    Green, John R.; Robinson, Timothy

    2015-05-01

    There is a growing interest in developing helmet-mounted digital imaging systems (HMDIS) for integration into military aircraft cockpits. This interest stems from the multiple advantages of digital vs. analog imaging such as image fusion from multiple sensors, data processing to enhance the image contrast, superposition of non-imaging data over the image, and sending images to remote location for analysis. There are several properties an HMDIS must have in order to aid the pilot during night operations. In addition to the resolution, image refresh rate, dynamic range, and sensor uniformity over the entire Focal Plane Array (FPA); the imaging system must have the sensitivity to detect the limited night light available filtered through cockpit transparencies. Digital sensor sensitivity is generally measured monochromatically using a laser with a wavelength near the peak detector quantum efficiency, and is generally reported as either the Noise Equivalent Power (NEP) or Noise Equivalent Irradiance (NEI). This paper proposes a test system that measures NEI of Short-Wave Infrared (SWIR) digital imaging systems using a broadband source that simulates the night spectrum. This method has a few advantages over a monochromatic method. Namely, the test conditions provide spectrum closer to what is experienced by the end-user, and the resulting NEI may be compared directly to modeled night glow irradiance calculation. This comparison may be used to assess the Technology Readiness Level of the imaging system for the application. The test system is being developed under a Cooperative Research and Development Agreement (CRADA) with the Air Force Research Laboratory.

  10. High-speed Imaging of Global Surface Temperature Distributions on Hypersonic Ballistic-Range Projectiles

    NASA Technical Reports Server (NTRS)

    Wilder, Michael C.; Reda, Daniel C.

    2004-01-01

    The NASA-Ames ballistic range provides a unique capability for aerothermodynamic testing of configurations in hypersonic, real-gas, free-flight environments. The facility can closely simulate conditions at any point along practically any trajectory of interest experienced by a spacecraft entering an atmosphere. Sub-scale models of blunt atmospheric entry vehicles are accelerated by a two-stage light-gas gun to speeds as high as 20 times the speed of sound to fly ballistic trajectories through an 24 m long vacuum-rated test section. The test-section pressure (effective altitude), the launch velocity of the model (flight Mach number), and the test-section working gas (planetary atmosphere) are independently variable. The model travels at hypersonic speeds through a quiescent test gas, creating a strong bow-shock wave and real-gas effects that closely match conditions achieved during actual atmospheric entry. The challenge with ballistic range experiments is to obtain quantitative surface measurements from a model traveling at hypersonic speeds. The models are relatively small (less than 3.8 cm in diameter), which limits the spatial resolution possible with surface mounted sensors. Furthermore, since the model is in flight, surface-mounted sensors require some form of on-board telemetry, which must survive the massive acceleration loads experienced during launch (up to 500,000 gravities). Finally, the model and any on-board instrumentation will be destroyed at the terminal wall of the range. For these reasons, optical measurement techniques are the most practical means of acquiring data. High-speed thermal imaging has been employed in the Ames ballistic range to measure global surface temperature distributions and to visualize the onset of transition to turbulent-flow on the forward regions of hypersonic blunt bodies. Both visible wavelength and infrared high-speed cameras are in use. The visible wavelength cameras are intensified CCD imagers capable of integration times as short as 2 ns. The infrared camera uses an Indium Antimonide (InSb) sensor in the 3 to 5 micron band and is capable of integration times as short as 500 ns. The projectiles are imaged nearly head-on using expendable mirrors offset slightly from the flight path. The proposed paper will discuss the application of high-speed digital imaging systems in the NASA-Ames hypersonic ballistic range, and the challenges encountered when applying these systems. Example images of the thermal radiation from the blunt nose of projectiles flying at nearly 14 times the speed of sound will be given.

  11. Novel detection schemes of nuclear magnetic resonance and magnetic resonance imaging: applications from analytical chemistry to molecular sensors.

    PubMed

    Harel, Elad; Schröder, Leif; Xu, Shoujun

    2008-01-01

    Nuclear magnetic resonance (NMR) is a well-established analytical technique in chemistry. The ability to precisely control the nuclear spin interactions that give rise to the NMR phenomenon has led to revolutionary advances in fields as diverse as protein structure determination and medical diagnosis. Here, we discuss methods for increasing the sensitivity of magnetic resonance experiments, moving away from the paradigm of traditional NMR by separating the encoding and detection steps of the experiment. This added flexibility allows for diverse applications ranging from lab-on-a-chip flow imaging and biological sensors to optical detection of magnetic resonance imaging at low magnetic fields. We aim to compare and discuss various approaches for a host of problems in material science, biology, and physics that differ from the high-field methods routinely used in analytical chemistry and medical imaging.

  12. Sensor fusion of range and reflectance data for outdoor scene analysis

    NASA Technical Reports Server (NTRS)

    Kweon, In SO; Hebvert, Martial; Kanade, Takeo

    1988-01-01

    In recognizing objects in an outdoor scene, range and reflectance (or color) data provide complementary information. Results of experiments in recognizing outdoor scenes containing roads, trees, and cars are presented. The recognition program uses range and reflectance data obtained by a scanning laser range finder, as well as color data from a color TV camera. After segmentation of each image into primitive regions, models of objects are matched using various properties.

  13. Helmet-Mounted Displays: Sensation, Perception and Cognition Issues

    DTIC Science & Technology

    2009-01-01

    Inc., web site: http://www.metavr.com/ technology/ papers /syntheticvision.html Helmetag, A., Halbig, C., Kubbat, W., and Schmidt, R. (1999...system-of-systems.” One integral system is a “head-borne vision enhancement” system (an HMD) that provides fused I2/ IR sensor imagery (U.S. Army Natick...Using microwave, radar, I2, infrared ( IR ), and other technology-based imaging sensors, the “seeing” range of the human eye is extended into the

  14. Proximity Operations and Docking Sensor Development

    NASA Technical Reports Server (NTRS)

    Howard, Richard T.; Bryan, Thomas C.; Brewster, Linda L.; Lee, James E.

    2009-01-01

    The Next Generation Advanced Video Guidance Sensor (NGAVGS) has been under development for the last three years as a long-range proximity operations and docking sensor for use in an Automated Rendezvous and Docking (AR&D) system. The first autonomous rendezvous and docking in the history of the U.S. Space Program was successfully accomplished by Orbital Express, using the Advanced Video Guidance Sensor (AVGS) as the primary docking sensor. That flight proved that the United States now has a mature and flight proven sensor technology for supporting Crew Exploration Vehicles (CEV) and Commercial Orbital Transport Systems (COTS) Automated Rendezvous and Docking (AR&D). NASA video sensors have worked well in the past: the AVGS used on the Demonstration of Autonomous Rendezvous Technology (DART) mission operated successfully in spot mode out to 2 km, and the first generation rendezvous and docking sensor, the Video Guidance Sensor (VGS), was developed and successfully flown on Space Shuttle flights in 1997 and 1998. 12 Parts obsolescence issues prevent the construction of more AVGS units, and the next generation sensor was updated to allow it to support the CEV and COTS programs. The flight proven AR&D sensor has been redesigned to update parts and add additional capabilities for CEV and COTS with the development of the Next Generation AVGS at the Marshall Space Flight Center. The obsolete imager and processor are being replaced with new radiation tolerant parts. In addition, new capabilities include greater sensor range, auto ranging capability, and real-time video output. This paper presents some sensor hardware trades, use of highly integrated laser components, and addresses the needs of future vehicles that may rendezvous and dock with the International Space Station (ISS) and other Constellation vehicles. It also discusses approaches for upgrading AVGS to address parts obsolescence, and concepts for minimizing the sensor footprint, weight, and power requirements. In addition, the testing of the brassboard and proto-type NGAVGS units will be discussed along with the use of the NGAVGS as a proximity operations and docking sensor.

  15. Concept of electro-optical sensor module for sniper detection system

    NASA Astrophysics Data System (ADS)

    Trzaskawka, Piotr; Dulski, Rafal; Kastek, Mariusz

    2010-10-01

    The paper presents an initial concept of the electro-optical sensor unit for sniper detection purposes. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. Being a part of a larger system it should contribute to greater overall system efficiency and lower false alarm rate thanks to data and sensor fusion techniques. Additionally, it is expected to provide some pre-shot detection capabilities. Generally acoustic (or radar) systems used for shot detection offer only "after-the-shot" information and they cannot prevent enemy attack, which in case of a skilled sniper opponent usually means trouble. The passive imaging sensors presented in this paper, together with active systems detecting pointed optics, are capable of detecting specific shooter signatures or at least the presence of suspected objects in the vicinity. The proposed sensor unit use thermal camera as a primary sniper and shot detection tool. The basic camera parameters such as focal plane array size and type, focal length and aperture were chosen on the basis of assumed tactical characteristics of the system (mainly detection range) and current technology level. In order to provide costeffective solution the commercially available daylight camera modules and infrared focal plane arrays were tested, including fast cooled infrared array modules capable of 1000 fps image acquisition rate. The daylight camera operates as a support, providing corresponding visual image, easier to comprehend for a human operator. The initial assumptions concerning sensor operation were verified during laboratory and field test and some example shot recording sequences are presented.

  16. Multi-Sensor Mud Detection

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo L.; Matthies, Larry H.

    2010-01-01

    Robust mud detection is a critical perception requirement for Unmanned Ground Vehicle (UGV) autonomous offroad navigation. A military UGV stuck in a mud body during a mission may have to be sacrificed or rescued, both of which are unattractive options. There are several characteristics of mud that may be detectable with appropriate UGV-mounted sensors. For example, mud only occurs on the ground surface, is cooler than surrounding dry soil during the daytime under nominal weather conditions, is generally darker than surrounding dry soil in visible imagery, and is highly polarized. However, none of these cues are definitive on their own. Dry soil also occurs on the ground surface, shadows, snow, ice, and water can also be cooler than surrounding dry soil, shadows are also darker than surrounding dry soil in visible imagery, and cars, water, and some vegetation are also highly polarized. Shadows, snow, ice, water, cars, and vegetation can all be disambiguated from mud by using a suite of sensors that span multiple bands in the electromagnetic spectrum. Because there are military operations when it is imperative for UGV's to operate without emitting strong, detectable electromagnetic signals, passive sensors are desirable. JPL has developed a daytime mud detection capability using multiple passive imaging sensors. Cues for mud from multiple passive imaging sensors are fused into a single mud detection image using a rule base, and the resultant mud detection is localized in a terrain map using range data generated from a stereo pair of color cameras.

  17. A monolithic 640 × 512 CMOS imager with high-NIR sensitivity

    NASA Astrophysics Data System (ADS)

    Lauxtermann, Stefan; Fisher, John; McDougal, Michael

    2014-06-01

    In this paper we present first results from a backside illuminated CMOS image sensor that we fabricated on high resistivity silicon. Compared to conventional CMOS imagers, a thicker photosensitive membrane can be depleted when using silicon with low background doping concentration while maintaining low dark current and good MTF performance. The benefits of such a fully depleted silicon sensor are high quantum efficiency over a wide spectral range and a fast photo detector response. Combining these characteristics with the circuit complexity and manufacturing maturity available from a modern, mixed signal CMOS technology leads to a new type of sensor, with an unprecedented performance spectrum in a monolithic device. Our fully depleted, backside illuminated CMOS sensor was designed to operate at integration times down to 100nsec and frame rates up to 1000Hz. Noise in Integrate While Read (IWR) snapshot shutter operation for these conditions was simulated to be below 10e- at room temperature. 2×2 binning with a 4× increase in sensitivity and a maximum frame rate of 4000 Hz is supported. For application in hyperspectral imaging systems the full well capacity in each row can individually be programmed between 10ke-, 60ke- and 500ke-. On test structures we measured a room temperature dark current of 360pA/cm2 at a reverse bias of 3.3V. A peak quantum efficiency of 80% was measured with a single layer AR coating on the backside. Test images captured with the 50μm thick VGA imager between 30Hz and 90Hz frame rate show a strong response at NIR wavelengths.

  18. Airborne digital-image data for monitoring the Colorado River corridor below Glen Canyon Dam, Arizona, 2009 - Image-mosaic production and comparison with 2002 and 2005 image mosaics

    USGS Publications Warehouse

    Davis, Philip A.

    2012-01-01

    Airborne digital-image data were collected for the Arizona part of the Colorado River ecosystem below Glen Canyon Dam in 2009. These four-band image data are similar in wavelength band (blue, green, red, and near infrared) and spatial resolution (20 centimeters) to image collections of the river corridor in 2002 and 2005. These periodic image collections are used by the Grand Canyon Monitoring and Research Center (GCMRC) of the U.S. Geological Survey to monitor the effects of Glen Canyon Dam operations on the downstream ecosystem. The 2009 collection used the latest model of the Leica ADS40 airborne digital sensor (the SH52), which uses a single optic for all four bands and collects and stores band radiance in 12-bits, unlike the image sensors that GCMRC used in 2002 and 2005. This study examined the performance of the SH52 sensor, on the basis of the collected image data, and determined that the SH52 sensor provided superior data relative to the previously employed sensors (that is, an early ADS40 model and Zeiss Imaging's Digital Mapping Camera) in terms of band-image registration, dynamic range, saturation, linearity to ground reflectance, and noise level. The 2009 image data were provided as orthorectified segments of each flightline to constrain the size of the image files; each river segment was covered by 5 to 6 overlapping, linear flightlines. Most flightline images for each river segment had some surface-smear defects and some river segments had cloud shadows, but these two conditions did not generally coincide in the majority of the overlapping flightlines for a particular river segment. Therefore, the final image mosaic for the 450-kilometer (km)-long river corridor required careful selection and editing of numerous flightline segments (a total of 513 segments, each 3.2 km long) to minimize surface defects and cloud shadows. The final image mosaic has a total of only 3 km of surface defects. The final image mosaic for the western end of the corridor has areas of cloud shadow because of persistent inclement weather during data collection. This report presents visual comparisons of the 2002, 2005, and 2009 digital-image mosaics for various physical, biological, and cultural resources within the Colorado River ecosystem. All of the comparisons show the superior quality of the 2009 image data. In fact, the 2009 four-band image mosaic is perhaps the best image dataset that exists for the entire Arizona part of the Colorado River.

  19. Design considerations for a new, high resolution Micro-Angiographic Fluoroscope based on a CMOS sensor (MAF-CMOS).

    PubMed

    Loughran, Brendan; Swetadri Vasan, S N; Singh, Vivek; Ionita, Ciprian N; Jain, Amit; Bednarek, Daniel R; Titus, Albert; Rudin, Stephen

    2013-03-06

    The detectors that are used for endovascular image-guided interventions (EIGI), particularly for neurovascular interventions, do not provide clinicians with adequate visualization to ensure the best possible treatment outcomes. Developing an improved x-ray imaging detector requires the determination of estimated clinical x-ray entrance exposures to the detector. The range of exposures to the detector in clinical studies was found for the three modes of operation: fluoroscopic mode, high frame-rate digital angiographic mode (HD fluoroscopic mode), and DSA mode. Using these estimated detector exposure ranges and available CMOS detector technical specifications, design requirements were developed to pursue a quantum limited, high resolution, dynamic x-ray detector based on a CMOS sensor with 50 μm pixel size. For the proposed MAF-CMOS, the estimated charge collected within the full exposure range was found to be within the estimated full well capacity of the pixels. Expected instrumentation noise for the proposed detector was estimated to be 50-1,300 electrons. Adding a gain stage such as a light image intensifier would minimize the effect of the estimated instrumentation noise on total image noise but may not be necessary to ensure quantum limited detector operation at low exposure levels. A recursive temporal filter may decrease the effective total noise by 2 to 3 times, allowing for the improved signal to noise ratios at the lowest estimated exposures despite consequent loss in temporal resolution. This work can serve as a guide for further development of dynamic x-ray imaging prototypes or improvements for existing dynamic x-ray imaging systems.

  20. Calibration procedures for imaging spectrometers: improving data quality from satellite missions to UAV campaigns

    NASA Astrophysics Data System (ADS)

    Brachmann, Johannes F. S.; Baumgartner, Andreas; Lenhard, Karim

    2016-10-01

    The Calibration Home Base (CHB) at the Remote Sensing Technology Institute of the German Aerospace Center (DLR-IMF) is an optical laboratory designed for the calibration of imaging spectrometers for the VNIR/SWIR wavelength range. Radiometric, spectral and geometric characterization is realized in the CHB in a precise and highly automated fashion. This allows performing a wide range of time consuming measurements in an efficient way. The implementation of ISO 9001 standards ensures a traceable quality of results. DLR-IMF will support the calibration and characterization campaign of the future German spaceborne hyperspectral imager EnMAP. In the context of this activity, a procedure for the correction of imaging artifacts, such as due to stray light, is currently being developed by DLR-IMF. Goal is the correction of in-band stray light as well as ghost images down to a level of a few digital numbers in the whole wavelength range 420-2450 nm. DLR-IMF owns a Norsk Elektro Optikks HySpex airborne imaging spectrometer system that has been thoroughly characterized. This system will be used to test stray light calibration procedures for EnMAP. Hyperspectral snapshot sensors offer the possibility to simultaneously acquire hyperspectral data in two dimensions. Recently, these rather new spectrometers have arisen much interest in the remote sensing community. Different designs are currently used for local area observation such as by use of small unmanned aerial vehicles (sUAV). In this context the CHB's measurement capabilities are currently extended such that a standard measurement procedure for these new sensors will be implemented.

  1. Self-amplified CMOS image sensor using a current-mode readout circuit

    NASA Astrophysics Data System (ADS)

    Santos, Patrick M.; de Lima Monteiro, Davies W.; Pittet, Patrick

    2014-05-01

    The feature size of the CMOS processes decreased during the past few years and problems such as reduced dynamic range have become more significant in voltage-mode pixels, even though the integration of more functionality inside the pixel has become easier. This work makes a contribution on both sides: the possibility of a high signal excursion range using current-mode circuits together with functionality addition by making signal amplification inside the pixel. The classic 3T pixel architecture was rebuild with small modifications to integrate a transconductance amplifier providing a current as an output. The matrix with these new pixels will operate as a whole large transistor outsourcing an amplified current that will be used for signal processing. This current is controlled by the intensity of the light received by the matrix, modulated pixel by pixel. The output current can be controlled by the biasing circuits to achieve a very large range of output signal levels. It can also be controlled with the matrix size and this permits a very high degree of freedom on the signal level, observing the current densities inside the integrated circuit. In addition, the matrix can operate at very small integration times. Its applications would be those in which fast imaging processing, high signal amplification are required and low resolution is not a major problem, such as UV image sensors. Simulation results will be presented to support: operation, control, design, signal excursion levels and linearity for a matrix of pixels that was conceived using this new concept of sensor.

  2. A High Fidelity Approach to Data Simulation for Space Situational Awareness Missions

    NASA Astrophysics Data System (ADS)

    Hagerty, S.; Ellis, H., Jr.

    2016-09-01

    Space Situational Awareness (SSA) is vital to maintaining our Space Superiority. A high fidelity, time-based simulation tool, PROXOR™ (Proximity Operations and Rendering), supports SSA by generating realistic mission scenarios including sensor frame data with corresponding truth. This is a unique and critical tool for supporting mission architecture studies, new capability (algorithm) development, current/future capability performance analysis, and mission performance prediction. PROXOR™ provides a flexible architecture for sensor and resident space object (RSO) orbital motion and attitude control that simulates SSA, rendezvous and proximity operations scenarios. The major elements of interest are based on the ability to accurately simulate all aspects of the RSO model, viewing geometry, imaging optics, sensor detector, and environmental conditions. These capabilities enhance the realism of mission scenario models and generated mission image data. As an input, PROXOR™ uses a library of 3-D satellite models containing 10+ satellites, including low-earth orbit (e.g., DMSP) and geostationary (e.g., Intelsat) spacecraft, where the spacecraft surface properties are those of actual materials and include Phong and Maxwell-Beard bidirectional reflectance distribution function (BRDF) coefficients for accurate radiometric modeling. We calculate the inertial attitude, the changing solar and Earth illumination angles of the satellite, and the viewing angles from the sensor as we propagate the RSO in its orbit. The synthetic satellite image is rendered at high resolution and aggregated to the focal plane resolution resulting in accurate radiometry even when the RSO is a point source. The sensor model includes optical effects from the imaging system [point spread function (PSF) includes aberrations, obscurations, support structures, defocus], detector effects (CCD blooming, left/right bias, fixed pattern noise, image persistence, shot noise, read noise, and quantization noise), and environmental effects (radiation hits with selectable angular distributions and 4-layer atmospheric turbulence model for ground based sensors). We have developed an accurate flash Light Detection and Ranging (LIDAR) model that supports reconstruction of 3-dimensional information on the RSO. PROXOR™ contains many important imaging effects such as intra-frame smear, realized by oversampling the image in time and capturing target motion and jitter during the integration time.

  3. Multi-Scale Fractal Analysis of Image Texture and Pattern

    NASA Technical Reports Server (NTRS)

    Emerson, Charles W.; Lam, Nina Siu-Ngan; Quattrochi, Dale A.

    1999-01-01

    Analyses of the fractal dimension of Normalized Difference Vegetation Index (NDVI) images of homogeneous land covers near Huntsville, Alabama revealed that the fractal dimension of an image of an agricultural land cover indicates greater complexity as pixel size increases, a forested land cover gradually grows smoother, and an urban image remains roughly self-similar over the range of pixel sizes analyzed (10 to 80 meters). A similar analysis of Landsat Thematic Mapper images of the East Humboldt Range in Nevada taken four months apart show a more complex relation between pixel size and fractal dimension. The major visible difference between the spring and late summer NDVI images is the absence of high elevation snow cover in the summer image. This change significantly alters the relation between fractal dimension and pixel size. The slope of the fractal dimension-resolution relation provides indications of how image classification or feature identification will be affected by changes in sensor spatial resolution.

  4. Multi-Scale Fractal Analysis of Image Texture and Pattern

    NASA Technical Reports Server (NTRS)

    Emerson, Charles W.; Lam, Nina Siu-Ngan; Quattrochi, Dale A.

    1999-01-01

    Analyses of the fractal dimension of Normalized Difference Vegetation Index (NDVI) images of homogeneous land covers near Huntsville, Alabama revealed that the fractal dimension of an image of an agricultural land cover indicates greater complexity as pixel size increases, a forested land cover gradually grows smoother, and an urban image remains roughly self-similar over the range of pixel sizes analyzed (10 to 80 meters). A similar analysis of Landsat Thematic Mapper images of the East Humboldt Range in Nevada taken four months apart show a more complex relation between pixel size and fractal dimension. The major visible difference between the spring and late summer NDVI images of the absence of high elevation snow cover in the summer image. This change significantly alters the relation between fractal dimension and pixel size. The slope of the fractal dimensional-resolution relation provides indications of how image classification or feature identification will be affected by changes in sensor spatial resolution.

  5. Focal-Plane Sensing-Processing: A Power-Efficient Approach for the Implementation of Privacy-Aware Networked Visual Sensors

    PubMed Central

    Fernández-Berni, Jorge; Carmona-Galán, Ricardo; del Río, Rocío; Kleihorst, Richard; Philips, Wilfried; Rodríguez-Vázquez, Ángel

    2014-01-01

    The capture, processing and distribution of visual information is one of the major challenges for the paradigm of the Internet of Things. Privacy emerges as a fundamental barrier to overcome. The idea of networked image sensors pervasively collecting data generates social rejection in the face of sensitive information being tampered by hackers or misused by legitimate users. Power consumption also constitutes a crucial aspect. Images contain a massive amount of data to be processed under strict timing requirements, demanding high-performance vision systems. In this paper, we describe a hardware-based strategy to concurrently address these two key issues. By conveying processing capabilities to the focal plane in addition to sensing, we can implement privacy protection measures just at the point where sensitive data are generated. Furthermore, such measures can be tailored for efficiently reducing the computational load of subsequent processing stages. As a proof of concept, a full-custom QVGA vision sensor chip is presented. It incorporates a mixed-signal focal-plane sensing-processing array providing programmable pixelation of multiple image regions in parallel. In addition to this functionality, the sensor exploits reconfigurability to implement other processing primitives, namely block-wise dynamic range adaptation, integral image computation and multi-resolution filtering. The proposed circuitry is also suitable to build a granular space, becoming the raw material for subsequent feature extraction and recognition of categorized objects. PMID:25195849

  6. Focal-plane sensing-processing: a power-efficient approach for the implementation of privacy-aware networked visual sensors.

    PubMed

    Fernández-Berni, Jorge; Carmona-Galán, Ricardo; del Río, Rocío; Kleihorst, Richard; Philips, Wilfried; Rodríguez-Vázquez, Ángel

    2014-08-19

    The capture, processing and distribution of visual information is one of the major challenges for the paradigm of the Internet of Things. Privacy emerges as a fundamental barrier to overcome. The idea of networked image sensors pervasively collecting data generates social rejection in the face of sensitive information being tampered by hackers or misused by legitimate users. Power consumption also constitutes a crucial aspect. Images contain a massive amount of data to be processed under strict timing requirements, demanding high-performance vision systems. In this paper, we describe a hardware-based strategy to concurrently address these two key issues. By conveying processing capabilities to the focal plane in addition to sensing, we can implement privacy protection measures just at the point where sensitive data are generated. Furthermore, such measures can be tailored for efficiently reducing the computational load of subsequent processing stages. As a proof of concept, a full-custom QVGA vision sensor chip is presented. It incorporates a mixed-signal focal-plane sensing-processing array providing programmable pixelation of multiple image regions in parallel. In addition to this functionality, the sensor exploits reconfigurability to implement other processing primitives, namely block-wise dynamic range adaptation, integral image computation and multi-resolution filtering. The proposed circuitry is also suitable to build a granular space, becoming the raw material for subsequent feature extraction and recognition of categorized objects.

  7. Thermal bioaerosol cloud tracking with Bayesian classification

    NASA Astrophysics Data System (ADS)

    Smith, Christian W.; Dupuis, Julia R.; Schundler, Elizabeth C.; Marinelli, William J.

    2017-05-01

    The development of a wide area, bioaerosol early warning capability employing existing uncooled thermal imaging systems used for persistent perimeter surveillance is discussed. The capability exploits thermal imagers with other available data streams including meteorological data and employs a recursive Bayesian classifier to detect, track, and classify observed thermal objects with attributes consistent with a bioaerosol plume. Target detection is achieved based on similarity to a phenomenological model which predicts the scene-dependent thermal signature of bioaerosol plumes. Change detection in thermal sensor data is combined with local meteorological data to locate targets with the appropriate thermal characteristics. Target motion is tracked utilizing a Kalman filter and nearly constant velocity motion model for cloud state estimation. Track management is performed using a logic-based upkeep system, and data association is accomplished using a combinatorial optimization technique. Bioaerosol threat classification is determined using a recursive Bayesian classifier to quantify the threat probability of each tracked object. The classifier can accept additional inputs from visible imagers, acoustic sensors, and point biological sensors to improve classification confidence. This capability was successfully demonstrated for bioaerosol simulant releases during field testing at Dugway Proving Grounds. Standoff detection at a range of 700m was achieved for as little as 500g of anthrax simulant. Developmental test results will be reviewed for a range of simulant releases, and future development and transition plans for the bioaerosol early warning platform will be discussed.

  8. Concurrent Initialization for Bearing-Only SLAM

    PubMed Central

    Munguía, Rodrigo; Grau, Antoni

    2010-01-01

    Simultaneous Localization and Mapping (SLAM) is perhaps the most fundamental problem to solve in robotics in order to build truly autonomous mobile robots. The sensors have a large impact on the algorithm used for SLAM. Early SLAM approaches focused on the use of range sensors as sonar rings or lasers. However, cameras have become more and more used, because they yield a lot of information and are well adapted for embedded systems: they are light, cheap and power saving. Unlike range sensors which provide range and angular information, a camera is a projective sensor which measures the bearing of images features. Therefore depth information (range) cannot be obtained in a single step. This fact has propitiated the emergence of a new family of SLAM algorithms: the Bearing-Only SLAM methods, which mainly rely in especial techniques for features system-initialization in order to enable the use of bearing sensors (as cameras) in SLAM systems. In this work a novel and robust method, called Concurrent Initialization, is presented which is inspired by having the complementary advantages of the Undelayed and Delayed methods that represent the most common approaches for addressing the problem. The key is to use concurrently two kinds of feature representations for both undelayed and delayed stages of the estimation. The simulations results show that the proposed method surpasses the performance of previous schemes. PMID:22294884

  9. Learning for Autonomous Navigation

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric

    2005-01-01

    Robotic ground vehicles for outdoor applications have achieved some remarkable successes, notably in autonomous highway following (Dickmanns, 1987), planetary exploration (1), and off-road navigation on Earth (1). Nevertheless, major challenges remain to enable reliable, high-speed, autonomous navigation in a wide variety of complex, off-road terrain. 3-D perception of terrain geometry with imaging range sensors is the mainstay of off-road driving systems. However, the stopping distance at high speed exceeds the effective lookahead distance of existing range sensors. Prospects for extending the range of 3-D sensors is strongly limited by sensor physics, eye safety of lasers, and related issues. Range sensor limitations also allow vehicles to enter large cul-de-sacs even at low speed, leading to long detours. Moreover, sensing only terrain geometry fails to reveal mechanical properties of terrain that are critical to assessing its traversability, such as potential for slippage, sinkage, and the degree of compliance of potential obstacles. Rovers in the Mars Exploration Rover (MER) mission have got stuck in sand dunes and experienced significant downhill slippage in the vicinity of large rock hazards. Earth-based off-road robots today have very limited ability to discriminate traversable vegetation from non-traversable vegetation or rough ground. It is impossible today to preprogram a system with knowledge of these properties for all types of terrain and weather conditions that might be encountered.

  10. Scene Context Dependency of Pattern Constancy of Time Series Imagery

    NASA Technical Reports Server (NTRS)

    Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur

    2008-01-01

    A fundamental element of future generic pattern recognition technology is the ability to extract similar patterns for the same scene despite wide ranging extraneous variables, including lighting, turbidity, sensor exposure variations, and signal noise. In the process of demonstrating pattern constancy of this kind for retinex/visual servo (RVS) image enhancement processing, we found that the pattern constancy performance depended somewhat on scene content. Most notably, the scene topography and, in particular, the scale and extent of the topography in an image, affects the pattern constancy the most. This paper will explore these effects in more depth and present experimental data from several time series tests. These results further quantify the impact of topography on pattern constancy. Despite this residual inconstancy, the results of overall pattern constancy testing support the idea that RVS image processing can be a universal front-end for generic visual pattern recognition. While the effects on pattern constancy were significant, the RVS processing still does achieve a high degree of pattern constancy over a wide spectrum of scene content diversity, and wide ranging extraneousness variations in lighting, turbidity, and sensor exposure.

  11. An efficient and secure partial image encryption for wireless multimedia sensor networks using discrete wavelet transform, chaotic maps and substitution box

    NASA Astrophysics Data System (ADS)

    Khan, Muazzam A.; Ahmad, Jawad; Javaid, Qaisar; Saqib, Nazar A.

    2017-03-01

    Wireless Sensor Networks (WSN) is widely deployed in monitoring of some physical activity and/or environmental conditions. Data gathered from WSN is transmitted via network to a central location for further processing. Numerous applications of WSN can be found in smart homes, intelligent buildings, health care, energy efficient smart grids and industrial control systems. In recent years, computer scientists has focused towards findings more applications of WSN in multimedia technologies, i.e. audio, video and digital images. Due to bulky nature of multimedia data, WSN process a large volume of multimedia data which significantly increases computational complexity and hence reduces battery time. With respect to battery life constraints, image compression in addition with secure transmission over a wide ranged sensor network is an emerging and challenging task in Wireless Multimedia Sensor Networks. Due to the open nature of the Internet, transmission of data must be secure through a process known as encryption. As a result, there is an intensive demand for such schemes that is energy efficient as well as highly secure since decades. In this paper, discrete wavelet-based partial image encryption scheme using hashing algorithm, chaotic maps and Hussain's S-Box is reported. The plaintext image is compressed via discrete wavelet transform and then the image is shuffled column-wise and row wise-wise via Piece-wise Linear Chaotic Map (PWLCM) and Nonlinear Chaotic Algorithm, respectively. To get higher security, initial conditions for PWLCM are made dependent on hash function. The permuted image is bitwise XORed with random matrix generated from Intertwining Logistic map. To enhance the security further, final ciphertext is obtained after substituting all elements with Hussain's substitution box. Experimental and statistical results confirm the strength of the anticipated scheme.

  12. Measuring Radiant Emissions from Entire Prescribed Fires with Ground, Airborne and Satellite Sensors RxCADRE 2012

    NASA Technical Reports Server (NTRS)

    Dickinson, Matthew B.; Hudak, Andrew T.; Zajkowski, Thomas; Loudermilk, E. Louise; Schroeder, Wilfrid; Ellison, Luke; Kremens, Robert L.; Holley, William; Martinez, Otto; Paxton, Alexander; hide

    2015-01-01

    Characterising radiation from wildland fires is an important focus of fire science because radiation relates directly to the combustion process and can be measured across a wide range of spatial extents and resolutions. As part of a more comprehensive set of measurements collected during the 2012 Prescribed Fire Combustion and Atmospheric Dynamics Research (RxCADRE) field campaign, we used ground, airborne and spaceborne sensors to measure fire radiative power (FRP) from whole fires, applying different methods to small (2 ha) and large (.100 ha) burn blocks. For small blocks (n1/46), FRP estimated from an obliquely oriented long-wave infrared (LWIR) camera mounted on a boom lift were compared with FRP derived from combined data from tower-mounted radiometers and remotely piloted aircraft systems (RPAS). For large burn blocks (n1/43), satellite FRP measurements from the Moderate-resolution Imaging Spectroradiometer (MODIS) and Visible Infrared Imaging Radiometer Suite (VIIRS) sensors were compared with near-coincident FRP measurements derived from a LWIR imaging system aboard a piloted aircraft. We describe measurements and consider their strengths and weaknesses. Until quantitative sensors exist for small RPAS, their use in fire research will remain limited. For oblique, airborne and satellite sensors, further FRP measurement development is needed along with greater replication of coincident measurements, which we show to be feasible.

  13. Effects of the source, surface, and sensor couplings and colorimetric of laser speckle pattern on the performance of optical imaging system

    NASA Astrophysics Data System (ADS)

    Darwiesh, M.; El-Sherif, Ashraf F.; El-Ghandour, Hatem; Aly, Hussein A.; Mokhtar, A. M.

    2011-03-01

    Optical imaging systems are widely used in different applications include tracking for portable scanners; input pointing devices for laptop computers, cell phones, and cameras, fingerprint-identification scanners, optical navigation for target tracking, and in optical computer mouse. We presented an experimental work to measure and analyze the laser speckle pattern (LSP) produced from different optical sources (i.e. various color LEDs, 3 mW diode laser, and 10mW He-Ne laser) with different produced operating surfaces (Gabor hologram diffusers), and how they affects the performance of the optical imaging systems; speckle size and signal-to-noise ratio (signal is represented by the patches of the speckles that contain or carry information, and noise is represented by the whole remaining part of the selected image). The theoretical and experimental studies of the colorimetry (color correction is done in the color images captured by the optical imaging system to produce realistic color images which contains most of the information in the image by selecting suitable gray scale which contains most of the informative data in the image, this is done by calculating the accurate Red-Green-Blue (RGB) color components making use of the measured spectrum for light sources, and color matching functions of International Telecommunication Organization (ITU-R709) for CRT phosphorus, Tirinton-SONY Model ) for the used optical sources are investigated and introduced to present the relations between the signal-to-noise ratios with different diffusers for each light source. The source surface coupling has been discussed and concludes that the performance of the optical imaging system for certain source varies from worst to best based on the operating surface. The sensor /surface coupling has been studied and discussed for the case of He-Ne laser and concludes the speckle size is ranged from 4.59 to 4.62 μm, which are slightly different or approximately the same for all produced diffusers (which satisfies the fact that the speckle size is independent on the illuminating surface). But, the calculated value of signal-tonoise ratio takes different values ranged from 0.71 to 0.92 for different diffuser. This means that the surface texture affects the performance of the optical sensor because, all images captured for all diffusers under the same conditions [same source (He-Ne laser), same distances of the experimental set-up, and the same sensor (CCD camera)].

  14. Investigation of Terrain Analysis and Classification Methods for Ground Vehicles

    DTIC Science & Technology

    2012-08-27

    exteroceptive terrain classifier takes exteroceptive sensor data (here, color stereo images of the terrain) as its input and returns terrain class...Mishkin & Laubach, 2006), the rover cannot safely travel beyond the distance it can image with its cameras, which has been as little as 15 meters or...field of view roughly 44°×30°, capturing pairs of color images at 640×480 pixels each (Videre Design, 2001). Range data were extracted from the stereo

  15. Robotic Vehicle Communications Interoperability

    DTIC Science & Technology

    1988-08-01

    starter (cold start) X X Fire suppression X Fording control X Fuel control X Fuel tank selector X Garage toggle X Gear selector X X X X Hazard warning...optic Sensors Sensor switch Video Radar IR Thermal imaging system Image intensifier Laser ranger Video camera selector Forward Stereo Rear Sensor control...optic sensors Sensor switch Video Radar IR Thermal imaging system Image intensifier Laser ranger Video camera selector Forward Stereo Rear Sensor

  16. Using hyperspectral remote sensing for land cover classification

    NASA Astrophysics Data System (ADS)

    Zhang, Wendy W.; Sriharan, Shobha

    2005-01-01

    This project used hyperspectral data set to classify land cover using remote sensing techniques. Many different earth-sensing satellites, with diverse sensors mounted on sophisticated platforms, are currently in earth orbit. These sensors are designed to cover a wide range of the electromagnetic spectrum and are generating enormous amounts of data that must be processed, stored, and made available to the user community. The Airborne Visible-Infrared Imaging Spectrometer (AVIRIS) collects data in 224 bands that are approximately 9.6 nm wide in contiguous bands between 0.40 and 2.45 mm. Hyperspectral sensors acquire images in many, very narrow, contiguous spectral bands throughout the visible, near-IR, and thermal IR portions of the spectrum. The unsupervised image classification procedure automatically categorizes the pixels in an image into land cover classes or themes. Experiments on using hyperspectral remote sensing for land cover classification were conducted during the 2003 and 2004 NASA Summer Faculty Fellowship Program at Stennis Space Center. Research Systems Inc.'s (RSI) ENVI software package was used in this application framework. In this application, emphasis was placed on: (1) Spectrally oriented classification procedures for land cover mapping, particularly, the supervised surface classification using AVIRIS data; and (2) Identifying data endmembers.

  17. a Spatio-Spectral Camera for High Resolution Hyperspectral Imaging

    NASA Astrophysics Data System (ADS)

    Livens, S.; Pauly, K.; Baeck, P.; Blommaert, J.; Nuyts, D.; Zender, J.; Delauré, B.

    2017-08-01

    Imaging with a conventional frame camera from a moving remotely piloted aircraft system (RPAS) is by design very inefficient. Less than 1 % of the flying time is used for collecting light. This unused potential can be utilized by an innovative imaging concept, the spatio-spectral camera. The core of the camera is a frame sensor with a large number of hyperspectral filters arranged on the sensor in stepwise lines. It combines the advantages of frame cameras with those of pushbroom cameras. By acquiring images in rapid succession, such a camera can collect detailed hyperspectral information, while retaining the high spatial resolution offered by the sensor. We have developed two versions of a spatio-spectral camera and used them in a variety of conditions. In this paper, we present a summary of three missions with the in-house developed COSI prototype camera (600-900 nm) in the domains of precision agriculture (fungus infection monitoring in experimental wheat plots), horticulture (crop status monitoring to evaluate irrigation management in strawberry fields) and geology (meteorite detection on a grassland field). Additionally, we describe the characteristics of the 2nd generation, commercially available ButterflEYE camera offering extended spectral range (475-925 nm), and we discuss future work.

  18. Shortwave infrared 512 x 2 line sensor for earth resources applications

    NASA Astrophysics Data System (ADS)

    Tower, J. R.; Pellon, L. E.; McCarthy, B. M.; Elabd, H.; Moldovan, A. G.; Kosonocky, W. F.; Kalshoven, J. E., Jr.; Tom, D.

    1985-08-01

    As part of the NASA remote-sensing Multispectral Linear Array Program, an edge-buttable 512 x 2 IRCCD line image sensor with 30-micron Pd2Si Schottky-barrier detectors is developed for operation with passive cooling at 120 K in the 1.1-2.5 micron short infrared band. On-chip CCD multiplexers provide one video output for each 512 detector band. The monolithic silicon line imager performance at a 4-ms optical integration time includes a signal-to-noise ratio of 241 for irradiance of 7.2 microwatts/sq cm at 1.65 microns wavelength, a 5000 dynamic range, a modulation transfer function, greater than 60 percent at the Nyquist frequency, and an 18-milliwatt imager chip total power dissipation. Blemish-free images with three percent nonuniformity under illumination and nonlinearity of 1.25 percent are obtained. A five SWIR imager hybrid focal plane was constructed, demonstrating the feasibility of arrays with only a two-detector loss at each joint.

  19. Expanding the Detection of Traversable Area with RealSense for the Visually Impaired

    PubMed Central

    Yang, Kailun; Wang, Kaiwei; Hu, Weijian; Bai, Jian

    2016-01-01

    The introduction of RGB-Depth (RGB-D) sensors into the visually impaired people (VIP)-assisting area has stirred great interest of many researchers. However, the detection range of RGB-D sensors is limited by narrow depth field angle and sparse depth map in the distance, which hampers broader and longer traversability awareness. This paper proposes an effective approach to expand the detection of traversable area based on a RGB-D sensor, the Intel RealSense R200, which is compatible with both indoor and outdoor environments. The depth image of RealSense is enhanced with IR image large-scale matching and RGB image-guided filtering. Traversable area is obtained with RANdom SAmple Consensus (RANSAC) segmentation and surface normal vector estimation, preliminarily. A seeded growing region algorithm, combining the depth image and RGB image, enlarges the preliminary traversable area greatly. This is critical not only for avoiding close obstacles, but also for allowing superior path planning on navigation. The proposed approach has been tested on a score of indoor and outdoor scenarios. Moreover, the approach has been integrated into an assistance system, which consists of a wearable prototype and an audio interface. Furthermore, the presented approach has been proved to be useful and reliable by a field test with eight visually impaired volunteers. PMID:27879634

  20. The effect of split pixel HDR image sensor technology on MTF measurements

    NASA Astrophysics Data System (ADS)

    Deegan, Brian M.

    2014-03-01

    Split-pixel HDR sensor technology is particularly advantageous in automotive applications, because the images are captured simultaneously rather than sequentially, thereby reducing motion blur. However, split pixel technology introduces artifacts in MTF measurement. To achieve a HDR image, raw images are captured from both large and small sub-pixels, and combined to make the HDR output. In some cases, a large sub-pixel is used for long exposure captures, and a small sub-pixel for short exposures, to extend the dynamic range. The relative size of the photosensitive area of the pixel (fill factor) plays a very significant role in the output MTF measurement. Given an identical scene, the MTF will be significantly different, depending on whether you use the large or small sub-pixels i.e. a smaller fill factor (e.g. in the short exposure sub-pixel) will result in higher MTF scores, but significantly greater aliasing. Simulations of split-pixel sensors revealed that, when raw images from both sub-pixels are combined, there is a significant difference in rising edge (i.e. black-to-white transition) and falling edge (white-to-black) reproduction. Experimental results showed a difference of ~50% in measured MTF50 between the falling and rising edges of a slanted edge test chart.

  1. An ECT/ERT dual-modality sensor for oil-water two-phase flow measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Pitao; Wang, Huaxiang; Sun, Benyuan

    2014-04-11

    This paper presents a new sensor for ECT/ERT dual-modality system which can simultaneously obtain the permittivity and conductivity of the materials in the pipeline. Quasi-static electromagnetic fields are produced by the inner electrodes array sensor of electrical capacitance tomography (ECT) system. The results of simulation show that the data of permittivity and conductivity can be simultaneously obtained from the same measurement electrode and the fusion of two kinds of data may improve the quality of the reconstructed images. For uniform oil-water mixtures, the performance of designed dual-modality sensor for measuring the various oil fractions has been tested on representative datamore » and the results of experiments show that the designed sensor broadens the measurement range compared to single modality.« less

  2. Earth Surface Monitoring with COSI-Corr, Techniques and Applications

    NASA Astrophysics Data System (ADS)

    Leprince, S.; Ayoub, F.; Avouac, J.

    2009-12-01

    Co-registration of Optically Sensed Images and Correlation (COSI-Corr) is a software package developed at the California Institute of Technology (USA) for accurate geometrical processing of optical satellite and aerial imagery. Initially developed for the measurement of co-seismic ground deformation using optical imagery, COSI-Corr is now used for a wide range of applications in Earth Sciences, which take advantage of the software capability to co-register, with very high accuracy, images taken from different sensors and acquired at different times. As long as a sensor is supported in COSI-Corr, all images between the supported sensors can be accurately orthorectified and co-registered. For example, it is possible to co-register a series of SPOT images, a series of aerial photographs, as well as to register a series of aerial photographs with a series of SPOT images, etc... Currently supported sensors include the SPOT 1-5, Quickbird, Worldview 1 and Formosat 2 satellites, the ASTER instrument, and frame camera acquisitions from e.g., aerial survey or declassified satellite imagery. Potential applications include accurate change detection between multi-temporal and multi-spectral images, and the calibration of pushbroom cameras. In particular, COSI-Corr provides a powerful correlation tool, which allows for accurate estimation of surface displacement. The accuracy depends on many factors (e.g., cloud, snow, and vegetation cover, shadows, temporal changes in general, steadiness of the imaging platform, defects of the imaging system, etc...) but in practice, the standard deviation of the measurements obtained from the correlation of mutli-temporal images is typically around 1/20 to 1/10 of the pixel size. The software package also includes post-processing tools such as denoising, destriping, and stacking tools to facilitate data interpretation. Examples drawn from current research in, e.g., seismotectonics, glaciology, and geomorphology will be presented. COSI-Corr is developed in IDL (Interactive Data Language), integrated under the user friendly interface ENVI (Environment for Visualizing Images), and is distributed free of charge for academic research purposes.

  3. Validation of the Five-Phase Method for Simulating Complex Fenestration Systems with Radiance against Field Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geisler-Moroder, David; Lee, Eleanor S.; Ward, Gregory J.

    2016-08-29

    The Five-Phase Method (5-pm) for simulating complex fenestration systems with Radiance is validated against field measurements. The capability of the method to predict workplane illuminances, vertical sensor illuminances, and glare indices derived from captured and rendered high dynamic range (HDR) images is investigated. To be able to accurately represent the direct sun part of the daylight not only in sensor point simulations, but also in renderings of interior scenes, the 5-pm calculation procedure was extended. The validation shows that the 5-pm is superior to the Three-Phase Method for predicting horizontal and vertical illuminance sensor values as well as glare indicesmore » derived from rendered images. Even with input data from global and diffuse horizontal irradiance measurements only, daylight glare probability (DGP) values can be predicted within 10% error of measured values for most situations.« less

  4. Topview stereo: combining vehicle-mounted wide-angle cameras to a distance sensor array

    NASA Astrophysics Data System (ADS)

    Houben, Sebastian

    2015-03-01

    The variety of vehicle-mounted sensors in order to fulfill a growing number of driver assistance tasks has become a substantial factor in automobile manufacturing cost. We present a stereo distance method exploiting the overlapping field of view of a multi-camera fisheye surround view system, as they are used for near-range vehicle surveillance tasks, e.g. in parking maneuvers. Hence, we aim at creating a new input signal from sensors that are already installed. Particular properties of wide-angle cameras (e.g. hanging resolution) demand an adaptation of the image processing pipeline to several problems that do not arise in classical stereo vision performed with cameras carefully designed for this purpose. We introduce the algorithms for rectification, correspondence analysis, and regularization of the disparity image, discuss reasons and avoidance of the shown caveats, and present first results on a prototype topview setup.

  5. Improved fiber-optic chemical sensor for penicillin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Healy, B.G.; Walt, D.R.

    An optical penicillin biosensor is described, based on the enzyme penicillinase. The sensor is fabricated by selective photodeposition of analyte-sensitive polymer matrices on optical imaging fibers. The penicillin-sensitive matrices are fabricated by immobilizing the enzyme as micrometer-sized particles in a polymer hydrogel with a covalently bound pH indicator. An array of penicillin-sensitive and pH-sensitive matrices are fabricated on the same fiber. This array allows for the simultaneous, independent measurement of pH and penicillin. Independent measurement of the two analytes allows penicillin to be quantitated in the presence of a concurrent pH change. An analysis was conducted of enzyme kinetic parametersmore » in order to model the penicillin response of the sensor at all pH values. This analysis accounts for the varying activity of the immobilized penicillinase at different pH values. The sensor detects penicillin in the range 0.25-10.0 mM in the pH range 6.2-7.5. The sensor was used to quantify penicillin concentration produced during a Penicillium chrysogenum fermentation. 27 refs., 7 figs., 1 tab.« less

  6. Ranging through Gabor logons-a consistent, hierarchical approach.

    PubMed

    Chang, C; Chatterjee, S

    1993-01-01

    In this work, the correspondence problem in stereo vision is handled by matching two sets of dense feature vectors. Inspired by biological evidence, these feature vectors are generated by a correlation between a bank of Gabor sensors and the intensity image. The sensors consist of two-dimensional Gabor filters at various scales (spatial frequencies) and orientations, which bear close resemblance to the receptive field profiles of simple V1 cells in visual cortex. A hierarchical, stochastic relaxation method is then used to obtain the dense stereo disparities. Unlike traditional hierarchical methods for stereo, feature based hierarchical processing yields consistent disparities. To avoid false matchings due to static occlusion, a dual matching, based on the imaging geometry, is used.

  7. Digital Simulation Of Precise Sensor Degradations Including Non-Linearities And Shift Variance

    NASA Astrophysics Data System (ADS)

    Kornfeld, Gertrude H.

    1987-09-01

    Realistic atmospheric and Forward Looking Infrared Radiometer (FLIR) degradations were digitally simulated. Inputs to the routine are environmental observables and the FLIR specifications. It was possible to achieve realism in the thermal domain within acceptable computer time and random access memory (RAM) requirements because a shift variant recursive convolution algorithm that well describes thermal properties was invented and because each picture element (pixel) has radiative temperature, a materials parameter and range and altitude information. The computer generation steps start with the image synthesis of an undegraded scene. Atmospheric and sensor degradation follow. The final result is a realistic representation of an image seen on the display of a specific FLIR.

  8. Performance and Transient Behavior of Vertically Integrated Thin-film Silicon Sensors

    PubMed Central

    Wyrsch, Nicolas; Choong, Gregory; Miazza, Clément; Ballif, Christophe

    2008-01-01

    Vertical integration of amorphous hydrogenated silicon diodes on CMOS readout chips offers several advantages compared to standard CMOS imagers in terms of sensitivity, dynamic range and dark current while at the same time introducing some undesired transient effects leading to image lag. Performance of such sensors is here reported and their transient behaviour is analysed and compared to the one of corresponding amorphous silicon test diodes deposited on glass. The measurements are further compared to simulations for a deeper investigation. The long time constant observed in dark or photocurrent decay is found to be rather independent of the density of defects present in the intrinsic layer of the amorphous silicon diode. PMID:27873778

  9. Interrogation of a ring-resonator ultrasound sensor using a fiber Mach-Zehnder interferometer.

    PubMed

    Peternella, Fellipe Grillo; Ouyang, Boling; Horsten, Roland; Haverdings, Michael; Kat, Pim; Caro, Jacob

    2017-12-11

    We experimentally demonstrate an interrogation procedure of a ring-resonator ultrasound sensor using a fiber Mach-Zehnder interferometer (MZI). The sensor comprises a silicon ring resonator (RR) located on a silicon-oxide membrane, designed to have its lowest vibrational mode in the MHz range, which is the range of intravascular ultrasound (IVUS) imaging. Ultrasound incident on the membrane excites its vibrational mode and as a result induces a modulation of the resonance wavelength of the RR, which is a measure of the amplitude of the ultrasound waves. The interrogation procedure developed is based on the mathematical description of the interrogator operation presented in Appendix A, where we identify the amplitude of the angular deflection Φ 0 on the circle arc periodically traced in the plane of the two orthogonal interrogator voltages, as the principal sensor signal. Interrogation is demonstrated for two sensors with membrane vibrational modes at 1.3 and 0.77 MHz, by applying continuous wave ultrasound in a wide pressure range. Ultrasound is detected at a pressure as low as 1.2 Pa. Two optical path differences (OPDs) of the MZI are used. Thus, different interference conditions of the optical signals are defined, leading to a higher apparent sensitivity for the larger OPD, which is accompanied by a weaker signal, however. Independent measurements using the modulation method yield a resonance modulation per unit of pressure of 21.4 fm/Pa (sensor #1) and 103.8 fm/Pa (sensor #2).

  10. An infrared/video fusion system for military robotics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, A.W.; Roberts, R.S.

    1997-08-05

    Sensory information is critical to the telerobotic operation of mobile robots. In particular, visual sensors are a key component of the sensor package on a robot engaged in urban military operations. Visual sensors provide the robot operator with a wealth of information including robot navigation and threat assessment. However, simple countermeasures such as darkness, smoke, or blinding by a laser, can easily neutralize visual sensors. In order to provide a robust visual sensing system, an infrared sensor is required to augment the primary visual sensor. An infrared sensor can acquire useful imagery in conditions that incapacitate a visual sensor. Amore » simple approach to incorporating an infrared sensor into the visual sensing system is to display two images to the operator: side-by-side visual and infrared images. However, dual images might overwhelm the operator with information, and result in degraded robot performance. A better solution is to combine the visual and infrared images into a single image that maximizes scene information. Fusing visual and infrared images into a single image demands balancing the mixture of visual and infrared information. Humans are accustom to viewing and interpreting visual images. They are not accustom to viewing or interpreting infrared images. Hence, the infrared image must be used to enhance the visual image, not obfuscate it.« less

  11. Dataset of surface plasmon resonance based on photonic crystal fiber for chemical sensing applications.

    PubMed

    Khalek, Md Abdul; Chakma, Sujan; Paul, Bikash Kumar; Ahmed, Kawsar

    2018-08-01

    In this research work a perfectly circular lattice Photonic Crystal Fiber (PCF) based surface Plasmon resonance (SPR) based sensor has been proposed. The investigation process has been successfully carried out using finite element method (FEM) based commercial available software package COMSOL Multiphysics version 4.2. The whole investigation module covers the wider optical spectrum ranging from 0.48 µm to 1.10 µm. Using the wavelength interrogation method the proposed model exposed maximum sensitivity of 9000 nm/RIU(Refractive Index Unit) and using the amplitude interrogation method it obtained maximum sensitivity of 318 RIU -1 . Moreover the maximum sensor resolution of 1.11×10 -5 in the sensing ranges between 1.34 and 1.37. Based on the suggested sensor model may provide great impact in biological area such as bio-imaging.

  12. Ultrasensitive, Biocompatible, Self-Calibrating, Multiparametric Temperature Sensors.

    PubMed

    Zhao, Haiguang; Vomiero, Alberto; Rosei, Federico

    2015-11-18

    Core-shell quantum dots serve as self-calibrating, ultrasensitive, multiparametric, near-infrared, and biocompatible temperature sensors. They allow temperature measurement with nanometer accuracy in the range 150-373 K, the broadest ever recorded for a nanothermometer, with sensitivities among the highest ever reported, which makes them essentially unique in the panorama of biocompatible nanothermometers with potential for in vivo biological thermal imaging and/or thermoablative therapy. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. The coronagraphic Modal Wavefront Sensor: a hybrid focal-plane sensor for the high-contrast imaging of circumstellar environments

    NASA Astrophysics Data System (ADS)

    Wilby, M. J.; Keller, C. U.; Snik, F.; Korkiakoski, V.; Pietrow, A. G. M.

    2017-01-01

    The raw coronagraphic performance of current high-contrast imaging instruments is limited by the presence of a quasi-static speckle (QSS) background, resulting from instrumental Non-Common Path Errors (NCPEs). Rapid development of efficient speckle subtraction techniques in data reduction has enabled final contrasts of up to 10-6 to be obtained, however it remains preferable to eliminate the underlying NCPEs at the source. In this work we introduce the coronagraphic Modal Wavefront Sensor (cMWS), a new wavefront sensor suitable for real-time NCPE correction. This combines the Apodizing Phase Plate (APP) coronagraph with a holographic modal wavefront sensor to provide simultaneous coronagraphic imaging and focal-plane wavefront sensing with the science point-spread function. We first characterise the baseline performance of the cMWS via idealised closed-loop simulations, showing that the sensor is able to successfully recover diffraction-limited coronagraph performance over an effective dynamic range of ±2.5 radians root-mean-square (rms) wavefront error within 2-10 iterations, with performance independent of the specific choice of mode basis. We then present the results of initial on-sky testing at the William Herschel Telescope, which demonstrate that the sensor is capable of NCPE sensing under realistic seeing conditions via the recovery of known static aberrations to an accuracy of 10 nm (0.1 radians) rms error in the presence of a dominant atmospheric speckle foreground. We also find that the sensor is capable of real-time measurement of broadband atmospheric wavefront variance (50% bandwidth, 158 nm rms wavefront error) at a cadence of 50 Hz over an uncorrected telescope sub-aperture. When combined with a suitable closed-loop adaptive optics system, the cMWS holds the potential to deliver an improvement of up to two orders of magnitude over the uncorrected QSS floor. Such a sensor would be eminently suitable for the direct imaging and spectroscopy of exoplanets with both existing and future instruments, including EPICS and METIS for the E-ELT.

  14. Fluorescence Intensity- and Lifetime-Based Glucose Sensing Using Glucose/Galactose-Binding Protein

    PubMed Central

    Pickup, John C.; Khan, Faaizah; Zhi, Zheng-Liang; Coulter, Jonathan; Birch, David J. S.

    2013-01-01

    We review progress in our laboratories toward developing in vivo glucose sensors for diabetes that are based on fluorescence labeling of glucose/galactose-binding protein. Measurement strategies have included both monitoring glucose-induced changes in fluorescence resonance energy transfer and labeling with the environmentally sensitive fluorophore, badan. Measuring fluorescence lifetime rather than intensity has particular potential advantages for in vivo sensing. A prototype fiber-optic-based glucose sensor using this technology is being tested.Fluorescence technique is one of the major solutions for achieving the continuous and noninvasive glucose sensor for diabetes. In this article, a highly sensitive nanostructured sensor is developed to detect extremely small amounts of aqueous glucose by applying fluorescence energy transfer (FRET). A one-pot method is applied to produce the dextran-fluorescein isothiocyanate (FITC)-conjugating mesoporous silica nanoparticles (MSNs), which afterward interact with the tetramethylrhodamine isothiocyanate (TRITC)-labeled concanavalin A (Con A) to form the FRET nanoparticles (FITC-dextran-Con A-TRITC@MSNs). The nanostructured glucose sensor is then formed via the self-assembly of the FRET nanoparticles on a transparent, flexible, and biocompatible substrate, e.g., poly(dimethylsiloxane). Our results indicate the diameter of the MSNs is 60 ± 5 nm. The difference in the images before and after adding 20 μl of glucose (0.10 mmol/liter) on the FRET sensor can be detected in less than 2 min by the laser confocal laser scanning microscope. The correlation between the ratio of fluorescence intensity, I(donor)/I(acceptor), of the FRET sensor and the concentration of aqueous glucose in the range of 0.04–4 mmol/liter has been investigated; a linear relationship is found. Furthermore, the durability of the nanostructured FRET sensor is evaluated for 5 days. In addition, the recorded images can be converted to digital images by obtaining the pixels from the resulting matrix using Matlab image processing functions. We have also studied the in vitro cytotoxicity of the device. The nanostructured FRET sensor may provide an alternative method to help patients manage the disease continuously. PMID:23439161

  15. CMOS Image Sensors: Electronic Camera On A Chip

    NASA Technical Reports Server (NTRS)

    Fossum, E. R.

    1995-01-01

    Recent advancements in CMOS image sensor technology are reviewed, including both passive pixel sensors and active pixel sensors. On- chip analog to digital converters and on-chip timing and control circuits permit realization of an electronic camera-on-a-chip. Highly miniaturized imaging systems based on CMOS image sensor technology are emerging as a competitor to charge-coupled devices for low cost uses.

  16. A non-disruptive technology for robust 3D tool tracking for ultrasound-guided interventions.

    PubMed

    Mung, Jay; Vignon, Francois; Jain, Ameet

    2011-01-01

    In the past decade ultrasound (US) has become the preferred modality for a number of interventional procedures, offering excellent soft tissue visualization. The main limitation however is limited visualization of surgical tools. A new method is proposed for robust 3D tracking and US image enhancement of surgical tools under US guidance. Small US sensors are mounted on existing surgical tools. As the imager emits acoustic energy, the electrical signal from the sensor is analyzed to reconstruct its 3D coordinates. These coordinates can then be used for 3D surgical navigation, similar to current day tracking systems. A system with real-time 3D tool tracking and image enhancement was implemented on a commercial ultrasound scanner and 3D probe. Extensive water tank experiments with a tracked 0.2mm sensor show robust performance in a wide range of imaging conditions and tool position/orientations. The 3D tracking accuracy was 0.36 +/- 0.16mm throughout the imaging volume of 55 degrees x 27 degrees x 150mm. Additionally, the tool was successfully tracked inside a beating heart phantom. This paper proposes an image enhancement and tool tracking technology with sub-mm accuracy for US-guided interventions. The technology is non-disruptive, both in terms of existing clinical workflow and commercial considerations, showing promise for large scale clinical impact.

  17. Integrated Georeferencing of Stereo Image Sequences Captured with a Stereovision Mobile Mapping System - Approaches and Practical Results

    NASA Astrophysics Data System (ADS)

    Eugster, H.; Huber, F.; Nebiker, S.; Gisi, A.

    2012-07-01

    Stereovision based mobile mapping systems enable the efficient capturing of directly georeferenced stereo pairs. With today's camera and onboard storage technologies imagery can be captured at high data rates resulting in dense stereo sequences. These georeferenced stereo sequences provide a highly detailed and accurate digital representation of the roadside environment which builds the foundation for a wide range of 3d mapping applications and image-based geo web-services. Georeferenced stereo images are ideally suited for the 3d mapping of street furniture and visible infrastructure objects, pavement inspection, asset management tasks or image based change detection. As in most mobile mapping systems, the georeferencing of the mapping sensors and observations - in our case of the imaging sensors - normally relies on direct georeferencing based on INS/GNSS navigation sensors. However, in urban canyons the achievable direct georeferencing accuracy of the dynamically captured stereo image sequences is often insufficient or at least degraded. Furthermore, many of the mentioned application scenarios require homogeneous georeferencing accuracy within a local reference frame over the entire mapping perimeter. To achieve these demands georeferencing approaches are presented and cost efficient workflows are discussed which allows validating and updating the INS/GNSS based trajectory with independently estimated positions in cases of prolonged GNSS signal outages in order to increase the georeferencing accuracy up to the project requirements.

  18. Modeling and performance assessment in QinetiQ of EO and IR airborne reconnaissance systems

    NASA Astrophysics Data System (ADS)

    Williams, John W.; Potter, Gary E.

    2002-11-01

    QinetiQ are the technical authority responsible for specifying the performance requirements for the procurement of airborne reconnaissance systems, on behalf of the UK MoD. They are also responsible for acceptance of delivered systems, overseeing and verifying the installed system performance as predicted and then assessed by the contractor. Measures of functional capability are central to these activities. The conduct of these activities utilises the broad technical insight and wide range of analysis tools and models available within QinetiQ. This paper focuses on the tools, methods and models that are applicable to systems based on EO and IR sensors. The tools, methods and models are described, and representative output for systems that QinetiQ has been responsible for is presented. The principle capability applicable to EO and IR airborne reconnaissance systems is the STAR (Simulation Tools for Airborne Reconnaissance) suite of models. STAR generates predictions of performance measures such as GRD (Ground Resolved Distance) and GIQE (General Image Quality) NIIRS (National Imagery Interpretation Rating Scales). It also generates images representing sensor output, using the scene generation software CAMEO-SIM and the imaging sensor model EMERALD. The simulated image 'quality' is fully correlated with the predicted non-imaging performance measures. STAR also generates image and table data that is compliant with STANAG 7023, which may be used to test ground station functionality.

  19. Novel eye-safe line scanning 3D laser-radar

    NASA Astrophysics Data System (ADS)

    Eberle, B.; Kern, Tobias; Hammer, Marcus; Schwanke, Ullrich; Nowak, Heinrich

    2014-10-01

    Today, the civil market provides quite a number of different 3D-Sensors covering ranges up to 1 km. Typically these sensors are based on single element detectors which suffer from the drawback of spatial resolution at larger distances. Tasks demanding reliable object classification at long ranges can be fulfilled only by sensors consisting of detector arrays. They ensure sufficient frame rates and high spatial resolution. Worldwide there are many efforts in developing 3D-detectors, based on two-dimensional arrays. This paper presents first results on the performance of a recently developed 3D imaging laser radar sensor, working in the short wave infrared (SWIR) at 1.5 μm. It consists of a novel Cadmium Mercury Telluride (CMT) linear array APD detector with 384x1 elements at a pitch of 25 μm, developed by AIM Infrarot Module GmbH. The APD elements are designed to work in the linear (non-Geiger) mode. Each pixel will provide the time of flight measurement, and, due to the linear detection mode, allowing the detection of three successive echoes. The resolution in depth is 15 cm, the maximum repetition rate is 4 kHz. We discuss various sensor concepts regarding possible applications and their dependence on system parameters like field of view, frame rate, spatial resolution and range of operation.

  20. Model development and system performance optimization for staring infrared search and track (IRST) sensors

    NASA Astrophysics Data System (ADS)

    Olson, Craig; Theisen, Michael; Pace, Teresa; Halford, Carl; Driggers, Ronald

    2016-05-01

    The mission of an Infrared Search and Track (IRST) system is to detect and locate (sometimes called find and fix) enemy aircraft at significant ranges. Two extreme opposite examples of IRST applications are 1) long range offensive aircraft detection when electronic warfare equipment is jammed, compromised, or intentionally turned off, and 2) distributed aperture systems where enemy aircraft may be in the proximity of the host aircraft. Past IRST systems have been primarily long range offensive systems that were based on the LWIR second generation thermal imager. The new IRST systems are primarily based on staring infrared focal planes and sensors. In the same manner that FLIR92 did not work well in the design of staring infrared cameras (NVTherm was developed to address staring infrared sensor performance), current modeling techniques do not adequately describe the performance of a staring IRST sensor. There are no standard military IRST models (per AFRL and NAVAIR), and each program appears to perform their own modeling. For this reason, L-3 has decided to develop a corporate model, working with AFRL and NAVAIR, for the analysis, design, and evaluation of IRST concepts, programs, and solutions. This paper provides some of the first analyses in the L-3 IRST model development program for the optimization of staring IRST sensors.

  1. Developing a Ruggedized User-Friendly UAS for Monitoring Volcanic Emissions

    NASA Astrophysics Data System (ADS)

    Wardell, L. J.; Elston, J. S.; Stachura, M.

    2017-12-01

    Using lessons learned from a history of airborne volcano measurements and a range of UAS R&D, a reliable and ruggedized UAS is being developed specifically for volcano monitoring and response. A key feature is the user interface (UI) that allows for a menu of automated flight plans that will account for terrain and sensor requirements. Due to variation in response times of miniaturized airborne the sensors, flight plan options are extended to account for sensor lag when needed. By automating such complicating variables into the UI, the amount of background and training needed for operation is further minimized. Payload options include simultaneous in situ gas and particle sensors combined with downward-looking imagers to provide a wide range of data products. Currently under development by Black Swift Technologies, the latest updates and test results will be presented. Specifications of the Superswift airframe include a 6,000 m flight ceiling, 2.4 kg payload capacity, and 2 hr endurance.

  2. Orbital navigation, docking and obstacle avoidance as a form of three dimensional model-based image understanding

    NASA Technical Reports Server (NTRS)

    Beyer, J.; Jacobus, C.; Mitchell, B.

    1987-01-01

    Range imagery from a laser scanner can be used to provide sufficient information for docking and obstacle avoidance procedures to be performed automatically. Three dimensional model-based computer vision algorithms in development can perform these tasks even with targets which may not be cooperative (that is, objects without special targets or markers to provide unambiguous location points). Roll, pitch and yaw of the vehicle can be taken into account as image scanning takes place, so that these can be corrected when the image is converted from egocentric to world coordinates. Other attributes of the sensor, such as the registered reflectence and texture channels, provide additional data sources for algorithm robustness. Temporal fusion of sensor immages can take place in the work coordinate domain, allowing for the building of complex maps in three dimensional space.

  3. Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling

    PubMed Central

    Tang, Shengjun; Zhu, Qing; Chen, Wu; Darwish, Walid; Wu, Bo; Hu, Han; Chen, Min

    2016-01-01

    RGB-D sensors (sensors with RGB camera and Depth camera) are novel sensing systems that capture RGB images along with pixel-wise depth information. Although they are widely used in various applications, RGB-D sensors have significant drawbacks including limited measurement ranges (e.g., within 3 m) and errors in depth measurement increase with distance from the sensor with respect to 3D dense mapping. In this paper, we present a novel approach to geometrically integrate the depth scene and RGB scene to enlarge the measurement distance of RGB-D sensors and enrich the details of model generated from depth images. First, precise calibration for RGB-D Sensors is introduced. In addition to the calibration of internal and external parameters for both, IR camera and RGB camera, the relative pose between RGB camera and IR camera is also calibrated. Second, to ensure poses accuracy of RGB images, a refined false features matches rejection method is introduced by combining the depth information and initial camera poses between frames of the RGB-D sensor. Then, a global optimization model is used to improve the accuracy of the camera pose, decreasing the inconsistencies between the depth frames in advance. In order to eliminate the geometric inconsistencies between RGB scene and depth scene, the scale ambiguity problem encountered during the pose estimation with RGB image sequences can be resolved by integrating the depth and visual information and a robust rigid-transformation recovery method is developed to register RGB scene to depth scene. The benefit of the proposed joint optimization method is firstly evaluated with the publicly available benchmark datasets collected with Kinect. Then, the proposed method is examined by tests with two sets of datasets collected in both outside and inside environments. The experimental results demonstrate the feasibility and robustness of the proposed method. PMID:27690028

  4. Enhanced RGB-D Mapping Method for Detailed 3D Indoor and Outdoor Modeling.

    PubMed

    Tang, Shengjun; Zhu, Qing; Chen, Wu; Darwish, Walid; Wu, Bo; Hu, Han; Chen, Min

    2016-09-27

    RGB-D sensors (sensors with RGB camera and Depth camera) are novel sensing systems that capture RGB images along with pixel-wise depth information. Although they are widely used in various applications, RGB-D sensors have significant drawbacks including limited measurement ranges (e.g., within 3 m) and errors in depth measurement increase with distance from the sensor with respect to 3D dense mapping. In this paper, we present a novel approach to geometrically integrate the depth scene and RGB scene to enlarge the measurement distance of RGB-D sensors and enrich the details of model generated from depth images. First, precise calibration for RGB-D Sensors is introduced. In addition to the calibration of internal and external parameters for both, IR camera and RGB camera, the relative pose between RGB camera and IR camera is also calibrated. Second, to ensure poses accuracy of RGB images, a refined false features matches rejection method is introduced by combining the depth information and initial camera poses between frames of the RGB-D sensor. Then, a global optimization model is used to improve the accuracy of the camera pose, decreasing the inconsistencies between the depth frames in advance. In order to eliminate the geometric inconsistencies between RGB scene and depth scene, the scale ambiguity problem encountered during the pose estimation with RGB image sequences can be resolved by integrating the depth and visual information and a robust rigid-transformation recovery method is developed to register RGB scene to depth scene. The benefit of the proposed joint optimization method is firstly evaluated with the publicly available benchmark datasets collected with Kinect. Then, the proposed method is examined by tests with two sets of datasets collected in both outside and inside environments. The experimental results demonstrate the feasibility and robustness of the proposed method.

  5. A 75-ps Gated CMOS Image Sensor with Low Parasitic Light Sensitivity

    PubMed Central

    Zhang, Fan; Niu, Hanben

    2016-01-01

    In this study, a 40 × 48 pixel global shutter complementary metal-oxide-semiconductor (CMOS) image sensor with an adjustable shutter time as low as 75 ps was implemented using a 0.5-μm mixed-signal CMOS process. The implementation consisted of a continuous contact ring around each p+/n-well photodiode in the pixel array in order to apply sufficient light shielding. The parasitic light sensitivity of the in-pixel storage node was measured to be 1/8.5 × 107 when illuminated by a 405-nm diode laser and 1/1.4 × 104 when illuminated by a 650-nm diode laser. The pixel pitch was 24 μm, the size of the square p+/n-well photodiode in each pixel was 7 μm per side, the measured random readout noise was 217 e− rms, and the measured dynamic range of the pixel of the designed chip was 5500:1. The type of gated CMOS image sensor (CIS) that is proposed here can be used in ultra-fast framing cameras to observe non-repeatable fast-evolving phenomena. PMID:27367699

  6. A 75-ps Gated CMOS Image Sensor with Low Parasitic Light Sensitivity.

    PubMed

    Zhang, Fan; Niu, Hanben

    2016-06-29

    In this study, a 40 × 48 pixel global shutter complementary metal-oxide-semiconductor (CMOS) image sensor with an adjustable shutter time as low as 75 ps was implemented using a 0.5-μm mixed-signal CMOS process. The implementation consisted of a continuous contact ring around each p+/n-well photodiode in the pixel array in order to apply sufficient light shielding. The parasitic light sensitivity of the in-pixel storage node was measured to be 1/8.5 × 10⁷ when illuminated by a 405-nm diode laser and 1/1.4 × 10⁴ when illuminated by a 650-nm diode laser. The pixel pitch was 24 μm, the size of the square p+/n-well photodiode in each pixel was 7 μm per side, the measured random readout noise was 217 e(-) rms, and the measured dynamic range of the pixel of the designed chip was 5500:1. The type of gated CMOS image sensor (CIS) that is proposed here can be used in ultra-fast framing cameras to observe non-repeatable fast-evolving phenomena.

  7. Microscopic resolution broadband dielectric spectroscopy

    NASA Astrophysics Data System (ADS)

    Mukherjee, S.; Watson, P.; Prance, R. J.

    2011-08-01

    Results are presented for a non-contact measurement system capable of micron level spatial resolution. It utilises the novel electric potential sensor (EPS) technology, invented at Sussex, to image the electric field above a simple composite dielectric material. EP sensors may be regarded as analogous to a magnetometer and require no adjustments or offsets during either setup or use. The sample consists of a standard glass/epoxy FR4 circuit board, with linear defects machined into the surface by a PCB milling machine. The sample is excited with an a.c. signal over a range of frequencies from 10 kHz to 10 MHz, from the reverse side, by placing it on a conducting sheet connected to the source. The single sensor is raster scanned over the surface at a constant working distance, consistent with the spatial resolution, in order to build up an image of the electric field, with respect to the reference potential. The results demonstrate that both the surface defects and the internal dielectric variations within the composite may be imaged in this way, with good contrast being observed between the glass mat and the epoxy resin.

  8. Optimizing Radiometric Fidelity to Enhance Aerial Image Change Detection Utilizing Digital Single Lens Reflex (DSLR) Cameras

    NASA Astrophysics Data System (ADS)

    Kerr, Andrew D.

    Determining optimal imaging settings and best practices related to the capture of aerial imagery using consumer-grade digital single lens reflex (DSLR) cameras, should enable remote sensing scientists to generate consistent, high quality, and low cost image data sets. Radiometric optimization, image fidelity, image capture consistency and repeatability were evaluated in the context of detailed image-based change detection. The impetus for this research is in part, a dearth of relevant, contemporary literature, on the utilization of consumer grade DSLR cameras for remote sensing, and the best practices associated with their use. The main radiometric control settings on a DSLR camera, EV (Exposure Value), WB (White Balance), light metering, ISO, and aperture (f-stop), are variables that were altered and controlled over the course of several image capture missions. These variables were compared for their effects on dynamic range, intra-frame brightness variation, visual acuity, temporal consistency, and the detectability of simulated cracks placed in the images. This testing was conducted from a terrestrial, rather than an airborne collection platform, due to the large number of images per collection, and the desire to minimize inter-image misregistration. The results point to a range of slightly underexposed image exposure values as preferable for change detection and noise minimization fidelity. The makeup of the scene, the sensor, and aerial platform, influence the selection of the aperture and shutter speed which along with other variables, allow for estimation of the apparent image motion (AIM) motion blur in the resulting images. The importance of the image edges in the image application, will in part dictate the lowest usable f-stop, and allow the user to select a more optimal shutter speed and ISO. The single most important camera capture variable is exposure bias (EV), with a full dynamic range, wide distribution of DN values, and high visual contrast and acuity occurring around -0.7 to -0.3EV exposure bias. The ideal values for sensor gain, was found to be ISO 100, with ISO 200 a less desirable. This study offers researchers a better understanding of the effects of camera capture settings on RSI pairs and their influence on image-based change detection.

  9. Cross-calibration of MODIS with ETM+ and ALI sensors for long-term monitoring of land surface processes

    USGS Publications Warehouse

    Meyer, D.; Chander, G.

    2006-01-01

    Increasingly, data from multiple sensors are used to gain a more complete understanding of land surface processes at a variety of scales. Although higher-level products (e.g., vegetation cover, albedo, surface temperature) derived from different sensors can be validated independently, the degree to which these sensors and their products can be compared to one another is vastly improved if their relative spectroradiometric responses are known. Most often, sensors are directly calibrated to diffuse solar irradiation or vicariously to ground targets. However, space-based targets are not traceable to metrological standards, and vicarious calibrations are expensive and provide a poor sampling of a sensor's full dynamic range. Crosscalibration of two sensors can augment these methods if certain conditions can be met: (1) the spectral responses are similar, (2) the observations are reasonably concurrent (similar atmospheric & solar illumination conditions), (3) errors due to misregistrations of inhomogeneous surfaces can be minimized (including scale differences), and (4) the viewing geometry is similar (or, some reasonable knowledge of surface bi-directional reflectance distribution functions is available). This study explores the impacts of cross-calibrating sensors when such conditions are met to some degree but not perfectly. In order to constrain the range of conditions at some level, the analysis is limited to sensors where cross-calibration studies have been conducted (Enhanced Thematic Mapper Plus (ETM+) on Landsat-7 (L7), Advance Land Imager (ALI) and Hyperion on Earth Observer-1 (EO-1)) and including systems having somewhat dissimilar geometry, spatial resolution & spectral response characteristics but are still part of the so-called "A.M. constellation" (Moderate Resolution Imaging Spectrometer (MODIS) aboard the Terra platform). Measures for spectral response differences and methods for cross calibrating such sensors are provided in this study. These instruments are cross calibrated using the Railroad Valley playa in Nevada. Best fit linear coefficients (slope and offset) are provided for ALI-to-MODIS and ETM+-to-MODIS cross calibrations, and root-mean-squared errors (RMSEs) and correlation coefficients are provided to quantify the uncertainty in these relationships. In theory, the linear fits and uncertainties can be used to compare radiance and reflectance products derived from each instrument.

  10. The Effect of a Pre-Lens Aperture on the Temperature Range and Image Uniformity of Microbolometer Infrared Cameras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinwiddie, Ralph Barton; Parris, Larkin S.; Lindal, John M.

    This paper explores the temperature range extension of long-wavelength infrared (LWIR) cameras by placing an aperture in front of the lens. An aperture smaller than the lens will reduce the radiance to the sensor, allowing the camera to image targets much hotter than typically allowable. These higher temperatures were accurately determined after developing a correction factor which was applied to the built-in temperature calibration. The relationship between aperture diameter and temperature range is linear. The effect of pre-lens apertures on the image uniformity is a form of anti-vignetting, meaning the corners appear brighter (hotter) than the rest of the image.more » An example of using this technique to measure temperatures of high melting point polymers during 3D printing provide valuable information of the time required for the weld-line temperature to fall below the glass transition temperature.« less

  11. High dynamic range pixel architecture for advanced diagnostic medical x-ray imaging applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Izadi, Mohammad Hadi; Karim, Karim S.

    2006-05-15

    The most widely used architecture in large-area amorphous silicon (a-Si) flat panel imagers is a passive pixel sensor (PPS), which consists of a detector and a readout switch. While the PPS has the advantage of being compact and amenable toward high-resolution imaging, small PPS output signals are swamped by external column charge amplifier and data line thermal noise, which reduce the minimum readable sensor input signal. In contrast to PPS circuits, on-pixel amplifiers in a-Si technology reduce readout noise to levels that can meet even the stringent requirements for low noise digital x-ray fluoroscopy (<1000 noise electrons). However, larger voltagesmore » at the pixel input cause the output of the amplified pixel to become nonlinear thus reducing the dynamic range. We reported a hybrid amplified pixel architecture based on a combination of PPS and amplified pixel designs that, in addition to low noise performance, also resulted in large-signal linearity and consequently higher dynamic range [K. S. Karim et al., Proc. SPIE 5368, 657 (2004)]. The additional benefit in large-signal linearity, however, came at the cost of an additional pixel transistor. We present an amplified pixel design that achieves the goals of low noise performance and large-signal linearity without the need for an additional pixel transistor. Theoretical calculations and simulation results for noise indicate the applicability of the amplified a-Si pixel architecture for high dynamic range, medical x-ray imaging applications that require switching between low exposure, real-time fluoroscopy and high-exposure radiography.« less

  12. Results of ACTIM: an EDA study on spectral laser imaging

    NASA Astrophysics Data System (ADS)

    Hamoir, Dominique; Hespel, Laurent; Déliot, Philippe; Boucher, Yannick; Steinvall, Ove; Ahlberg, Jörgen; Larsson, Hakan; Letalick, Dietmar; Lutzmann, Peter; Repasi, Endre; Ritt, Gunnar

    2011-11-01

    The European Defence Agency (EDA) launched the Active Imaging (ACTIM) study to investigate the potential of active imaging, especially that of spectral laser imaging. The work included a literature survey, the identification of promising military applications, system analyses, a roadmap and recommendations. Passive multi- and hyper-spectral imaging allows discriminating between materials. But the measured radiance in the sensor is difficult to relate to spectral reflectance due to the dependence on e.g. solar angle, clouds, shadows... In turn, active spectral imaging offers a complete control of the illumination, thus eliminating these effects. In addition it allows observing details at long ranges, seeing through degraded atmospheric conditions, penetrating obscurants (foliage, camouflage...) or retrieving polarization information. When 3D, it is suited to producing numerical terrain models and to performing geometry-based identification. Hence fusing the knowledge of ladar and passive spectral imaging will result in new capabilities. We have identified three main application areas for active imaging, and for spectral active imaging in particular: (1) long range observation for identification, (2) mid-range mapping for reconnaissance, (3) shorter range perception for threat detection. We present the system analyses that have been performed for confirming the interests, limitations and requirements of spectral active imaging in these three prioritized applications.

  13. 77 FR 26787 - Certain CMOS Image Sensors and Products Containing Same; Notice of Receipt of Complaint...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-07

    ... INTERNATIONAL TRADE COMMISSION [Docket No. 2895] Certain CMOS Image Sensors and Products.... International Trade Commission has received a complaint entitled Certain CMOS Image Sensors and Products... importation, and the sale within the United States after importation of certain CMOS image sensors and...

  14. Single-Photon Detectors for Time-of-Flight Range Imaging

    NASA Astrophysics Data System (ADS)

    Stoppa, David; Simoni, Andrea

    We live in a three-dimensional (3D) world and thanks to the stereoscopic vision provided by our two eyes, in combination with the powerful neural network of the brain we are able to perceive the distance of the objects. Nevertheless, despite the huge market volume of digital cameras, solid-state image sensors can capture only a two-dimensional (2D) projection, of the scene under observation, losing a variable of paramount importance, i.e., the scene depth. On the contrary, 3D vision tools could offer amazing possibilities of improvement in many areas thanks to the increased accuracy and reliability of the models representing the environment. Among the great variety of distance measuring techniques and detection systems available, this chapter will treat only the emerging niche of solid-state, scannerless systems based on the TOF principle and using a detector SPAD-based pixels. The chapter is organized into three main parts. At first, TOF systems and measuring techniques will be described. In the second part, most meaningful sensor architectures for scannerless TOF distance measurements will be analyzed, focusing onto the circuital building blocks required by time-resolved image sensors. Finally, a performance summary is provided and a perspective view for the near future developments of SPAD-TOF sensors is given.

  15. A Multi-Resolution Approach for an Automated Fusion of Different Low-Cost 3D Sensors

    PubMed Central

    Dupuis, Jan; Paulus, Stefan; Behmann, Jan; Plümer, Lutz; Kuhlmann, Heiner

    2014-01-01

    The 3D acquisition of object structures has become a common technique in many fields of work, e.g., industrial quality management, cultural heritage or crime scene documentation. The requirements on the measuring devices are versatile, because spacious scenes have to be imaged with a high level of detail for selected objects. Thus, the used measuring systems are expensive and require an experienced operator. With the rise of low-cost 3D imaging systems, their integration into the digital documentation process is possible. However, common low-cost sensors have the limitation of a trade-off between range and accuracy, providing either a low resolution of single objects or a limited imaging field. Therefore, the use of multiple sensors is desirable. We show the combined use of two low-cost sensors, the Microsoft Kinect and the David laserscanning system, to achieve low-resolved scans of the whole scene and a high level of detail for selected objects, respectively. Afterwards, the high-resolved David objects are automatically assigned to their corresponding Kinect object by the use of surface feature histograms and SVM-classification. The corresponding objects are fitted using an ICP-implementation to produce a multi-resolution map. The applicability is shown for a fictional crime scene and the reconstruction of a ballistic trajectory. PMID:24763255

  16. LWIR hyperspectral imaging application and detection of chemical precursors

    NASA Astrophysics Data System (ADS)

    Lavoie, Hugo; Thériault, Jean-Marc; Bouffard, François; Puckrin, Eldon; Dubé, Denis

    2012-10-01

    Detection and identification of Toxic industrial chemicals (TICs) represent a major challenge to protect and sustain first responder and public security. In this context, passive Hyperspectral Imaging (HSI) is a promising technology for the standoff detection and identification of chemical vapors emanating from a distant location. To investigate this method, the Department of National Defense and Public Safety Canada have mandated Defense Research and Development Canada (DRDC) - Valcartier to develop and test Very Long Wave Infrared (VLWIR) HSI sensors for standoff detection. The initial effort was focused to address the standoff detection and identification of toxic industrial chemicals (TICs), surrogates and precursors. Sensors such as the Improved Compact ATmospheric Sounding Interferometer (iCATSI) and the Multi-option Differential Detection and Imaging Fourier Spectrometer (MoDDIFS) were developed for this application. This paper presents the sensor developments and preliminary results of standoff detection and identification of TICs and precursors. The iCATSI and MoDDIFS sensors are based on the optical differential Fourier-transform infrared (FTIR) radiometric technology and are able to detect, spectrally resolve and identify small leak at ranges in excess of 1 km. Results from a series of trials in asymmetric threat type scenarios are reported. These results serve to establish the potential of passive standoff HSI detection of TICs, precursors and surrogates.

  17. Detection of chemical pollutants by passive LWIR hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Lavoie, Hugo; Thériault, Jean-Marc; Bouffard, François; Puckrin, Eldon; Dubé, Denis

    2012-09-01

    Toxic industrial chemicals (TICs) represent a major threat to public health and security. Their detection constitutes a real challenge to security and first responder's communities. One promising detection method is based on the passive standoff identification of chemical vapors emanating from the laboratory under surveillance. To investigate this method, the Department of National Defense and Public Safety Canada have mandated Defense Research and Development Canada (DRDC) - Valcartier to develop and test passive Long Wave Infrared (LWIR) hyperspectral imaging (HSI) sensors for standoff detection. The initial effort was focused to address the standoff detection and identification of toxic industrial chemicals (TICs) and precursors. Sensors such as the Multi-option Differential Detection and Imaging Fourier Spectrometer (MoDDIFS) and the Improved Compact ATmospheric Sounding Interferometer (iCATSI) were developed for this application. This paper describes the sensor developments and presents initial results of standoff detection and identification of TICs and precursors. The standoff sensors are based on the differential Fourier-transform infrared (FTIR) radiometric technology and are able to detect, spectrally resolve and identify small leak plumes at ranges in excess of 1 km. Results from a series of trials in asymmetric threat type scenarios will be presented. These results will serve to establish the potential of the method for standoff detection of TICs precursors and surrogates.

  18. A multi-resolution approach for an automated fusion of different low-cost 3D sensors.

    PubMed

    Dupuis, Jan; Paulus, Stefan; Behmann, Jan; Plümer, Lutz; Kuhlmann, Heiner

    2014-04-24

    The 3D acquisition of object structures has become a common technique in many fields of work, e.g., industrial quality management, cultural heritage or crime scene documentation. The requirements on the measuring devices are versatile, because spacious scenes have to be imaged with a high level of detail for selected objects. Thus, the used measuring systems are expensive and require an experienced operator. With the rise of low-cost 3D imaging systems, their integration into the digital documentation process is possible. However, common low-cost sensors have the limitation of a trade-off between range and accuracy, providing either a low resolution of single objects or a limited imaging field. Therefore, the use of multiple sensors is desirable. We show the combined use of two low-cost sensors, the Microsoft Kinect and the David laserscanning system, to achieve low-resolved scans of the whole scene and a high level of detail for selected objects, respectively. Afterwards, the high-resolved David objects are automatically assigned to their corresponding Kinect object by the use of surface feature histograms and SVM-classification. The corresponding objects are fitted using an ICP-implementation to produce a multi-resolution map. The applicability is shown for a fictional crime scene and the reconstruction of a ballistic trajectory.

  19. Measuring Pulse Rate Variability using Long-Range, Non-Contact Imaging Photoplethysmography

    DTIC Science & Technology

    2016-08-20

    subject distances. In this study, video was recorded from 19 participants, while at rest, at a distance of 25 meters from the imaging sensor. A...subject distances of no more than 3 meters . This study demonstrates that pulse rates of less than one beat-per-minute error can be obtained when the...be achieved at long imager-to-subject distances. In this study, video was recorded from 19 participants, while at rest, at a distance of 25 meters

  20. Development of CCD imaging sensors for space applications, phase 1

    NASA Technical Reports Server (NTRS)

    Antcliffe, G. A.

    1975-01-01

    The results of an experimental investigation to develop a large area charge coupled device (CCD) imager for space photography applications are described. Details of the design and processing required to achieve 400 X 400 imagers are presented together with a discussion of the optical characterization techniques developed for this program. A discussion of several aspects of large CCD performance is given with detailed test reports. The areas covered include dark current, uniformity of optical response, square wave amplitude response, spectral responsivity and dynamic range.

  1. Noise-based body-wave seismic tomography in an active underground mine.

    NASA Astrophysics Data System (ADS)

    Olivier, G.; Brenguier, F.; Campillo, M.; Lynch, R.; Roux, P.

    2014-12-01

    Over the last decade, ambient noise tomography has become increasingly popular to image the earth's upper crust. The seismic noise recorded in the earth's crust is dominated by surface waves emanating from the interaction of the ocean with the solid earth. These surface waves are low frequency in nature ( < 1 Hz) and not usable for imaging smaller structures associated with mining or oil and gas applications. The seismic noise recorded at higher frequencies are typically from anthropogenic sources, which are short lived, spatially unstable and not well suited for constructing seismic Green's functions between sensors with conventional cross-correlation methods. To examine the use of ambient noise tomography for smaller scale applications, continuous data were recorded for 5 months in an active underground mine in Sweden located more than 1km below surface with 18 high frequency seismic sensors. A wide variety of broadband (10 - 3000 Hz) seismic noise sources are present in an active underground mine ranging from drilling, scraping, trucks, ore crushers and ventilation fans. Some of these sources generate favorable seismic noise, while others are peaked in frequency and not usable. In this presentation, I will show that the noise generated by mining activity can be useful if periods of seismic noise are carefully selected. Although noise sources are not temporally stable and not evenly distributed around the sensor array, good estimates of the seismic Green's functions between sensors can be retrieved for a broad frequency range (20 - 400 Hz) when a selective stacking scheme is used. For frequencies below 100 Hz, the reconstructed Green's functions show clear body-wave arrivals for almost all of the 153 sensor pairs. The arrival times of these body-waves are picked and used to image the local velocity structure. The resulting 3-dimensional image shows a high velocity structure that overlaps with a known ore-body. The material properties of the ore-body differ from the host rock and is likely the cause of the observed high velocity structure. For frequencies above 200 Hz, the seismic waves are multiply scattered by the tunnels and excavations and used to determine the scattering properties of the medium. The results of this study should be useful for future imaging and exploration projects in mining and oil and gas industries.

  2. Image quality measures to assess hyperspectral compression techniques

    NASA Astrophysics Data System (ADS)

    Lurie, Joan B.; Evans, Bruce W.; Ringer, Brian; Yeates, Mathew

    1994-12-01

    The term 'multispectral' is used to describe imagery with anywhere from three to about 20 bands of data. The images acquired by Landsat and similar earth sensing satellites including the French Spot platform are typical examples of multispectral data sets. Applications range from crop observation and yield estimation, to forestry, to sensing of the environment. The wave bands typically range from the visible to thermal infrared and are fractions of a micron wide. They may or may not be contiguous. Thus each pixel will have several spectral intensities associated with it but detailed spectra are not obtained. The term 'hyperspectral' is typically used for spectral data encompassing hundreds of samples of a spectrum. Hyperspectral, electro-optical sensors typically operate in the visible and near infrared bands. Their characteristic property is the ability to resolve a large number (typically hundreds) of contiguous spectral bands, thus producing a detailed profile of the electromagnetic spectrum. Like multispectral sensors, recently developed hyperspectral sensors are often also imaging sensors, measuring spectral over a two dimensional spatial array of picture elements of pixels. The resulting data is thus inherently three dimensional - an array of samples in which two dimensions correspond to spatial position and the third to wavelength. The data sets, commonly referred to as image cubes or datacubes (although technically they are often rectangular solids), are very rich in information but quickly become unwieldy in size, generating formidable torrents of data. Both spaceborne and airborne hyperspectral cameras exist and are in use today. The data is unique in its ability to provide high spatial and spectral resolution simultaneously, and shows great promise in both military and civilian applications. A data analysis system has been built at TRW under a series of Internal Research and Development projects. This development has been prompted by the business opportunities, by the series of instruments built here and by the availability of data from other instruments. The products of the processing system has been used to process data produced by TRW sensors and other instruments. Figure 1 provides an overview of the TRW hyperspectral collection, data handling and exploitation capability. The Analysis and Exploitation functions deal with the digitized image cubes. The analysis system was designed to handle various types of data but the emphasis was on the data acquired by the TRW instruments.

  3. Study on polarized optical flow algorithm for imaging bionic polarization navigation micro sensor

    NASA Astrophysics Data System (ADS)

    Guan, Le; Liu, Sheng; Li, Shi-qi; Lin, Wei; Zhai, Li-yuan; Chu, Jin-kui

    2018-05-01

    At present, both the point source and the imaging polarization navigation devices only can output the angle information, which means that the velocity information of the carrier cannot be extracted from the polarization field pattern directly. Optical flow is an image-based method for calculating the velocity of pixel point movement in an image. However, for ordinary optical flow, the difference in pixel value as well as the calculation accuracy can be reduced in weak light. Polarization imaging technology has the ability to improve both the detection accuracy and the recognition probability of the target because it can acquire the extra polarization multi-dimensional information of target radiation or reflection. In this paper, combining the polarization imaging technique with the traditional optical flow algorithm, a polarization optical flow algorithm is proposed, and it is verified that the polarized optical flow algorithm has good adaptation in weak light and can improve the application range of polarization navigation sensors. This research lays the foundation for day and night all-weather polarization navigation applications in future.

  4. Maximum-likelihood methods in wavefront sensing: stochastic models and likelihood functions

    PubMed Central

    Barrett, Harrison H.; Dainty, Christopher; Lara, David

    2008-01-01

    Maximum-likelihood (ML) estimation in wavefront sensing requires careful attention to all noise sources and all factors that influence the sensor data. We present detailed probability density functions for the output of the image detector in a wavefront sensor, conditional not only on wavefront parameters but also on various nuisance parameters. Practical ways of dealing with nuisance parameters are described, and final expressions for likelihoods and Fisher information matrices are derived. The theory is illustrated by discussing Shack–Hartmann sensors, and computational requirements are discussed. Simulation results show that ML estimation can significantly increase the dynamic range of a Shack–Hartmann sensor with four detectors and that it can reduce the residual wavefront error when compared with traditional methods. PMID:17206255

  5. A Wide Dynamic Range Tapped Linear Array Image Sensor

    NASA Astrophysics Data System (ADS)

    Washkurak, William D.; Chamberlain, Savvas G.; Prince, N. Daryl

    1988-08-01

    Detectors for acousto-optic signal processing applications require fast transient response as well as wide dynamic range. There are two major choices of detectors: conductive or integration mode. Conductive mode detectors have an initial transient period before they reach then' i equilibrium state. The duration of 1 his period is dependent on light level as well as detector capacitance. At low light levels a conductive mode detector is very slow; response time is typically on the order of milliseconds. Generally. to obtain fast transient response an integrating mode detector is preferred. With integrating mode detectors. the dynamic range is determined by the charge storage capability of the tran-sport shift registers and the noise level of the image sensor. The conventional net hod used to improve dynamic range is to increase the shift register charge storage capability. To achieve a dynamic range of fifty thousand assuming two hundred noise equivalent electrons, a charge storage capability of ten million electrons would be required. In order to accommodate this amount of charge. unrealistic shift registers widths would be required. Therefore, with an integrating mode detector it is difficult to achieve a dynamic range of over four orders of magnitude of input light intensity. Another alternative is to solve the problem at the photodetector aml not the shift, register. DALSA's wide dynamic range detector utilizes an optimized, ion implant doped, profiled MOSFET photodetector specifically designed for wide dynamic range. When this new detector operates at high speed and at low light levels the photons are collected and stored in an integrating fashion. However. at bright light levels where transient periods are short, the detector switches into a conductive mode. The light intensity is logarithmically compressed into small charge packets, easily carried by the CCD shift register. As a result of the logarithmic conversion, dynamic ranges of over six orders of magnitide are obtained. To achieve the short integration times necessary in acousto-optic applications. t he wide dynamic range detector has been implemented into a tapped array architecture with eight outputs and 256 photoelements. Operation of each 01)1,1)111 at 16 MHz yields detector integration times of 2 micro-seconds. Buried channel two phase CCD shift register technology is utilized to minimize image sensor noise improve video output rates and increase ease of operation.

  6. CMOS sensors for atmospheric imaging

    NASA Astrophysics Data System (ADS)

    Pratlong, Jérôme; Burt, David; Jerram, Paul; Mayer, Frédéric; Walker, Andrew; Simpson, Robert; Johnson, Steven; Hubbard, Wendy

    2017-09-01

    Recent European atmospheric imaging missions have seen a move towards the use of CMOS sensors for the visible and NIR parts of the spectrum. These applications have particular challenges that are completely different to those that have driven the development of commercial sensors for applications such as cell-phone or SLR cameras. This paper will cover the design and performance of general-purpose image sensors that are to be used in the MTG (Meteosat Third Generation) and MetImage satellites and the technology challenges that they have presented. We will discuss how CMOS imagers have been designed with 4T pixel sizes of up to 250 μm square achieving good charge transfer efficiency, or low lag, with signal levels up to 2M electrons and with high line rates. In both devices a low noise analogue read-out chain is used with correlated double sampling to suppress the readout noise and give a maximum dynamic range that is significantly larger than in standard commercial devices. Radiation hardness is a particular challenge for CMOS detectors and both of these sensors have been designed to be fully radiation hard with high latch-up and single-event-upset tolerances, which is now silicon proven on MTG. We will also cover the impact of ionising radiation on these devices. Because with such large pixels the photodiodes have a large open area, front illumination technology is sufficient to meet the detection efficiency requirements but with thicker than standard epitaxial silicon to give improved IR response (note that this makes latch up protection even more important). However with narrow band illumination reflections from the front and back of the dielectric stack on the top of the sensor produce Fabry-Perot étalon effects, which have been minimised with process modifications. We will also cover the addition of precision narrow band filters inside the MTG package to provide a complete imaging subsystem. Control of reflected light is also critical in obtaining the required optical performance and this has driven the development of a black coating layer that can be applied between the active silicon regions.

  7. Airborne measurements in the infrared using FTIR-based imaging hyperspectral sensors

    NASA Astrophysics Data System (ADS)

    Puckrin, E.; Turcotte, C. S.; Lahaie, P.; Dubé, D.; Lagueux, P.; Farley, V.; Marcotte, F.; Chamberland, M.

    2009-09-01

    Hyperspectral ground mapping is being used in an ever-increasing extent for numerous applications in the military, geology and environmental fields. The different regions of the electromagnetic spectrum help produce information of differing nature. The visible, near-infrared and short-wave infrared radiation (400 nm to 2.5 μm) has been mostly used to analyze reflected solar light, while the mid-wave (3 to 5 μm) and long-wave (8 to 12 μm or thermal) infrared senses the self-emission of molecules directly, enabling the acquisition of data during night time. Push-broom dispersive sensors have been typically used for airborne hyperspectral mapping. However, extending the spectral range towards the mid-wave and long-wave infrared brings performance limitations due to the self emission of the sensor itself. The Fourier-transform spectrometer technology has been extensively used in the infrared spectral range due to its high transmittance as well as throughput and multiplex advantages, thereby reducing the sensor self-emission problem. Telops has developed the Hyper-Cam, a rugged and compact infrared hyperspectral imager. The Hyper-Cam is based on the Fourier-transform technology yielding high spectral resolution and enabling high accuracy radiometric calibration. It provides passive signature measurement capability, with up to 320x256 pixels at spectral resolutions of up to 0.25 cm-1. The Hyper-Cam has been used on the ground in several field campaigns, including the demonstration of standoff chemical agent detection. More recently, the Hyper-Cam has been integrated into an airplane to provide airborne measurement capabilities. A special pointing module was designed to compensate for airplane attitude and forward motion. To our knowledge, the Hyper-Cam is the first commercial airborne hyperspectral imaging sensor based on Fourier-transform infrared technology. The first airborne measurements and some preliminary performance criteria for the Hyper-Cam are presented in this paper.

  8. Airborne measurements in the infrared using FTIR-based imaging hyperspectral sensors

    NASA Astrophysics Data System (ADS)

    Puckrin, E.; Turcotte, C. S.; Lahaie, P.; Dubé, D.; Farley, V.; Lagueux, P.; Marcotte, F.; Chamberland, M.

    2009-05-01

    Hyperspectral ground mapping is being used in an ever-increasing extent for numerous applications in the military, geology and environmental fields. The different regions of the electromagnetic spectrum help produce information of differing nature. The visible, near-infrared and short-wave infrared radiation (400 nm to 2.5 μm) has been mostly used to analyze reflected solar light, while the mid-wave (3 to 5 μm) and long-wave (8 to 12 μm or thermal) infrared senses the self-emission of molecules directly, enabling the acquisition of data during night time. Push-broom dispersive sensors have been typically used for airborne hyperspectral mapping. However, extending the spectral range towards the mid-wave and long-wave infrared brings performance limitations due to the self emission of the sensor itself. The Fourier-transform spectrometer technology has been extensively used in the infrared spectral range due to its high transmittance as well as throughput and multiplex advantages, thereby reducing the sensor self-emission problem. Telops has developed the Hyper-Cam, a rugged and compact infrared hyperspectral imager. The Hyper-Cam is based on the Fourier-transform technology yielding high spectral resolution and enabling high accuracy radiometric calibration. It provides passive signature measurement capability, with up to 320x256 pixels at spectral resolutions of up to 0.25 cm-1. The Hyper-Cam has been used on the ground in several field campaigns, including the demonstration of standoff chemical agent detection. More recently, the Hyper-Cam has been integrated into an airplane to provide airborne measurement capabilities. A special pointing module was designed to compensate for airplane attitude and forward motion. To our knowledge, the Hyper-Cam is the first commercial airborne hyperspectral imaging sensor based on Fourier-transform infrared technology. The first airborne measurements and some preliminary performance criteria for the Hyper-Cam are presented in this paper.

  9. Smart CMOS sensor for wideband laser threat detection

    NASA Astrophysics Data System (ADS)

    Schwarze, Craig R.; Sonkusale, Sameer

    2015-09-01

    The proliferation of lasers has led to their widespread use in applications ranging from short range standoff chemical detection to long range Lidar sensing and target designation operating across the UV to LWIR spectrum. Recent advances in high energy lasers have renewed the development of laser weapons systems. The ability to measure and assess laser source information is important to both identify a potential threat as well as determine safety and nominal hazard zone (NHZ). Laser detection sensors are required that provide high dynamic range, wide spectral coverage, pulsed and continuous wave detection, and large field of view. OPTRA, Inc. and Tufts have developed a custom ROIC smart pixel imaging sensor architecture and wavelength encoding optics for measurement of source wavelength, pulse length, pulse repetition frequency (PRF), irradiance, and angle of arrival. The smart architecture provides dual linear and logarithmic operating modes to provide 8+ orders of signal dynamic range and nanosecond pulse measurement capability that can be hybridized with the appropriate detector array to provide UV through LWIR laser sensing. Recent advances in sputtering techniques provide the capability for post-processing CMOS dies from the foundry and patterning PbS and PbSe photoconductors directly on the chip to create a single monolithic sensor array architecture for measuring sources operating from 0.26 - 5.0 microns, 1 mW/cm2 - 2 kW/cm2.

  10. Novel instrumentation of multispectral imaging technology for detecting tissue abnormity

    NASA Astrophysics Data System (ADS)

    Yi, Dingrong; Kong, Linghua

    2012-10-01

    Multispectral imaging is becoming a powerful tool in a wide range of biological and clinical studies by adding spectral, spatial and temporal dimensions to visualize tissue abnormity and the underlying biological processes. A conventional spectral imaging system includes two physically separated major components: a band-passing selection device (such as liquid crystal tunable filter and diffraction grating) and a scientific-grade monochromatic camera, and is expensive and bulky. Recently micro-arrayed narrow-band optical mosaic filter was invented and successfully fabricated to reduce the size and cost of multispectral imaging devices in order to meet the clinical requirement for medical diagnostic imaging applications. However the challenging issue of how to integrate and place the micro filter mosaic chip to the targeting focal plane, i.e., the imaging sensor, of an off-shelf CMOS/CCD camera is not reported anywhere. This paper presents the methods and results of integrating such a miniaturized filter with off-shelf CMOS imaging sensors to produce handheld real-time multispectral imaging devices for the application of early stage pressure ulcer (ESPU) detection. Unlike conventional multispectral imaging devices which are bulky and expensive, the resulting handheld real-time multispectral ESPU detector can produce multiple images at different center wavelengths with a single shot, therefore eliminates the image registration procedure required by traditional multispectral imaging technologies.

  11. Proof of principle study of the use of a CMOS active pixel sensor for proton radiography.

    PubMed

    Seco, Joao; Depauw, Nicolas

    2011-02-01

    Proof of principle study of the use of a CMOS active pixel sensor (APS) in producing proton radiographic images using the proton beam at the Massachusetts General Hospital (MGH). A CMOS APS, previously tested for use in s-ray radiation therapy applications, was used for proton beam radiographic imaging at the MGH. Two different setups were used as a proof of principle that CMOS can be used as proton imaging device: (i) a pen with two metal screws to assess spatial resolution of the CMOS and (ii) a phantom with lung tissue, bone tissue, and water to assess tissue contrast of the CMOS. The sensor was then traversed by a double scattered monoenergetic proton beam at 117 MeV, and the energy deposition inside the detector was recorded to assess its energy response. Conventional x-ray images with similar setup at voltages of 70 kVp and proton images using commercial Gafchromic EBT 2 and Kodak X-Omat V films were also taken for comparison purposes. Images were successfully acquired and compared to x-ray kVp and proton EBT2/X-Omat film images. The spatial resolution of the CMOS detector image is subjectively comparable to the EBT2 and Kodak X-Omat V film images obtained at the same object-detector distance. X-rays have apparent higher spatial resolution than the CMOS. However, further studies with different commercial films using proton beam irradiation demonstrate that the distance of the detector to the object is important to the amount of proton scatter contributing to the proton image. Proton images obtained with films at different distances from the source indicate that proton scatter significantly affects the CMOS image quality. Proton radiographic images were successfully acquired at MGH using a CMOS active pixel sensor detector. The CMOS demonstrated spatial resolution subjectively comparable to films at the same object-detector distance. Further work will be done in order to establish the spatial and energy resolution of the CMOS detector for protons. The development and use of CMOS in proton radiography could allow in vivo proton range checks, patient setup QA, and real-time tumor tracking.

  12. Photon Counting Imaging with an Electron-Bombarded Pixel Image Sensor

    PubMed Central

    Hirvonen, Liisa M.; Suhling, Klaus

    2016-01-01

    Electron-bombarded pixel image sensors, where a single photoelectron is accelerated directly into a CCD or CMOS sensor, allow wide-field imaging at extremely low light levels as they are sensitive enough to detect single photons. This technology allows the detection of up to hundreds or thousands of photon events per frame, depending on the sensor size, and photon event centroiding can be employed to recover resolution lost in the detection process. Unlike photon events from electron-multiplying sensors, the photon events from electron-bombarded sensors have a narrow, acceleration-voltage-dependent pulse height distribution. Thus a gain voltage sweep during exposure in an electron-bombarded sensor could allow photon arrival time determination from the pulse height with sub-frame exposure time resolution. We give a brief overview of our work with electron-bombarded pixel image sensor technology and recent developments in this field for single photon counting imaging, and examples of some applications. PMID:27136556

  13. Evaluation and comparison of the IRS-P6 and the landsat sensors

    USGS Publications Warehouse

    Chander, G.; Coan, M.J.; Scaramuzza, P.L.

    2008-01-01

    The Indian Remote Sensing Satellite (IRS-P6), also called ResourceSat-1, was launched in a polar sun-synchronous orbit on October 17, 2003. It carries three sensors: the highresolution Linear Imaging Self-Scanner (LISS-IV), the mediumresolution Linear Imaging Self-Scanner (LISS-III), and the Advanced Wide-Field Sensor (AWiFS). These three sensors provide images of different resolutions and coverage. To understand the absolute radiometric calibration accuracy of IRS-P6 AWiFS and LISS-III sensors, image pairs from these sensors were compared to images from the Landsat-5 Thematic Mapper (TM) and Landsat-7 Enhanced TM Plus (ETM+) sensors. The approach involves calibration of surface observations based on image statistics from areas observed nearly simultaneously by the two sensors. This paper also evaluated the viability of data from these nextgeneration imagers for use in creating three National Land Cover Dataset (NLCD) products: land cover, percent tree canopy, and percent impervious surface. Individual products were consistent with previous studies but had slightly lower overall accuracies as compared to data from the Landsat sensors.

  14. Performance test and image correction of CMOS image sensor in radiation environment

    NASA Astrophysics Data System (ADS)

    Wang, Congzheng; Hu, Song; Gao, Chunming; Feng, Chang

    2016-09-01

    CMOS image sensors rival CCDs in domains that include strong radiation resistance as well as simple drive signals, so it is widely applied in the high-energy radiation environment, such as space optical imaging application and video monitoring of nuclear power equipment. However, the silicon material of CMOS image sensors has the ionizing dose effect in the high-energy rays, and then the indicators of image sensors, such as signal noise ratio (SNR), non-uniformity (NU) and bad point (BP) are degraded because of the radiation. The radiation environment of test experiments was generated by the 60Co γ-rays source. The camera module based on image sensor CMV2000 from CMOSIS Inc. was chosen as the research object. The ray dose used for the experiments was with a dose rate of 20krad/h. In the test experiences, the output signals of the pixels of image sensor were measured on the different total dose. The results of data analysis showed that with the accumulation of irradiation dose, SNR of image sensors decreased, NU of sensors was enhanced, and the number of BP increased. The indicators correction of image sensors was necessary, as it was the main factors to image quality. The image processing arithmetic was adopt to the data from the experiences in the work, which combined local threshold method with NU correction based on non-local means (NLM) method. The results from image processing showed that image correction can effectively inhibit the BP, improve the SNR, and reduce the NU.

  15. Adaptive DOF for plenoptic cameras

    NASA Astrophysics Data System (ADS)

    Oberdörster, Alexander; Lensch, Hendrik P. A.

    2013-03-01

    Plenoptic cameras promise to provide arbitrary re-focusing through a scene after the capture. In practice, however, the refocusing range is limited by the depth of field (DOF) of the plenoptic camera. For the focused plenoptic camera, this range is given by the range of object distances for which the microimages are in focus. We propose a technique of recording light fields with an adaptive depth of focus. Between multiple exposures { or multiple recordings of the light field { the distance between the microlens array (MLA) and the image sensor is adjusted. The depth and quality of focus is chosen by changing the number of exposures and the spacing of the MLA movements. In contrast to traditional cameras, extending the DOF does not necessarily lead to an all-in-focus image. Instead, the refocus range is extended. There is full creative control about the focus depth; images with shallow or selective focus can be generated.

  16. New amorphous-silicon image sensor for x-ray diagnostic medical imaging applications

    NASA Astrophysics Data System (ADS)

    Weisfield, Richard L.; Hartney, Mark A.; Street, Robert A.; Apte, Raj B.

    1998-07-01

    This paper introduces new high-resolution amorphous Silicon (a-Si) image sensors specifically configured for demonstrating film-quality medical x-ray imaging capabilities. The devices utilizes an x-ray phosphor screen coupled to an array of a-Si photodiodes for detecting visible light, and a-Si thin-film transistors (TFTs) for connecting the photodiodes to external readout electronics. We have developed imagers based on a pixel size of 127 micrometer X 127 micrometer with an approximately page-size imaging area of 244 mm X 195 mm, and array size of 1,536 data lines by 1,920 gate lines, for a total of 2.95 million pixels. More recently, we have developed a much larger imager based on the same pixel pattern, which covers an area of approximately 406 mm X 293 mm, with 2,304 data lines by 3,200 gate lines, for a total of nearly 7.4 million pixels. This is very likely to be the largest image sensor array and highest pixel count detector fabricated on a single substrate. Both imagers connect to a standard PC and are capable of taking an image in a few seconds. Through design rule optimization we have achieved a light sensitive area of 57% and optimized quantum efficiency for x-ray phosphor output in the green part of the spectrum, yielding an average quantum efficiency between 500 and 600 nm of approximately 70%. At the same time, we have managed to reduce extraneous leakage currents on these devices to a few fA per pixel, which allows for very high dynamic range to be achieved. We have characterized leakage currents as a function of photodiode bias, time and temperature to demonstrate high stability over these large sized arrays. At the electronics level, we have adopted a new generation of low noise, charge- sensitive amplifiers coupled to 12-bit A/D converters. Considerable attention was given to reducing electronic noise in order to demonstrate a large dynamic range (over 4,000:1) for medical imaging applications. Through a combination of low data lines capacitance, readout amplifier design, optimized timing, and noise cancellation techniques, we achieve 1,000e to 2,000e of noise for the page size and large size arrays, respectively. This allows for true 12-bit performance and quantum limited images over a wide range of x-ray exposures. Various approaches to reducing line correlated noise have been implemented and will be discussed. Images documenting the improved performance will be presented. Avenues for improvement are under development, including higher resolution 97 micrometer pixel imagers, further improvements in detective quantum efficiency, and characterization of dynamic behavior.

  17. Optical control and diagnostics sensors for gas turbine machinery

    NASA Astrophysics Data System (ADS)

    Trolinger, James D.; Jenkins, Thomas P.; Heeg, Bauke

    2012-10-01

    There exists a vast range of optical techniques that have been under development for solving complex measurement problems related to gas-turbine machinery and phenomena. For instance, several optical techniques are ideally suited for studying fundamental combustion phenomena in laboratory environments. Yet other techniques hold significant promise for use as either on-line gas turbine control sensors, or as health monitoring diagnostics sensors. In this paper, we briefly summarize these and discuss, in more detail, some of the latter class of techniques, including phosphor thermometry, hyperspectral imaging and low coherence interferometry, which are particularly suited for control and diagnostics sensing on hot section components with ceramic thermal barrier coatings (TBCs).

  18. Advanced Sensors Boost Optical Communication, Imaging

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Brooklyn, New York-based Amplification Technologies Inc. (ATI), employed Phase I and II SBIR funding from NASA s Jet Propulsion Laboratory to forward the company's solid-state photomultiplier technology. Under the SBIR, ATI developed a small, energy-efficient, extremely high-gain sensor capable of detecting light down to single photons in the near infrared wavelength range. The company has commercialized this technology in the form of its NIRDAPD photomultiplier, ideal for use in free space optical communications, lidar and ladar, night vision goggles, and other light sensing applications.

  19. Implantable Bladder Sensors: A Methodological Review

    PubMed Central

    Dakurah, Mathias Naangmenkpeong; Koo, Chiwan; Choi, Wonseok; Joung, Yeun-Ho

    2015-01-01

    The loss of urinary bladder control/sensation, also known as urinary incontinence (UI), is a common clinical problem in autistic children, diabetics, and the elderly. UI not only causes discomfort for patients but may also lead to kidney failure, infections, and even death. The increase of bladder urine volume/pressure above normal ranges without sensation of UI patients necessitates the need for bladder sensors. Currently, a catheter-based sensor is introduced directly through the urethra into the bladder to measure pressure variations. Unfortunately, this method is inaccurate because measurement is affected by disturbances in catheter lines as well as delays in response time owing to the inertia of urine inside the bladder. Moreover, this technique can cause infection during prolonged use; hence, it is only suitable for short-term measurement. Development of discrete wireless implantable sensors to measure bladder volume/pressure would allow for long-term monitoring within the bladder, while maintaining the patient’s quality of life. With the recent advances in microfabrication, the size of implantable bladder sensors has been significantly reduced. However, microfabricated sensors face hostility from the bladder environment and require surgical intervention for implantation inside the bladder. Here, we explore the various types of implantable bladder sensors and current efforts to solve issues like hermeticity, biocompatibility, drift, telemetry, power, and compatibility issues with popular imaging tools such as computed tomography and magnetic resonance imaging. We also discuss some possible improvements/emerging trends in the design of an implantable bladder sensor. PMID:26620894

  20. Nanohole-array-based device for 2D snapshot multispectral imaging

    PubMed Central

    Najiminaini, Mohamadreza; Vasefi, Fartash; Kaminska, Bozena; Carson, Jeffrey J. L.

    2013-01-01

    We present a two-dimensional (2D) snapshot multispectral imager that utilizes the optical transmission characteristics of nanohole arrays (NHAs) in a gold film to resolve a mixture of input colors into multiple spectral bands. The multispectral device consists of blocks of NHAs, wherein each NHA has a unique periodicity that results in transmission resonances and minima in the visible and near-infrared regions. The multispectral device was illuminated over a wide spectral range, and the transmission was spectrally unmixed using a least-squares estimation algorithm. A NHA-based multispectral imaging system was built and tested in both reflection and transmission modes. The NHA-based multispectral imager was capable of extracting 2D multispectral images representative of four independent bands within the spectral range of 662 nm to 832 nm for a variety of targets. The multispectral device can potentially be integrated into a variety of imaging sensor systems. PMID:24005065

  1. Design of CMOS imaging system based on FPGA

    NASA Astrophysics Data System (ADS)

    Hu, Bo; Chen, Xiaolai

    2017-10-01

    In order to meet the needs of engineering applications for high dynamic range CMOS camera under the rolling shutter mode, a complete imaging system is designed based on the CMOS imaging sensor NSC1105. The paper decides CMOS+ADC+FPGA+Camera Link as processing architecture and introduces the design and implementation of the hardware system. As for camera software system, which consists of CMOS timing drive module, image acquisition module and transmission control module, the paper designs in Verilog language and drives it to work properly based on Xilinx FPGA. The ISE 14.6 emulator ISim is used in the simulation of signals. The imaging experimental results show that the system exhibits a 1280*1024 pixel resolution, has a frame frequency of 25 fps and a dynamic range more than 120dB. The imaging quality of the system satisfies the requirement of the index.

  2. Analysis of Active Sensor Discrimination Requirements for Various Defense Missile Defense Scenarios Final Report 1999(99-ERD-080)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ledebuhr, A.G.; Ng, L.C.; Gaughan, R.J.

    2000-02-15

    During FY99, we have explored and analyzed a combined passive/active sensor concept to support the advanced discrimination requirements for various missile defense scenario. The idea is to combine multiple IR spectral channels with an imaging LIDAR (Light Detection and Ranging) behind a common optical system. The imaging LIDAR would itself consist of at least two channels; one at the fundamental laser wavelength (e.g., the 1.064 {micro}m for Nd:YAG) and one channel at the frequency doubled (at 532 nm for Nd:YAG). two-color laser output would, for example, allow the longer wavelength for a direct detection time of flight ranger and anmore » active imaging channel at the shorter wavelength. The LIDAR can function as a high-resolution 2D spatial image either passively or actively with laser illumination. Advances in laser design also offer three color (frequency tripled) systems, high rep-rate operation, better pumping efficiencies that can provide longer distance acquisition, and ranging for enhanced discrimination phenomenology. New detector developments can enhance the performance and operation of both LIDAR channels. A real time data fusion approach that combines multi-spectral IR phenomenology with LIDAR imagery can improve both discrimination and aim-point selection capability.« less

  3. Nanocrystalline ZnON; High mobility and low band gap semiconductor material for high performance switch transistor and image sensor application

    PubMed Central

    Lee, Eunha; Benayad, Anass; Shin, Taeho; Lee, HyungIk; Ko, Dong-Su; Kim, Tae Sang; Son, Kyoung Seok; Ryu, Myungkwan; Jeon, Sanghun; Park, Gyeong-Su

    2014-01-01

    Interest in oxide semiconductors stems from benefits, primarily their ease of process, relatively high mobility (0.3–10 cm2/vs), and wide-bandgap. However, for practical future electronic devices, the channel mobility should be further increased over 50 cm2/vs and wide-bandgap is not suitable for photo/image sensor applications. The incorporation of nitrogen into ZnO semiconductor can be tailored to increase channel mobility, enhance the optical absorption for whole visible light and form uniform micro-structure, satisfying the desirable attributes essential for high performance transistor and visible light photo-sensors on large area platform. Here, we present electronic, optical and microstructural properties of ZnON, a composite of Zn3N2 and ZnO. Well-optimized ZnON material presents high mobility exceeding 100 cm2V−1s−1, the band-gap of 1.3 eV and nanocrystalline structure with multiphase. We found that mobility, microstructure, electronic structure, band-gap and trap properties of ZnON are varied with nitrogen concentration in ZnO. Accordingly, the performance of ZnON-based device can be adjustable to meet the requisite of both switch device and image-sensor potentials. These results demonstrate how device and material attributes of ZnON can be optimized for new device strategies in display technology and we expect the ZnON will be applicable to a wide range of imaging/display devices. PMID:24824778

  4. Multi-spectral texture analysis for IED detection

    NASA Astrophysics Data System (ADS)

    Petersson, Henrik; Gustafsson, David

    2016-10-01

    The use of Improvised Explosive Devices (IEDs) has increased significantly over the world and is a globally widespread phenomenon. Although measures can be taken to anticipate and prevent the opponent's ability to deploy IEDs, detection of IEDs will always be a central activity. There is a wide range of different sensors that are useful but also simple means, such as a pair of binoculars, can be crucial to detect IEDs in time. Disturbed earth (disturbed soil), such as freshly dug areas, dumps of clay on top of smooth sand or depressions in the ground, could be an indication of a buried IED. This paper brie y describes how a field trial was set-up to provide a realistic data set on a road section containing areas with disturbed soil due to buried IEDs. The road section was imaged using a forward looking land-based sensor platform consisting of visual imaging sensors together with long-, mid-, and shortwave infrared imaging sensors. The paper investigates the presence of discriminatory information in surface texture comparing areas with disturbed against undisturbed soil. The investigation is conducted for the different wavelength bands available. To extract features that describe texture, image processing tools such as 'Histogram of Oriented Gradients', 'Local Binary Patterns', 'Lacunarity', 'Gabor Filtering' and 'Co-Occurence' is used. It is found that texture as characterized here may provide discriminatory information to detect disturbed soil, but the signatures we found are weak and can not be used alone in e.g. a detector system.

  5. Cross delay line sensor characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owens, Israel J; Remelius, Dennis K; Tiee, Joe J

    There exists a wealth of information in the scientific literature on the physical properties and device characterization procedures for complementary metal oxide semiconductor (CMOS), charge coupled device (CCD) and avalanche photodiode (APD) format detectors. Numerous papers and books have also treated photocathode operation in the context of photomultiplier tube (PMT) operation for either non imaging applications or limited night vision capability. However, much less information has been reported in the literature about the characterization procedures and properties of photocathode detectors with novel cross delay line (XDL) anode structures. These allow one to detect single photons and create images by recordingmore » space and time coordinate (X, Y & T) information. In this paper, we report on the physical characteristics and performance of a cross delay line anode sensor with an enhanced near infrared wavelength response photocathode and high dynamic range micro channel plate (MCP) gain (> 10{sup 6}) multiplier stage. Measurement procedures and results including the device dark event rate (DER), pulse height distribution, quantum and electronic device efficiency (QE & DQE) and spatial resolution per effective pixel region in a 25 mm sensor array are presented. The overall knowledge and information obtained from XDL sensor characterization allow us to optimize device performance and assess capability. These device performance properties and capabilities make XDL detectors ideal for remote sensing field applications that require single photon detection, imaging, sub nano-second timing response, high spatial resolution (10's of microns) and large effective image format.« less

  6. A noble technique a using force-sensing resistor for immobilization-device quality assurance: A feasibility study

    NASA Astrophysics Data System (ADS)

    Cho, Min-Seok; Kim, Tae-Ho; Kang, Seong-Hee; Kim, Dong-Su; Kim, Kyeong-Hyeon; Shin, Dong-Seok; Noh, Yu-Yun; Koo, Hyun-Jae; Cheon, Geum Seong; Suh, Tae Suk; Kim, Siyong

    2016-03-01

    Many studies have reported that a patient can move even when an immobilization device is used. Researchers have developed an immobilization-device quality-assurance (QA) system that evaluates the validity of immobilization devices. The QA system consists of force-sensing-resistor (FSR) sensor units, an electric circuit, a signal conditioning device, and a control personal computer (PC) with in-house software. The QA system is designed to measure the force between an immobilization device and a patient's skin by using the FSR sensor unit. This preliminary study aimed to evaluate the feasibility of using the QA system in radiation-exposure situations. When the FSR sensor unit was irradiated with a computed tomography (CT) beam and a treatment beam from a linear accelerator (LINAC), the stability of the output signal, the image artifact on the CT image, and changing the variation on the patient's dose were tested. The results of this study demonstrate that this system is promising in that it performed within the error range (signal variation on CT beam < 0.30 kPa, root-mean-square error (RMSE) of the two CT images according to presence or absence of the FSR sensor unit < 15 HU, signal variation on the treatment beam < 0.15 kPa, and dose difference between the presence and the absence of the FSR sensor unit < 0.02%). Based on the obtained results, we will volunteer tests to investigate the clinical feasibility of the QA system.

  7. Object acquisition and tracking for space-based surveillance

    NASA Astrophysics Data System (ADS)

    1991-11-01

    This report presents the results of research carried out by Space Computer Corporation under the U.S. government's Small Business Innovation Research (SBIR) Program. The work was sponsored by the Strategic Defense Initiative Organization and managed by the Office of Naval Research under Contracts N00014-87-C-0801 (Phase 1) and N00014-89-C-0015 (Phase 2). The basic purpose of this research was to develop and demonstrate a new approach to the detection of, and initiation of track on, moving targets using data from a passive infrared or visual sensor. This approach differs in very significant ways from the traditional approach of dividing the required processing into time dependent, object dependent, and data dependent processing stages. In that approach individual targets are first detected in individual image frames, and the detections are then assembled into tracks. That requires that the signal to noise ratio in each image frame be sufficient for fairly reliable target detection. In contrast, our approach bases detection of targets on multiple image frames, and, accordingly, requires a smaller signal to noise ratio. It is sometimes referred to as track before detect, and can lead to a significant reduction in total system cost. For example, it can allow greater detection range for a single sensor, or it can allow the use of smaller sensor optics. Both the traditional and track before detect approaches are applicable to systems using scanning sensors, as well as those which use staring sensors.

  8. Object acquisition and tracking for space-based surveillance. Final report, Dec 88-May 90

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-11-27

    This report presents the results of research carried out by Space Computer Corporation under the U.S. government's Small Business Innovation Research (SBIR) Program. The work was sponsored by the Strategic Defense Initiative Organization and managed by the Office of Naval Research under Contracts N00014-87-C-0801 (Phase I) and N00014-89-C-0015 (Phase II). The basic purpose of this research was to develop and demonstrate a new approach to the detection of, and initiation of track on, moving targets using data from a passive infrared or visual sensor. This approach differs in very significant ways from the traditional approach of dividing the required processingmore » into time dependent, object-dependent, and data-dependent processing stages. In that approach individual targets are first detected in individual image frames, and the detections are then assembled into tracks. That requires that the signal to noise ratio in each image frame be sufficient for fairly reliable target detection. In contrast, our approach bases detection of targets on multiple image frames, and, accordingly, requires a smaller signal to noise ratio. It is sometimes referred to as track before detect, and can lead to a significant reduction in total system cost. For example, it can allow greater detection range for a single sensor, or it can allow the use of smaller sensor optics. Both the traditional and track before detect approaches are applicable to systems using scanning sensors, as well as those which use staring sensors.« less

  9. Smart sensors II; Proceedings of the Seminar, San Diego, CA, July 31, August 1, 1980

    NASA Astrophysics Data System (ADS)

    Barbe, D. F.

    1980-01-01

    Topics discussed include technology for smart sensors, smart sensors for tracking and surveillance, and techniques and algorithms for smart sensors. Papers are presented on the application of very large scale integrated circuits to smart sensors, imaging charge-coupled devices for deep-space surveillance, ultra-precise star tracking using charge coupled devices, and automatic target identification of blurred images with super-resolution features. Attention is also given to smart sensors for terminal homing, algorithms for estimating image position, and the computational efficiency of multiple image registration algorithms.

  10. Collaborative Point Paper on Border Surveillance Technology

    DTIC Science & Technology

    2007-06-01

    Systems PLC LORHIS (Long Range Hyperspectral Imaging System ) can be configured for either manned or unmanned aircraft to automatically detect and...Airships, and/or Aerostats, (RF, Electro-Optical, Infrared, Video) • Land- based Sensor Systems (Attended/Mobile and Unattended: e.g., CCD, Motion, Acoustic...electronic surveillance technologies for intrusion detection and warning. These ground- based systems are primarily short-range, up to around 500 meters

  11. CMOS image sensor-based implantable glucose sensor using glucose-responsive fluorescent hydrogel.

    PubMed

    Tokuda, Takashi; Takahashi, Masayuki; Uejima, Kazuhiro; Masuda, Keita; Kawamura, Toshikazu; Ohta, Yasumi; Motoyama, Mayumi; Noda, Toshihiko; Sasagawa, Kiyotaka; Okitsu, Teru; Takeuchi, Shoji; Ohta, Jun

    2014-11-01

    A CMOS image sensor-based implantable glucose sensor based on an optical-sensing scheme is proposed and experimentally verified. A glucose-responsive fluorescent hydrogel is used as the mediator in the measurement scheme. The wired implantable glucose sensor was realized by integrating a CMOS image sensor, hydrogel, UV light emitting diodes, and an optical filter on a flexible polyimide substrate. Feasibility of the glucose sensor was verified by both in vitro and in vivo experiments.

  12. The Quanta Image Sensor: Every Photon Counts

    PubMed Central

    Fossum, Eric R.; Ma, Jiaju; Masoodian, Saleh; Anzagira, Leo; Zizza, Rachel

    2016-01-01

    The Quanta Image Sensor (QIS) was conceived when contemplating shrinking pixel sizes and storage capacities, and the steady increase in digital processing power. In the single-bit QIS, the output of each field is a binary bit plane, where each bit represents the presence or absence of at least one photoelectron in a photodetector. A series of bit planes is generated through high-speed readout, and a kernel or “cubicle” of bits (x, y, t) is used to create a single output image pixel. The size of the cubicle can be adjusted post-acquisition to optimize image quality. The specialized sub-diffraction-limit photodetectors in the QIS are referred to as “jots” and a QIS may have a gigajot or more, read out at 1000 fps, for a data rate exceeding 1 Tb/s. Basically, we are trying to count photons as they arrive at the sensor. This paper reviews the QIS concept and its imaging characteristics. Recent progress towards realizing the QIS for commercial and scientific purposes is discussed. This includes implementation of a pump-gate jot device in a 65 nm CIS BSI process yielding read noise as low as 0.22 e− r.m.s. and conversion gain as high as 420 µV/e−, power efficient readout electronics, currently as low as 0.4 pJ/b in the same process, creating high dynamic range images from jot data, and understanding the imaging characteristics of single-bit and multi-bit QIS devices. The QIS represents a possible major paradigm shift in image capture. PMID:27517926

  13. Radiometric Normalization of Large Airborne Image Data Sets Acquired by Different Sensor Types

    NASA Astrophysics Data System (ADS)

    Gehrke, S.; Beshah, B. T.

    2016-06-01

    Generating seamless mosaics of aerial images is a particularly challenging task when the mosaic comprises a large number of im-ages, collected over longer periods of time and with different sensors under varying imaging conditions. Such large mosaics typically consist of very heterogeneous image data, both spatially (different terrain types and atmosphere) and temporally (unstable atmo-spheric properties and even changes in land coverage). We present a new radiometric normalization or, respectively, radiometric aerial triangulation approach that takes advantage of our knowledge about each sensor's properties. The current implementation supports medium and large format airborne imaging sensors of the Leica Geosystems family, namely the ADS line-scanner as well as DMC and RCD frame sensors. A hierarchical modelling - with parameters for the overall mosaic, the sensor type, different flight sessions, strips and individual images - allows for adaptation to each sensor's geometric and radiometric properties. Additional parameters at different hierarchy levels can compensate radiome-tric differences of various origins to compensate for shortcomings of the preceding radiometric sensor calibration as well as BRDF and atmospheric corrections. The final, relative normalization is based on radiometric tie points in overlapping images, absolute radiometric control points and image statistics. It is computed in a global least squares adjustment for the entire mosaic by altering each image's histogram using a location-dependent mathematical model. This model involves contrast and brightness corrections at radiometric fix points with bilinear interpolation for corrections in-between. The distribution of the radiometry fixes is adaptive to each image and generally increases with image size, hence enabling optimal local adaptation even for very long image strips as typi-cally captured by a line-scanner sensor. The normalization approach is implemented in HxMap software. It has been successfully applied to large sets of heterogeneous imagery, including the adjustment of original sensor images prior to quality control and further processing as well as radiometric adjustment for ortho-image mosaic generation.

  14. Contrast computation methods for interferometric measurement of sensor modulation transfer function

    NASA Astrophysics Data System (ADS)

    Battula, Tharun; Georgiev, Todor; Gille, Jennifer; Goma, Sergio

    2018-01-01

    Accurate measurement of image-sensor frequency response over a wide range of spatial frequencies is very important for analyzing pixel array characteristics, such as modulation transfer function (MTF), crosstalk, and active pixel shape. Such analysis is especially significant in computational photography for the purposes of deconvolution, multi-image superresolution, and improved light-field capture. We use a lensless interferometric setup that produces high-quality fringes for measuring MTF over a wide range of frequencies (here, 37 to 434 line pairs per mm). We discuss the theoretical framework, involving Michelson and Fourier contrast measurement of the MTF, addressing phase alignment problems using a moiré pattern. We solidify the definition of Fourier contrast mathematically and compare it to Michelson contrast. Our interferometric measurement method shows high detail in the MTF, especially at high frequencies (above Nyquist frequency). We are able to estimate active pixel size and pixel pitch from measurements. We compare both simulation and experimental MTF results to a lens-free slanted-edge implementation using commercial software.

  15. Backside illuminated CMOS-TDI line scan sensor for space applications

    NASA Astrophysics Data System (ADS)

    Cohen, Omer; Ofer, Oren; Abramovich, Gil; Ben-Ari, Nimrod; Gershon, Gal; Brumer, Maya; Shay, Adi; Shamay, Yaron

    2018-05-01

    A multi-spectral backside illuminated Time Delayed Integration Radiation Hardened line scan sensor utilizing CMOS technology was designed for continuous scanning Low Earth Orbit small satellite applications. The sensor comprises a single silicon chip with 4 independent arrays of pixels where each array is arranged in 2600 columns with 64 TDI levels. A multispectral optical filter whose spectral responses per array are adjustable per system requirement is assembled at the package level. A custom 4T Pixel design provides the required readout speed, low-noise, very low dark current, and high conversion gains. A 2-phase internally controlled exposure mechanism improves the sensor's dynamic MTF. The sensor high level of integration includes on-chip 12 bit per pixel analog to digital converters, on-chip controller, and CMOS compatible voltage levels. Thus, the power consumption and the weight of the supporting electronics are reduced, and a simple electrical interface is provided. An adjustable gain provides a Full Well Capacity ranging from 150,000 electrons up to 500,000 electrons per column and an overall readout noise per column of less than 120 electrons. The imager supports line rates ranging from 50 to 10,000 lines/sec, with power consumption of less than 0.5W per array. Thus, the sensor is characterized by a high pixel rate, a high dynamic range and a very low power. To meet a Latch-up free requirement RadHard architecture and design rules were utilized. In this paper recent electrical and electro-optical measurements of the sensor's Flight Models will be presented for the first time.

  16. A robust approach for a filter-based monocular simultaneous localization and mapping (SLAM) system.

    PubMed

    Munguía, Rodrigo; Castillo-Toledo, Bernardino; Grau, Antoni

    2013-07-03

    Simultaneous localization and mapping (SLAM) is an important problem to solve in robotics theory in order to build truly autonomous mobile robots. This work presents a novel method for implementing a SLAM system based on a single camera sensor. The SLAM with a single camera, or monocular SLAM, is probably one of the most complex SLAM variants. In this case, a single camera, which is freely moving through its environment, represents the sole sensor input to the system. The sensors have a large impact on the algorithm used for SLAM. Cameras are used more frequently, because they provide a lot of information and are well adapted for embedded systems: they are light, cheap and power-saving. Nevertheless, and unlike range sensors, which provide range and angular information, a camera is a projective sensor providing only angular measurements of image features. Therefore, depth information (range) cannot be obtained in a single step. In this case, special techniques for feature system-initialization are needed in order to enable the use of angular sensors (as cameras) in SLAM systems. The main contribution of this work is to present a novel and robust scheme for incorporating and measuring visual features in filtering-based monocular SLAM systems. The proposed method is based in a two-step technique, which is intended to exploit all the information available in angular measurements. Unlike previous schemes, the values of parameters used by the initialization technique are derived directly from the sensor characteristics, thus simplifying the tuning of the system. The experimental results show that the proposed method surpasses the performance of previous schemes.

  17. OAST Space Theme Workshop. Volume 3: Working group summary. 3: Sensors (E-3). A. Statement. B. Technology needs (form 1). C. Priority assessment (form 2). D. Additional assessment

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Developments required to support the space power, SETI, solar system exploration and global services programs are identified. Instrumentation and calibration sensors (rather than scientific) are needed for the space power system. Highly sophisticated receivers for narrowband detection of microwave sensors and sensors for automated stellar cataloging to provide a mapping data base for SETI are needed. Various phases of solar system exploration require large area solid state imaging arrays from UV to IR; a long focal plane telescope; high energy particle detectors; advanced spectrometers; a gravitometer; and atmospheric distanalyzer; sensors for penetrometers; in-situ sensors for surface chemical analysis, life detection, spectroscopic and microscopic analyses of surface soils, and for meteorological measurements. Active and passive multiapplication sensors, advanced multispectral scanners with improved resolution in the UV and IR ranges, and laser techniques for advanced probing and oceanographic characterization will enhance for global services.

  18. Dual-polarized light-field imaging micro-system via a liquid-crystal microlens array for direct three-dimensional observation.

    PubMed

    Xin, Zhaowei; Wei, Dong; Xie, Xingwang; Chen, Mingce; Zhang, Xinyu; Liao, Jing; Wang, Haiwei; Xie, Changsheng

    2018-02-19

    Light-field imaging is a crucial and straightforward way of measuring and analyzing surrounding light worlds. In this paper, a dual-polarized light-field imaging micro-system based on a twisted nematic liquid-crystal microlens array (TN-LCMLA) for direct three-dimensional (3D) observation is fabricated and demonstrated. The prototyped camera has been constructed by integrating a TN-LCMLA with a common CMOS sensor array. By switching the working state of the TN-LCMLA, two orthogonally polarized light-field images can be remapped through the functioned imaging sensors. The imaging micro-system in conjunction with the electric-optical microstructure can be used to perform polarization and light-field imaging, simultaneously. Compared with conventional plenoptic cameras using liquid-crystal microlens array, the polarization-independent light-field images with a high image quality can be obtained in the arbitrary polarization state selected. We experimentally demonstrate characters including a relatively wide operation range in the manipulation of incident beams and the multiple imaging modes, such as conventional two-dimensional imaging, light-field imaging, and polarization imaging. Considering the obvious features of the TN-LCMLA, such as very low power consumption, providing multiple imaging modes mentioned, simple and low-cost manufacturing, the imaging micro-system integrated with this kind of liquid-crystal microstructure driven electrically presents the potential capability of directly observing a 3D object in typical scattering media.

  19. An electrically tunable plenoptic camera using a liquid crystal microlens array.

    PubMed

    Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Ji, An; Xie, Changsheng

    2015-05-01

    Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated with an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.

  20. An electrically tunable plenoptic camera using a liquid crystal microlens array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, Yu; School of Automation, Huazhong University of Science and Technology, Wuhan 430074; Wuhan National Laboratory for Optoelectronics, Huazhong University of Science and Technology, Wuhan 430074

    2015-05-15

    Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated withmore » an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.« less

  1. The multifocus plenoptic camera

    NASA Astrophysics Data System (ADS)

    Georgiev, Todor; Lumsdaine, Andrew

    2012-01-01

    The focused plenoptic camera is based on the Lippmann sensor: an array of microlenses focused on the pixels of a conventional image sensor. This device samples the radiance, or plenoptic function, as an array of cameras with large depth of field, focused at a certain plane in front of the microlenses. For the purpose of digital refocusing (which is one of the important applications) the depth of field needs to be large, but there are fundamental optical limitations to this. The solution of the above problem is to use and array of interleaved microlenses of different focal lengths, focused at two or more different planes. In this way a focused image can be constructed at any depth of focus, and a really wide range of digital refocusing can be achieved. This paper presents our theory and results of implementing such camera. Real world images are demonstrating the extended capabilities, and limitations are discussed.

  2. An electrically tunable plenoptic camera using a liquid crystal microlens array

    NASA Astrophysics Data System (ADS)

    Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Ji, An; Xie, Changsheng

    2015-05-01

    Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated with an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.

  3. Imaging Radar Applications in the Death Valley Region

    NASA Technical Reports Server (NTRS)

    Farr, Tom G.

    1996-01-01

    Death Valley has had a long history as a testbed for remote sensing techniques (Gillespie, this conference). Along with visible-near infrared and thermal IR sensors, imaging radars have flown and orbited over the valley since the 1970's, yielding new insights into the geologic applications of that technology. More recently, radar interferometry has been used to derive digital topographic maps of the area, supplementing the USGS 7.5' digital quadrangles currently available for nearly the entire area. As for their shorter-wavelength brethren, imaging radars were tested early in their civilian history in Death Valley because it has a variety of surface types in a small area without the confounding effects of vegetation. In one of the classic references of these early radar studies, in a semi-quantitative way the response of an imaging radar to surface roughness near the radar wavelength, which typically ranges from about 1 cm to 1 m was explained. This laid the groundwork for applications of airborne and spaceborne radars to geologic problems in and regions. Radar's main advantages over other sensors stems from its active nature- supplying its own illumination makes it independent of solar illumination and it can also control the imaging geometry more accurately. Finally, its long wavelength allows it to peer through clouds, eliminating some of the problems of optical sensors, especially in perennially cloudy and polar areas.

  4. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors.

    PubMed

    Dutton, Neale A W; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K

    2016-07-20

    SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed.

  5. Microwave Sensors for Breast Cancer Detection

    PubMed Central

    2018-01-01

    Breast cancer is the leading cause of death among females, early diagnostic methods with suitable treatments improve the 5-year survival rates significantly. Microwave breast imaging has been reported as the most potential to become the alternative or additional tool to the current gold standard X-ray mammography for detecting breast cancer. The microwave breast image quality is affected by the microwave sensor, sensor array, the number of sensors in the array and the size of the sensor. In fact, microwave sensor array and sensor play an important role in the microwave breast imaging system. Numerous microwave biosensors have been developed for biomedical applications, with particular focus on breast tumor detection. Compared to the conventional medical imaging and biosensor techniques, these microwave sensors not only enable better cancer detection and improve the image resolution, but also provide attractive features such as label-free detection. This paper aims to provide an overview of recent important achievements in microwave sensors for biomedical imaging applications, with particular focus on breast cancer detection. The electric properties of biological tissues at microwave spectrum, microwave imaging approaches, microwave biosensors, current challenges and future works are also discussed in the manuscript. PMID:29473867

  6. Microwave Sensors for Breast Cancer Detection.

    PubMed

    Wang, Lulu

    2018-02-23

    Breast cancer is the leading cause of death among females, early diagnostic methods with suitable treatments improve the 5-year survival rates significantly. Microwave breast imaging has been reported as the most potential to become the alternative or additional tool to the current gold standard X-ray mammography for detecting breast cancer. The microwave breast image quality is affected by the microwave sensor, sensor array, the number of sensors in the array and the size of the sensor. In fact, microwave sensor array and sensor play an important role in the microwave breast imaging system. Numerous microwave biosensors have been developed for biomedical applications, with particular focus on breast tumor detection. Compared to the conventional medical imaging and biosensor techniques, these microwave sensors not only enable better cancer detection and improve the image resolution, but also provide attractive features such as label-free detection. This paper aims to provide an overview of recent important achievements in microwave sensors for biomedical imaging applications, with particular focus on breast cancer detection. The electric properties of biological tissues at microwave spectrum, microwave imaging approaches, microwave biosensors, current challenges and future works are also discussed in the manuscript.

  7. Image Accumulation in Pixel Detector Gated by Late External Trigger Signal and its Application in Imaging Activation Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakubek, J.; Cejnarova, A.; Platkevic, M.

    Single quantum counting pixel detectors of Medipix type are starting to be used in various radiographic applications. Compared to standard devices for digital imaging (such as CCDs or CMOS sensors) they present significant advantages: direct conversion of radiation to electric signal, energy sensitivity, noiseless image integration, unlimited dynamic range, absolute linearity. In this article we describe usage of the pixel device TimePix for image accumulation gated by late trigger signal. Demonstration of the technique is given on imaging coincidence instrumental neutron activation analysis (Imaging CINAA). This method allows one to determine concentration and distribution of certain preselected element in anmore » inspected sample.« less

  8. Toward a digital camera to rival the human eye

    NASA Astrophysics Data System (ADS)

    Skorka, Orit; Joseph, Dileepan

    2011-07-01

    All things considered, electronic imaging systems do not rival the human visual system despite notable progress over 40 years since the invention of the CCD. This work presents a method that allows design engineers to evaluate the performance gap between a digital camera and the human eye. The method identifies limiting factors of the electronic systems by benchmarking against the human system. It considers power consumption, visual field, spatial resolution, temporal resolution, and properties related to signal and noise power. A figure of merit is defined as the performance gap of the weakest parameter. Experimental work done with observers and cadavers is reviewed to assess the parameters of the human eye, and assessment techniques are also covered for digital cameras. The method is applied to 24 modern image sensors of various types, where an ideal lens is assumed to complete a digital camera. Results indicate that dynamic range and dark limit are the most limiting factors. The substantial functional gap, from 1.6 to 4.5 orders of magnitude, between the human eye and digital cameras may arise from architectural differences between the human retina, arranged in a multiple-layer structure, and image sensors, mostly fabricated in planar technologies. Functionality of image sensors may be significantly improved by exploiting technologies that allow vertical stacking of active tiers.

  9. Wavelength- or Polarization-Selective Thermal Infrared Detectors for Multi-Color or Polarimetric Imaging Using Plasmonics and Metamaterials

    PubMed Central

    Ogawa, Shinpei; Kimata, Masafumi

    2017-01-01

    Wavelength- or polarization-selective thermal infrared (IR) detectors are promising for various novel applications such as fire detection, gas analysis, multi-color imaging, multi-channel detectors, recognition of artificial objects in a natural environment, and facial recognition. However, these functions require additional filters or polarizers, which leads to high cost and technical difficulties related to integration of many different pixels in an array format. Plasmonic metamaterial absorbers (PMAs) can impart wavelength or polarization selectivity to conventional thermal IR detectors simply by controlling the surface geometry of the absorbers to produce surface plasmon resonances at designed wavelengths or polarizations. This enables integration of many different pixels in an array format without any filters or polarizers. We review our recent advances in wavelength- and polarization-selective thermal IR sensors using PMAs for multi-color or polarimetric imaging. The absorption mechanism defined by the surface structures is discussed for three types of PMAs—periodic crystals, metal-insulator-metal and mushroom-type PMAs—to demonstrate appropriate applications. Our wavelength- or polarization-selective uncooled IR sensors using various PMAs and multi-color image sensors are then described. Finally, high-performance mushroom-type PMAs are investigated. These advanced functional thermal IR detectors with wavelength or polarization selectivity will provide great benefits for a wide range of applications. PMID:28772855

  10. Wavelength- or Polarization-Selective Thermal Infrared Detectors for Multi-Color or Polarimetric Imaging Using Plasmonics and Metamaterials.

    PubMed

    Ogawa, Shinpei; Kimata, Masafumi

    2017-05-04

    Wavelength- or polarization-selective thermal infrared (IR) detectors are promising for various novel applications such as fire detection, gas analysis, multi-color imaging, multi-channel detectors, recognition of artificial objects in a natural environment, and facial recognition. However, these functions require additional filters or polarizers, which leads to high cost and technical difficulties related to integration of many different pixels in an array format. Plasmonic metamaterial absorbers (PMAs) can impart wavelength or polarization selectivity to conventional thermal IR detectors simply by controlling the surface geometry of the absorbers to produce surface plasmon resonances at designed wavelengths or polarizations. This enables integration of many different pixels in an array format without any filters or polarizers. We review our recent advances in wavelength- and polarization-selective thermal IR sensors using PMAs for multi-color or polarimetric imaging. The absorption mechanism defined by the surface structures is discussed for three types of PMAs-periodic crystals, metal-insulator-metal and mushroom-type PMAs-to demonstrate appropriate applications. Our wavelength- or polarization-selective uncooled IR sensors using various PMAs and multi-color image sensors are then described. Finally, high-performance mushroom-type PMAs are investigated. These advanced functional thermal IR detectors with wavelength or polarization selectivity will provide great benefits for a wide range of applications.

  11. Estimating plant area index for monitoring crop growth dynamics using Landsat-8 and RapidEye images

    NASA Astrophysics Data System (ADS)

    Shang, Jiali; Liu, Jiangui; Huffman, Ted; Qian, Budong; Pattey, Elizabeth; Wang, Jinfei; Zhao, Ting; Geng, Xiaoyuan; Kroetsch, David; Dong, Taifeng; Lantz, Nicholas

    2014-01-01

    This study investigates the use of two different optical sensors, the multispectral imager (MSI) onboard the RapidEye satellites and the operational land imager (OLI) onboard the Landsat-8 for mapping within-field variability of crop growth conditions and tracking the seasonal growth dynamics. The study was carried out in southern Ontario, Canada, during the 2013 growing season for three annual crops, corn, soybeans, and winter wheat. Plant area index (PAI) was measured at different growth stages using digital hemispherical photography at two corn fields, two winter wheat fields, and two soybean fields. Comparison between several conventional vegetation indices derived from concurrently acquired image data by the two sensors showed a good agreement. The two-band enhanced vegetation index (EVI2) and the normalized difference vegetation index (NDVI) were derived from the surface reflectance of the two sensors. The study showed that EVI2 was more resistant to saturation at high biomass range than NDVI. A linear relationship could be used for crop green effective PAI estimation from EVI2, with a coefficient of determination (R2) of 0.85 and root-mean-square error of 0.53. The estimated multitemporal product of green PAI was found to be able to capture the seasonal dynamics of the three crops.

  12. Towards establishing compact imaging spectrometer standards

    USGS Publications Warehouse

    Slonecker, E. Terrence; Allen, David W.; Resmini, Ronald G.

    2016-01-01

    Remote sensing science is currently undergoing a tremendous expansion in the area of hyperspectral imaging (HSI) technology. Spurred largely by the explosive growth of Unmanned Aerial Vehicles (UAV), sometimes called Unmanned Aircraft Systems (UAS), or drones, HSI capabilities that once required access to one of only a handful of very specialized and expensive sensor systems are now miniaturized and widely available commercially. Small compact imaging spectrometers (CIS) now on the market offer a number of hyperspectral imaging capabilities in terms of spectral range and sampling. The potential uses of HSI/CIS on UAVs/UASs seem limitless. However, the rapid expansion of unmanned aircraft and small hyperspectral sensor capabilities has created a number of questions related to technological, legal, and operational capabilities. Lightweight sensor systems suitable for UAV platforms are being advertised in the trade literature at an ever-expanding rate with no standardization of system performance specifications or terms of reference. To address this issue, both the U.S. Geological Survey and the National Institute of Standards and Technology are eveloping draft standards to meet these issues. This paper presents the outline of a combined USGS/NIST cooperative strategy to develop and test a characterization methodology to meet the needs of a new and expanding UAV/CIS/HSI user community.

  13. An Imaging Sensor-Aided Vision Navigation Approach that Uses a Geo-Referenced Image Database.

    PubMed

    Li, Yan; Hu, Qingwu; Wu, Meng; Gao, Yang

    2016-01-28

    In determining position and attitude, vision navigation via real-time image processing of data collected from imaging sensors is advanced without a high-performance global positioning system (GPS) and an inertial measurement unit (IMU). Vision navigation is widely used in indoor navigation, far space navigation, and multiple sensor-integrated mobile mapping. This paper proposes a novel vision navigation approach aided by imaging sensors and that uses a high-accuracy geo-referenced image database (GRID) for high-precision navigation of multiple sensor platforms in environments with poor GPS. First, the framework of GRID-aided vision navigation is developed with sequence images from land-based mobile mapping systems that integrate multiple sensors. Second, a highly efficient GRID storage management model is established based on the linear index of a road segment for fast image searches and retrieval. Third, a robust image matching algorithm is presented to search and match a real-time image with the GRID. Subsequently, the image matched with the real-time scene is considered to calculate the 3D navigation parameter of multiple sensor platforms. Experimental results show that the proposed approach retrieves images efficiently and has navigation accuracies of 1.2 m in a plane and 1.8 m in height under GPS loss in 5 min and within 1500 m.

  14. An Imaging Sensor-Aided Vision Navigation Approach that Uses a Geo-Referenced Image Database

    PubMed Central

    Li, Yan; Hu, Qingwu; Wu, Meng; Gao, Yang

    2016-01-01

    In determining position and attitude, vision navigation via real-time image processing of data collected from imaging sensors is advanced without a high-performance global positioning system (GPS) and an inertial measurement unit (IMU). Vision navigation is widely used in indoor navigation, far space navigation, and multiple sensor-integrated mobile mapping. This paper proposes a novel vision navigation approach aided by imaging sensors and that uses a high-accuracy geo-referenced image database (GRID) for high-precision navigation of multiple sensor platforms in environments with poor GPS. First, the framework of GRID-aided vision navigation is developed with sequence images from land-based mobile mapping systems that integrate multiple sensors. Second, a highly efficient GRID storage management model is established based on the linear index of a road segment for fast image searches and retrieval. Third, a robust image matching algorithm is presented to search and match a real-time image with the GRID. Subsequently, the image matched with the real-time scene is considered to calculate the 3D navigation parameter of multiple sensor platforms. Experimental results show that the proposed approach retrieves images efficiently and has navigation accuracies of 1.2 m in a plane and 1.8 m in height under GPS loss in 5 min and within 1500 m. PMID:26828496

  15. High-content analysis of single cells directly assembled on CMOS sensor based on color imaging.

    PubMed

    Tanaka, Tsuyoshi; Saeki, Tatsuya; Sunaga, Yoshihiko; Matsunaga, Tadashi

    2010-12-15

    A complementary metal oxide semiconductor (CMOS) image sensor was applied to high-content analysis of single cells which were assembled closely or directly onto the CMOS sensor surface. The direct assembling of cell groups on CMOS sensor surface allows large-field (6.66 mm×5.32 mm in entire active area of CMOS sensor) imaging within a second. Trypan blue-stained and non-stained cells in the same field area on the CMOS sensor were successfully distinguished as white- and blue-colored images under white LED light irradiation. Furthermore, the chemiluminescent signals of each cell were successfully visualized as blue-colored images on CMOS sensor only when HeLa cells were placed directly on the micro-lens array of the CMOS sensor. Our proposed approach will be a promising technique for real-time and high-content analysis of single cells in a large-field area based on color imaging. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. CMOS image sensor-based implantable glucose sensor using glucose-responsive fluorescent hydrogel

    PubMed Central

    Tokuda, Takashi; Takahashi, Masayuki; Uejima, Kazuhiro; Masuda, Keita; Kawamura, Toshikazu; Ohta, Yasumi; Motoyama, Mayumi; Noda, Toshihiko; Sasagawa, Kiyotaka; Okitsu, Teru; Takeuchi, Shoji; Ohta, Jun

    2014-01-01

    A CMOS image sensor-based implantable glucose sensor based on an optical-sensing scheme is proposed and experimentally verified. A glucose-responsive fluorescent hydrogel is used as the mediator in the measurement scheme. The wired implantable glucose sensor was realized by integrating a CMOS image sensor, hydrogel, UV light emitting diodes, and an optical filter on a flexible polyimide substrate. Feasibility of the glucose sensor was verified by both in vitro and in vivo experiments. PMID:25426316

  17. Beam imaging sensor and method for using same

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McAninch, Michael D.; Root, Jeffrey J.

    The present invention relates generally to the field of sensors for beam imaging and, in particular, to a new and useful beam imaging sensor for use in determining, for example, the power density distribution of a beam including, but not limited to, an electron beam or an ion beam. In one embodiment, the beam imaging sensor of the present invention comprises, among other items, a circumferential slit that is either circular, elliptical or polygonal in nature. In another embodiment, the beam imaging sensor of the present invention comprises, among other things, a discontinuous partially circumferential slit. Also disclosed is amore » method for using the various beams sensor embodiments of the present invention.« less

  18. Analysis on the Effect of Sensor Views in Image Reconstruction Produced by Optical Tomography System Using Charge-Coupled Device.

    PubMed

    Jamaludin, Juliza; Rahim, Ruzairi Abdul; Fazul Rahiman, Mohd Hafiz; Mohd Rohani, Jemmy

    2018-04-01

    Optical tomography (OPT) is a method to capture a cross-sectional image based on the data obtained by sensors, distributed around the periphery of the analyzed system. This system is based on the measurement of the final light attenuation or absorption of radiation after crossing the measured objects. The number of sensor views will affect the results of image reconstruction, where the high number of sensor views per projection will give a high image quality. This research presents an application of charge-coupled device linear sensor and laser diode in an OPT system. Experiments in detecting solid and transparent objects in crystal clear water were conducted. Two numbers of sensors views, 160 and 320 views are evaluated in this research in reconstructing the images. The image reconstruction algorithms used were filtered images of linear back projection algorithms. Analysis on comparing the simulation and experiments image results shows that, with 320 image views giving less area error than 160 views. This suggests that high image view resulted in the high resolution of image reconstruction.

  19. Integration of piezo-capacitive and piezo-electric nanoweb based pressure sensors for imaging of static and dynamic pressure distribution.

    PubMed

    Jeong, Y J; Oh, T I; Woo, E J; Kim, K J

    2017-07-01

    Recently, highly flexible and soft pressure distribution imaging sensor is in great demand for tactile sensing, gait analysis, ubiquitous life-care based on activity recognition, and therapeutics. In this study, we integrate the piezo-capacitive and piezo-electric nanowebs with the conductive fabric sheets for detecting static and dynamic pressure distributions on a large sensing area. Electrical impedance tomography (EIT) and electric source imaging are applied for reconstructing pressure distribution images from measured current-voltage data on the boundary of the hybrid fabric sensor. We evaluated the piezo-capacitive nanoweb sensor, piezo-electric nanoweb sensor, and hybrid fabric sensor. The results show the feasibility of static and dynamic pressure distribution imaging from the boundary measurements of the fabric sensors.

  20. Anatomically correct visualization of the human upper airway using a high-speed long range optical coherence tomography system with an integrated positioning sensor

    NASA Astrophysics Data System (ADS)

    Jing, Joseph C.; Chou, Lidek; Su, Erica; Wong, Brian J. F.; Chen, Zhongping

    2016-12-01

    The upper airway is a complex tissue structure that is prone to collapse. Current methods for studying airway obstruction are inadequate in safety, cost, or availability, such as CT or MRI, or only provide localized qualitative information such as flexible endoscopy. Long range optical coherence tomography (OCT) has been used to visualize the human airway in vivo, however the limited imaging range has prevented full delineation of the various shapes and sizes of the lumen. We present a new long range OCT system that integrates high speed imaging with a real-time position tracker to allow for the acquisition of an accurate 3D anatomical structure in vivo. The new system can achieve an imaging range of 30 mm at a frame rate of 200 Hz. The system is capable of generating a rapid and complete visualization and quantification of the airway, which can then be used in computational simulations to determine obstruction sites.

  1. MACS-Mar: a real-time remote sensing system for maritime security applications

    NASA Astrophysics Data System (ADS)

    Brauchle, Jörg; Bayer, Steven; Hein, Daniel; Berger, Ralf; Pless, Sebastian

    2018-04-01

    The modular aerial camera system (MACS) is a development platform for optical remote sensing concepts, algorithms and special environments. For real-time services for maritime security (EMSec joint project), a new multi-sensor configuration MACS-Mar was realized. It consists of four co-aligned sensor heads in the visible RGB, near infrared (NIR, 700-950 nm), hyperspectral (HS, 450-900 nm) and thermal infrared (TIR, 7.5-14 µm) spectral range, a mid-cost navigation system, a processing unit and two data links. On-board image projection, cropping of redundant data and compression enable the instant generation of direct-georeferenced high-resolution image mosaics, automatic object detection, vectorization and annotation of floating objects on the water surface. The results were transmitted over a distance up to 50 km in real-time via narrow and broadband data links and were visualized in a maritime situation awareness system. For the automatic onboard detection of floating objects, a segmentation and classification workflow based on RGB, IR and TIR information was developed and tested. The completeness of the object detection in the experiment resulted in 95%, the correctness in 53%. Mostly, bright backwash of ships lead to an overestimation of the number of objects, further refinement using water homogeneity in the TIR, as implemented in the workflow, couldn't be carried out due to problems with the TIR sensor, else distinctly better results could have been expected. The absolute positional accuracy of the projected real-time imagery resulted in 2 m without postprocessing of images or navigation data, the relative measurement accuracy of distances is in the range of the image resolution, which is about 12 cm for RGB imagery in the EMSec experiment.

  2. Improved Denoising via Poisson Mixture Modeling of Image Sensor Noise.

    PubMed

    Zhang, Jiachao; Hirakawa, Keigo

    2017-04-01

    This paper describes a study aimed at comparing the real image sensor noise distribution to the models of noise often assumed in image denoising designs. A quantile analysis in pixel, wavelet transform, and variance stabilization domains reveal that the tails of Poisson, signal-dependent Gaussian, and Poisson-Gaussian models are too short to capture real sensor noise behavior. A new Poisson mixture noise model is proposed to correct the mismatch of tail behavior. Based on the fact that noise model mismatch results in image denoising that undersmoothes real sensor data, we propose a mixture of Poisson denoising method to remove the denoising artifacts without affecting image details, such as edge and textures. Experiments with real sensor data verify that denoising for real image sensor data is indeed improved by this new technique.

  3. KSC-2010-4679

    NASA Image and Video Library

    2010-07-28

    CAPE CANAVERAL, Fla. -- A DragonEye proximity sensor developed by Space Exploration Technologies (SpaceX) is installed while space shuttle Discovery is in Orbiter Processing Facility-3 at NASA's Kennedy Space Center in Florida. DragonEye is a Laser Imaging Detection and Ranging (LIDAR) sensor that will be tested on Discovery's docking operation with the International Space Station. Discovery's STS-133 mission, targeted to launch Nov. 1, will be the second demonstration of the sensor, following shuttle Endeavour's STS-127 mission in 2009. The DragonEye sensor will guide SpaceX's Dragon spacecraft as it approaches and berths to the station on future cargo re-supply missions. The Dragon spacecraft is a free-flying, reusable spacecraft being developed by SpaceX, which is contracted by NASA's Commercial Orbital Transportation Services (COTS) program. Photo credit: NASA/Jim Grossmann

  4. KSC-2010-4678

    NASA Image and Video Library

    2010-07-28

    CAPE CANAVERAL, Fla. -- A DragonEye proximity sensor developed by Space Exploration Technologies (SpaceX) is installed while space shuttle Discovery is in Orbiter Processing Facility-3 at NASA's Kennedy Space Center in Florida. DragonEye is a Laser Imaging Detection and Ranging (LIDAR) sensor that will be tested on Discovery's docking operation with the International Space Station. Discovery's STS-133 mission, targeted to launch Nov. 1, will be the second demonstration of the sensor, following shuttle Endeavour's STS-127 mission in 2009. The DragonEye sensor will guide SpaceX's Dragon spacecraft as it approaches and berths to the station on future cargo re-supply missions. The Dragon spacecraft is a free-flying, reusable spacecraft being developed by SpaceX, which is contracted by NASA's Commercial Orbital Transportation Services (COTS) program. Photo credit: NASA/Jim Grossmann

  5. KSC-2010-4680

    NASA Image and Video Library

    2010-07-28

    CAPE CANAVERAL, Fla. -- A DragonEye proximity sensor developed by Space Exploration Technologies (SpaceX) is installed while space shuttle Discovery is in Orbiter Processing Facility-3 at NASA's Kennedy Space Center in Florida. DragonEye is a Laser Imaging Detection and Ranging (LIDAR) sensor that will be tested on Discovery's docking operation with the International Space Station. Discovery's STS-133 mission, targeted to launch Nov. 1, will be the second demonstration of the sensor, following shuttle Endeavour's STS-127 mission in 2009. The DragonEye sensor will guide SpaceX's Dragon spacecraft as it approaches and berths to the station on future cargo re-supply missions. The Dragon spacecraft is a free-flying, reusable spacecraft being developed by SpaceX, which is contracted by NASA's Commercial Orbital Transportation Services (COTS) program. Photo credit: NASA/Jim Grossmann

  6. KSC-2010-4681

    NASA Image and Video Library

    2010-07-28

    CAPE CANAVERAL, Fla. -- A DragonEye proximity sensor developed by Space Exploration Technologies (SpaceX) is installed while space shuttle Discovery is in Orbiter Processing Facility-3 at NASA's Kennedy Space Center in Florida. DragonEye is a Laser Imaging Detection and Ranging (LIDAR) sensor that will be tested on Discovery's docking operation with the International Space Station. Discovery's STS-133 mission, targeted to launch Nov. 1, will be the second demonstration of the sensor, following shuttle Endeavour's STS-127 mission in 2009. The DragonEye sensor will guide SpaceX's Dragon spacecraft as it approaches and berths to the station on future cargo re-supply missions. The Dragon spacecraft is a free-flying, reusable spacecraft being developed by SpaceX, which is contracted by NASA's Commercial Orbital Transportation Services (COTS) program. Photo credit: NASA/Jim Grossmann

  7. KSC-2010-4683

    NASA Image and Video Library

    2010-07-28

    CAPE CANAVERAL, Fla. -- A DragonEye proximity sensor developed by Space Exploration Technologies (SpaceX) is installed while space shuttle Discovery is in Orbiter Processing Facility-3 at NASA's Kennedy Space Center in Florida. DragonEye is a Laser Imaging Detection and Ranging (LIDAR) sensor that will be tested on Discovery's docking operation with the International Space Station. Discovery's STS-133 mission, targeted to launch Nov. 1, will be the second demonstration of the sensor, following shuttle Endeavour's STS-127 mission in 2009. The DragonEye sensor will guide SpaceX's Dragon spacecraft as it approaches and berths to the station on future cargo re-supply missions. The Dragon spacecraft is a free-flying, reusable spacecraft being developed by SpaceX, which is contracted by NASA's Commercial Orbital Transportation Services (COTS) program. Photo credit: NASA/Jim Grossmann

  8. KSC-2010-4677

    NASA Image and Video Library

    2010-07-28

    CAPE CANAVERAL, Fla. -- A DragonEye proximity sensor developed by Space Exploration Technologies (SpaceX) is prepared for installation while space shuttle Discovery is in Orbiter Processing Facility-3 at NASA's Kennedy Space Center in Florida. DragonEye is a Laser Imaging Detection and Ranging (LIDAR) sensor that will be tested on Discovery's docking operation with the International Space Station. Discovery's STS-133 mission, targeted to launch Nov. 1, will be the second demonstration of the sensor, following shuttle Endeavour's STS-127 mission in 2009. The DragonEye sensor will guide SpaceX's Dragon spacecraft as it approaches and berths to the station on future cargo re-supply missions. The Dragon spacecraft is a free-flying, reusable spacecraft being developed by SpaceX, which is contracted by NASA's Commercial Orbital Transportation Services (COTS) program. Photo credit: NASA/Jim Grossmann

  9. KSC-2010-4682

    NASA Image and Video Library

    2010-07-28

    CAPE CANAVERAL, Fla. -- A DragonEye proximity sensor developed by Space Exploration Technologies (SpaceX) is installed while space shuttle Discovery is in Orbiter Processing Facility-3 at NASA's Kennedy Space Center in Florida. DragonEye is a Laser Imaging Detection and Ranging (LIDAR) sensor that will be tested on Discovery's docking operation with the International Space Station. Discovery's STS-133 mission, targeted to launch Nov. 1, will be the second demonstration of the sensor, following shuttle Endeavour's STS-127 mission in 2009. The DragonEye sensor will guide SpaceX's Dragon spacecraft as it approaches and berths to the station on future cargo re-supply missions. The Dragon spacecraft is a free-flying, reusable spacecraft being developed by SpaceX, which is contracted by NASA's Commercial Orbital Transportation Services (COTS) program. Photo credit: NASA/Jim Grossmann

  10. Apparatus and method for a light direction sensor

    NASA Technical Reports Server (NTRS)

    Leviton, Douglas B. (Inventor)

    2011-01-01

    The present invention provides a light direction sensor for determining the direction of a light source. The system includes an image sensor; a spacer attached to the image sensor, and a pattern mask attached to said spacer. The pattern mask has a slit pattern that as light passes through the slit pattern it casts a diffraction pattern onto the image sensor. The method operates by receiving a beam of light onto a patterned mask, wherein the patterned mask as a plurality of a slit segments. Then, diffusing the beam of light onto an image sensor and determining the direction of the light source.

  11. Initial test of MITA/DIMM with an operational CBP system

    NASA Astrophysics Data System (ADS)

    Baldwin, Kevin; Hanna, Randall; Brown, Andrea; Brown, David; Moyer, Steven; Hixson, Jonathan G.

    2018-05-01

    The MITA (Motion Imagery Task Analyzer) project was conceived by CBP OA (Customs and Border Protection - Office of Acquisition) and executed by JHU/APL (Johns Hopkins University/Applied Physics Laboratory) and CERDEC NVESD MSD (Communications and Electronics Research Development Engineering Command Night Vision and Electronic Sensors Directorate Modeling and Simulation Division). The intent was to develop an efficient methodology whereby imaging system performance could be quickly and objectively characterized in a field setting. The initial design, development, and testing spanned a period of approximately 18 months with the initial project coming to a conclusion after testing of the MITA system in June 2017 with a fielded CBP system. The NVESD contribution to MITA was thermally heated target resolution boards deployed to support a range close to the sensor and, when possible, at range with the targets of interest. JHU/APL developed a laser DIMM (Differential Image Motion Monitor) system designed to measure the optical turbulence present along the line of sight of the imaging system during the time of image collection. The imagery collected of the target board was processed to calculate the in situ system resolution. This in situ imaging system resolution and the time-correlated turbulence measured by the DIMM system were used in NV-IPM (Night Vision Integrated Performance Model) to calculate the theoretical imaging system performance. Overall, this proves the MITA concept feasible. However, MITA is still in the initial phases of development and requires further verification and validation to ensure accuracy and reliability of both the instrument and the imaging system performance predictions.

  12. Dynamic Range and Sensitivity Requirements of Satellite Ocean Color Sensors: Learning from the Past

    NASA Technical Reports Server (NTRS)

    Hu, Chuanmin; Feng, Lian; Lee, Zhongping; Davis, Curtiss O.; Mannino, Antonio; McClain, Charles R.; Franz, Bryan A.

    2012-01-01

    Sensor design and mission planning for satellite ocean color measurements requires careful consideration of the signal dynamic range and sensitivity (specifically here signal-to-noise ratio or SNR) so that small changes of ocean properties (e.g., surface chlorophyll-a concentrations or Chl) can be quantified while most measurements are not saturated. Past and current sensors used different signal levels, formats, and conventions to specify these critical parameters, making it difficult to make cross-sensor comparisons or to establish standards for future sensor design. The goal of this study is to quantify these parameters under uniform conditions for widely used past and current sensors in order to provide a reference for the design of future ocean color radiometers. Using measurements from the Moderate Resolution Imaging Spectroradiometer onboard the Aqua satellite (MODISA) under various solar zenith angles (SZAs), typical (L(sub typical)) and maximum (L(sub max)) at-sensor radiances from the visible to the shortwave IR were determined. The Ltypical values at an SZA of 45 deg were used as constraints to calculate SNRs of 10 multiband sensors at the same L(sub typical) radiance input and 2 hyperspectral sensors at a similar radiance input. The calculations were based on clear-water scenes with an objective method of selecting pixels with minimal cross-pixel variations to assure target homogeneity. Among the widely used ocean color sensors that have routine global coverage, MODISA ocean bands (1 km) showed 2-4 times higher SNRs than the Sea-viewing Wide Field-of-view Sensor (Sea-WiFS) (1 km) and comparable SNRs to the Medium Resolution Imaging Spectrometer (MERIS)-RR (reduced resolution, 1.2 km), leading to different levels of precision in the retrieved Chl data product. MERIS-FR (full resolution, 300 m) showed SNRs lower than MODISA and MERIS-RR with the gain in spatial resolution. SNRs of all MODISA ocean bands and SeaWiFS bands (except the SeaWiFS near-IR bands) exceeded those from prelaunch sensor specifications after adjusting the input radiance to L(sub typical). The tabulated L(sub typical), L(sub max), and SNRs of the various multiband and hyperspectral sensors under the same or similar radiance input provide references to compare sensor performance in product precision and to help design future missions such as the Geostationary Coastal and Air Pollution Events (GEO-CAPE) mission and the Pre-Aerosol-Clouds-Ecosystems (PACE) mission currently being planned by the U.S. National Aeronautics and Space Administration (NASA).

  13. Development of three-dimensional tracking system using astigmatic lens method for microscopes

    NASA Astrophysics Data System (ADS)

    Kibata, Hiroki; Ishii, Katsuhiro

    2017-07-01

    We have developed a three-dimensional tracking system for microscopes. Using the astigmatic lens method and a CMOS image sensor, we realize a rapid detection of a target position in a wide range. We demonstrate a target tracking using the developed system.

  14. An examination of polyvinylidene fluoride capacitive sensors as ultrasound transducer for imaging applications

    NASA Astrophysics Data System (ADS)

    Reyes-Ramírez, B.; García-Segundo, C.; García-Valenzuela, A.

    2014-05-01

    We investigate theoretically and experimentally the performance of low-noise capacitive sensors based on polyvinylidene fluoride (PVDF) piezoelectric films to sense water-borne ultrasound signals for their use in photoacoustic tomography. We derive a mechanical-to-electrical transfer function of a piezoelectric capacitor sensor of infinite lateral dimensions and arbitrary thickness assuming that an ultrasound wave is normally incident. Then, we analyse the response for obliquely incident ultrasound waves on sensors of large but finite area and derive an expression for the angle dependence of the sensor's response. We also present experimental different measurements with home-made sensors and compare with our theoretical model. We present measurements of the sensors' response to harmonic signals of variable frequency in the range from 0.5 to 50 MHz and of the angular-dependence factor at 6 MHz. Additionally, because of the scope of interest in these kinds of sensors, we also tested the sensors' response for photoacoustic perturbations. These are generated by laser pulses from directly impinging on the sensor and from ultrasound perturbations produced on neoprene by the same kind of laser pulses and then travelling through water to the sensor.

  15. DynAMITe: a prototype large area CMOS APS for breast cancer diagnosis using x-ray diffraction measurements

    NASA Astrophysics Data System (ADS)

    Konstantinidis, A.; Anaxagoras, T.; Esposito, M.; Allinson, N.; Speller, R.

    2012-03-01

    X-ray diffraction studies are used to identify specific materials. Several laboratory-based x-ray diffraction studies were made for breast cancer diagnosis. Ideally a large area, low noise, linear and wide dynamic range digital x-ray detector is required to perform x-ray diffraction measurements. Recently, digital detectors based on Complementary Metal-Oxide- Semiconductor (CMOS) Active Pixel Sensor (APS) technology have been used in x-ray diffraction studies. Two APS detectors, namely Vanilla and Large Area Sensor (LAS), were developed by the Multidimensional Integrated Intelligent Imaging (MI-3) consortium to cover a range of scientific applications including x-ray diffraction. The MI-3 Plus consortium developed a novel large area APS, named as Dynamically Adjustable Medical Imaging Technology (DynAMITe), to combine the key characteristics of Vanilla and LAS with a number of extra features. The active area (12.8 × 13.1 cm2) of DynaMITe offers the ability of angle dispersive x-ray diffraction (ADXRD). The current study demonstrates the feasibility of using DynaMITe for breast cancer diagnosis by identifying six breast-equivalent plastics. Further work will be done to optimize the system in order to perform ADXRD for identification of suspicious areas of breast tissue following a conventional mammogram taken with the same sensor.

  16. Next Generation Advanced Video Guidance Sensor

    NASA Technical Reports Server (NTRS)

    Lee, Jimmy; Spencer, Susan; Bryan, Tom; Johnson, Jimmie; Robertson, Bryan

    2008-01-01

    The first autonomous rendezvous and docking in the history of the U.S. Space Program was successfully accomplished by Orbital Express, using the Advanced Video Guidance Sensor (AVGS) as the primary docking sensor. The United States now has a mature and flight proven sensor technology for supporting Crew Exploration Vehicles (CEV) and Commercial Orbital Transport. Systems (COTS) Automated Rendezvous and Docking (AR&D). AVGS has a proven pedigree, based on extensive ground testing and flight demonstrations. The AVGS on the Demonstration of Autonomous Rendezvous Technology (DART)mission operated successfully in "spot mode" out to 2 km. The first generation rendezvous and docking sensor, the Video Guidance Sensor (VGS), was developed and successfully flown on Space Shuttle flights in 1997 and 1998. Parts obsolescence issues prevent the construction of more AVGS. units, and the next generation sensor must be updated to support the CEV and COTS programs. The flight proven AR&D sensor is being redesigned to update parts and add additional. capabilities for CEV and COTS with the development of the Next, Generation AVGS (NGAVGS) at the Marshall Space Flight Center. The obsolete imager and processor are being replaced with new radiation tolerant parts. In addition, new capabilities might include greater sensor range, auto ranging, and real-time video output. This paper presents an approach to sensor hardware trades, use of highly integrated laser components, and addresses the needs of future vehicles that may rendezvous and dock with the International Space Station (ISS) and other Constellation vehicles. It will also discuss approaches for upgrading AVGS to address parts obsolescence, and concepts for minimizing the sensor footprint, weight, and power requirements. In addition, parts selection and test plans for the NGAVGS will be addressed to provide a highly reliable flight qualified sensor. Expanded capabilities through innovative use of existing capabilities will also be discussed.

  17. Study the performance of star sensor influenced by space radiation damage of image sensor

    NASA Astrophysics Data System (ADS)

    Feng, Jie; Li, Yudong; Wen, Lin; Guo, Qi; Zhang, Xingyao

    2018-03-01

    Star sensor is an essential component of spacecraft attitude control system. Spatial radiation can cause star sensor performance degradation, abnormal work, attitude measurement accuracy and reliability reduction. Many studies have already been dedicated to the radiation effect on Charge-Coupled Device(CCD) image sensor, but fewer studies focus on the radiation effect of star sensor. The innovation of this paper is to study the radiation effects from the device level to the system level. The influence of the degradation of CCD image sensor radiation sensitive parameters on the performance parameters of star sensor is studied in this paper. The correlation among the radiation effect of proton, the non-uniformity noise of CCD image sensor and the performance parameter of star sensor is analyzed. This paper establishes a foundation for the study of error prediction and correction technology of star sensor on-orbit attitude measurement, and provides some theoretical basis for the design of high performance star sensor.

  18. Penrose high-dynamic-range imaging

    NASA Astrophysics Data System (ADS)

    Li, Jia; Bai, Chenyan; Lin, Zhouchen; Yu, Jian

    2016-05-01

    High-dynamic-range (HDR) imaging is becoming increasingly popular and widespread. The most common multishot HDR approach, based on multiple low-dynamic-range images captured with different exposures, has difficulties in handling camera and object movements. The spatially varying exposures (SVE) technology provides a solution to overcome this limitation by obtaining multiple exposures of the scene in only one shot but suffers from a loss in spatial resolution of the captured image. While aperiodic assignment of exposures has been shown to be advantageous during reconstruction in alleviating resolution loss, almost all the existing imaging sensors use the square pixel layout, which is a periodic tiling of square pixels. We propose the Penrose pixel layout, using pixels in aperiodic rhombus Penrose tiling, for HDR imaging. With the SVE technology, Penrose pixel layout has both exposure and pixel aperiodicities. To investigate its performance, we have to reconstruct HDR images in square pixel layout from Penrose raw images with SVE. Since the two pixel layouts are different, the traditional HDR reconstruction methods are not applicable. We develop a reconstruction method for Penrose pixel layout using a Gaussian mixture model for regularization. Both quantitative and qualitative results show the superiority of Penrose pixel layout over square pixel layout.

  19. Polymer-Free Optode Nanosensors for Dynamic, Reversible, and Ratiometric Sodium Imaging in the Physiological Range

    PubMed Central

    Ruckh, Timothy T.; Mehta, Ankeeta A.; Dubach, J. Matthew; Clark, Heather A.

    2013-01-01

    This work introduces a polymer-free optode nanosensor for ratiometric sodium imaging. Transmembrane ion dynamics are often captured by electrophysiology and calcium imaging, but sodium dyes suffer from short excitation wavelengths and poor selectivity. Optodes, optical sensors composed of a polymer matrix with embedded sensing chemistry, have been translated into nanosensors that selectively image ion concentrations. Polymer-free nanosensors were fabricated by emulsification and were stable by diameter and sensitivity for at least one week. Ratiometric fluorescent measurements demonstrated that the nanosensors are selective for sodium over potassium by ~1.4 orders of magnitude, have a dynamic range centered at 20 mM, and are fully reversible. The ratiometric signal changes by 70% between 10 and 100 mM sodium, showing that they are sensitive to changes in sodium concentration. These nanosensors will provide a new tool for sensitive and quantitative ion imaging. PMID:24284431

  20. Development and testing of the EVS 2000 enhanced vision system

    NASA Astrophysics Data System (ADS)

    Way, Scott P.; Kerr, Richard; Imamura, Joe J.; Arnoldy, Dan; Zeylmaker, Richard; Zuro, Greg

    2003-09-01

    An effective enhanced vision system must operate over a broad spectral range in order to offer a pilot an optimized scene that includes runway background as well as airport lighting and aircraft operations. The large dynamic range of intensities of these images is best handled with separate imaging sensors. The EVS 2000 is a patented dual-band Infrared Enhanced Vision System (EVS) utilizing image fusion concepts to provide a single image from uncooled infrared imagers in both the LWIR and SWIR. The system is designed to provide commercial and corporate airline pilots with improved situational awareness at night and in degraded weather conditions. A prototype of this system was recently fabricated and flown on the Boeing Advanced Technology Demonstrator 737-900 aircraft. This paper will discuss the current EVS 2000 concept, show results taken from the Boeing Advanced Technology Demonstrator program, and discuss future plans for EVS systems.

  1. Improved detection and false alarm rejection for chemical vapors using passive hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Marinelli, William J.; Miyashiro, Rex; Gittins, Christopher M.; Konno, Daisei; Chang, Shing; Farr, Matt; Perkins, Brad

    2013-05-01

    Two AIRIS sensors were tested at Dugway Proving Grounds against chemical agent vapor simulants. The primary objectives of the test were to: 1) assess performance of algorithm improvements designed to reduce false alarm rates with a special emphasis on solar effects, and 3) evaluate performance in target detection at 5 km. The tests included 66 total releases comprising alternating 120 kg glacial acetic acid (GAA) and 60 kg triethyl phosphate (TEP) events. The AIRIS sensors had common algorithms, detection thresholds, and sensor parameters. The sensors used the target set defined for the Joint Service Lightweight Chemical Agent Detector (JSLSCAD) with TEP substituted for GA and GAA substituted for VX. They were exercised at two sites located at either 3 km or 5 km from the release point. Data from the tests will be presented showing that: 1) excellent detection capability was obtained at both ranges with significantly shorter alarm times at 5 km, 2) inter-sensor comparison revealed very comparable performance, 3) false alarm rates < 1 incident per 10 hours running time over 143 hours of sensor operations were achieved, 4) algorithm improvements eliminated both solar and cloud false alarms. The algorithms enabling the improved false alarm rejection will be discussed. The sensor technology has recently been extended to address the problem of detection of liquid and solid chemical agents and toxic industrial chemical on surfaces. The phenomenology and applicability of passive infrared hyperspectral imaging to this problem will be discussed and demonstrated.

  2. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors

    PubMed Central

    Dutton, Neale A. W.; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K.

    2016-01-01

    SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed. PMID:27447643

  3. Characterization of a smartphone camera's response to ultraviolet A radiation.

    PubMed

    Igoe, Damien; Parisi, Alfio; Carter, Brad

    2013-01-01

    As part of a wider study into the use of smartphones as solar ultraviolet radiation monitors, this article characterizes the ultraviolet A (UVA; 320-400 nm) response of a consumer complementary metal oxide semiconductor (CMOS)-based smartphone image sensor in a controlled laboratory environment. The CMOS image sensor in the camera possesses inherent sensitivity to UVA, and despite the attenuation due to the lens and neutral density and wavelength-specific bandpass filters, the measured relative UVA irradiances relative to the incident irradiances range from 0.0065% at 380 nm to 0.0051% at 340 nm. In addition, the sensor demonstrates a predictable response to low-intensity discrete UVA stimuli that can be modelled using the ratio of recorded digital values to the incident UVA irradiance for a given automatic exposure time, and resulting in measurement errors that are typically less than 5%. Our results support the idea that smartphones can be used for scientific monitoring of UVA radiation. © 2012 Wiley Periodicals, Inc. Photochemistry and Photobiology © 2012 The American Society of Photobiology.

  4. LandingNav: a precision autonomous landing sensor for robotic platforms on planetary bodies

    NASA Astrophysics Data System (ADS)

    Katake, Anup; Bruccoleri, Chrisitian; Singla, Puneet; Junkins, John L.

    2010-01-01

    Increased interest in the exploration of extra terrestrial planetary bodies calls for an increase in the number of spacecraft landing on remote planetary surfaces. Currently, imaging and radar based surveys are used to determine regions of interest and a safe landing zone. The purpose of this paper is to introduce LandingNav, a sensor system solution for autonomous landing on planetary bodies that enables landing on unknown terrain. LandingNav is based on a novel multiple field of view imaging system that leverages the integration of different state of the art technologies for feature detection, tracking, and 3D dense stereo map creation. In this paper we present the test flight results of the LandingNav system prototype. Sources of errors due to hardware limitations and processing algorithms were identified and will be discussed. This paper also shows that addressing the issues identified during the post-flight test data analysis will reduce the error down to 1-2%, thus providing for a high precision 3D range map sensor system.

  5. Smoothing-Based Relative Navigation and Coded Aperture Imaging

    NASA Technical Reports Server (NTRS)

    Saenz-Otero, Alvar; Liebe, Carl Christian; Hunter, Roger C.; Baker, Christopher

    2017-01-01

    This project will develop an efficient smoothing software for incremental estimation of the relative poses and velocities between multiple, small spacecraft in a formation, and a small, long range depth sensor based on coded aperture imaging that is capable of identifying other spacecraft in the formation. The smoothing algorithm will obtain the maximum a posteriori estimate of the relative poses between the spacecraft by using all available sensor information in the spacecraft formation.This algorithm will be portable between different satellite platforms that possess different sensor suites and computational capabilities, and will be adaptable in the case that one or more satellites in the formation become inoperable. It will obtain a solution that will approach an exact solution, as opposed to one with linearization approximation that is typical of filtering algorithms. Thus, the algorithms developed and demonstrated as part of this program will enhance the applicability of small spacecraft to multi-platform operations, such as precisely aligned constellations and fractionated satellite systems.

  6. Test and evaluation of Japanese GPR-EMI dual sensor systems at Benkovac test site in Croatia

    NASA Astrophysics Data System (ADS)

    Ishikawa, J.; Furuta, K.; Pavković, Nikola

    2007-04-01

    This paper presents an experimental design and the evaluation result of a trial that were carried out from 1 February to 9 March 2006 using real PMA-1A and PMA-2 landmines at the Benkovac test site in Croatia. The objective of the Croatia- Japan joint trial is to evaluate dual sensor systems, which use both ground penetrating radar (GPR) and electromagnetic inductive (EMI) sensors. A comparative trial was also carried out by Croatian deminers using an existing EMI sensor, i.e., a metal detector (MD). The trial aims at evaluating differences in performance between dual sensors and MDs, especially in terms of discrimination of landmines from metal fragments and extension of detectable range in the depth direction. Devices evaluated here are 4 prototypes of anti-personnel landmine detection systems developed under a project of the Japan Science and Technology Agency (JST), the supervising authority of which is the Ministry of Education, Culture, Sports, Science and Technology (MEXT). The prototypes provide operators with subsurface images, and final decision whether a shadow in the image is a real landmine or not is left to the operator. This is similar to the way that medical doctors find cancer by reading CT images. Since operators' pre-knowledge of locations of buried targets significantly influences the test result, three test lanes, which have 3 different kinds of soils, have been designed to be suitable for blind tests. The result showed that the dual sensor systems have a potential to discriminate landmines from metal fragments and that probability of detection for small targets in mineralized soils can be improved by using GPR.

  7. The AOLI low-order non-linear curvature wavefront sensor: laboratory and on-sky results

    NASA Astrophysics Data System (ADS)

    Crass, Jonathan; King, David; MacKay, Craig

    2014-08-01

    Many adaptive optics (AO) systems in use today require the use of bright reference objects to determine the effects of atmospheric distortions. Typically these systems use Shack-Hartmann Wavefront sensors (SHWFS) to distribute incoming light from a reference object between a large number of sub-apertures. Guyon et al. evaluated the sensitivity of several different wavefront sensing techniques and proposed the non-linear Curvature Wavefront Sensor (nlCWFS) offering improved sensitivity across a range of orders of distortion. On large ground-based telescopes this can provide nearly 100% sky coverage using natural guide stars. We present work being undertaken on the nlCWFS development for the Adaptive Optics Lucky Imager (AOLI) project. The wavefront sensor is being developed as part of a low-order adaptive optics system for use in a dedicated instrument providing an AO corrected beam to a Lucky Imaging based science detector. The nlCWFS provides a total of four reference images on two photon-counting EMCCDs for use in the wavefront reconstruction process. We present results from both laboratory work using a calibration system and the first on-sky data obtained with the nlCWFS at the 4.2 metre William Herschel Telescope, La Palma. In addition, we describe the updated optical design of the wavefront sensor, strategies for minimising intrinsic effects and methods to maximise sensitivity using photon-counting detectors. We discuss on-going work to develop the high speed reconstruction algorithm required for the nlCWFS technique. This includes strategies to implement the technique on graphics processing units (GPUs) and to minimise computing overheads to obtain a prior for a rapid convergence of the wavefront reconstruction. Finally we evaluate the sensitivity of the wavefront sensor based upon both data and low-photon count strategies.

  8. Fluorescent materials for pH sensing and imaging based on novel 1,4-diketopyrrolo-[3,4-c]pyrrole dyes†Electronic supplementary information (ESI) available: NMR and MS spectra, further sensor characteristics and sensor long-time performance. See DOI: 10.1039/c3tc31130aClick here for additional data file.

    PubMed

    Aigner, Daniel; Ungerböck, Birgit; Mayr, Torsten; Saf, Robert; Klimant, Ingo; Borisov, Sergey M

    2013-09-28

    New optical pH-sensors relying on 1,4-diketopyrrolo-[3,4- c ]pyrroles (DPPs) as fluorescent pH-indicators are presented. Different polymer hydrogels are useful as immobilization matrices, achieving excellent sensitivity and good brightness in the resulting sensor. The operational pH can be tuned over a wide range (pH 5-12) by selecting the fine structure of the indicator and the matrix. A ratiometric sensor in the form of nanoparticles is also presented. It is suitable for RGB camera readout, and its practical applicability for fluorescence imaging in microfluidic systems is demonstrated. The indicators are synthesized starting from the commercially available DPP pigments by a straightforward concept employing chlorosulfonation and subsequent reaction with amines. Their sensitivity derives from two distinct mechanisms. At high pH (>9), they exhibit a remarkable alteration of both absorption and fluorescence spectra due to deprotonation of the lactam nitrogen atoms. If a phenolic group is introduced, highly effective fluorescence quenching at near-neutral pH occurs due to photoinduced electron transfer (PET) involving the phenolate form.

  9. Commercialization of Australian advanced infrared technology

    NASA Astrophysics Data System (ADS)

    Redpath, John; Brown, Allen; Woods, William F.

    1995-09-01

    For several decades, the main thrust in infrared technology developments in Australia has been in two main sensor technologies: uncooled silicon chip printed bolometric sensors pioneered by DSTO's Kevin Liddiard, and precision engineered high quality Cadmium Mercury Telluride developed at DSTO under the guidance of Dr. Richard Hartley. In late 1993 a low cost infrared imaging device was developed at DSTO as a sensor for guided missiles. The combination of these three innovations made up a unique package that enabled Australian industry to break through the barriers of commercializing infrared technology. The privately owned company, R.J. Optronics Pty Ltd undertook the process of re-engineering a selection of these DSTO developments to be applicable to a wide range of infrared products. The first project was a novel infrared imager based on a Palmer scan (translated circle) mechanism. This device applies a spinning wedge and a single detector, it uses a video processor to convert the image into a standard rectangular format. Originally developed as an imaging seeker for a stand-off weapon, it is producing such high quality images at such a low cost that it is now also being adapted for a wide variety of other military and commercial applications. A technique for electronically stabilizing it has been developed which uses the inertial signals from co-mounted sensors to compensate for platform motions. This enables it to meet the requirements of aircraft, marine vessels and masthead sight applications without the use of gimbals. After tests on a three-axis motion table, several system configurations have now been successfully operated on a number of lightweight platforms, including a Cessna 172 and the Australian made Seabird Seeker aircraft.

  10. The lucky image-motion prediction for simple scene observation based soft-sensor technology

    NASA Astrophysics Data System (ADS)

    Li, Yan; Su, Yun; Hu, Bin

    2015-08-01

    High resolution is important to earth remote sensors, while the vibration of the platforms of the remote sensors is a major factor restricting high resolution imaging. The image-motion prediction and real-time compensation are key technologies to solve this problem. For the reason that the traditional autocorrelation image algorithm cannot meet the demand for the simple scene image stabilization, this paper proposes to utilize soft-sensor technology in image-motion prediction, and focus on the research of algorithm optimization in imaging image-motion prediction. Simulations results indicate that the improving lucky image-motion stabilization algorithm combining the Back Propagation Network (BP NN) and support vector machine (SVM) is the most suitable for the simple scene image stabilization. The relative error of the image-motion prediction based the soft-sensor technology is below 5%, the training computing speed of the mathematical predication model is as fast as the real-time image stabilization in aerial photography.

  11. Fusion: ultra-high-speed and IR image sensors

    NASA Astrophysics Data System (ADS)

    Etoh, T. Goji; Dao, V. T. S.; Nguyen, Quang A.; Kimata, M.

    2015-08-01

    Most targets of ultra-high-speed video cameras operating at more than 1 Mfps, such as combustion, crack propagation, collision, plasma, spark discharge, an air bag at a car accident and a tire under a sudden brake, generate sudden heat. Researchers in these fields require tools to measure the high-speed motion and heat simultaneously. Ultra-high frame rate imaging is achieved by an in-situ storage image sensor. Each pixel of the sensor is equipped with multiple memory elements to record a series of image signals simultaneously at all pixels. Image signals stored in each pixel are read out after an image capturing operation. In 2002, we developed an in-situ storage image sensor operating at 1 Mfps 1). However, the fill factor of the sensor was only 15% due to a light shield covering the wide in-situ storage area. Therefore, in 2011, we developed a backside illuminated (BSI) in-situ storage image sensor to increase the sensitivity with 100% fill factor and a very high quantum efficiency 2). The sensor also achieved a much higher frame rate,16.7 Mfps, thanks to the wiring on the front side with more freedom 3). The BSI structure has another advantage that it has less difficulties in attaching an additional layer on the backside, such as scintillators. This paper proposes development of an ultra-high-speed IR image sensor in combination of advanced nano-technologies for IR imaging and the in-situ storage technology for ultra-highspeed imaging with discussion on issues in the integration.

  12. Cadastral Audit and Assessments Using Unmanned Aerial Systems

    NASA Astrophysics Data System (ADS)

    Cunningham, K.; Walker, G.; Stahlke, E.; Wilson, R.

    2011-09-01

    Ground surveys and remote sensing are integral to establishing fair and equitable property valuations necessary for real property taxation. The International Association of Assessing Officers (IAAO) has embraced aerial and street-view imaging as part of its standards related to property tax assessments and audits. New technologies, including unmanned aerial systems (UAS) paired with imaging sensors, will become more common as local governments work to ensure their cadastre and tax rolls are both accurate and complete. Trends in mapping technology have seen an evolution in platforms from large, expensive manned aircraft to very small, inexpensive UAS. Traditional methods of photogrammetry have also given way to new equipment and sensors: digital cameras, infrared imagers, light detection and ranging (LiDAR) laser scanners, and now synthetic aperture radar (SAR). At the University of Alaska Fairbanks (UAF), we work extensively with unmanned aerial systems equipped with each of these newer sensors. UAF has significant experience flying unmanned systems in the US National Airspace, having begun in 1969 with scientific rockets and expanded to unmanned aircraft in 2003. Ongoing field experience allows UAF to partner effectively with outside organizations to test and develop leading-edge research in UAS and remote sensing. This presentation will discuss our research related to various sensors and payloads for mapping. We will also share our experience with UAS and optical systems for creating some of the first cadastral surveys in rural Alaska.

  13. Development of liquid-environment frequency modulation atomic force microscope with low noise deflection sensor for cantilevers of various dimensions

    NASA Astrophysics Data System (ADS)

    Fukuma, Takeshi; Jarvis, Suzanne P.

    2006-04-01

    We have developed a liquid-environment frequency modulation atomic force microscope (FM-AFM) with a low noise deflection sensor for a wide range of cantilevers with different dimensions. A simple yet accurate equation describing the theoretical limit of the optical beam deflection method in air and liquid is presented. Based on the equation, we have designed a low noise deflection sensor. Replaceable microscope objective lenses are utilized for providing a high magnification optical view (resolution: <3μm) as well as for focusing a laser beam (laser spot size: ˜10μm). Even for a broad range of cantilevers with lengths from 35to125μm, the sensor provides deflection noise densities of less than 11fm/√Hz in air and 16fm/√Hz in water. In particular, a cantilever with a length of 50μm gives the minimum deflection noise density of 5.7fm/√Hz in air and 7.3fm/√Hz in water. True atomic resolution of the developed FM-AFM is demonstrated by imaging mica in water.

  14. Advanced sensor-simulation capability

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Kalman, Linda S.; Keller, Robert A.

    1990-09-01

    This paper provides an overview of an advanced simulation capability currently in use for analyzing visible and infrared sensor systems. The software system, called VISTAS (VISIBLE/INFRARED SENSOR TRADES, ANALYSES, AND SIMULATIONS) combines classical image processing techniques with detailed sensor models to produce static and time dependent simulations of a variety of sensor systems including imaging, tracking, and point target detection systems. Systems modelled to date include space-based scanning line-array sensors as well as staring 2-dimensional array sensors which can be used for either imaging or point source detection.

  15. Room-temperature bonding of epitaxial layer to carbon-cluster ion-implanted silicon wafers for CMOS image sensors

    NASA Astrophysics Data System (ADS)

    Koga, Yoshihiro; Kadono, Takeshi; Shigematsu, Satoshi; Hirose, Ryo; Onaka-Masada, Ayumi; Okuyama, Ryousuke; Okuda, Hidehiko; Kurita, Kazunari

    2018-06-01

    We propose a fabrication process for silicon wafers by combining carbon-cluster ion implantation and room-temperature bonding for advanced CMOS image sensors. These carbon-cluster ions are made of carbon and hydrogen, which can passivate process-induced defects. We demonstrated that this combination process can be used to form an epitaxial layer on a carbon-cluster ion-implanted Czochralski (CZ)-grown silicon substrate with a high dose of 1 × 1016 atoms/cm2. This implantation condition transforms the top-surface region of the CZ-grown silicon substrate into a thin amorphous layer. Thus, an epitaxial layer cannot be grown on this implanted CZ-grown silicon substrate. However, this combination process can be used to form an epitaxial layer on the amorphous layer of this implanted CZ-grown silicon substrate surface. This bonding wafer has strong gettering capability in both the wafer-bonding region and the carbon-cluster ion-implanted projection range. Furthermore, this wafer inhibits oxygen out-diffusion to the epitaxial layer from the CZ-grown silicon substrate after device fabrication. Therefore, we believe that this bonding wafer is effective in decreasing the dark current and white-spot defect density for advanced CMOS image sensors.

  16. Thin-Film Quantum Dot Photodiode for Monolithic Infrared Image Sensors.

    PubMed

    Malinowski, Pawel E; Georgitzikis, Epimitheas; Maes, Jorick; Vamvaka, Ioanna; Frazzica, Fortunato; Van Olmen, Jan; De Moor, Piet; Heremans, Paul; Hens, Zeger; Cheyns, David

    2017-12-10

    Imaging in the infrared wavelength range has been fundamental in scientific, military and surveillance applications. Currently, it is a crucial enabler of new industries such as autonomous mobility (for obstacle detection), augmented reality (for eye tracking) and biometrics. Ubiquitous deployment of infrared cameras (on a scale similar to visible cameras) is however prevented by high manufacturing cost and low resolution related to the need of using image sensors based on flip-chip hybridization. One way to enable monolithic integration is by replacing expensive, small-scale III-V-based detector chips with narrow bandgap thin-films compatible with 8- and 12-inch full-wafer processing. This work describes a CMOS-compatible pixel stack based on lead sulfide quantum dots (PbS QD) with tunable absorption peak. Photodiode with a 150-nm thick absorber in an inverted architecture shows dark current of 10 -6 A/cm² at -2 V reverse bias and EQE above 20% at 1440 nm wavelength. Optical modeling for top illumination architecture can improve the contact transparency to 70%. Additional cooling (193 K) can improve the sensitivity to 60 dB. This stack can be integrated on a CMOS ROIC, enabling order-of-magnitude cost reduction for infrared sensors.

  17. Visual Image Sensor Organ Replacement: Implementation

    NASA Technical Reports Server (NTRS)

    Maluf, A. David (Inventor)

    2011-01-01

    Method and system for enhancing or extending visual representation of a selected region of a visual image, where visual representation is interfered with or distorted, by supplementing a visual signal with at least one audio signal having one or more audio signal parameters that represent one or more visual image parameters, such as vertical and/or horizontal location of the region; region brightness; dominant wavelength range of the region; change in a parameter value that characterizes the visual image, with respect to a reference parameter value; and time rate of change in a parameter value that characterizes the visual image. Region dimensions can be changed to emphasize change with time of a visual image parameter.

  18. TimepixCam: a fast optical imager with time-stamping

    NASA Astrophysics Data System (ADS)

    Fisher-Levine, M.; Nomerotski, A.

    2016-03-01

    We describe a novel fast optical imager, TimepixCam, based on an optimized silicon pixel sensor with a thin entrance window, read out by a Timepix ASIC. TimepixCam is able to record and time-stamp light flashes in excess of 1,000 photons with high quantum efficiency in the 400-1000nm wavelength range with 20ns timing resolution, corresponding to an effective rate of 50 Megaframes per second. The camera was used for imaging ions impinging on a microchannel plate followed by a phosphor screen. Possible applications include spatial and velocity map imaging of ions in time-of-flight mass spectroscopy; coincidence imaging of ions and electrons, and other time-resolved types of imaging spectroscopy.

  19. Wavelength interrogation of fiber Bragg grating sensors using tapered hollow Bragg waveguides.

    PubMed

    Potts, C; Allen, T W; Azar, A; Melnyk, A; Dennison, C R; DeCorby, R G

    2014-10-15

    We describe an integrated system for wavelength interrogation, which uses tapered hollow Bragg waveguides coupled to an image sensor. Spectral shifts are extracted from the wavelength dependence of the light radiated at mode cutoff. Wavelength shifts as small as ~10  pm were resolved by employing a simple peak detection algorithm. Si/SiO₂-based cladding mirrors enable a potential operational range of several hundred nanometers in the 1550 nm wavelength region for a taper length of ~1  mm. Interrogation of a strain-tuned grating was accomplished using a broadband amplified spontaneous emission (ASE) source, and potential for single-chip interrogation of multiplexed sensor arrays is demonstrated.

  20. Characterization of TimepixCam, a fast imager for the time-stamping of optical photons

    NASA Astrophysics Data System (ADS)

    Nomerotski, Andrei; Chakaberia, I.; Fisher-Levine, M.; Janoska, Z.; Takacs, P.; Tsang, T.

    2017-01-01

    We describe the characterization of TimepixCam, a novel camera used to time-stamp optical photons. The camera employs a specialized silicon sensor with a thin entrance window, read out by a Timepix ASIC. TimepixCam is able to record and time-stamp light flashes exceeding 1,000 photons with 15 ns time resolution. Specially produced photodiodes were used to evaluate the quantum efficiency, which was determined to be higher than 90% in the wavelength range of 430-900 nm. The quantum efficiency, sensitivity and ion detection efficiency were compared for a variety of sensors with different surface treatments. Sensors with the thinnest window, 50 nm, had the best performance.

Top