Sample records for array image sensor

  1. Apparatus and method for imaging metallic objects using an array of giant magnetoresistive sensors

    DOEpatents

    Chaiken, Alison

    2000-01-01

    A portable, low-power, metallic object detector and method for providing an image of a detected metallic object. In one embodiment, the present portable low-power metallic object detector an array of giant magnetoresistive (GMR) sensors. The array of GMR sensors is adapted for detecting the presence of and compiling image data of a metallic object. In the embodiment, the array of GMR sensors is arranged in a checkerboard configuration such that axes of sensitivity of alternate GMR sensors are orthogonally oriented. An electronics portion is coupled to the array of GMR sensors. The electronics portion is adapted to receive and process the image data of the metallic object compiled by the array of GMR sensors. The embodiment also includes a display unit which is coupled to the electronics portion. The display unit is adapted to display a graphical representation of the metallic object detected by the array of GMR sensors. In so doing, a graphical representation of the detected metallic object is provided.

  2. A mobile ferromagnetic shape detection sensor using a Hall sensor array and magnetic imaging.

    PubMed

    Misron, Norhisam; Shin, Ng Wei; Shafie, Suhaidi; Marhaban, Mohd Hamiruce; Mailah, Nashiren Farzilah

    2011-01-01

    This paper presents a mobile Hall sensor array system for the shape detection of ferromagnetic materials that are embedded in walls or floors. The operation of the mobile Hall sensor array system is based on the principle of magnetic flux leakage to describe the shape of the ferromagnetic material. Two permanent magnets are used to generate the magnetic flux flow. The distribution of magnetic flux is perturbed as the ferromagnetic material is brought near the permanent magnets and the changes in magnetic flux distribution are detected by the 1-D array of the Hall sensor array setup. The process for magnetic imaging of the magnetic flux distribution is done by a signal processing unit before it displays the real time images using a netbook. A signal processing application software is developed for the 1-D Hall sensor array signal acquisition and processing to construct a 2-D array matrix. The processed 1-D Hall sensor array signals are later used to construct the magnetic image of ferromagnetic material based on the voltage signal and the magnetic flux distribution. The experimental results illustrate how the shape of specimens such as square, round and triangle shapes is determined through magnetic images based on the voltage signal and magnetic flux distribution of the specimen. In addition, the magnetic images of actual ferromagnetic objects are also illustrated to prove the functionality of mobile Hall sensor array system for actual shape detection. The results prove that the mobile Hall sensor array system is able to perform magnetic imaging in identifying various ferromagnetic materials.

  3. A Mobile Ferromagnetic Shape Detection Sensor Using a Hall Sensor Array and Magnetic Imaging

    PubMed Central

    Misron, Norhisam; Shin, Ng Wei; Shafie, Suhaidi; Marhaban, Mohd Hamiruce; Mailah, Nashiren Farzilah

    2011-01-01

    This paper presents a Mobile Hall Sensor Array system for the shape detection of ferromagnetic materials that are embedded in walls or floors. The operation of the Mobile Hall Sensor Array system is based on the principle of magnetic flux leakage to describe the shape of the ferromagnetic material. Two permanent magnets are used to generate the magnetic flux flow. The distribution of magnetic flux is perturbed as the ferromagnetic material is brought near the permanent magnets and the changes in magnetic flux distribution are detected by the 1-D array of the Hall sensor array setup. The process for magnetic imaging of the magnetic flux distribution is done by a signal processing unit before it displays the real time images using a netbook. A signal processing application software is developed for the 1-D Hall sensor array signal acquisition and processing to construct a 2-D array matrix. The processed 1-D Hall sensor array signals are later used to construct the magnetic image of ferromagnetic material based on the voltage signal and the magnetic flux distribution. The experimental results illustrate how the shape of specimens such as square, round and triangle shapes is determined through magnetic images based on the voltage signal and magnetic flux distribution of the specimen. In addition, the magnetic images of actual ferromagnetic objects are also illustrated to prove the functionality of Mobile Hall Sensor Array system for actual shape detection. The results prove that the Mobile Hall Sensor Array system is able to perform magnetic imaging in identifying various ferromagnetic materials. PMID:22346653

  4. A programmable computational image sensor for high-speed vision

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Shi, Cong; Long, Xitian; Wu, Nanjian

    2013-08-01

    In this paper we present a programmable computational image sensor for high-speed vision. This computational image sensor contains four main blocks: an image pixel array, a massively parallel processing element (PE) array, a row processor (RP) array and a RISC core. The pixel-parallel PE is responsible for transferring, storing and processing image raw data in a SIMD fashion with its own programming language. The RPs are one dimensional array of simplified RISC cores, it can carry out complex arithmetic and logic operations. The PE array and RP array can finish great amount of computation with few instruction cycles and therefore satisfy the low- and middle-level high-speed image processing requirement. The RISC core controls the whole system operation and finishes some high-level image processing algorithms. We utilize a simplified AHB bus as the system bus to connect our major components. Programming language and corresponding tool chain for this computational image sensor are also developed.

  5. Imaging optical sensor arrays.

    PubMed

    Walt, David R

    2002-10-01

    Imaging optical fibres have been etched to prepare microwell arrays. These microwells have been loaded with sensing materials such as bead-based sensors and living cells to create high-density sensor arrays. The extremely small sizes and volumes of the wells enable high sensitivity and high information content sensing capabilities.

  6. Geiger-Mode Avalanche Photodiode Arrays Integrated to All-Digital CMOS Circuits

    DTIC Science & Technology

    2016-01-20

    Figure 7 4×4 GMAPD array wire bonded to CMOS timing circuits Figure 8 Low‐fill‐factor APD design used in lidar sensors The APD doping...epitaxial growth and the pixels are isolated by mesa etch. 128×32 lidar image sensors were built by bump bonding the APD arrays to a CMOS timing...passive image sensor with this large a format based on hybridization of a GMAPD array to a CMOS readout. Fig. 14 shows one of the first images taken

  7. Vision communications based on LED array and imaging sensor

    NASA Astrophysics Data System (ADS)

    Yoo, Jong-Ho; Jung, Sung-Yoon

    2012-11-01

    In this paper, we propose a brand new communication concept, called as "vision communication" based on LED array and image sensor. This system consists of LED array as a transmitter and digital device which include image sensor such as CCD and CMOS as receiver. In order to transmit data, the proposed communication scheme simultaneously uses the digital image processing and optical wireless communication scheme. Therefore, the cognitive communication scheme is possible with the help of recognition techniques used in vision system. By increasing data rate, our scheme can use LED array consisting of several multi-spectral LEDs. Because arranged each LED can emit multi-spectral optical signal such as visible, infrared and ultraviolet light, the increase of data rate is possible similar to WDM and MIMO skills used in traditional optical and wireless communications. In addition, this multi-spectral capability also makes it possible to avoid the optical noises in communication environment. In our vision communication scheme, the data packet is composed of Sync. data and information data. Sync. data is used to detect the transmitter area and calibrate the distorted image snapshots obtained by image sensor. By making the optical rate of LED array be same with the frame rate (frames per second) of image sensor, we can decode the information data included in each image snapshot based on image processing and optical wireless communication techniques. Through experiment based on practical test bed system, we confirm the feasibility of the proposed vision communications based on LED array and image sensor.

  8. Reduced signal crosstalk multi neurotransmitter image sensor by microhole array structure

    NASA Astrophysics Data System (ADS)

    Ogaeri, Yuta; Lee, You-Na; Mitsudome, Masato; Iwata, Tatsuya; Takahashi, Kazuhiro; Sawada, Kazuaki

    2018-06-01

    A microhole array structure combined with an enzyme immobilization method using magnetic beads can enhance the target discernment capability of a multi neurotransmitter image sensor. Here we report the fabrication and evaluation of the H+-diffusion-preventing capability of the sensor with the array structure. The structure with an SU-8 photoresist has holes with a size of 24.5 × 31.6 µm2. Sensors were prepared with the array structure of three different heights: 0, 15, and 60 µm. When the sensor has the structure of 60 µm height, 48% reduced output voltage is measured at a H+-sensitive null pixel that is located 75 µm from the acetylcholinesterase (AChE)-immobilized pixel, which is the starting point of H+ diffusion. The suppressed H+ immigration is shown in a two-dimensional (2D) image in real time. The sensor parameters, such as height of the array structure and measuring time, are optimized experimentally. The sensor is expected to effectively distinguish various neurotransmitters in biological samples.

  9. Advanced sensor-simulation capability

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Kalman, Linda S.; Keller, Robert A.

    1990-09-01

    This paper provides an overview of an advanced simulation capability currently in use for analyzing visible and infrared sensor systems. The software system, called VISTAS (VISIBLE/INFRARED SENSOR TRADES, ANALYSES, AND SIMULATIONS) combines classical image processing techniques with detailed sensor models to produce static and time dependent simulations of a variety of sensor systems including imaging, tracking, and point target detection systems. Systems modelled to date include space-based scanning line-array sensors as well as staring 2-dimensional array sensors which can be used for either imaging or point source detection.

  10. High-resolution dynamic pressure sensor array based on piezo-phototronic effect tuned photoluminescence imaging.

    PubMed

    Peng, Mingzeng; Li, Zhou; Liu, Caihong; Zheng, Qiang; Shi, Xieqing; Song, Ming; Zhang, Yang; Du, Shiyu; Zhai, Junyi; Wang, Zhong Lin

    2015-03-24

    A high-resolution dynamic tactile/pressure display is indispensable to the comprehensive perception of force/mechanical stimulations such as electronic skin, biomechanical imaging/analysis, or personalized signatures. Here, we present a dynamic pressure sensor array based on pressure/strain tuned photoluminescence imaging without the need for electricity. Each sensor is a nanopillar that consists of InGaN/GaN multiple quantum wells. Its photoluminescence intensity can be modulated dramatically and linearly by small strain (0-0.15%) owing to the piezo-phototronic effect. The sensor array has a high pixel density of 6350 dpi and exceptional small standard deviation of photoluminescence. High-quality tactile/pressure sensing distribution can be real-time recorded by parallel photoluminescence imaging without any cross-talk. The sensor array can be inexpensively fabricated over large areas by semiconductor product lines. The proposed dynamic all-optical pressure imaging with excellent resolution, high sensitivity, good uniformity, and ultrafast response time offers a suitable way for smart sensing, micro/nano-opto-electromechanical systems.

  11. Microwave Sensors for Breast Cancer Detection

    PubMed Central

    2018-01-01

    Breast cancer is the leading cause of death among females, early diagnostic methods with suitable treatments improve the 5-year survival rates significantly. Microwave breast imaging has been reported as the most potential to become the alternative or additional tool to the current gold standard X-ray mammography for detecting breast cancer. The microwave breast image quality is affected by the microwave sensor, sensor array, the number of sensors in the array and the size of the sensor. In fact, microwave sensor array and sensor play an important role in the microwave breast imaging system. Numerous microwave biosensors have been developed for biomedical applications, with particular focus on breast tumor detection. Compared to the conventional medical imaging and biosensor techniques, these microwave sensors not only enable better cancer detection and improve the image resolution, but also provide attractive features such as label-free detection. This paper aims to provide an overview of recent important achievements in microwave sensors for biomedical imaging applications, with particular focus on breast cancer detection. The electric properties of biological tissues at microwave spectrum, microwave imaging approaches, microwave biosensors, current challenges and future works are also discussed in the manuscript. PMID:29473867

  12. Microwave Sensors for Breast Cancer Detection.

    PubMed

    Wang, Lulu

    2018-02-23

    Breast cancer is the leading cause of death among females, early diagnostic methods with suitable treatments improve the 5-year survival rates significantly. Microwave breast imaging has been reported as the most potential to become the alternative or additional tool to the current gold standard X-ray mammography for detecting breast cancer. The microwave breast image quality is affected by the microwave sensor, sensor array, the number of sensors in the array and the size of the sensor. In fact, microwave sensor array and sensor play an important role in the microwave breast imaging system. Numerous microwave biosensors have been developed for biomedical applications, with particular focus on breast tumor detection. Compared to the conventional medical imaging and biosensor techniques, these microwave sensors not only enable better cancer detection and improve the image resolution, but also provide attractive features such as label-free detection. This paper aims to provide an overview of recent important achievements in microwave sensors for biomedical imaging applications, with particular focus on breast cancer detection. The electric properties of biological tissues at microwave spectrum, microwave imaging approaches, microwave biosensors, current challenges and future works are also discussed in the manuscript.

  13. Solid state image sensing arrays

    NASA Technical Reports Server (NTRS)

    Sadasiv, G.

    1972-01-01

    The fabrication of a photodiode transistor image sensor array in silicon, and tests on individual elements of the array are described along with design for a scanning system for an image sensor array. The spectral response of p-n junctions was used as a technique for studying the optical-absorption edge in silicon. Heterojunction structures of Sb2S3- Si were fabricated and a system for measuring C-V curves on MOS structures was built.

  14. Human eye-inspired soft optoelectronic device using high-density MoS2-graphene curved image sensor array.

    PubMed

    Choi, Changsoon; Choi, Moon Kee; Liu, Siyi; Kim, Min Sung; Park, Ok Kyu; Im, Changkyun; Kim, Jaemin; Qin, Xiaoliang; Lee, Gil Ju; Cho, Kyoung Won; Kim, Myungbin; Joh, Eehyung; Lee, Jongha; Son, Donghee; Kwon, Seung-Hae; Jeon, Noo Li; Song, Young Min; Lu, Nanshu; Kim, Dae-Hyeong

    2017-11-21

    Soft bioelectronic devices provide new opportunities for next-generation implantable devices owing to their soft mechanical nature that leads to minimal tissue damages and immune responses. However, a soft form of the implantable optoelectronic device for optical sensing and retinal stimulation has not been developed yet because of the bulkiness and rigidity of conventional imaging modules and their composing materials. Here, we describe a high-density and hemispherically curved image sensor array that leverages the atomically thin MoS 2 -graphene heterostructure and strain-releasing device designs. The hemispherically curved image sensor array exhibits infrared blindness and successfully acquires pixelated optical signals. We corroborate the validity of the proposed soft materials and ultrathin device designs through theoretical modeling and finite element analysis. Then, we propose the ultrathin hemispherically curved image sensor array as a promising imaging element in the soft retinal implant. The CurvIS array is applied as a human eye-inspired soft implantable optoelectronic device that can detect optical signals and apply programmed electrical stimulation to optic nerves with minimum mechanical side effects to the retina.

  15. Mapping Capacitive Coupling Among Pixels in a Sensor Array

    NASA Technical Reports Server (NTRS)

    Seshadri, Suresh; Cole, David M.; Smith, Roger M.

    2010-01-01

    An improved method of mapping the capacitive contribution to cross-talk among pixels in an imaging array of sensors (typically, an imaging photodetector array) has been devised for use in calibrating and/or characterizing such an array. The method involves a sequence of resets of subarrays of pixels to specified voltages and measurement of the voltage responses of neighboring non-reset pixels.

  16. Toroidal sensor arrays for real-time photoacoustic imaging

    NASA Astrophysics Data System (ADS)

    Bychkov, Anton S.; Cherepetskaya, Elena B.; Karabutov, Alexander A.; Makarov, Vladimir A.

    2017-07-01

    This article addresses theoretical and numerical investigation of image formation in photoacoustic (PA) imaging with complex-shaped concave sensor arrays. The spatial resolution and the size of sensitivity region of PA and laser ultrasonic (LU) imaging systems are assessed using sensitivity maps and spatial resolution maps in the image plane. This paper also discusses the relationship between the size of high-sensitivity regions and the spatial resolution of real-time imaging systems utilizing toroidal arrays. It is shown that the use of arrays with toroidal geometry significantly improves the diagnostic capabilities of PA and LU imaging to investigate biological objects, rocks, and composite materials.

  17. CMOS imager for pointing and tracking applications

    NASA Technical Reports Server (NTRS)

    Sun, Chao (Inventor); Pain, Bedabrata (Inventor); Yang, Guang (Inventor); Heynssens, Julie B. (Inventor)

    2006-01-01

    Systems and techniques to realize pointing and tracking applications with CMOS imaging devices. In general, in one implementation, the technique includes: sampling multiple rows and multiple columns of an active pixel sensor array into a memory array (e.g., an on-chip memory array), and reading out the multiple rows and multiple columns sampled in the memory array to provide image data with reduced motion artifact. Various operation modes may be provided, including TDS, CDS, CQS, a tracking mode to read out multiple windows, and/or a mode employing a sample-first-read-later readout scheme. The tracking mode can take advantage of a diagonal switch array. The diagonal switch array, the active pixel sensor array and the memory array can be integrated onto a single imager chip with a controller. This imager device can be part of a larger imaging system for both space-based applications and terrestrial applications.

  18. High-Resolution Spin-on-Patterning of Perovskite Thin Films for a Multiplexed Image Sensor Array.

    PubMed

    Lee, Woongchan; Lee, Jongha; Yun, Huiwon; Kim, Joonsoo; Park, Jinhong; Choi, Changsoon; Kim, Dong Chan; Seo, Hyunseon; Lee, Hakyong; Yu, Ji Woong; Lee, Won Bo; Kim, Dae-Hyeong

    2017-10-01

    Inorganic-organic hybrid perovskite thin films have attracted significant attention as an alternative to silicon in photon-absorbing devices mainly because of their superb optoelectronic properties. However, high-definition patterning of perovskite thin films, which is important for fabrication of the image sensor array, is hardly accomplished owing to their extreme instability in general photolithographic solvents. Here, a novel patterning process for perovskite thin films is described: the high-resolution spin-on-patterning (SoP) process. This fast and facile process is compatible with a variety of spin-coated perovskite materials and perovskite deposition techniques. The SoP process is successfully applied to develop a high-performance, ultrathin, and deformable perovskite-on-silicon multiplexed image sensor array, paving the road toward next-generation image sensor arrays. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. CMOS Imaging of Pin-Printed Xerogel-Based Luminescent Sensor Microarrays.

    PubMed

    Yao, Lei; Yung, Ka Yi; Khan, Rifat; Chodavarapu, Vamsy P; Bright, Frank V

    2010-12-01

    We present the design and implementation of a luminescence-based miniaturized multisensor system using pin-printed xerogel materials which act as host media for chemical recognition elements. We developed a CMOS imager integrated circuit (IC) to image the luminescence response of the xerogel-based sensor array. The imager IC uses a 26 × 20 (520 elements) array of active pixel sensors and each active pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. The imager includes a correlated double sampling circuit and pixel address/digital control circuit; the image data is read-out as coded serial signal. The sensor system uses a light-emitting diode (LED) to excite the target analyte responsive luminophores doped within discrete xerogel-based sensor elements. As a prototype, we developed a 4 × 4 (16 elements) array of oxygen (O 2 ) sensors. Each group of 4 sensor elements in the array (arranged in a row) is designed to provide a different and specific sensitivity to the target gaseous O 2 concentration. This property of multiple sensitivities is achieved by using a strategic mix of two oxygen sensitive luminophores ([Ru(dpp) 3 ] 2+ and ([Ru(bpy) 3 ] 2+ ) in each pin-printed xerogel sensor element. The CMOS imager consumes an average power of 8 mW operating at 1 kHz sampling frequency driven at 5 V. The developed prototype system demonstrates a low cost and miniaturized luminescence multisensor system.

  20. Microscopy imaging device with advanced imaging properties

    DOEpatents

    Ghosh, Kunal; Burns, Laurie; El Gamal, Abbas; Schnitzer, Mark J.; Cocker, Eric; Ho, Tatt Wei

    2015-11-24

    Systems, methods and devices are implemented for microscope imaging solutions. One embodiment of the present disclosure is directed toward an epifluorescence microscope. The microscope includes an image capture circuit including an array of optical sensor. An optical arrangement is configured to direct excitation light of less than about 1 mW to a target object in a field of view of that is at least 0.5 mm.sup.2 and to direct epi-fluorescence emission caused by the excitation light to the array of optical sensors. The optical arrangement and array of optical sensors are each sufficiently close to the target object to provide at least 2.5 .mu.m resolution for an image of the field of view.

  1. Microscopy imaging device with advanced imaging properties

    DOEpatents

    Ghosh, Kunal; Burns, Laurie; El Gamal, Abbas; Schnitzer, Mark J.; Cocker, Eric; Ho, Tatt Wei

    2016-10-25

    Systems, methods and devices are implemented for microscope imaging solutions. One embodiment of the present disclosure is directed toward an epifluorescence microscope. The microscope includes an image capture circuit including an array of optical sensor. An optical arrangement is configured to direct excitation light of less than about 1 mW to a target object in a field of view of that is at least 0.5 mm.sup.2 and to direct epi-fluorescence emission caused by the excitation light to the array of optical sensors. The optical arrangement and array of optical sensors are each sufficiently close to the target object to provide at least 2.5 .mu.m resolution for an image of the field of view.

  2. Microscopy imaging device with advanced imaging properties

    DOEpatents

    Ghosh, Kunal; Burns, Laurie; El Gamal, Abbas; Schnitzer, Mark J.; Cocker, Eric; Ho, Tatt Wei

    2016-11-22

    Systems, methods and devices are implemented for microscope imaging solutions. One embodiment of the present disclosure is directed toward an epifluorescence microscope. The microscope includes an image capture circuit including an array of optical sensor. An optical arrangement is configured to direct excitation light of less than about 1 mW to a target object in a field of view of that is at least 0.5 mm.sup.2 and to direct epi-fluorescence emission caused by the excitation light to the array of optical sensors. The optical arrangement and array of optical sensors are each sufficiently close to the target object to provide at least 2.5 .mu.m resolution for an image of the field of view.

  3. Microscopy imaging device with advanced imaging properties

    DOEpatents

    Ghosh, Kunal; Burns, Laurie; El Gamal, Abbas; Schnitzer, Mark J.; Cocker, Eric; Ho, Tatt Wei

    2017-04-25

    Systems, methods and devices are implemented for microscope imaging solutions. One embodiment of the present disclosure is directed toward an epifluorescence microscope. The microscope includes an image capture circuit including an array of optical sensor. An optical arrangement is configured to direct excitation light of less than about 1 mW to a target object in a field of view of that is at least 0.5 mm.sup.2 and to direct epi-fluorescence emission caused by the excitation light to the array of optical sensors. The optical arrangement and array of optical sensors are each sufficiently close to the target object to provide at least 2.5 .mu.m resolution for an image of the field of view.

  4. Plenoptic camera image simulation for reconstruction algorithm verification

    NASA Astrophysics Data System (ADS)

    Schwiegerling, Jim

    2014-09-01

    Plenoptic cameras have emerged in recent years as a technology for capturing light field data in a single snapshot. A conventional digital camera can be modified with the addition of a lenslet array to create a plenoptic camera. Two distinct camera forms have been proposed in the literature. The first has the camera image focused onto the lenslet array. The lenslet array is placed over the camera sensor such that each lenslet forms an image of the exit pupil onto the sensor. The second plenoptic form has the lenslet array relaying the image formed by the camera lens to the sensor. We have developed a raytracing package that can simulate images formed by a generalized version of the plenoptic camera. Several rays from each sensor pixel are traced backwards through the system to define a cone of rays emanating from the entrance pupil of the camera lens. Objects that lie within this cone are integrated to lead to a color and exposure level for that pixel. To speed processing three-dimensional objects are approximated as a series of planes at different depths. Repeating this process for each pixel in the sensor leads to a simulated plenoptic image on which different reconstruction algorithms can be tested.

  5. Integrated sensor with frame memory and programmable resolution for light adaptive imaging

    NASA Technical Reports Server (NTRS)

    Zhou, Zhimin (Inventor); Fossum, Eric R. (Inventor); Pain, Bedabrata (Inventor)

    2004-01-01

    An image sensor operable to vary the output spatial resolution according to a received light level while maintaining a desired signal-to-noise ratio. Signals from neighboring pixels in a pixel patch with an adjustable size are added to increase both the image brightness and signal-to-noise ratio. One embodiment comprises a sensor array for receiving input signals, a frame memory array for temporarily storing a full frame, and an array of self-calibration column integrators for uniform column-parallel signal summation. The column integrators are capable of substantially canceling fixed pattern noise.

  6. Sparsely-Bonded CMOS Hybrid Imager

    NASA Technical Reports Server (NTRS)

    Sun, Chao (Inventor); Jones, Todd J. (Inventor); Nikzad, Shouleh (Inventor); Newton, Kenneth W. (Inventor); Cunningham, Thomas J. (Inventor); Hancock, Bruce R. (Inventor); Dickie, Matthew R. (Inventor); Hoenk, Michael E. (Inventor); Wrigley, Christopher J. (Inventor); Pain, Bedabrata (Inventor)

    2015-01-01

    A method and device for imaging or detecting electromagnetic radiation is provided. A device structure includes a first chip interconnected with a second chip. The first chip includes a detector array, wherein the detector array comprises a plurality of light sensors and one or more transistors. The second chip includes a Read Out Integrated Circuit (ROIC) that reads out, via the transistors, a signal produced by the light sensors. A number of interconnects between the ROIC and the detector array can be less than one per light sensor or pixel.

  7. Contact CMOS imaging of gaseous oxygen sensor array

    PubMed Central

    Daivasagaya, Daisy S.; Yao, Lei; Yi Yung, Ka; Hajj-Hassan, Mohamad; Cheung, Maurice C.; Chodavarapu, Vamsy P.; Bright, Frank V.

    2014-01-01

    We describe a compact luminescent gaseous oxygen (O2) sensor microsystem based on the direct integration of sensor elements with a polymeric optical filter and placed on a low power complementary metal-oxide semiconductor (CMOS) imager integrated circuit (IC). The sensor operates on the measurement of excited-state emission intensity of O2-sensitive luminophore molecules tris(4,7-diphenyl-1,10-phenanthroline) ruthenium(II) ([Ru(dpp)3]2+) encapsulated within sol–gel derived xerogel thin films. The polymeric optical filter is made with polydimethylsiloxane (PDMS) that is mixed with a dye (Sudan-II). The PDMS membrane surface is molded to incorporate arrays of trapezoidal microstructures that serve to focus the optical sensor signals on to the imager pixels. The molded PDMS membrane is then attached with the PDMS color filter. The xerogel sensor arrays are contact printed on top of the PDMS trapezoidal lens-like microstructures. The CMOS imager uses a 32 × 32 (1024 elements) array of active pixel sensors and each pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. Correlated double sampling circuit, pixel address, digital control and signal integration circuits are also implemented on-chip. The CMOS imager data is read out as a serial coded signal. The CMOS imager consumes a static power of 320 µW and an average dynamic power of 625 µW when operating at 100 Hz sampling frequency and 1.8 V DC. This CMOS sensor system provides a useful platform for the development of miniaturized optical chemical gas sensors. PMID:24493909

  8. Contact CMOS imaging of gaseous oxygen sensor array.

    PubMed

    Daivasagaya, Daisy S; Yao, Lei; Yi Yung, Ka; Hajj-Hassan, Mohamad; Cheung, Maurice C; Chodavarapu, Vamsy P; Bright, Frank V

    2011-10-01

    We describe a compact luminescent gaseous oxygen (O 2 ) sensor microsystem based on the direct integration of sensor elements with a polymeric optical filter and placed on a low power complementary metal-oxide semiconductor (CMOS) imager integrated circuit (IC). The sensor operates on the measurement of excited-state emission intensity of O 2 -sensitive luminophore molecules tris(4,7-diphenyl-1,10-phenanthroline) ruthenium(II) ([Ru(dpp) 3 ] 2+ ) encapsulated within sol-gel derived xerogel thin films. The polymeric optical filter is made with polydimethylsiloxane (PDMS) that is mixed with a dye (Sudan-II). The PDMS membrane surface is molded to incorporate arrays of trapezoidal microstructures that serve to focus the optical sensor signals on to the imager pixels. The molded PDMS membrane is then attached with the PDMS color filter. The xerogel sensor arrays are contact printed on top of the PDMS trapezoidal lens-like microstructures. The CMOS imager uses a 32 × 32 (1024 elements) array of active pixel sensors and each pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. Correlated double sampling circuit, pixel address, digital control and signal integration circuits are also implemented on-chip. The CMOS imager data is read out as a serial coded signal. The CMOS imager consumes a static power of 320 µW and an average dynamic power of 625 µW when operating at 100 Hz sampling frequency and 1.8 V DC. This CMOS sensor system provides a useful platform for the development of miniaturized optical chemical gas sensors.

  9. Range imaging pulsed laser sensor with two-dimensional scanning of transmitted beam and scanless receiver using high-aspect avalanche photodiode array for eye-safe wavelength

    NASA Astrophysics Data System (ADS)

    Tsuji, Hidenobu; Imaki, Masaharu; Kotake, Nobuki; Hirai, Akihito; Nakaji, Masaharu; Kameyama, Shumpei

    2017-03-01

    We demonstrate a range imaging pulsed laser sensor with two-dimensional scanning of a transmitted beam and a scanless receiver using a high-aspect avalanche photodiode (APD) array for the eye-safe wavelength. The system achieves a high frame rate and long-range imaging with a relatively simple sensor configuration. We developed a high-aspect APD array for the wavelength of 1.5 μm, a receiver integrated circuit, and a range and intensity detector. By combining these devices, we realized 160×120 pixels range imaging with a frame rate of 8 Hz at a distance of about 50 m.

  10. Wave analysis of a plenoptic system and its applications

    NASA Astrophysics Data System (ADS)

    Shroff, Sapna A.; Berkner, Kathrin

    2013-03-01

    Traditional imaging systems directly image a 2D object plane on to the sensor. Plenoptic imaging systems contain a lenslet array at the conventional image plane and a sensor at the back focal plane of the lenslet array. In this configuration the data captured at the sensor is not a direct image of the object. Each lenslet effectively images the aperture of the main imaging lens at the sensor. Therefore the sensor data retains angular light-field information which can be used for a posteriori digital computation of multi-angle images and axially refocused images. If a filter array, containing spectral filters or neutral density or polarization filters, is placed at the pupil aperture of the main imaging lens, then each lenslet images the filters on to the sensor. This enables the digital separation of multiple filter modalities giving single snapshot, multi-modal images. Due to the diversity of potential applications of plenoptic systems, their investigation is increasing. As the application space moves towards microscopes and other complex systems, and as pixel sizes become smaller, the consideration of diffraction effects in these systems becomes increasingly important. We discuss a plenoptic system and its wave propagation analysis for both coherent and incoherent imaging. We simulate a system response using our analysis and discuss various applications of the system response pertaining to plenoptic system design, implementation and calibration.

  11. High-density Schottky barrier IRCCD sensors for remote sensing applications

    NASA Astrophysics Data System (ADS)

    Elabd, H.; Tower, J. R.; McCarthy, B. M.

    1983-01-01

    It is pointed out that the ambitious goals envisaged for the next generation of space-borne sensors challenge the state-of-the-art in solid-state imaging technology. Studies are being conducted with the aim to provide focal plane array technology suitable for use in future Multispectral Linear Array (MLA) earth resource instruments. An important new technology for IR-image sensors involves the use of monolithic Schottky barrier infrared charge-coupled device arrays. This technology is suitable for earth sensing applications in which moderate quantum efficiency and intermediate operating temperatures are required. This IR sensor can be fabricated by using standard integrated circuit (IC) processing techniques, and it is possible to employ commercial IC grade silicon. For this reason, it is feasible to construct Schottky barrier area and line arrays with large numbers of elements and high-density designs. A Pd2Si Schottky barrier sensor for multispectral imaging in the 1 to 3.5 micron band is under development.

  12. Wide-field microscopy using microcamera arrays

    NASA Astrophysics Data System (ADS)

    Marks, Daniel L.; Youn, Seo Ho; Son, Hui S.; Kim, Jungsang; Brady, David J.

    2013-02-01

    A microcamera is a relay lens paired with image sensors. Microcameras are grouped into arrays to relay overlapping views of a single large surface to the sensors to form a continuous synthetic image. The imaged surface may be curved or irregular as each camera may independently be dynamically focused to a different depth. Microcamera arrays are akin to microprocessors in supercomputers in that both join individual processors by an optoelectronic routing fabric to increase capacity and performance. A microcamera may image ten or more megapixels and grouped into an array of several hundred, as has already been demonstrated by the DARPA AWARE Wide-Field program with multiscale gigapixel photography. We adapt gigapixel microcamera array architectures to wide-field microscopy of irregularly shaped surfaces to greatly increase area imaging over 1000 square millimeters at resolutions of 3 microns or better in a single snapshot. The system includes a novel relay design, a sensor electronics package, and a FPGA-based networking fabric. Biomedical applications of this include screening for skin lesions, wide-field and resolution-agile microsurgical imaging, and microscopic cytometry of millions of cells performed in situ.

  13. Miniature infrared hyperspectral imaging sensor for airborne applications

    NASA Astrophysics Data System (ADS)

    Hinnrichs, Michele; Hinnrichs, Bradford; McCutchen, Earl

    2017-05-01

    Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera, both MWIR and LWIR, small enough to serve as a payload on a miniature unmanned aerial vehicles. The optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of the sensor. This new and innovative approach to infrared hyperspectral imaging spectrometer uses micro-optics and will be explained in this paper. The micro-optics are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a mini-UAV or commercial quadcopter. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the spatial resolution. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame.

  14. Optical design of microlens array for CMOS image sensors

    NASA Astrophysics Data System (ADS)

    Zhang, Rongzhu; Lai, Liping

    2016-10-01

    The optical crosstalk between the pixel units can influence the image quality of CMOS image sensor. In the meantime, the duty ratio of CMOS is low because of its pixel structure. These two factors cause the low detection sensitivity of CMOS. In order to reduce the optical crosstalk and improve the fill factor of CMOS image sensor, a microlens array has been designed and integrated with CMOS. The initial parameters of the microlens array have been calculated according to the structure of a CMOS. Then the parameters have been optimized by using ZEMAX and the microlens arrays with different substrate thicknesses have been compared. The results show that in order to obtain the best imaging quality, when the effect of optical crosstalk for CMOS is the minimum, the best distance between microlens array and CMOS is about 19.3 μm. When incident light successively passes through microlens array and the distance, obtaining the minimum facula is around 0.347 um in the active area. In addition, when the incident angle of the light is 0o 22o, the microlens array has obvious inhibitory effect on the optical crosstalk. And the anti-crosstalk distance between microlens array and CMOS is 0 μm 162 μm.

  15. CMOS Active-Pixel Image Sensor With Intensity-Driven Readout

    NASA Technical Reports Server (NTRS)

    Langenbacher, Harry T.; Fossum, Eric R.; Kemeny, Sabrina

    1996-01-01

    Proposed complementary metal oxide/semiconductor (CMOS) integrated-circuit image sensor automatically provides readouts from pixels in order of decreasing illumination intensity. Sensor operated in integration mode. Particularly useful in number of image-sensing tasks, including diffractive laser range-finding, three-dimensional imaging, event-driven readout of sparse sensor arrays, and star tracking.

  16. Range and egomotion estimation from compound photodetector arrays with parallel optical axis using optical flow techniques.

    PubMed

    Chahl, J S

    2014-01-20

    This paper describes an application for arrays of narrow-field-of-view sensors with parallel optical axes. These devices exhibit some complementary characteristics with respect to conventional perspective projection or angular projection imaging devices. Conventional imaging devices measure rotational egomotion directly by measuring the angular velocity of the projected image. Translational egomotion cannot be measured directly by these devices because the induced image motion depends on the unknown range of the viewed object. On the other hand, a known translational motion generates image velocities which can be used to recover the ranges of objects and hence the three-dimensional (3D) structure of the environment. A new method is presented for computing egomotion and range using the properties of linear arrays of independent narrow-field-of-view optical sensors. An approximate parallel projection can be used to measure translational egomotion in terms of the velocity of the image. On the other hand, a known rotational motion of the paraxial sensor array generates image velocities, which can be used to recover the 3D structure of the environment. Results of tests of an experimental array confirm these properties.

  17. Autonomous Sensors for Large Scale Data Collection

    NASA Astrophysics Data System (ADS)

    Noto, J.; Kerr, R.; Riccobono, J.; Kapali, S.; Migliozzi, M. A.; Goenka, C.

    2017-12-01

    Presented here is a novel implementation of a "Doppler imager" which remotely measures winds and temperatures of the neutral background atmosphere at ionospheric altitudes of 87-300Km and possibly above. Incorporating both recent optical manufacturing developments, modern network awareness and the application of machine learning techniques for intelligent self-monitoring and data classification. This system achieves cost savings in manufacturing, deployment and lifetime operating costs. Deployed in both ground and space-based modalities, this cost-disruptive technology will allow computer models of, ionospheric variability and other space weather models to operate with higher precision. Other sensors can be folded into the data collection and analysis architecture easily creating autonomous virtual observatories. A prototype version of this sensor has recently been deployed in Trivandrum India for the Indian Government. This Doppler imager is capable of operation, even within the restricted CubeSat environment. The CubeSat bus offers a very challenging environment, even for small instruments. The lack of SWaP and the challenging thermal environment demand development of a new generation of instruments; the Doppler imager presented is well suited to this environment. Concurrent with this CubeSat development is the development and construction of ground based arrays of inexpensive sensors using the proposed technology. This instrument could be flown inexpensively on one or more CubeSats to provide valuable data to space weather forecasters and ionospheric scientists. Arrays of magnetometers have been deployed for the last 20 years [Alabi, 2005]. Other examples of ground based arrays include an array of white-light all sky imagers (THEMIS) deployed across Canada [Donovan et al., 2006], oceans sensors on buoys [McPhaden et al., 2010], and arrays of seismic sensors [Schweitzer et al., 2002]. A comparable array of Doppler imagers can be constructed and deployed on the ground, to compliment the CubeSat data.

  18. A time-resolved image sensor for tubeless streak cameras

    NASA Astrophysics Data System (ADS)

    Yasutomi, Keita; Han, SangMan; Seo, Min-Woong; Takasawa, Taishi; Kagawa, Keiichiro; Kawahito, Shoji

    2014-03-01

    This paper presents a time-resolved CMOS image sensor with draining-only modulation (DOM) pixels for tube-less streak cameras. Although the conventional streak camera has high time resolution, the device requires high voltage and bulky system due to the structure with a vacuum tube. The proposed time-resolved imager with a simple optics realize a streak camera without any vacuum tubes. The proposed image sensor has DOM pixels, a delay-based pulse generator, and a readout circuitry. The delay-based pulse generator in combination with an in-pixel logic allows us to create and to provide a short gating clock to the pixel array. A prototype time-resolved CMOS image sensor with the proposed pixel is designed and implemented using 0.11um CMOS image sensor technology. The image array has 30(Vertical) x 128(Memory length) pixels with the pixel pitch of 22.4um. .

  19. Color filter array pattern identification using variance of color difference image

    NASA Astrophysics Data System (ADS)

    Shin, Hyun Jun; Jeon, Jong Ju; Eom, Il Kyu

    2017-07-01

    A color filter array is placed on the image sensor of a digital camera to acquire color images. Each pixel uses only one color, since the image sensor can measure only one color per pixel. Therefore, empty pixels are filled using an interpolation process called demosaicing. The original and the interpolated pixels have different statistical characteristics. If the image is modified by manipulation or forgery, the color filter array pattern is altered. This pattern change can be a clue for image forgery detection. However, most forgery detection algorithms have the disadvantage of assuming the color filter array pattern. We present an identification method of the color filter array pattern. Initially, the local mean is eliminated to remove the background effect. Subsequently, the color difference block is constructed to emphasize the difference between the original pixel and the interpolated pixel. The variance measure of the color difference image is proposed as a means of estimating the color filter array configuration. The experimental results show that the proposed method is effective in identifying the color filter array pattern. Compared with conventional methods, our method provides superior performance.

  20. The Focal Plane Assembly for the Athena X-Ray Integral Field Unit Instrument

    NASA Technical Reports Server (NTRS)

    Jackson, B. D.; Van Weers, H.; van der Kuur, J.; den Hartog, R.; Akamatsu, H.; Argan, A.; Bandler, S. R.; Barbera, M.; Barret, D.; Bruijn, M. P.; hide

    2016-01-01

    This paper summarizes a preliminary design concept for the focal plane assembly of the X-ray Integral Field Unit on the Athena spacecraft, an imaging microcalorimeter that will enable high spectral resolution imaging and point-source spectroscopy. The instrument's sensor array will be a 3840-pixel transition edge sensor (TES) microcalorimeter array, with a frequency domain multiplexed SQUID readout system allowing this large-format sensor array to be operated within the thermal constraints of the instrument's cryogenic system. A second TES detector will be operated in close proximity to the sensor array to detect cosmic rays and secondary particles passing through the sensor array for off-line coincidence detection to identify and reject events caused by the in-orbit high-energy particle background. The detectors, operating at 55 mK, or less, will be thermally isolated from the instrument cryostat's 2 K stage, while shielding and filtering within the FPA will allow the instrument's sensitive sensor array to be operated in the expected environment during both on-ground testing and in-flight operation, including stray light from the cryostat environment, low-energy photons entering through the X-ray aperture, low-frequency magnetic fields, and high-frequency electric fields.

  1. CMOS Imaging of Temperature Effects on Pin-Printed Xerogel Sensor Microarrays.

    PubMed

    Lei Yao; Ka Yi Yung; Chodavarapu, Vamsy P; Bright, Frank V

    2011-04-01

    In this paper, we study the effect of temperature on the operation and performance of a xerogel-based sensor microarrays coupled to a complementary metal-oxide semiconductor (CMOS) imager integrated circuit (IC) that images the photoluminescence response from the sensor microarray. The CMOS imager uses a 32 × 32 (1024 elements) array of active pixel sensors and each pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. A correlated double sampling circuit and pixel address/digital control/signal integration circuit are also implemented on-chip. The CMOS imager data are read out as a serial coded signal. The sensor system uses a light-emitting diode to excite target analyte responsive organometallic luminophores doped within discrete xerogel-based sensor elements. As a proto type, we developed a 3 × 3 (9 elements) array of oxygen (O2) sensors. Each group of three sensor elements in the array (arranged in a column) is designed to provide a different and specific sensitivity to the target gaseous O2 concentration. This property of multiple sensitivities is achieved by using a mix of two O2 sensitive luminophores in each pin-printed xerogel sensor element. The CMOS imager is designed to be low noise and consumes a static power of 320.4 μW and an average dynamic power of 624.6 μW when operating at 100-Hz sampling frequency and 1.8-V dc power supply.

  2. Color filter array design based on a human visual model

    NASA Astrophysics Data System (ADS)

    Parmar, Manu; Reeves, Stanley J.

    2004-05-01

    To reduce cost and complexity associated with registering multiple color sensors, most consumer digital color cameras employ a single sensor. A mosaic of color filters is overlaid on a sensor array such that only one color channel is sampled per pixel location. The missing color values must be reconstructed from available data before the image is displayed. The quality of the reconstructed image depends fundamentally on the array pattern and the reconstruction technique. We present a design method for color filter array patterns that use red, green, and blue color channels in an RGB array. A model of the human visual response for luminance and opponent chrominance channels is used to characterize the perceptual error between a fully sampled and a reconstructed sparsely-sampled image. Demosaicking is accomplished using Wiener reconstruction. To ensure that the error criterion reflects perceptual effects, reconstruction is done in a perceptually uniform color space. A sequential backward selection algorithm is used to optimize the error criterion to obtain the sampling arrangement. Two different types of array patterns are designed: non-periodic and periodic arrays. The resulting array patterns outperform commonly used color filter arrays in terms of the error criterion.

  3. Circuit design for the retina-like image sensor based on space-variant lens array

    NASA Astrophysics Data System (ADS)

    Gao, Hongxun; Hao, Qun; Jin, Xuefeng; Cao, Jie; Liu, Yue; Song, Yong; Fan, Fan

    2013-12-01

    Retina-like image sensor is based on the non-uniformity of the human eyes and the log-polar coordinate theory. It has advantages of high-quality data compression and redundant information elimination. However, retina-like image sensors based on the CMOS craft have drawbacks such as high cost, low sensitivity and signal outputting efficiency and updating inconvenience. Therefore, this paper proposes a retina-like image sensor based on space-variant lens array, focusing on the circuit design to provide circuit support to the whole system. The circuit includes the following parts: (1) A photo-detector array with a lens array to convert optical signals to electrical signals; (2) a strobe circuit for time-gating of the pixels and parallel paths for high-speed transmission of the data; (3) a high-precision digital potentiometer for the I-V conversion, ratio normalization and sensitivity adjustment, a programmable gain amplifier for automatic generation control(AGC), and a A/D converter for the A/D conversion in every path; (4) the digital data is displayed on LCD and stored temporarily in DDR2 SDRAM; (5) a USB port to transfer the data to PC; (6) the whole system is controlled by FPGA. This circuit has advantages as lower cost, larger pixels, updating convenience and higher signal outputting efficiency. Experiments have proved that the grayscale output of every pixel basically matches the target and a non-uniform image of the target is ideally achieved in real time. The circuit can provide adequate technical support to retina-like image sensors based on space-variant lens array.

  4. The Design of Optical Sensor for the Pinhole/Occulter Facility

    NASA Technical Reports Server (NTRS)

    Greene, Michael E.

    1990-01-01

    Three optical sight sensor systems were designed, built and tested. Two optical lines of sight sensor system are capable of measuring the absolute pointing angle to the sun. The system is for use with the Pinhole/Occulter Facility (P/OF), a solar hard x ray experiment to be flown from Space Shuttle or Space Station. The sensor consists of a pinhole camera with two pairs of perpendicularly mounted linear photodiode arrays to detect the intensity distribution of the solar image produced by the pinhole, track and hold circuitry for data reduction, an analog to digital converter, and a microcomputer. The deflection of the image center is calculated from these data using an approximation for the solar image. A second system consists of a pinhole camera with a pair of perpendicularly mounted linear photodiode arrays, amplification circuitry, threshold detection circuitry, and a microcomputer board. The deflection of the image is calculated by knowing the position of each pixel of the photodiode array and merely counting the pixel numbers until threshold is surpassed. A third optical sensor system is capable of measuring the internal vibration of the P/OF between the mask and base. The system consists of a white light source, a mirror and a pair of perpendicularly mounted linear photodiode arrays to detect the intensity distribution of the solar image produced by the mirror, amplification circuitry, threshold detection circuitry, and a microcomputer board. The deflection of the image and hence the vibration of the structure is calculated by knowing the position of each pixel of the photodiode array and merely counting the pixel numbers until threshold is surpassed.

  5. Moving-Article X-Ray Imaging System and Method for 3-D Image Generation

    NASA Technical Reports Server (NTRS)

    Fernandez, Kenneth R. (Inventor)

    2012-01-01

    An x-ray imaging system and method for a moving article are provided for an article moved along a linear direction of travel while the article is exposed to non-overlapping x-ray beams. A plurality of parallel linear sensor arrays are disposed in the x-ray beams after they pass through the article. More specifically, a first half of the plurality are disposed in a first of the x-ray beams while a second half of the plurality are disposed in a second of the x-ray beams. Each of the parallel linear sensor arrays is oriented perpendicular to the linear direction of travel. Each of the parallel linear sensor arrays in the first half is matched to a corresponding one of the parallel linear sensor arrays in the second half in terms of an angular position in the first of the x-ray beams and the second of the x-ray beams, respectively.

  6. Using a plenoptic camera to measure distortions in wavefronts affected by atmospheric turbulence

    NASA Astrophysics Data System (ADS)

    Eslami, Mohammed; Wu, Chensheng; Rzasa, John; Davis, Christopher C.

    2012-10-01

    Ideally, as planar wave fronts travel through an imaging system, all rays, or vectors pointing in the direction of the propagation of energy are parallel, and thus the wave front is focused to a particular point. If the wave front arrives at an imaging system with energy vectors that point in different directions, each part of the wave front will be focused at a slightly different point on the sensor plane and result in a distorted image. The Hartmann test, which involves the insertion of a series of pinholes between the imaging system and the sensor plane, was developed to sample the wavefront at different locations and measure the distortion angles at different points in the wave front. An adaptive optic system, such as a deformable mirror, is then used to correct for these distortions and allow the planar wave front to focus at the point desired on the sensor plane, thereby correcting the distorted image. The apertures of a pinhole array limit the amount of light that reaches the sensor plane. By replacing the pinholes with a microlens array each bundle of rays is focused to brighten the image. Microlens arrays are making their way into newer imaging technologies, such as "light field" or "plenoptic" cameras. In these cameras, the microlens array is used to recover the ray information of the incoming light by using post processing techniques to focus on objects at different depths. The goal of this paper is to demonstrate the use of these plenoptic cameras to recover the distortions in wavefronts. Taking advantage of the microlens array within the plenoptic camera, CODE-V simulations show that its performance can provide more information than a Shack-Hartmann sensor. Using the microlens array to retrieve the ray information and then backstepping through the imaging system provides information about distortions in the arriving wavefront.

  7. Relating transverse ray error and light fields in plenoptic camera images

    NASA Astrophysics Data System (ADS)

    Schwiegerling, Jim; Tyo, J. Scott

    2013-09-01

    Plenoptic cameras have emerged in recent years as a technology for capturing light field data in a single snapshot. A conventional digital camera can be modified with the addition of a lenslet array to create a plenoptic camera. The camera image is focused onto the lenslet array. The lenslet array is placed over the camera sensor such that each lenslet forms an image of the exit pupil onto the sensor. The resultant image is an array of circular exit pupil images, each corresponding to the overlying lenslet. The position of the lenslet encodes the spatial information of the scene, whereas as the sensor pixels encode the angular information for light incident on the lenslet. The 4D light field is therefore described by the 2D spatial information and 2D angular information captured by the plenoptic camera. In aberration theory, the transverse ray error relates the pupil coordinates of a given ray to its deviation from the ideal image point in the image plane and is consequently a 4D function as well. We demonstrate a technique for modifying the traditional transverse ray error equations to recover the 4D light field of a general scene. In the case of a well corrected optical system, this light field is easily related to the depth of various objects in the scene. Finally, the effects of sampling with both the lenslet array and the camera sensor on the 4D light field data are analyzed to illustrate the limitations of such systems.

  8. Configuration-controlled Au nanocluster arrays on inverse micelle nano-patterns: versatile platforms for SERS and SPR sensors

    NASA Astrophysics Data System (ADS)

    Jang, Yoon Hee; Chung, Kyungwha; Quan, Li Na; Špačková, Barbora; Šípová, Hana; Moon, Seyoung; Cho, Won Joon; Shin, Hae-Young; Jang, Yu Jin; Lee, Ji-Eun; Kochuveedu, Saji Thomas; Yoon, Min Ji; Kim, Jihyeon; Yoon, Seokhyun; Kim, Jin Kon; Kim, Donghyun; Homola, Jiří; Kim, Dong Ha

    2013-11-01

    Nanopatterned 2-dimensional Au nanocluster arrays with controlled configuration are fabricated onto reconstructed nanoporous poly(styrene-block-vinylpyridine) inverse micelle monolayer films. Near-field coupling of localized surface plasmons is studied and compared for disordered and ordered core-centered Au NC arrays. Differences in evolution of the absorption band and field enhancement upon Au nanoparticle adsorption are shown. The experimental results are found to be in good agreement with theoretical studies based on the finite-difference time-domain method and rigorous coupled-wave analysis. The realized Au nanopatterns are exploited as substrates for surface-enhanced Raman scattering and integrated into Kretschmann-type SPR sensors, based on which unprecedented SPR-coupling-type sensors are demonstrated.Nanopatterned 2-dimensional Au nanocluster arrays with controlled configuration are fabricated onto reconstructed nanoporous poly(styrene-block-vinylpyridine) inverse micelle monolayer films. Near-field coupling of localized surface plasmons is studied and compared for disordered and ordered core-centered Au NC arrays. Differences in evolution of the absorption band and field enhancement upon Au nanoparticle adsorption are shown. The experimental results are found to be in good agreement with theoretical studies based on the finite-difference time-domain method and rigorous coupled-wave analysis. The realized Au nanopatterns are exploited as substrates for surface-enhanced Raman scattering and integrated into Kretschmann-type SPR sensors, based on which unprecedented SPR-coupling-type sensors are demonstrated. Electronic supplementary information (ESI) available: TEM image and UV-vis absorption spectrum of citrate-capped Au NPs, AFM images of Au NC arrays on the PS-b-P4VP (41k-24k) template, ImageJ-analyzed results of PS-b-P4VP (41k-24k)-templated Au NC arrays, calculated %-surface coverage values, SEM images of Au NC arrays on the PS-b-P2VP (172k-42k) template for SPR biosensing, corresponding ImageJ-analyzed images by varying the Au NP deposition time and results of image analysis. See DOI: 10.1039/c3nr03860b

  9. Image science team

    NASA Technical Reports Server (NTRS)

    Ando, K.

    1982-01-01

    A substantial technology base of solid state pushbroom sensors exists and is in the process of further evolution at both GSFC and JPL. Technologies being developed relate to short wave infrared (SWIR) detector arrays; HgCdTe hybrid detector arrays; InSb linear and area arrays; passive coolers; spectral beam splitters; the deposition of spectral filters on detector arrays; and the functional design of the shuttle/space platform imaging spectrometer (SIS) system. Spatial and spectral characteristics of field, aircraft and space multispectral sensors are summaried. The status, field of view, and resolution of foreign land observing systems are included.

  10. Optical Demonstration of a Medical Imaging System with an EMCCD-Sensor Array for Use in a High Resolution Dynamic X-ray Imager

    PubMed Central

    Qu, Bin; Huang, Ying; Wang, Weiyuan; Sharma, Prateek; Kuhls-Gilcrist, Andrew T.; Cartwright, Alexander N.; Titus, Albert H.; Bednarek, Daniel R.; Rudin, Stephen

    2011-01-01

    Use of an extensible array of Electron Multiplying CCDs (EMCCDs) in medical x-ray imager applications was demonstrated for the first time. The large variable electronic-gain (up to 2000) and small pixel size of EMCCDs provide effective suppression of readout noise compared to signal, as well as high resolution, enabling the development of an x-ray detector with far superior performance compared to conventional x-ray image intensifiers and flat panel detectors. We are developing arrays of EMCCDs to overcome their limited field of view (FOV). In this work we report on an array of two EMCCD sensors running simultaneously at a high frame rate and optically focused on a mammogram film showing calcified ducts. The work was conducted on an optical table with a pulsed LED bar used to provide a uniform diffuse light onto the film to simulate x-ray projection images. The system can be selected to run at up to 17.5 frames per second or even higher frame rate with binning. Integration time for the sensors can be adjusted from 1 ms to 1000 ms. Twelve-bit correlated double sampling AD converters were used to digitize the images, which were acquired by a National Instruments dual-channel Camera Link PC board in real time. A user-friendly interface was programmed using LabVIEW to save and display 2K × 1K pixel matrix digital images. The demonstration tiles a 2 × 1 array to acquire increased-FOV stationary images taken at different gains and fluoroscopic-like videos recorded by scanning the mammogram simultaneously with both sensors. The results show high resolution and high dynamic range images stitched together with minimal adjustments needed. The EMCCD array design allows for expansion to an M×N array for arbitrarily larger FOV, yet with high resolution and large dynamic range maintained. PMID:23505330

  11. A 128 x 128 CMOS Active Pixel Image Sensor for Highly Integrated Imaging Systems

    NASA Technical Reports Server (NTRS)

    Mendis, Sunetra K.; Kemeny, Sabrina E.; Fossum, Eric R.

    1993-01-01

    A new CMOS-based image sensor that is intrinsically compatible with on-chip CMOS circuitry is reported. The new CMOS active pixel image sensor achieves low noise, high sensitivity, X-Y addressability, and has simple timing requirements. The image sensor was fabricated using a 2 micrometer p-well CMOS process, and consists of a 128 x 128 array of 40 micrometer x 40 micrometer pixels. The CMOS image sensor technology enables highly integrated smart image sensors, and makes the design, incorporation and fabrication of such sensors widely accessible to the integrated circuit community.

  12. Active pixel sensors with substantially planarized color filtering elements

    NASA Technical Reports Server (NTRS)

    Fossum, Eric R. (Inventor); Kemeny, Sabrina E. (Inventor)

    1999-01-01

    A semiconductor imaging system preferably having an active pixel sensor array compatible with a CMOS fabrication process. Color-filtering elements such as polymer filters and wavelength-converting phosphors can be integrated with the image sensor.

  13. Depth map generation using a single image sensor with phase masks.

    PubMed

    Jang, Jinbeum; Park, Sangwoo; Jo, Jieun; Paik, Joonki

    2016-06-13

    Conventional stereo matching systems generate a depth map using two or more digital imaging sensors. It is difficult to use the small camera system because of their high costs and bulky sizes. In order to solve this problem, this paper presents a stereo matching system using a single image sensor with phase masks for the phase difference auto-focusing. A novel pattern of phase mask array is proposed to simultaneously acquire two pairs of stereo images. Furthermore, a noise-invariant depth map is generated from the raw format sensor output. The proposed method consists of four steps to compute the depth map: (i) acquisition of stereo images using the proposed mask array, (ii) variational segmentation using merging criteria to simplify the input image, (iii) disparity map generation using the hierarchical block matching for disparity measurement, and (iv) image matting to fill holes to generate the dense depth map. The proposed system can be used in small digital cameras without additional lenses or sensors.

  14. Imaging through turbulence using a plenoptic sensor

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C.

    2015-09-01

    Atmospheric turbulence can significantly affect imaging through paths near the ground. Atmospheric turbulence is generally treated as a time varying inhomogeneity of the refractive index of the air, which disrupts the propagation of optical signals from the object to the viewer. Under circumstances of deep or strong turbulence, the object is hard to recognize through direct imaging. Conventional imaging methods can't handle those problems efficiently. The required time for lucky imaging can be increased significantly and the image processing approaches require much more complex and iterative de-blurring algorithms. We propose an alternative approach using a plenoptic sensor to resample and analyze the image distortions. The plenoptic sensor uses a shared objective lens and a microlens array to form a mini Keplerian telescope array. Therefore, the image obtained by a conventional method will be separated into an array of images that contain multiple copies of the object's image and less correlated turbulence disturbances. Then a highdimensional lucky imaging algorithm can be performed based on the collected video on the plenoptic sensor. The corresponding algorithm will select the most stable pixels from various image cells and reconstruct the object's image as if there is only weak turbulence effect. Then, by comparing the reconstructed image with the recorded images in each MLA cell, the difference can be regarded as the turbulence effects. As a result, the retrieval of the object's image and extraction of turbulence effect can be performed simultaneously.

  15. Imaging spectroscopy using embedded diffractive optical arrays

    NASA Astrophysics Data System (ADS)

    Hinnrichs, Michele; Hinnrichs, Bradford

    2017-09-01

    Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera based on diffractive optic arrays. This approach to hyperspectral imaging has been demonstrated in all three infrared bands SWIR, MWIR and LWIR. The hyperspectral optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of this infrared hyperspectral sensor. This new and innovative approach to an infrared hyperspectral imaging spectrometer uses micro-optics that are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a small satellite, mini-UAV, commercial quadcopter or man portable. Also, an application of how this spectral imaging technology can easily be used to quantify the mass and volume flow rates of hydrocarbon gases. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. The detector array is divided into sub-images covered by each lenslet. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the number of simultaneous different spectral images collected each frame of the camera. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame. This system spans the SWIR and MWIR bands with a single optical array and focal plane array.

  16. A Label-Free Fluorescent Array Sensor Utilizing Liposome Encapsulating Calcein for Discriminating Target Proteins by Principal Component Analysis

    PubMed Central

    Imamura, Ryota; Murata, Naoki; Shimanouchi, Toshinori; Yamashita, Kaoru; Fukuzawa, Masayuki; Noda, Minoru

    2017-01-01

    A new fluorescent arrayed biosensor has been developed to discriminate species and concentrations of target proteins by using plural different phospholipid liposome species encapsulating fluorescent molecules, utilizing differences in permeation of the fluorescent molecules through the membrane to modulate liposome-target protein interactions. This approach proposes a basically new label-free fluorescent sensor, compared with the common technique of developed fluorescent array sensors with labeling. We have confirmed a high output intensity of fluorescence emission related to characteristics of the fluorescent molecules dependent on their concentrations when they leak from inside the liposomes through the perturbed lipid membrane. After taking an array image of the fluorescence emission from the sensor using a CMOS imager, the output intensities of the fluorescence were analyzed by a principal component analysis (PCA) statistical method. It is found from PCA plots that different protein species with several concentrations were successfully discriminated by using the different lipid membranes with high cumulative contribution ratio. We also confirmed that the accuracy of the discrimination by the array sensor with a single shot is higher than that of a single sensor with multiple shots. PMID:28714873

  17. A Label-Free Fluorescent Array Sensor Utilizing Liposome Encapsulating Calcein for Discriminating Target Proteins by Principal Component Analysis.

    PubMed

    Imamura, Ryota; Murata, Naoki; Shimanouchi, Toshinori; Yamashita, Kaoru; Fukuzawa, Masayuki; Noda, Minoru

    2017-07-15

    A new fluorescent arrayed biosensor has been developed to discriminate species and concentrations of target proteins by using plural different phospholipid liposome species encapsulating fluorescent molecules, utilizing differences in permeation of the fluorescent molecules through the membrane to modulate liposome-target protein interactions. This approach proposes a basically new label-free fluorescent sensor, compared with the common technique of developed fluorescent array sensors with labeling. We have confirmed a high output intensity of fluorescence emission related to characteristics of the fluorescent molecules dependent on their concentrations when they leak from inside the liposomes through the perturbed lipid membrane. After taking an array image of the fluorescence emission from the sensor using a CMOS imager, the output intensities of the fluorescence were analyzed by a principal component analysis (PCA) statistical method. It is found from PCA plots that different protein species with several concentrations were successfully discriminated by using the different lipid membranes with high cumulative contribution ratio. We also confirmed that the accuracy of the discrimination by the array sensor with a single shot is higher than that of a single sensor with multiple shots.

  18. Comparative Chemometric Analysis for Classification of Acids and Bases via a Colorimetric Sensor Array.

    PubMed

    Kangas, Michael J; Burks, Raychelle M; Atwater, Jordyn; Lukowicz, Rachel M; Garver, Billy; Holmes, Andrea E

    2018-02-01

    With the increasing availability of digital imaging devices, colorimetric sensor arrays are rapidly becoming a simple, yet effective tool for the identification and quantification of various analytes. Colorimetric arrays utilize colorimetric data from many colorimetric sensors, with the multidimensional nature of the resulting data necessitating the use of chemometric analysis. Herein, an 8 sensor colorimetric array was used to analyze select acid and basic samples (0.5 - 10 M) to determine which chemometric methods are best suited for classification quantification of analytes within clusters. PCA, HCA, and LDA were used to visualize the data set. All three methods showed well-separated clusters for each of the acid or base analytes and moderate separation between analyte concentrations, indicating that the sensor array can be used to identify and quantify samples. Furthermore, PCA could be used to determine which sensors showed the most effective analyte identification. LDA, KNN, and HQI were used for identification of analyte and concentration. HQI and KNN could be used to correctly identify the analytes in all cases, while LDA correctly identified 95 of 96 analytes correctly. Additional studies demonstrated that controlling for solvent and image effects was unnecessary for all chemometric methods utilized in this study.

  19. Imaging system design for improved information capacity

    NASA Technical Reports Server (NTRS)

    Fales, C. L.; Huck, F. O.; Samms, R. W.

    1984-01-01

    Shannon's theory of information for communication channels is used to assess the performance of line-scan and sensor-array imaging systems and to optimize the design trade-offs involving sensitivity, spatial response, and sampling intervals. Formulations and computational evaluations account for spatial responses typical of line-scan and sensor-array mechanisms, lens diffraction and transmittance shading, defocus blur, and square and hexagonal sampling lattices.

  20. USGS aerial resolution targets.

    USGS Publications Warehouse

    Salamonowicz, P.H.

    1982-01-01

    It is necessary to measure the achievable resolution of any airborne sensor that is to be used for metric purposes. Laboratory calibration facilities may be inadequate or inappropriate for determining the resolution of non-photographic sensors such as optical-mechanical scanners, television imaging tubes, and linear arrays. However, large target arrays imaged in the field can be used in testing such systems. The USGS has constructed an array of resolution targets in order to permit field testing of a variety of airborne sensing systems. The target array permits any interested organization with an airborne sensing system to accurately determine the operational resolution of its system. -from Author

  1. Infrared hyperspectral imaging miniaturized for UAV applications

    NASA Astrophysics Data System (ADS)

    Hinnrichs, Michele; Hinnrichs, Bradford; McCutchen, Earl

    2017-02-01

    Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera, both MWIR and LWIR, small enough to serve as a payload on a miniature unmanned aerial vehicles. The optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of the sensor. This new and innovative approach to infrared hyperspectral imaging spectrometer uses micro-optics and will be explained in this paper. The micro-optics are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a mini-UAV or commercial quadcopter. Also, an example of how this technology can easily be used to quantify a hydrocarbon gas leak's volume and mass flowrates. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the spatial resolution. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame.

  2. Log polar image sensor in CMOS technology

    NASA Astrophysics Data System (ADS)

    Scheffer, Danny; Dierickx, Bart; Pardo, Fernando; Vlummens, Jan; Meynants, Guy; Hermans, Lou

    1996-08-01

    We report on the design, design issues, fabrication and performance of a log-polar CMOS image sensor. The sensor is developed for the use in a videophone system for deaf and hearing impaired people, who are not capable of communicating through a 'normal' telephone. The system allows 15 detailed images per second to be transmitted over existing telephone lines. This framerate is sufficient for conversations by means of sign language or lip reading. The pixel array of the sensor consists of 76 concentric circles with (up to) 128 pixels per circle, in total 8013 pixels. The interior pixels have a pitch of 14 micrometers, up to 250 micrometers at the border. The 8013-pixels image is mapped (log-polar transformation) in a X-Y addressable 76 by 128 array.

  3. Radiation tolerant compact image sensor using CdTe photodiode and field emitter array (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Masuzawa, Tomoaki; Neo, Yoichiro; Mimura, Hidenori; Okamoto, Tamotsu; Nagao, Masayoshi; Akiyoshi, Masafumi; Sato, Nobuhiro; Takagi, Ikuji; Tsuji, Hiroshi; Gotoh, Yasuhito

    2016-10-01

    A growing demand on incident detection is recognized since the Great East Japan Earthquake and successive accidents in Fukushima nuclear power plant in 2011. Radiation tolerant image sensors are powerful tools to collect crucial information at initial stages of such incidents. However, semiconductor based image sensors such as CMOS and CCD have limited tolerance to radiation exposure. Image sensors used in nuclear facilities are conventional vacuum tubes using thermal cathodes, which have large size and high power consumption. In this study, we propose a compact image sensor composed of a CdTe-based photodiode and a matrix-driven Spindt-type electron beam source called field emitter array (FEA). A basic principle of FEA-based image sensors is similar to conventional Vidicon type camera tubes, but its electron source is replaced from a thermal cathode to FEA. The use of a field emitter as an electron source should enable significant size reduction while maintaining high radiation tolerance. Current researches on radiation tolerant FEAs and development of CdTe based photoconductive films will be presented.

  4. Smart CMOS image sensor for lightning detection and imaging.

    PubMed

    Rolando, Sébastien; Goiffon, Vincent; Magnan, Pierre; Corbière, Franck; Molina, Romain; Tulet, Michel; Bréart-de-Boisanger, Michel; Saint-Pé, Olivier; Guiry, Saïprasad; Larnaudie, Franck; Leone, Bruno; Perez-Cuevas, Leticia; Zayer, Igor

    2013-03-01

    We present a CMOS image sensor dedicated to lightning detection and imaging. The detector has been designed to evaluate the potentiality of an on-chip lightning detection solution based on a smart sensor. This evaluation is performed in the frame of the predevelopment phase of the lightning detector that will be implemented in the Meteosat Third Generation Imager satellite for the European Space Agency. The lightning detection process is performed by a smart detector combining an in-pixel frame-to-frame difference comparison with an adjustable threshold and on-chip digital processing allowing an efficient localization of a faint lightning pulse on the entire large format array at a frequency of 1 kHz. A CMOS prototype sensor with a 256×256 pixel array and a 60 μm pixel pitch has been fabricated using a 0.35 μm 2P 5M technology and tested to validate the selected detection approach.

  5. Transparent Fingerprint Sensor System for Large Flat Panel Display.

    PubMed

    Seo, Wonkuk; Pi, Jae-Eun; Cho, Sung Haeung; Kang, Seung-Youl; Ahn, Seong-Deok; Hwang, Chi-Sun; Jeon, Ho-Sik; Kim, Jong-Uk; Lee, Myunghee

    2018-01-19

    In this paper, we introduce a transparent fingerprint sensing system using a thin film transistor (TFT) sensor panel, based on a self-capacitive sensing scheme. An armorphousindium gallium zinc oxide (a-IGZO) TFT sensor array and associated custom Read-Out IC (ROIC) are implemented for the system. The sensor panel has a 200 × 200 pixel array and each pixel size is as small as 50 μm × 50 μm. The ROIC uses only eight analog front-end (AFE) amplifier stages along with a successive approximation analog-to-digital converter (SAR ADC). To get the fingerprint image data from the sensor array, the ROIC senses a capacitance, which is formed by a cover glass material between a human finger and an electrode of each pixel of the sensor array. Three methods are reviewed for estimating the self-capacitance. The measurement result demonstrates that the transparent fingerprint sensor system has an ability to differentiate a human finger's ridges and valleys through the fingerprint sensor array.

  6. Transparent Fingerprint Sensor System for Large Flat Panel Display

    PubMed Central

    Seo, Wonkuk; Pi, Jae-Eun; Cho, Sung Haeung; Kang, Seung-Youl; Ahn, Seong-Deok; Hwang, Chi-Sun; Jeon, Ho-Sik; Kim, Jong-Uk

    2018-01-01

    In this paper, we introduce a transparent fingerprint sensing system using a thin film transistor (TFT) sensor panel, based on a self-capacitive sensing scheme. An armorphousindium gallium zinc oxide (a-IGZO) TFT sensor array and associated custom Read-Out IC (ROIC) are implemented for the system. The sensor panel has a 200 × 200 pixel array and each pixel size is as small as 50 μm × 50 μm. The ROIC uses only eight analog front-end (AFE) amplifier stages along with a successive approximation analog-to-digital converter (SAR ADC). To get the fingerprint image data from the sensor array, the ROIC senses a capacitance, which is formed by a cover glass material between a human finger and an electrode of each pixel of the sensor array. Three methods are reviewed for estimating the self-capacitance. The measurement result demonstrates that the transparent fingerprint sensor system has an ability to differentiate a human finger’s ridges and valleys through the fingerprint sensor array. PMID:29351218

  7. Polarimetric Imaging System for Automatic Target Detection and Recognition

    DTIC Science & Technology

    2000-03-01

    technique shown in Figure 4(b) can also be used to integrate polarizer arrays with other types of imaging sensors, such as LWIR cameras and uncooled...vertical stripe pattern in this φ image is caused by nonuniformities in the particular polarizer array used. 2. CIRCULAR POLARIZATION IMAGING USING

  8. Microfabricated optically pumped magnetometer arrays for biomedical imaging

    NASA Astrophysics Data System (ADS)

    Perry, A. R.; Sheng, D.; Krzyzewski, S. P.; Geller, S.; Knappe, S.

    2017-02-01

    Optically-pumped magnetometers have demonstrated magnetic field measurements as precise as the best superconducting quantum interference device magnetometers. Our group develops miniature alkali atom-based magnetic sensors using microfabrication technology. Our sensors do not require cryogenic cooling, and can be positioned very close to the sample, making these sensors an attractive option for development in the medical community. We will present our latest chip-scale optically-pumped gradiometer developed for array applications to image magnetic fields from the brain noninvasively. These developments should lead to improved spatial resolution, and potentially sensitive measurements in unshielded environments.

  9. Model of an optical system's influence on sensitivity of microbolometric focal plane array

    NASA Astrophysics Data System (ADS)

    Gogler, Sławomir; Bieszczad, Grzegorz; Zarzycka, Alicja; Szymańska, Magdalena; Sosnowski, Tomasz

    2012-10-01

    Thermal imagers and used therein infrared array sensors are subject to calibration procedure and evaluation of their voltage sensitivity on incident radiation during manufacturing process. The calibration procedure is especially important in so-called radiometric cameras, where accurate radiometric quantities, given in physical units, are of concern. Even though non-radiometric cameras are not expected to stand up to such elevated standards, it is still important, that the image faithfully represents temperature variations across the scene. The detectors used in thermal camera are illuminated by infrared radiation transmitted through a specialized optical system. Each optical system used influences irradiation distribution across an sensor array. In the article a model describing irradiation distribution across an array sensor working with an optical system used in the calibration set-up has been proposed. In the said method optical and geometrical considerations of the array set-up have been taken into account. By means of Monte-Carlo simulation, large number of rays has been traced to the sensor plane, what allowed to determine the irradiation distribution across the image plane for different aperture limiting configurations. Simulated results have been confronted with proposed analytical expression. Presented radiometric model allows fast and accurate non-uniformity correction to be carried out.

  10. Alignment of sensor arrays in optical instruments using a geometric approach.

    PubMed

    Sawyer, Travis W

    2018-02-01

    Alignment of sensor arrays in optical instruments is critical to maximize the instrument's performance. While many commercial systems use standardized mounting threads for alignment, custom systems require specialized equipment and alignment procedures. These alignment procedures can be time-consuming, dependent on operator experience, and have low repeatability. Furthermore, each alignment solution must be considered on a case-by-case basis, leading to additional time and resource cost. Here I present a method to align a sensor array using geometric analysis. By imaging a grid pattern of dots, I show that it is possible to calculate the misalignment for a sensor in five degrees of freedom simultaneously. I first test the approach by simulating different cases of misalignment using Zemax before applying the method to experimentally acquired data of sensor misalignment for an echelle spectrograph. The results show that the algorithm effectively quantifies misalignment in five degrees of freedom for an F/5 imaging system, accurate to within ±0.87  deg in rotation and ±0.86  μm in translation. Furthermore, the results suggest that the method can also be applied to non-imaging systems with a small penalty to precision. This general approach can potentially improve the alignment of sensor arrays in custom instruments by offering an accurate, quantitative approach to calculating misalignment in five degrees of freedom simultaneously.

  11. Mapping Electrical Crosstalk in Pixelated Sensor Arrays

    NASA Technical Reports Server (NTRS)

    Seshadri, Suresh (Inventor); Cole, David (Inventor); Smith, Roger M. (Inventor); Hancock, Bruce R. (Inventor)

    2017-01-01

    The effects of inter pixel capacitance in a pixilated array may be measured by first resetting all pixels in the array to a first voltage, where a first image is read out, followed by resetting only a subset of pixels in the array to a second voltage, where a second image is read out, where the difference in the first and second images provide information about the inter pixel capacitance. Other embodiments are described and claimed.

  12. Mapping Electrical Crosstalk in Pixelated Sensor Arrays

    NASA Technical Reports Server (NTRS)

    Smith, Roger M (Inventor); Hancock, Bruce R. (Inventor); Cole, David (Inventor); Seshadri, Suresh (Inventor)

    2013-01-01

    The effects of inter pixel capacitance in a pixilated array may be measured by first resetting all pixels in the array to a first voltage, where a first image is read out, followed by resetting only a subset of pixels in the array to a second voltage, where a second image is read out, where the difference in the first and second images provide information about the inter pixel capacitance. Other embodiments are described and claimed.

  13. Differential temperature stress measurement employing array sensor with local offset

    NASA Technical Reports Server (NTRS)

    Lesniak, Jon R. (Inventor)

    1993-01-01

    The instrument has a focal plane array of infrared sensors of the integrating type such as a multiplexed device in which a charge is built up on a capacitor which is proportional to the total number of photons which that sensor is exposed to between read-out cycles. The infrared sensors of the array are manufactured as part of an overall array which is part of a micro-electronic device. The sensor achieves greater sensitivity by applying a local offset to the output of each sensor before it is converted into a digital word. The offset which is applied to each sensor will typically be the sensor's average value so that the digital signal which is periodically read from each sensor of the array corresponds to the portion of the signal which is varying in time. With proper synchronization between the cyclical loading of the test object and the frame rate of the infrared array the output of the A/D converted signal will correspond to the stress field induced temperature variations. A digital lock-in operation may be performed on the output of each sensor in the array. This results in a test instrument which can rapidly form a precise image of the thermoelastic stresses in an object.

  14. The CAOS camera platform: ushering in a paradigm change in extreme dynamic range imager design

    NASA Astrophysics Data System (ADS)

    Riza, Nabeel A.

    2017-02-01

    Multi-pixel imaging devices such as CCD, CMOS and Focal Plane Array (FPA) photo-sensors dominate the imaging world. These Photo-Detector Array (PDA) devices certainly have their merits including increasingly high pixel counts and shrinking pixel sizes, nevertheless, they are also being hampered by limitations in instantaneous dynamic range, inter-pixel crosstalk, quantum full well capacity, signal-to-noise ratio, sensitivity, spectral flexibility, and in some cases, imager response time. Recently invented is the Coded Access Optical Sensor (CAOS) Camera platform that works in unison with current Photo-Detector Array (PDA) technology to counter fundamental limitations of PDA-based imagers while providing high enough imaging spatial resolution and pixel counts. Using for example the Texas Instruments (TI) Digital Micromirror Device (DMD) to engineer the CAOS camera platform, ushered in is a paradigm change in advanced imager design, particularly for extreme dynamic range applications.

  15. Optical and Electric Multifunctional CMOS Image Sensors for On-Chip Biosensing Applications.

    PubMed

    Tokuda, Takashi; Noda, Toshihiko; Sasagawa, Kiyotaka; Ohta, Jun

    2010-12-29

    In this review, the concept, design, performance, and a functional demonstration of multifunctional complementary metal-oxide-semiconductor (CMOS) image sensors dedicated to on-chip biosensing applications are described. We developed a sensor architecture that allows flexible configuration of a sensing pixel array consisting of optical and electric sensing pixels, and designed multifunctional CMOS image sensors that can sense light intensity and electric potential or apply a voltage to an on-chip measurement target. We describe the sensors' architecture on the basis of the type of electric measurement or imaging functionalities.

  16. A 128×96 Pixel Stack-Type Color Image Sensor: Stack of Individual Blue-, Green-, and Red-Sensitive Organic Photoconductive Films Integrated with a ZnO Thin Film Transistor Readout Circuit

    NASA Astrophysics Data System (ADS)

    Seo, Hokuto; Aihara, Satoshi; Watabe, Toshihisa; Ohtake, Hiroshi; Sakai, Toshikatsu; Kubota, Misao; Egami, Norifumi; Hiramatsu, Takahiro; Matsuda, Tokiyoshi; Furuta, Mamoru; Hirao, Takashi

    2011-02-01

    A color image was produced by a vertically stacked image sensor with blue (B)-, green (G)-, and red (R)-sensitive organic photoconductive films, each having a thin-film transistor (TFT) array that uses a zinc oxide (ZnO) channel to read out the signal generated in each organic film. The number of the pixels of the fabricated image sensor is 128×96 for each color, and the pixel size is 100×100 µm2. The current on/off ratio of the ZnO TFT is over 106, and the B-, G-, and R-sensitive organic photoconductive films show excellent wavelength selectivity. The stacked image sensor can produce a color image at 10 frames per second with a resolution corresponding to the pixel number. This result clearly shows that color separation is achieved without using any conventional color separation optical system such as a color filter array or a prism.

  17. Pixel electronic noise as a function of position in an active matrix flat panel imaging array

    NASA Astrophysics Data System (ADS)

    Yazdandoost, Mohammad Y.; Wu, Dali; Karim, Karim S.

    2010-04-01

    We present an analysis of output referred pixel electronic noise as a function of position in the active matrix array for both active and passive pixel architectures. Three different noise sources for Active Pixel Sensor (APS) arrays are considered: readout period noise, reset period noise and leakage current noise of the reset TFT during readout. For the state-of-the-art Passive Pixel Sensor (PPS) array, the readout noise of the TFT switch is considered. Measured noise results are obtained by modeling the array connections with RC ladders on a small in-house fabricated prototype. The results indicate that the pixels in the rows located in the middle part of the array have less random electronic noise at the output of the off-panel charge amplifier compared to the ones in rows at the two edges of the array. These results can help optimize for clearer images as well as help define the region-of-interest with the best signal-to-noise ratio in an active matrix digital flat panel imaging array.

  18. Real time three-dimensional space video rate sensors for millimeter waves imaging based very inexpensive plasma LED lamps

    NASA Astrophysics Data System (ADS)

    Levanon, Assaf; Yitzhaky, Yitzhak; Kopeika, Natan S.; Rozban, Daniel; Abramovich, Amir

    2014-10-01

    In recent years, much effort has been invested to develop inexpensive but sensitive Millimeter Wave (MMW) detectors that can be used in focal plane arrays (FPAs), in order to implement real time MMW imaging. Real time MMW imaging systems are required for many varied applications in many fields as homeland security, medicine, communications, military products and space technology. It is mainly because this radiation has high penetration and good navigability through dust storm, fog, heavy rain, dielectric materials, biological tissue, and diverse materials. Moreover, the atmospheric attenuation in this range of the spectrum is relatively low and the scattering is also low compared to NIR and VIS. The lack of inexpensive room temperature imaging systems makes it difficult to provide a suitable MMW system for many of the above applications. In last few years we advanced in research and development of sensors using very inexpensive (30-50 cents) Glow Discharge Detector (GDD) plasma indicator lamps as MMW detectors. This paper presents three kinds of GDD sensor based lamp Focal Plane Arrays (FPA). Those three kinds of cameras are different in the number of detectors, scanning operation, and detection method. The 1st and 2nd generations are 8 × 8 pixel array and an 18 × 2 mono-rail scanner array respectively, both of them for direct detection and limited to fixed imaging. The last designed sensor is a multiplexing frame rate of 16x16 GDD FPA. It permits real time video rate imaging of 30 frames/ sec and comprehensive 3D MMW imaging. The principle of detection in this sensor is a frequency modulated continuous wave (FMCW) system while each of the 16 GDD pixel lines is sampled simultaneously. Direct detection is also possible and can be done with a friendly user interface. This FPA sensor is built over 256 commercial GDD lamps with 3 mm diameter International Light, Inc., Peabody, MA model 527 Ne indicator lamps as pixel detectors. All three sensors are fully supported by software Graphical Unit Interface (GUI). They were tested and characterized through different kinds of optical systems for imaging applications, super resolution, and calibration methods. Capability of the 16x16 sensor is to employ a chirp radar like method to produced depth and reflectance information in the image. This enables 3-D MMW imaging in real time with video frame rate. In this work we demonstrate different kinds of optical imaging systems. Those systems have capability of 3-D imaging for short range and longer distances to at least 10-20 meters.

  19. A Dual-Mode Large-Arrayed CMOS ISFET Sensor for Accurate and High-Throughput pH Sensing in Biomedical Diagnosis.

    PubMed

    Huang, Xiwei; Yu, Hao; Liu, Xu; Jiang, Yu; Yan, Mei; Wu, Dongping

    2015-09-01

    The existing ISFET-based DNA sequencing detects hydrogen ions released during the polymerization of DNA strands on microbeads, which are scattered into microwell array above the ISFET sensor with unknown distribution. However, false pH detection happens at empty microwells due to crosstalk from neighboring microbeads. In this paper, a dual-mode CMOS ISFET sensor is proposed to have accurate pH detection toward DNA sequencing. Dual-mode sensing, optical and chemical modes, is realized by integrating a CMOS image sensor (CIS) with ISFET pH sensor, and is fabricated in a standard 0.18-μm CIS process. With accurate determination of microbead physical locations with CIS pixel by contact imaging, the dual-mode sensor can correlate local pH for one DNA slice at one location-determined microbead, which can result in improved pH detection accuracy. Moreover, toward a high-throughput DNA sequencing, a correlated-double-sampling readout that supports large array for both modes is deployed to reduce pixel-to-pixel nonuniformity such as threshold voltage mismatch. The proposed CMOS dual-mode sensor is experimentally examined to show a well correlated pH map and optical image for microbeads with a pH sensitivity of 26.2 mV/pH, a fixed pattern noise (FPN) reduction from 4% to 0.3%, and a readout speed of 1200 frames/s. A dual-mode CMOS ISFET sensor with suppressed FPN for accurate large-arrayed pH sensing is proposed and demonstrated with state-of-the-art measured results toward accurate and high-throughput DNA sequencing. The developed dual-mode CMOS ISFET sensor has great potential for future personal genome diagnostics with high accuracy and low cost.

  20. Sensor modeling and demonstration of a multi-object spectrometer for performance-driven sensing

    NASA Astrophysics Data System (ADS)

    Kerekes, John P.; Presnar, Michael D.; Fourspring, Kenneth D.; Ninkov, Zoran; Pogorzala, David R.; Raisanen, Alan D.; Rice, Andrew C.; Vasquez, Juan R.; Patel, Jeffrey P.; MacIntyre, Robert T.; Brown, Scott D.

    2009-05-01

    A novel multi-object spectrometer (MOS) is being explored for use as an adaptive performance-driven sensor that tracks moving targets. Developed originally for astronomical applications, the instrument utilizes an array of micromirrors to reflect light to a panchromatic imaging array. When an object of interest is detected the individual micromirrors imaging the object are tilted to reflect the light to a spectrometer to collect a full spectrum. This paper will present example sensor performance from empirical data collected in laboratory experiments, as well as our approach in designing optical and radiometric models of the MOS channels and the micromirror array. Simulation of moving vehicles in a highfidelity, hyperspectral scene is used to generate a dynamic video input for the adaptive sensor. Performance-driven algorithms for feature-aided target tracking and modality selection exploit multiple electromagnetic observables to track moving vehicle targets.

  1. Information theory analysis of sensor-array imaging systems for computer vision

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Fales, C. L.; Park, S. K.; Samms, R. W.; Self, M. O.

    1983-01-01

    Information theory is used to assess the performance of sensor-array imaging systems, with emphasis on the performance obtained with image-plane signal processing. By electronically controlling the spatial response of the imaging system, as suggested by the mechanism of human vision, it is possible to trade-off edge enhancement for sensitivity, increase dynamic range, and reduce data transmission. Computational results show that: signal information density varies little with large variations in the statistical properties of random radiance fields; most information (generally about 85 to 95 percent) is contained in the signal intensity transitions rather than levels; and performance is optimized when the OTF of the imaging system is nearly limited to the sampling passband to minimize aliasing at the cost of blurring, and the SNR is very high to permit the retrieval of small spatial detail from the extensively blurred signal. Shading the lens aperture transmittance to increase depth of field and using a regular hexagonal sensor-array instead of square lattice to decrease sensitivity to edge orientation also improves the signal information density up to about 30 percent at high SNRs.

  2. Fixed mount wavefront sensor

    DOEpatents

    Neal, Daniel R.

    2000-01-01

    A rigid mount and method of mounting for a wavefront sensor. A wavefront dissector, such as a lenslet array, is rigidly mounted at a fixed distance relative to an imager, such as a CCD camera, without need for a relay imaging lens therebetween.

  3. Dual-mode lensless imaging device for digital enzyme linked immunosorbent assay

    NASA Astrophysics Data System (ADS)

    Sasagawa, Kiyotaka; Kim, Soo Heyon; Miyazawa, Kazuya; Takehara, Hironari; Noda, Toshihiko; Tokuda, Takashi; Iino, Ryota; Noji, Hiroyuki; Ohta, Jun

    2014-03-01

    Digital enzyme linked immunosorbent assay (ELISA) is an ultra-sensitive technology for detecting biomarkers and viruses etc. As a conventional ELISA technique, a target molecule is bonded to an antibody with an enzyme by antigen-antibody reaction. In this technology, a femto-liter droplet chamber array is used as reaction chambers. Due to its small volume, the concentration of fluorescent product by single enzyme can be sufficient for detection by a fluorescent microscopy. In this work, we demonstrate a miniaturized lensless imaging device for digital ELISA by using a custom image sensor. The pixel array of the sensor is coated with a 20 μm-thick yellow filter to eliminate excitation light at 470 nm and covered by a fiber optic plate (FOP) to protect the sensor without resolution degradation. The droplet chamber array formed on a 50μm-thick glass plate is directly placed on the FOP. In the digital ELISA, microbeads coated with antibody are loaded into the droplet chamber array, and the ratio of the fluorescent to the non-fluorescent chambers with the microbeads are observed. In the fluorescence imaging, the spatial resolution is degraded by the spreading through the glass plate because the fluorescence is irradiated omnidirectionally. This degradation is compensated by image processing and the resolution of ~35 μm was achieved. In the bright field imaging, the projected images of the beads with collimated illumination are observed. By varying the incident angle and image composition, microbeads were successfully imaged.

  4. Design of an ultrasonic micro-array for near field sensing during retinal microsurgery.

    PubMed

    Clarke, Clyde; Etienne-Cummings, Ralph

    2006-01-01

    A method for obtaining the optimal and specific sensor parameters for a tool-tip mountable ultrasonic transducer micro-array is presented. The ultrasonic transducer array sensor parameters, such as frequency of operation, element size, inter-element spacing, number of elements and transducer geometry are obtained using a quadratic programming method to obtain a maximum directivity while being constrained to a total array size of 4 mm2 and the required resolution for retinal imaging. The technique is used to design a uniformly spaced NxN transducer array that is capable of resolving structures in the retina that are as small as 2 microm from a distance of 100 microm. The resultant 37x37 array of 16 microm transducers with 26 microm spacing will be realized as a Capacitive Micromachined Ultrasonic Transducer (CMUT) array and used for imaging and robotic guidance during retinal microsurgery.

  5. Using the sun analog sensor (SAS) data to investigate solar array yoke motion on the GOES-8 and -9 spacecraft

    NASA Astrophysics Data System (ADS)

    Phenneger, Milton; Knack, Jennifer L.

    1996-10-01

    The GOES-8 and -9 Sun analog sensor (SAS) flight data is analyzed to evaluate the attitude motion environment of payloads mounted on the solar array. The work was performed in part to extend analysis in progress to support the solar x-ray imager to be flown on the GOES-M. The SAS is a two axis sensor mounted on the x-ray sensor pointing (XRP) module to measure the east/west error angle between the SUn and the solar array normal and to provide a north south error angle for automatic solar pointing of the x-ray sensor by the XRP. The goal was to search for evidence of solar array vibrational modes in the 2 Hz and 0.5 Hz range and to test the predicted amplitudes. The results show that the solar array rotates at the rate of the mean Sun with unexpected oscillation periods of 5.6 minutes, 90 minutes, and 1440 minutes originating from the two 16.1 gear drive train stages between the solar array drive stepper motor and the solar array yoke. The higher frequency oscillations are detected as random noise at the 1/16 Hz telemetry sampling rate of the SAS. This supports the preflight predictions for the high frequency modes but provide s no detailed measurement of the frequency as expected for this data period. In addition to this the data indicates that the solar array is responding unexpectedly to GOES imager instrument blackbody calibration events.

  6. System and method for optical fiber based image acquisition suitable for use in turbine engines

    DOEpatents

    Baleine, Erwan; A V, Varun; Zombo, Paul J.; Varghese, Zubin

    2017-05-16

    A system and a method for image acquisition suitable for use in a turbine engine are disclosed. Light received from a field of view in an object plane is projected onto an image plane through an optical modulation device and is transferred through an image conduit to a sensor array. The sensor array generates a set of sampled image signals in a sensing basis based on light received from the image conduit. Finally, the sampled image signals are transformed from the sensing basis to a representation basis and a set of estimated image signals are generated therefrom. The estimated image signals are used for reconstructing an image and/or a motion-video of a region of interest within a turbine engine.

  7. Compact LWIR sensors using spatial interferometric technology (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Bingham, Adam L.; Lucey, Paul G.; Knobbe, Edward T.

    2017-05-01

    Recent developments in reducing the cost and mass of hyperspectral sensors have enabled more widespread use for short range compositional imaging applications. HSI in the long wave infrared (LWIR) is of interest because it is sensitive to spectral phenomena not accessible to other wavelengths, and because of its inherent thermal imaging capability. At Spectrum Photonics we have pursued compact LWIR hyperspectral sensors both using microbolometer arrays and compact cryogenic detector cameras. Our microbolometer-based systems are principally aimed at short standoff applications, currently weigh 10-15 lbs and feature sizes approximately 20x20x10 cm, with sensitivity in the 1-2 microflick range, and imaging times on the order of 30 seconds. Our systems that employ cryogenic arrays are aimed at medium standoff ranges such as nadir looking missions from UAVs. Recent work with cooled sensors has focused on Strained Layer Superlattice (SLS) technology, as these detector arrays are undergoing rapid improvements, and have some advantages compared to HgCdTe detectors in terms of calibration stability. These sensors include full on-board processing sensor stabilization so are somewhat larger than the microbolometer systems, but could be adapted to much more compact form factors. We will review our recent progress in both these application areas.

  8. Active Sensor for Microwave Tissue Imaging with Bias-Switched Arrays.

    PubMed

    Foroutan, Farzad; Nikolova, Natalia K

    2018-05-06

    A prototype of a bias-switched active sensor was developed and measured to establish the achievable dynamic range in a new generation of active arrays for microwave tissue imaging. The sensor integrates a printed slot antenna, a low-noise amplifier (LNA) and an active mixer in a single unit, which is sufficiently small to enable inter-sensor separation distance as small as 12 mm. The sensor’s input covers the bandwidth from 3 GHz to 7.5 GHz. Its output intermediate frequency (IF) is 30 MHz. The sensor is controlled by a simple bias-switching circuit, which switches ON and OFF the bias of the LNA and the mixer simultaneously. It was demonstrated experimentally that the dynamic range of the sensor, as determined by its ON and OFF states, is 109 dB and 118 dB at resolution bandwidths of 1 kHz and 100 Hz, respectively.

  9. Polymer-carbon black composite sensors in an electronic nose for air-quality monitoring

    NASA Technical Reports Server (NTRS)

    Ryan, M. A.; Shevade, A. V.; Zhou, H.; Homer, M. L.

    2004-01-01

    An electronic nose that uses an array of 32 polymer-carbon black composite sensors has been developed, trained, and tested. By selecting a variety of chemical functionalities in the polymers used to make sensors, it is possible to construct an array capable of identifying and quantifying a broad range of target compounds, such as alcohols and aromatics, and distinguishing isomers and enantiomers (mirror-image isomers). A model of the interaction between target molecules and the polymer-carbon black composite sensors is under development to aid in selecting the array members and to enable identification of compounds with responses not stored in the analysis library.

  10. High-Speed Binary-Output Image Sensor

    NASA Technical Reports Server (NTRS)

    Fossum, Eric; Panicacci, Roger A.; Kemeny, Sabrina E.; Jones, Peter D.

    1996-01-01

    Photodetector outputs digitized by circuitry on same integrated-circuit chip. Developmental special-purpose binary-output image sensor designed to capture up to 1,000 images per second, with resolution greater than 10 to the 6th power pixels per image. Lower-resolution but higher-frame-rate prototype of sensor contains 128 x 128 array of photodiodes on complementary metal oxide/semiconductor (CMOS) integrated-circuit chip. In application for which it is being developed, sensor used to examine helicopter oil to determine whether amount of metal and sand in oil sufficient to warrant replacement.

  11. Geiger-Mode Avalanche Photodiode Arrays Integrated to All-Digital CMOS Circuits.

    PubMed

    Aull, Brian

    2016-04-08

    This article reviews MIT Lincoln Laboratory's work over the past 20 years to develop photon-sensitive image sensors based on arrays of silicon Geiger-mode avalanche photodiodes. Integration of these detectors to all-digital CMOS readout circuits enable exquisitely sensitive solid-state imagers for lidar, wavefront sensing, and passive imaging.

  12. The analysis and rationale behind the upgrading of existing standard definition thermal imagers to high definition

    NASA Astrophysics Data System (ADS)

    Goss, Tristan M.

    2016-05-01

    With 640x512 pixel format IR detector arrays having been on the market for the past decade, Standard Definition (SD) thermal imaging sensors have been developed and deployed across the world. Now with 1280x1024 pixel format IR detector arrays becoming readily available designers of thermal imager systems face new challenges as pixel sizes reduce and the demand and applications for High Definition (HD) thermal imaging sensors increases. In many instances the upgrading of existing under-sampled SD thermal imaging sensors into more optimally sampled or oversampled HD thermal imaging sensors provides a more cost effective and reduced time to market option than to design and develop a completely new sensor. This paper presents the analysis and rationale behind the selection of the best suited HD pixel format MWIR detector for the upgrade of an existing SD thermal imaging sensor to a higher performing HD thermal imaging sensor. Several commercially available and "soon to be" commercially available HD small pixel IR detector options are included as part of the analysis and are considered for this upgrade. The impact the proposed detectors have on the sensor's overall sensitivity, noise and resolution is analyzed, and the improved range performance is predicted. Furthermore with reduced dark currents due to the smaller pixel sizes, the candidate HD MWIR detectors are operated at higher temperatures when compared to their SD predecessors. Therefore, as an additional constraint and as a design goal, the feasibility of achieving upgraded performance without any increase in the size, weight and power consumption of the thermal imager is discussed herein.

  13. Performance Analysis for Lateral-Line-Inspired Sensor Arrays

    DTIC Science & Technology

    2011-06-01

    found to affect numerous aspects of behavior including maneuvering in complex fluid environments, schooling, prey tracking, and environment mapping...190 5-29 Maps of the cost function for a reflected vortex model with an increasing array length but constant sensor spacing . The x at...length but constant sensor spacing . The x in each image denotes the true location of the vortex. The black lines correspond to level sets generated by the

  14. Chemistry integrated circuit: chemical system on a complementary metal oxide semiconductor integrated circuit.

    PubMed

    Nakazato, Kazuo

    2014-03-28

    By integrating chemical reactions on a large-scale integration (LSI) chip, new types of device can be created. For biomedical applications, monolithically integrated sensor arrays for potentiometric, amperometric and impedimetric sensing of biomolecules have been developed. The potentiometric sensor array detects pH and redox reaction as a statistical distribution of fluctuations in time and space. For the amperometric sensor array, a microelectrode structure for measuring multiple currents at high speed has been proposed. The impedimetric sensor array is designed to measure impedance up to 10 MHz. The multimodal sensor array will enable synthetic analysis and make it possible to standardize biosensor chips. Another approach is to create new functional devices by integrating molecular systems with LSI chips, for example image sensors that incorporate biological materials with a sensor array. The quantum yield of the photoelectric conversion of photosynthesis is 100%, which is extremely difficult to achieve by artificial means. In a recently developed process, a molecular wire is plugged directly into a biological photosynthetic system to efficiently conduct electrons to a gold electrode. A single photon can be detected at room temperature using such a system combined with a molecular single-electron transistor.

  15. Development of a conformable electronic skin based on silver nanowires and PDMS

    NASA Astrophysics Data System (ADS)

    Wang, Haopeng

    2017-06-01

    This paper presented the designed and tested a flexible and stretchable pressure sensor array that could be used to cover 3D surface to measure contact pressure. The sensor array is laminated into a thin film with 1 mm in thickness and can easily be stretched without losing its functionality. The fabricated sensor array contained 8×8 sensing elements, each could measure the pressure up to 180 kPa. An improved sandwich structure is used to build the sensor array. The upper and lower layers were PDMS thin films embedded with conductor strips formed by PDMS-based silver nanowires (AgNWs) networks covered with nano-scale thin metal film. The middle layer was formed a porous PDMS film inserted with circular conductive rubber. The sensor array could detect the contact pressure within 30% stretching rate. In this paper, the performance of the pressure sensor array was systematically studied. With the corresponding scanning power-supply circuit and data acquisition system, it is demonstrated that the system can successfully capture the tactile images induced by objects of different shapes. Such sensor system could be applied on complex surfaces in robots or medical devices for contact pressure detection and feedback.

  16. CMOS Active-Pixel Image Sensor With Simple Floating Gates

    NASA Technical Reports Server (NTRS)

    Fossum, Eric R.; Nakamura, Junichi; Kemeny, Sabrina E.

    1996-01-01

    Experimental complementary metal-oxide/semiconductor (CMOS) active-pixel image sensor integrated circuit features simple floating-gate structure, with metal-oxide/semiconductor field-effect transistor (MOSFET) as active circuit element in each pixel. Provides flexibility of readout modes, no kTC noise, and relatively simple structure suitable for high-density arrays. Features desirable for "smart sensor" applications.

  17. Development of Ultra-Low Noise, High Performance III-V Quantum Well Infrared Photodetectors (QWIPs) for Focal Plane Array Staring Image Sensor Systems

    DTIC Science & Technology

    1993-11-01

    Development of Ultra-Low Noise , High Performance III-V Quantum Well Infrared Photodetectors ( QWIPs )I for Focal Plane Array Staring Image Sensor Systems...experimental studies of dark current, photocurrent, noise fig- ures optical absorption, spectral responsivity and detectivity for different types of QWIPs ...the Boltzmann constant, and T is the temperature. S The noise in the QWIPs is mainly due to the random fluctuations of thermally excited carriers. The

  18. High performance thermal imaging for the 21st century

    NASA Astrophysics Data System (ADS)

    Clarke, David J.; Knowles, Peter

    2003-01-01

    In recent years IR detector technology has developed from early short linear arrays. Such devices require high performance signal processing electronics to meet today's thermal imaging requirements for military and para-military applications. This paper describes BAE SYSTEMS Avionics Group's Sensor Integrated Modular Architecture thermal imager which has been developed alongside the group's Eagle 640×512 arrays to provide high performance imaging capability. The electronics architecture also supprots High Definition TV format 2D arrays for future growth capability.

  19. Broadband image sensor array based on graphene-CMOS integration

    NASA Astrophysics Data System (ADS)

    Goossens, Stijn; Navickaite, Gabriele; Monasterio, Carles; Gupta, Shuchi; Piqueras, Juan José; Pérez, Raúl; Burwell, Gregory; Nikitskiy, Ivan; Lasanta, Tania; Galán, Teresa; Puma, Eric; Centeno, Alba; Pesquera, Amaia; Zurutuza, Amaia; Konstantatos, Gerasimos; Koppens, Frank

    2017-06-01

    Integrated circuits based on complementary metal-oxide-semiconductors (CMOS) are at the heart of the technological revolution of the past 40 years, enabling compact and low-cost microelectronic circuits and imaging systems. However, the diversification of this platform into applications other than microcircuits and visible-light cameras has been impeded by the difficulty to combine semiconductors other than silicon with CMOS. Here, we report the monolithic integration of a CMOS integrated circuit with graphene, operating as a high-mobility phototransistor. We demonstrate a high-resolution, broadband image sensor and operate it as a digital camera that is sensitive to ultraviolet, visible and infrared light (300-2,000 nm). The demonstrated graphene-CMOS integration is pivotal for incorporating 2D materials into the next-generation microelectronics, sensor arrays, low-power integrated photonics and CMOS imaging systems covering visible, infrared and terahertz frequencies.

  20. Dual-mode photosensitive arrays based on the integration of liquid crystal microlenses and CMOS sensors for obtaining the intensity images and wavefronts of objects.

    PubMed

    Tong, Qing; Lei, Yu; Xin, Zhaowei; Zhang, Xinyu; Sang, Hongshi; Xie, Changsheng

    2016-02-08

    In this paper, we present a kind of dual-mode photosensitive arrays (DMPAs) constructed by hybrid integration a liquid crystal microlens array (LCMLA) driven electrically and a CMOS sensor array, which can be used to measure both the conventional intensity images and corresponding wavefronts of objects. We utilize liquid crystal materials to shape the microlens array with the electrically tunable focal length. Through switching the voltage signal on and off, the wavefronts and the intensity images can be acquired through the DMPAs, sequentially. We use white light to obtain the object's wavefronts for avoiding losing important wavefront information. We separate the white light wavefronts with a large number of spectral components and then experimentally compare them with single spectral wavefronts of typical red, green and blue lasers, respectively. Then we mix the red, green and blue wavefronts to a composite wavefront containing more optical information of the object.

  1. A Multi-Modality CMOS Sensor Array for Cell-Based Assay and Drug Screening.

    PubMed

    Chi, Taiyun; Park, Jong Seok; Butts, Jessica C; Hookway, Tracy A; Su, Amy; Zhu, Chengjie; Styczynski, Mark P; McDevitt, Todd C; Wang, Hua

    2015-12-01

    In this paper, we present a fully integrated multi-modality CMOS cellular sensor array with four sensing modalities to characterize different cell physiological responses, including extracellular voltage recording, cellular impedance mapping, optical detection with shadow imaging and bioluminescence sensing, and thermal monitoring. The sensor array consists of nine parallel pixel groups and nine corresponding signal conditioning blocks. Each pixel group comprises one temperature sensor and 16 tri-modality sensor pixels, while each tri-modality sensor pixel can be independently configured for extracellular voltage recording, cellular impedance measurement (voltage excitation/current sensing), and optical detection. This sensor array supports multi-modality cellular sensing at the pixel level, which enables holistic cell characterization and joint-modality physiological monitoring on the same cellular sample with a pixel resolution of 80 μm × 100 μm. Comprehensive biological experiments with different living cell samples demonstrate the functionality and benefit of the proposed multi-modality sensing in cell-based assay and drug screening.

  2. A survey of current solid state star tracker technology

    NASA Astrophysics Data System (ADS)

    Armstrong, R. W.; Staley, D. A.

    1985-12-01

    This paper is a survey of the current state of the art in design of star trackers for spacecraft attitude determination systems. Specific areas discussed are sensor technology, including the current state-of-the-art solid state sensors and techniques of mounting and cooling the sensor, analog image preprocessing electronics performance, and digital processing hardware and software. Three examples of area array solid state star tracker development are presented - ASTROS, developed by the Jet Propulsion Laboratory, the Retroreflector Field Tracker (RFT) by Ball Aerospace, and TRW's MADAN. Finally, a discussion of solid state line arrays explores the possibilities for one-dimensional imagers which offer simplified scan control electronics.

  3. A 4MP high-dynamic-range, low-noise CMOS image sensor

    NASA Astrophysics Data System (ADS)

    Ma, Cheng; Liu, Yang; Li, Jing; Zhou, Quan; Chang, Yuchun; Wang, Xinyang

    2015-03-01

    In this paper we present a 4 Megapixel high dynamic range, low dark noise and dark current CMOS image sensor, which is ideal for high-end scientific and surveillance applications. The pixel design is based on a 4-T PPD structure. During the readout of the pixel array, signals are first amplified, and then feed to a low- power column-parallel ADC array which is already presented in [1]. Measurement results show that the sensor achieves a dynamic range of 96dB, a dark noise of 1.47e- at 24fps speed. The dark current is 0.15e-/pixel/s at -20oC.

  4. High-speed particle tracking in microscopy using SPAD image sensors

    NASA Astrophysics Data System (ADS)

    Gyongy, Istvan; Davies, Amy; Miguelez Crespo, Allende; Green, Andrew; Dutton, Neale A. W.; Duncan, Rory R.; Rickman, Colin; Henderson, Robert K.; Dalgarno, Paul A.

    2018-02-01

    Single photon avalanche diodes (SPADs) are used in a wide range of applications, from fluorescence lifetime imaging microscopy (FLIM) to time-of-flight (ToF) 3D imaging. SPAD arrays are becoming increasingly established, combining the unique properties of SPADs with widefield camera configurations. Traditionally, the photosensitive area (fill factor) of SPAD arrays has been limited by the in-pixel digital electronics. However, recent designs have demonstrated that by replacing the complex digital pixel logic with simple binary pixels and external frame summation, the fill factor can be increased considerably. A significant advantage of such binary SPAD arrays is the high frame rates offered by the sensors (>100kFPS), which opens up new possibilities for capturing ultra-fast temporal dynamics in, for example, life science cellular imaging. In this work we consider the use of novel binary SPAD arrays in high-speed particle tracking in microscopy. We demonstrate the tracking of fluorescent microspheres undergoing Brownian motion, and in intra-cellular vesicle dynamics, at high frame rates. We thereby show how binary SPAD arrays can offer an important advance in live cell imaging in such fields as intercellular communication, cell trafficking and cell signaling.

  5. Wavelength-Scanning SPR Imaging Sensors Based on an Acousto-Optic Tunable Filter and a White Light Laser

    PubMed Central

    Zeng, Youjun; Wang, Lei; Wu, Shu-Yuen; He, Jianan; Qu, Junle; Li, Xuejin; Ho, Ho-Pui; Gu, Dayong; Gao, Bruce Zhi; Shao, Yonghong

    2017-01-01

    A fast surface plasmon resonance (SPR) imaging biosensor system based on wavelength interrogation using an acousto-optic tunable filter (AOTF) and a white light laser is presented. The system combines the merits of a wide-dynamic detection range and high sensitivity offered by the spectral approach with multiplexed high-throughput data collection and a two-dimensional (2D) biosensor array. The key feature is the use of AOTF to realize wavelength scan from a white laser source and thus to achieve fast tracking of the SPR dip movement caused by target molecules binding to the sensor surface. Experimental results show that the system is capable of completing a SPR dip measurement within 0.35 s. To the best of our knowledge, this is the fastest time ever reported in the literature for imaging spectral interrogation. Based on a spectral window with a width of approximately 100 nm, a dynamic detection range and resolution of 4.63 × 10−2 refractive index unit (RIU) and 1.27 × 10−6 RIU achieved in a 2D-array sensor is reported here. The spectral SPR imaging sensor scheme has the capability of performing fast high-throughput detection of biomolecular interactions from 2D sensor arrays. The design has no mechanical moving parts, thus making the scheme completely solid-state. PMID:28067766

  6. Solid-State Multi-Sensor Array System for Real Time Imaging of Magnetic Fields and Ferrous Objects

    NASA Astrophysics Data System (ADS)

    Benitez, D.; Gaydecki, P.; Quek, S.; Torres, V.

    2008-02-01

    In this paper the development of a solid-state sensors based system for real-time imaging of magnetic fields and ferrous objects is described. The system comprises 1089 magneto inductive solid state sensors arranged in a 2D array matrix of 33×33 files and columns, equally spaced in order to cover an approximate area of 300 by 300 mm. The sensor array is located within a large current-carrying coil. Data is sampled from the sensors by several DSP controlling units and finally streamed to a host computer via a USB 2.0 interface and the image generated and displayed at a rate of 20 frames per minute. The development of the instrumentation has been complemented by extensive numerical modeling of field distribution patterns using boundary element methods. The system was originally intended for deployment in the non-destructive evaluation (NDE) of reinforced concrete. Nevertheless, the system is not only capable of producing real-time, live video images of the metal target embedded within any opaque medium, it also allows the real-time visualization and determination of the magnetic field distribution emitted by either permanent magnets or geometries carrying current. Although this system was initially developed for the NDE arena, it could also have many potential applications in many other fields, including medicine, security, manufacturing, quality assurance and design involving magnetic fields.

  7. Combined imaging and chemical sensing using a single optical imaging fiber.

    PubMed

    Bronk, K S; Michael, K L; Pantano, P; Walt, D R

    1995-09-01

    Despite many innovations and developments in the field of fiber-optic chemical sensors, optical fibers have not been employed to both view a sample and concurrently detect an analyte of interest. While chemical sensors employing a single optical fiber or a noncoherent fiberoptic bundle have been applied to a wide variety of analytical determinations, they cannot be used for imaging. Similarly, coherent imaging fibers have been employed only for their originally intended purpose, image transmission. We herein report a new technique for viewing a sample and measuring surface chemical concentrations that employs a coherent imaging fiber. The method is based on the deposition of a thin, analyte-sensitive polymer layer on the distal surface of a 350-microns-diameter imaging fiber. We present results from a pH sensor array and an acetylcholine biosensor array, each of which contains approximately 6000 optical sensors. The acetylcholine biosensor has a detection limit of 35 microM and a fast (< 1 s) response time. In association with an epifluorescence microscope and a charge-coupled device, these modified imaging fibers can display visual information of a remote sample with 4-microns spatial resolution, allowing for alternating acquisition of both chemical analysis and visual histology.

  8. The application of Fresnel zone plate based projection in optofluidic microscopy.

    PubMed

    Wu, Jigang; Cui, Xiquan; Lee, Lap Man; Yang, Changhuei

    2008-09-29

    Optofluidic microscopy (OFM) is a novel technique for low-cost, high-resolution on-chip microscopy imaging. In this paper we report the use of the Fresnel zone plate (FZP) based projection in OFM as a cost-effective and compact means for projecting the transmission through an OFM's aperture array onto a sensor grid. We demonstrate this approach by employing a FZP (diameter = 255 microm, focal length = 800 microm) that has been patterned onto a glass slide to project the transmission from an array of apertures (diameter = 1 microm, separation = 10 microm) onto a CMOS sensor. We are able to resolve the contributions from 44 apertures on the sensor under the illumination from a HeNe laser (wavelength = 633 nm). The imaging quality of the FZP determines the effective field-of-view (related to the number of resolvable transmissions from apertures) but not the image resolution of such an OFM system--a key distinction from conventional microscope systems. We demonstrate the capability of the integrated system by flowing the protist Euglena gracilis across the aperture array microfluidically and performing OFM imaging of the samples.

  9. A high-speed trapezoid image sensor design for continuous traffic monitoring at signalized intersection approaches.

    DOT National Transportation Integrated Search

    2014-10-01

    The goal of this project is to monitor traffic flow continuously with an innovative camera system composed of a custom : designed image sensor integrated circuit (IC) containing trapezoid pixel array and camera system that is capable of : intelligent...

  10. Color Restoration of RGBN Multispectral Filter Array Sensor Images Based on Spectral Decomposition.

    PubMed

    Park, Chulhee; Kang, Moon Gi

    2016-05-18

    A multispectral filter array (MSFA) image sensor with red, green, blue and near-infrared (NIR) filters is useful for various imaging applications with the advantages that it obtains color information and NIR information simultaneously. Because the MSFA image sensor needs to acquire invisible band information, it is necessary to remove the IR cut-offfilter (IRCF). However, without the IRCF, the color of the image is desaturated by the interference of the additional NIR component of each RGB color channel. To overcome color degradation, a signal processing approach is required to restore natural color by removing the unwanted NIR contribution to the RGB color channels while the additional NIR information remains in the N channel. Thus, in this paper, we propose a color restoration method for an imaging system based on the MSFA image sensor with RGBN filters. To remove the unnecessary NIR component in each RGB color channel, spectral estimation and spectral decomposition are performed based on the spectral characteristics of the MSFA sensor. The proposed color restoration method estimates the spectral intensity in NIR band and recovers hue and color saturation by decomposing the visible band component and the NIR band component in each RGB color channel. The experimental results show that the proposed method effectively restores natural color and minimizes angular errors.

  11. Color Restoration of RGBN Multispectral Filter Array Sensor Images Based on Spectral Decomposition

    PubMed Central

    Park, Chulhee; Kang, Moon Gi

    2016-01-01

    A multispectral filter array (MSFA) image sensor with red, green, blue and near-infrared (NIR) filters is useful for various imaging applications with the advantages that it obtains color information and NIR information simultaneously. Because the MSFA image sensor needs to acquire invisible band information, it is necessary to remove the IR cut-offfilter (IRCF). However, without the IRCF, the color of the image is desaturated by the interference of the additional NIR component of each RGB color channel. To overcome color degradation, a signal processing approach is required to restore natural color by removing the unwanted NIR contribution to the RGB color channels while the additional NIR information remains in the N channel. Thus, in this paper, we propose a color restoration method for an imaging system based on the MSFA image sensor with RGBN filters. To remove the unnecessary NIR component in each RGB color channel, spectral estimation and spectral decomposition are performed based on the spectral characteristics of the MSFA sensor. The proposed color restoration method estimates the spectral intensity in NIR band and recovers hue and color saturation by decomposing the visible band component and the NIR band component in each RGB color channel. The experimental results show that the proposed method effectively restores natural color and minimizes angular errors. PMID:27213381

  12. Integrated infrared and visible image sensors

    NASA Technical Reports Server (NTRS)

    Fossum, Eric R. (Inventor); Pain, Bedabrata (Inventor)

    2000-01-01

    Semiconductor imaging devices integrating an array of visible detectors and another array of infrared detectors into a single module to simultaneously detect both the visible and infrared radiation of an input image. The visible detectors and the infrared detectors may be formed either on two separate substrates or on the same substrate by interleaving visible and infrared detectors.

  13. Multiple-Event, Single-Photon Counting Imaging Sensor

    NASA Technical Reports Server (NTRS)

    Zheng, Xinyu; Cunningham, Thomas J.; Sun, Chao; Wang, Kang L.

    2011-01-01

    The single-photon counting imaging sensor is typically an array of silicon Geiger-mode avalanche photodiodes that are monolithically integrated with CMOS (complementary metal oxide semiconductor) readout, signal processing, and addressing circuits located in each pixel and the peripheral area of the chip. The major problem is its single-event method for photon count number registration. A single-event single-photon counting imaging array only allows registration of up to one photon count in each of its pixels during a frame time, i.e., the interval between two successive pixel reset operations. Since the frame time can t be too short, this will lead to very low dynamic range and make the sensor merely useful for very low flux environments. The second problem of the prior technique is a limited fill factor resulting from consumption of chip area by the monolithically integrated CMOS readout in pixels. The resulting low photon collection efficiency will substantially ruin any benefit gained from the very sensitive single-photon counting detection. The single-photon counting imaging sensor developed in this work has a novel multiple-event architecture, which allows each of its pixels to register as more than one million (or more) photon-counting events during a frame time. Because of a consequently boosted dynamic range, the imaging array of the invention is capable of performing single-photon counting under ultra-low light through high-flux environments. On the other hand, since the multiple-event architecture is implemented in a hybrid structure, back-illumination and close-to-unity fill factor can be realized, and maximized quantum efficiency can also be achieved in the detector array.

  14. High-Speed Monitoring of Multiple Grid-Connected Photovoltaic Array Configurations and Supplementary Weather Station.

    PubMed

    Boyd, Matthew T

    2017-06-01

    Three grid-connected monocrystalline silicon photovoltaic arrays have been instrumented with research-grade sensors on the Gaithersburg, MD campus of the National Institute of Standards and Technology (NIST). These arrays range from 73 kW to 271 kW and have different tilts, orientations, and configurations. Irradiance, temperature, wind, and electrical measurements at the arrays are recorded, and images are taken of the arrays to monitor shading and capture any anomalies. A weather station has also been constructed that includes research-grade instrumentation to measure all standard meteorological quantities plus additional solar irradiance spectral bands, full spectrum curves, and directional components using multiple irradiance sensor technologies. Reference photovoltaic (PV) modules are also monitored to provide comprehensive baseline measurements for the PV arrays. Images of the whole sky are captured, along with images of the instrumentation and reference modules to document any obstructions or anomalies. Nearly, all measurements at the arrays and weather station are sampled and saved every 1s, with monitoring having started on Aug. 1, 2014. This report describes the instrumentation approach used to monitor the performance of these photovoltaic systems, measure the meteorological quantities, and acquire the images for use in PV performance and weather monitoring and computer model validation.

  15. High-Speed Monitoring of Multiple Grid-Connected Photovoltaic Array Configurations and Supplementary Weather Station

    PubMed Central

    Boyd, Matthew T.

    2017-01-01

    Three grid-connected monocrystalline silicon photovoltaic arrays have been instrumented with research-grade sensors on the Gaithersburg, MD campus of the National Institute of Standards and Technology (NIST). These arrays range from 73 kW to 271 kW and have different tilts, orientations, and configurations. Irradiance, temperature, wind, and electrical measurements at the arrays are recorded, and images are taken of the arrays to monitor shading and capture any anomalies. A weather station has also been constructed that includes research-grade instrumentation to measure all standard meteorological quantities plus additional solar irradiance spectral bands, full spectrum curves, and directional components using multiple irradiance sensor technologies. Reference photovoltaic (PV) modules are also monitored to provide comprehensive baseline measurements for the PV arrays. Images of the whole sky are captured, along with images of the instrumentation and reference modules to document any obstructions or anomalies. Nearly, all measurements at the arrays and weather station are sampled and saved every 1s, with monitoring having started on Aug. 1, 2014. This report describes the instrumentation approach used to monitor the performance of these photovoltaic systems, measure the meteorological quantities, and acquire the images for use in PV performance and weather monitoring and computer model validation. PMID:28670044

  16. Implementation issues of the nearfield equivalent source imaging microphone array

    NASA Astrophysics Data System (ADS)

    Bai, Mingsian R.; Lin, Jia-Hong; Tseng, Chih-Wen

    2011-01-01

    This paper revisits a nearfield microphone array technique termed nearfield equivalent source imaging (NESI) proposed previously. In particular, various issues concerning the implementation of the NESI algorithm are examined. The NESI can be implemented in both the time domain and the frequency domain. Acoustical variables including sound pressure, particle velocity, active intensity and sound power are calculated by using multichannel inverse filters. Issues concerning sensor deployment are also investigated for the nearfield array. The uniform array outperformed a random array previously optimized for far-field imaging, which contradicts the conventional wisdom in far-field arrays. For applications in which only a patch array with scarce sensors is available, a virtual microphone approach is employed to ameliorate edge effects using extrapolation and to improve imaging resolution using interpolation. To enhance the processing efficiency of the time-domain NESI, an eigensystem realization algorithm (ERA) is developed. Several filtering methods are compared in terms of computational complexity. Significant saving on computations can be achieved using ERA and the frequency-domain NESI, as compared to the traditional method. The NESI technique was also experimentally validated using practical sources including a 125 cc scooter and a wooden box model with a loudspeaker fitted inside. The NESI technique proved effective in identifying broadband and non-stationary sources produced by the sources.

  17. Hybrid graphene-copper UWB array sensor for brain tumor detection via scattering parameters in microwave detection system

    NASA Astrophysics Data System (ADS)

    Jamlos, Mohd Aminudin; Ismail, Abdul Hafiizh; Jamlos, Mohd Faizal; Narbudowicz, Adam

    2017-01-01

    Hybrid graphene-copper ultra-wideband array sensor applied to microwave imaging technique is successfully used in detecting and visualizing tumor inside human brain. The sensor made of graphene coated film for the patch while copper for both the transmission line and parasitic element. The hybrid sensor performance is better than fully copper sensor. Hybrid sensor recorded wider bandwidth of 2.0-10.1 GHz compared with fully copper sensor operated from 2.5 to 10.1 GHz. Higher gain of 3.8-8.5 dB is presented by hybrid sensor, while fully copper sensor stated lower gain ranging from 2.6 to 6.7 dB. Both sensors recorded excellent total efficiency averaged at 97 and 94%, respectively. The sensor used for both transmits equivalent signal and receives backscattering signal from stratified human head model in detecting tumor. Difference in the data of the scattering parameters recorded from the head model with presence and absence of tumor is used as the main data to be further processed in confocal microwave imaging algorithm in generating image. MATLAB software is utilized to analyze S-parameter signals obtained from measurement. Tumor presence is indicated by lower S-parameter values compared to higher values recorded by tumor absence.

  18. Multichip imager with improved optical performance near the butt region

    NASA Technical Reports Server (NTRS)

    Kinnard, Kenneth P. (Inventor); Strong, Jr., Richard T. (Inventor); Goldfarb, Samuel (Inventor); Tower, John R. (Inventor)

    1991-01-01

    A compound imager consists of two or more individual chips, each with at least one line array of sensors thereupon. Each chip has a glass support plate attached to the side from which light reaches the line arrays. The chips are butted together end-to-end to make large line arrays of sensors. Because of imperfections in cutting, the butted surfaces define a gap. Light entering in the region of the gap is either lost or falls on an individual imager other than the one for which it is intended. This results in vignetting and/or crosstalk near the butted region. The gap is filled with an epoxy resin or other similar material which, when hardened, has an index of referaction near that of the glass support plate.

  19. In-situ device integration of large-area patterned organic nanowire arrays for high-performance optical sensors

    PubMed Central

    Wu, Yiming; Zhang, Xiujuan; Pan, Huanhuan; Deng, Wei; Zhang, Xiaohong; Zhang, Xiwei; Jie, Jiansheng

    2013-01-01

    Single-crystalline organic nanowires (NWs) are important building blocks for future low-cost and efficient nano-optoelectronic devices due to their extraordinary properties. However, it remains a critical challenge to achieve large-scale organic NW array assembly and device integration. Herein, we demonstrate a feasible one-step method for large-area patterned growth of cross-aligned single-crystalline organic NW arrays and their in-situ device integration for optical image sensors. The integrated image sensor circuitry contained a 10 × 10 pixel array in an area of 1.3 × 1.3 mm2, showing high spatial resolution, excellent stability and reproducibility. More importantly, 100% of the pixels successfully operated at a high response speed and relatively small pixel-to-pixel variation. The high yield and high spatial resolution of the operational pixels, along with the high integration level of the device, clearly demonstrate the great potential of the one-step organic NW array growth and device construction approach for large-scale optoelectronic device integration. PMID:24287887

  20. Design and performance of 4 x 5120-element visible and 2 x 2560-element shortwave infrared multispectral focal planes

    NASA Astrophysics Data System (ADS)

    Tower, J. R.; Cope, A. D.; Pellon, L. E.; McCarthy, B. M.; Strong, R. T.

    1986-06-01

    Two solid-state sensors for use in remote sensing instruments operating in the pushbroom mode are examined. The design and characteristics of the visible/near-infrared (VIS/NIR) device and the short-wavelength infrared (SWIR) device are described. The VIS/NIR is a CCD imager with four parallel sensor lines, each 1024 pixel long; the chip design and filter system of the VIS/NIR are studied. The performance of the VIS/NIR sensor with mask and its system performance are measured. The SWIR is a dual-band line imager consisting of palladium silicide Schottky-barrier detectors coupled to CCD multiplexers; the performance of the device is analyzed. The substrate materials and layout designs used to assemble the 4 x 5120-element VIS/NIR array and the 2 x 2560-element SWIR array are discussed, and the planarity of the butted arrays are verified using a profilometer. The optical and electrical characteristics, and the placement and butting accuracy of the arrays are evaluated. It is noted that the arrays met or exceed their expected performance.

  1. Low Power Camera-on-a-Chip Using CMOS Active Pixel Sensor Technology

    NASA Technical Reports Server (NTRS)

    Fossum, E. R.

    1995-01-01

    A second generation image sensor technology has been developed at the NASA Jet Propulsion Laboratory as a result of the continuing need to miniaturize space science imaging instruments. Implemented using standard CMOS, the active pixel sensor (APS) technology permits the integration of the detector array with on-chip timing, control and signal chain electronics, including analog-to-digital conversion.

  2. Dual-polarized light-field imaging micro-system via a liquid-crystal microlens array for direct three-dimensional observation.

    PubMed

    Xin, Zhaowei; Wei, Dong; Xie, Xingwang; Chen, Mingce; Zhang, Xinyu; Liao, Jing; Wang, Haiwei; Xie, Changsheng

    2018-02-19

    Light-field imaging is a crucial and straightforward way of measuring and analyzing surrounding light worlds. In this paper, a dual-polarized light-field imaging micro-system based on a twisted nematic liquid-crystal microlens array (TN-LCMLA) for direct three-dimensional (3D) observation is fabricated and demonstrated. The prototyped camera has been constructed by integrating a TN-LCMLA with a common CMOS sensor array. By switching the working state of the TN-LCMLA, two orthogonally polarized light-field images can be remapped through the functioned imaging sensors. The imaging micro-system in conjunction with the electric-optical microstructure can be used to perform polarization and light-field imaging, simultaneously. Compared with conventional plenoptic cameras using liquid-crystal microlens array, the polarization-independent light-field images with a high image quality can be obtained in the arbitrary polarization state selected. We experimentally demonstrate characters including a relatively wide operation range in the manipulation of incident beams and the multiple imaging modes, such as conventional two-dimensional imaging, light-field imaging, and polarization imaging. Considering the obvious features of the TN-LCMLA, such as very low power consumption, providing multiple imaging modes mentioned, simple and low-cost manufacturing, the imaging micro-system integrated with this kind of liquid-crystal microstructure driven electrically presents the potential capability of directly observing a 3D object in typical scattering media.

  3. Low SWaP multispectral sensors using dichroic filter arrays

    NASA Astrophysics Data System (ADS)

    Dougherty, John; Varghese, Ron

    2015-06-01

    The benefits of multispectral imaging are well established in a variety of applications including remote sensing, authentication, satellite and aerial surveillance, machine vision, biomedical, and other scientific and industrial uses. However, many of the potential solutions require more compact, robust, and cost-effective cameras to realize these benefits. The next generation of multispectral sensors and cameras needs to deliver improvements in size, weight, power, portability, and spectral band customization to support widespread deployment for a variety of purpose-built aerial, unmanned, and scientific applications. A novel implementation uses micro-patterning of dichroic filters1 into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. This approach can be implemented across a variety of wavelength ranges and on a variety of detector types including linear, area, silicon, and InGaAs. This dichroic filter array approach can also reduce payloads and increase range for unmanned systems, with the capability to support both handheld and autonomous systems. Recent examples and results of 4 band RGB + NIR dichroic filter arrays in multispectral cameras are discussed. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and scalable production.

  4. Optical and Electric Multifunctional CMOS Image Sensors for On-Chip Biosensing Applications

    PubMed Central

    Tokuda, Takashi; Noda, Toshihiko; Sasagawa, Kiyotaka; Ohta, Jun

    2010-01-01

    In this review, the concept, design, performance, and a functional demonstration of multifunctional complementary metal-oxide-semiconductor (CMOS) image sensors dedicated to on-chip biosensing applications are described. We developed a sensor architecture that allows flexible configuration of a sensing pixel array consisting of optical and electric sensing pixels, and designed multifunctional CMOS image sensors that can sense light intensity and electric potential or apply a voltage to an on-chip measurement target. We describe the sensors’ architecture on the basis of the type of electric measurement or imaging functionalities. PMID:28879978

  5. A sensitive optical micro-machined ultrasound sensor (OMUS) based on a silicon photonic ring resonator on an acoustical membrane.

    PubMed

    Leinders, S M; Westerveld, W J; Pozo, J; van Neer, P L M J; Snyder, B; O'Brien, P; Urbach, H P; de Jong, N; Verweij, M D

    2015-09-22

    With the increasing use of ultrasonography, especially in medical imaging, novel fabrication techniques together with novel sensor designs are needed to meet the requirements for future applications like three-dimensional intercardiac and intravascular imaging. These applications require arrays of many small elements to selectively record the sound waves coming from a certain direction. Here we present proof of concept of an optical micro-machined ultrasound sensor (OMUS) fabricated with a semi-industrial CMOS fabrication line. The sensor is based on integrated photonics, which allows for elements with small spatial footprint. We demonstrate that the first prototype is already capable of detecting pressures of 0.4 Pa, which matches the performance of the state of the art piezo-electric transducers while having a 65 times smaller spatial footprint. The sensor is compatible with MRI due to the lack of electronical wiring. Another important benefit of the use of integrated photonics is the easy interrogation of an array of elements. Hence, in future designs only two optical fibers are needed to interrogate an entire array, which minimizes the amount of connections of smart catheters. The demonstrated OMUS has potential applications in medical ultrasound imaging, non destructive testing as well as in flow sensing.

  6. Biomimetic machine vision system.

    PubMed

    Harman, William M; Barrett, Steven F; Wright, Cameron H G; Wilcox, Michael

    2005-01-01

    Real-time application of digital imaging for use in machine vision systems has proven to be prohibitive when used within control systems that employ low-power single processors without compromising the scope of vision or resolution of captured images. Development of a real-time machine analog vision system is the focus of research taking place at the University of Wyoming. This new vision system is based upon the biological vision system of the common house fly. Development of a single sensor is accomplished, representing a single facet of the fly's eye. This new sensor is then incorporated into an array of sensors capable of detecting objects and tracking motion in 2-D space. This system "preprocesses" incoming image data resulting in minimal data processing to determine the location of a target object. Due to the nature of the sensors in the array, hyperacuity is achieved thereby eliminating resolutions issues found in digital vision systems. In this paper, we will discuss the biological traits of the fly eye and the specific traits that led to the development of this machine vision system. We will also discuss the process of developing an analog based sensor that mimics the characteristics of interest in the biological vision system. This paper will conclude with a discussion of how an array of these sensors can be applied toward solving real-world machine vision issues.

  7. Characteristics of Monolithically Integrated InGaAs Active Pixel Imager Array

    NASA Technical Reports Server (NTRS)

    Kim, Q.; Cunningham, T. J.; Pain, B.; Lange, M. J.; Olsen, G. H.

    2000-01-01

    Switching and amplifying characteristics of a newly developed monolithic InGaAs Active Pixel Imager Array are presented. The sensor array is fabricated from InGaAs material epitaxially deposited on an InP substrate. It consists of an InGaAs photodiode connected to InP depletion-mode junction field effect transistors (JFETs) for low leakage, low power, and fast control of circuit signal amplifying, buffering, selection, and reset. This monolithically integrated active pixel sensor configuration eliminates the need for hybridization with silicon multiplexer. In addition, the configuration allows the sensor to be front illuminated, making it sensitive to visible as well as near infrared signal radiation. Adapting the existing 1.55 micrometer fiber optical communication technology, this integration will be an ideal system of optoelectronic integration for dual band (Visible/IR) applications near room temperature, for use in atmospheric gas sensing in space, and for target identification on earth. In this paper, two different types of small 4 x 1 test arrays will be described. The effectiveness of switching and amplifying circuits will be discussed in terms of circuit effectiveness (leakage, operating frequency, and temperature) in preparation for the second phase demonstration of integrated, two-dimensional monolithic InGaAs active pixel sensor arrays for applications in transportable shipboard surveillance, night vision, and emission spectroscopy.

  8. Separating light absorption layer from channel in ZnO vertical nanorod arrays based photodetectors for high-performance image sensors

    NASA Astrophysics Data System (ADS)

    Ma, Yang; Wu, Congjun; Xu, Zhihao; Wang, Fei; Wang, Min

    2018-05-01

    Photoconductor arrays with both high responsivity and large ON/OFF ratios are of great importance for the application of image sensors. Herein, a ZnO vertical nanorod array based photoconductor with a light absorption layer separated from the device channel has been designed, in which the photo-generated carriers along the axial ZnO nanorods drive to the external electrodes through nanorod-nanorod junctions in the dense layer at the bottom. This design allows us to enhance the photocurrent with unchanged dark current by increasing the ratio between the ZnO nanorod length and the thickness of the dense layer to achieve both high responsivity and large ON/OFF ratios. As a result, the as-fabricated devices possess a high responsivity of 1.3 × 105 A/W, a high ON/OFF ratio of 790, a high detectivity of 1.3 × 1013 Jones, and a low detectable light intensity of 1 μW/cm2. More importantly, the developed approach enables the integration of ZnO vertical nanorod array based photodetectors as image sensors with uniform device-to-device performance.

  9. Ultrasonic fingerprint sensor using a piezoelectric micromachined ultrasonic transducer array integrated with complementary metal oxide semiconductor electronics

    NASA Astrophysics Data System (ADS)

    Lu, Y.; Tang, H.; Fung, S.; Wang, Q.; Tsai, J. M.; Daneman, M.; Boser, B. E.; Horsley, D. A.

    2015-06-01

    This paper presents an ultrasonic fingerprint sensor based on a 24 × 8 array of 22 MHz piezoelectric micromachined ultrasonic transducers (PMUTs) with 100 μm pitch, fully integrated with 180 nm complementary metal oxide semiconductor (CMOS) circuitry through eutectic wafer bonding. Each PMUT is directly bonded to a dedicated CMOS receive amplifier, minimizing electrical parasitics and eliminating the need for through-silicon vias. The array frequency response and vibration mode-shape were characterized using laser Doppler vibrometry and verified via finite element method simulation. The array's acoustic output was measured using a hydrophone to be ˜14 kPa with a 28 V input, in reasonable agreement with predication from analytical calculation. Pulse-echo imaging of a 1D steel grating is demonstrated using electronic scanning of a 20 × 8 sub-array, resulting in 300 mV maximum received amplitude and 5:1 contrast ratio. Because the small size of this array limits the maximum image size, mechanical scanning was used to image a 2D polydimethylsiloxane fingerprint phantom (10 mm × 8 mm) at a 1.2 mm distance from the array.

  10. Ultrasonic imaging of material flaws exploiting multipath information

    NASA Astrophysics Data System (ADS)

    Shen, Xizhong; Zhang, Yimin D.; Demirli, Ramazan; Amin, Moeness G.

    2011-05-01

    In this paper, we consider ultrasonic imaging for the visualization of flaws in a material. Ultrasonic imaging is a powerful nondestructive testing (NDT) tool which assesses material conditions via the detection, localization, and classification of flaws inside a structure. Multipath exploitations provide extended virtual array apertures and, in turn, enhance imaging capability beyond the limitation of traditional multisensor approaches. We utilize reflections of ultrasonic signals which occur when encountering different media and interior discontinuities. The waveforms observed at the physical as well as virtual sensors yield additional measurements corresponding to different aspect angles. Exploitation of multipath information addresses unique issues observed in ultrasonic imaging. (1) Utilization of physical and virtual sensors significantly extends the array aperture for image enhancement. (2) Multipath signals extend the angle of view of the narrow beamwidth of the ultrasound transducers, allowing improved visibility and array design flexibility. (3) Ultrasonic signals experience difficulty in penetrating a flaw, thus the aspect angle of the observation is limited unless access to other sides is available. The significant extension of the aperture makes it possible to yield flaw observation from multiple aspect angles. We show that data fusion of physical and virtual sensor data significantly improves the detection and localization performance. The effectiveness of the proposed multipath exploitation approach is demonstrated through experimental studies.

  11. Retinal fundus imaging with a plenoptic sensor

    NASA Astrophysics Data System (ADS)

    Thurin, Brice; Bloch, Edward; Nousias, Sotiris; Ourselin, Sebastien; Keane, Pearse; Bergeles, Christos

    2018-02-01

    Vitreoretinal surgery is moving towards 3D visualization of the surgical field. This require acquisition system capable of recording such 3D information. We propose a proof of concept imaging system based on a light-field camera where an array of micro-lenses is placed in front of a conventional sensor. With a single snapshot, a stack of images focused at different depth are produced on the fly, which provides enhanced depth perception for the surgeon. Difficulty in depth localization of features and frequent focus-change during surgery are making current vitreoretinal heads-up surgical imaging systems cumbersome to use. To improve the depth perception and eliminate the need to manually refocus on the instruments during the surgery, we designed and implemented a proof-of-concept ophthalmoscope equipped with a commercial light-field camera. The sensor of our camera is composed of an array of micro-lenses which are projecting an array of overlapped micro-images. We show that with a single light-field snapshot we can digitally refocus between the retina and a tool located in front of the retina or display an extended depth-of-field image where everything is in focus. The design and system performances of the plenoptic fundus camera are detailed. We will conclude by showing in vivo data recorded with our device.

  12. Imaging dipole flow sources using an artificial lateral-line system made of biomimetic hair flow sensors

    PubMed Central

    Dagamseh, Ahmad; Wiegerink, Remco; Lammerink, Theo; Krijnen, Gijs

    2013-01-01

    In Nature, fish have the ability to localize prey, school, navigate, etc., using the lateral-line organ. Artificial hair flow sensors arranged in a linear array shape (inspired by the lateral-line system (LSS) in fish) have been applied to measure airflow patterns at the sensor positions. Here, we take advantage of both biomimetic artificial hair-based flow sensors arranged as LSS and beamforming techniques to demonstrate dipole-source localization in air. Modelling and measurement results show the artificial lateral-line ability to image the position of dipole sources accurately with estimation error of less than 0.14 times the array length. This opens up possibilities for flow-based, near-field environment mapping that can be beneficial to, for example, biologists and robot guidance applications. PMID:23594816

  13. Imaging Science Panel. Multispectral Imaging Science Working Group joint meeting with Information Science Panel: Introduction

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The state-of-the-art of multispectral sensing is reviewed and recommendations for future research and development are proposed. specifically, two generic sensor concepts were discussed. One is the multispectral pushbroom sensor utilizing linear array technology which operates in six spectral bands including two in the SWIR region and incorporates capabilities for stereo and crosstrack pointing. The second concept is the imaging spectrometer (IS) which incorporates a dispersive element and area arrays to provide both spectral and spatial information simultaneously. Other key technology areas included very large scale integration and the computer aided design of these devices.

  14. Experimental single-chip color HDTV image acquisition system with 8M-pixel CMOS image sensor

    NASA Astrophysics Data System (ADS)

    Shimamoto, Hiroshi; Yamashita, Takayuki; Funatsu, Ryohei; Mitani, Kohji; Nojiri, Yuji

    2006-02-01

    We have developed an experimental single-chip color HDTV image acquisition system using 8M-pixel CMOS image sensor. The sensor has 3840 × 2160 effective pixels and is progressively scanned at 60 frames per second. We describe the color filter array and interpolation method to improve image quality with a high-pixel-count single-chip sensor. We also describe an experimental image acquisition system we used to measured spatial frequency characteristics in the horizontal direction. The results indicate good prospects for achieving a high quality single chip HDTV camera that reduces pseudo signals and maintains high spatial frequency characteristics within the frequency band for HDTV.

  15. Origin of high photoconductive gain in fully transparent heterojunction nanocrystalline oxide image sensors and interconnects.

    PubMed

    Jeon, Sanghun; Song, Ihun; Lee, Sungsik; Ryu, Byungki; Ahn, Seung-Eon; Lee, Eunha; Kim, Young; Nathan, Arokia; Robertson, John; Chung, U-In

    2014-11-05

    A technique for invisible image capture using a photosensor array based on transparent conducting oxide semiconductor thin-film transistors and transparent interconnection technologies is presented. A transparent conducting layer is employed for the sensor electrodes as well as interconnection in the array, providing about 80% transmittance at visible-light wavelengths. The phototransistor is a Hf-In-Zn-O/In-Zn-O heterostructure yielding a high quantum-efficiency in the visible range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Development of Ultra-Low Noise, High Performance III-V Quantum Well Infrared Photodetectors (QWIPs) for Focal Plane Array Staring Image Sensor Systems

    DTIC Science & Technology

    1993-08-01

    Development of Ultra-Low Noise , High Performance III-V Quantum Well Infrared Photodetectors ( QWIPs ) for Focal Plane Array Staring Image Sensor Systems...using a 2-D square mesh grating coupler to achieve maximum responsivity for an InGaAs SBTM QWIP , and (iv) performed noise characterization on four...different types of Ir-V QWIPs and identified their noise sources. Detailed results and accomplishments are discussed in this report. 1 SJ •aTEtcRMrtlS

  17. An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability.

    PubMed

    Cevik, Ismail; Huang, Xiwei; Yu, Hao; Yan, Mei; Ay, Suat U

    2015-03-06

    An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability is introduced in this paper. The photodiode pixel array can not only capture images but also harvest solar energy. As such, the CMOS image sensor chip is able to switch between imaging and harvesting modes towards self-power operation. Moreover, an on-chip maximum power point tracking (MPPT)-based power management system (PMS) is designed for the dual-mode image sensor to further improve the energy efficiency. A new isolated P-well energy harvesting and imaging (EHI) pixel with very high fill factor is introduced. Several ultra-low power design techniques such as reset and select boosting techniques have been utilized to maintain a wide pixel dynamic range. The chip was designed and fabricated in a 1.8 V, 1P6M 0.18 µm CMOS process. Total power consumption of the imager is 6.53 µW for a 96 × 96 pixel array with 1 V supply and 5 fps frame rate. Up to 30 μW of power could be generated by the new EHI pixels. The PMS is capable of providing 3× the power required during imaging mode with 50% efficiency allowing energy autonomous operation with a 72.5% duty cycle.

  18. An Ultra-Low Power CMOS Image Sensor with On-Chip Energy Harvesting and Power Management Capability

    PubMed Central

    Cevik, Ismail; Huang, Xiwei; Yu, Hao; Yan, Mei; Ay, Suat U.

    2015-01-01

    An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability is introduced in this paper. The photodiode pixel array can not only capture images but also harvest solar energy. As such, the CMOS image sensor chip is able to switch between imaging and harvesting modes towards self-power operation. Moreover, an on-chip maximum power point tracking (MPPT)-based power management system (PMS) is designed for the dual-mode image sensor to further improve the energy efficiency. A new isolated P-well energy harvesting and imaging (EHI) pixel with very high fill factor is introduced. Several ultra-low power design techniques such as reset and select boosting techniques have been utilized to maintain a wide pixel dynamic range. The chip was designed and fabricated in a 1.8 V, 1P6M 0.18 µm CMOS process. Total power consumption of the imager is 6.53 µW for a 96 × 96 pixel array with 1 V supply and 5 fps frame rate. Up to 30 μW of power could be generated by the new EHI pixels. The PMS is capable of providing 3× the power required during imaging mode with 50% efficiency allowing energy autonomous operation with a 72.5% duty cycle. PMID:25756863

  19. Performance quantification of a millimeter-wavelength imaging system based on inexpensive glow-discharge-detector focal-plane array.

    PubMed

    Shilemay, Moshe; Rozban, Daniel; Levanon, Assaf; Yitzhaky, Yitzhak; Kopeika, Natan S; Yadid-Pecht, Orly; Abramovich, Amir

    2013-03-01

    Inexpensive millimeter-wavelength (MMW) optical digital imaging raises a challenge of evaluating the imaging performance and image quality because of the large electromagnetic wavelengths and pixel sensor sizes, which are 2 to 3 orders of magnitude larger than those of ordinary thermal or visual imaging systems, and also because of the noisiness of the inexpensive glow discharge detectors that compose the focal-plane array. This study quantifies the performances of this MMW imaging system. Its point-spread function and modulation transfer function were investigated. The experimental results and the analysis indicate that the image quality of this MMW imaging system is limited mostly by the noise, and the blur is dominated by the pixel sensor size. Therefore, the MMW image might be improved by oversampling, given that noise reduction is achieved. Demonstration of MMW image improvement through oversampling is presented.

  20. High Dynamic Range Spectral Imaging Pipeline For Multispectral Filter Array Cameras.

    PubMed

    Lapray, Pierre-Jean; Thomas, Jean-Baptiste; Gouton, Pierre

    2017-06-03

    Spectral filter arrays imaging exhibits a strong similarity with color filter arrays. This permits us to embed this technology in practical vision systems with little adaptation of the existing solutions. In this communication, we define an imaging pipeline that permits high dynamic range (HDR)-spectral imaging, which is extended from color filter arrays. We propose an implementation of this pipeline on a prototype sensor and evaluate the quality of our implementation results on real data with objective metrics and visual examples. We demonstrate that we reduce noise, and, in particular we solve the problem of noise generated by the lack of energy balance. Data are provided to the community in an image database for further research.

  1. Design and application of a small size SAFT imaging system for concrete structure

    NASA Astrophysics Data System (ADS)

    Shao, Zhixue; Shi, Lihua; Shao, Zhe; Cai, Jian

    2011-07-01

    A method of ultrasonic imaging detection is presented for quick non-destructive testing (NDT) of concrete structures using synthesized aperture focusing technology (SAFT). A low cost ultrasonic sensor array consisting of 12 market available low frequency ultrasonic transducers is designed and manufactured. A channel compensation method is proposed to improve the consistency of different transducers. The controlling devices for array scan as well as the virtual instrument for SAFT imaging are designed. In the coarse scan mode with the scan step of 50 mm, the system can quickly give an image display of a cross section of 600 mm (L) × 300 mm (D) by one measurement. In the refined scan model, the system can reduce the scan step and give an image display of the same cross section by moving the sensor array several times. Experiments on staircase specimen, concrete slab with embedded target, and building floor with underground pipe line all verify the efficiency of the proposed method.

  2. Implementation of a Virtual Microphone Array to Obtain High Resolution Acoustic Images

    PubMed Central

    Izquierdo, Alberto; Suárez, Luis; Suárez, David

    2017-01-01

    Using arrays with digital MEMS (Micro-Electro-Mechanical System) microphones and FPGA-based (Field Programmable Gate Array) acquisition/processing systems allows building systems with hundreds of sensors at a reduced cost. The problem arises when systems with thousands of sensors are needed. This work analyzes the implementation and performance of a virtual array with 6400 (80 × 80) MEMS microphones. This virtual array is implemented by changing the position of a physical array of 64 (8 × 8) microphones in a grid with 10 × 10 positions, using a 2D positioning system. This virtual array obtains an array spatial aperture of 1 × 1 m2. Based on the SODAR (SOund Detection And Ranging) principle, the measured beampattern and the focusing capacity of the virtual array have been analyzed, since beamforming algorithms assume to be working with spherical waves, due to the large dimensions of the array in comparison with the distance between the target (a mannequin) and the array. Finally, the acoustic images of the mannequin, obtained for different frequency and range values, have been obtained, showing high angular resolutions and the possibility to identify different parts of the body of the mannequin. PMID:29295485

  3. Evaluation of realistic layouts for next generation on-scalp MEG: spatial information density maps.

    PubMed

    Riaz, Bushra; Pfeiffer, Christoph; Schneiderman, Justin F

    2017-08-01

    While commercial magnetoencephalography (MEG) systems are the functional neuroimaging state-of-the-art in terms of spatio-temporal resolution, MEG sensors have not changed significantly since the 1990s. Interest in newer sensors that operate at less extreme temperatures, e.g., high critical temperature (high-T c ) SQUIDs, optically-pumped magnetometers, etc., is growing because they enable significant reductions in head-to-sensor standoff (on-scalp MEG). Various metrics quantify the advantages of on-scalp MEG, but a single straightforward one is lacking. Previous works have furthermore been limited to arbitrary and/or unrealistic sensor layouts. We introduce spatial information density (SID) maps for quantitative and qualitative evaluations of sensor arrays. SID-maps present the spatial distribution of information a sensor array extracts from a source space while accounting for relevant source and sensor parameters. We use it in a systematic comparison of three practical on-scalp MEG sensor array layouts (based on high-T c SQUIDs) and the standard Elekta Neuromag TRIUX magnetometer array. Results strengthen the case for on-scalp and specifically high-T c SQUID-based MEG while providing a path for the practical design of future MEG systems. SID-maps are furthermore general to arbitrary magnetic sensor technologies and source spaces and can thus be used for design and evaluation of sensor arrays for magnetocardiography, magnetic particle imaging, etc.

  4. High-Sensitivity Fiber-Optic Ultrasound Sensors for Medical Imaging Applications

    PubMed Central

    Wen, H.; Wiesler, D.G.; Tveten, A.; Danver, B.; Dandridge, A.

    2010-01-01

    This paper presents several designs of high-sensitivity, compact fiber-optic ultrasound sensors that may be used for medical imaging applications. These sensors translate ultrasonic pulses into strains in single-mode optical fibers, which are measured with fiber-based laser interferometers at high precision. The sensors are simpler and less expensive to make than piezoelectric sensors, and are not susceptible to electromagnetic interference. It is possible to make focal sensors with these designs, and several schemes are discussed. Because of the minimum bending radius of optical fibers, the designs are suitable for single element sensors rather than for arrays. PMID:9691368

  5. NeuroSeek dual-color image processing infrared focal plane array

    NASA Astrophysics Data System (ADS)

    McCarley, Paul L.; Massie, Mark A.; Baxter, Christopher R.; Huynh, Buu L.

    1998-09-01

    Several technologies have been developed in recent years to advance the state of the art of IR sensor systems including dual color affordable focal planes, on-focal plane array biologically inspired image and signal processing techniques and spectral sensing techniques. Pacific Advanced Technology (PAT) and the Air Force Research Lab Munitions Directorate have developed a system which incorporates the best of these capabilities into a single device. The 'NeuroSeek' device integrates these technologies into an IR focal plane array (FPA) which combines multicolor Midwave IR/Longwave IR radiometric response with on-focal plane 'smart' neuromorphic analog image processing. The readout and processing integrated circuit very large scale integration chip which was developed under this effort will be hybridized to a dual color detector array to produce the NeuroSeek FPA, which will have the capability to fuse multiple pixel-based sensor inputs directly on the focal plane. Great advantages are afforded by application of massively parallel processing algorithms to image data in the analog domain; the high speed and low power consumption of this device mimic operations performed in the human retina.

  6. CMOS foveal image sensor chip

    NASA Technical Reports Server (NTRS)

    Scott, Peter (Inventor); Sridhar, Ramalingam (Inventor); Bandera, Cesar (Inventor); Xia, Shu (Inventor)

    2002-01-01

    A foveal image sensor integrated circuit comprising a plurality of CMOS active pixel sensors arranged both within and about a central fovea region of the chip. The pixels in the central fovea region have a smaller size than the pixels arranged in peripheral rings about the central region. A new photocharge normalization scheme and associated circuitry normalizes the output signals from the different size pixels in the array. The pixels are assembled into a multi-resolution rectilinear foveal image sensor chip using a novel access scheme to reduce the number of analog RAM cells needed. Localized spatial resolution declines monotonically with offset from the imager's optical axis, analogous to biological foveal vision.

  7. ACOUSTICAL IMAGING AND MECHANICAL PROPERTIES OF SOFT ROCK AND MARINE SEDIMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thurman E. Scott, Jr., Ph.D.; Younane Abousleiman, Ph.D.; Musharraf Zaman, Ph.D., P.E.

    2002-11-18

    During the sixth quarter of this research project the research team developed a method and the experimental procedures for acquiring the data needed for ultrasonic tomography of rock core samples under triaxial stress conditions as outlined in Task 10. Traditional triaxial compression experiments, where compressional and shear wave velocities are measured, provide little or no information about the internal spatial distribution of mechanical damage within the sample. The velocities measured between platen-to-platen or sensor-to-sensor reflects an averaging of all the velocities occurring along that particular raypath across the boundaries of the rock. The research team is attempting to develop andmore » refine a laboratory equivalent of seismic tomography for use on rock samples deformed under triaxial stress conditions. Seismic tomography, utilized for example in crosswell tomography, allows an imaging of the velocities within a discrete zone within the rock. Ultrasonic or acoustic tomography is essentially the extension of that field technology applied to rock samples deforming in the laboratory at high pressures. This report outlines the technical steps and procedures for developing this technology for use on weak, soft chalk samples. Laboratory tests indicate that the chalk samples exhibit major changes in compressional and shear wave velocities during compaction. Since chalk is the rock type responsible for the severe subsidence and compaction in the North Sea it was selected for the first efforts at tomographic imaging of soft rocks. Field evidence from the North Sea suggests that compaction, which has resulted in over 30 feet of subsidence to date, is heterogeneously distributed within the reservoir. The research team will attempt to image this very process in chalk samples. The initial tomographic studies (Scott et al., 1994a,b; 1998) were accomplished on well cemented, competent rocks such as Berea sandstone. The extension of the technology to weaker samples is more difficult but potentially much more rewarding. The chalk, since it is a weak material, also attenuates wave propagation more than other rock types. Three different types of sensors were considered (and tested) for the tomographic imaging project: 600 KHz PZT, 1 MHz PZT, and PVDF film sensors. 600 KHz PZT crystals were selected because they generated a sufficiently high amplitude pulse to propagate across the damaged chalk. A number of different configurations were considered for placement of the acoustic arrays. It was decided after preliminary testing that the most optimum arrangement of the acoustic sensors was to place three arrays of sensors, with each array containing twenty sensors, around the sample. There would be two horizontal arrays to tomographically image two circular cross-sectional planes through the rock core sample. A third array would be vertically oriented to provide a vertical cross-sectional view of the sample. A total of 260 acoustic raypaths would be shot and acquired in the horizontal acoustic array to create each horizontal tomographic image. The sensors can be used as both acoustic sources or as acoustic each of the 10 pulsers to the 10 receivers.« less

  8. Highly Concentrated Seed-Mediated Synthesis of Monodispersed Gold Nanorods (Postprint)

    DTIC Science & Technology

    2017-07-17

    imaging, therapeutics and sensors, to large area coatings, filters , and optical attenuators. Development of the latter technologies has been hindered by...sensors, to large area coatings, filters , and optical attenuators. Development of the latter technologies has been hindered by the lack of cost-effective...challenges the utilization of Au-NRs in a diverse array of technologies, ranging from therapeutics, imaging and sensors, to large area coatings, filters and

  9. Superresolution with the focused plenoptic camera

    NASA Astrophysics Data System (ADS)

    Georgiev, Todor; Chunev, Georgi; Lumsdaine, Andrew

    2011-03-01

    Digital images from a CCD or CMOS sensor with a color filter array must undergo a demosaicing process to combine the separate color samples into a single color image. This interpolation process can interfere with the subsequent superresolution process. Plenoptic superresolution, which relies on precise sub-pixel sampling across captured microimages, is particularly sensitive to such resampling of the raw data. In this paper we present an approach for superresolving plenoptic images that takes place at the time of demosaicing the raw color image data. Our approach exploits the interleaving provided by typical color filter arrays (e.g., Bayer filter) to further refine plenoptic sub-pixel sampling. Our rendering algorithm treats the color channels in a plenoptic image separately, which improves final superresolution by a factor of two. With appropriate plenoptic capture we show the theoretical possibility for rendering final images at full sensor resolution.

  10. MTF evaluation of white pixel sensors

    NASA Astrophysics Data System (ADS)

    Lindner, Albrecht; Atanassov, Kalin; Luo, Jiafu; Goma, Sergio

    2015-01-01

    We present a methodology to compare image sensors with traditional Bayer RGB layouts to sensors with alternative layouts containing white pixels. We focused on the sensors' resolving powers, which we measured in the form of a modulation transfer function for variations in both luma and chroma channels. We present the design of the test chart, the acquisition of images, the image analysis, and an interpretation of results. We demonstrate the approach at the example of two sensors that only differ in their color filter arrays. We confirmed that the sensor with white pixels and the corresponding demosaicing result in a higher resolving power in the luma channel, but a lower resolving power in the chroma channels when compared to the traditional Bayer sensor.

  11. High-content analysis of single cells directly assembled on CMOS sensor based on color imaging.

    PubMed

    Tanaka, Tsuyoshi; Saeki, Tatsuya; Sunaga, Yoshihiko; Matsunaga, Tadashi

    2010-12-15

    A complementary metal oxide semiconductor (CMOS) image sensor was applied to high-content analysis of single cells which were assembled closely or directly onto the CMOS sensor surface. The direct assembling of cell groups on CMOS sensor surface allows large-field (6.66 mm×5.32 mm in entire active area of CMOS sensor) imaging within a second. Trypan blue-stained and non-stained cells in the same field area on the CMOS sensor were successfully distinguished as white- and blue-colored images under white LED light irradiation. Furthermore, the chemiluminescent signals of each cell were successfully visualized as blue-colored images on CMOS sensor only when HeLa cells were placed directly on the micro-lens array of the CMOS sensor. Our proposed approach will be a promising technique for real-time and high-content analysis of single cells in a large-field area based on color imaging. Copyright © 2010 Elsevier B.V. All rights reserved.

  12. Multispectral photoacoustic tomography for detection of small tumors inside biological tissues

    NASA Astrophysics Data System (ADS)

    Hirasawa, Takeshi; Okawa, Shinpei; Tsujita, Kazuhiro; Kushibiki, Toshihiro; Fujita, Masanori; Urano, Yasuteru; Ishihara, Miya

    2018-02-01

    Visualization of small tumors inside biological tissue is important in cancer treatment because that promotes accurate surgical resection and enables therapeutic effect monitoring. For sensitive detection of tumor, we have been developing photoacoustic (PA) imaging technique to visualize tumor-specific contrast agents, and have already succeeded to image a subcutaneous tumor of a mouse using the contrast agents. To image tumors inside biological tissues, extension of imaging depth and improvement of sensitivity were required. In this study, to extend imaging depth, we developed a PA tomography (PAT) system that can image entire cross section of mice. To improve sensitivity, we discussed the use of the P(VDF-TrFE) linear array acoustic sensor that can detect PA signals with wide ranges of frequencies. Because PA signals produced from low absorbance optical absorbers shifts to low frequency, we hypothesized that the detection of low frequency PA signals improves sensitivity to low absorbance optical absorbers. We developed a PAT system with both a PZT linear array acoustic sensor and the P(VDF-TrFE) sensor, and performed experiment using tissue-mimicking phantoms to evaluate lower detection limits of absorbance. As a result, PAT images calculated from low frequency components of PA signals detected by the P(VDF-TrFE) sensor could visualize optical absorbers with lower absorbance.

  13. Concept of electro-optical sensor module for sniper detection system

    NASA Astrophysics Data System (ADS)

    Trzaskawka, Piotr; Dulski, Rafal; Kastek, Mariusz

    2010-10-01

    The paper presents an initial concept of the electro-optical sensor unit for sniper detection purposes. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. Being a part of a larger system it should contribute to greater overall system efficiency and lower false alarm rate thanks to data and sensor fusion techniques. Additionally, it is expected to provide some pre-shot detection capabilities. Generally acoustic (or radar) systems used for shot detection offer only "after-the-shot" information and they cannot prevent enemy attack, which in case of a skilled sniper opponent usually means trouble. The passive imaging sensors presented in this paper, together with active systems detecting pointed optics, are capable of detecting specific shooter signatures or at least the presence of suspected objects in the vicinity. The proposed sensor unit use thermal camera as a primary sniper and shot detection tool. The basic camera parameters such as focal plane array size and type, focal length and aperture were chosen on the basis of assumed tactical characteristics of the system (mainly detection range) and current technology level. In order to provide costeffective solution the commercially available daylight camera modules and infrared focal plane arrays were tested, including fast cooled infrared array modules capable of 1000 fps image acquisition rate. The daylight camera operates as a support, providing corresponding visual image, easier to comprehend for a human operator. The initial assumptions concerning sensor operation were verified during laboratory and field test and some example shot recording sequences are presented.

  14. Ultrasonic fingerprint sensor using a piezoelectric micromachined ultrasonic transducer array integrated with complementary metal oxide semiconductor electronics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Y.; Fung, S.; Wang, Q.

    2015-06-29

    This paper presents an ultrasonic fingerprint sensor based on a 24 × 8 array of 22 MHz piezoelectric micromachined ultrasonic transducers (PMUTs) with 100 μm pitch, fully integrated with 180 nm complementary metal oxide semiconductor (CMOS) circuitry through eutectic wafer bonding. Each PMUT is directly bonded to a dedicated CMOS receive amplifier, minimizing electrical parasitics and eliminating the need for through-silicon vias. The array frequency response and vibration mode-shape were characterized using laser Doppler vibrometry and verified via finite element method simulation. The array's acoustic output was measured using a hydrophone to be ∼14 kPa with a 28 V input, in reasonable agreement with predication from analyticalmore » calculation. Pulse-echo imaging of a 1D steel grating is demonstrated using electronic scanning of a 20 × 8 sub-array, resulting in 300 mV maximum received amplitude and 5:1 contrast ratio. Because the small size of this array limits the maximum image size, mechanical scanning was used to image a 2D polydimethylsiloxane fingerprint phantom (10 mm × 8 mm) at a 1.2 mm distance from the array.« less

  15. Piezoelectric micromachined ultrasonic transducers for fingerprint sensing

    NASA Astrophysics Data System (ADS)

    Lu, Yipeng

    Fingerprint identification is the most prevalent biometric technology due to its uniqueness, universality and convenience. Over the past two decades, a variety of physical mechanisms have been exploited to capture an electronic image of a human fingerprint. Among these, capacitive fingerprint sensors are the ones most widely used in consumer electronics because they are fabricated using conventional complementary metal oxide semiconductor (CMOS) integrated circuit technology. However, capacitive fingerprint sensors are extremely sensitive to finger contamination and moisture. This thesis will introduce an ultrasonic fingerprint sensor using a PMUT array, which offers a potential solution to this problem. In addition, it has the potential to increase security, as it allows images to be collected at various depths beneath the epidermis, providing images of the sub-surface dermis layer and blood vessels. Firstly, PMUT sensitivity is maximized by optimizing the layer stack and electrode design, and the coupling coefficient is doubled via series transduction. Moreover, a broadband PMUT with 97% fractional bandwidth is achieved by utilizing a thinner structure excited at two adjacent mechanical vibration modes with overlapping bandwidth. In addition, we proposed waveguide PMUTs, which function to direct acoustic waves, confine acoustic energy, and provide mechanical protection for the PMUT array. Furthermore, PMUT arrays were fabricated with different processes to form the membrane, including front-side etching with a patterned sacrificial layer, front-side etching with additional anchor, cavity SOI wafers and eutectic bonding. Additionally, eutectic bonding allows the PMUT to be integrated with CMOS circuits. PMUTs were characterized in the mechanical, electrical and acoustic domains. Using transmit beamforming, a narrow acoustic beam was achieved, and high-resolution (sub-100 microm) and short-range (~1 mm) pulse-echo ultrasonic imaging was demonstrated using a steel phantom. Finally, a novel ultrasonic fingerprint sensor was demonstrated using a 24x8 array of 22 MHz PMUTs with 100 microm pitch, fully integrated with 180 nm CMOS circuitry through eutectic wafer bonding. Each PMUT is directly bonded to a dedicated CMOS receive amplifier, minimizing electrical parasitics and eliminating the need for through-silicon vias. Pulse-echo imaging of a 1D steel grating is demonstrated using electronic scanning of a 20x8 sub-array, resulting in 300 mV maximum received amplitude and 5:1 contrast ratio. Because the small size of this array limits the maximum image size, mechanical scanning was used to image a 2D PDMS fingerprint phantom (10 mm by 8 mm) at a 1.2 mm distance from the array.

  16. An electrically tunable plenoptic camera using a liquid crystal microlens array.

    PubMed

    Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Ji, An; Xie, Changsheng

    2015-05-01

    Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated with an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.

  17. An electrically tunable plenoptic camera using a liquid crystal microlens array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lei, Yu; School of Automation, Huazhong University of Science and Technology, Wuhan 430074; Wuhan National Laboratory for Optoelectronics, Huazhong University of Science and Technology, Wuhan 430074

    2015-05-15

    Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated withmore » an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.« less

  18. The multifocus plenoptic camera

    NASA Astrophysics Data System (ADS)

    Georgiev, Todor; Lumsdaine, Andrew

    2012-01-01

    The focused plenoptic camera is based on the Lippmann sensor: an array of microlenses focused on the pixels of a conventional image sensor. This device samples the radiance, or plenoptic function, as an array of cameras with large depth of field, focused at a certain plane in front of the microlenses. For the purpose of digital refocusing (which is one of the important applications) the depth of field needs to be large, but there are fundamental optical limitations to this. The solution of the above problem is to use and array of interleaved microlenses of different focal lengths, focused at two or more different planes. In this way a focused image can be constructed at any depth of focus, and a really wide range of digital refocusing can be achieved. This paper presents our theory and results of implementing such camera. Real world images are demonstrating the extended capabilities, and limitations are discussed.

  19. An electrically tunable plenoptic camera using a liquid crystal microlens array

    NASA Astrophysics Data System (ADS)

    Lei, Yu; Tong, Qing; Zhang, Xinyu; Sang, Hongshi; Ji, An; Xie, Changsheng

    2015-05-01

    Plenoptic cameras generally employ a microlens array positioned between the main lens and the image sensor to capture the three-dimensional target radiation in the visible range. Because the focal length of common refractive or diffractive microlenses is fixed, the depth of field (DOF) is limited so as to restrict their imaging capability. In this paper, we propose a new plenoptic camera using a liquid crystal microlens array (LCMLA) with electrically tunable focal length. The developed LCMLA is fabricated by traditional photolithography and standard microelectronic techniques, and then, its focusing performance is experimentally presented. The fabricated LCMLA is directly integrated with an image sensor to construct a prototyped LCMLA-based plenoptic camera for acquiring raw radiation of targets. Our experiments demonstrate that the focused region of the LCMLA-based plenoptic camera can be shifted efficiently through electrically tuning the LCMLA used, which is equivalent to the extension of the DOF.

  20. Spectral X-Ray Diffraction using a 6 Megapixel Photon Counting Array Detector.

    PubMed

    Muir, Ryan D; Pogranichniy, Nicholas R; Muir, J Lewis; Sullivan, Shane Z; Battaile, Kevin P; Mulichak, Anne M; Toth, Scott J; Keefe, Lisa J; Simpson, Garth J

    2015-03-12

    Pixel-array array detectors allow single-photon counting to be performed on a massively parallel scale, with several million counting circuits and detectors in the array. Because the number of photoelectrons produced at the detector surface depends on the photon energy, these detectors offer the possibility of spectral imaging. In this work, a statistical model of the instrument response is used to calibrate the detector on a per-pixel basis. In turn, the calibrated sensor was used to perform separation of dual-energy diffraction measurements into two monochromatic images. Targeting applications include multi-wavelength diffraction to aid in protein structure determination and X-ray diffraction imaging.

  1. Spectral x-ray diffraction using a 6 megapixel photon counting array detector

    NASA Astrophysics Data System (ADS)

    Muir, Ryan D.; Pogranichniy, Nicholas R.; Muir, J. Lewis; Sullivan, Shane Z.; Battaile, Kevin P.; Mulichak, Anne M.; Toth, Scott J.; Keefe, Lisa J.; Simpson, Garth J.

    2015-03-01

    Pixel-array array detectors allow single-photon counting to be performed on a massively parallel scale, with several million counting circuits and detectors in the array. Because the number of photoelectrons produced at the detector surface depends on the photon energy, these detectors offer the possibility of spectral imaging. In this work, a statistical model of the instrument response is used to calibrate the detector on a per-pixel basis. In turn, the calibrated sensor was used to perform separation of dual-energy diffraction measurements into two monochromatic images. Targeting applications include multi-wavelength diffraction to aid in protein structure determination and X-ray diffraction imaging.

  2. Cell adhesion and guidance by micropost-array chemical sensors

    NASA Astrophysics Data System (ADS)

    Pantano, Paul; Quah, Soo-Kim; Danowski, Kristine L.

    2002-06-01

    An array of ~50,000 individual polymeric micropost sensors was patterned across a glass coverslip by a photoimprint lithographic technique. Individual micropost sensors were ~3-micrometers tall and ~8-micrometers wide. The O2-sensitive micropost array sensors (MPASs) comprised a ruthenium complex encapsulated in a gas permeable photopolymerizable siloxane. The pH-sensitive MPASs comprised a fluorescein conjugate encapsulated in a photocrosslinkable poly(vinyl alcohol)-based polymer. PO2 and pH were quantitated by acquiring MPAS luminescence images with an epifluorescence microscope/charge coupled device imaging system. O2-sensitive MPASs displayed linear Stern-Volmer quenching behavior with a maximum Io/I of ~8.6. pH-sensitive MPASs displayed sigmoidal calibration curves with a pKa of ~5.8. The adhesion of undifferentiated rat pheochromocytoma (PC12) cells across these two polymeric surface types was investigated. The greatest PC12 cell proliferation and adhesion occurred across the poly(vinyl alcohol)-based micropost arrays relative to planar poly(vinyl alcohol)-based surfaces and both patterned and planar siloxane surfaces. An additional advantage of the patterned MPAS layers relative to planar sensing layers was the ability to direct the growth of biological cells. Preliminary data is presented whereby nerve growth factor-differentiated PC12 cells grew neurite-like processes that extended along paths defined by the micropost architecture.

  3. Detection and recognition of simple spatial forms

    NASA Technical Reports Server (NTRS)

    Watson, A. B.

    1983-01-01

    A model of human visual sensitivity to spatial patterns is constructed. The model predicts the visibility and discriminability of arbitrary two-dimensional monochrome images. The image is analyzed by a large array of linear feature sensors, which differ in spatial frequency, phase, orientation, and position in the visual field. All sensors have one octave frequency bandwidths, and increase in size linearly with eccentricity. Sensor responses are processed by an ideal Bayesian classifier, subject to uncertainty. The performance of the model is compared to that of the human observer in detecting and discriminating some simple images.

  4. High-resolution CCD imaging alternatives

    NASA Astrophysics Data System (ADS)

    Brown, D. L.; Acker, D. E.

    1992-08-01

    High resolution CCD color cameras have recently stimulated the interest of a large number of potential end-users for a wide range of practical applications. Real-time High Definition Television (HDTV) systems are now being used or considered for use in applications ranging from entertainment program origination through digital image storage to medical and scientific research. HDTV generation of electronic images offers significant cost and time-saving advantages over the use of film in such applications. Further in still image systems electronic image capture is faster and more efficient than conventional image scanners. The CCD still camera can capture 3-dimensional objects into the computing environment directly without having to shoot a picture on film develop it and then scan the image into a computer. 2. EXTENDING CCD TECHNOLOGY BEYOND BROADCAST Most standard production CCD sensor chips are made for broadcast-compatible systems. One popular CCD and the basis for this discussion offers arrays of roughly 750 x 580 picture elements (pixels) or a total array of approximately 435 pixels (see Fig. 1). FOR. A has developed a technique to increase the number of available pixels for a given image compared to that produced by the standard CCD itself. Using an inter-lined CCD with an overall spatial structure several times larger than the photo-sensitive sensor areas each of the CCD sensors is shifted in two dimensions in order to fill in spatial gaps between adjacent sensors.

  5. Optical Inspection In Hostile Industrial Environments: Single-Sensor VS. Imaging Methods

    NASA Astrophysics Data System (ADS)

    Cielo, P.; Dufour, M.; Sokalski, A.

    1988-11-01

    On-line and unsupervised industrial inspection for quality control and process monitoring is increasingly required in the modern automated factory. Optical techniques are particularly well suited to industrial inspection in hostile environments because of their noncontact nature, fast response time and imaging capabilities. Optical sensors can be used for remote inspection of high temperature products or otherwise inaccessible parts, provided they are in a line-of-sight relation with the sensor. Moreover, optical sensors are much easier to adapt to a variety of part shapes, position or orientation and conveyor speeds as compared to contact-based sensors. This is an important requirement in a flexible automation environment. A number of choices are possible in the design of optical inspection systems. General-purpose two-dimensional (2-D) or three-dimensional (3-D) imaging techniques have advanced very rapidly in the last years thanks to a substantial research effort as well as to the availability of increasingly powerful and affordable hardware and software. Imaging can be realized using 2-D arrays or simpler one-dimensional (1-D) line-array detectors. Alternatively, dedicated single-spot sensors require a smaller amount of data processing and often lead to robust sensors which are particularly appropriate to on-line operation in hostile industrial environments. Many specialists now feel that dedicated sensors or clusters of sensors are often more effective for specific industrial automation and control tasks, at least in the short run. This paper will discuss optomechanical and electro-optical choices with reference to the design of a number of on-line inspection sensors which have been recently developed at our institute. Case studies will include real-time surface roughness evaluation on polymer cables extruded at high speed, surface characterization of hot-rolled or galvanized-steel sheets, temperature evaluation and pinhole detection in aluminum foil, multi-wavelength polymer sheet thickness gauging and thermographic imaging, 3-D lumber profiling, line-array inspection of textiles and glassware, as well as on-line optical inspection for the control of automated arc welding. In each case the design choices between single or multiple-element detectors, mechanical vs. electronic scanning, laser vs. incoherent illumination, etc. will be discussed in terms of industrial constraints such as speed requirements, protection against the environment or reliability of the sensor output.

  6. Electrowetting liquid lens array on curved substrates for wide field of view image sensor

    NASA Astrophysics Data System (ADS)

    Bang, Yousung; Lee, Muyoung; Won, Yong Hyub

    2016-03-01

    In this research, electrowetting liquid lens array on curved substrates is developed for wide field of view image sensor. In the conventional image sensing system, this lens array is usually in the form of solid state. However, in this state, the lens array which is similar to insect-like compound eyes in nature has several limitations such as degradation of image quality and narrow field of view because it cannot adjust focal length of lens. For implementation of the more enhanced system, the curved array of lenses based on electrowetting effect is developed in this paper, which can adjust focal length of lens. The fabrication of curved lens array is conducted upon the several steps, including chamber fabrication, electrode & dielectric layer deposition, liquid injection, and encapsulation. As constituent materials, IZO coated convex glass, UV epoxy (NOA 68), DI water, and dodecane are used. The number of lenses on the fabricated panel is 23 by 23 and each lens has 1mm aperture with 1.6mm pitch between adjacent lenses. When the voltage is applied on the device, it is observed that each lens is changed from concave state to convex state. From the unique optical characteristics of curved array of liquid lenses such as controllable focal length and wide field of view, we can expect that it has potential applications in various fields such as medical diagnostics, surveillance systems, and light field photography.

  7. Camera array based light field microscopy

    PubMed Central

    Lin, Xing; Wu, Jiamin; Zheng, Guoan; Dai, Qionghai

    2015-01-01

    This paper proposes a novel approach for high-resolution light field microscopy imaging by using a camera array. In this approach, we apply a two-stage relay system for expanding the aperture plane of the microscope into the size of an imaging lens array, and utilize a sensor array for acquiring different sub-apertures images formed by corresponding imaging lenses. By combining the rectified and synchronized images from 5 × 5 viewpoints with our prototype system, we successfully recovered color light field videos for various fast-moving microscopic specimens with a spatial resolution of 0.79 megapixels at 30 frames per second, corresponding to an unprecedented data throughput of 562.5 MB/s for light field microscopy. We also demonstrated the use of the reported platform for different applications, including post-capture refocusing, phase reconstruction, 3D imaging, and optical metrology. PMID:26417490

  8. Acoustic emission linear pulse holography

    DOEpatents

    Collins, H.D.; Busse, L.J.; Lemon, D.K.

    1983-10-25

    This device relates to the concept of and means for performing Acoustic Emission Linear Pulse Holography, which combines the advantages of linear holographic imaging and Acoustic Emission into a single non-destructive inspection system. This unique system produces a chronological, linear holographic image of a flaw by utilizing the acoustic energy emitted during crack growth. The innovation is the concept of utilizing the crack-generated acoustic emission energy to generate a chronological series of images of a growing crack by applying linear, pulse holographic processing to the acoustic emission data. The process is implemented by placing on a structure an array of piezoelectric sensors (typically 16 or 32 of them) near the defect location. A reference sensor is placed between the defect and the array.

  9. A data base of ASAS digital imagery. [Advanced Solid-state Array Spectroradiometer

    NASA Technical Reports Server (NTRS)

    Irons, James R.; Meeson, Blanche W.; Dabney, Philip W.; Kovalick, William M.; Graham, David W.; Hahn, Daniel S.

    1992-01-01

    The Advanced Solid-State Array Spectroradiometer (ASAS) is an airborne, off-nadir tilting, imaging spectroradiometer that acquires digital image data for 29 spectral bands in the visible and near-infrared. The sensor is used principally for studies of the bidirectional distribution of solar radiation scattered by terrestial surfaces. ASAS has acquired data for a number of terrestial ecosystem field experiments and investigators have received over 170 radiometrically corrected, multiangle, digital image data sets. A database of ASAS digital imagery has been established in the Pilot Land Data System (PLDS) at the NASA/Goddard Space Flight Center to provide access to these data by the scientific community. ASAS, its processed data, and the PLDS are described, together with recent improvements to the sensor system.

  10. Color camera computed tomography imaging spectrometer for improved spatial-spectral image accuracy

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Bearman, Gregory H. (Inventor); Johnson, William R. (Inventor)

    2011-01-01

    Computed tomography imaging spectrometers ("CTIS"s) having color focal plane array detectors are provided. The color FPA detector may comprise a digital color camera including a digital image sensor, such as a Foveon X3.RTM. digital image sensor or a Bayer color filter mosaic. In another embodiment, the CTIS includes a pattern imposed either directly on the object scene being imaged or at the field stop aperture. The use of a color FPA detector and the pattern improves the accuracy of the captured spatial and spectral information.

  11. International Symposium on Applications of Ferroelectrics

    DTIC Science & Technology

    1993-02-01

    neighborhood of the Curie point. A high dielectric constant The technology of producing monolithic IR detectors using is also useful in many imaging applications...a linear array of sensors. Eacha detector (pixel) or group of Work on new infrared (IR) sensors is at present them, can thus produce a signal ... recorded . The signal beam , was expanded to certain input image (or a partial one) is illuminated only with the 15mm to carry images and was then

  12. Wavelet compression techniques for hyperspectral data

    NASA Technical Reports Server (NTRS)

    Evans, Bruce; Ringer, Brian; Yeates, Mathew

    1994-01-01

    Hyperspectral sensors are electro-optic sensors which typically operate in visible and near infrared bands. Their characteristic property is the ability to resolve a relatively large number (i.e., tens to hundreds) of contiguous spectral bands to produce a detailed profile of the electromagnetic spectrum. In contrast, multispectral sensors measure relatively few non-contiguous spectral bands. Like multispectral sensors, hyperspectral sensors are often also imaging sensors, measuring spectra over an array of spatial resolution cells. The data produced may thus be viewed as a three dimensional array of samples in which two dimensions correspond to spatial position and the third to wavelength. Because they multiply the already large storage/transmission bandwidth requirements of conventional digital images, hyperspectral sensors generate formidable torrents of data. Their fine spectral resolution typically results in high redundancy in the spectral dimension, so that hyperspectral data sets are excellent candidates for compression. Although there have been a number of studies of compression algorithms for multispectral data, we are not aware of any published results for hyperspectral data. Three algorithms for hyperspectral data compression are compared. They were selected as representatives of three major approaches for extending conventional lossy image compression techniques to hyperspectral data. The simplest approach treats the data as an ensemble of images and compresses each image independently, ignoring the correlation between spectral bands. The second approach transforms the data to decorrelate the spectral bands, and then compresses the transformed data as a set of independent images. The third approach directly generalizes two-dimensional transform coding by applying a three-dimensional transform as part of the usual transform-quantize-entropy code procedure. The algorithms studied all use the discrete wavelet transform. In the first two cases, a wavelet transform coder was used for the two-dimensional compression. The third case used a three dimensional extension of this same algorithm.

  13. Arrays of Nano Tunnel Junctions as Infrared Image Sensors

    NASA Technical Reports Server (NTRS)

    Son, Kyung-Ah; Moon, Jeong S.; Prokopuk, Nicholas

    2006-01-01

    Infrared image sensors based on high density rectangular planar arrays of nano tunnel junctions have been proposed. These sensors would differ fundamentally from prior infrared sensors based, variously, on bolometry or conventional semiconductor photodetection. Infrared image sensors based on conventional semiconductor photodetection must typically be cooled to cryogenic temperatures to reduce noise to acceptably low levels. Some bolometer-type infrared sensors can be operated at room temperature, but they exhibit low detectivities and long response times, which limit their utility. The proposed infrared image sensors could be operated at room temperature without incurring excessive noise, and would exhibit high detectivities and short response times. Other advantages would include low power demand, high resolution, and tailorability of spectral response. Neither bolometers nor conventional semiconductor photodetectors, the basic detector units as proposed would partly resemble rectennas. Nanometer-scale tunnel junctions would be created by crossing of nanowires with quantum-mechanical-barrier layers in the form of thin layers of electrically insulating material between them (see figure). A microscopic dipole antenna sized and shaped to respond maximally in the infrared wavelength range that one seeks to detect would be formed integrally with the nanowires at each junction. An incident signal in that wavelength range would become coupled into the antenna and, through the antenna, to the junction. At the junction, the flow of electrons between the crossing wires would be dominated by quantum-mechanical tunneling rather than thermionic emission. Relative to thermionic emission, quantum mechanical tunneling is a fast process.

  14. Handheld colorimeter for determination of heavy metal concentrations

    NASA Astrophysics Data System (ADS)

    López Ruiz, N.; Ariza, M.; Martínez Olmos, A.; Vukovic, J.; Palma, A. J.; Capitan-Vallvey, L. F.

    2011-08-01

    A portable instrument that measures heavy metal concentration from a colorimetric sensor array is presented. The use of eight sensing membranes, placed on a plastic support, allows to obtain the hue component of the HSV colour space of each one in order to determinate the concentration of metals present in a solution. The developed microcontroller-based system captures, in an ambient light environment, an image of the sensor array using an integrated micro-camera and shows the picture in a touch micro-LCD screen which acts as user interface. After image-processing of the regions of interest selected by the user, colour and concentration information are displayed on the screen.

  15. Development of Ultra-Low Noise, High Performance III-V Quantum Well Infrared Photodetectors (QWIPs) for Focal Plane Array Staring Image Sensor Systems

    DTIC Science & Technology

    1994-02-06

    Ultra-Low Noise , High Performance lll-V Quantum Well Infrared Photodetectors ( QWIPs ) for Focal Plane Array Staring Image Sensor Systems i Submitted to i... QWIP , the noise is increased by the square root of the gain ,(g and the detectivity D" is reduced by this same factor. As shown in Fig. 3.18, the optimum...PI .4totekotP044l .t.,me. O IM A. AGENCY use ONLY (Leave blank) 1. y.p0AT J *fY E AND OATES CO r S - 0 1 DWveop cTteOf Ultra-Low Noise , High

  16. Redundancy Analysis of Capacitance Data of a Coplanar Electrode Array for Fast and Stable Imaging Processing

    PubMed Central

    Wen, Yintang; Zhang, Zhenda; Zhang, Yuyan; Sun, Dongtao

    2017-01-01

    A coplanar electrode array sensor is established for the imaging of composite-material adhesive-layer defect detection. The sensor is based on the capacitive edge effect, which leads to capacitance data being considerably weak and susceptible to environmental noise. The inverse problem of coplanar array electrical capacitance tomography (C-ECT) is ill-conditioning, in which a small error of capacitance data can seriously affect the quality of reconstructed images. In order to achieve a stable image reconstruction process, a redundancy analysis method for capacitance data is proposed. The proposed method is based on contribution rate and anti-interference capability. According to the redundancy analysis, the capacitance data are divided into valid and invalid data. When the image is reconstructed by valid data, the sensitivity matrix needs to be changed accordingly. In order to evaluate the effectiveness of the sensitivity map, singular value decomposition (SVD) is used. Finally, the two-dimensional (2D) and three-dimensional (3D) images are reconstructed by the Tikhonov regularization method. Through comparison of the reconstructed images of raw capacitance data, the stability of the image reconstruction process can be improved, and the quality of reconstructed images is not degraded. As a result, much invalid data are not collected, and the data acquisition time can also be reduced. PMID:29295537

  17. Multifrequency Ultra-High Resolution Miniature Scanning Microscope Using Microchannel And Solid-State Sensor Technologies And Method For Scanning Samples

    NASA Technical Reports Server (NTRS)

    Wang, Yu (Inventor)

    2006-01-01

    A miniature, ultra-high resolution, and color scanning microscope using microchannel and solid-state technology that does not require focus adjustment. One embodiment includes a source of collimated radiant energy for illuminating a sample, a plurality of narrow angle filters comprising a microchannel structure to permit the passage of only unscattered radiant energy through the microchannels with some portion of the radiant energy entering the microchannels from the sample, a solid-state sensor array attached to the microchannel structure, the microchannels being aligned with an element of the solid-state sensor array, that portion of the radiant energy entering the microchannels parallel to the microchannel walls travels to the sensor element generating an electrical signal from which an image is reconstructed by an external device, and a moving element for movement of the microchannel structure relative to the sample. Discloses a method for scanning samples whereby the sensor array elements trace parallel paths that are arbitrarily close to the parallel paths traced by other elements of the array.

  18. Compressive Sensing Image Sensors-Hardware Implementation

    PubMed Central

    Dadkhah, Mohammadreza; Deen, M. Jamal; Shirani, Shahram

    2013-01-01

    The compressive sensing (CS) paradigm uses simultaneous sensing and compression to provide an efficient image acquisition technique. The main advantages of the CS method include high resolution imaging using low resolution sensor arrays and faster image acquisition. Since the imaging philosophy in CS imagers is different from conventional imaging systems, new physical structures have been developed for cameras that use the CS technique. In this paper, a review of different hardware implementations of CS encoding in optical and electrical domains is presented. Considering the recent advances in CMOS (complementary metal–oxide–semiconductor) technologies and the feasibility of performing on-chip signal processing, important practical issues in the implementation of CS in CMOS sensors are emphasized. In addition, the CS coding for video capture is discussed. PMID:23584123

  19. Imaging through water turbulence with a plenoptic sensor

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C.

    2016-09-01

    A plenoptic sensor can be used to improve the image formation process in a conventional camera. Through this process, the conventional image is mapped to an image array that represents the image's photon paths along different angular directions. Therefore, it can be used to resolve imaging problems where severe distortion happens. Especially for objects observed at moderate range (10m to 200m) through turbulent water, the image can be twisted to be entirely unrecognizable and correction algorithms need to be applied. In this paper, we show how to use a plenoptic sensor to recover an unknown object in line of sight through significant water turbulence distortion. In general, our approach can be applied to both atmospheric turbulence and water turbulence conditions.

  20. SPIDER: Next Generation Chip Scale Imaging Sensor Update

    NASA Astrophysics Data System (ADS)

    Duncan, A.; Kendrick, R.; Ogden, C.; Wuchenich, D.; Thurman, S.; Su, T.; Lai, W.; Chun, J.; Li, S.; Liu, G.; Yoo, S. J. B.

    2016-09-01

    The Lockheed Martin Advanced Technology Center (LM ATC) and the University of California at Davis (UC Davis) are developing an electro-optical (EO) imaging sensor called SPIDER (Segmented Planar Imaging Detector for Electro-optical Reconnaissance) that seeks to provide a 10x to 100x size, weight, and power (SWaP) reduction alternative to the traditional bulky optical telescope and focal-plane detector array. The substantial reductions in SWaP would reduce cost and/or provide higher resolution by enabling a larger-aperture imager in a constrained volume. Our SPIDER imager replaces the traditional optical telescope and digital focal plane detector array with a densely packed interferometer array based on emerging photonic integrated circuit (PIC) technologies that samples the object being imaged in the Fourier domain (i.e., spatial frequency domain), and then reconstructs an image. Our approach replaces the large optics and structures required by a conventional telescope with PICs that are accommodated by standard lithographic fabrication techniques (e.g., complementary metal-oxide-semiconductor (CMOS) fabrication). The standard EO payload integration and test process that involves precision alignment and test of optical components to form a diffraction limited telescope is, therefore, replaced by in-process integration and test as part of the PIC fabrication, which substantially reduces associated schedule and cost. This paper provides an overview of performance data on the second-generation PIC for SPIDER developed under the Defense Advanced Research Projects Agency (DARPA)'s SPIDER Zoom research funding. We also update the design description of the SPIDER Zoom imaging sensor and the second-generation PIC (high- and low resolution versions).

  1. Backside illuminated CMOS-TDI line scanner for space applications

    NASA Astrophysics Data System (ADS)

    Cohen, O.; Ben-Ari, N.; Nevo, I.; Shiloah, N.; Zohar, G.; Kahanov, E.; Brumer, M.; Gershon, G.; Ofer, O.

    2017-09-01

    A new multi-spectral line scanner CMOS image sensor is reported. The backside illuminated (BSI) image sensor was designed for continuous scanning Low Earth Orbit (LEO) space applications including A custom high quality CMOS Active Pixels, Time Delayed Integration (TDI) mechanism that increases the SNR, 2-phase exposure mechanism that increases the dynamic Modulation Transfer Function (MTF), very low power internal Analog to Digital Converters (ADC) with resolution of 12 bit per pixel and on chip controller. The sensor has 4 independent arrays of pixels where each array is arranged in 2600 TDI columns with controllable TDI depth from 8 up to 64 TDI levels. A multispectral optical filter with specific spectral response per array is assembled at the package level. In this paper we briefly describe the sensor design and present some electrical and electro-optical recent measurements of the first prototypes including high Quantum Efficiency (QE), high MTF, wide range selectable Full Well Capacity (FWC), excellent linearity of approximately 1.3% in a signal range of 5-85% and approximately 1.75% in a signal range of 2-95% out of the signal span, readout noise of approximately 95 electrons with 64 TDI levels, negligible dark current and power consumption of less than 1.5W total for 4 bands sensor at all operation conditions .

  2. Sweetwater, Texas Large N Experiment

    NASA Astrophysics Data System (ADS)

    Sumy, D. F.; Woodward, R.; Barklage, M.; Hollis, D.; Spriggs, N.; Gridley, J. M.; Parker, T.

    2015-12-01

    From 7 March to 30 April 2014, NodalSeismic, Nanometrics, and IRIS PASSCAL conducted a collaborative, spatially-dense seismic survey with several thousand nodal short-period geophones complemented by a backbone array of broadband sensors near Sweetwater, Texas. This pilot project demonstrates the efficacy of industry and academic partnerships, and leveraged a larger, commercial 3D survey to collect passive source seismic recordings to image the subsurface. This innovative deployment of a large-N mixed-mode array allows industry to explore array geometries and investigate the value of broadband recordings, while affording academics a dense wavefield imaging capability and an operational model for high volume instrument deployment. The broadband array consists of 25 continuously-recording stations from IRIS PASSCAL and Nanometrics, with an array design that maximized recording of horizontal-traveling seismic energy for surface wave analysis over the primary target area with sufficient offset for imaging objectives at depth. In addition, 2639 FairfieldNodal Zland nodes from NodalSeismic were deployed in three sub-arrays: the outlier, backbone, and active source arrays. The backbone array consisted of 292 nodes that covered the entire survey area, while the outlier array consisted of 25 continuously-recording nodes distributed at a ~3 km distance away from the survey perimeter. Both the backbone and outlier array provide valuable constraints for the passive source portion of the analysis. This project serves as a learning platform to develop best practices in the support of large-N arrays with joint industry and academic expertise. Here we investigate lessons learned from a facility perspective, and present examples of data from the various sensors and array geometries. We will explore first-order results from local and teleseismic earthquakes, and show visualizations of the data across the array. Data are archived at the IRIS DMC under stations codes XB and 1B.

  3. A Detailed Look at the Performance Characteristics of the Lightning Imaging Sensor

    NASA Technical Reports Server (NTRS)

    Zhang, Daile; Cummins, Kenneth L.; Bitzer, Phillip; Koshak, William J.

    2018-01-01

    The Lightning Imaging Sensor (LIS) on board the Tropical Rainfall Measuring Mission (TRMM) effectively reached its end of life on April 15, 2015 after 17+ years of observation. Given the wealth of information in the archived LIS lightning data, and growing use of optical observations of lightning from space throughout the world, it is still of importance to better understand LIS calibration and performance characteristics. In this work, we continue our efforts to quantify the optical characteristics of the LIS pixel array, and to further characterize the detection efficiency and location accuracy of LIS. The LIS pixel array was partitioned into four quadrants, each having its own signal amplifier and digital conversion hardware. In addition, the sensor optics resulted in a decreasing sensitivity with increasing displacement from the center of the array. These engineering limitations resulted in differences in the optical emissions detected across the pixel array. Our work to date has shown a 20% increase in the count of the lightning events detected in one of the LIS quadrants, because of a lower detection threshold. In this study, we will discuss our work in progress on these limitations, and their potential impact on the group- and flash-level parameters.

  4. Microlens array processor with programmable weight mask and direct optical input

    NASA Astrophysics Data System (ADS)

    Schmid, Volker R.; Lueder, Ernst H.; Bader, Gerhard; Maier, Gert; Siegordner, Jochen

    1999-03-01

    We present an optical feature extraction system with a microlens array processor. The system is suitable for online implementation of a variety of transforms such as the Walsh transform and DCT. Operating with incoherent light, our processor accepts direct optical input. Employing a sandwich- like architecture, we obtain a very compact design of the optical system. The key elements of the microlens array processor are a square array of 15 X 15 spherical microlenses on acrylic substrate and a spatial light modulator as transmissive mask. The light distribution behind the mask is imaged onto the pixels of a customized a-Si image sensor with adjustable gain. We obtain one output sample for each microlens image and its corresponding weight mask area as summation of the transmitted intensity within one sensor pixel. The resulting architecture is very compact and robust like a conventional camera lens while incorporating a high degree of parallelism. We successfully demonstrate a Walsh transform into the spatial frequency domain as well as the implementation of a discrete cosine transform with digitized gray values. We provide results showing the transformation performance for both synthetic image patterns and images of natural texture samples. The extracted frequency features are suitable for neural classification of the input image. Other transforms and correlations can be implemented in real-time allowing adaptive optical signal processing.

  5. Blur spot limitations in distal endoscope sensors

    NASA Astrophysics Data System (ADS)

    Yaron, Avi; Shechterman, Mark; Horesh, Nadav

    2006-02-01

    In years past, the picture quality of electronic video systems was limited by the image sensor. In the present, the resolution of miniature image sensors, as in medical endoscopy, is typically superior to the resolution of the optical system. This "excess resolution" is utilized by Visionsense to create stereoscopic vision. Visionsense has developed a single chip stereoscopic camera that multiplexes the horizontal dimension of the image sensor into two (left and right) images, compensates the blur phenomena, and provides additional depth resolution without sacrificing planar resolution. The camera is based on a dual-pupil imaging objective and an image sensor coated by an array of microlenses (a plenoptic camera). The camera has the advantage of being compact, providing simultaneous acquisition of left and right images, and offering resolution comparable to a dual chip stereoscopic camera with low to medium resolution imaging lenses. A stereoscopic vision system provides an improved 3-dimensional perspective of intra-operative sites that is crucial for advanced minimally invasive surgery and contributes to surgeon performance. An additional advantage of single chip stereo sensors is improvement of tolerance to electronic signal noise.

  6. Multispectral Filter Arrays: Recent Advances and Practical Implementation

    PubMed Central

    Lapray, Pierre-Jean; Wang, Xingbo; Thomas, Jean-Baptiste; Gouton, Pierre

    2014-01-01

    Thanks to some technical progress in interferencefilter design based on different technologies, we can finally successfully implement the concept of multispectral filter array-based sensors. This article provides the relevant state-of-the-art for multispectral imaging systems and presents the characteristics of the elements of our multispectral sensor as a case study. The spectral characteristics are based on two different spatial arrangements that distribute eight different bandpass filters in the visible and near-infrared area of the spectrum. We demonstrate that the system is viable and evaluate its performance through sensor spectral simulation. PMID:25407904

  7. Artificial Roughness Encoding with a Bio-inspired MEMS- based Tactile Sensor Array

    PubMed Central

    Oddo, Calogero Maria; Beccai, Lucia; Felder, Martin; Giovacchini, Francesco; Carrozza, Maria Chiara

    2009-01-01

    A compliant 2×2 tactile sensor array was developed and investigated for roughness encoding. State of the art cross shape 3D MEMS sensors were integrated with polymeric packaging providing in total 16 sensitive elements to external mechanical stimuli in an area of about 20 mm2, similarly to the SA1 innervation density in humans. Experimental analysis of the bio-inspired tactile sensor array was performed by using ridged surfaces, with spatial periods from 2.6 mm to 4.1 mm, which were indented with regulated 1N normal force and stroked at constant sliding velocity from 15 mm/s to 48 mm/s. A repeatable and expected frequency shift of the sensor outputs depending on the applied stimulus and on its scanning velocity was observed between 3.66 Hz and 18.46 Hz with an overall maximum error of 1.7%. The tactile sensor could also perform contact imaging during static stimulus indentation. The experiments demonstrated the suitability of this approach for the design of a roughness encoding tactile sensor for an artificial fingerpad. PMID:22412304

  8. Joint estimation of high resolution images and depth maps from light field cameras

    NASA Astrophysics Data System (ADS)

    Ohashi, Kazuki; Takahashi, Keita; Fujii, Toshiaki

    2014-03-01

    Light field cameras are attracting much attention as tools for acquiring 3D information of a scene through a single camera. The main drawback of typical lenselet-based light field cameras is the limited resolution. This limitation comes from the structure where a microlens array is inserted between the sensor and the main lens. The microlens array projects 4D light field on a single 2D image sensor at the sacrifice of the resolution; the angular resolution and the position resolution trade-off under the fixed resolution of the image sensor. This fundamental trade-off remains after the raw light field image is converted to a set of sub-aperture images. The purpose of our study is to estimate a higher resolution image from low resolution sub-aperture images using a framework of super-resolution reconstruction. In this reconstruction, these sub-aperture images should be registered as accurately as possible. This registration is equivalent to depth estimation. Therefore, we propose a method where super-resolution and depth refinement are performed alternatively. Most of the process of our method is implemented by image processing operations. We present several experimental results using a Lytro camera, where we increased the resolution of a sub-aperture image by three times horizontally and vertically. Our method can produce clearer images compared to the original sub-aperture images and the case without depth refinement.

  9. Coded aperture detector: an image sensor with sub 20-nm pixel resolution.

    PubMed

    Miyakawa, Ryan; Mayer, Rafael; Wojdyla, Antoine; Vannier, Nicolas; Lesser, Ian; Aron-Dine, Shifrah; Naulleau, Patrick

    2014-08-11

    We describe the coded aperture detector, a novel image sensor based on uniformly redundant arrays (URAs) with customizable pixel size, resolution, and operating photon energy regime. In this sensor, a coded aperture is scanned laterally at the image plane of an optical system, and the transmitted intensity is measured by a photodiode. The image intensity is then digitally reconstructed using a simple convolution. We present results from a proof-of-principle optical prototype, demonstrating high-fidelity image sensing comparable to a CCD. A 20-nm half-pitch URA fabricated by the Center for X-ray Optics (CXRO) nano-fabrication laboratory is presented that is suitable for high-resolution image sensing at EUV and soft X-ray wavelengths.

  10. Indium antimonide large-format detector arrays

    NASA Astrophysics Data System (ADS)

    Davis, Mike; Greiner, Mark

    2011-06-01

    Large format infrared imaging sensors are required to achieve simultaneously high resolution and wide field of view image data. Infrared sensors are generally required to be cooled from room temperature to cryogenic temperatures in less than 10 min thousands of times during their lifetime. The challenge is to remove mechanical stress, which is due to different materials with different coefficients of expansion, over a very wide temperature range and at the same time, provide a high sensitivity and high resolution image data. These challenges are met by developing a hybrid where the indium antimonide detector elements (pixels) are unconnected islands that essentially float on a silicon substrate and form a near perfect match to the silicon read-out circuit. Since the pixels are unconnected and isolated from each other, the array is reticulated. This paper shows that the front side illuminated and reticulated element indium antimonide focal plane developed at L-3 Cincinnati Electronics are robust, approach background limited sensitivity limit, and provide the resolution expected of the reticulated pixel array.

  11. Optomechanical System Development of the AWARE Gigapixel Scale Camera

    NASA Astrophysics Data System (ADS)

    Son, Hui S.

    Electronic focal plane arrays (FPA) such as CMOS and CCD sensors have dramatically improved to the point that digital cameras have essentially phased out film (except in very niche applications such as hobby photography and cinema). However, the traditional method of mating a single lens assembly to a single detector plane, as required for film cameras, is still the dominant design used in cameras today. The use of electronic sensors and their ability to capture digital signals that can be processed and manipulated post acquisition offers much more freedom of design at system levels and opens up many interesting possibilities for the next generation of computational imaging systems. The AWARE gigapixel scale camera is one such computational imaging system. By utilizing a multiscale optical design, in which a large aperture objective lens is mated with an array of smaller, well corrected relay lenses, we are able to build an optically simple system that is capable of capturing gigapixel scale images via post acquisition stitching of the individual pictures from the array. Properly shaping the array of digital cameras allows us to form an effectively continuous focal surface using off the shelf (OTS) flat sensor technology. This dissertation details developments and physical implementations of the AWARE system architecture. It illustrates the optomechanical design principles and system integration strategies we have developed through the course of the project by summarizing the results of the two design phases for AWARE: AWARE-2 and AWARE-10. These systems represent significant advancements in the pursuit of scalable, commercially viable snapshot gigapixel imaging systems and should serve as a foundation for future development of such systems.

  12. Chromatic Modulator for High Resolution CCD or APS Devices

    NASA Technical Reports Server (NTRS)

    Hartley, Frank T. (Inventor); Hull, Anthony B. (Inventor)

    2003-01-01

    A system for providing high-resolution color separation in electronic imaging. Comb drives controllably oscillate a red-green-blue (RGB) color strip filter system (or otherwise) over an electronic imaging system such as a charge-coupled device (CCD) or active pixel sensor (APS). The color filter is modulated over the imaging array at a rate three or more times the frame rate of the imaging array. In so doing, the underlying active imaging elements are then able to detect separate color-separated images, which are then combined to provide a color-accurate frame which is then recorded as the representation of the recorded image. High pixel resolution is maintained. Registration is obtained between the color strip filter and the underlying imaging array through the use of electrostatic comb drives in conjunction with a spring suspension system.

  13. A generic FPGA-based detector readout and real-time image processing board

    NASA Astrophysics Data System (ADS)

    Sarpotdar, Mayuresh; Mathew, Joice; Safonova, Margarita; Murthy, Jayant

    2016-07-01

    For space-based astronomical observations, it is important to have a mechanism to capture the digital output from the standard detector for further on-board analysis and storage. We have developed a generic (application- wise) field-programmable gate array (FPGA) board to interface with an image sensor, a method to generate the clocks required to read the image data from the sensor, and a real-time image processor system (on-chip) which can be used for various image processing tasks. The FPGA board is applied as the image processor board in the Lunar Ultraviolet Cosmic Imager (LUCI) and a star sensor (StarSense) - instruments developed by our group. In this paper, we discuss the various design considerations for this board and its applications in the future balloon and possible space flights.

  14. Measuring MEG closer to the brain: Performance of on-scalp sensor arrays

    PubMed Central

    Iivanainen, Joonas; Stenroos, Matti; Parkkonen, Lauri

    2017-01-01

    Optically-pumped magnetometers (OPMs) have recently reached sensitivity levels required for magnetoencephalography (MEG). OPMs do not need cryogenics and can thus be placed within millimetres from the scalp into an array that adapts to the invidual head size and shape, thereby reducing the distance from cortical sources to the sensors. Here, we quantified the improvement in recording MEG with hypothetical on-scalp OPM arrays compared to a 306-channel state-of-the-art SQUID array (102 magnetometers and 204 planar gradiometers). We simulated OPM arrays that measured either normal (nOPM; 102 sensors), tangential (tOPM; 204 sensors), or all components (aOPM; 306 sensors) of the magnetic field. We built forward models based on magnetic resonance images of 10 adult heads; we employed a three-compartment boundary element model and distributed current dipoles evenly across the cortical mantle. Compared to the SQUID magnetometers, nOPM and tOPM yielded 7.5 and 5.3 times higher signal power, while the correlations between the field patterns of source dipoles were reduced by factors of 2.8 and 3.6, respectively. Values of the field-pattern correlations were similar across nOPM, tOPM and SQUID gradiometers. Volume currents reduced the signals of primary currents on average by 10%, 72% and 15% in nOPM, tOPM and SQUID magnetometers, respectively. The information capacities of the OPM arrays were clearly higher than that of the SQUID array. The dipole-localization accuracies of the arrays were similar while the minimum-norm-based point-spread functions were on average 2.4 and 2.5 times more spread for the SQUID array compared to nOPM and tOPM arrays, respectively. PMID:28007515

  15. Impact imaging of aircraft composite structure based on a model-independent spatial-wavenumber filter.

    PubMed

    Qiu, Lei; Liu, Bin; Yuan, Shenfang; Su, Zhongqing

    2016-01-01

    The spatial-wavenumber filtering technique is an effective approach to distinguish the propagating direction and wave mode of Lamb wave in spatial-wavenumber domain. Therefore, it has been gradually studied for damage evaluation in recent years. But for on-line impact monitoring in practical application, the main problem is how to realize the spatial-wavenumber filtering of impact signal when the wavenumber of high spatial resolution cannot be measured or the accurate wavenumber curve cannot be modeled. In this paper, a new model-independent spatial-wavenumber filter based impact imaging method is proposed. In this method, a 2D cross-shaped array constructed by two linear piezoelectric (PZT) sensor arrays is used to acquire impact signal on-line. The continuous complex Shannon wavelet transform is adopted to extract the frequency narrowband signals from the frequency wideband impact response signals of the PZT sensors. A model-independent spatial-wavenumber filter is designed based on the spatial-wavenumber filtering technique. Based on the designed filter, a wavenumber searching and best match mechanism is proposed to implement the spatial-wavenumber filtering of the frequency narrowband signals without modeling, which can be used to obtain a wavenumber-time image of the impact relative to a linear PZT sensor array. By using the two wavenumber-time images of the 2D cross-shaped array, the impact direction can be estimated without blind angle. The impact distance relative to the 2D cross-shaped array can be calculated by using the difference of time-of-flight between the frequency narrowband signals of two different central frequencies and the corresponding group velocities. The validations performed on a carbon fiber composite laminate plate and an aircraft composite oil tank show a good impact localization accuracy of the model-independent spatial-wavenumber filter based impact imaging method. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Performance Evaluation of a Biometric System Based on Acoustic Images

    PubMed Central

    Izquierdo-Fuente, Alberto; del Val, Lara; Jiménez, María I.; Villacorta, Juan J.

    2011-01-01

    An acoustic electronic scanning array for acquiring images from a person using a biometric application is developed. Based on pulse-echo techniques, multifrequency acoustic images are obtained for a set of positions of a person (front, front with arms outstretched, back and side). Two Uniform Linear Arrays (ULA) with 15 λ/2-equispaced sensors have been employed, using different spatial apertures in order to reduce sidelobe levels. Working frequencies have been designed on the basis of the main lobe width, the grating lobe levels and the frequency responses of people and sensors. For a case-study with 10 people, the acoustic profiles, formed by all images acquired, are evaluated and compared in a mean square error sense. Finally, system performance, using False Match Rate (FMR)/False Non-Match Rate (FNMR) parameters and the Receiver Operating Characteristic (ROC) curve, is evaluated. On the basis of the obtained results, this system could be used for biometric applications. PMID:22163708

  17. Autonomous collection of dynamically-cued multi-sensor imagery

    NASA Astrophysics Data System (ADS)

    Daniel, Brian; Wilson, Michael L.; Edelberg, Jason; Jensen, Mark; Johnson, Troy; Anderson, Scott

    2011-05-01

    The availability of imagery simultaneously collected from sensors of disparate modalities enhances an image analyst's situational awareness and expands the overall detection capability to a larger array of target classes. Dynamic cooperation between sensors is increasingly important for the collection of coincident data from multiple sensors either on the same or on different platforms suitable for UAV deployment. Of particular interest is autonomous collaboration between wide area survey detection, high-resolution inspection, and RF sensors that span large segments of the electromagnetic spectrum. The Naval Research Laboratory (NRL) in conjunction with the Space Dynamics Laboratory (SDL) is building sensors with such networked communications capability and is conducting field tests to demonstrate the feasibility of collaborative sensor data collection and exploitation. Example survey / detection sensors include: NuSAR (NRL Unmanned SAR), a UAV compatible synthetic aperture radar system; microHSI, an NRL developed lightweight hyper-spectral imager; RASAR (Real-time Autonomous SAR), a lightweight podded synthetic aperture radar; and N-WAPSS-16 (Nighttime Wide-Area Persistent Surveillance Sensor-16Mpix), a MWIR large array gimbaled system. From these sensors, detected target cues are automatically sent to the NRL/SDL developed EyePod, a high-resolution, narrow FOV EO/IR sensor, for target inspection. In addition to this cooperative data collection, EyePod's real-time, autonomous target tracking capabilities will be demonstrated. Preliminary results and target analysis will be presented.

  18. CMOS image sensor with lateral electric field modulation pixels for fluorescence lifetime imaging with sub-nanosecond time response

    NASA Astrophysics Data System (ADS)

    Li, Zhuo; Seo, Min-Woong; Kagawa, Keiichiro; Yasutomi, Keita; Kawahito, Shoji

    2016-04-01

    This paper presents the design and implementation of a time-resolved CMOS image sensor with a high-speed lateral electric field modulation (LEFM) gating structure for time domain fluorescence lifetime measurement. Time-windowed signal charge can be transferred from a pinned photodiode (PPD) to a pinned storage diode (PSD) by turning on a pair of transfer gates, which are situated beside the channel. Unwanted signal charge can be drained from the PPD to the drain by turning on another pair of gates. The pixel array contains 512 (V) × 310 (H) pixels with 5.6 × 5.6 µm2 pixel size. The imager chip was fabricated using 0.11 µm CMOS image sensor process technology. The prototype sensor has a time response of 150 ps at 374 nm. The fill factor of the pixels is 5.6%. The usefulness of the prototype sensor is demonstrated for fluorescence lifetime imaging through simulation and measurement results.

  19. Miniaturized optical wavelength sensors

    NASA Astrophysics Data System (ADS)

    Kung, Helen Ling-Ning

    Recently semiconductor processing technology has been applied to the miniaturization of optical wavelength sensors. Compact sensors enable new applications such as integrated diode-laser wavelength monitors and frequency lockers, portable chemical and biological detection, and portable and adaptive hyperspectral imaging arrays. Small sensing systems have trade-offs between resolution, operating range, throughput, multiplexing and complexity. We have developed a new wavelength sensing architecture that balances these parameters for applications involving hyperspectral imaging spectrometer arrays. In this thesis we discuss and demonstrate two new wavelength-sensing architectures whose single-pixel designs can easily be extended into spectrometer arrays. The first class of devices is based on sampling a standing wave. These devices are based on measuring the wavelength-dependent period of optical standing waves formed by the interference of forward and reflected waves at a mirror. We fabricated two different devices based on this principle. The first device is a wavelength monitor, which measures the wavelength and power of a monochromatic source. The second device is a spectrometer that can also act as a selective spectral coherence sensor. The spectrometer contains a large displacement piston-motion MEMS mirror and a thin GaAs photodiode flip-chip bonded to a quartz substrate. The performance of this spectrometer is similar to that of a Michelson in resolution, operating range, throughput and multiplexing but with the added advantages of fewer components and one-dimensional architecture. The second class of devices is based on the Talbot self-imaging effect. The Talbot effect occurs when a periodic object is illuminated with a spatially coherent wave. Periodically spaced self-images are formed behind the object. The spacing of the self-images is proportional to wavelength of the incident light. We discuss and demonstrate how this effect can be used for spectroscopy. In the conclusion we compare these two new miniaturized spectrometer architectures to existing miniaturized spectrometers. We believe that the combination of miniaturized wavelength sensors and smart processing should facilitate the development real-time, adaptive and portable sensing systems.

  20. Learning receptor positions from imperfectly known motions

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.; Mulligan, Jeffrey B.

    1990-01-01

    An algorithm is described for learning image interpolation functions for sensor arrays whose sensor positions are somewhat disordered. The learning is based on failures of translation invariance, so it does not require knowledge of the images being presented to the visual system. Previously reported implementations of the method assumed the visual system to have precise knowledge of the translations. It is demonstrated that translation estimates computed from the imperfectly interpolated images can have enough accuracy to allow the learning process to converge to a correct interpolation.

  1. Information-Efficient Spectral Imaging Sensor With Tdi

    DOEpatents

    Rienstra, Jeffrey L.; Gentry, Stephen M.; Sweatt, William C.

    2004-01-13

    A programmable optical filter for use in multispectral and hyperspectral imaging employing variable gain time delay and integrate arrays. A telescope focuses an image of a scene onto at least one TDI array that is covered by a multispectral filter that passes separate bandwidths of light onto the rows in the TDI array. The variable gain feature of the TDI array allows individual rows of pixels to be attenuated individually. The attenuations are functions of the magnitudes of the positive and negative components of a spectral basis vector. The spectral basis vector is constructed so that its positive elements emphasize the presence of a target and its negative elements emphasize the presence of the constituents of the background of the imaged scene. This system provides for a very efficient determination of the presence of the target, as opposed to the very data intensive data manipulations that are required in conventional hyperspectral imaging systems.

  2. NASA Tech Briefs, April 2006

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The topics covered include: 1) Replaceable Sensor System for Bioreactor Monitoring; 2) Unitary Shaft-Angle and Shaft-Speed Sensor Assemblies; 3) Arrays of Nano Tunnel Junctions as Infrared Image Sensors; 4) Catalytic-Metal/PdO(sub x)/SiC Schottky-Diode Gas Sensors; 5) Compact, Precise Inertial Rotation Sensors for Spacecraft; 6) Universal Controller for Spacecraft Mechanisms; 7) The Flostation - an Immersive Cyberspace System; 8) Algorithm for Aligning an Array of Receiving Radio Antennas; 9) Single-Chip T/R Module for 1.2 GHz; 10) Quantum Entanglement Molecular Absorption Spectrum Simulator; 11) FuzzObserver; 12) Internet Distribution of Spacecraft Telemetry Data; 13) Semi-Automated Identification of Rocks in Images; 14) Pattern-Recognition Algorithm for Locking Laser Frequency; 15) Designing Cure Cycles for Matrix/Fiber Composite Parts; 16) Controlling Herds of Cooperative Robots; 17) Modification of a Limbed Robot to Favor Climbing; 18) Vacuum-Assisted, Constant-Force Exercise Device; 19) Production of Tuber-Inducing Factor; 20) Quantum-Dot Laser for Wavelengths of 1.8 to 2.3 micron; 21) Tunable Filter Made From Three Coupled WGM Resonators; and 22) Dynamic Pupil Masking for Phasing Telescope Mirror Segments.

  3. Wide-area SWIR arrays and active illuminators

    NASA Astrophysics Data System (ADS)

    MacDougal, Michael; Hood, Andrew; Geske, Jon; Wang, Chad; Renner, Daniel; Follman, David; Heu, Paula

    2012-01-01

    We describe the factors that go into the component choices for a short wavelength (SWIR) imager, which include the SWIR sensor, the lens, and the illuminator. We have shown the factors for reducing dark current, and shown that we can achieve well below 1.5 nA/cm2 for 15 μm devices at 7°C. We have mated our InGaAs detector arrays to 640x512 readout integrated integrated circuits (ROICs) to make focal plane arrays (FPAs). In addition, we have fabricated high definition 1920x1080 FPAs for wide field of view imaging. The resulting FPAs are capable of imaging photon fluxes with wavelengths between 1 and 1.6 microns at low light levels. The dark current associated with these FPAs is extremely low, exhibiting a mean dark current density of 0.26 nA/cm2 at 0°C. FLIR has also developed a high definition, 1920x1080, 15 um pitch SWIR sensor. In addition, FLIR has developed laser arrays that provide flat illumination in scenes that are normally light-starved. The illuminators have 40% wall-plug efficiency and provide low-speckle illumination, provide artifact-free imagery versus conventional laser illuminators.

  4. Optical fibres in pre-detector signal processing

    NASA Astrophysics Data System (ADS)

    Flinn, A. R.

    The basic form of conventional electro-optic sensors is described. The main drawback of these sensors is their inability to deal with the background radiation which usually accompanies the signal. This 'clutter' limits the sensors performance long before other noise such as 'shot' noise. Pre-detector signal processing using the complex amplitude of the light is introduced as a means to discriminate between the signal and 'clutter'. Further improvements to predetector signal processors can be made by the inclusion of optical fibres allowing radiation to be used with greater efficiency and enabling certain signal processing tasks to be carried out with an ease unequalled by any other method. The theory of optical waveguides and their application in sensors, interferometers, and signal processors is reviewed. Geometrical aspects of the formation of linear and circular interference fringes are described along with temporal and spatial coherence theory and their relationship to Michelson's visibility function. The requirements for efficient coupling of a source into singlemode and multimode fibres are given. We describe interference experiments between beams of light emitted from a few metres of two or more, singlemode or multimode, optical fibres. Fresnel's equation is used to obtain expressions for Fresnel and Fraunhofer diffraction patterns which enable electro-optic (E-0) sensors to be analysed by Fourier optics. Image formation is considered when the aperture plane of an E-0 sensor is illuminated with partially coherent light. This allows sensors to be designed using optical transfer functions which are sensitive to the spatial coherence of the illuminating light. Spatial coherence sensors which use gratings as aperture plane reticles are discussed. By using fibre arrays, spatial coherence processing enables E-0 sensors to discriminate between a spatially coherent source and an incoherent background. The sensors enable the position and wavelength of the source to be determined. Experiments are described which use optical fibre arrays as masks for correlation with spatial distributions of light in image planes of E-0 sensors. Correlations between laser light from different points in a scene is investigated by interfering the light emitted from an array of fibres, placed in the image plane of a sensor, with each other. Temporal signal processing experiments show that the visibility of interference fringes gives information about path differences in a scene or through an optical system. Most E-0 sensors employ wavelength filtering of the detected radiation to improve their discrimination and this is shown to be less selective than temporal coherence filtering which is sensitive to spectral bandwidth. Experiments using fibre interferometers to discriminate between red and blue laser light by their bandwidths are described. In most cases the path difference need only be a few tens of centimetres. We consider spatial and temporal coherence in fibres. We show that high visibility interference fringes can be produced by red and blue laser light transmitted through over 100 metres of singlemode or multimode fibre. The effect of detector size, relative to speckle size, is considered for fringes produced by multimode fibres. The effect of dispersion on the coherence of the light emitted from fibres is considered in terms of correlation and interference between modes. We describe experiments using a spatial light modulator called SIGHT-MOD. The device is used in various systems as a fibre optic switch and as a programmable aperture plane reticle. The contrast of the device is measured using red and green, HeNe, sources. Fourier transform images of patterns on the SIGHT-MOD are obtained and used to demonstrate the geometrical manipulation of images using 2D fibre arrays. Correlation of Fourier transform images of the SIGHT-MOD with 2D fibre arrays is demonstrated.

  5. Compressive spectral testbed imaging system based on thin-film color-patterned filter arrays.

    PubMed

    Rueda, Hoover; Arguello, Henry; Arce, Gonzalo R

    2016-11-20

    Compressive spectral imaging systems can reliably capture multispectral data using far fewer measurements than traditional scanning techniques. In this paper, a thin-film patterned filter array-based compressive spectral imager is demonstrated, including its optical design and implementation. The use of a patterned filter array entails a single-step three-dimensional spatial-spectral coding on the input data cube, which provides higher flexibility on the selection of voxels being multiplexed on the sensor. The patterned filter array is designed and fabricated with micrometer pitch size thin films, referred to as pixelated filters, with three different wavelengths. The performance of the system is evaluated in terms of references measured by a commercially available spectrometer and the visual quality of the reconstructed images. Different distributions of the pixelated filters, including random and optimized structures, are explored.

  6. Simulated and Real Sheet-of-Light 3D Object Scanning Using a-Si:H Thin Film PSD Arrays.

    PubMed

    Contreras, Javier; Tornero, Josep; Ferreira, Isabel; Martins, Rodrigo; Gomes, Luis; Fortunato, Elvira

    2015-11-30

    A MATLAB/SIMULINK software simulation model (structure and component blocks) has been constructed in order to view and analyze the potential of the PSD (Position Sensitive Detector) array concept technology before it is further expanded or developed. This simulation allows changing most of its parameters, such as the number of elements in the PSD array, the direction of vision, the viewing/scanning angle, the object rotation, translation, sample/scan/simulation time, etc. In addition, results show for the first time the possibility of scanning an object in 3D when using an a-Si:H thin film 128 PSD array sensor and hardware/software system. Moreover, this sensor technology is able to perform these scans and render 3D objects at high speeds and high resolutions when using a sheet-of-light laser within a triangulation platform. As shown by the simulation, a substantial enhancement in 3D object profile image quality and realism can be achieved by increasing the number of elements of the PSD array sensor as well as by achieving an optimal position response from the sensor since clearly the definition of the 3D object profile depends on the correct and accurate position response of each detector as well as on the size of the PSD array.

  7. A Planar Two-Dimensional Superconducting Bolometer Array for the Green Bank Telescope

    NASA Technical Reports Server (NTRS)

    Benford, Dominic; Staguhn, Johannes G.; Chervenak, James A.; Chen, Tina C.; Moseley, S. Harvey; Wollack, Edward J.; Devlin, Mark J.; Dicker, Simon R.; Supanich, Mark

    2004-01-01

    In order to provide high sensitivity rapid imaging at 3.3mm (90GHz) for the Green Bank Telescope - the world's largest steerable aperture - a camera is being built by the University of Pennsylvania, NASA/GSFC, and NRAO. The heart of this camera is an 8x8 close-packed, Nyquist-sampled detector array. We have designed and are fabricating a functional superconducting bolometer array system using a monolithic planar architecture. Read out by SQUID multiplexers, the superconducting transition edge sensors will provide fast, linear, sensitive response for high performance imaging. This will provide the first ever superconducting bolometer array on a facility instrument.

  8. Superconducting Detector Arrays for Astrophysics

    NASA Technical Reports Server (NTRS)

    Chervenak, James

    2008-01-01

    The next generation of astrophysics instruments will feature an order of magnitude more photon sensors or sensors that have an order of magnitude greater sensitivity. Since detector noise scales with temperature, a number of candidate technologies have been developed that use the intrinsic advantages of detector systems that operate below 1 Kelvin. Many of these systems employ of the superconducting phenomena that occur in metals at these temperatures to build ultrasensitive detectors and low-noise, low-power readout architectures. I will present one such system in use today to meet the needs of the astrophysics community at millimeter and x-ray wavelengths. Our group at NASA in collaboration with Princeton, NIST, Boulder and a number of other groups is building large format arrays of superconducting transition edge sensors (TES) read out with multiplexed superconducting quantum interference devices (SQUID). I will present the high sensitivity we have achieved in multiplexed x-ray sensors with the TES technology and describe the construction of a 1000-sensor TES/SQUID array for microwave measurements. With our collaboration's deployment of a kilopixel TES array for 2 mm radiation at the Atacarna Cosmology Telescope in November 2007, we have first images of the lensed Cosmic Microwave Background at fine angular scales.

  9. Comparisons between wave directional spectra from SAR and pressure sensor arrays

    NASA Technical Reports Server (NTRS)

    Pawka, S. S.; Inman, D. L.; Hsiao, S. V.; Shemdin, O. H.

    1980-01-01

    Simultaneous directional wave measurements were made at Torrey Pines Beach, California, by a synthetic aperture radar (SAR) and a linear array of pressure sensors. The measurements were conducted during the West Coast Experiment in March 1977. Quantitative comparisons of the normalized directional spectra from the two systems were made for wave periods of 6.9-17.0 s. The comparison results were variable but generally showed good agreement of the primary mode of the normalized directional energy. An attempt was made to quantify the physical criteria for good wave imaging in the SAR. A frequency band analysis of wave parameters such as band energy, slope, and orbital velocity did not show good correlation with the directional comparisons. It is noted that absolute values of the wave height spectrum cannot be derived from the SAR images yet and, consequently, no comparisons of absolute energy levels with corresponding array measurements were intended.

  10. A bio-image sensor for simultaneous detection of multi-neurotransmitters.

    PubMed

    Lee, You-Na; Okumura, Koichi; Horio, Tomoko; Iwata, Tatsuya; Takahashi, Kazuhiro; Hattori, Toshiaki; Sawada, Kazuaki

    2018-03-01

    We report here a new bio-image sensor for simultaneous detection of spatial and temporal distribution of multi-neurotransmitters. It consists of multiple enzyme-immobilized membranes on a 128 × 128 pixel array with read-out circuit. Apyrase and acetylcholinesterase (AChE), as selective elements, are used to recognize adenosine 5'-triphosphate (ATP) and acetylcholine (ACh), respectively. To enhance the spatial resolution, hydrogen ion (H + ) diffusion barrier layers are deposited on top of the bio-image sensor and demonstrated their prevention capability. The results are used to design the space among enzyme-immobilized pixels and the null H + sensor to minimize the undesired signal overlap by H + diffusion. Using this bio-image sensor, we can obtain H + diffusion-independent imaging of concentration gradients of ATP and ACh in real-time. The sensing characteristics, such as sensitivity and detection of limit, are determined experimentally. With the proposed bio-image sensor the possibility exists for customizable monitoring of the activities of various neurochemicals by using different kinds of proton-consuming or generating enzymes. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Film cameras or digital sensors? The challenge ahead for aerial imaging

    USGS Publications Warehouse

    Light, D.L.

    1996-01-01

    Cartographic aerial cameras continue to play the key role in producing quality products for the aerial photography business, and specifically for the National Aerial Photography Program (NAPP). One NAPP photograph taken with cameras capable of 39 lp/mm system resolution can contain the equivalent of 432 million pixels at 11 ??m spot size, and the cost is less than $75 per photograph to scan and output the pixels on a magnetic storage medium. On the digital side, solid state charge coupled device linear and area arrays can yield quality resolution (7 to 12 ??m detector size) and a broader dynamic range. If linear arrays are to compete with film cameras, they will require precise attitude and positioning of the aircraft so that the lines of pixels can be unscrambled and put into a suitable homogeneous scene that is acceptable to an interpreter. Area arrays need to be much larger than currently available to image scenes competitive in size with film cameras. Analysis of the relative advantages and disadvantages of the two systems show that the analog approach is more economical at present. However, as arrays become larger, attitude sensors become more refined, global positioning system coordinate readouts become commonplace, and storage capacity becomes more affordable, the digital camera may emerge as the imaging system for the future. Several technical challenges must be overcome if digital sensors are to advance to where they can support mapping, charting, and geographic information system applications.

  12. The Spaceborne Imaging Radar program: SIR-C - The next step toward EOS

    NASA Technical Reports Server (NTRS)

    Evans, Diane; Elachi, Charles; Cimino, Jobea

    1987-01-01

    The NASA Shuttle Imaging Radar SIR-C experiments will investigate earth surface and environment phenomena to deepen understanding of terra firma, biosphere, hydrosphere, cryosphere, and atmosphere components of the earth system, capitalizing on the observational capabilities of orbiting multiparameter radar sensors alone or in combination with other sensors. The SIR-C sensor encompasses an antenna array, an exciter, receivers, a data-handling network, and the ground SAR processor. It will be possible to steer the antenna beam electronically, so that the radar look angle can be varied.

  13. Tomographic Imaging on Distributed Unattended Ground Sensor Arrays

    DTIC Science & Technology

    2002-05-14

    communication, the recently released Bluetooth standard warrants investigation into its usefulness on ground sensors. Although not as powerful or as fast...NTSC,” June 2001, http://archive.ncsa.uiuc.edu/ SCMS /training/general/details/ntsc.html [14] Techfest, “PCI local bus technical summary,” 1999, http

  14. Layer by layer: complex analysis with OCT technology

    NASA Astrophysics Data System (ADS)

    Florin, Christian

    2017-03-01

    Standard visualisation systems capture two- dimensional images and need more or less fast image processing systems. Now, the ASP Array (Actives sensor pixel array) opens a new world in imaging. On the ASP array, each pixel is provided with its own lens and with its own signal pre-processing. The OCT technology works in "real time" with highest accuracy. In the ASP array systems functionalities of the data acquisition and signal processing are even integrated onto the "pixel level". For the extraction of interferometric features, the time-of-flight principle (TOF) is used. The ASP architecture offers the demodulation of the optical signal within a pixel with up to 100 kHz and the reconstruction of the amplitude and its phase. The dynamics of image capture with the ASP array is higher by two orders of magnitude in comparison with conventional image sensors!!! The OCT- Technology allows a topographic imaging in real time with an extremely high geometric spatial resolution. The optical path length is generated by an axial movement of the reference mirror. The amplitude-modulated optical signal and the carrier frequency are proportional to the scan rate and contains the depth information. Each maximum of the signal envelope corresponds to a reflection (or scattering) within a sample. The ASP array produces at same time 300 * 300 axial Interferorgrams which touch each other on all sides. The signal demodulation for detecting the envelope is not limited by the frame rate of the ASP array in comparison to standard OCT systems. If an optical signal arrives to a pixel of the ASP Array an electrical signal is generated. The background is faded to saturation of pixels by high light intensity to avoid. The sampled signal is integrated continuously multiplied by a signal of the same frequency and two paths whose phase is shifted by 90 degrees from each other are averaged. The outputs of the two paths are routed to the PC, where the envelope amplitude and the phase calculate a three-dimensional tomographic image. For 3D measuring technique specially designed ASP- arrays with a very high image rate are available. If ASP- Arrays are coupled with the OCT method, layer thicknesses can be determined without contact, sealing seams can be inspected or geometrical shapes can be measured. From a stack of hundreds of single OCT images, interesting images can be selected and fed to the computer to analyse them.

  15. A DUAL-BAND MILLIMETER-WAVE KINETIC INDUCTANCE CAMERA FOR THE IRAM 30 m TELESCOPE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monfardini, A.; Benoit, A.; Bideaud, A.

    The Neel IRAM KIDs Array (NIKA) is a fully integrated measurement system based on kinetic inductance detectors (KIDs) currently being developed for millimeter wave astronomy. The instrument includes dual-band optics allowing simultaneous imaging at 150 GHz and 220 GHz. The imaging sensors consist of two spatially separated arrays of KIDs. The first array, mounted on the 150 GHz branch, is composed of 144 lumped-element KIDs. The second array (220 GHz) consists of 256 antenna-coupled KIDs. Each of the arrays is sensitive to a single polarization; the band splitting is achieved by using a grid polarizer. The optics and sensors aremore » mounted in a custom dilution cryostat, with an operating temperature of {approx}70 mK. Electronic readout is realized using frequency multiplexing and a transmission line geometry consisting of a coaxial cable connected in series with the sensor array and a low-noise 4 K amplifier. The dual-band NIKA was successfully tested in 2010 October at the Institute for Millimetric Radio Astronomy (IRAM) 30 m telescope at Pico Veleta, Spain, performing in-line with laboratory predictions. An optical NEP was then calculated to be around 2 x 10{sup -16} W Hz{sup -1/2} (at 1 Hz) while under a background loading of approximately 4 pW pixel{sup -1}. This improvement in comparison with a preliminary run (2009) verifies that NIKA is approaching the target sensitivity for photon-noise limited ground-based detectors. Taking advantage of the larger arrays and increased sensitivity, a number of scientifically relevant faint and extended objects were then imaged including the Galactic Center SgrB2 (FIR1), the radio galaxy Cygnus A, and the NGC1068 Seyfert galaxy. These targets were all observed simultaneously in the 150 GHz and 220 GHz atmospheric windows.« less

  16. Thermal microphotonic sensor and sensor array

    DOEpatents

    Watts, Michael R [Albuquerque, NM; Shaw, Michael J [Tijeras, NM; Nielson, Gregory N [Albuquerque, NM; Lentine, Anthony L [Albuquerque, NM

    2010-02-23

    A thermal microphotonic sensor is disclosed for detecting infrared radiation using heat generated by the infrared radiation to shift the resonant frequency of an optical resonator (e.g. a ring resonator) to which the heat is coupled. The shift in the resonant frequency can be determined from light in an optical waveguide which is evanescently coupled to the optical resonator. An infrared absorber can be provided on the optical waveguide either as a coating or as a plate to aid in absorption of the infrared radiation. In some cases, a vertical resonant cavity can be formed about the infrared absorber to further increase the absorption of the infrared radiation. The sensor can be formed as a single device, or as an array for imaging the infrared radiation.

  17. Colorimetric Recognition of Aldehydes and Ketones.

    PubMed

    Li, Zheng; Fang, Ming; LaGasse, Maria K; Askim, Jon R; Suslick, Kenneth S

    2017-08-07

    A colorimetric sensor array has been designed for the identification of and discrimination among aldehydes and ketones in vapor phase. Due to rapid chemical reactions between the solid-state sensor elements and gaseous analytes, distinct color difference patterns were produced and digitally imaged for chemometric analysis. The sensor array was developed from classical spot tests using aniline and phenylhydrazine dyes that enable molecular recognition of a wide variety of aliphatic or aromatic aldehydes and ketones, as demonstrated by hierarchical cluster, principal component, and support vector machine analyses. The aldehyde/ketone-specific sensors were further employed for differentiation among and identification of ten liquor samples (whiskies, brandy, vodka) and ethanol controls, showing its potential applications in the beverage industry. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Tracking and imaging humans on heterogeneous infrared sensor arrays for law enforcement applications

    NASA Astrophysics Data System (ADS)

    Feller, Steven D.; Zheng, Y.; Cull, Evan; Brady, David J.

    2002-08-01

    We present a plan for the integration of geometric constraints in the source, sensor and analysis levels of sensor networks. The goal of geometric analysis is to reduce the dimensionality and complexity of distributed sensor data analysis so as to achieve real-time recognition and response to significant events. Application scenarios include biometric tracking of individuals, counting and analysis of individuals in groups of humans and distributed sentient environments. We are particularly interested in using this approach to provide networks of low cost point detectors, such as infrared motion detectors, with complex imaging capabilities. By extending the capabilities of simple sensors, we expect to reduce the cost of perimeter and site security applications.

  19. Performance evaluation of a conformal thermal monitoring sheet (TMS) sensor array for measurement of surface temperature distributions during superficial hyperthermia treatments

    PubMed Central

    Arunachalam, K.; Maccarini, P.; Juang, T.; Gaeta, C.; Stauffer, P. R.

    2009-01-01

    Purpose This paper presents a novel conformal thermal monitoring sheet sensor array with differential thermal sensitivity for measuring temperature distributions over large surface areas. Performance of the sensor array is evaluated in terms of thermal accuracy, mechanical stability and conformity to contoured surfaces, probe self heating under irradiation from microwave and ultrasound hyperthermia sources, and electromagnetic field perturbation. Materials and Methods A prototype TMS with 4×4 array of fiberoptic sensors embedded between two flexible and thermally conducting polyimide films was developed as an alternative to the standard 1-2 mm diameter plastic catheter based probes used in clinical hyperthermia. Computed tomography images and bending tests were performed to evaluate the conformability and mechanical stability respectively. Irradiation and thermal barrier tests were conducted and thermal response of the prototype was compared with round cross-sectional clinical probes. Results Bending and conformity tests demonstrated higher flexibility, dimensional stability and close conformity to human torso. Minimal perturbation of microwave fields and low probe self heating was observed when irradiated with 915MHz microwave and 3.4MHz ultrasound sources. The transient and steady state thermal responses of the TMS array were superior compared to the clinical probes. Conclusions A conformal TMS sensor array with improved thermal sensitivity and dimensional stability was investigated for real-time skin temperature monitoring. This fixed-geometry, body-conforming array of thermal sensors allows fast and accurate characterization of two-dimensional temperature distributions over large surface areas. The prototype TMS demonstrates significant advantages over clinical probes for characterizing skin temperature distributions during hyperthermia treatments of superficial tissue disease. PMID:18465416

  20. High Dynamic Range Imaging at the Quantum Limit with Single Photon Avalanche Diode-Based Image Sensors †

    PubMed Central

    Mattioli Della Rocca, Francescopaolo

    2018-01-01

    This paper examines methods to best exploit the High Dynamic Range (HDR) of the single photon avalanche diode (SPAD) in a high fill-factor HDR photon counting pixel that is scalable to megapixel arrays. The proposed method combines multi-exposure HDR with temporal oversampling in-pixel. We present a silicon demonstration IC with 96 × 40 array of 8.25 µm pitch 66% fill-factor SPAD-based pixels achieving >100 dB dynamic range with 3 back-to-back exposures (short, mid, long). Each pixel sums 15 bit-planes or binary field images internally to constitute one frame providing 3.75× data compression, hence the 1k frames per second (FPS) output off-chip represents 45,000 individual field images per second on chip. Two future projections of this work are described: scaling SPAD-based image sensors to HDR 1 MPixel formats and shrinking the pixel pitch to 1–3 µm. PMID:29641479

  1. Shortwave infrared 512 x 2 line sensor for earth resources applications

    NASA Astrophysics Data System (ADS)

    Tower, J. R.; Pellon, L. E.; McCarthy, B. M.; Elabd, H.; Moldovan, A. G.; Kosonocky, W. F.; Kalshoven, J. E., Jr.; Tom, D.

    1985-08-01

    As part of the NASA remote-sensing Multispectral Linear Array Program, an edge-buttable 512 x 2 IRCCD line image sensor with 30-micron Pd2Si Schottky-barrier detectors is developed for operation with passive cooling at 120 K in the 1.1-2.5 micron short infrared band. On-chip CCD multiplexers provide one video output for each 512 detector band. The monolithic silicon line imager performance at a 4-ms optical integration time includes a signal-to-noise ratio of 241 for irradiance of 7.2 microwatts/sq cm at 1.65 microns wavelength, a 5000 dynamic range, a modulation transfer function, greater than 60 percent at the Nyquist frequency, and an 18-milliwatt imager chip total power dissipation. Blemish-free images with three percent nonuniformity under illumination and nonlinearity of 1.25 percent are obtained. A five SWIR imager hybrid focal plane was constructed, demonstrating the feasibility of arrays with only a two-detector loss at each joint.

  2. Design of multi-mode compatible image acquisition system for HD area array CCD

    NASA Astrophysics Data System (ADS)

    Wang, Chen; Sui, Xiubao

    2014-11-01

    Combining with the current development trend in video surveillance-digitization and high-definition, a multimode-compatible image acquisition system for HD area array CCD is designed. The hardware and software designs of the color video capture system of HD area array CCD KAI-02150 presented by Truesense Imaging company are analyzed, and the structure parameters of the HD area array CCD and the color video gathering principle of the acquisition system are introduced. Then, the CCD control sequence and the timing logic of the whole capture system are realized. The noises of the video signal (KTC noise and 1/f noise) are filtered by using the Correlated Double Sampling (CDS) technique to enhance the signal-to-noise ratio of the system. The compatible designs in both software and hardware for the two other image sensors of the same series: KAI-04050 and KAI-08050 are put forward; the effective pixels of these two HD image sensors are respectively as many as four million and eight million. A Field Programmable Gate Array (FPGA) is adopted as the key controller of the system to perform the modularization design from top to bottom, which realizes the hardware design by software and improves development efficiency. At last, the required time sequence driving is simulated accurately by the use of development platform of Quartus II 12.1 combining with VHDL. The result of the simulation indicates that the driving circuit is characterized by simple framework, low power consumption, and strong anti-interference ability, which meet the demand of miniaturization and high-definition for the current tendency.

  3. A Combined Laser-Communication and Imager for Microspacecraft (ACLAIM)

    NASA Technical Reports Server (NTRS)

    Hemmati, H.; Lesh, J.

    1998-01-01

    ACLAIM is a multi-function instrument consisting of a laser communication terminal and an imaging camera that share a common telescope. A single APS- (Active Pixel Sensor) based focal-plane-array is used to perform both the acquisition and tracking (for laser communication) and science imaging functions.

  4. Hardware-based image processing for high-speed inspection of grains

    USDA-ARS?s Scientific Manuscript database

    A high-speed, low-cost, image-based sorting device was developed to detect and separate grains with slight color differences and small defects on grains The device directly combines a complementary metal–oxide–semiconductor (CMOS) color image sensor with a field-programmable gate array (FPGA) which...

  5. New characterization techniques for LSST sensors

    DOE PAGES

    Nomerotski, A.

    2015-06-18

    Fully depleted, thick CCDs with extended infra-red response have become the sensor of choice for modern sky surveys. The charge transport effects in the silicon and associated astrometric distortions could make mapping between the sky coordinates and sensor coordinates non-trivial, and limit the ultimate precision achievable with these sensors. Two new characterization techniques for the CCDs, which both could probe these issues, are discussed: x-ray flat fielding and imaging of pinhole arrays.

  6. High-resolution depth profiling using a range-gated CMOS SPAD quanta image sensor.

    PubMed

    Ren, Ximing; Connolly, Peter W R; Halimi, Abderrahim; Altmann, Yoann; McLaughlin, Stephen; Gyongy, Istvan; Henderson, Robert K; Buller, Gerald S

    2018-03-05

    A CMOS single-photon avalanche diode (SPAD) quanta image sensor is used to reconstruct depth and intensity profiles when operating in a range-gated mode used in conjunction with pulsed laser illumination. By designing the CMOS SPAD array to acquire photons within a pre-determined temporal gate, the need for timing circuitry was avoided and it was therefore possible to have an enhanced fill factor (61% in this case) and a frame rate (100,000 frames per second) that is more difficult to achieve in a SPAD array which uses time-correlated single-photon counting. When coupled with appropriate image reconstruction algorithms, millimeter resolution depth profiles were achieved by iterating through a sequence of temporal delay steps in synchronization with laser illumination pulses. For photon data with high signal-to-noise ratios, depth images with millimeter scale depth uncertainty can be estimated using a standard cross-correlation approach. To enhance the estimation of depth and intensity images in the sparse photon regime, we used a bespoke clustering-based image restoration strategy, taking into account the binomial statistics of the photon data and non-local spatial correlations within the scene. For sparse photon data with total exposure times of 75 ms or less, the bespoke algorithm can reconstruct depth images with millimeter scale depth uncertainty at a stand-off distance of approximately 2 meters. We demonstrate a new approach to single-photon depth and intensity profiling using different target scenes, taking full advantage of the high fill-factor, high frame rate and large array format of this range-gated CMOS SPAD array.

  7. Development of a c-scan photoacoutsic imaging probe for prostate cancer detection

    NASA Astrophysics Data System (ADS)

    Valluru, Keerthi S.; Chinni, Bhargava K.; Rao, Navalgund A.; Bhatt, Shweta; Dogra, Vikram S.

    2011-03-01

    Prostate cancer is the second leading cause of death in American men after lung cancer. The current screening procedures include Digital Rectal Exam (DRE) and Prostate Specific Antigen (PSA) test, along with Transrectal Ultrasound (TRUS). All suffer from low sensitivity and specificity in detecting prostate cancer in early stages. There is a desperate need for a new imaging modality. We are developing a prototype transrectal photoacoustic imaging probe to detect prostate malignancies in vivo that promises high sensitivity and specificity. To generate photoacoustic (PA) signals, the probe utilizes a high energy 1064 nm laser that delivers light pulses onto the prostate at 10Hz with 10ns duration through a fiber optic cable. The designed system will generate focused C-scan planar images using acoustic lens technology. A 5 MHz custom fabricated ultrasound sensor array located in the image plane acquires the focused PA signals, eliminating the need for any synthetic aperture focusing. The lens and sensor array design was optimized towards this objective. For fast acquisition times, a custom built 16 channel simultaneous backend electronics PCB has been developed. It consists of a low-noise variable gain amplifier and a 16 channel ADC. Due to the unavailability of 2d ultrasound arrays, in the current implementation several B-scan (depth-resolved) data is first acquired by scanning a 1d array, which is then processed to reconstruct either 3d volumetric images or several C-scan planar images. Experimental results on excised tissue using a in-vitro prototype of this technology are presented to demonstrate the system capability in terms of resolution and sensitivity.

  8. Multi-energy x-ray imaging and sensing for diagnostic and control of the burning plasma.

    PubMed

    Stutman, D; Tritz, K; Finkenthal, M

    2012-10-01

    New diagnostic and sensor designs are needed for future burning plasma (BP) fusion experiments, having good space and time resolution and capable of prolonged operation in the harsh BP environment. We evaluate the potential of multi-energy x-ray imaging with filtered detector arrays for BP diagnostic and control. Experimental studies show that this simple and robust technique enables measuring with good accuracy, speed, and spatial resolution the T(e) profile, impurity content, and MHD activity in a tokamak. Applied to the BP this diagnostic could also serve for non-magnetic sensing of the plasma position, centroid, ELM, and RWM instability. BP compatible x-ray sensors are proposed using "optical array" or "bi-cell" detectors.

  9. CMOS minimal array

    NASA Astrophysics Data System (ADS)

    Janesick, James; Cheng, John; Bishop, Jeanne; Andrews, James T.; Tower, John; Walker, Jeff; Grygon, Mark; Elliot, Tom

    2006-08-01

    A high performance prototype CMOS imager is introduced. Test data is reviewed for different array formats that utilize 3T photo diode, 5T pinned photo diode and 6T photo gate CMOS pixel architectures. The imager allows several readout modes including progressive scan, snap and windowed operation. The new imager is built on different silicon substrates including very high resistivity epitaxial wafers for deep depletion operation. Data products contained in this paper focus on sensor's read noise, charge capacity, charge transfer efficiency, thermal dark current, RTS dark spikes, QE, pixel cross- talk and on-chip analog circuitry performance.

  10. Development and test of an active pixel sensor detector for heliospheric imager on solar orbiter and solar probe plus

    NASA Astrophysics Data System (ADS)

    Korendyke, Clarence M.; Vourlidas, Angelos; Plunkett, Simon P.; Howard, Russell A.; Wang, Dennis; Marshall, Cheryl J.; Waczynski, Augustyn; Janesick, James J.; Elliott, Thomas; Tun, Samuel; Tower, John; Grygon, Mark; Keller, David; Clifford, Gregory E.

    2013-10-01

    The Naval Research Laboratory is developing next generation CMOS imaging arrays for the Solar Orbiter and Solar Probe Plus missions. The device development is nearly complete with flight device delivery scheduled for summer of 2013. The 4Kx4K mosaic array with 10micron pixels is well suited to the panoramic imaging required for the Solar Orbiter mission. The devices are robust (<100krad) and exhibit minimal performance degradation with respect to radiation. The device design and performance are described.

  11. Image sensor system with bio-inspired efficient coding and adaptation.

    PubMed

    Okuno, Hirotsugu; Yagi, Tetsuya

    2012-08-01

    We designed and implemented an image sensor system equipped with three bio-inspired coding and adaptation strategies: logarithmic transform, local average subtraction, and feedback gain control. The system comprises a field-programmable gate array (FPGA), a resistive network, and active pixel sensors (APS), whose light intensity-voltage characteristics are controllable. The system employs multiple time-varying reset voltage signals for APS in order to realize multiple logarithmic intensity-voltage characteristics, which are controlled so that the entropy of the output image is maximized. The system also employs local average subtraction and gain control in order to obtain images with an appropriate contrast. The local average is calculated by the resistive network instantaneously. The designed system was successfully used to obtain appropriate images of objects that were subjected to large changes in illumination.

  12. Determining the phase and amplitude distortion of a wavefront using a plenoptic sensor.

    PubMed

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C

    2015-05-01

    We have designed a plenoptic sensor to retrieve phase and amplitude changes resulting from a laser beam's propagation through atmospheric turbulence. Compared with the commonly restricted domain of (-π,π) in phase reconstruction by interferometers, the reconstructed phase obtained by the plenoptic sensors can be continuous up to a multiple of 2π. When compared with conventional Shack-Hartmann sensors, ambiguities caused by interference or low intensity, such as branch points and branch cuts, are less likely to happen and can be adaptively avoided by our reconstruction algorithm. In the design of our plenoptic sensor, we modified the fundamental structure of a light field camera into a mini Keplerian telescope array by accurately cascading the back focal plane of its object lens with a microlens array's front focal plane and matching the numerical aperture of both components. Unlike light field cameras designed for incoherent imaging purposes, our plenoptic sensor operates on the complex amplitude of the incident beam and distributes it into a matrix of images that are simpler and less subject to interference than a global image of the beam. Then, with the proposed reconstruction algorithms, the plenoptic sensor is able to reconstruct the wavefront and a phase screen at an appropriate depth in the field that causes the equivalent distortion on the beam. The reconstructed results can be used to guide adaptive optics systems in directing beam propagation through atmospheric turbulence. In this paper, we will show the theoretical analysis and experimental results obtained with the plenoptic sensor and its reconstruction algorithms.

  13. Determining the phase and amplitude distortion of a wavefront using a plenoptic sensor

    NASA Astrophysics Data System (ADS)

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C.

    2015-05-01

    We have designed a plenoptic sensor to retrieve phase and amplitude changes resulting from a laser beam's propagation through atmospheric turbulence. Compared with the commonly restricted domain of (-pi, pi) in phase reconstruction by interferometers, the reconstructed phase obtained by the plenoptic sensors can be continuous up to a multiple of 2pi. When compared with conventional Shack-Hartmann sensors, ambiguities caused by interference or low intensity, such as branch points and branch cuts, are less likely to happen and can be adaptively avoided by our reconstruction algorithm. In the design of our plenoptic sensor, we modified the fundamental structure of a light field camera into a mini Keplerian telescope array by accurately cascading the back focal plane of its object lens with a microlens array's front focal plane and matching the numerical aperture of both components. Unlike light field cameras designed for incoherent imaging purposes, our plenoptic sensor operates on the complex amplitude of the incident beam and distributes it into a matrix of images that are simpler and less subject to interference than a global image of the beam. Then, with the proposed reconstruction algorithms, the plenoptic sensor is able to reconstruct the wavefront and a phase screen at an appropriate depth in the field that causes the equivalent distortion on the beam. The reconstructed results can be used to guide adaptive optics systems in directing beam propagation through atmospheric turbulence. In this paper we will show the theoretical analysis and experimental results obtained with the plenoptic sensor and its reconstruction algorithms.

  14. Hard-X-Ray/Soft-Gamma-Ray Imaging Sensor Assembly for Astronomy

    NASA Technical Reports Server (NTRS)

    Myers, Richard A.

    2008-01-01

    An improved sensor assembly has been developed for astronomical imaging at photon energies ranging from 1 to 100 keV. The assembly includes a thallium-doped cesium iodide scintillator divided into pixels and coupled to an array of high-gain avalanche photodiodes (APDs). Optionally, the array of APDs can be operated without the scintillator to detect photons at energies below 15 keV. The array of APDs is connected to compact electronic readout circuitry that includes, among other things, 64 independent channels for detection of photons in various energy ranges, up to a maximum energy of 100 keV, at a count rate up to 3 kHz. The readout signals are digitized and processed by imaging software that performs "on-the-fly" analysis. The sensor assembly has been integrated into an imaging spectrometer, along with a pair of coded apertures (Fresnel zone plates) that are used in conjunction with the pixel layout to implement a shadow-masking technique to obtain relatively high spatial resolution without having to use extremely small pixels. Angular resolutions of about 20 arc-seconds have been measured. Thus, for example, the imaging spectrometer can be used to (1) determine both the energy spectrum of a distant x-ray source and the angular deviation of the source from the nominal line of sight of an x-ray telescope in which the spectrometer is mounted or (2) study the spatial and temporal development of solar flares, repeating - ray bursters, and other phenomena that emit transient radiation in the hard-x-ray/soft- -ray region of the electromagnetic spectrum.

  15. Design, optimization and evaluation of a "smart" pixel sensor array for low-dose digital radiography

    NASA Astrophysics Data System (ADS)

    Wang, Kai; Liu, Xinghui; Ou, Hai; Chen, Jun

    2016-04-01

    Amorphous silicon (a-Si:H) thin-film transistors (TFTs) have been widely used to build flat-panel X-ray detectors for digital radiography (DR). As the demand for low-dose X-ray imaging grows, a detector with high signal-to-noise-ratio (SNR) pixel architecture emerges. "Smart" pixel is intended to use a dual-gate photosensitive TFT for sensing, storage, and switch. It differs from a conventional passive pixel sensor (PPS) and active pixel sensor (APS) in that all these three functions are combined into one device instead of three separate units in a pixel. Thus, it is expected to have high fill factor and high spatial resolution. In addition, it utilizes the amplification effect of the dual-gate photosensitive TFT to form a one-transistor APS that leads to a potentially high SNR. This paper addresses the design, optimization and evaluation of the smart pixel sensor and array for low-dose DR. We will design and optimize the smart pixel from the scintillator to TFT levels and validate it through optical and electrical simulation and experiments of a 4x4 sensor array.

  16. Time-Domain Fluorescence Lifetime Imaging Techniques Suitable for Solid-State Imaging Sensor Arrays

    PubMed Central

    Li, David Day-Uei; Ameer-Beg, Simon; Arlt, Jochen; Tyndall, David; Walker, Richard; Matthews, Daniel R.; Visitkul, Viput; Richardson, Justin; Henderson, Robert K.

    2012-01-01

    We have successfully demonstrated video-rate CMOS single-photon avalanche diode (SPAD)-based cameras for fluorescence lifetime imaging microscopy (FLIM) by applying innovative FLIM algorithms. We also review and compare several time-domain techniques and solid-state FLIM systems, and adapt the proposed algorithms for massive CMOS SPAD-based arrays and hardware implementations. The theoretical error equations are derived and their performances are demonstrated on the data obtained from 0.13 μm CMOS SPAD arrays and the multiple-decay data obtained from scanning PMT systems. In vivo two photon fluorescence lifetime imaging data of FITC-albumin labeled vasculature of a P22 rat carcinosarcoma (BD9 rat window chamber) are used to test how different algorithms perform on bi-decay data. The proposed techniques are capable of producing lifetime images with enough contrast. PMID:22778606

  17. A LWIR hyperspectral imager using a Sagnac interferometer and cooled HgCdTe detector array

    NASA Astrophysics Data System (ADS)

    Lucey, Paul G.; Wood, Mark; Crites, Sarah T.; Akagi, Jason

    2012-06-01

    LWIR hyperspectral imaging has a wide range of civil and military applications with its ability to sense chemical compositions at standoff ranges. Most recent implementations of this technology use spectrographs employing varying degrees of cryogenic cooling to reduce sensor self-emission that can severely limit sensitivity. We have taken an interferometric approach that promises to reduce the need for cooling while preserving high resolution. Reduced cooling has multiple benefits including faster system readiness from a power off state, lower mass, and potentially lower cost owing to lower system complexity. We coupled an uncooled Sagnac interferometer with a 256x320 mercury cadmium telluride array with an 11 micron cutoff to produce a spatial interferometric LWIR hyperspectral imaging system operating from 7.5 to 11 microns. The sensor was tested in ground-ground applications, and from a small aircraft producing spectral imagery including detection of gas emission from high vapor pressure liquids.

  18. Thin polymer etalon arrays for high-resolution photoacoustic imaging

    PubMed Central

    Hou, Yang; Huang, Sheng-Wen; Ashkenazi, Shai; Witte, Russell; O’Donnell, Matthew

    2009-01-01

    Thin polymer etalons are demonstrated as high-frequency ultrasound sensors for three-dimensional (3-D) high-resolution photoacoustic imaging. The etalon, a Fabry-Perot optical resonator, consists of a thin polymer slab sandwiched between two gold layers. It is probed with a scanning continuous-wave (CW) laser for ultrasound array detection. Detection bandwidth of a 20-μm-diam array element exceeds 50 MHz, and the ultrasound sensitivity is comparable to polyvinylidene fluoride (PVDF) equivalents of similar size. In a typical photoacoustic imaging setup, a pulsed laser beam illuminates the imaging target, where optical energy is absorbed and acoustic waves are generated through the thermoelastic effect. An ultrasound detection array is formed by scanning the probing laser beam on the etalon surface in either a 1-D or a 2-D configuration, which produces 2-D or 3-D images, respectively. Axial and lateral resolutions have been demonstrated to be better than 20 μm. Detailed characterizations of the optical and acoustical properties of the etalon, as well as photoacoustic imaging results, suggest that thin polymer etalon arrays can be used as ultrasound detectors for 3-D high-resolution photoacoustic imaging applications. PMID:19123679

  19. Planar and finger-shaped optical tactile sensors for robotic applications

    NASA Technical Reports Server (NTRS)

    Begej, Stefan

    1988-01-01

    Progress is described regarding the development of optical tactile sensors specifically designed for application to dexterous robotics. These sensors operate on optical principles involving the frustration of total internal reflection at a waveguide/elastomer interface and produce a grey-scale tactile image that represents the normal (vertical) forces of contact. The first tactile sensor discussed is a compact, 32 x 32 planar sensor array intended for mounting on a parallel-jaw gripper. Optical fibers were employed to convey the tactile image to a CCD camera and microprocessor-based image analysis system. The second sensor had the shape and size of a human fingertip and was designed for a dexterous robotic hand. It contained 256 sensing sites (taxels) distributed in a dual-density pattern that included a tactile fovea near the tip measuring 13 x 13 mm and containing 169 taxels. The design and construction details of these tactile sensors are presented, in addition to photographs of tactile imprints.

  20. Active-Pixel Image Sensor With Analog-To-Digital Converters

    NASA Technical Reports Server (NTRS)

    Fossum, Eric R.; Mendis, Sunetra K.; Pain, Bedabrata; Nixon, Robert H.

    1995-01-01

    Proposed single-chip integrated-circuit image sensor contains 128 x 128 array of active pixel sensors at 50-micrometer pitch. Output terminals of all pixels in each given column connected to analog-to-digital (A/D) converter located at bottom of column. Pixels scanned in semiparallel fashion, one row at time; during time allocated to scanning row, outputs of all active pixel sensors in row fed to respective A/D converters. Design of chip based on complementary metal oxide semiconductor (CMOS) technology, and individual circuit elements fabricated according to 2-micrometer CMOS design rules. Active pixel sensors designed to operate at video rate of 30 frames/second, even at low light levels. A/D scheme based on first-order Sigma-Delta modulation.

  1. Uncooled long-wave infrared hyperspectral imaging

    NASA Technical Reports Server (NTRS)

    Lucey, Paul G. (Inventor)

    2006-01-01

    A long-wave infrared hyperspectral sensor device employs a combination of an interferometer with an uncooled microbolometer array camera to produce hyperspectral images without the use of bulky, power-hungry motorized components, making it suitable for UAV vehicles, small mobile platforms, or in extraterrestrial environments. The sensor device can provide signal-to-noise ratios near 200 for ambient temperature scenes with 33 wavenumber resolution at a frame rate of 50 Hz, with higher results indicated by ongoing component improvements.

  2. SEM contour based metrology for microlens process studies in CMOS image sensor technologies

    NASA Astrophysics Data System (ADS)

    Lakcher, Amine; Ostrovsky, Alain; Le-Gratiet, Bertrand; Berthier, Ludovic; Bidault, Laurent; Ducoté, Julien; Jamin-Mornet, Clémence; Mortini, Etienne; Besacier, Maxime

    2018-03-01

    From the first digital cameras which appeared during the 70s to cameras of current smartphones, image sensors have undergone significant technological development in the last decades. The development of CMOS image sensor technologies in the 90s has been the main driver of the recent progresses. The main component of an image sensor is the pixel. A pixel contains a photodiode connected to transistors but only the photodiode area is light sensitive. This results in a significant loss of efficiency. To solve this issue, microlenses are used to focus the incident light on the photodiode. A microlens array is made out of a transparent material and has a spherical cap shape. To obtain this spherical shape, a lithography process is performed to generate resist blocks which are then annealed above their glass transition temperature (reflow). Even if the dimensions to consider are higher than in advanced IC nodes, microlenses are sensitive to process variability during lithography and reflow. A good control of the microlens dimensions is key to optimize the process and thus the performance of the final product. The purpose of this paper is to apply SEM contour metrology [1, 2, 3, 4] to microlenses in order to develop a relevant monitoring methodology and to propose new metrics to engineers to evaluate their process or optimize the design of the microlens arrays.

  3. Realization of integral 3-dimensional image using fabricated tunable liquid lens array

    NASA Astrophysics Data System (ADS)

    Lee, Muyoung; Kim, Junoh; Kim, Cheol Joong; Lee, Jin Su; Won, Yong Hyub

    2015-03-01

    Electrowetting has been widely studied for various optical applications such as optical switch, sensor, prism, and display. In this study, vari-focal liquid lens array is developed using electrowetting principle to construct integral 3-dimensional imaging. The electrowetting principle that changes the surface tension by applying voltage has several advantages to realize active optical device such as fast response time, low electrical consumption, and no mechanical moving parts. Two immiscible liquids that are water and oil are used for forming lens. By applying a voltage to the water, the focal length of the lens could be tuned as changing contact angle of water. The fabricated electrowetting vari-focal liquid lens array has 1mm diameter spherical lens shape that has 1.6mm distance between each lens. The number of lenses on the panel is 23x23 and the focal length of the lens array is simultaneously tuned from -125 to 110 diopters depending on the applied voltage. The fabricated lens array is implemented to integral 3-dimensional imaging. A 3D object is reconstructed by fabricated liquid lens array with 23x23 elemental images that are generated by 3D max tools. When liquid lens array is tuned as convex state. From vari-focal liquid lens array implemented integral imaging system, we expect that depth enhanced integral imaging can be realized in the near future.

  4. Motion camera based on a custom vision sensor and an FPGA architecture

    NASA Astrophysics Data System (ADS)

    Arias-Estrada, Miguel

    1998-09-01

    A digital camera for custom focal plane arrays was developed. The camera allows the test and development of analog or mixed-mode arrays for focal plane processing. The camera is used with a custom sensor for motion detection to implement a motion computation system. The custom focal plane sensor detects moving edges at the pixel level using analog VLSI techniques. The sensor communicates motion events using the event-address protocol associated to a temporal reference. In a second stage, a coprocessing architecture based on a field programmable gate array (FPGA) computes the time-of-travel between adjacent pixels. The FPGA allows rapid prototyping and flexible architecture development. Furthermore, the FPGA interfaces the sensor to a compact PC computer which is used for high level control and data communication to the local network. The camera could be used in applications such as self-guided vehicles, mobile robotics and smart surveillance systems. The programmability of the FPGA allows the exploration of further signal processing like spatial edge detection or image segmentation tasks. The article details the motion algorithm, the sensor architecture, the use of the event- address protocol for velocity vector computation and the FPGA architecture used in the motion camera system.

  5. Microchamber arrays with an integrated long luminescence lifetime pH sensor.

    PubMed

    Poehler, Elisabeth; Pfeiffer, Simon A; Herm, Marc; Gaebler, Michael; Busse, Benedikt; Nagl, Stefan

    2016-04-01

    A pH probe with a microsecond luminescence lifetime was obtained via covalent coupling of 6-carboxynaphthofluorescein (CNF) moieties to ruthenium-tris-(1,10-phenanthroline)(2+). The probe was covalently attached to amino-modified poly-(2-hydroxyethyl)methacrylate (pHEMA) and showed a pH-dependent FRET with luminescence lifetimes of 681 to 1260 ns and a working range from ca. pH 6.5 to 9.0 with a pKa of 7.79 ± 0.14. The pH sensor matrix was integrated via spin coating as ca. 1- to 2-μm-thick layer into "CytoCapture" cell culture dishes of 6 mm in diameter. These contained a microcavity array of square-shaped regions of 40 μm length and width and 15 μm depth that was homogeneously coated with the pH sensor matrix. The sensor layer showed fast response times in both directions. A microscopic setup was developed that enabled imaging of the pH inside the microchamber arrays over many hours. As a proof of principle, we monitored the pH of Escherichia coli cell cultures grown in the microchamber arrays. The integrated sensor matrix allowed pH monitoring spatially resolved in every microchamber, and the differences in cell growth between individual chambers could be resolved and quantified.

  6. Uncooled Terahertz real-time imaging 2D arrays developed at LETI: present status and perspectives

    NASA Astrophysics Data System (ADS)

    Simoens, François; Meilhan, Jérôme; Dussopt, Laurent; Nicolas, Jean-Alain; Monnier, Nicolas; Sicard, Gilles; Siligaris, Alexandre; Hiberty, Bruno

    2017-05-01

    As for other imaging sensor markets, whatever is the technology, the commercial spread of terahertz (THz) cameras has to fulfil simultaneously the criteria of high sensitivity and low cost and SWAP (size, weight and power). Monolithic silicon-based 2D sensors integrated in uncooled THz real-time cameras are good candidates to meet these requirements. Over the past decade, LETI has been studying and developing such arrays with two complimentary technological approaches, i.e. antenna-coupled silicon bolometers and CMOS Field Effect Transistors (FET), both being compatible to standard silicon microelectronics processes. LETI has leveraged its know-how in thermal infrared bolometer sensors in developing a proprietary architecture for THz sensing. High technological maturity has been achieved as illustrated by the demonstration of fast scanning of large field of view and the recent birth of a commercial camera. In the FET-based THz field, recent works have been focused on innovative CMOS read-out-integrated circuit designs. The studied architectures take advantage of the large pixel pitch to enhance the flexibility and the sensitivity: an embedded in-pixel configurable signal processing chain dramatically reduces the noise. Video sequences at 100 frames per second using our 31x31 pixels 2D Focal Plane Arrays (FPA) have been achieved. The authors describe the present status of these developments and perspectives of performance evolutions are discussed. Several experimental imaging tests are also presented in order to illustrate the capabilities of these arrays to address industrial applications such as non-destructive testing (NDT), security or quality control of food.

  7. Development of integrated semiconductor optical sensors for functional brain imaging

    NASA Astrophysics Data System (ADS)

    Lee, Thomas T.

    Optical imaging of neural activity is a widely accepted technique for imaging brain function in the field of neuroscience research, and has been used to study the cerebral cortex in vivo for over two decades. Maps of brain activity are obtained by monitoring intensity changes in back-scattered light, called Intrinsic Optical Signals (IOS), that correspond to fluctuations in blood oxygenation and volume associated with neural activity. Current imaging systems typically employ bench-top equipment including lamps and CCD cameras to study animals using visible light. Such systems require the use of anesthetized or immobilized subjects with craniotomies, which imposes limitations on the behavioral range and duration of studies. The ultimate goal of this work is to overcome these limitations by developing a single-chip semiconductor sensor using arrays of sources and detectors operating at near-infrared (NIR) wavelengths. A single-chip implementation, combined with wireless telemetry, will eliminate the need for immobilization or anesthesia of subjects and allow in vivo studies of free behavior. NIR light offers additional advantages because it experiences less absorption in animal tissue than visible light, which allows for imaging through superficial tissues. This, in turn, reduces or eliminates the need for traumatic surgery and enables long-term brain-mapping studies in freely-behaving animals. This dissertation concentrates on key engineering challenges of implementing the sensor. This work shows the feasibility of using a GaAs-based array of vertical-cavity surface emitting lasers (VCSELs) and PIN photodiodes for IOS imaging. I begin with in-vivo studies of IOS imaging through the skull in mice, and use these results along with computer simulations to establish minimum performance requirements for light sources and detectors. I also evaluate the performance of a current commercial VCSEL for IOS imaging, and conclude with a proposed prototype sensor.

  8. STARR: shortwave-targeted agile Raman robot for the detection and identification of emplaced explosives

    NASA Astrophysics Data System (ADS)

    Gomer, Nathaniel R.; Gardner, Charles W.

    2014-05-01

    In order to combat the threat of emplaced explosives (land mines, etc.), ChemImage Sensor Systems (CISS) has developed a multi-sensor, robot mounted sensor capable of identification and confirmation of potential threats. The system, known as STARR (Shortwave-infrared Targeted Agile Raman Robot), utilizes shortwave infrared spectroscopy for the identification of potential threats, combined with a visible short-range standoff Raman hyperspectral imaging (HSI) system for material confirmation. The entire system is mounted onto a Talon UGV (Unmanned Ground Vehicle), giving the sensor an increased area search rate and reducing the risk of injury to the operator. The Raman HSI system utilizes a fiber array spectral translator (FAST) for the acquisition of high quality Raman chemical images, allowing for increased sensitivity and improved specificity. An overview of the design and operation of the system will be presented, along with initial detection results of the fusion sensor.

  9. Rolling Shutter Effect aberration compensation in Digital Holographic Microscopy

    NASA Astrophysics Data System (ADS)

    Monaldi, Andrea C.; Romero, Gladis G.; Cabrera, Carlos M.; Blanc, Adriana V.; Alanís, Elvio E.

    2016-05-01

    Due to the sequential-readout nature of most CMOS sensors, each row of the sensor array is exposed at a different time, resulting in the so-called rolling shutter effect that induces geometric distortion to the image if the video camera or the object moves during image acquisition. Particularly in digital holograms recording, while the sensor captures progressively each row of the hologram, interferometric fringes can oscillate due to external vibrations and/or noises even when the object under study remains motionless. The sensor records each hologram row in different instants of these disturbances. As a final effect, phase information is corrupted, distorting the reconstructed holograms quality. We present a fast and simple method for compensating this effect based on image processing tools. The method is exemplified by holograms of microscopic biological static objects. Results encourage incorporating CMOS sensors over CCD in Digital Holographic Microscopy due to a better resolution and less expensive benefits.

  10. Digital sun sensor multi-spot operation.

    PubMed

    Rufino, Giancarlo; Grassi, Michele

    2012-11-28

    The operation and test of a multi-spot digital sun sensor for precise sun-line determination is described. The image forming system consists of an opaque mask with multiple pinhole apertures producing multiple, simultaneous, spot-like images of the sun on the focal plane. The sun-line precision can be improved by averaging multiple simultaneous measures. Nevertheless, the sensor operation on a wide field of view requires acquiring and processing images in which the number of sun spots and the related intensity level are largely variable. To this end, a reliable and robust image acquisition procedure based on a variable shutter time has been considered as well as a calibration function exploiting also the knowledge of the sun-spot array size. Main focus of the present paper is the experimental validation of the wide field of view operation of the sensor by using a sensor prototype and a laboratory test facility. Results demonstrate that it is possible to keep high measurement precision also for large off-boresight angles.

  11. Highly sensitive and area-efficient CMOS image sensor using a PMOSFET-type photodetector with a built-in transfer gate

    NASA Astrophysics Data System (ADS)

    Seo, Sang-Ho; Kim, Kyoung-Do; Kong, Jae-Sung; Shin, Jang-Kyoo; Choi, Pyung

    2007-02-01

    In this paper, a new CMOS image sensor is presented, which uses a PMOSFET-type photodetector with a transfer gate that has a high and variable sensitivity. The proposed CMOS image sensor has been fabricated using a 0.35 μm 2-poly 4- metal standard CMOS technology and is composed of a 256 × 256 array of 7.05 × 7.10 μm pixels. The unit pixel has a configuration of a pseudo 3-transistor active pixel sensor (APS) with the PMOSFET-type photodetector with a transfer gate, which has a function of conventional 4-transistor APS. The generated photocurrent is controlled by the transfer gate of the PMOSFET-type photodetector. The maximum responsivity of the photodetector is larger than 1.0 × 10 3 A/W without any optical lens. Fabricated 256 × 256 CMOS image sensor exhibits a good response to low-level illumination as low as 5 lux.

  12. Single-shot and single-sensor high/super-resolution microwave imaging based on metasurface.

    PubMed

    Wang, Libo; Li, Lianlin; Li, Yunbo; Zhang, Hao Chi; Cui, Tie Jun

    2016-06-01

    Real-time high-resolution (including super-resolution) imaging with low-cost hardware is a long sought-after goal in various imaging applications. Here, we propose broadband single-shot and single-sensor high-/super-resolution imaging by using a spatio-temporal dispersive metasurface and an imaging reconstruction algorithm. The metasurface with spatio-temporal dispersive property ensures the feasibility of the single-shot and single-sensor imager for super- and high-resolution imaging, since it can convert efficiently the detailed spatial information of the probed object into one-dimensional time- or frequency-dependent signal acquired by a single sensor fixed in the far-field region. The imaging quality can be improved by applying a feature-enhanced reconstruction algorithm in post-processing, and the desired imaging resolution is related to the distance between the object and metasurface. When the object is placed in the vicinity of the metasurface, the super-resolution imaging can be realized. The proposed imaging methodology provides a unique means to perform real-time data acquisition, high-/super-resolution images without employing expensive hardware (e.g. mechanical scanner, antenna array, etc.). We expect that this methodology could make potential breakthroughs in the areas of microwave, terahertz, optical, and even ultrasound imaging.

  13. Evolution of miniature detectors and focal plane arrays for infrared sensors

    NASA Astrophysics Data System (ADS)

    Watts, Louis A.

    1993-06-01

    Sensors that are sensitive in the infrared spectral region have been under continuous development since the WW2 era. A quest for the military advantage of 'seeing in the dark' has pushed thermal imaging technology toward high spatial and temporal resolution for night vision equipment, fire control, search track, and seeker 'homing' guidance sensing devices. Similarly, scientific applications have pushed spectral resolution for chemical analysis, remote sensing of earth resources, and astronomical exploration applications. As a result of these developments, focal plane arrays (FPA) are now available with sufficient sensitivity for both high spatial and narrow bandwidth spectral resolution imaging over large fields of view. Such devices combined with emerging opto-electronic developments in integrated FPA data processing techniques can yield miniature sensors capable of imaging reflected sunlight in the near IR and emitted thermal energy in the Mid-wave (MWIR) and longwave (LWIR) IR spectral regions. Robotic space sensors equipped with advanced versions of these FPA's will provide high resolution 'pictures' of their surroundings, perform remote analysis of solid, liquid, and gas matter, or selectively look for 'signatures' of specific objects. Evolutionary trends and projections of future low power micro detector FPA developments for day/night operation or use in adverse viewing conditions are presented in the following test.

  14. Evolution of miniature detectors and focal plane arrays for infrared sensors

    NASA Technical Reports Server (NTRS)

    Watts, Louis A.

    1993-01-01

    Sensors that are sensitive in the infrared spectral region have been under continuous development since the WW2 era. A quest for the military advantage of 'seeing in the dark' has pushed thermal imaging technology toward high spatial and temporal resolution for night vision equipment, fire control, search track, and seeker 'homing' guidance sensing devices. Similarly, scientific applications have pushed spectral resolution for chemical analysis, remote sensing of earth resources, and astronomical exploration applications. As a result of these developments, focal plane arrays (FPA) are now available with sufficient sensitivity for both high spatial and narrow bandwidth spectral resolution imaging over large fields of view. Such devices combined with emerging opto-electronic developments in integrated FPA data processing techniques can yield miniature sensors capable of imaging reflected sunlight in the near IR and emitted thermal energy in the Mid-wave (MWIR) and longwave (LWIR) IR spectral regions. Robotic space sensors equipped with advanced versions of these FPA's will provide high resolution 'pictures' of their surroundings, perform remote analysis of solid, liquid, and gas matter, or selectively look for 'signatures' of specific objects. Evolutionary trends and projections of future low power micro detector FPA developments for day/night operation or use in adverse viewing conditions are presented in the following test.

  15. New amorphous-silicon image sensor for x-ray diagnostic medical imaging applications

    NASA Astrophysics Data System (ADS)

    Weisfield, Richard L.; Hartney, Mark A.; Street, Robert A.; Apte, Raj B.

    1998-07-01

    This paper introduces new high-resolution amorphous Silicon (a-Si) image sensors specifically configured for demonstrating film-quality medical x-ray imaging capabilities. The devices utilizes an x-ray phosphor screen coupled to an array of a-Si photodiodes for detecting visible light, and a-Si thin-film transistors (TFTs) for connecting the photodiodes to external readout electronics. We have developed imagers based on a pixel size of 127 micrometer X 127 micrometer with an approximately page-size imaging area of 244 mm X 195 mm, and array size of 1,536 data lines by 1,920 gate lines, for a total of 2.95 million pixels. More recently, we have developed a much larger imager based on the same pixel pattern, which covers an area of approximately 406 mm X 293 mm, with 2,304 data lines by 3,200 gate lines, for a total of nearly 7.4 million pixels. This is very likely to be the largest image sensor array and highest pixel count detector fabricated on a single substrate. Both imagers connect to a standard PC and are capable of taking an image in a few seconds. Through design rule optimization we have achieved a light sensitive area of 57% and optimized quantum efficiency for x-ray phosphor output in the green part of the spectrum, yielding an average quantum efficiency between 500 and 600 nm of approximately 70%. At the same time, we have managed to reduce extraneous leakage currents on these devices to a few fA per pixel, which allows for very high dynamic range to be achieved. We have characterized leakage currents as a function of photodiode bias, time and temperature to demonstrate high stability over these large sized arrays. At the electronics level, we have adopted a new generation of low noise, charge- sensitive amplifiers coupled to 12-bit A/D converters. Considerable attention was given to reducing electronic noise in order to demonstrate a large dynamic range (over 4,000:1) for medical imaging applications. Through a combination of low data lines capacitance, readout amplifier design, optimized timing, and noise cancellation techniques, we achieve 1,000e to 2,000e of noise for the page size and large size arrays, respectively. This allows for true 12-bit performance and quantum limited images over a wide range of x-ray exposures. Various approaches to reducing line correlated noise have been implemented and will be discussed. Images documenting the improved performance will be presented. Avenues for improvement are under development, including higher resolution 97 micrometer pixel imagers, further improvements in detective quantum efficiency, and characterization of dynamic behavior.

  16. Small Imaging Depth LIDAR and DCNN-Based Localization for Automated Guided Vehicle †

    PubMed Central

    Ito, Seigo; Hiratsuka, Shigeyoshi; Ohta, Mitsuhiko; Matsubara, Hiroyuki; Ogawa, Masaru

    2018-01-01

    We present our third prototype sensor and a localization method for Automated Guided Vehicles (AGVs), for which small imaging LIght Detection and Ranging (LIDAR) and fusion-based localization are fundamentally important. Our small imaging LIDAR, named the Single-Photon Avalanche Diode (SPAD) LIDAR, uses a time-of-flight method and SPAD arrays. A SPAD is a highly sensitive photodetector capable of detecting at the single-photon level, and the SPAD LIDAR has two SPAD arrays on the same chip for detection of laser light and environmental light. Therefore, the SPAD LIDAR simultaneously outputs range image data and monocular image data with the same coordinate system and does not require external calibration among outputs. As AGVs travel both indoors and outdoors with vibration, this calibration-less structure is particularly useful for AGV applications. We also introduce a fusion-based localization method, named SPAD DCNN, which uses the SPAD LIDAR and employs a Deep Convolutional Neural Network (DCNN). SPAD DCNN can fuse the outputs of the SPAD LIDAR: range image data, monocular image data and peak intensity image data. The SPAD DCNN has two outputs: the regression result of the position of the SPAD LIDAR and the classification result of the existence of a target to be approached. Our third prototype sensor and the localization method are evaluated in an indoor environment by assuming various AGV trajectories. The results show that the sensor and localization method improve the localization accuracy. PMID:29320434

  17. Small Imaging Depth LIDAR and DCNN-Based Localization for Automated Guided Vehicle.

    PubMed

    Ito, Seigo; Hiratsuka, Shigeyoshi; Ohta, Mitsuhiko; Matsubara, Hiroyuki; Ogawa, Masaru

    2018-01-10

    We present our third prototype sensor and a localization method for Automated Guided Vehicles (AGVs), for which small imaging LIght Detection and Ranging (LIDAR) and fusion-based localization are fundamentally important. Our small imaging LIDAR, named the Single-Photon Avalanche Diode (SPAD) LIDAR, uses a time-of-flight method and SPAD arrays. A SPAD is a highly sensitive photodetector capable of detecting at the single-photon level, and the SPAD LIDAR has two SPAD arrays on the same chip for detection of laser light and environmental light. Therefore, the SPAD LIDAR simultaneously outputs range image data and monocular image data with the same coordinate system and does not require external calibration among outputs. As AGVs travel both indoors and outdoors with vibration, this calibration-less structure is particularly useful for AGV applications. We also introduce a fusion-based localization method, named SPAD DCNN, which uses the SPAD LIDAR and employs a Deep Convolutional Neural Network (DCNN). SPAD DCNN can fuse the outputs of the SPAD LIDAR: range image data, monocular image data and peak intensity image data. The SPAD DCNN has two outputs: the regression result of the position of the SPAD LIDAR and the classification result of the existence of a target to be approached. Our third prototype sensor and the localization method are evaluated in an indoor environment by assuming various AGV trajectories. The results show that the sensor and localization method improve the localization accuracy.

  18. Synthesis of a fiber-optic magnetostrictive sensor (FOMS) pixel for RF magnetic field imaging

    NASA Astrophysics Data System (ADS)

    Rengarajan, Suraj

    The principal objective of this dissertation was to synthesize a sensor element with properties specifically optimized for integration into arrays capable of imaging RF magnetic fields. The dissertation problem was motivated by applications in nondestructive eddy current testing, smart skins, etc., requiring sensor elements that non-invasively detect millimeter-scale variations over several square meters, in low level magnetic fields varying at frequencies in the 100 kHz-1 GHz range. The poor spatial and temporal resolution of FOMS elements available prior to this dissertation research, precluded their use in non-invasive large area mapping applications. Prior research had been focused on large, discrete devices for detecting extremely low level magnetic fields varying at a few kHz. These devices are incompatible with array integration and imaging applications. The dissertation research sought to overcome the limitations of current technology by utilizing three new approaches; synthesizing magnetostrictive thin films and optimizing their properties for sensor applications, integrating small sensor elements into an array compatible fiber optic interferometer, and devising a RF mixing approach to measure high frequency magnetic fields using the integrated sensor element. Multilayer thin films were used to optimize the magnetic properties of the magnetostrictive elements. Alternating soft (Nisb{80}Fesb{20}) and hard (Cosb{50}Fesb{50}) magnetic alloy layers were selected for the multilayer and the layer thicknesses were varied to obtain films with a combination of large magnetization, high frequency permeability and large magnetostrictivity. X-Ray data and measurement of the variations in the magnetization, resistivity and magnetostriction with layer thicknesses, indicated that an interfacial layer was responsible for enhancing the sensing performance of the multilayers. A FOMS pixel was patterned directly onto the sensing arm of a fiber-optic interferometer, by sputtering a multilayer film with favorable sensor properties. After calibrating the interferometer response with a piezo, the mechanical and magnetic responses of the FOMS element were evaluated for various test fields. High frequency magnetic fields were detected using a local oscillator field to downconvert the RF signal fields to the lower mechanical resonant frequency of the element. A field sensitivity of 0.3 Oe/cm sensor element length was demonstrated at 1 MHz. A coherent magnetization rotation model was developed to predict the magnetostrictive response of the element, and identify approaches for optimizing its performance. This model predicts that an optimized element could resolve ˜1 mm variations in fields varying at frequencies >10 MHz with a sensitivity of ˜10sp{-3} Oe/mm. The results demonstrate the potential utility of integrating this device as a FOMS pixel in RF magnetic field imaging arrays.

  19. Ferroelectric thin-film active sensors for structural health monitoring

    NASA Astrophysics Data System (ADS)

    Lin, Bin; Giurgiutiu, Victor; Yuan, Zheng; Liu, Jian; Chen, Chonglin; Jiang, Jiechao; Bhalla, Amar S.; Guo, Ruyan

    2007-04-01

    Piezoelectric wafer active sensors (PWAS) have been proven a valuable tool in structural health monitoring. Piezoelectric wafer active sensors are able to send and receive guided Lamb/Rayleigh waves that scan the structure and detect the presence of incipient cracks and structural damage. In-situ thin-film active sensor deposition can eliminate the bonding layer to improve the durability issue and reduce the acoustic impedance mismatch. Ferroelectric thin films have been shown to have piezoelectric properties that are close to those of single-crystal ferroelectrics but the fabrication of ferroelectric thin films on structural materials (steel, aluminum, titanium, etc.) has not been yet attempted. In this work, in-situ fabrication method of piezoelectric thin-film active sensors arrays was developed using the nano technology approach. Specification for the piezoelectric thin-film active sensors arrays was based on electro-mechanical-acoustical model. Ferroelectric BaTiO3 (BTO) thin films were successfully deposited on Ni tapes by pulsed laser deposition under the optimal synthesis conditions. Microstructural studies by X-ray diffractometer and transmission electron microscopy reveal that the as-grown BTO thin films have the nanopillar structures with an average size of approximately 80 nm in diameter and the good interface structures with no inter-diffusion or reaction. The dielectric and ferroelectric property measurements exhibit that the BTO films have a relatively large dielectric constant, a small dielectric loss, and an extremely large piezoelectric response with a symmetric hysteresis loop. The research objective is to develop the fabrication and optimum design of thin-film active sensor arrays for structural health monitoring applications. The short wavelengths of the micro phased arrays will permit the phased-array imaging of smaller parts and smaller damage than is currently not possible with existing technology.

  20. Evaluation of an innovative color sensor for space application

    NASA Astrophysics Data System (ADS)

    Cessa, Virginie; Beauvivre, Stéphane; Pittet, Jacques; Dougnac, Virgile; Fasano, M.

    2017-11-01

    We present in this paper an evaluation of an innovative image sensor that provides color information without the need of organic filters. The sensor is a CMOS array with more than 4 millions pixels which filters the incident photons into R, G, and B channels, delivering the full resolution in color. Such a sensor, combining high performance with low power consumption, is of high interest for future space missions. The paper presents the characteristics of the detector as well as the first results of environmental testing.

  1. High-speed sorting of grains by color and surface texture

    USDA-ARS?s Scientific Manuscript database

    A high-speed, low-cost, image-based sorting device was developed to detect and separate grains with different colors/textures. The device directly combines a complementary metal–oxide–semiconductor (CMOS) color image sensor with a field-programmable gate array (FPGA) that was programmed to execute ...

  2. Detection of electromagnetic radiation using micromechanical multiple quantum wells structures

    DOEpatents

    Datskos, Panagiotis G [Knoxville, TN; Rajic, Slobodan [Knoxville, TN; Datskou, Irene [Knoxville, TN

    2007-07-17

    An apparatus and method for detecting electromagnetic radiation employs a deflectable micromechanical apparatus incorporating multiple quantum wells structures. When photons strike the quantum-well structure, physical stresses are created within the sensor, similar to a "bimetallic effect." The stresses cause the sensor to bend. The extent of deflection of the sensor can be measured through any of a variety of conventional means to provide a measurement of the photons striking the sensor. A large number of such sensors can be arranged in a two-dimensional array to provide imaging capability.

  3. Progress of the Swedish-Australian research collaboration on uncooled smart IR sensors

    NASA Astrophysics Data System (ADS)

    Liddiard, Kevin C.; Ringh, Ulf; Jansson, Christer; Reinhold, Olaf

    1998-10-01

    Progress is reported on the development of uncooled microbolometer IR focal plane detector arrays (IRFPDA) under a research collaboration between the Swedish Defence Research Establishment (FOA), and the Defence Science and Technology Organization (DSTO), Australia. The paper describes current focal plane detector arrays designed by Electro-optic Sensor Design (EOSD) for readout circuits developed by FOA. The readouts are fabricated in 0.8 micrometer CMOS, and have a novel signal conditioning and 16 bit parallel ADC design. The arrays are post-processed at DSTO on wafers supplied by FOA. During the past year array processing has been carried out at a new microengineering facility at DSTO, Salisbury, South Australia. A number of small format 16 X 16 arrays have been delivered to FOA for evaluation, and imaging has been demonstrated with these arrays. A 320 X 240 readout with 320 parallel 16 bit ADCs has been developed and IRFPDAs for this readout have been fabricated and are currently being evaluated.

  4. Correlation plenoptic imaging

    NASA Astrophysics Data System (ADS)

    Pepe, Francesco V.; Di Lena, Francesco; Garuccio, Augusto; D'Angelo, Milena

    2017-06-01

    Plenoptic Imaging (PI) is a novel optical technique for achieving tridimensional imaging in a single shot. In conventional PI, a microlens array is inserted in the native image plane and the sensor array is moved behind the microlenses. On the one hand, the microlenses act as imaging pixels to reproduce the image of the scene; on the other hand, each microlens reproduces on the sensor array an image of the camera lens, thus providing the angular information associated with each imaging pixel. The recorded propagation direction is exploited, in post- processing, to computationally retrace the geometrical light path, thus enabling the refocusing of different planes within the scene, the extension of the depth of field of the acquired image, as well as the 3D reconstruction of the scene. However, a trade-off between spatial and angular resolution is built in the standard plenoptic imaging process. We demonstrate that the second-order spatio-temporal correlation properties of light can be exploited to overcome this fundamental limitation. Using two correlated beams, from either a chaotic or an entangled photon source, we can perform imaging in one arm and simultaneously obtain the angular information in the other arm. In fact, we show that the second order correlation function possesses plenoptic imaging properties (i.e., it encodes both spatial and angular information), and is thus characterized by a key re-focusing and 3D imaging capability. From a fundamental standpoint, the plenoptic application is the first situation where the counterintuitive properties of correlated systems are effectively used to beat intrinsic limits of standard imaging systems. From a practical standpoint, our protocol can dramatically enhance the potentials of PI, paving the way towards its promising applications.

  5. An Optimal Image-Based Method for Identification of Acoustic Emission (AE) Sources in Plate-Like Structures Using a Lead Zirconium Titanate (PZT) Sensor Array.

    PubMed

    Yan, Gang; Zhou, Li

    2018-02-21

    This paper proposes an innovative method for identifying the locations of multiple simultaneous acoustic emission (AE) events in plate-like structures from the view of image processing. By using a linear lead zirconium titanate (PZT) sensor array to record the AE wave signals, a reverse-time frequency-wavenumber (f-k) migration is employed to produce images displaying the locations of AE sources by back-propagating the AE waves. Lamb wave theory is included in the f-k migration to consider the dispersive property of the AE waves. Since the exact occurrence time of the AE events is usually unknown when recording the AE wave signals, a heuristic artificial bee colony (ABC) algorithm combined with an optimal criterion using minimum Shannon entropy is used to find the image with the identified AE source locations and occurrence time that mostly approximate the actual ones. Experimental studies on an aluminum plate with AE events simulated by PZT actuators are performed to validate the applicability and effectiveness of the proposed optimal image-based AE source identification method.

  6. An Optimal Image-Based Method for Identification of Acoustic Emission (AE) Sources in Plate-Like Structures Using a Lead Zirconium Titanate (PZT) Sensor Array

    PubMed Central

    Zhou, Li

    2018-01-01

    This paper proposes an innovative method for identifying the locations of multiple simultaneous acoustic emission (AE) events in plate-like structures from the view of image processing. By using a linear lead zirconium titanate (PZT) sensor array to record the AE wave signals, a reverse-time frequency-wavenumber (f-k) migration is employed to produce images displaying the locations of AE sources by back-propagating the AE waves. Lamb wave theory is included in the f-k migration to consider the dispersive property of the AE waves. Since the exact occurrence time of the AE events is usually unknown when recording the AE wave signals, a heuristic artificial bee colony (ABC) algorithm combined with an optimal criterion using minimum Shannon entropy is used to find the image with the identified AE source locations and occurrence time that mostly approximate the actual ones. Experimental studies on an aluminum plate with AE events simulated by PZT actuators are performed to validate the applicability and effectiveness of the proposed optimal image-based AE source identification method. PMID:29466310

  7. Single-snapshot 2D color measurement by plenoptic imaging system

    NASA Astrophysics Data System (ADS)

    Masuda, Kensuke; Yamanaka, Yuji; Maruyama, Go; Nagai, Sho; Hirai, Hideaki; Meng, Lingfei; Tosic, Ivana

    2014-03-01

    Plenoptic cameras enable capture of directional light ray information, thus allowing applications such as digital refocusing, depth estimation, or multiband imaging. One of the most common plenoptic camera architectures contains a microlens array at the conventional image plane and a sensor at the back focal plane of the microlens array. We leverage the multiband imaging (MBI) function of this camera and develop a single-snapshot, single-sensor high color fidelity camera. Our camera is based on a plenoptic system with XYZ filters inserted in the pupil plane of the main lens. To achieve high color measurement precision of this system, we perform an end-to-end optimization of the system model that includes light source information, object information, optical system information, plenoptic image processing and color estimation processing. Optimized system characteristics are exploited to build an XYZ plenoptic colorimetric camera prototype that achieves high color measurement precision. We describe an application of our colorimetric camera to color shading evaluation of display and show that it achieves color accuracy of ΔE<0.01.

  8. Room temperature infrared imaging sensors based on highly purified semiconducting carbon nanotubes.

    PubMed

    Liu, Yang; Wei, Nan; Zhao, Qingliang; Zhang, Dehui; Wang, Sheng; Peng, Lian-Mao

    2015-04-21

    High performance infrared (IR) imaging systems usually require expensive cooling systems, which are highly undesirable. Here we report the fabrication and performance characteristics of room temperature carbon nanotube (CNT) IR imaging sensors. The CNT IR imaging sensor is based on aligned semiconducting CNT films with 99% purity, and each pixel or device of the imaging sensor consists of aligned strips of CNT asymmetrically contacted by Sc and Pd. We found that the performance of the device is dependent on the CNT channel length. While short channel devices provide a large photocurrent and a rapid response of about 110 μs, long channel length devices exhibit a low dark current and a high signal-to-noise ratio which are critical for obtaining high detectivity. In total, 36 CNT IR imagers are constructed on a single chip, each consists of 3 × 3 pixel arrays. The demonstrated advantages of constructing a high performance IR system using purified semiconducting CNT aligned films include, among other things, fast response, excellent stability and uniformity, ideal linear photocurrent response, high imaging polarization sensitivity and low power consumption.

  9. Sensor fusion V; Proceedings of the Meeting, Boston, MA, Nov. 15-17, 1992

    NASA Technical Reports Server (NTRS)

    Schenker, Paul S. (Editor)

    1992-01-01

    Topics addressed include 3D object perception, human-machine interface in multisensor systems, sensor fusion architecture, fusion of multiple and distributed sensors, interface and decision models for sensor fusion, computational networks, simple sensing for complex action, multisensor-based control, and metrology and calibration of multisensor systems. Particular attention is given to controlling 3D objects by sketching 2D views, the graphical simulation and animation environment for flexible structure robots, designing robotic systems from sensorimotor modules, cylindrical object reconstruction from a sequence of images, an accurate estimation of surface properties by integrating information using Bayesian networks, an adaptive fusion model for a distributed detection system, multiple concurrent object descriptions in support of autonomous navigation, robot control with multiple sensors and heuristic knowledge, and optical array detectors for image sensors calibration. (No individual items are abstracted in this volume)

  10. Graphical user interface for a dual-module EMCCD x-ray detector array

    NASA Astrophysics Data System (ADS)

    Wang, Weiyuan; Ionita, Ciprian; Kuhls-Gilcrist, Andrew; Huang, Ying; Qu, Bin; Gupta, Sandesh K.; Bednarek, Daniel R.; Rudin, Stephen

    2011-03-01

    A new Graphical User Interface (GUI) was developed using Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) for a high-resolution, high-sensitivity Solid State X-ray Image Intensifier (SSXII), which is a new x-ray detector for radiographic and fluoroscopic imaging, consisting of an array of Electron-Multiplying CCDs (EMCCDs) each having a variable on-chip electron-multiplication gain of up to 2000x to reduce the effect of readout noise. To enlarge the field-of-view (FOV), each EMCCD sensor is coupled to an x-ray phosphor through a fiberoptic taper. Two EMCCD camera modules are used in our prototype to form a computer-controlled array; however, larger arrays are under development. The new GUI provides patient registration, EMCCD module control, image acquisition, and patient image review. Images from the array are stitched into a 2kx1k pixel image that can be acquired and saved at a rate of 17 Hz (faster with pixel binning). When reviewing the patient's data, the operator can select images from the patient's directory tree listed by the GUI and cycle through the images using a slider bar. Commonly used camera parameters including exposure time, trigger mode, and individual EMCCD gain can be easily adjusted using the GUI. The GUI is designed to accommodate expansion of the EMCCD array to even larger FOVs with more modules. The high-resolution, high-sensitivity EMCCD modular-array SSXII imager with the new user-friendly GUI should enable angiographers and interventionalists to visualize smaller vessels and endovascular devices, helping them to make more accurate diagnoses and to perform more precise image-guided interventions.

  11. Albion: the UK 3rd generation high-performance thermal imaging programme

    NASA Astrophysics Data System (ADS)

    McEwen, R. K.; Lupton, M.; Lawrence, M.; Knowles, P.; Wilson, M.; Dennis, P. N. J.; Gordon, N. T.; Lees, D. J.; Parsons, J. F.

    2007-04-01

    The first generation of high performance thermal imaging sensors in the UK was based on two axis opto-mechanical scanning systems and small (4-16 element) arrays of the SPRITE detector, developed during the 1970s. Almost two decades later, a 2nd Generation system, STAIRS C was introduced, based on single axis scanning and a long linear array of approximately 3000 elements. The UK has now begun the industrialisation of 3 rd Generation High Performance Thermal Imaging under a programme known as "Albion". Three new high performance cadmium mercury telluride arrays are being manufactured. The CMT material is grown by MOVPE on low cost substrates and bump bonded to the silicon read out circuit (ROIC). To maintain low production costs, all three detectors are designed to fit with existing standard Integrated Detector Cooling Assemblies (IDCAs). The two largest focal planes are conventional devices operating in the MWIR and LWIR spectral bands. A smaller format LWIR device is also described which has a smart ROIC, enabling much longer stare times than are feasible with conventional pixel circuits, thus achieving very high sensitivity. A new reference surface technology for thermal imaging sensors is described, based on Negative Luminescence (NL), which offers several advantages over conventional peltier references, improving the quality of the Non-Uniformity Correction (NUC) algorithms.

  12. Tactile surface classification for limbed robots using a pressure sensitive robot skin.

    PubMed

    Shill, Jacob J; Collins, Emmanuel G; Coyle, Eric; Clark, Jonathan

    2015-02-02

    This paper describes an approach to terrain identification based on pressure images generated through direct surface contact using a robot skin constructed around a high-resolution pressure sensing array. Terrain signatures for classification are formulated from the magnitude frequency responses of the pressure images. The initial experimental results for statically obtained images show that the approach yields classification accuracies [Formula: see text]. The methodology is extended to accommodate the dynamic pressure images anticipated when a robot is walking or running. Experiments with a one-legged hopping robot yield similar identification accuracies [Formula: see text]. In addition, the accuracies are independent with respect to changing robot dynamics (i.e., when using different leg gaits). The paper further shows that the high-resolution capabilities of the sensor enables similarly textured surfaces to be distinguished. A correcting filter is developed to accommodate for failures or faults that inevitably occur within the sensing array with continued use. Experimental results show using the correcting filter can extend the effective operational lifespan of a high-resolution sensing array over 6x in the presence of sensor damage. The results presented suggest this methodology can be extended to autonomous field robots, providing a robot with crucial information about the environment that can be used to aid stable and efficient mobility over rough and varying terrains.

  13. Passive, Highly-Sensitive, Room-Temperature Magnetic Field Sensors and Arrays for Detection and Imaging of Hidden Threats in Urban Environments

    DTIC Science & Technology

    2012-07-01

    units made from the various sensors. This was because the different types of ME laminates have different electrical properties ( resistance and...DC resistance of a sensor (Rdc) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT: SAR 18. NUMBER OF PAGES 338 19a. NAME OF...3.3.6. Electric -field tuning effect ..................................................................70 A.3.4. Dielectric loss noise reduction

  14. Robust snow avalanche detection using machine learning on infrasonic array data

    NASA Astrophysics Data System (ADS)

    Thüring, Thomas; Schoch, Marcel; van Herwijnen, Alec; Schweizer, Jürg

    2014-05-01

    Snow avalanches may threaten people and infrastructure in mountain areas. Automated detection of avalanche activity would be highly desirable, in particular during times of poor visibility, to improve hazard assessment, but also to monitor the effectiveness of avalanche control by explosives. In the past, a variety of remote sensing techniques and instruments for the automated detection of avalanche activity have been reported, which are based on radio waves (radar), seismic signals (geophone), optical signals (imaging sensor) or infrasonic signals (microphone). Optical imagery enables to assess avalanche activity with very high spatial resolution, however it is strongly weather dependent. Radar and geophone-based detection typically provide robust avalanche detection for all weather conditions, but are very limited in the size of the monitoring area. On the other hand, due to the long propagation distance of infrasound through air, the monitoring area of infrasonic sensors can cover a large territory using a single sensor (or an array). In addition, they are by far more cost effective than radars or optical imaging systems. Unfortunately, the reliability of infrasonic sensor systems has so far been rather low due to the strong variation of ambient noise (e.g. wind) causing a high false alarm rate. We analyzed the data collected by a low-cost infrasonic array system consisting of four sensors for the automated detection of avalanche activity at Lavin in the eastern Swiss Alps. A comparably large array aperture (~350m) allows highly accurate time delay estimations of signals which arrive at different times at the sensors, enabling precise source localization. An array of four sensors is sufficient for the time resolved source localization of signals in full 3D space, which is an excellent method to anticipate true avalanche activity. Robust avalanche detection is then achieved by using machine learning methods such as support vector machines. The system is initially trained by using characteristic data features from known avalanche and non-avalanche events. Data features are obtained from output signals of the source localization algorithm or from Fourier or time domain processing and support the learning phase of the system. A significantly improved detection rate as well as a reduction of the false alarm rate was achieved compared to previous approaches.

  15. Wavelength- or Polarization-Selective Thermal Infrared Detectors for Multi-Color or Polarimetric Imaging Using Plasmonics and Metamaterials

    PubMed Central

    Ogawa, Shinpei; Kimata, Masafumi

    2017-01-01

    Wavelength- or polarization-selective thermal infrared (IR) detectors are promising for various novel applications such as fire detection, gas analysis, multi-color imaging, multi-channel detectors, recognition of artificial objects in a natural environment, and facial recognition. However, these functions require additional filters or polarizers, which leads to high cost and technical difficulties related to integration of many different pixels in an array format. Plasmonic metamaterial absorbers (PMAs) can impart wavelength or polarization selectivity to conventional thermal IR detectors simply by controlling the surface geometry of the absorbers to produce surface plasmon resonances at designed wavelengths or polarizations. This enables integration of many different pixels in an array format without any filters or polarizers. We review our recent advances in wavelength- and polarization-selective thermal IR sensors using PMAs for multi-color or polarimetric imaging. The absorption mechanism defined by the surface structures is discussed for three types of PMAs—periodic crystals, metal-insulator-metal and mushroom-type PMAs—to demonstrate appropriate applications. Our wavelength- or polarization-selective uncooled IR sensors using various PMAs and multi-color image sensors are then described. Finally, high-performance mushroom-type PMAs are investigated. These advanced functional thermal IR detectors with wavelength or polarization selectivity will provide great benefits for a wide range of applications. PMID:28772855

  16. Wavelength- or Polarization-Selective Thermal Infrared Detectors for Multi-Color or Polarimetric Imaging Using Plasmonics and Metamaterials.

    PubMed

    Ogawa, Shinpei; Kimata, Masafumi

    2017-05-04

    Wavelength- or polarization-selective thermal infrared (IR) detectors are promising for various novel applications such as fire detection, gas analysis, multi-color imaging, multi-channel detectors, recognition of artificial objects in a natural environment, and facial recognition. However, these functions require additional filters or polarizers, which leads to high cost and technical difficulties related to integration of many different pixels in an array format. Plasmonic metamaterial absorbers (PMAs) can impart wavelength or polarization selectivity to conventional thermal IR detectors simply by controlling the surface geometry of the absorbers to produce surface plasmon resonances at designed wavelengths or polarizations. This enables integration of many different pixels in an array format without any filters or polarizers. We review our recent advances in wavelength- and polarization-selective thermal IR sensors using PMAs for multi-color or polarimetric imaging. The absorption mechanism defined by the surface structures is discussed for three types of PMAs-periodic crystals, metal-insulator-metal and mushroom-type PMAs-to demonstrate appropriate applications. Our wavelength- or polarization-selective uncooled IR sensors using various PMAs and multi-color image sensors are then described. Finally, high-performance mushroom-type PMAs are investigated. These advanced functional thermal IR detectors with wavelength or polarization selectivity will provide great benefits for a wide range of applications.

  17. Intelligent imaging systems for automotive applications

    NASA Astrophysics Data System (ADS)

    Thompson, Chris; Huang, Yingping; Fu, Shan

    2004-03-01

    In common with many other application areas, visual signals are becoming an increasingly important information source for many automotive applications. For several years CCD cameras have been used as research tools for a range of automotive applications. Infrared cameras, RADAR and LIDAR are other types of imaging sensors that have also been widely investigated for use in cars. This paper will describe work in this field performed in C2VIP over the last decade - starting with Night Vision Systems and looking at various other Advanced Driver Assistance Systems. Emerging from this experience, we make the following observations which are crucial for "intelligent" imaging systems: 1. Careful arrangement of sensor array. 2. Dynamic-Self-Calibration. 3. Networking and processing. 4. Fusion with other imaging sensors, both at the image level and the feature level, provides much more flexibility and reliability in complex situations. We will discuss how these problems can be addressed and what are the outstanding issues.

  18. Laser doppler blood flow imaging using a CMOS imaging sensor with on-chip signal processing.

    PubMed

    He, Diwei; Nguyen, Hoang C; Hayes-Gill, Barrie R; Zhu, Yiqun; Crowe, John A; Gill, Cally; Clough, Geraldine F; Morgan, Stephen P

    2013-09-18

    The first fully integrated 2D CMOS imaging sensor with on-chip signal processing for applications in laser Doppler blood flow (LDBF) imaging has been designed and tested. To obtain a space efficient design over 64 × 64 pixels means that standard processing electronics used off-chip cannot be implemented. Therefore the analog signal processing at each pixel is a tailored design for LDBF signals with balanced optimization for signal-to-noise ratio and silicon area. This custom made sensor offers key advantages over conventional sensors, viz. the analog signal processing at the pixel level carries out signal normalization; the AC amplification in combination with an anti-aliasing filter allows analog-to-digital conversion with a low number of bits; low resource implementation of the digital processor enables on-chip processing and the data bottleneck that exists between the detector and processing electronics has been overcome. The sensor demonstrates good agreement with simulation at each design stage. The measured optical performance of the sensor is demonstrated using modulated light signals and in vivo blood flow experiments. Images showing blood flow changes with arterial occlusion and an inflammatory response to a histamine skin-prick demonstrate that the sensor array is capable of detecting blood flow signals from tissue.

  19. Self-adaptive calibration for staring infrared sensors

    NASA Astrophysics Data System (ADS)

    Kendall, William B.; Stocker, Alan D.

    1993-10-01

    This paper presents a new, self-adaptive technique for the correlation of non-uniformities (fixed-pattern noise) in high-density infrared focal-plane detector arrays. We have developed a new approach to non-uniformity correction in which we use multiple image frames of the scene itself, and take advantage of the aim-point wander caused by jitter, residual tracking errors, or deliberately induced motion. Such wander causes each detector in the array to view multiple scene elements, and each scene element to be viewed by multiple detectors. It is therefore possible to formulate (and solve) a set of simultaneous equations from which correction parameters can be computed for the detectors. We have tested our approach with actual images collected by the ARPA-sponsored MUSIC infrared sensor. For these tests we employed a 60-frame (0.75-second) sequence of terrain images for which an out-of-date calibration was deliberately used. The sensor was aimed at a point on the ground via an operator-assisted tracking system having a maximum aim point wander on the order of ten pixels. With these data, we were able to improve the calibration accuracy by a factor of approximately 100.

  20. Mutual capacitance of liquid conductors in deformable tactile sensing arrays

    NASA Astrophysics Data System (ADS)

    Li, Bin; Fontecchio, Adam K.; Visell, Yon

    2016-01-01

    Advances in highly deformable electronics are needed in order to enable emerging categories of soft computing devices ranging from wearable electronics, to medical devices, and soft robotic components. The combination of highly elastic substrates with intrinsically stretchable conductors holds the promise of enabling electronic sensors that can conform to curved objects, reconfigurable displays, or soft biological tissues, including the skin. Here, we contribute sensing principles for tactile (mechanical image) sensors based on very low modulus polymer substrates with embedded liquid metal microfluidic arrays. The sensors are fabricated using a single-step casting method that utilizes fine nylon filaments to produce arrays of cylindrical channels on two layers. The liquid metal (gallium indium alloy) conductors that fill these channels readily adopt the shape of the embedding membrane, yielding levels of deformability greater than 400%, due to the use of soft polymer substrates. We modeled the sensor performance using electrostatic theory and continuum mechanics, yielding excellent agreement with experiments. Using a matrix-addressed capacitance measurement technique, we are able to resolve strain distributions with millimeter resolution over areas of several square centimeters.

  1. Imaging system design and image interpolation based on CMOS image sensor

    NASA Astrophysics Data System (ADS)

    Li, Yu-feng; Liang, Fei; Guo, Rui

    2009-11-01

    An image acquisition system is introduced, which consists of a color CMOS image sensor (OV9620), SRAM (CY62148), CPLD (EPM7128AE) and DSP (TMS320VC5509A). The CPLD implements the logic and timing control to the system. SRAM stores the image data, and DSP controls the image acquisition system through the SCCB (Omni Vision Serial Camera Control Bus). The timing sequence of the CMOS image sensor OV9620 is analyzed. The imaging part and the high speed image data memory unit are designed. The hardware and software design of the image acquisition and processing system is given. CMOS digital cameras use color filter arrays to sample different spectral components, such as red, green, and blue. At the location of each pixel only one color sample is taken, and the other colors must be interpolated from neighboring samples. We use the edge-oriented adaptive interpolation algorithm for the edge pixels and bilinear interpolation algorithm for the non-edge pixels to improve the visual quality of the interpolated images. This method can get high processing speed, decrease the computational complexity, and effectively preserve the image edges.

  2. Backside illuminated CMOS-TDI line scan sensor for space applications

    NASA Astrophysics Data System (ADS)

    Cohen, Omer; Ofer, Oren; Abramovich, Gil; Ben-Ari, Nimrod; Gershon, Gal; Brumer, Maya; Shay, Adi; Shamay, Yaron

    2018-05-01

    A multi-spectral backside illuminated Time Delayed Integration Radiation Hardened line scan sensor utilizing CMOS technology was designed for continuous scanning Low Earth Orbit small satellite applications. The sensor comprises a single silicon chip with 4 independent arrays of pixels where each array is arranged in 2600 columns with 64 TDI levels. A multispectral optical filter whose spectral responses per array are adjustable per system requirement is assembled at the package level. A custom 4T Pixel design provides the required readout speed, low-noise, very low dark current, and high conversion gains. A 2-phase internally controlled exposure mechanism improves the sensor's dynamic MTF. The sensor high level of integration includes on-chip 12 bit per pixel analog to digital converters, on-chip controller, and CMOS compatible voltage levels. Thus, the power consumption and the weight of the supporting electronics are reduced, and a simple electrical interface is provided. An adjustable gain provides a Full Well Capacity ranging from 150,000 electrons up to 500,000 electrons per column and an overall readout noise per column of less than 120 electrons. The imager supports line rates ranging from 50 to 10,000 lines/sec, with power consumption of less than 0.5W per array. Thus, the sensor is characterized by a high pixel rate, a high dynamic range and a very low power. To meet a Latch-up free requirement RadHard architecture and design rules were utilized. In this paper recent electrical and electro-optical measurements of the sensor's Flight Models will be presented for the first time.

  3. Maui Aeromagnetic Survey

    DOE Data Explorer

    Akerley, John

    2010-04-17

    Map, image, and data files, and a summary report of a high-resolution aeromagnetic survey of southern Maui, Hawai'i completed by EDCON-PRJ, Inc. for Ormat Nevada Inc using an helicopter and a towed sensor array.

  4. Spatio-spectral color filter array design for optimal image recovery.

    PubMed

    Hirakawa, Keigo; Wolfe, Patrick J

    2008-10-01

    In digital imaging applications, data are typically obtained via a spatial subsampling procedure implemented as a color filter array-a physical construction whereby only a single color value is measured at each pixel location. Owing to the growing ubiquity of color imaging and display devices, much recent work has focused on the implications of such arrays for subsequent digital processing, including in particular the canonical demosaicking task of reconstructing a full color image from spatially subsampled and incomplete color data acquired under a particular choice of array pattern. In contrast to the majority of the demosaicking literature, we consider here the problem of color filter array design and its implications for spatial reconstruction quality. We pose this problem formally as one of simultaneously maximizing the spectral radii of luminance and chrominance channels subject to perfect reconstruction, and-after proving sub-optimality of a wide class of existing array patterns-provide a constructive method for its solution that yields robust, new panchromatic designs implementable as subtractive colors. Empirical evaluations on multiple color image test sets support our theoretical results, and indicate the potential of these patterns to increase spatial resolution for fixed sensor size, and to contribute to improved reconstruction fidelity as well as significantly reduced hardware complexity.

  5. Phonon-mediated superconducting transition-edge sensor X-ray detectors for use in astronomy

    NASA Astrophysics Data System (ADS)

    Leman, Steven W.; Martinez-Galarce, Dennis S.; Brink, Paul L.; Cabrera, Blas; Castle, Joseph P.; Morse, Kathleen; Stern, Robert A.; Tomada, Astrid

    2004-09-01

    Superconducting Transition-Edge Sensors (TESs) are generating a great deal of interest in the areas of x-ray astrophysics and space science, particularly to develop them as large-array, imaging x-ray spectrometers. We are developing a novel concept that is based on position-sensitive macro-pixels placing TESs on the backside of a silicon or germanium absorber. Each x-ray absorbed will be position (X/δX and Y/δY ~ 100) and energy (E/δE ~ 1000) resolved via four distributed TES readouts. In the future, combining such macropixels with advances in multiplexing could lead to 30 by 30 arrays of close-packed macro-pixels equivalent to imaging instruments of 10 megapixels or more. We report on our progress to date and discuss its application to a plausible solar satellite mission and plans for future development.

  6. High-speed line-scan camera with digital time delay integration

    NASA Astrophysics Data System (ADS)

    Bodenstorfer, Ernst; Fürtler, Johannes; Brodersen, Jörg; Mayer, Konrad J.; Eckel, Christian; Gravogl, Klaus; Nachtnebel, Herbert

    2007-02-01

    Dealing with high-speed image acquisition and processing systems, the speed of operation is often limited by the amount of available light, due to short exposure times. Therefore, high-speed applications often use line-scan cameras, based on charge-coupled device (CCD) sensors with time delayed integration (TDI). Synchronous shift and accumulation of photoelectric charges on the CCD chip - according to the objects' movement - result in a longer effective exposure time without introducing additional motion blur. This paper presents a high-speed color line-scan camera based on a commercial complementary metal oxide semiconductor (CMOS) area image sensor with a Bayer filter matrix and a field programmable gate array (FPGA). The camera implements a digital equivalent to the TDI effect exploited with CCD cameras. The proposed design benefits from the high frame rates of CMOS sensors and from the possibility of arbitrarily addressing the rows of the sensor's pixel array. For the digital TDI just a small number of rows are read out from the area sensor which are then shifted and accumulated according to the movement of the inspected objects. This paper gives a detailed description of the digital TDI algorithm implemented on the FPGA. Relevant aspects for the practical application are discussed and key features of the camera are listed.

  7. I-ImaS: intelligent imaging sensors

    NASA Astrophysics Data System (ADS)

    Griffiths, J.; Royle, G.; Esbrand, C.; Hall, G.; Turchetta, R.; Speller, R.

    2010-08-01

    Conventional x-radiography uniformly irradiates the relevant region of the patient. Across that region, however, there is likely to be significant variation in both the thickness and pathological composition of the tissues present, which means that the x-ray exposure conditions selected, and consequently the image quality achieved, are a compromise. The I-ImaS concept eliminates this compromise by intelligently scanning the patient to identify the important diagnostic features, which are then used to adaptively control the x-ray exposure conditions at each point in the patient. In this way optimal image quality is achieved throughout the region of interest whilst maintaining or reducing the dose. An I-ImaS system has been built under an EU Framework 6 project and has undergone pre-clinical testing. The system is based upon two rows of sensors controlled via an FPGA based DAQ board. Each row consists of a 160 mm × 1 mm linear array of ten scintillator coated 3T CMOS APS devices with 32 μm pixels and a readable array of 520 × 40 pixels. The first sensor row scans the patient using a fraction of the total radiation dose to produce a preview image, which is then interrogated to identify the optimal exposure conditions at each point in the image. A signal is then sent to control a beam filter mechanism to appropriately moderate x-ray beam intensity at the patient as the second row of sensors follows behind. Tests performed on breast tissue sections found that the contrast-to-noise ratio in over 70% of the images was increased by an average of 15% at an average dose reduction of 9%. The same technology is currently also being applied to baggage scanning for airport security.

  8. Single-shot and single-sensor high/super-resolution microwave imaging based on metasurface

    PubMed Central

    Wang, Libo; Li, Lianlin; Li, Yunbo; Zhang, Hao Chi; Cui, Tie Jun

    2016-01-01

    Real-time high-resolution (including super-resolution) imaging with low-cost hardware is a long sought-after goal in various imaging applications. Here, we propose broadband single-shot and single-sensor high-/super-resolution imaging by using a spatio-temporal dispersive metasurface and an imaging reconstruction algorithm. The metasurface with spatio-temporal dispersive property ensures the feasibility of the single-shot and single-sensor imager for super- and high-resolution imaging, since it can convert efficiently the detailed spatial information of the probed object into one-dimensional time- or frequency-dependent signal acquired by a single sensor fixed in the far-field region. The imaging quality can be improved by applying a feature-enhanced reconstruction algorithm in post-processing, and the desired imaging resolution is related to the distance between the object and metasurface. When the object is placed in the vicinity of the metasurface, the super-resolution imaging can be realized. The proposed imaging methodology provides a unique means to perform real-time data acquisition, high-/super-resolution images without employing expensive hardware (e.g. mechanical scanner, antenna array, etc.). We expect that this methodology could make potential breakthroughs in the areas of microwave, terahertz, optical, and even ultrasound imaging. PMID:27246668

  9. Binary CMOS image sensor with a gate/body-tied MOSFET-type photodetector for high-speed operation

    NASA Astrophysics Data System (ADS)

    Choi, Byoung-Soo; Jo, Sung-Hyun; Bae, Myunghan; Kim, Sang-Hwan; Shin, Jang-Kyoo

    2016-05-01

    In this paper, a binary complementary metal oxide semiconductor (CMOS) image sensor with a gate/body-tied (GBT) metal oxide semiconductor field effect transistor (MOSFET)-type photodetector is presented. The sensitivity of the GBT MOSFET-type photodetector, which was fabricated using the standard CMOS 0.35-μm process, is higher than the sensitivity of the p-n junction photodiode, because the output signal of the photodetector is amplified by the MOSFET. A binary image sensor becomes more efficient when using this photodetector. Lower power consumptions and higher speeds of operation are possible, compared to the conventional image sensors using multi-bit analog to digital converters (ADCs). The frame rate of the proposed image sensor is over 2000 frames per second, which is higher than those of the conventional CMOS image sensors. The output signal of an active pixel sensor is applied to a comparator and compared with a reference level. The 1-bit output data of the binary process is determined by this level. To obtain a video signal, the 1-bit output data is stored in the memory and is read out by horizontal scanning. The proposed chip is composed of a GBT pixel array (144 × 100), binary-process circuit, vertical scanner, horizontal scanner, and readout circuit. The operation mode can be selected from between binary mode and multi-bit mode.

  10. Measurement of charge transfer potential barrier in pinned photodiode CMOS image sensors

    NASA Astrophysics Data System (ADS)

    Chen, Cao; Bing, Zhang; Junfeng, Wang; Longsheng, Wu

    2016-05-01

    The charge transfer potential barrier (CTPB) formed beneath the transfer gate causes a noticeable image lag issue in pinned photodiode (PPD) CMOS image sensors (CIS), and is difficult to measure straightforwardly since it is embedded inside the device. From an understanding of the CTPB formation mechanism, we report on an alternative method to feasibly measure the CTPB height by performing a linear extrapolation coupled with a horizontal left-shift on the sensor photoresponse curve under the steady-state illumination. The theoretical study was performed in detail on the principle of the proposed method. Application of the measurements on a prototype PPD-CIS chip with an array of 160 × 160 pixels is demonstrated. Such a method intends to shine new light on the guidance for the lag-free and high-speed sensors optimization based on PPD devices. Project supported by the National Defense Pre-Research Foundation of China (No. 51311050301095).

  11. Architecture and applications of a high resolution gated SPAD image sensor

    PubMed Central

    Burri, Samuel; Maruyama, Yuki; Michalet, Xavier; Regazzoni, Francesco; Bruschini, Claudio; Charbon, Edoardo

    2014-01-01

    We present the architecture and three applications of the largest resolution image sensor based on single-photon avalanche diodes (SPADs) published to date. The sensor, fabricated in a high-voltage CMOS process, has a resolution of 512 × 128 pixels and a pitch of 24 μm. The fill-factor of 5% can be increased to 30% with the use of microlenses. For precise control of the exposure and for time-resolved imaging, we use fast global gating signals to define exposure windows as small as 4 ns. The uniformity of the gate edges location is ∼140 ps (FWHM) over the whole array, while in-pixel digital counting enables frame rates as high as 156 kfps. Currently, our camera is used as a highly sensitive sensor with high temporal resolution, for applications ranging from fluorescence lifetime measurements to fluorescence correlation spectroscopy and generation of true random numbers. PMID:25090572

  12. High-density fiber-optic DNA random microsphere array.

    PubMed

    Ferguson, J A; Steemers, F J; Walt, D R

    2000-11-15

    A high-density fiber-optic DNA microarray sensor was developed to monitor multiple DNA sequences in parallel. Microarrays were prepared by randomly distributing DNA probe-functionalized 3.1-microm-diameter microspheres in an array of wells etched in a 500-microm-diameter optical imaging fiber. Registration of the microspheres was performed using an optical encoding scheme and a custom-built imaging system. Hybridization was visualized using fluorescent-labeled DNA targets with a detection limit of 10 fM. Hybridization times of seconds are required for nanomolar target concentrations, and analysis is performed in minutes.

  13. Diffraction mode terahertz tomography

    DOEpatents

    Ferguson, Bradley; Wang, Shaohong; Zhang, Xi-Cheng

    2006-10-31

    A method of obtaining a series of images of a three-dimensional object. The method includes the steps of transmitting pulsed terahertz (THz) radiation through the entire object from a plurality of angles, optically detecting changes in the transmitted THz radiation using pulsed laser radiation, and constructing a plurality of imaged slices of the three-dimensional object using the detected changes in the transmitted THz radiation. The THz radiation is transmitted through the object as a two-dimensional array of parallel rays. The optical detection is an array of detectors such as a CCD sensor.

  14. Multidirectional seismo-acoustic wavefield of strombolian explosions at Yasur, Vanuatu using a broadband seismo-acoustic network, infrasound arrays, and infrasonic sensors on tethered balloons

    NASA Astrophysics Data System (ADS)

    Matoza, R. S.; Jolly, A. D.; Fee, D.; Johnson, R.; Kilgour, G.; Christenson, B. W.; Garaebiti, E.; Iezzi, A. M.; Austin, A.; Kennedy, B.; Fitzgerald, R.; Key, N.

    2016-12-01

    Seismo-acoustic wavefields at volcanoes contain rich information on shallow magma transport and subaerial eruption processes. Acoustic wavefields from eruptions are predicted to be directional, but sampling this wavefield directivity is challenging because infrasound sensors are usually deployed on the ground surface. We attempt to overcome this observational limitation using a novel deployment of infrasound sensors on tethered balloons in tandem with a suite of dense ground-based seismo-acoustic, geochemical, and eruption imaging instrumentation. We present preliminary results from a field experiment at Yasur Volcano, Vanuatu from July 26th to August 4th 2016. Our observations include data from a temporary network of 11 broadband seismometers, 6 single infrasonic microphones, 7 small-aperture 3-element infrasound arrays, 2 infrasound sensor packages on tethered balloons, an FTIR, a FLIR, 2 scanning Flyspecs, and various visual imaging data. An introduction to the dataset and preliminary analysis of the 3D seismo-acoustic wavefield and source process will be presented. This unprecedented dataset should provide a unique window into processes operating in the shallow magma plumbing system and their relation to subaerial eruption dynamics.

  15. Pyroelectric IR sensor arrays for fall detection in the older population

    NASA Astrophysics Data System (ADS)

    Sixsmith, A.; Johnson, N.; Whatmore, R.

    2005-09-01

    Uncooled pyroelectric sensor arrays have been studied over many years for their uses in thermal imaging applications. These arrays will only detect changes in IR flux and so systems based upon them are very good at detecting movements of people in the scene without sensing the background, if they are used in staring mode. Relatively-low element count arrays (16 x 16) can be used for a variety of people sensing applications, including people counting (for safety applications), queue monitoring etc. With appropriate signal processing such systems can be also be used for the detection of particular events such as a person falling over. There is a considerable need for automatic fall detection amongst older people, but there are important limitations to some of the current and emerging technologies available for this. Simple sensors, such as 1 or 2 element pyroelectric infra-red sensors provide crude data that is difficult to interpret; the use of devices worn on the person, such as wrist communicator and motion detectors have potential, but are reliant on the person being able and willing to wear the device; video cameras may be seen as intrusive and require considerable human resources to monitor activity while machine-interpretation of camera images is complex and may be difficult in this application area. The use of a pyroelectric thermal array sensor was seen to have a number of potential benefits. The sensor is wall-mounted and does not require the user to wear a device. It enables detailed analysis of a subject's motion to be achieved locally, within the detector, using only a modest processor. This is possible due to the relative ease with which data from the sensor can be interpreted relative to the data generated by alternative sensors such as video devices. In addition to the cost-effectiveness of this solution, it was felt that the lack of detail in the low-level data, together with the elimination of the need to transmit data outside the detector, would help to avert feelings intrusiveness on the part of the end-user.The main benefits of this type of technology would be for older people who spend time alone in unsupervised environments. This would include people living alone in ordinary housing or in sheltered accommodation (apartment complexes for older people with local warden) and non-communal areas in residential/nursing home environments (e.g. bedrooms and ensuite bathrooms and toilets). This paper will review the development of the array, the pyroelectric ceramic material upon which it is based and the system capabilities. It will present results from the Framework 5 SIMBAD project, which used the system to monitor the movements of elderly people over a considerable period of time.

  16. Analysis of Camera Arrays Applicable to the Internet of Things.

    PubMed

    Yang, Jiachen; Xu, Ru; Lv, Zhihan; Song, Houbing

    2016-03-22

    The Internet of Things is built based on various sensors and networks. Sensors for stereo capture are essential for acquiring information and have been applied in different fields. In this paper, we focus on the camera modeling and analysis, which is very important for stereo display and helps with viewing. We model two kinds of cameras, a parallel and a converged one, and analyze the difference between them in vertical and horizontal parallax. Even though different kinds of camera arrays are used in various applications and analyzed in the research work, there are few discussions on the comparison of them. Therefore, we make a detailed analysis about their performance over different shooting distances. From our analysis, we find that the threshold of shooting distance for converged cameras is 7 m. In addition, we design a camera array in our work that can be used as a parallel camera array, as well as a converged camera array and take some images and videos with it to identify the threshold.

  17. Spatial optical crosstalk in CMOS image sensors integrated with plasmonic color filters.

    PubMed

    Yu, Yan; Chen, Qin; Wen, Long; Hu, Xin; Zhang, Hui-Fang

    2015-08-24

    Imaging resolution of complementary metal oxide semiconductor (CMOS) image sensor (CIS) keeps increasing to approximately 7k × 4k. As a result, the pixel size shrinks down to sub-2μm, which greatly increases the spatial optical crosstalk. Recently, plasmonic color filter was proposed as an alternative to conventional colorant pigmented ones. However, there is little work on its size effect and the spatial optical crosstalk in a model of CIS. By numerical simulation, we investigate the size effect of nanocross array plasmonic color filters and analyze the spatial optical crosstalk of each pixel in a Bayer array of a CIS with a pixel size of 1μm. It is found that the small pixel size deteriorates the filtering performance of nanocross color filters and induces substantial spatial color crosstalk. By integrating the plasmonic filters in the low Metal layer in standard CMOS process, the crosstalk reduces significantly, which is compatible to pigmented filters in a state-of-the-art backside illumination CIS.

  18. The silicon vidicon: Integration, storage and slow scan capability - Experimental observation of a secondary mode of operation.

    NASA Technical Reports Server (NTRS)

    Ando, K. J.

    1971-01-01

    Description of the performance of the silicon diode array vidicon - an imaging sensor which possesses wide spectral response, high quantum efficiency, and linear response. These characteristics, in addition to its inherent ruggedness, simplicity, and long-term stability and operating life make this device potentially of great usefulness for ground-base and spaceborne planetary and stellar imaging applications. However, integration and charged storage for periods greater than approximately five seconds are not possible at room temperature because of diode saturation from dark current buildup. Since dark current can be reduced by cooling, measurements were made in the range from -65 to 25 C. Results are presented on the extension of integration, storage, and slow scan capabilities achievable by cooling. Integration times in excess of 20 minutes were achieved at the lowest temperatures. The measured results are compared with results obtained with other types of sensors and the advantages of the silicon diode array vidicon for imaging applications are discussed.

  19. Large-area, flexible imaging arrays constructed by light-charge organic memories

    PubMed Central

    Zhang, Lei; Wu, Ti; Guo, Yunlong; Zhao, Yan; Sun, Xiangnan; Wen, Yugeng; Yu, Gui; Liu, Yunqi

    2013-01-01

    Existing organic imaging circuits, which offer attractive benefits of light weight, low cost and flexibility, are exclusively based on phototransistor or photodiode arrays. One shortcoming of these photo-sensors is that the light signal should keep invariant throughout the whole pixel-addressing and reading process. As a feasible solution, we synthesized a new charge storage molecule and embedded it into a device, which we call light-charge organic memory (LCOM). In LCOM, the functionalities of photo-sensor and non-volatile memory are integrated. Thanks to the deliberate engineering of electronic structure and self-organization process at the interface, 92% of the stored charges, which are linearly controlled by the quantity of light, retain after 20000 s. The stored charges can also be non-destructively read and erased by a simple voltage program. These results pave the way to large-area, flexible imaging circuits and demonstrate a bright future of small molecular materials in non-volatile memory. PMID:23326636

  20. Synchromodal optical in vivo imaging employing microlens array optics: a complete framework

    NASA Astrophysics Data System (ADS)

    Peter, Joerg

    2013-03-01

    A complete mathematical framework for preclinical optical imaging (OI) support comprising bioluminescence imaging (BLI), fluorescence surface imaging (FSI) and fluorescence optical tomography (FOT) is presented in which optical data is acquired by means of a microlens array (MLA) based light detector (MLA-D). The MLA-D has been developed to enable unique OI, especially in synchromodal operation with secondary imaging modalities (SIM) such as positron emission tomography (PET) or magnetic resonance imaging (MRI). An MLA-D consists of a (large-area) photon sensor array, a matched MLA for field-of-view definition, and a septum mask of specific geometry made of anodized aluminum that is positioned between the sensor and the MLA to suppresses light cross-talk and to shield the sensor's radiofrequency interference signal (essential when used inside an MRI system). The software framework, while freely parameterizable for any MLA-D, is tailored towards an OI prototype system for preclinical SIM application comprising a multitude of cylindrically assembled, gantry-mounted, simultaneously operating MLA-D's. Besides the MLA-D specificity, the framework incorporates excitation and illumination light-source declarations of large-field and point geometry to facilitate multispectral FSI and FOT as well as three-dimensional object recognition. When used in synchromodal operation, reconstructed tomographic SIM volume data can be used for co-modal image fusion and also as a prior for estimating the imaged object's 3D surface by means of gradient vector flow. Superimposed planar (without object prior) or surface-aligned inverse mapping can be performed to estimate and to fuse the emission light map with the boundary of the imaged object. Triangulation and subsequent optical reconstruction (FOT) or constrained flow estimation (BLI), both including the possibility of SIM priors, can be performed to estimate the internal three-dimensional emission light distribution. The framework is susceptible to a number of variables controlling convergence and computational speed. Utilization and performance is illustrated on experimentally acquired data employing the OI prototype system in stand-alone operation, and when integrated into an unmodified preclinical PET system performing synchromodal BLI-PET in vivo imaging.

  1. Planetary investigation utilizing an imaging spectrometer system based upon charge injection technology

    NASA Technical Reports Server (NTRS)

    Wattson, R. B.; Harvey, P.; Swift, R.

    1975-01-01

    An intrinsic silicon charge injection device (CID) television sensor array has been used in conjunction with a CaMoO4 colinear tunable acousto optic filter, a 61 inch reflector, a sophisticated computer system, and a digital color TV scan converter/computer to produce near IR images of Saturn and Jupiter with 10A spectral resolution and approximately 3 inch spatial resolution. The CID camera has successfully obtained digitized 100 x 100 array images with 5 minutes of exposure time, and slow-scanned readout to a computer. Details of the equipment setup, innovations, problems, experience, data and final equipment performance limits are given.

  2. High-resolution panoramic images with megapixel MWIR FPA

    NASA Astrophysics Data System (ADS)

    Leboucher, Vincent; Aubry, Gilles

    2014-06-01

    In the continuity of its current strategy, HGH maintains a deep effort in developing its most recent product family: the infrared (IR) panoramic 360-degree surveillance sensors. During the last two years, HGH optimized its prototype Middle Wave IR (MWIR) panoramic sensor IR Revolution 360 HD that gave birth to Spynel-S product. Various test campaigns proved its excellent image quality. Cyclope, the software associated with Spynel, benefitted from recent image processing improvements and new functionalities such as target geolocalization, long range sensor slue to cue and facilitated forensics analysis. In the frame of the PANORAMIR project sustained by the DGA (Délégation Générale de l'Armement), HGH designed a new extra large resolution sensor including a MWIR megapixel Focal Plane Array (FPA) detector (1280×1024 pixels). This new sensor is called Spynel-X. It provides outstanding resolution 360-degree images (with more than 100 Mpixels). The mechanical frame of Spynel (-S and -X) was designed with the collaboration of an industrial design agency. Spynel got the "Observeur du Design 2013" label.

  3. Multi-Band Miniaturized Patch Antennas for a Compact, Shielded Microwave Breast Imaging Array.

    PubMed

    Aguilar, Suzette M; Al-Joumayly, Mudar A; Burfeindt, Matthew J; Behdad, Nader; Hagness, Susan C

    2013-12-18

    We present a comprehensive study of a class of multi-band miniaturized patch antennas designed for use in a 3D enclosed sensor array for microwave breast imaging. Miniaturization and multi-band operation are achieved by loading the antenna with non-radiating slots at strategic locations along the patch. This results in symmetric radiation patterns and similar radiation characteristics at all frequencies of operation. Prototypes were fabricated and tested in a biocompatible immersion medium. Excellent agreement was obtained between simulations and measurements. The trade-off between miniaturization and radiation efficiency within this class of patch antennas is explored via a numerical analysis of the effects of the location and number of slots, as well as the thickness and permittivity of the dielectric substrate, on the resonant frequencies and gain. Additionally, we compare 3D quantitative microwave breast imaging performance achieved with two different enclosed arrays of slot-loaded miniaturized patch antennas. Simulated array measurements were obtained for a 3D anatomically realistic numerical breast phantom. The reconstructed breast images generated from miniaturized patch array data suggest that, for the realistic noise power levels assumed in this study, the variations in gain observed across this class of multi-band patch antennas do not significantly impact the overall image quality. We conclude that these miniaturized antennas are promising candidates as compact array elements for shielded, multi-frequency microwave breast imaging systems.

  4. Kilopixel Pop-Up Bolometer Arrays for the Atacama Cosmology Telescope

    NASA Technical Reports Server (NTRS)

    Chervenak, J. A.; Wollack, E.; Henry, R.; Moseley, S. H.; Niemack, M.; Staggs, S.; Page, L.; Doriese, R.; Hilton, G. c.; Irwin, K. D.

    2007-01-01

    The recently deployed Atacama Cosmology Telescope (ACT) anticipates first light on its kilopixel array of close-packed transition-edge-sensor bolometers in November of 2007. The instrument will represent a full implementation of the next-generation, large format arrays for millimeter wave astronomy that use superconducting electronics and detectors. Achieving the practical construction of such an array is a significant step toward producing advanced detector arrays for future SOFIA instruments. We review the design considerations for the detector array produced for the ACT instrument. The first light imager consists of 32 separately instrumented 32-channel pop-up bolometer arrays (to create a 32x32 filled array of mm-wave sensors). Each array is instrumented with a 32-channel bias resistor array, Nyquist filter array, and time-division SQUID multiplexer. Each component needed to be produced in relatively large quantities with suitable uniformity to meet tolerances for array operation. An optical design was chosen to maximize absorption at the focal plane while mitigating reflections and stray light. The pop-up geometry (previously implemented with semiconducting detectors and readout on the SHARC II and HAWC instruments) enabled straightforward interface of the superconducting bias and readout circuit with the 2D array of superconducting bolometers. The array construction program balanced fabrication challenges with assembly challenges to deliver the instrument in a timely fashion. We present some of the results of the array build and characterization of its performance.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henderson, Bradley G; Suszcynsky, David M; Hamlin, Timothy E

    Los Alamos National Laboratory (LANL) owns and operates an array of Very-Low Frequency (VLF) sensors that measure the Radio-Frequency (RF) waveforms emitted by Cloud-to-Ground (CG) and InCloud (IC) lightning. This array, the Los Alamos Sferic Array (LASA), has approximately 15 sensors concentrated in the Great Plains and Florida, which detect electric field changes in a bandwidth from 200 Hz to 500 kHz (Smith et al., 2002). Recently, LANL has begun development of a new dual-band RF sensor array that includes the Very-High Frequency (VHF) band as well as the VLF. Whereas VLF lightning emissions can be used to deduce physicalmore » parameters such as lightning type and peak current, VHF emissions can be used to perform precise 3d mapping of individual radiation sources, which can number in the thousands for a typical CG flash. These new dual-band sensors will be used to monitor lightning activity in hurricanes in an effort to better predict intensification cycles. Although the new LANL dual-band array is not yet operational, we have begun initial work utilizing both VLF and VHF lightning data to monitor hurricane evolution. In this paper, we present the temporal evolution of Rita's landfall using VLF and VHF lightning data, and also WSR-88D radar. At landfall, Rita's northern eyewall experienced strong updrafts and significant lightning activity that appear to mark a transition between oceanic hurricane dynamics and continental thunderstorm dynamics. In section 2, we give a brief overview of Hurricane Rita, including its development as a hurricane and its lightning history. In the following section, we present WSR-88D data of Rita's landfall, including reflectivity images and temporal variation. In section 4, we present both VHF and VLF lightning data, overplotted on radar reflectivity images. Finally, we discuss our observations, including a comparison to previous studies and a brief conclusion.« less

  6. Graphical User Interface for a Dual-Module EMCCD X-ray Detector Array.

    PubMed

    Wang, Weiyuan; Ionita, Ciprian; Kuhls-Gilcrist, Andrew; Huang, Ying; Qu, Bin; Gupta, Sandesh K; Bednarek, Daniel R; Rudin, Stephen

    2011-03-16

    A new Graphical User Interface (GUI) was developed using Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) for a high-resolution, high-sensitivity Solid State X-ray Image Intensifier (SSXII), which is a new x-ray detector for radiographic and fluoroscopic imaging, consisting of an array of Electron-Multiplying CCDs (EMCCDs) each having a variable on-chip electron-multiplication gain of up to 2000× to reduce the effect of readout noise. To enlarge the field-of-view (FOV), each EMCCD sensor is coupled to an x-ray phosphor through a fiberoptic taper. Two EMCCD camera modules are used in our prototype to form a computer-controlled array; however, larger arrays are under development. The new GUI provides patient registration, EMCCD module control, image acquisition, and patient image review. Images from the array are stitched into a 2k×1k pixel image that can be acquired and saved at a rate of 17 Hz (faster with pixel binning). When reviewing the patient's data, the operator can select images from the patient's directory tree listed by the GUI and cycle through the images using a slider bar. Commonly used camera parameters including exposure time, trigger mode, and individual EMCCD gain can be easily adjusted using the GUI. The GUI is designed to accommodate expansion of the EMCCD array to even larger FOVs with more modules. The high-resolution, high-sensitivity EMCCD modular-array SSXII imager with the new user-friendly GUI should enable angiographers and interventionalists to visualize smaller vessels and endovascular devices, helping them to make more accurate diagnoses and to perform more precise image-guided interventions.

  7. Stereo Cloud Height and Wind Determination Using Measurements from a Single Focal Plane

    NASA Astrophysics Data System (ADS)

    Demajistre, R.; Kelly, M. A.

    2014-12-01

    We present here a method for extracting cloud heights and winds from an aircraft or orbital platform using measurements from a single focal plane, exploiting the motion of the platform to provide multiple views of the cloud tops. To illustrate this method we use data acquired during aircraft flight tests of a set of simple stereo imagers that are well suited to this purpose. Each of these imagers has three linear arrays on the focal plane, one looking forward, one looking aft, and one looking down. Push-broom images from each of these arrays are constructed, and then a spatial correlation analysis is used to deduce the delays and displacements required for wind and cloud height determination. We will present the algorithms necessary for the retrievals, as well as the methods used to determine the uncertainties of the derived cloud heights and winds. We will apply the retrievals and uncertainty determination to a number of image sets acquired by the airborne sensors. We then generalize these results to potential space based observations made by similar types of sensors.

  8. Co-Prime Frequency and Aperture Design for HF Surveillance, Wideband Radar Imaging, and Nonstationary Array Processing

    DTIC Science & Technology

    2018-03-10

    can be generated using only two sensors in the physical array. In case ofredundancy in the difference coarray, there is more than one antenna pair that...estimation results based on the MUSIC algorithm using multi- frequency co-prime arrays. Both proportional and nonproportional source spectra cases are...be made in this case as well. However, two differences can be noticed by comparing the RMSE plots in Figs. 11 and 13. First, the RMSE takes on lower

  9. All-optical endoscopic probe for high resolution 3D photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Ansari, R.; Zhang, E.; Desjardins, A. E.; Beard, P. C.

    2017-03-01

    A novel all-optical forward-viewing photoacoustic probe using a flexible coherent fibre-optic bundle and a Fabry- Perot (FP) ultrasound sensor has been developed. The fibre bundle, along with the FP sensor at its distal end, synthesizes a high density 2D array of wideband ultrasound detectors. Photoacoustic waves arriving at the sensor are spatially mapped by optically scanning the proximal end face of the bundle in 2D with a CW wavelength-tunable interrogation laser. 3D images are formed from the detected signals using a time-reversal image reconstruction algorithm. The system has been characterized in terms of its PSF, noise-equivalent pressure and field of view. Finally, the high resolution 3D imaging capability has been demonstrated using arbitrary shaped phantoms and duck embryo.

  10. LC-lens array with light field algorithm for 3D biomedical applications

    NASA Astrophysics Data System (ADS)

    Huang, Yi-Pai; Hsieh, Po-Yuan; Hassanfiroozi, Amir; Martinez, Manuel; Javidi, Bahram; Chu, Chao-Yu; Hsuan, Yun; Chu, Wen-Chun

    2016-03-01

    In this paper, liquid crystal lens (LC-lens) array was utilized in 3D bio-medical applications including 3D endoscope and light field microscope. Comparing with conventional plastic lens array, which was usually placed in 3D endoscope or light field microscope system to record image disparity, our LC-lens array has higher flexibility of electrically changing its focal length. By using LC-lens array, the working distance and image quality of 3D endoscope and microscope could be enhanced. Furthermore, the 2D/3D switching ability could be achieved if we turn off/on the electrical power on LClens array. In 3D endoscope case, a hexagonal micro LC-lens array with 350um diameter was placed at the front end of a 1mm diameter endoscope. With applying electric field on LC-lens array, the 3D specimen would be recorded as from seven micro-cameras with different disparity. We could calculate 3D construction of specimen with those micro images. In the other hand, if we turn off the electric field on LC-lens array, the conventional high resolution 2D endoscope image would be recorded. In light field microscope case, the LC-lens array was placed in front of the CMOS sensor. The main purpose of LC-lens array is to extend the refocusing distance of light field microscope, which is usually very narrow in focused light field microscope system, by montaging many light field images sequentially focusing on different depth. With adjusting focal length of LC-lens array from 2.4mm to 2.9mm, the refocusing distance was extended from 1mm to 11.3mm. Moreover, we could use a LC wedge to electrically shift the optics axis and increase the resolution of light field.

  11. Smartphone-Based VOC Sensor Using Colorimetric Polydiacetylenes.

    PubMed

    Park, Dong-Hoon; Heo, Jung-Moo; Jeong, Woomin; Yoo, Young Hyuk; Park, Bum Jun; Kim, Jong-Man

    2018-02-07

    Owing to a unique colorimetric (typically blue-to-red) feature upon environmental stimulation, polydiacetylenes (PDAs) have been actively employed in chemosensor systems. We developed a highly accurate and simple volatile organic compound (VOC) sensor system that can be operated using a conventional smartphone. The procedure begins with forming an array of four different PDAs on conventional paper using inkjet printing of four corresponding diacetylenes followed by photopolymerization. A database of color changes (i.e., red and hue values) is then constructed on the basis of different solvatochromic responses of the 4 PDAs to 11 organic solvents. Exposure of the PDA array to an unknown solvent promotes color changes, which are imaged using a smartphone camera and analyzed using the app. A comparison of the color changes to the database promoted by the 11 solvents enables the smartphone app to identify the unknown solvent with 100% accuracy. Additionally, it was demonstrated that the PDA array sensor was sufficiently sensitive to accurately detect the 11 VOC gases.

  12. Experimental Demonstration of Adaptive Infrared Multispectral Imaging using Plasmonic Filter Array.

    PubMed

    Jang, Woo-Yong; Ku, Zahyun; Jeon, Jiyeon; Kim, Jun Oh; Lee, Sang Jun; Park, James; Noyola, Michael J; Urbas, Augustine

    2016-10-10

    In our previous theoretical study, we performed target detection using a plasmonic sensor array incorporating the data-processing technique termed "algorithmic spectrometry". We achieved the reconstruction of a target spectrum by extracting intensity at multiple wavelengths with high resolution from the image data obtained from the plasmonic array. The ultimate goal is to develop a full-scale focal plane array with a plasmonic opto-coupler in order to move towards the next generation of versatile infrared cameras. To this end, and as an intermediate step, this paper reports the experimental demonstration of adaptive multispectral imagery using fabricated plasmonic spectral filter arrays and proposed target detection scenarios. Each plasmonic filter was designed using periodic circular holes perforated through a gold layer, and an enhanced target detection strategy was proposed to refine the original spectrometry concept for spatial and spectral computation of the data measured from the plasmonic array. Both the spectrum of blackbody radiation and a metal ring object at multiple wavelengths were successfully reconstructed using the weighted superposition of plasmonic output images as specified in the proposed detection strategy. In addition, plasmonic filter arrays were theoretically tested on a target at extremely high temperature as a challenging scenario for the detection scheme.

  13. Pixel parallel localized driver design for a 128 x 256 pixel array 3D 1Gfps image sensor

    NASA Astrophysics Data System (ADS)

    Zhang, C.; Dao, V. T. S.; Etoh, T. G.; Charbon, E.

    2017-02-01

    In this paper, a 3D 1Gfps BSI image sensor is proposed, where 128 × 256 pixels are located in the top-tier chip and a 32 × 32 localized driver array in the bottom-tier chip. Pixels are designed with Multiple Collection Gates (MCG), which collects photons selectively with different collection gates being active at intervals of 1ns to achieve 1Gfps. For the drivers, a global PLL is designed, which consists of a ring oscillator with 6-stage current starved differential inverters, achieving a wide frequency tuning range from 40MHz to 360MHz (20ps rms jitter). The drivers are the replicas of the ring oscillator that operates within a PLL. Together with level shifters and XNOR gates, continuous 3.3V pulses are generated with desired pulse width, which is 1/12 of the PLL clock period. The driver array is activated by a START signal, which propagates through a highly balanced clock tree, to activate all the pixels at the same time with virtually negligible skew.

  14. Efficient Smart CMOS Camera Based on FPGAs Oriented to Embedded Image Processing

    PubMed Central

    Bravo, Ignacio; Baliñas, Javier; Gardel, Alfredo; Lázaro, José L.; Espinosa, Felipe; García, Jorge

    2011-01-01

    This article describes an image processing system based on an intelligent ad-hoc camera, whose two principle elements are a high speed 1.2 megapixel Complementary Metal Oxide Semiconductor (CMOS) sensor and a Field Programmable Gate Array (FPGA). The latter is used to control the various sensor parameter configurations and, where desired, to receive and process the images captured by the CMOS sensor. The flexibility and versatility offered by the new FPGA families makes it possible to incorporate microprocessors into these reconfigurable devices, and these are normally used for highly sequential tasks unsuitable for parallelization in hardware. For the present study, we used a Xilinx XC4VFX12 FPGA, which contains an internal Power PC (PPC) microprocessor. In turn, this contains a standalone system which manages the FPGA image processing hardware and endows the system with multiple software options for processing the images captured by the CMOS sensor. The system also incorporates an Ethernet channel for sending processed and unprocessed images from the FPGA to a remote node. Consequently, it is possible to visualize and configure system operation and captured and/or processed images remotely. PMID:22163739

  15. The Acoustic Lens Design and in Vivo Use of a Multifunctional Catheter Combining Intracardiac Ultrasound Imaging and Electrophysiology Sensing

    PubMed Central

    Stephens, Douglas N.; Cannata, Jonathan; Liu, Ruibin; Zhao, Jian Zhong; Shung, K. Kirk; Nguyen, Hien; Chia, Raymond; Dentinger, Aaron; Wildes, Douglas; Thomenius, Kai E.; Mahajan, Aman; Shivkumar, Kalyanam; Kim, Kang; O’Donnell, Matthew; Sahn, David

    2009-01-01

    A multifunctional 9F intracardiac imaging and electrophysiology mapping catheter was developed and tested to help guide diagnostic and therapeutic intracardiac electrophysiology (EP) procedures. The catheter tip includes a 7.25-MHz, 64-element, side-looking phased array for high resolution sector scanning. Multiple electrophysiology mapping sensors were mounted as ring electrodes near the array for electrocardiographic synchronization of ultrasound images. The catheter array elevation beam performance in particular was investigated. An acoustic lens for the distal tip array designed with a round cross section can produce an acceptable elevation beam shape; however, the velocity of sound in the lens material should be approximately 155 m/s slower than in tissue for the best beam shape and wide bandwidth performance. To help establish the catheter’s unique ability for integration with electrophysiology interventional procedures, it was used in vivo in a porcine animal model, and demonstrated both useful intracardiac echocardiographic visualization and simultaneous 3-D positional information using integrated electroanatomical mapping techniques. The catheter also performed well in high frame rate imaging, color flow imaging, and strain rate imaging of atrial and ventricular structures. PMID:18407850

  16. Simulation of sampling effects in FPAs

    NASA Astrophysics Data System (ADS)

    Cook, Thomas H.; Hall, Charles S.; Smith, Frederick G.; Rogne, Timothy J.

    1991-09-01

    The use of multiplexers and large focal plane arrays in advanced thermal imaging systems has drawn renewed attention to sampling and aliasing issues in imaging applications. As evidenced by discussions in a recent workshop, there is no clear consensus among experts whether aliasing in sensor designs can be readily tolerated, or must be avoided at all cost. Further, there is no straightforward, analytical method that can answer the question, particularly when considering image interpreters as different as humans and autonomous target recognizers (ATR). However, the means exist for investigating sampling and aliasing issues through computer simulation. The U.S. Army Tank-Automotive Command (TACOM) Thermal Image Model (TTIM) provides realistic sensor imagery that can be evaluated by both human observers and TRs. This paper briefly describes the history and current status of TTIM, explains the simulation of FPA sampling effects, presents validation results of the FPA sensor model, and demonstrates the utility of TTIM for investigating sampling effects in imagery.

  17. High-Throughput and Label-Free Single Nanoparticle Sizing Based on Time-Resolved On-Chip Microscopy

    DTIC Science & Technology

    2015-02-17

    12,13 soot ,6,14 ice crystals in clouds,15 and engineered nano- materials,16 among others. While there exist various nanoparticle detection and sizing...the sample of interest is placed on an optoelectronic sensor -array with typically less than 0.5 mm gap (z2) between the sample and sensor planes such...that, under unit mag- nification, the entire sensor active area serves as the imaging FOV, easily reaching >2030 mm2 with state-of-the-art CMOS

  18. The low-order wavefront control system for the PICTURE-C mission: preliminary testbed results from the Shack-Hartmann sensor

    NASA Astrophysics Data System (ADS)

    Howe, Glenn A.; Mendillo, Christopher B.; Hewawasam, Kuravi; Martel, Jason; Finn, Susanna C.; Cook, Timothy A.; Chakrabarti, Supriya

    2017-09-01

    The Planetary Imaging Concept Testbed Using a Recoverable Experiment - Coronagraph (PICTURE-C) mission will directly image debris disks and exozodiacal dust around three nearby stars from a high-altitude balloon using a vector vortex coronagraph. We present experimental results of the PICTURE-C low-order wavefront control (LOWFC) system utilizing a Shack-Hartmann (SH) sensor in an instrument testbed. The SH sensor drives both the alignment of the telescope secondary mirror using a 6-axis Hexapod and a surface parallel array deformable mirror to remove residual low-order aberrations. The sensor design and actuator calibration methods are discussed and the preliminary LOWFC closed-loop performance is shown to stabilize a reference wavefront to an RMS error of 0.30 +/- 0.29 nm.

  19. Robust optical sensors for safety critical automotive applications

    NASA Astrophysics Data System (ADS)

    De Locht, Cliff; De Knibber, Sven; Maddalena, Sam

    2008-02-01

    Optical sensors for the automotive industry need to be robust, high performing and low cost. This paper focuses on the impact of automotive requirements on optical sensor design and packaging. Main strategies to lower optical sensor entry barriers in the automotive market include: Perform sensor calibration and tuning by the sensor manufacturer, sensor test modes on chip to guarantee functional integrity at operation, and package technology is key. As a conclusion, optical sensor applications are growing in automotive. Optical sensor robustness matured to the level of safety critical applications like Electrical Power Assisted Steering (EPAS) and Drive-by-Wire by optical linear arrays based systems and Automated Cruise Control (ACC), Lane Change Assist and Driver Classification/Smart Airbag Deployment by camera imagers based systems.

  20. Broadband Terahertz Computed Tomography Using a 5k-pixel Real-time THz Camera

    NASA Astrophysics Data System (ADS)

    Trichopoulos, Georgios C.; Sertel, Kubilay

    2015-07-01

    We present a novel THz computed tomography system that enables fast 3-dimensional imaging and spectroscopy in the 0.6-1.2 THz band. The system is based on a new real-time broadband THz camera that enables rapid acquisition of multiple cross-sectional images required in computed tomography. Tomographic reconstruction is achieved using digital images from the densely-packed large-format (80×64) focal plane array sensor located behind a hyper-hemispherical silicon lens. Each pixel of the sensor array consists of an 85 μm × 92 μm lithographically fabricated wideband dual-slot antenna, monolithically integrated with an ultra-fast diode tuned to operate in the 0.6-1.2 THz regime. Concurrently, optimum impedance matching was implemented for maximum pixel sensitivity, enabling 5 frames-per-second image acquisition speed. As such, the THz computed tomography system generates diffraction-limited resolution cross-section images as well as the three-dimensional models of various opaque and partially transparent objects. As an example, an over-the-counter vitamin supplement pill is imaged and its material composition is reconstructed. The new THz camera enables, for the first time, a practical application of THz computed tomography for non-destructive evaluation and biomedical imaging.

  1. Micro-Hall devices for magnetic, electric and photo-detection

    NASA Astrophysics Data System (ADS)

    Gilbertson, A.; Sadeghi, H.; Panchal, V.; Kazakova, O.; Lambert, C. J.; Solin, S. A.; Cohen, L. F.

    Multifunctional mesoscopic sensors capable of detecting local magnetic (B) , electric (E) , and optical fields can greatly facilitate image capture in nano-arrays that address a multitude of disciplines. The use of micro-Hall devices as B-field sensors and, more recently as E-field sensors is well established. Here we report the real-space voltage response of InSb/AlInSb micro-Hall devices to not only local E-, and B-fields but also to photo-excitation using scanning probe microscopy. We show that the ultrafast generation of localised photocarriers results in conductance perturbations analogous to those produced by local E-fields. Our experimental results are in good agreement with tight-binding transport calculations in the diffusive regime. At room temperature, samples exhibit a magnetic sensitivity of >500 nT/ √Hz, an optical noise equivalent power of >20 pW/ √Hz (λ = 635 nm) comparable to commercial photoconductive detectors, and charge sensitivity of >0.04 e/ √Hz comparable to that of single electron transistors. Work done while on sabbatical from Washington University. Co-founder of PixelEXX, a start-up whose focus is imaging nano-arrays.

  2. Design of area array CCD image acquisition and display system based on FPGA

    NASA Astrophysics Data System (ADS)

    Li, Lei; Zhang, Ning; Li, Tianting; Pan, Yue; Dai, Yuming

    2014-09-01

    With the development of science and technology, CCD(Charge-coupled Device) has been widely applied in various fields and plays an important role in the modern sensing system, therefore researching a real-time image acquisition and display plan based on CCD device has great significance. This paper introduces an image data acquisition and display system of area array CCD based on FPGA. Several key technical challenges and problems of the system have also been analyzed and followed solutions put forward .The FPGA works as the core processing unit in the system that controls the integral time sequence .The ICX285AL area array CCD image sensor produced by SONY Corporation has been used in the system. The FPGA works to complete the driver of the area array CCD, then analog front end (AFE) processes the signal of the CCD image, including amplification, filtering, noise elimination, CDS correlation double sampling, etc. AD9945 produced by ADI Corporation to convert analog signal to digital signal. Developed Camera Link high-speed data transmission circuit, and completed the PC-end software design of the image acquisition, and realized the real-time display of images. The result through practical testing indicates that the system in the image acquisition and control is stable and reliable, and the indicators meet the actual project requirements.

  3. Surface plasmon resonance imaging system with Mach-Zehnder phase-shift interferometry for DNA micro-array hybridization

    NASA Astrophysics Data System (ADS)

    Hsiu, Feng-Ming; Chen, Shean-Jen; Tsai, Chien-Hung; Tsou, Chia-Yuan; Su, Y.-D.; Lin, G.-Y.; Huang, K.-T.; Chyou, Jin-Jung; Ku, Wei-Chih; Chiu, S.-K.; Tzeng, C.-M.

    2002-09-01

    Surface plasmon resonance (SPR) imaging system is presented as a novel technique based on modified Mach-Zehnder phase-shifting interferometry (PSI) for biomolecular interaction analysis (BIA), which measures the spatial phase variation of a resonantly reflected light in biomolecular interaction. In this technique, the micro-array SPR biosensors with over a thousand probe NDA spots can be detected simultaneously. Owing to the feasible and swift measurements, the micro-array SPR biosensors can be extensively applied to the nonspecific adsorption of protein, the membrane/protein interactions, and DNA hybridization. The detection sensitivity of the SPR PSI imaging system is improved to about 1 pg/mm2 for each spot over the conventional SPR imaging systems. The SPR PSI imaging system and its SPR sensors have been successfully used to observe slightly index change in consequence of argon gas flow through the nitrogen in real time, with high sensitivity, and at high-throughout screening rates.

  4. A Comparison of Lightning Flashes as Observed by the Lightning Imaging Sensor and the North Alabama Lightning Mapping Array

    NASA Technical Reports Server (NTRS)

    Bateman, M. G.; Mach, D. M.; McCaul, M. G.; Bailey, J. C.; Christian, H. J.

    2008-01-01

    The Lightning Imaging Sensor (LIS) aboard the TRMM satellite has been collecting optical lightning data since November 1997. A Lightning Mapping Array (LMA) that senses VHF impulses from lightning was installed in North Alabama in the Fall of 2001. A dataset has been compiled to compare data from both instruments for all times when the LIS was passing over the domain of our LMA. We have algorithms for both instruments to group pixels or point sources into lightning flashes. This study presents the comparison statistics of the flash data output (flash duration, size, and amplitude) from both algorithms. We will present the results of this comparison study and show "point-level" data to explain the differences. AS we head closer to realizing a Global Lightning Mapper (GLM) on GOES-R, better understanding and ground truth of each of these instruments and their respective flash algorithms is needed.

  5. Supplemental blue LED lighting array to improve the signal quality in hyperspectral imaging of plants.

    PubMed

    Mahlein, Anne-Katrin; Hammersley, Simon; Oerke, Erich-Christian; Dehne, Heinz-Wilhelm; Goldbach, Heiner; Grieve, Bruce

    2015-06-01

    Hyperspectral imaging systems used in plant science or agriculture often have suboptimal signal-to-noise ratio in the blue region (400-500 nm) of the electromagnetic spectrum. Typically there are two principal reasons for this effect, the low sensitivity of the imaging sensor and the low amount of light available from the illuminating source. In plant science, the blue region contains relevant information about the physiology and the health status of a plant. We report on the improvement in sensitivity of a hyperspectral imaging system in the blue region of the spectrum by using supplemental illumination provided by an array of high brightness light emitting diodes (LEDs) with an emission peak at 470 nm.

  6. Novel eye-safe line scanning 3D laser-radar

    NASA Astrophysics Data System (ADS)

    Eberle, B.; Kern, Tobias; Hammer, Marcus; Schwanke, Ullrich; Nowak, Heinrich

    2014-10-01

    Today, the civil market provides quite a number of different 3D-Sensors covering ranges up to 1 km. Typically these sensors are based on single element detectors which suffer from the drawback of spatial resolution at larger distances. Tasks demanding reliable object classification at long ranges can be fulfilled only by sensors consisting of detector arrays. They ensure sufficient frame rates and high spatial resolution. Worldwide there are many efforts in developing 3D-detectors, based on two-dimensional arrays. This paper presents first results on the performance of a recently developed 3D imaging laser radar sensor, working in the short wave infrared (SWIR) at 1.5 μm. It consists of a novel Cadmium Mercury Telluride (CMT) linear array APD detector with 384x1 elements at a pitch of 25 μm, developed by AIM Infrarot Module GmbH. The APD elements are designed to work in the linear (non-Geiger) mode. Each pixel will provide the time of flight measurement, and, due to the linear detection mode, allowing the detection of three successive echoes. The resolution in depth is 15 cm, the maximum repetition rate is 4 kHz. We discuss various sensor concepts regarding possible applications and their dependence on system parameters like field of view, frame rate, spatial resolution and range of operation.

  7. Thermal noise variance of a receive radiofrequency coil as a respiratory motion sensor.

    PubMed

    Andreychenko, A; Raaijmakers, A J E; Sbrizzi, A; Crijns, S P M; Lagendijk, J J W; Luijten, P R; van den Berg, C A T

    2017-01-01

    Development of a passive respiratory motion sensor based on the noise variance of the receive coil array. Respiratory motion alters the body resistance. The noise variance of an RF coil depends on the body resistance and, thus, is also modulated by respiration. For the noise variance monitoring, the noise samples were acquired without and with MR signal excitation on clinical 1.5/3 T MR scanners. The performance of the noise sensor was compared with the respiratory bellow and with the diaphragm displacement visible on MR images. Several breathing patterns were tested. The noise variance demonstrated a periodic, temporal modulation that was synchronized with the respiratory bellow signal. The modulation depth of the noise variance resulting from the respiration varied between the channels of the array and depended on the channel's location with respect to the body. The noise sensor combined with MR acquisition was able to detect the respiratory motion for every k-space read-out line. Within clinical MR systems, the respiratory motion can be detected by the noise in receive array. The noise sensor does not require careful positioning unlike the bellow, any additional hardware, and/or MR acquisition. Magn Reson Med 77:221-228, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  8. Noise reduction techniques for Bayer-matrix images

    NASA Astrophysics Data System (ADS)

    Kalevo, Ossi; Rantanen, Henry

    2002-04-01

    In this paper, some arrangements to apply Noise Reduction (NR) techniques for images captured by a single sensor digital camera are studied. Usually, the NR filter processes full three-color component image data. This requires that raw Bayer-matrix image data, available from the image sensor, is first interpolated by using Color Filter Array Interpolation (CFAI) method. Another choice is that the raw Bayer-matrix image data is processed directly. The advantages and disadvantages of both processing orders, before (pre-) CFAI and after (post-) CFAI, are studied with linear, multi-stage median, multistage median hybrid and median-rational filters .The comparison is based on the quality of the output image, the processing power requirements and the amount of memory needed. Also the solution, which improves preservation of details in the NR filtering before the CFAI, is proposed.

  9. Real-time imaging of microparticles and living cells with CMOS nanocapacitor arrays

    NASA Astrophysics Data System (ADS)

    Laborde, C.; Pittino, F.; Verhoeven, H. A.; Lemay, S. G.; Selmi, L.; Jongsma, M. A.; Widdershoven, F. P.

    2015-09-01

    Platforms that offer massively parallel, label-free biosensing can, in principle, be created by combining all-electrical detection with low-cost integrated circuits. Examples include field-effect transistor arrays, which are used for mapping neuronal signals and sequencing DNA. Despite these successes, however, bioelectronics has so far failed to deliver a broadly applicable biosensing platform. This is due, in part, to the fact that d.c. or low-frequency signals cannot be used to probe beyond the electrical double layer formed by screening salt ions, which means that under physiological conditions the sensing of a target analyte located even a short distance from the sensor (∼1 nm) is severely hampered. Here, we show that high-frequency impedance spectroscopy can be used to detect and image microparticles and living cells under physiological salt conditions. Our assay employs a large-scale, high-density array of nanoelectrodes integrated with CMOS electronics on a single chip and the sensor response depends on the electrical properties of the analyte, allowing impedance-based fingerprinting. With our platform, we image the dynamic attachment and micromotion of BEAS, THP1 and MCF7 cancer cell lines in real time at submicrometre resolution in growth medium, demonstrating the potential of the platform for label/tracer-free high-throughput screening of anti-tumour drug candidates.

  10. Guaranteeing Failsafe Operation of Extended-Scene Shack-Hartmann Wavefront Sensor Algorithm

    NASA Technical Reports Server (NTRS)

    Sidick, Erikin

    2009-01-01

    A Shack-Hartmann sensor (SHS) is an optical instrument consisting of a lenslet array and a camera. It is widely used for wavefront sensing in optical testing and astronomical adaptive optics. The camera is placed at the focal point of the lenslet array and points at a star or any other point source. The image captured is an array of spot images. When the wavefront error at the lenslet array changes, the position of each spot measurably shifts from its original position. Determining the shifts of the spot images from their reference points shows the extent of the wavefront error. An adaptive cross-correlation (ACC) algorithm has been developed to use scenes as well as point sources for wavefront error detection. Qualifying an extended scene image is often not an easy task due to changing conditions in scene content, illumination level, background, Poisson noise, read-out noise, dark current, sampling format, and field of view. The proposed new technique based on ACC algorithm analyzes the effects of these conditions on the performance of the ACC algorithm and determines the viability of an extended scene image. If it is viable, then it can be used for error correction; if it is not, the image fails and will not be further processed. By potentially testing for a wide variety of conditions, the algorithm s accuracy can be virtually guaranteed. In a typical application, the ACC algorithm finds image shifts of more than 500 Shack-Hartmann camera sub-images relative to a reference sub -image or cell when performing one wavefront sensing iteration. In the proposed new technique, a pair of test and reference cells is selected from the same frame, preferably from two well-separated locations. The test cell is shifted by an integer number of pixels, say, for example, from m= -5 to 5 along the x-direction by choosing a different area on the same sub-image, and the shifts are estimated using the ACC algorithm. The same is done in the y-direction. If the resulting shift estimate errors are less than a pre-determined threshold (e.g., 0.03 pixel), the image is accepted. Otherwise, it is rejected.

  11. Material condition assessment with eddy current sensors

    NASA Technical Reports Server (NTRS)

    Goldfine, Neil J. (Inventor); Washabaugh, Andrew P. (Inventor); Sheiretov, Yanko K. (Inventor); Schlicker, Darrell E. (Inventor); Lyons, Robert J. (Inventor); Windoloski, Mark D. (Inventor); Craven, Christopher A. (Inventor); Tsukernik, Vladimir B. (Inventor); Grundy, David C. (Inventor)

    2010-01-01

    Eddy current sensors and sensor arrays are used for process quality and material condition assessment of conducting materials. In an embodiment, changes in spatially registered high resolution images taken before and after cold work processing reflect the quality of the process, such as intensity and coverage. These images also permit the suppression or removal of local outlier variations. Anisotropy in a material property, such as magnetic permeability or electrical conductivity, can be intentionally introduced and used to assess material condition resulting from an operation, such as a cold work or heat treatment. The anisotropy is determined by sensors that provide directional property measurements. The sensor directionality arises from constructs that use a linear conducting drive segment to impose the magnetic field in a test material. Maintaining the orientation of this drive segment, and associated sense elements, relative to a material edge provides enhanced sensitivity for crack detection at edges.

  12. Smart-phone based computational microscopy using multi-frame contact imaging on a fiber-optic array.

    PubMed

    Navruz, Isa; Coskun, Ahmet F; Wong, Justin; Mohammad, Saqib; Tseng, Derek; Nagi, Richie; Phillips, Stephen; Ozcan, Aydogan

    2013-10-21

    We demonstrate a cellphone based contact microscopy platform, termed Contact Scope, which can image highly dense or connected samples in transmission mode. Weighing approximately 76 grams, this portable and compact microscope is installed on the existing camera unit of a cellphone using an opto-mechanical add-on, where planar samples of interest are placed in contact with the top facet of a tapered fiber-optic array. This glass-based tapered fiber array has ~9 fold higher density of fiber optic cables on its top facet compared to the bottom one and is illuminated by an incoherent light source, e.g., a simple light-emitting-diode (LED). The transmitted light pattern through the object is then sampled by this array of fiber optic cables, delivering a transmission image of the sample onto the other side of the taper, with ~3× magnification in each direction. This magnified image of the object, located at the bottom facet of the fiber array, is then projected onto the CMOS image sensor of the cellphone using two lenses. While keeping the sample and the cellphone camera at a fixed position, the fiber-optic array is then manually rotated with discrete angular increments of e.g., 1-2 degrees. At each angular position of the fiber-optic array, contact images are captured using the cellphone camera, creating a sequence of transmission images for the same sample. These multi-frame images are digitally fused together based on a shift-and-add algorithm through a custom-developed Android application running on the smart-phone, providing the final microscopic image of the sample, visualized through the screen of the phone. This final computation step improves the resolution and also removes spatial artefacts that arise due to non-uniform sampling of the transmission intensity at the fiber optic array surface. We validated the performance of this cellphone based Contact Scope by imaging resolution test charts and blood smears.

  13. Smart-phone based computational microscopy using multi-frame contact imaging on a fiber-optic array

    PubMed Central

    Navruz, Isa; Coskun, Ahmet F.; Wong, Justin; Mohammad, Saqib; Tseng, Derek; Nagi, Richie; Phillips, Stephen; Ozcan, Aydogan

    2013-01-01

    We demonstrate a cellphone based contact microscopy platform, termed Contact Scope, which can image highly dense or connected samples in transmission mode. Weighing approximately 76 grams, this portable and compact microscope is installed on the existing camera unit of a cellphone using an opto-mechanical add-on, where planar samples of interest are placed in contact with the top facet of a tapered fiber-optic array. This glass-based tapered fiber array has ∼9 fold higher density of fiber optic cables on its top facet compared to the bottom one and is illuminated by an incoherent light source, e.g., a simple light-emitting-diode (LED). The transmitted light pattern through the object is then sampled by this array of fiber optic cables, delivering a transmission image of the sample onto the other side of the taper, with ∼3× magnification in each direction. This magnified image of the object, located at the bottom facet of the fiber array, is then projected onto the CMOS image sensor of the cellphone using two lenses. While keeping the sample and the cellphone camera at a fixed position, the fiber-optic array is then manually rotated with discrete angular increments of e.g., 1-2 degrees. At each angular position of the fiber-optic array, contact images are captured using the cellphone camera, creating a sequence of transmission images for the same sample. These multi-frame images are digitally fused together based on a shift-and-add algorithm through a custom-developed Android application running on the smart-phone, providing the final microscopic image of the sample, visualized through the screen of the phone. This final computation step improves the resolution and also gets rid of spatial artefacts that arise due to non-uniform sampling of the transmission intensity at the fiber optic array surface. We validated the performance of this cellphone based Contact Scope by imaging resolution test charts and blood smears. PMID:23939637

  14. Magnetic resonance imaging-compatible tactile sensing device based on a piezoelectric array.

    PubMed

    Hamed, Abbi; Masamune, Ken; Tse, Zion Tsz Ho; Lamperth, Michael; Dohi, Takeyoshi

    2012-07-01

    Minimally invasive surgery is a widely used medical technique, one of the drawbacks of which is the loss of direct sense of touch during the operation. Palpation is the use of fingertips to explore and make fast assessments of tissue morphology. Although technologies are developed to equip minimally invasive surgery tools with haptic feedback capabilities, the majority focus on tissue stiffness profiling and tool-tissue interaction force measurement. For greatly increased diagnostic capability, a magnetic resonance imaging-compatible tactile sensor design is proposed, which allows minimally invasive surgery to be performed under image guidance, combining the strong capability of magnetic resonance imaging soft tissue and intuitive palpation. The sensing unit is based on a piezoelectric sensor methodology, which conforms to the stringent mechanical and electrical design requirements imposed by the magnetic resonance environment The sensor mechanical design and the device integration to a 0.2 Tesla open magnetic resonance imaging scanner are described, together with the device's magnetic resonance compatibility testing. Its design limitations and potential future improvements are also discussed. A tactile sensing unit based on a piezoelectric sensor principle is proposed, which is designed for magnetic resonance imaging guided interventions.

  15. SELF CALIBRATED STMR ARRAY FOR MATERIAL CHARACTERIZATION AND SHM OF ORTHOTROPIC PLATE-LIKE STRUCTURES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vishnuvardhan, J.; Muralidharan, Ajith; Balasubramaniam, Krishnan

    A full ring STMR array patch had been used for Structural Health Monitoring (SHM) of anisotropic materials where the elastic moduli, correspond to the virgin sample, were used in the calculations. In the present work an in-situ SHM has been successfully demonstrated using a novel compact sensor patch (Double ring single quadrant small footprint STMR array) through simultaneous reconstruction of the elastic moduli, material symmetry, orientation of principal planes and defect imaging. The direct received signals were used to measure Lamb wave velocities, which were used in a slowness based reconstructed algorithm using Genetic Algorithm to reconstruct the elastic moduli,more » material symmetry and orientation of principal planes. The measured signals along with the reconstructed elastic moduli were used in the phased addition algorithm for imaging the damages present on the structure. To show the applicability of the method, simulations were carried out with the double ring single quadrant STMR array configuration to image defects and are compared with the images obtained using simulation data of the full ring STMR array configuration. The experimental validation has been carried out using 3.15 mm quasi-isotropic graphite-epoxy composite. The double ring single quadrant STMR array has advantages over the full ring STMR array as it can carry out in-situ SHM with limited footprint on the structure.« less

  16. Forensics for flatbed scanners

    NASA Astrophysics Data System (ADS)

    Gloe, Thomas; Franz, Elke; Winkler, Antje

    2007-02-01

    Within this article, we investigate possibilities for identifying the origin of images acquired with flatbed scanners. A current method for the identification of digital cameras takes advantage of image sensor noise, strictly speaking, the spatial noise. Since flatbed scanners and digital cameras use similar technologies, the utilization of image sensor noise for identifying the origin of scanned images seems to be possible. As characterization of flatbed scanner noise, we considered array reference patterns and sensor line reference patterns. However, there are particularities of flatbed scanners which we expect to influence the identification. This was confirmed by extensive tests: Identification was possible to a certain degree, but less reliable than digital camera identification. In additional tests, we simulated the influence of flatfielding and down scaling as examples for such particularities of flatbed scanners on digital camera identification. One can conclude from the results achieved so far that identifying flatbed scanners is possible. However, since the analyzed methods are not able to determine the image origin in all cases, further investigations are necessary.

  17. Towards Silicon-Based Longwave Integrated Optoelectronics (LIO)

    DTIC Science & Technology

    2008-01-21

    circuitry. The photonics can use, for example, microbolometers and III-V photodetectors as well as III-V interband cascade and quantum cascade lasers...chips using inputs from several sensors. (4) imaging: focal - plane - array imager with integral readout, infrared-to-visible image converter chip, (5... photodetectors , type II interband cascades and QCLs. I would integrate the cascades in LIO using a technique similar to that developed by John Bower’s

  18. MSTI-3 sensor package optical design

    NASA Astrophysics Data System (ADS)

    Horton, Richard F.; Baker, William G.; Griggs, Michael; Nguyen, Van; Baker, H. Vernon

    1995-06-01

    The MSTI-3 sensor package is a three band imaging telescope for military and dual use sensing missions. The MSTI-3 mission is one of the Air Force Phillips Laboratory's Pegasus launched space missions, a third in the series of state-of-the-art lightweight sensors on low cost satellites. The satellite is planned for launch into a 425 Km orbit in late 1995. The MSTI- 3 satellite is configured with a down looking two axis gimbal and gimbal mirror. The gimbal mirror is an approximately 13 cm by 29 cm mirror which allows a field of regard approximately 100 degrees by 180 degrees. The optical train uses several novel optical features to allow for compactness and light weight. A 105 mm Ritchey Chretien Cassegrain imaging system with a CaF(subscript 2) dome astigmatism corrector is followed by a CaF(subscript 2) beamsplitter cube assembly at the systems first focus. The dichroic beamsplitter cube assembly separates the light into a visible and two IR channels of approximately 2.5 to 3.3, (SWIR), and 3.5 to 4.5, (MWIR), micron wavelength bands. The two IR imaging channels each consist of unity power re-imaging lens cluster, a cooled seven position filter wheel, a cooled Lyot stop and an Amber 256 X 256 InSb array camera. The visible channel uses a unity power re- imaging system prior to a linear variable filter with a Sony CCD array, which allows for a multispectral imaging capability in the 0.5 to 0.8 micron region. The telescope field of view is 1.4 degrees square.

  19. Calibrating the orientation between a microlens array and a sensor based on projective geometry

    NASA Astrophysics Data System (ADS)

    Su, Lijuan; Yan, Qiangqiang; Cao, Jun; Yuan, Yan

    2016-07-01

    We demonstrate a method for calibrating a microlens array (MLA) with a sensor component by building a plenoptic camera with a conventional prime lens. This calibration method includes a geometric model, a setup to adjust the distance (L) between the prime lens and the MLA, a calibration procedure for determining the subimage centers, and an optimization algorithm. The geometric model introduces nine unknown parameters regarding the centers of the microlenses and their images, whereas the distance adjustment setup provides an initial guess for the distance L. The simulation results verify the effectiveness and accuracy of the proposed method. The experimental results demonstrate the calibration process can be performed with a commercial prime lens and the proposed method can be used to quantitatively evaluate whether a MLA and a sensor is assembled properly for plenoptic systems.

  20. Giga-pixel lensfree holographic microscopy and tomography using color image sensors.

    PubMed

    Isikman, Serhan O; Greenbaum, Alon; Luo, Wei; Coskun, Ahmet F; Ozcan, Aydogan

    2012-01-01

    We report Giga-pixel lensfree holographic microscopy and tomography using color sensor-arrays such as CMOS imagers that exhibit Bayer color filter patterns. Without physically removing these color filters coated on the sensor chip, we synthesize pixel super-resolved lensfree holograms, which are then reconstructed to achieve ~350 nm lateral resolution, corresponding to a numerical aperture of ~0.8, across a field-of-view of ~20.5 mm(2). This constitutes a digital image with ~0.7 Billion effective pixels in both amplitude and phase channels (i.e., ~1.4 Giga-pixels total). Furthermore, by changing the illumination angle (e.g., ± 50°) and scanning a partially-coherent light source across two orthogonal axes, super-resolved images of the same specimen from different viewing angles are created, which are then digitally combined to synthesize tomographic images of the object. Using this dual-axis lensfree tomographic imager running on a color sensor-chip, we achieve a 3D spatial resolution of ~0.35 µm × 0.35 µm × ~2 µm, in x, y and z, respectively, creating an effective voxel size of ~0.03 µm(3) across a sample volume of ~5 mm(3), which is equivalent to >150 Billion voxels. We demonstrate the proof-of-concept of this lensfree optical tomographic microscopy platform on a color CMOS image sensor by creating tomograms of micro-particles as well as a wild-type C. elegans nematode.

  1. Third-generation imaging sensor system concepts

    NASA Astrophysics Data System (ADS)

    Reago, Donald A.; Horn, Stuart B.; Campbell, James, Jr.; Vollmerhausen, Richard H.

    1999-07-01

    Second generation forward looking infrared sensors, based on either parallel scanning, long wave (8 - 12 um) time delay and integration HgCdTe detectors or mid wave (3 - 5 um), medium format staring (640 X 480 pixels) InSb detectors, are being fielded. The science and technology community is now turning its attention toward the definition of a future third generation of FLIR sensors, based on emerging research and development efforts. Modeled third generation sensor performance demonstrates a significant improvement in performance over second generation, resulting in enhanced lethality and survivability on the future battlefield. In this paper we present the current thinking on what third generation sensors systems will be and the resulting requirements for third generation focal plane array detectors. Three classes of sensors have been identified. The high performance sensor will contain a megapixel or larger array with at least two colors. Higher operating temperatures will also be the goal here so that power and weight can be reduced. A high performance uncooled sensor is also envisioned that will perform somewhere between first and second generation cooled detectors, but at significantly lower cost, weight, and power. The final third generation sensor is a very low cost micro sensor. This sensor can open up a whole new IR market because of its small size, weight, and cost. Future unattended throwaway sensors, micro UAVs, and helmet mounted IR cameras will be the result of this new class.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stutman, D.; Tritz, K.; Finkenthal, M.

    New diagnostic and sensor designs are needed for future burning plasma (BP) fusion experiments, having good space and time resolution and capable of prolonged operation in the harsh BP environment. We evaluate the potential of multi-energy x-ray imaging with filtered detector arrays for BP diagnostic and control. Experimental studies show that this simple and robust technique enables measuring with good accuracy, speed, and spatial resolution the T{sub e} profile, impurity content, and MHD activity in a tokamak. Applied to the BP this diagnostic could also serve for non-magnetic sensing of the plasma position, centroid, ELM, and RWM instability. BP compatiblemore » x-ray sensors are proposed using 'optical array' or 'bi-cell' detectors.« less

  3. Changing requirements and solutions for unattended ground sensors

    NASA Astrophysics Data System (ADS)

    Prado, Gervasio; Johnson, Robert

    2007-10-01

    Unattended Ground Sensors (UGS) were first used to monitor Viet Cong activity along the Ho Chi Minh Trail in the 1960's. In the 1980's, significant improvement in the capabilities of UGS became possible with the development of digital signal processors; this led to their use as fire control devices for smart munitions (for example: the Wide Area Mine) and later to monitor the movements of mobile missile launchers. In these applications, the targets of interest were large military vehicles with strong acoustic, seismic and magnetic signatures. Currently, the requirements imposed by new terrorist threats and illegal border crossings have changed the emphasis to the monitoring of light vehicles and foot traffic. These new requirements have changed the way UGS are used. To improve performance against targets with lower emissions, sensors are used in multi-modal arrangements. Non-imaging sensors (acoustic, seismic, magnetic and passive infrared) are now being used principally as activity sensors to cue imagers and remote cameras. The availability of better imaging technology has made imagers the preferred source of "actionable intelligence". Infrared cameras are now based on un-cooled detector-arrays that have made their application in UGS possible in terms of their cost and power consumption. Visible light imagers are also more sensitive extending their utility well beyond twilight. The imagers are equipped with sophisticated image processing capabilities (image enhancement, moving target detection and tracking, image compression). Various commercial satellite services now provide relatively inexpensive long-range communications and the Internet provides fast worldwide access to the data.

  4. Overview of CMOS process and design options for image sensor dedicated to space applications

    NASA Astrophysics Data System (ADS)

    Martin-Gonthier, P.; Magnan, P.; Corbiere, F.

    2005-10-01

    With the growth of huge volume markets (mobile phones, digital cameras...) CMOS technologies for image sensor improve significantly. New process flows appear in order to optimize some parameters such as quantum efficiency, dark current, and conversion gain. Space applications can of course benefit from these improvements. To illustrate this evolution, this paper reports results from three technologies that have been evaluated with test vehicles composed of several sub arrays designed with some space applications as target. These three technologies are CMOS standard, improved and sensor optimized process in 0.35μm generation. Measurements are focussed on quantum efficiency, dark current, conversion gain and noise. Other measurements such as Modulation Transfer Function (MTF) and crosstalk are depicted in [1]. A comparison between results has been done and three categories of CMOS process for image sensors have been listed. Radiation tolerance has been also studied for the CMOS improved process in the way of hardening the imager by design. Results at 4, 15, 25 and 50 krad prove a good ionizing dose radiation tolerance applying specific techniques.

  5. The Solid State Image Sensor's Contribution To The Development Of Silicon Technology

    NASA Astrophysics Data System (ADS)

    Weckler, Gene P.

    1985-12-01

    Until recently, a solid-state image sensor with full television resolution was a dream. However, the dream of a solid state image sensor has been a driving force in the development of silicon technology for more than twenty-five years. There are probably many in the main stream of semiconductor technology who would argue with this; however, the solid state image sensor was conceived years before the invention of the semi conductor RAM or the microprocessor (i.e., even before the invention of the integrated circuit). No other potential application envisioned at that time required such complexity. How could anyone have ever hoped in 1960 to make a semi conductor chip containing half-a-million picture elements, capable of resolving eight to twelve bits of infornation, and each capable of readout rates in the tens of mega-pixels per second? As early as 1960 arrays of p-n junctions were being investigated as the optical targets in vidicon tubes, replacing the photoconductive targets. It took silicon technology several years to catch up with these dreamers.

  6. Compensated Row-Column Ultrasound Imaging System Using Fisher Tippett Multilayered Conditional Random Field Model

    PubMed Central

    Ben Daya, Ibrahim; Chen, Albert I. H.; Shafiee, Mohammad Javad; Wong, Alexander; Yeow, John T. W.

    2015-01-01

    3-D ultrasound imaging offers unique opportunities in the field of non destructive testing that cannot be easily found in A-mode and B-mode images. To acquire a 3-D ultrasound image without a mechanically moving transducer, a 2-D array can be used. The row column technique is preferred over a fully addressed 2-D array as it requires a significantly lower number of interconnections. Recent advances in 3-D row-column ultrasound imaging systems were largely focused on sensor design. However, these imaging systems face three intrinsic challenges that cannot be addressed by improving sensor design alone: speckle noise, sparsity of data in the imaged volume, and the spatially dependent point spread function of the imaging system. In this paper, we propose a compensated row-column ultrasound image reconstruction system using Fisher-Tippett multilayered conditional random field model. Tests carried out on both simulated and real row-column ultrasound images show the effectiveness of our proposed system as opposed to other published systems. Visual assessment of the results show our proposed system’s potential at preserving detail and reducing speckle. Quantitative analysis shows that our proposed system outperforms previously published systems when evaluated with metrics such as Peak Signal to Noise Ratio, Coefficient of Correlation, and Effective Number of Looks. These results show the potential of our proposed system as an effective tool for enhancing 3-D row-column imaging. PMID:26658577

  7. Experimental Demonstration of Adaptive Infrared Multispectral Imaging using Plasmonic Filter Array

    PubMed Central

    Jang, Woo-Yong; Ku, Zahyun; Jeon, Jiyeon; Kim, Jun Oh; Lee, Sang Jun; Park, James; Noyola, Michael J.; Urbas, Augustine

    2016-01-01

    In our previous theoretical study, we performed target detection using a plasmonic sensor array incorporating the data-processing technique termed “algorithmic spectrometry”. We achieved the reconstruction of a target spectrum by extracting intensity at multiple wavelengths with high resolution from the image data obtained from the plasmonic array. The ultimate goal is to develop a full-scale focal plane array with a plasmonic opto-coupler in order to move towards the next generation of versatile infrared cameras. To this end, and as an intermediate step, this paper reports the experimental demonstration of adaptive multispectral imagery using fabricated plasmonic spectral filter arrays and proposed target detection scenarios. Each plasmonic filter was designed using periodic circular holes perforated through a gold layer, and an enhanced target detection strategy was proposed to refine the original spectrometry concept for spatial and spectral computation of the data measured from the plasmonic array. Both the spectrum of blackbody radiation and a metal ring object at multiple wavelengths were successfully reconstructed using the weighted superposition of plasmonic output images as specified in the proposed detection strategy. In addition, plasmonic filter arrays were theoretically tested on a target at extremely high temperature as a challenging scenario for the detection scheme. PMID:27721506

  8. Hybrid UV Imager Containing Face-Up AlGaN/GaN Photodiodes

    NASA Technical Reports Server (NTRS)

    Zheng, Xinyu; Pain, Bedabrata

    2005-01-01

    A proposed hybrid ultraviolet (UV) image sensor would comprise a planar membrane array of face-up AlGaN/GaN photodiodes integrated with a complementary metal oxide/semiconductor (CMOS) readout-circuit chip. Each pixel in the hybrid image sensor would contain a UV photodiode on the AlGaN/GaN membrane, metal oxide/semiconductor field-effect transistor (MOSFET) readout circuitry on the CMOS chip underneath the photodiode, and a metal via connection between the photodiode and the readout circuitry (see figure). The proposed sensor design would offer all the advantages of comparable prior CMOS active-pixel sensors and AlGaN UV detectors while overcoming some of the limitations of prior (AlGaN/sapphire)/CMOS hybrid image sensors that have been designed and fabricated according to the methodology of flip-chip integration. AlGaN is a nearly ideal UV-detector material because its bandgap is wide and adjustable and it offers the potential to attain extremely low dark current. Integration of AlGaN with CMOS is necessary because at present there are no practical means of realizing readout circuitry in the AlGaN/GaN material system, whereas the means of realizing readout circuitry in CMOS are well established. In one variant of the flip-chip approach to integration, an AlGaN chip on a sapphire substrate is inverted (flipped) and then bump-bonded to a CMOS readout circuit chip; this variant results in poor quantum efficiency. In another variant of the flip-chip approach, an AlGaN chip on a crystalline AlN substrate would be bonded to a CMOS readout circuit chip; this variant is expected to result in narrow spectral response, which would be undesirable in many applications. Two other major disadvantages of flip-chip integration are large pixel size (a consequence of the need to devote sufficient area to each bump bond) and severe restriction on the photodetector structure. The membrane array of AlGaN/GaN photodiodes and the CMOS readout circuit for the proposed image sensor would be fabricated separately.

  9. Directional ocean wave measurements in a coastal setting using a focused array imaging radar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frasier, S.J.; Liu, Y.; Moller, D.

    1995-03-01

    A unique focused array imaging Doppler radar was used to measure directional spectra of ocean surface waves in a nearshore experiment performed on the North Carolina Outer Banks. Radar images of the ocean surface`s Doppler velocity were used to generate two dimensional spectra of the radial component of the ocean surface velocity field. These are compared to simultaneous in-situ measurements made by a nearby array of submerged pressure sensors. Analysis of the resulting two-dimensional spectra include comparisons of dominant wave lengths, wave directions, and wave energy accounting for relative differences in water depth at the measurement locations. Limited estimates ofmore » the two-dimensional surface displacement spectrum are derived from the radar data. The radar measurements are analogous to those of interferometric synthetic aperture radars (INSAR), and the equivalent INSAR parameters are shown. The agreement between the remote and in-situ measurements suggests that an imaging Doppler radar is effective for these wave measurements at near grazing incidence angles.« less

  10. Restoration of out-of-focus images based on circle of confusion estimate

    NASA Astrophysics Data System (ADS)

    Vivirito, Paolo; Battiato, Sebastiano; Curti, Salvatore; La Cascia, M.; Pirrone, Roberto

    2002-11-01

    In this paper a new method for a fast out-of-focus blur estimation and restoration is proposed. It is suitable for CFA (Color Filter Array) images acquired by typical CCD/CMOS sensor. The method is based on the analysis of a single image and consists of two steps: 1) out-of-focus blur estimation via Bayer pattern analysis; 2) image restoration. Blur estimation is based on a block-wise edge detection technique. This edge detection is carried out on the green pixels of the CFA sensor image also called Bayer pattern. Once the blur level has been estimated the image is restored through the application of a new inverse filtering technique. This algorithm gives sharp images reducing ringing and crisping artifact, involving wider region of frequency. Experimental results show the effectiveness of the method, both in subjective and numerical way, by comparison with other techniques found in literature.

  11. Energy dispersive CdTe and CdZnTe detectors for spectral clinical CT and NDT applications

    NASA Astrophysics Data System (ADS)

    Barber, W. C.; Wessel, J. C.; Nygard, E.; Iwanczyk, J. S.

    2015-06-01

    We are developing room temperature compound semiconductor detectors for applications in energy-resolved high-flux single x-ray photon-counting spectral computed tomography (CT), including functional imaging with nanoparticle contrast agents for medical applications and non-destructive testing (NDT) for security applications. Energy-resolved photon-counting can provide reduced patient dose through optimal energy weighting for a particular imaging task in CT, functional contrast enhancement through spectroscopic imaging of metal nanoparticles in CT, and compositional analysis through multiple basis function material decomposition in CT and NDT. These applications produce high input count rates from an x-ray generator delivered to the detector. Therefore, in order to achieve energy-resolved single photon counting in these applications, a high output count rate (OCR) for an energy-dispersive detector must be achieved at the required spatial resolution and across the required dynamic range for the application. The required performance in terms of the OCR, spatial resolution, and dynamic range must be obtained with sufficient field of view (FOV) for the application thus requiring the tiling of pixel arrays and scanning techniques. Room temperature cadmium telluride (CdTe) and cadmium zinc telluride (CdZnTe) compound semiconductors, operating as direct conversion x-ray sensors, can provide the required speed when connected to application specific integrated circuits (ASICs) operating at fast peaking times with multiple fixed thresholds per pixel provided the sensors are designed for rapid signal formation across the x-ray energy ranges of the application at the required energy and spatial resolutions, and at a sufficiently high detective quantum efficiency (DQE). We have developed high-flux energy-resolved photon-counting x-ray imaging array sensors using pixellated CdTe and CdZnTe semiconductors optimized for clinical CT and security NDT. We have also fabricated high-flux ASICs with a two dimensional (2D) array of inputs for readout from the sensors. The sensors are guard ring free and have a 2D array of pixels and can be tiled in 2D while preserving pixel pitch. The 2D ASICs have four energy bins with a linear energy response across sufficient dynamic range for clinical CT and some NDT applications. The ASICs can also be tiled in 2D and are designed to fit within the active area of the sensors. We have measured several important performance parameters including: the output count rate (OCR) in excess of 20 million counts per second per square mm with a minimum loss of counts due to pulse pile-up, an energy resolution of 7 keV full width at half-maximum (FWHM) across the entire dynamic range, and a noise floor about 20 keV. This is achieved by directly interconnecting the ASIC inputs to the pixels of the CdZnTe sensors incurring very little input capacitance to the ASICs. We present measurements of the performance of the CdTe and CdZnTe sensors including the OCR, FWHM energy resolution, noise floor, as well as the temporal stability and uniformity under the rapidly varying high flux expected in CT and NDT applications.

  12. Energy dispersive CdTe and CdZnTe detectors for spectral clinical CT and NDT applications

    PubMed Central

    Barber, W. C.; Wessel, J. C.; Nygard, E.; Iwanczyk, J. S.

    2014-01-01

    We are developing room temperature compound semiconductor detectors for applications in energy-resolved high-flux single x-ray photon-counting spectral computed tomography (CT), including functional imaging with nanoparticle contrast agents for medical applications and non destructive testing (NDT) for security applications. Energy-resolved photon-counting can provide reduced patient dose through optimal energy weighting for a particular imaging task in CT, functional contrast enhancement through spectroscopic imaging of metal nanoparticles in CT, and compositional analysis through multiple basis function material decomposition in CT and NDT. These applications produce high input count rates from an x-ray generator delivered to the detector. Therefore, in order to achieve energy-resolved single photon counting in these applications, a high output count rate (OCR) for an energy-dispersive detector must be achieved at the required spatial resolution and across the required dynamic range for the application. The required performance in terms of the OCR, spatial resolution, and dynamic range must be obtained with sufficient field of view (FOV) for the application thus requiring the tiling of pixel arrays and scanning techniques. Room temperature cadmium telluride (CdTe) and cadmium zinc telluride (CdZnTe) compound semiconductors, operating as direct conversion x-ray sensors, can provide the required speed when connected to application specific integrated circuits (ASICs) operating at fast peaking times with multiple fixed thresholds per pixel provided the sensors are designed for rapid signal formation across the x-ray energy ranges of the application at the required energy and spatial resolutions, and at a sufficiently high detective quantum efficiency (DQE). We have developed high-flux energy-resolved photon-counting x-ray imaging array sensors using pixellated CdTe and CdZnTe semiconductors optimized for clinical CT and security NDT. We have also fabricated high-flux ASICs with a two dimensional (2D) array of inputs for readout from the sensors. The sensors are guard ring free and have a 2D array of pixels and can be tiled in 2D while preserving pixel pitch. The 2D ASICs have four energy bins with a linear energy response across sufficient dynamic range for clinical CT and some NDT applications. The ASICs can also be tiled in 2D and are designed to fit within the active area of the sensors. We have measured several important performance parameters including; the output count rate (OCR) in excess of 20 million counts per second per square mm with a minimum loss of counts due to pulse pile-up, an energy resolution of 7 keV full width at half maximum (FWHM) across the entire dynamic range, and a noise floor about 20keV. This is achieved by directly interconnecting the ASIC inputs to the pixels of the CdZnTe sensors incurring very little input capacitance to the ASICs. We present measurements of the performance of the CdTe and CdZnTe sensors including the OCR, FWHM energy resolution, noise floor, as well as the temporal stability and uniformity under the rapidly varying high flux expected in CT and NDT applications. PMID:25937684

  13. Energy dispersive CdTe and CdZnTe detectors for spectral clinical CT and NDT applications.

    PubMed

    Barber, W C; Wessel, J C; Nygard, E; Iwanczyk, J S

    2015-06-01

    We are developing room temperature compound semiconductor detectors for applications in energy-resolved high-flux single x-ray photon-counting spectral computed tomography (CT), including functional imaging with nanoparticle contrast agents for medical applications and non destructive testing (NDT) for security applications. Energy-resolved photon-counting can provide reduced patient dose through optimal energy weighting for a particular imaging task in CT, functional contrast enhancement through spectroscopic imaging of metal nanoparticles in CT, and compositional analysis through multiple basis function material decomposition in CT and NDT. These applications produce high input count rates from an x-ray generator delivered to the detector. Therefore, in order to achieve energy-resolved single photon counting in these applications, a high output count rate (OCR) for an energy-dispersive detector must be achieved at the required spatial resolution and across the required dynamic range for the application. The required performance in terms of the OCR, spatial resolution, and dynamic range must be obtained with sufficient field of view (FOV) for the application thus requiring the tiling of pixel arrays and scanning techniques. Room temperature cadmium telluride (CdTe) and cadmium zinc telluride (CdZnTe) compound semiconductors, operating as direct conversion x-ray sensors, can provide the required speed when connected to application specific integrated circuits (ASICs) operating at fast peaking times with multiple fixed thresholds per pixel provided the sensors are designed for rapid signal formation across the x-ray energy ranges of the application at the required energy and spatial resolutions, and at a sufficiently high detective quantum efficiency (DQE). We have developed high-flux energy-resolved photon-counting x-ray imaging array sensors using pixellated CdTe and CdZnTe semiconductors optimized for clinical CT and security NDT. We have also fabricated high-flux ASICs with a two dimensional (2D) array of inputs for readout from the sensors. The sensors are guard ring free and have a 2D array of pixels and can be tiled in 2D while preserving pixel pitch. The 2D ASICs have four energy bins with a linear energy response across sufficient dynamic range for clinical CT and some NDT applications. The ASICs can also be tiled in 2D and are designed to fit within the active area of the sensors. We have measured several important performance parameters including; the output count rate (OCR) in excess of 20 million counts per second per square mm with a minimum loss of counts due to pulse pile-up, an energy resolution of 7 keV full width at half maximum (FWHM) across the entire dynamic range, and a noise floor about 20keV. This is achieved by directly interconnecting the ASIC inputs to the pixels of the CdZnTe sensors incurring very little input capacitance to the ASICs. We present measurements of the performance of the CdTe and CdZnTe sensors including the OCR, FWHM energy resolution, noise floor, as well as the temporal stability and uniformity under the rapidly varying high flux expected in CT and NDT applications.

  14. An optical wavefront sensor based on a double layer microlens array.

    PubMed

    Lin, Vinna; Wei, Hsiang-Chun; Hsieh, Hsin-Ta; Su, Guo-Dung John

    2011-01-01

    In order to determine light aberrations, Shack-Hartmann optical wavefront sensors make use of microlens arrays (MLA) to divide the incident light into small parts and focus them onto image planes. In this paper, we present the design and fabrication of long focal length MLA with various shapes and arrangements based on a double layer structure for optical wavefront sensing applications. A longer focal length MLA could provide high sensitivity in determining the average slope across each microlens under a given wavefront, and spatial resolution of a wavefront sensor is increased by numbers of microlenses across a detector. In order to extend focal length, we used polydimethysiloxane (PDMS) above MLA on a glass substrate. Because of small refractive index difference between PDMS and MLA interface (UV-resin), the incident light is less refracted and focused in further distance. Other specific focal lengths could also be realized by modifying the refractive index difference without changing the MLA size. Thus, the wavefront sensor could be improved with better sensitivity and higher spatial resolution.

  15. Multispectral interference filter arrays with compensation of angular dependence or extended spectral range.

    PubMed

    Frey, Laurent; Masarotto, Lilian; Armand, Marilyn; Charles, Marie-Lyne; Lartigue, Olivier

    2015-05-04

    Thin film Fabry-Perot filter arrays with high selectivity can be realized with a single patterning step, generating a spatial modulation of the effective refractive index in the optical cavity. In this paper, we investigate the ability of this technology to address two applications in the field of image sensors. First, the spectral tuning may be used to compensate the blue-shift of the filters in oblique incidence, provided the filter array is located in an image plane of an optical system with higher field of view than aperture angle. The technique is analyzed for various types of filters and experimental evidence is shown with copper-dielectric infrared filters. Then, we propose a design of a multispectral filter array with an extended spectral range spanning the visible and near-infrared range, using a single set of materials and realizable on a single substrate.

  16. Uncooled infrared focal plane array imaging in China

    NASA Astrophysics Data System (ADS)

    Lei, Shuyu

    2015-06-01

    This article reviews the development of uncooled infrared focal plane array (UIFPA) imaging in China in the past decade. Sensors based on optical or electrical read-out mechanism were developed but the latter dominates the market. In resistive bolometers, VOx and amorphous silicon are still the two major thermal-sensing materials. The specifications of the IRFPA made by different manufactures were collected and compared. Currently more than five Chinese companies and institutions design and fabricate uncooled infrared focal plane array. Some devices have sensitivity as high as 30 mK; the largest array for commercial products is 640×512 and the smallest pixel size is 17 μm. Emphasis is given on the pixel MEMS design, ROIC design, fabrication, and packaging of the IRFPA manufactured by GWIC, especially on design for high sensitivities, low noise, better uniformity and linearity, better stabilization for whole working temperature range, full-digital design, etc.

  17. Colorimetric Sensor Array for White Wine Tasting.

    PubMed

    Chung, Soo; Park, Tu San; Park, Soo Hyun; Kim, Joon Yong; Park, Seongmin; Son, Daesik; Bae, Young Min; Cho, Seong In

    2015-07-24

    A colorimetric sensor array was developed to characterize and quantify the taste of white wines. A charge-coupled device (CCD) camera captured images of the sensor array from 23 different white wine samples, and the change in the R, G, B color components from the control were analyzed by principal component analysis. Additionally, high performance liquid chromatography (HPLC) was used to analyze the chemical components of each wine sample responsible for its taste. A two-dimensional score plot was created with 23 data points. It revealed clusters created from the same type of grape, and trends of sweetness, sourness, and astringency were mapped. An artificial neural network model was developed to predict the degree of sweetness, sourness, and astringency of the white wines. The coefficients of determination (R2) for the HPLC results and the sweetness, sourness, and astringency were 0.96, 0.95, and 0.83, respectively. This research could provide a simple and low-cost but sensitive taste prediction system, and, by helping consumer selection, will be able to have a positive effect on the wine industry.

  18. Colorimetric Sensor Array for White Wine Tasting

    PubMed Central

    Chung, Soo; Park, Tu San; Park, Soo Hyun; Kim, Joon Yong; Park, Seongmin; Son, Daesik; Bae, Young Min; Cho, Seong In

    2015-01-01

    A colorimetric sensor array was developed to characterize and quantify the taste of white wines. A charge-coupled device (CCD) camera captured images of the sensor array from 23 different white wine samples, and the change in the R, G, B color components from the control were analyzed by principal component analysis. Additionally, high performance liquid chromatography (HPLC) was used to analyze the chemical components of each wine sample responsible for its taste. A two-dimensional score plot was created with 23 data points. It revealed clusters created from the same type of grape, and trends of sweetness, sourness, and astringency were mapped. An artificial neural network model was developed to predict the degree of sweetness, sourness, and astringency of the white wines. The coefficients of determination (R2) for the HPLC results and the sweetness, sourness, and astringency were 0.96, 0.95, and 0.83, respectively. This research could provide a simple and low-cost but sensitive taste prediction system, and, by helping consumer selection, will be able to have a positive effect on the wine industry. PMID:26213946

  19. Development of X-Ray Microcalorimeter Imaging Spectrometers for the X-Ray Surveyor Mission Concept

    NASA Technical Reports Server (NTRS)

    Bandler, Simon R.; Adams, Joseph S.; Chervenak, James A.; Datesman, Aaron M.; Eckart, Megan E.; Finkbeiner, Fred M.; Kelley, Richard L.; Kilbourne, Caroline A.; Betncourt-Martinez, Gabriele; Miniussi, Antoine R.; hide

    2016-01-01

    Four astrophysics missions are currently being studied by NASA as candidate large missions to be chosen inthe 2020 astrophysics decadal survey.1 One of these missions is the X-Ray Surveyor (XRS), and possibleconfigurations of this mission are currently under study by a science and technology definition team (STDT). Oneof the key instruments under study is an X-ray microcalorimeter, and the requirements for such an instrument arecurrently under discussion. In this paper we review some different detector options that exist for this instrument,and discuss what array formats might be possible. We have developed one design option that utilizes eithertransition-edge sensor (TES) or magnetically coupled calorimeters (MCC) in pixel array-sizes approaching 100kilo-pixels. To reduce the number of sensors read out to a plausible scale, we have assumed detector geometriesin which a thermal sensor such a TES or MCC can read out a sub-array of 20-25 individual 1 pixels. In thispaper we describe the development status of these detectors, and also discuss the different options that exist forreading out the very large number of pixels.

  20. Sensitivity encoded silicon photomultiplier--a new sensor for high-resolution PET-MRI.

    PubMed

    Schulz, Volkmar; Berker, Yannick; Berneking, Arne; Omidvari, Negar; Kiessling, Fabian; Gola, Alberto; Piemonte, Claudio

    2013-07-21

    Detectors for simultaneous positron emission tomography and magnetic resonance imaging in particular with sub-mm spatial resolution are commonly composed of scintillator crystal arrays, readout via arrays of solid state sensors, such as avalanche photo diodes (APDs) or silicon photomultipliers (SiPMs). Usually a light guide between the crystals and the sensor is used to enable the identification of crystals which are smaller than the sensor elements. However, this complicates crystal identification at the gaps and edges of the sensor arrays. A solution is to use as many sensors as crystals with a direct coupling, which unfortunately increases the complexity and power consumption of the readout electronics. Since 1997, position-sensitive APDs have been successfully used to identify sub-mm crystals. Unfortunately, these devices show a limitation in their time resolution and a degradation of spatial resolution when placed in higher magnetic fields. To overcome these limitations, this paper presents a new sensor concept that extends conventional SiPMs by adding position information via the spatial encoding of the channel sensitivity. The concept allows a direct coupling of high-resolution crystal arrays to the sensor with a reduced amount of readout channels. The theory of sensitivity encoding is detailed and linked to compressed sensing to compute unique sparse solutions. Two devices have been designed using one- and two-dimensional linear sensitivity encoding with eight and four readout channels, respectively. Flood histograms of both devices show the capability to precisely identify all 4 × 4 LYSO crystals with dimensions of 0.93 × 0.93 × 10 mm(3). For these crystals, the energy and time resolution (MV ± SD) of the devices with one (two)-dimensional encoding have been measured to be 12.3 · (1 ± 0.047)% (13.7 · (1 ± 0.047)%) around 511 keV with a paired coincidence time resolution (full width at half maximum) of 462 · (1 ± 0.054) ps (452 · (1 ± 0.078) ps).

  1. Sensitivity encoded silicon photomultiplier—a new sensor for high-resolution PET-MRI

    NASA Astrophysics Data System (ADS)

    Schulz, Volkmar; Berker, Yannick; Berneking, Arne; Omidvari, Negar; Kiessling, Fabian; Gola, Alberto; Piemonte, Claudio

    2013-07-01

    Detectors for simultaneous positron emission tomography and magnetic resonance imaging in particular with sub-mm spatial resolution are commonly composed of scintillator crystal arrays, readout via arrays of solid state sensors, such as avalanche photo diodes (APDs) or silicon photomultipliers (SiPMs). Usually a light guide between the crystals and the sensor is used to enable the identification of crystals which are smaller than the sensor elements. However, this complicates crystal identification at the gaps and edges of the sensor arrays. A solution is to use as many sensors as crystals with a direct coupling, which unfortunately increases the complexity and power consumption of the readout electronics. Since 1997, position-sensitive APDs have been successfully used to identify sub-mm crystals. Unfortunately, these devices show a limitation in their time resolution and a degradation of spatial resolution when placed in higher magnetic fields. To overcome these limitations, this paper presents a new sensor concept that extends conventional SiPMs by adding position information via the spatial encoding of the channel sensitivity. The concept allows a direct coupling of high-resolution crystal arrays to the sensor with a reduced amount of readout channels. The theory of sensitivity encoding is detailed and linked to compressed sensing to compute unique sparse solutions. Two devices have been designed using one- and two-dimensional linear sensitivity encoding with eight and four readout channels, respectively. Flood histograms of both devices show the capability to precisely identify all 4 × 4 LYSO crystals with dimensions of 0.93 × 0.93 × 10 mm3. For these crystals, the energy and time resolution (MV ± SD) of the devices with one (two)-dimensional encoding have been measured to be 12.3 · (1 ± 0.047)% (13.7 · (1 ± 0.047)%) around 511 keV with a paired coincidence time resolution (full width at half maximum) of 462 · (1 ± 0.054) ps (452 · (1 ± 0.078) ps).

  2. Hemispherical Field-of-View Above-Water Surface Imager for Submarines

    NASA Technical Reports Server (NTRS)

    Hemmati, Hamid; Kovalik, Joseph M.; Farr, William H.; Dannecker, John D.

    2012-01-01

    A document discusses solutions to the problem of submarines having to rise above water to detect airplanes in the general vicinity. Two solutions are provided, in which a sensor is located just under the water surface, and at a few to tens of meter depth under the water surface. The first option is a Fish Eye Lens (FEL) digital-camera combination, situated just under the water surface that will have near-full- hemisphere (360 azimuth and 90 elevation) field of view for detecting objects on the water surface. This sensor can provide a three-dimensional picture of the airspace both in the marine and in the land environment. The FEL is coupled to a camera and can continuously look at the entire sky above it. The camera can have an Active Pixel Sensor (APS) focal plane array that allows logic circuitry to be built directly in the sensor. The logic circuitry allows data processing to occur on the sensor head without the need for any other external electronics. In the second option, a single-photon sensitive (photon counting) detector-array is used at depth, without the need for any optics in front of it, since at this location, optical signals are scattered and arrive at a wide (tens of degrees) range of angles. Beam scattering through clouds and seawater effectively negates optical imaging at depths below a few meters under cloudy or turbulent conditions. Under those conditions, maximum collection efficiency can be achieved by using a non-imaging photon-counting detector behind narrowband filters. In either case, signals from these sensors may be fused and correlated or decorrelated with other sensor data to get an accurate picture of the object(s) above the submarine. These devices can complement traditional submarine periscopes that have a limited field of view in the elevation direction. Also, these techniques circumvent the need for exposing the entire submarine or its periscopes to the outside environment.

  3. Improved fiber-optic chemical sensor for penicillin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Healy, B.G.; Walt, D.R.

    An optical penicillin biosensor is described, based on the enzyme penicillinase. The sensor is fabricated by selective photodeposition of analyte-sensitive polymer matrices on optical imaging fibers. The penicillin-sensitive matrices are fabricated by immobilizing the enzyme as micrometer-sized particles in a polymer hydrogel with a covalently bound pH indicator. An array of penicillin-sensitive and pH-sensitive matrices are fabricated on the same fiber. This array allows for the simultaneous, independent measurement of pH and penicillin. Independent measurement of the two analytes allows penicillin to be quantitated in the presence of a concurrent pH change. An analysis was conducted of enzyme kinetic parametersmore » in order to model the penicillin response of the sensor at all pH values. This analysis accounts for the varying activity of the immobilized penicillinase at different pH values. The sensor detects penicillin in the range 0.25-10.0 mM in the pH range 6.2-7.5. The sensor was used to quantify penicillin concentration produced during a Penicillium chrysogenum fermentation. 27 refs., 7 figs., 1 tab.« less

  4. A 12-bit high-speed column-parallel two-step single-slope analog-to-digital converter (ADC) for CMOS image sensors.

    PubMed

    Lyu, Tao; Yao, Suying; Nie, Kaiming; Xu, Jiangtao

    2014-11-17

    A 12-bit high-speed column-parallel two-step single-slope (SS) analog-to-digital converter (ADC) for CMOS image sensors is proposed. The proposed ADC employs a single ramp voltage and multiple reference voltages, and the conversion is divided into coarse phase and fine phase to improve the conversion rate. An error calibration scheme is proposed to correct errors caused by offsets among the reference voltages. The digital-to-analog converter (DAC) used for the ramp generator is based on the split-capacitor array with an attenuation capacitor. Analysis of the DAC's linearity performance versus capacitor mismatch and parasitic capacitance is presented. A prototype 1024 × 32 Time Delay Integration (TDI) CMOS image sensor with the proposed ADC architecture has been fabricated in a standard 0.18 μm CMOS process. The proposed ADC has average power consumption of 128 μW and a conventional rate 6 times higher than the conventional SS ADC. A high-quality image, captured at the line rate of 15.5 k lines/s, shows that the proposed ADC is suitable for high-speed CMOS image sensors.

  5. Thermoelectric infrared imaging sensors for automotive applications

    NASA Astrophysics Data System (ADS)

    Hirota, Masaki; Nakajima, Yasushi; Saito, Masanori; Satou, Fuminori; Uchiyama, Makoto

    2004-07-01

    This paper describes three low-cost thermoelectric infrared imaging sensors having a 1,536, 2,304, and 10,800 element thermoelectric focal plane array (FPA) respectively and two experimental automotive application systems. The FPAs are basically fabricated with a conventional IC process and micromachining technologies and have a low cost potential. Among these sensors, the sensor having 2,304 elements provide high responsivity of 5,500 V/W and a very small size with adopting a vacuum-sealed package integrated with a wide-angle ZnS lens. One experimental system incorporated in the Nissan ASV-2 is a blind spot pedestrian warning system that employs four infrared imaging sensors. This system helps alert the driver to the presence of a pedestrian in a blind spot by detecting the infrared radiation emitted from the person"s body. The system can also prevent the vehicle from moving in the direction of the pedestrian. The other is a rearview camera system with an infrared detection function. This system consists of a visible camera and infrared sensors, and it helps alert the driver to the presence of a pedestrian in a rear blind spot. Various issues that will need to be addressed in order to expand the automotive applications of IR imaging sensors in the future are also summarized. This performance is suitable for consumer electronics as well as automotive applications.

  6. Convolutional Sparse Coding for RGB+NIR Imaging.

    PubMed

    Hu, Xuemei; Heide, Felix; Dai, Qionghai; Wetzstein, Gordon

    2018-04-01

    Emerging sensor designs increasingly rely on novel color filter arrays (CFAs) to sample the incident spectrum in unconventional ways. In particular, capturing a near-infrared (NIR) channel along with conventional RGB color is an exciting new imaging modality. RGB+NIR sensing has broad applications in computational photography, such as low-light denoising, it has applications in computer vision, such as facial recognition and tracking, and it paves the way toward low-cost single-sensor RGB and depth imaging using structured illumination. However, cost-effective commercial CFAs suffer from severe spectral cross talk. This cross talk represents a major challenge in high-quality RGB+NIR imaging, rendering existing spatially multiplexed sensor designs impractical. In this work, we introduce a new approach to RGB+NIR image reconstruction using learned convolutional sparse priors. We demonstrate high-quality color and NIR imaging for challenging scenes, even including high-frequency structured NIR illumination. The effectiveness of the proposed method is validated on a large data set of experimental captures, and simulated benchmark results which demonstrate that this work achieves unprecedented reconstruction quality.

  7. Nanohole-array-based device for 2D snapshot multispectral imaging

    PubMed Central

    Najiminaini, Mohamadreza; Vasefi, Fartash; Kaminska, Bozena; Carson, Jeffrey J. L.

    2013-01-01

    We present a two-dimensional (2D) snapshot multispectral imager that utilizes the optical transmission characteristics of nanohole arrays (NHAs) in a gold film to resolve a mixture of input colors into multiple spectral bands. The multispectral device consists of blocks of NHAs, wherein each NHA has a unique periodicity that results in transmission resonances and minima in the visible and near-infrared regions. The multispectral device was illuminated over a wide spectral range, and the transmission was spectrally unmixed using a least-squares estimation algorithm. A NHA-based multispectral imaging system was built and tested in both reflection and transmission modes. The NHA-based multispectral imager was capable of extracting 2D multispectral images representative of four independent bands within the spectral range of 662 nm to 832 nm for a variety of targets. The multispectral device can potentially be integrated into a variety of imaging sensor systems. PMID:24005065

  8. Monolithic microwave integrated circuits for sensors, radar, and communications systems; Proceedings of the Meeting, Orlando, FL, Apr. 2-4, 1991

    NASA Technical Reports Server (NTRS)

    Leonard, Regis F. (Editor); Bhasin, Kul B. (Editor)

    1991-01-01

    Consideration is given to MMICs for airborne phased arrays, monolithic GaAs integrated circuit millimeter wave imaging sensors, accurate design of multiport low-noise MMICs up to 20 GHz, an ultralinear low-noise amplifier technology for space communications, variable-gain MMIC module for space applications, a high-efficiency dual-band power amplifier for radar applications, a high-density circuit approach for low-cost MMIC circuits, coplanar SIMMWIC circuits, recent advances in monolithic phased arrays, and system-level integrated circuit development for phased-array antenna applications. Consideration is also given to performance enhancement in future communications satellites with MMIC technology insertion, application of Ka-band MMIC technology for an Orbiter/ACTS communications experiment, a space-based millimeter wave debris tracking radar, low-noise high-yield octave-band feedback amplifiers to 20 GHz, quasi-optical MESFET VCOs, and a high-dynamic-range mixer using novel balun structure.

  9. Honeywell's Compact, Wide-angle Uv-visible Imaging Sensor

    NASA Technical Reports Server (NTRS)

    Pledger, D.; Billing-Ross, J.

    1993-01-01

    Honeywell is currently developing the Earth Reference Attitude Determination System (ERADS). ERADS determines attitude by imaging the entire Earth's limb and a ring of the adjacent star field in the 2800-3000 A band of the ultraviolet. This is achieved through the use of a highly nonconventional optical system, an intensifier tube, and a mega-element CCD array. The optics image a 30 degree region in the center of the field, and an outer region typically from 128 to 148 degrees, which can be adjusted up to 180 degrees. Because of the design employed, the illumination at the outer edge of the field is only some 15 percent below that at the center, in contrast to the drastic rolloffs encountered in conventional wide-angle sensors. The outer diameter of the sensor is only 3 in; the volume and weight of the entire system, including processor, are 1000 cc and 6 kg, respectively.

  10. Selection of optimal spectral sensitivity functions for color filter arrays.

    PubMed

    Parmar, Manu; Reeves, Stanley J

    2010-12-01

    A color image meant for human consumption can be appropriately displayed only if at least three distinct color channels are present. Typical digital cameras acquire three-color images with only one sensor. A color filter array (CFA) is placed on the sensor such that only one color is sampled at a particular spatial location. This sparsely sampled signal is then reconstructed to form a color image with information about all three colors at each location. In this paper, we show that the wavelength sensitivity functions of the CFA color filters affect both the color reproduction ability and the spatial reconstruction quality of recovered images. We present a method to select perceptually optimal color filter sensitivity functions based upon a unified spatial-chromatic sampling framework. A cost function independent of particular scenes is defined that expresses the error between a scene viewed by the human visual system and the reconstructed image that represents the scene. A constrained minimization of the cost function is used to obtain optimal values of color-filter sensitivity functions for several periodic CFAs. The sensitivity functions are shown to perform better than typical RGB and CMY color filters in terms of both the s-CIELAB ∆E error metric and a qualitative assessment.

  11. Multi-sensor Array for High Altitude Balloon Missions to the Stratosphere

    NASA Astrophysics Data System (ADS)

    Davis, Tim; McClurg, Bryce; Sohl, John

    2008-10-01

    We have designed and built a microprocessor controlled and expandable multi-sensor array for data collection on near space missions. Weber State University has started a high altitude research balloon program called HARBOR. This array has been designed to data log a base set of measurements for every flight and has room for six guest instruments. The base measurements are absolute pressure, on-board temperature, 3-axis accelerometer for attitude measurement, and 2-axis compensated magnetic compass. The system also contains a real time clock and circuitry for logging data directly to a USB memory stick. In typical operation the measurements will be cycled through in sequence and saved to the memory stick along with the clock's time stamp. The microprocessor can be reprogrammed to adapt to guest experiments with either analog or digital interfacing. This system will fly with every mission and will provide backup data collection for other instrumentation for which the primary task is measuring atmospheric pressure and temperature. The attitude data will be used to determine the orientation of the onboard camera systems to aid in identifying features in the images. This will make these images easier to use for any future GIS (geographic information system) remote sensing missions.

  12. Automatic Fault Recognition of Photovoltaic Modules Based on Statistical Analysis of Uav Thermography

    NASA Astrophysics Data System (ADS)

    Kim, D.; Youn, J.; Kim, C.

    2017-08-01

    As a malfunctioning PV (Photovoltaic) cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle). The proposed algorithm uses statistical analysis of thermal intensity (surface temperature) characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.

  13. Lensless transport-of-intensity phase microscopy and tomography with a color LED matrix

    NASA Astrophysics Data System (ADS)

    Zuo, Chao; Sun, Jiasong; Zhang, Jialin; Hu, Yan; Chen, Qian

    2015-07-01

    We demonstrate lens-less quantitative phase microscopy and diffraction tomography based on a compact on-chip platform, using only a CMOS image sensor and a programmable color LED array. Based on multi-wavelength transport-of- intensity phase retrieval and multi-angle illumination diffraction tomography, this platform offers high quality, depth resolved images with a lateral resolution of ˜3.7μm and an axial resolution of ˜5μm, over wide large imaging FOV of 24mm2. The resolution and FOV can be further improved by using a larger image sensors with small pixels straightforwardly. This compact, low-cost, robust, portable platform with a decent imaging performance may offer a cost-effective tool for telemedicine needs, or for reducing health care costs for point-of-care diagnostics in resource-limited environments.

  14. Pitch variable liquid lens array using electrowetting

    NASA Astrophysics Data System (ADS)

    Kim, YooKwang; Lee, Jin Su; Kim, Junoh; Won, Yong Hyub

    2017-02-01

    These days micro lens array is used in various fields such as fiber coupling, laser collimation, imaging and sensor system and beam homogenizer, etc. One of important thing in using micro lens array is, choice of its pitch. Especially imaging systems like integral imaging or light-field camera, pitch of micro lens array defines the system property and thus it could limit the variability of the system. There are already researches about lens array using liquid, and droplet control by electrowetting. This paper reports the result of combining them, the liquid lens array that could vary its pitch by electrowetting. Since lens array is a repeated system, realization of a small part of lens array is enough to show its property. The lens array is composed of nine (3 by 3) liquid droplets on flat surface. On substrate, 11 line electrodes are patterned along vertical and horizontal direction respectively. The width of line electrodes is 300um and interval is 200um. Each droplet is positioned to contain three electrode lines for both of vertical and horizontal direction. So there is one remaining electrode line in each of outermost side for both direction. In original state the voltage is applied to inner electrodes. When voltage of outermost electrodes are turned on, eight outermost droplets move to outer side, thereby increasing pitch of lens array. The original pitch was 1.5mm and it increased to 2.5mm after electrodes of voltage applied is changed.

  15. AMTV headway sensor and safety design

    NASA Technical Reports Server (NTRS)

    Johnston, A. R.; Nelson, M.; Cassell, P.; Herridge, J. T.

    1980-01-01

    A headway sensing system for an automated mixed traffic vehicle (AMTV) employing an array of optical proximity sensor elements is described, and its performance is presented in terms of object detection profiles. The problem of sensing in turns is explored experimentally and requirements for future turn sensors are discussed. A recommended headway sensor configuration, employing multiple source elements in the focal plane of one lens operating together with a similar detector unit, is described. Alternative concepts including laser radar, ultrasonic sensing, imaging techniques, and radar are compared to the present proximity sensor approach. Design concepts for an AMTV body which will minimize the probability of injury to pedestrians or passengers in the event of a collision are presented.

  16. Development of a ground signal processor for digital synthetic array radar data

    NASA Technical Reports Server (NTRS)

    Griffin, C. R.; Estes, J. M.

    1981-01-01

    A modified APQ-102 sidelooking array radar (SLAR) in a B-57 aircraft test bed is used, with other optical and infrared sensors, in remote sensing of Earth surface features for various users at NASA Johnson Space Center. The video from the radar is normally recorded on photographic film and subsequently processed photographically into high resolution radar images. Using a high speed sampling (digitizing) system, the two receiver channels of cross-and co-polarized video are recorded on wideband magnetic tape along with radar and platform parameters. These data are subsequently reformatted and processed into digital synthetic aperture radar images with the image data available on magnetic tape for subsequent analysis by investigators. The system design and results obtained are described.

  17. Compact SPAD-Based Pixel Architectures for Time-Resolved Image Sensors

    PubMed Central

    Perenzoni, Matteo; Pancheri, Lucio; Stoppa, David

    2016-01-01

    This paper reviews the state of the art of single-photon avalanche diode (SPAD) image sensors for time-resolved imaging. The focus of the paper is on pixel architectures featuring small pixel size (<25 μm) and high fill factor (>20%) as a key enabling technology for the successful implementation of high spatial resolution SPAD-based image sensors. A summary of the main CMOS SPAD implementations, their characteristics and integration challenges, is provided from the perspective of targeting large pixel arrays, where one of the key drivers is the spatial uniformity. The main analog techniques aimed at time-gated photon counting and photon timestamping suitable for compact and low-power pixels are critically discussed. The main features of these solutions are the adoption of analog counting techniques and time-to-analog conversion, in NMOS-only pixels. Reliable quantum-limited single-photon counting, self-referenced analog-to-digital conversion, time gating down to 0.75 ns and timestamping with 368 ps jitter are achieved. PMID:27223284

  18. Bundle Block Adjustment of Airborne Three-Line Array Imagery Based on Rotation Angles

    PubMed Central

    Zhang, Yongjun; Zheng, Maoteng; Huang, Xu; Xiong, Jinxin

    2014-01-01

    In the midst of the rapid developments in electronic instruments and remote sensing technologies, airborne three-line array sensors and their applications are being widely promoted and plentiful research related to data processing and high precision geo-referencing technologies is under way. The exterior orientation parameters (EOPs), which are measured by the integrated positioning and orientation system (POS) of airborne three-line sensors, however, have inevitable systematic errors, so the level of precision of direct geo-referencing is not sufficiently accurate for surveying and mapping applications. Consequently, a few ground control points are necessary to refine the exterior orientation parameters, and this paper will discuss bundle block adjustment models based on the systematic error compensation and the orientation image, considering the principle of an image sensor and the characteristics of the integrated POS. Unlike the models available in the literature, which mainly use a quaternion to represent the rotation matrix of exterior orientation, three rotation angles are directly used in order to effectively model and eliminate the systematic errors of the POS observations. Very good experimental results have been achieved with several real datasets that verify the correctness and effectiveness of the proposed adjustment models. PMID:24811075

  19. Bundle block adjustment of airborne three-line array imagery based on rotation angles.

    PubMed

    Zhang, Yongjun; Zheng, Maoteng; Huang, Xu; Xiong, Jinxin

    2014-05-07

    In the midst of the rapid developments in electronic instruments and remote sensing technologies, airborne three-line array sensors and their applications are being widely promoted and plentiful research related to data processing and high precision geo-referencing technologies is under way. The exterior orientation parameters (EOPs), which are measured by the integrated positioning and orientation system (POS) of airborne three-line sensors, however, have inevitable systematic errors, so the level of precision of direct geo-referencing is not sufficiently accurate for surveying and mapping applications. Consequently, a few ground control points are necessary to refine the exterior orientation parameters, and this paper will discuss bundle block adjustment models based on the systematic error compensation and the orientation image, considering the principle of an image sensor and the characteristics of the integrated POS. Unlike the models available in the literature, which mainly use a quaternion to represent the rotation matrix of exterior orientation, three rotation angles are directly used in order to effectively model and eliminate the systematic errors of the POS observations. Very good experimental results have been achieved with several real datasets that verify the correctness and effectiveness of the proposed adjustment models.

  20. Mathematical models and photogrammetric exploitation of image sensing

    NASA Astrophysics Data System (ADS)

    Puatanachokchai, Chokchai

    Mathematical models of image sensing are generally categorized into physical/geometrical sensor models and replacement sensor models. While the former is determined from image sensing geometry, the latter is based on knowledge of the physical/geometric sensor models and on using such models for its implementation. The main thrust of this research is in replacement sensor models which have three important characteristics: (1) Highly accurate ground-to-image functions; (2) Rigorous error propagation that is essentially of the same accuracy as the physical model; and, (3) Adjustability, or the ability to upgrade the replacement sensor model parameters when additional control information becomes available after the replacement sensor model has replaced the physical model. In this research, such replacement sensor models are considered as True Replacement Models or TRMs. TRMs provide a significant advantage of universality, particularly for image exploitation functions. There have been several writings about replacement sensor models, and except for the so called RSM (Replacement Sensor Model as a product described in the Manual of Photogrammetry), almost all of them pay very little or no attention to errors and their propagation. This is because, it is suspected, the few physical sensor parameters are usually replaced by many more parameters, thus presenting a potential error estimation difficulty. The third characteristic, adjustability, is perhaps the most demanding. It provides an equivalent flexibility to that of triangulation using the physical model. Primary contributions of this thesis include not only "the eigen-approach", a novel means of replacing the original sensor parameter covariance matrices at the time of estimating the TRM, but also the implementation of the hybrid approach that combines the eigen-approach with the added parameters approach used in the RSM. Using either the eigen-approach or the hybrid approach, rigorous error propagation can be performed during image exploitation. Further, adjustability can be performed when additional control information becomes available after the TRM has been implemented. The TRM is shown to apply to imagery from sensors having different geometries, including an aerial frame camera, a spaceborne linear array sensor, an airborne pushbroom sensor, and an airborne whiskbroom sensor. TRM results show essentially negligible differences as compared to those from rigorous physical sensor models, both for geopositioning from single and overlapping images. Simulated as well as real image data are used to address all three characteristics of the TRM.

  1. Mission-Oriented Sensor Arrays and UAVs - a Case Study on Environmental Monitoring

    NASA Astrophysics Data System (ADS)

    Figueira, N. M.; Freire, I. L.; Trindade, O.; Simões, E.

    2015-08-01

    This paper presents a new concept of UAV mission design in geomatics, applied to the generation of thematic maps for a multitude of civilian and military applications. We discuss the architecture of Mission-Oriented Sensors Arrays (MOSA), proposed in Figueira et Al. (2013), aimed at splitting and decoupling the mission-oriented part of the system (non safety-critical hardware and software) from the aircraft control systems (safety-critical). As a case study, we present an environmental monitoring application for the automatic generation of thematic maps to track gunshot activity in conservation areas. The MOSA modeled for this application integrates information from a thermal camera and an on-the-ground microphone array. The use of microphone arrays technology is of particular interest in this paper. These arrays allow estimation of the direction-of-arrival (DOA) of the incoming sound waves. Information about events of interest is obtained by the fusion of the data provided by the microphone array, captured by the UAV, fused with information from the termal image processing. Preliminary results show the feasibility of the on-the-ground sound processing array and the simulation of the main processing module, to be embedded into an UAV in a future work. The main contributions of this paper are the proposed MOSA system, including concepts, models and architecture.

  2. Analysis of the Advantages and Limitations of Stationary Imaging Fourier Transform Spectrometer. Revised

    NASA Technical Reports Server (NTRS)

    Beecken, Brian P.; Kleinman, Randall R.

    2004-01-01

    New developments in infrared sensor technology have potentially made possible a new space-based system which can measure far-infrared radiation at lower costs (mass, power and expense). The Stationary Imaging Fourier Transform Spectrometer (SIFTS) proposed by NASA Langley Research Center, makes use of new detector array technology. A mathematical model which simulates resolution and spectral range relationships has been developed for analyzing the utility of such a radically new approach to spectroscopy. Calculations with this forward model emulate the effects of a detector array on the ability to retrieve accurate spectral features. Initial computations indicate significant attenuation at high wavenumbers.

  3. An airborne thematic thermal infrared and electro-optical imaging system

    NASA Astrophysics Data System (ADS)

    Sun, Xiuhong; Shu, Peter

    2011-08-01

    This paper describes an advanced Airborne Thematic Thermal InfraRed and Electro-Optical Imaging System (ATTIREOIS) and its potential applications. ATTIREOIS sensor payload consists of two sets of advanced Focal Plane Arrays (FPAs) - a broadband Thermal InfraRed Sensor (TIRS) and a four (4) band Multispectral Electro-Optical Sensor (MEOS) to approximate Landsat ETM+ bands 1,2,3,4, and 6, and LDCM bands 2,3,4,5, and 10+11. The airborne TIRS is 3-axis stabilized payload capable of providing 3D photogrammetric images with a 1,850 pixel swathwidth via pushbroom operation. MEOS has a total of 116 million simultaneous sensor counts capable of providing 3 cm spatial resolution multispectral orthophotos for continuous airborne mapping. ATTIREOIS is a complete standalone and easy-to-use portable imaging instrument for light aerial vehicle deployment. Its miniaturized backend data system operates all ATTIREOIS imaging sensor components, an INS/GPS, and an e-Gimbal™ Control Electronic Unit (ECU) with a data throughput of 300 Megabytes/sec. The backend provides advanced onboard processing, performing autonomous raw sensor imagery development, TIRS image track-recovery reconstruction, LWIR/VNIR multi-band co-registration, and photogrammetric image processing. With geometric optics and boresight calibrations, the ATTIREOIS data products are directly georeferenced with an accuracy of approximately one meter. A prototype ATTIREOIS has been configured. Its sample LWIR/EO image data will be presented. Potential applications of ATTIREOIS include: 1) Providing timely and cost-effective, precisely and directly georeferenced surface emissive and solar reflective LWIR/VNIR multispectral images via a private Google Earth Globe to enhance NASA's Earth science research capabilities; and 2) Underflight satellites to support satellite measurement calibration and validation observations.

  4. Performance of a novel SQUID-based superconducting imaging-surface magnetoencephalography system

    NASA Astrophysics Data System (ADS)

    Kraus, R. H.; Volegov, P.; Maharajh, K.; Espy, M. A.; Matlashov, A. N.; Flynn, E. R.

    2002-03-01

    Performance for a recently completed whole-head magnetoencephalography system using a superconducting imaging surface (SIS) surrounding an array of 150 SQUID magnetometers is reported. The helmet-like SIS is hemispherical in shape with a brim. Conceptually, the SIS images nearby sources onto the SQUIDs while shielding sensors from distant “noise” sources. A finite element method (FEM) description using the as-built geometry was developed to describe the SIS effect on source fields by imposing B⊥( surface)=0 . Sensors consist of 8×8 mm 2 SQUID magnetometers with 0.84 nT/ Φ0 sensitivity and <3 fT/ Hz noise. A series of phantom experiments to verify system efficacy have been completed. Simple dry-wire phantoms were used to eliminate model dependence from our results. Phantom coils were distributed throughout the volume encompassed by the array with a variety of orientations. Each phantom coil was precisely machined and located to better than 25 μm and 10 mRad accuracy. Excellent agreement between model-calculated and measured magnetic field distributions of all phantom coil positions and orientations was found. Good agreement was found between modeled and measured shielding of the SQUIDs from sources external to the array showing significant frequency-independent shielding. Phantom localization precision was better than 0.5 mm at all locations with a mean of better than 0.3 mm.

  5. GaAs QWIP Array Containing More Than a Million Pixels

    NASA Technical Reports Server (NTRS)

    Jhabvala, Murzy; Choi, K. K.; Gunapala, Sarath

    2005-01-01

    A 1,024 x 1,024-pixel array of quantum-well infrared photodetectors (QWIPs) has been built on a 1.8 x 1.8- cm GaAs chip. In tests, the array was found to perform well in detecting images at wavelengths from 8 to 9 m in operation at temperatures between 60 and 70 K. The largest-format QWIP prior array that performed successfully in tests contained 512 x 640 pixels. There is continuing development effort directed toward satisfying actual and anticipated demands to increase numbers of pixels and pixel sizes in order to increase the imaging resolution of infrared photodetector arrays. A 1,024 x 1,024-pixel and even larger formats have been achieved in the InSb and HgCdTe material systems, but photodetector arrays in these material systems are very expensive and manufactured by fewer than half a dozen large companies. In contrast, GaAs-photodetector-array technology is very mature, and photodetectors in the GaAs material system can be readily manufactured by a wide range of industrial technologists, by universities, and government laboratories. There is much similarity between processing in the GaAs industry and processing in the pervasive silicon industry. With respect to yield and cost, the performance of GaAs technology substantially exceeds that of InSb and HgCdTe technologies. In addition, GaAs detectors can be designed to respond to any portion of the wavelength range from 3 to about 16 micrometers - a feature that is very desirable for infrared imaging. GaAs QWIP arrays, like the present one, have potential for use as imaging sensors in infrared measuring instruments, infrared medical imaging systems, and infrared cameras.

  6. Analysis of Multiplexed Nanosensor Arrays Based on Near-Infrared Fluorescent Single-Walled Carbon Nanotubes.

    PubMed

    Dong, Juyao; Salem, Daniel P; Sun, Jessica H; Strano, Michael S

    2018-04-24

    The high-throughput, label-free detection of biomolecules remains an important challenge in analytical chemistry with the potential of nanosensors to significantly increase the ability to multiplex such assays. In this work, we develop an optical sensor array, printable from a single-walled carbon nanotube/chitosan ink and functionalized to enable a divalent ion-based proximity quenching mechanism for transducing binding between a capture protein or an antibody with the target analyte. Arrays of 5 × 6, 200 μm near-infrared (nIR) spots at a density of ≈300 spots/cm 2 are conjugated with immunoglobulin-binding proteins (proteins A, G, and L) for the detection of human IgG, mouse IgM, rat IgG2a, and human IgD. Binding kinetics are measured in a parallel, multiplexed fashion from each sensor spot using a custom laser scanning imaging configuration with an nIR photomultiplier tube detector. These arrays are used to examine cross-reactivity, competitive and nonspecific binding of analyte mixtures. We find that protein G and protein L functionalized sensors report selective responses to mouse IgM on the latter, as anticipated. Optically addressable platforms such as the one examined in this work have potential to significantly advance the real-time, multiplexed biomolecular detection of complex mixtures.

  7. Multifunctional Catheters Combining Intracardiac Ultrasound Imaging and Electrophysiology Sensing

    PubMed Central

    Stephens, Douglas N.; Cannata, Jonathan; Liu, Ruibin; Zhao, Jian Zhong; Shung, K. Kirk; Nguyen, Hien; Chia, Raymond; Dentinger, Aaron; Wildes, Douglas; Thomenius, Kai E.; Mahajan, Aman; Shivkumar, Kalyanam; Kim, Kang; O’Donnell, Matthew; Nikoozadeh, Amin; Oralkan, Omer; Khuri-Yakub, Pierre T.; Sahn, David J.

    2015-01-01

    A family of 3 multifunctional intracardiac imaging and electrophysiology (EP) mapping catheters has been in development to help guide diagnostic and therapeutic intracardiac EP procedures. The catheter tip on the first device includes a 7.5 MHz, 64-element, side-looking phased array for high resolution sector scanning. The second device is a forward-looking catheter with a 24-element 14 MHz phased array. Both of these catheters operate on a commercial imaging system with standard software. Multiple EP mapping sensors were mounted as ring electrodes near the arrays for electrocardiographic synchronization of ultrasound images and used for unique integration with EP mapping technologies. To help establish the catheters’ ability for integration with EP interventional procedures, tests were performed in vivo in a porcine animal model to demonstrate both useful intracardiac echocardiographic (ICE) visualization and simultaneous 3-D positional information using integrated electroanatomical mapping techniques. The catheters also performed well in high frame rate imaging, color flow imaging, and strain rate imaging of atrial and ventricular structures. The companion paper of this work discusses the catheter design of the side-looking catheter with special attention to acoustic lens design. The third device in development is a 10 MHz forward-looking ring array that is to be mounted at the distal tip of a 9F catheter to permit use of the available catheter lumen for adjunctive therapy tools. PMID:18986948

  8. Multifunctional catheters combining intracardiac ultrasound imaging and electrophysiology sensing.

    PubMed

    Stephens, D N; Cannata, J; Liu, Ruibin; Zhao, Jian Zhong; Shung, K K; Nguyen, Hien; Chia, R; Dentinger, A; Wildes, D; Thomenius, K E; Mahajan, A; Shivkumar, K; Kim, Kang; O'Donnell, M; Nikoozadeh, A; Oralkan, O; Khuri-Yakub, P T; Sahn, D J

    2008-07-01

    A family of 3 multifunctional intracardiac imaging and electrophysiology (EP) mapping catheters has been in development to help guide diagnostic and therapeutic intracardiac EP procedures. The catheter tip on the first device includes a 7.5 MHz, 64-element, side-looking phased array for high resolution sector scanning. The second device is a forward-looking catheter with a 24-element 14 MHz phased array. Both of these catheters operate on a commercial imaging system with standard software. Multiple EP mapping sensors were mounted as ring electrodes near the arrays for electrocardiographic synchronization of ultrasound images and used for unique integration with EP mapping technologies. To help establish the catheters' ability for integration with EP interventional procedures, tests were performed in vivo in a porcine animal model to demonstrate both useful intracardiac echocardiographic (ICE) visualization and simultaneous 3-D positional information using integrated electroanatomical mapping techniques. The catheters also performed well in high frame rate imaging, color flow imaging, and strain rate imaging of atrial and ventricular structures. The companion paper of this work discusses the catheter design of the side-looking catheter with special attention to acoustic lens design. The third device in development is a 10 MHz forward-looking ring array that is to be mounted at the distal tip of a 9F catheter to permit use of the available catheter lumen for adjunctive therapy tools.

  9. Stability Measurements for Alignment of the NIF Neutron Imaging System Pinhole Array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fittinghoff, D N; Bower, D E; Drury, O B

    2011-03-29

    The alignment system for the National Ignition Facility's neutron imaging system has been commissioned and measurements of the relative stability of the 90-315 DIM, the front and the back of the neutron imaging pinhole array and an exploding pusher target have been made using the 90-135 and the 90-258 opposite port alignment systems. Additionally, a laser beam shot from the neutron-imaging Annex and reflected from a mirror at the back of the pinhole array was used to monitor the pointing of the pinhole. Over a twelve hour period, the relative stability of these parts was found to be within {approx}more » {+-}18 {micro}m rms even when using manual methods for tracking the position of the objects. For highly visible features, use of basic particle tracking techniques found that the front of the pinhole array was stable relative to the 90-135 opposite port alignment camera to within {+-}3.4 {micro}m rms. Reregistration, however, of the opposite port alignment systems themselves using the target alignment sensor was found to change the expected position of target chamber center by up to 194 {micro}m.« less

  10. Solid-state image sensor with focal-plane digital photon-counting pixel array

    NASA Technical Reports Server (NTRS)

    Fossum, Eric R. (Inventor); Pain, Bedabrata (Inventor)

    1995-01-01

    A photosensitive layer such as a-Si for a UV/visible wavelength band is provided for low light level imaging with at least a separate CMOS amplifier directly connected to each PIN photodetector diode to provide a focal-plane array of NxN pixels, and preferably a separate photon-counting CMOS circuit directly connected to each CMOS amplifier, although one row of counters may be time shared for reading out the photon flux rate of each diode in the array, together with a buffer memory for storing all rows of the NxN image frame before transfer to suitable storage. All CMOS circuitry is preferably fabricated in the same silicon layer as the PIN photodetector diode for a monolithic structure, but when the wavelength band of interest requires photosensitive material different from silicon, the focal-plane array may be fabricated separately on a different semiconductor layer bump-bonded or otherwise bonded for a virtually monolithic structure with one free terminal of each diode directly connected to the input terminal of its CMOS amplifier and digital counter for integration of the photon flux rate at each photodetector of the array.

  11. A Compressed Sensing Based Method for Reducing the Sampling Time of A High Resolution Pressure Sensor Array System

    PubMed Central

    Sun, Chenglu; Li, Wei; Chen, Wei

    2017-01-01

    For extracting the pressure distribution image and respiratory waveform unobtrusively and comfortably, we proposed a smart mat which utilized a flexible pressure sensor array, printed electrodes and novel soft seven-layer structure to monitor those physiological information. However, in order to obtain high-resolution pressure distribution and more accurate respiratory waveform, it needs more time to acquire the pressure signal of all the pressure sensors embedded in the smart mat. In order to reduce the sampling time while keeping the same resolution and accuracy, a novel method based on compressed sensing (CS) theory was proposed. By utilizing the CS based method, 40% of the sampling time can be decreased by means of acquiring nearly one-third of original sampling points. Then several experiments were carried out to validate the performance of the CS based method. While less than one-third of original sampling points were measured, the correlation degree coefficient between reconstructed respiratory waveform and original waveform can achieve 0.9078, and the accuracy of the respiratory rate (RR) extracted from the reconstructed respiratory waveform can reach 95.54%. The experimental results demonstrated that the novel method can fit the high resolution smart mat system and be a viable option for reducing the sampling time of the pressure sensor array. PMID:28796188

  12. Optimization of CMOS image sensor utilizing variable temporal multisampling partial transfer technique to achieve full-frame high dynamic range with superior low light and stop motion capability

    NASA Astrophysics Data System (ADS)

    Kabir, Salman; Smith, Craig; Armstrong, Frank; Barnard, Gerrit; Schneider, Alex; Guidash, Michael; Vogelsang, Thomas; Endsley, Jay

    2018-03-01

    Differential binary pixel technology is a threshold-based timing, readout, and image reconstruction method that utilizes the subframe partial charge transfer technique in a standard four-transistor (4T) pixel CMOS image sensor to achieve a high dynamic range video with stop motion. This technology improves low light signal-to-noise ratio (SNR) by up to 21 dB. The method is verified in silicon using a Taiwan Semiconductor Manufacturing Company's 65 nm 1.1 μm pixel technology 1 megapixel test chip array and is compared with a traditional 4 × oversampling technique using full charge transfer to show low light SNR superiority of the presented technology.

  13. A coherent through-wall MIMO phased array imaging radar based on time-duplexed switching

    NASA Astrophysics Data System (ADS)

    Chen, Qingchao; Chetty, Kevin; Brennan, Paul; Lok, Lai Bun; Ritchie, Matthiew; Woodbridge, Karl

    2017-05-01

    Through-the-Wall (TW) radar sensors are gaining increasing interest for security, surveillance and search and rescue applications. Additionally, the integration of Multiple-Input, Multiple-Output (MIMO) techniques with phased array radar is allowing higher performance at lower cost. In this paper we present a 4-by-4 TW MIMO phased array imaging radar operating at 2.4 GHz with 200 MHz bandwidth. To achieve high imaging resolution in a cost-effective manner, the 4 Tx and 4 Rx elements are used to synthesize a uniform linear array (ULA) of 16 virtual elements. Furthermore, the transmitter is based on a single-channel 4-element time-multiplexed switched array. In transmission, the radar utilizes frequency modulated continuous wave (FMCW) waveforms that undergo de-ramping on receive to allow digitization at relatively low sampling rates, which then simplifies the imaging process. This architecture has been designed for the short-range TW scenarios envisaged, and permits sufficient time to switch between antenna elements. The paper first outlines the system characteristics before describing the key signal processing and imaging algorithms which are based on traditional Fast Fourier Transform (FFT) processing. These techniques are implemented in LabVIEW software. Finally, we report results from an experimental campaign that investigated the imaging capabilities of the system and demonstrated the detection of personnel targets. Moreover, we show that multiple targets within a room with greater than approximately 1 meter separation can be distinguished from one another.

  14. Characterization of photocathode dark current vs. temperature in image intensifier tube modules and intensified televisions

    NASA Astrophysics Data System (ADS)

    Bender, Edward J.; Wood, Michael V.; Hart, Steve; Heim, Gerald B.; Torgerson, John A.

    2004-10-01

    Image intensifiers (I2) have gained wide acceptance throughout the Army as the premier nighttime mobility sensor for the individual soldier, with over 200,000 fielded systems. There is increasing need, however, for such a sensor with a video output, so that it can be utilized in remote vehicle platforms, and/or can be electronically fused with other sensors. The image-intensified television (I2TV), typically consisting of an image intensifier tube coupled via fiber optic to a solid-state imaging array, has been the primary solution to this need. I2TV platforms in vehicles, however, can generate high internal heat loads and must operate in high-temperature environments. Intensifier tube dark current, called "Equivalent Background Input" or "EBI", is not a significant factor at room temperature, but can seriously degrade image contrast and intra-scene dynamic range at such high temperatures. Cooling of the intensifier's photocathode is the only practical solution to this problem. The US Army RDECOM CERDEC Night Vision & Electronic Sensors Directorate (NVESD) and Ball Aerospace have collaborated in the reported effort to more rigorously characterize intensifier EBI versus temperature. NVESD performed non-imaging EBI measurements of Generation 2 and 3 tube modules over a large range of ambient temperature, while Ball performed an imaging evaluation of Generation 3 I2TVs over a similar temperature range. The findings and conclusions of this effort are presented.

  15. PCA-based spatially adaptive denoising of CFA images for single-sensor digital cameras.

    PubMed

    Zheng, Lei; Lukac, Rastislav; Wu, Xiaolin; Zhang, David

    2009-04-01

    Single-sensor digital color cameras use a process called color demosiacking to produce full color images from the data captured by a color filter array (CAF). The quality of demosiacked images is degraded due to the sensor noise introduced during the image acquisition process. The conventional solution to combating CFA sensor noise is demosiacking first, followed by a separate denoising processing. This strategy will generate many noise-caused color artifacts in the demosiacking process, which are hard to remove in the denoising process. Few denoising schemes that work directly on the CFA images have been presented because of the difficulties arisen from the red, green and blue interlaced mosaic pattern, yet a well-designed "denoising first and demosiacking later" scheme can have advantages such as less noise-caused color artifacts and cost-effective implementation. This paper presents a principle component analysis (PCA)-based spatially-adaptive denoising algorithm, which works directly on the CFA data using a supporting window to analyze the local image statistics. By exploiting the spatial and spectral correlations existing in the CFA image, the proposed method can effectively suppress noise while preserving color edges and details. Experiments using both simulated and real CFA images indicate that the proposed scheme outperforms many existing approaches, including those sophisticated demosiacking and denoising schemes, in terms of both objective measurement and visual evaluation.

  16. Robotic vehicle uses acoustic sensors for voice detection and diagnostics

    NASA Astrophysics Data System (ADS)

    Young, Stuart H.; Scanlon, Michael V.

    2000-07-01

    An acoustic sensor array that cues an imaging system on a small tele- operated robotic vehicle was used to detect human voice and activity inside a building. The advantage of acoustic sensors is that it is a non-line of sight (NLOS) sensing technology that can augment traditional LOS sensors such as visible and IR cameras. Acoustic energy emitted from a target, such as from a person, weapon, or radio, will travel through walls and smoke, around corners, and down corridors, whereas these obstructions would cripple an imaging detection system. The hardware developed and tested used an array of eight microphones to detect the loudest direction and automatically setter a camera's pan/tilt toward the noise centroid. This type of system has applicability for counter sniper applications, building clearing, and search/rescue. Data presented will be time-frequency representations showing voice detected within rooms and down hallways at various ranges. Another benefit of acoustics is that it provides the tele-operator some situational awareness clues via low-bandwidth transmission of raw audio data for the operator to interpret with either headphones or through time-frequency analysis. This data can be useful to recognize familiar sounds that might indicate the presence of personnel, such as talking, equipment, movement noise, etc. The same array also detects the sounds of the robot it is mounted on, and can be useful for engine diagnostics and trouble shooting, or for self-noise emanations for stealthy travel. Data presented will characterize vehicle self noise over various surfaces such as tiles, carpets, pavement, sidewalk, and grass. Vehicle diagnostic sounds will indicate a slipping clutch and repeated unexpected application of emergency braking mechanism.

  17. Submillimeter video imaging with a superconducting bolometer array

    NASA Astrophysics Data System (ADS)

    Becker, Daniel Thomas

    Millimeter wavelength radiation holds promise for detection of security threats at a distance, including suicide bombers and maritime threats in poor weather. The high sensitivity of superconducting Transition Edge Sensor (TES) bolometers makes them ideal for passive imaging of thermal signals at millimeter and submillimeter wavelengths. I have built a 350 GHz video-rate imaging system using an array of feedhorn-coupled TES bolometers. The system operates at standoff distances of 16 m to 28 m with a measured spatial resolution of 1.4 cm (at 17 m). It currently contains one 251-detector sub-array, and can be expanded to contain four sub-arrays for a total of 1004 detectors. The system has been used to take video images that reveal the presence of weapons concealed beneath a shirt in an indoor setting. This dissertation describes the design, implementation and characterization of this system. It presents an overview of the challenges associated with standoff passive imaging and how these problems can be overcome through the use of large-format TES bolometer arrays. I describe the design of the system and cover the results of detector and optical characterization. I explain the procedure used to generate video images using the system, and present a noise analysis of those images. This analysis indicates that the Noise Equivalent Temperature Difference (NETD) of the video images is currently limited by artifacts of the scanning process. More sophisticated image processing algorithms can eliminate these artifacts and reduce the NETD to 100 mK, which is the target value for the most demanding passive imaging scenarios. I finish with an overview of future directions for this system.

  18. Military microwaves '84; Proceedings of the Conference, London, England, October 24-26, 1984

    NASA Astrophysics Data System (ADS)

    The present conference on microwave frequency electronic warfare and military sensor equipment developments consider radar warning receivers, optical frequency spread spectrum systems, mobile digital communications troposcatter effects, wideband bulk encryption, long range air defense radars (such as the AR320, W-2000 and Martello), multistatic radars, and multimode airborne and interceptor radars. IR system and subsystem component topics encompass thermal imaging and active IR countermeasures, class 1 modules, and diamond coatings, while additional radar-related topics include radar clutter in airborne maritime reconnaissance systems, microstrip antennas with dual polarization capability, the synthesis of shaped beam antenna patterns, planar phased arrays, radar signal processing, radar cross section measurement techniques, and radar imaging and pattern analysis. Attention is also given to optical control and signal processing, mm-wave control technology and EW systems, W-band operations, planar mm-wave arrays, mm-wave monolithic solid state components, mm-wave sensor technology, GaAs monolithic ICs, and dielectric resonator and wideband tunable oscillators.

  19. Single lens 3D-camera with extended depth-of-field

    NASA Astrophysics Data System (ADS)

    Perwaß, Christian; Wietzke, Lennart

    2012-03-01

    Placing a micro lens array in front of an image sensor transforms a normal camera into a single lens 3D camera, which also allows the user to change the focus and the point of view after a picture has been taken. While the concept of such plenoptic cameras is known since 1908, only recently the increased computing power of low-cost hardware and the advances in micro lens array production, have made the application of plenoptic cameras feasible. This text presents a detailed analysis of plenoptic cameras as well as introducing a new type of plenoptic camera with an extended depth of field and a maximal effective resolution of up to a quarter of the sensor resolution.

  20. Superconducting Digital Multiplexers for Sensor Arrays

    NASA Technical Reports Server (NTRS)

    Kadin, Alan M.; Brock, Darren K.; Gupta, Deepnarayan

    2004-01-01

    Arrays of cryogenic microbolometers and other cryogenic detectors are being developed for infrared imaging. If the signal from each sensor is amplified, multiplexed, and digitized using superconducting electronics, then this data can be efficiently read out to ambient temperature with a minimum of noise and thermal load. HYPRES is developing an integrated system based on SQUID amplifiers, a high-resolution analog-to-digital converter (ADC) based on RSFQ (rapid single flux quantum) logic, and a clocked RSFQ multiplexer. The ADC and SQUIDs have already been demonstrated for other projects, so this paper will focus on new results of a digital multiplexer. Several test circuits have been fabricated using Nb Josephson technology and are about to be tested at T = 4.2 K, with a more complete prototype in preparation.

  1. Performance of PHOTONIS' low light level CMOS imaging sensor for long range observation

    NASA Astrophysics Data System (ADS)

    Bourree, Loig E.

    2014-05-01

    Identification of potential threats in low-light conditions through imaging is commonly achieved through closed-circuit television (CCTV) and surveillance cameras by combining the extended near infrared (NIR) response (800-10000nm wavelengths) of the imaging sensor with NIR LED or laser illuminators. Consequently, camera systems typically used for purposes of long-range observation often require high-power lasers in order to generate sufficient photons on targets to acquire detailed images at night. While these systems may adequately identify targets at long-range, the NIR illumination needed to achieve such functionality can easily be detected and therefore may not be suitable for covert applications. In order to reduce dependency on supplemental illumination in low-light conditions, the frame rate of the imaging sensors may be reduced to increase the photon integration time and thus improve the signal to noise ratio of the image. However, this may hinder the camera's ability to image moving objects with high fidelity. In order to address these particular drawbacks, PHOTONIS has developed a CMOS imaging sensor (CIS) with a pixel architecture and geometry designed specifically to overcome these issues in low-light level imaging. By combining this CIS with field programmable gate array (FPGA)-based image processing electronics, PHOTONIS has achieved low-read noise imaging with enhanced signal-to-noise ratio at quarter moon illumination, all at standard video frame rates. The performance of this CIS is discussed herein and compared to other commercially available CMOS and CCD for long-range observation applications.

  2. Solid state television camera

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The design, fabrication, and tests of a solid state television camera using a new charge-coupled imaging device are reported. An RCA charge-coupled device arranged in a 512 by 320 format and directly compatible with EIA format standards was the sensor selected. This is a three-phase, sealed surface-channel array that has 163,840 sensor elements, which employs a vertical frame transfer system for image readout. Included are test results of the complete camera system, circuit description and changes to such circuits as a result of integration and test, maintenance and operation section, recommendations to improve the camera system, and a complete set of electrical and mechanical drawing sketches.

  3. Remote environmental sensor array system

    NASA Astrophysics Data System (ADS)

    Hall, Geoffrey G.

    This thesis examines the creation of an environmental monitoring system for inhospitable environments. It has been named The Remote Environmental Sensor Array System or RESA System for short. This thesis covers the development of RESA from its inception, to the design and modeling of the hardware and software required to make it functional. Finally, the actual manufacture, and laboratory testing of the finished RESA product is discussed and documented. The RESA System is designed as a cost-effective way to bring sensors and video systems to the underwater environment. It contains as water quality probe with sensors such as dissolved oxygen, pH, temperature, specific conductivity, oxidation-reduction potential and chlorophyll a. In addition, an omni-directional hydrophone is included to detect underwater acoustic signals. It has a colour, high-definition and a low-light, black and white camera system, which it turn are coupled to a laser scaling system. Both high-intensity discharge and halogen lighting system are included to illuminate the video images. The video and laser scaling systems are manoeuvred using pan and tilt units controlled from an underwater computer box. Finally, a sediment profile imager is included to enable profile images of sediment layers to be acquired. A control and manipulation system to control the instruments and move the data across networks is integrated into the underwater system while a power distribution node provides the correct voltages to power the instruments. Laboratory testing was completed to ensure that the different instruments associated with the RESA performed as designed. This included physical testing of the motorized instruments, calibration of the instruments, benchmark performance testing and system failure exercises.

  4. Polymer-based sensor array for phytochemical detection

    NASA Astrophysics Data System (ADS)

    Weerakoon, Kanchana A.; Hiremath, Nitilaksha; Chin, Bryan A.

    2012-05-01

    Monitoring for the appearance of volatile organic compounds emitted by plants which correspond to time of first insect attack can be used to detect the early stages of insect infestation. This paper reports a chemical sensor array consisting of polymer based chemiresistor sensors that could detect insect infestation effectively. The sensor array consists of sensors with micro electronically fabricated interdigitated electrodes, and twelve different types of electro active polymer layers. The sensor array was cheap, easy to fabricate, and could be used easily in agricultural fields. The polymer array was found to be sensitive to a variety of volatile organic compounds emitted by plants including γ-terpinene α-pinene, pcymene, farnesene, limonene and cis-hexenyl acetate. The sensor array was not only able to detect but also distinguish between these compounds. The twelve sensors produced a resistance change for each of the analytes detected, and each of these responses together produced a unique fingerprint, enabling to distinguish among these chemicals.

  5. Spatiotemporal and geometric optimization of sensor arrays for detecting analytes fluids

    DOEpatents

    Lewis, Nathan S.; Freund, Michael S.; Briglin, Shawn M.; Tokumaru, Phil; Martin, Charles R.; Mitchell, David T.

    2006-10-17

    Sensor arrays and sensor array systems for detecting analytes in fluids. Sensors configured to generate a response upon introduction of a fluid containing one or more analytes can be located on one or more surfaces relative to one or more fluid channels in an array. Fluid channels can take the form of pores or holes in a substrate material. Fluid channels can be formed between one or more substrate plates. Sensor can be fabricated with substantially optimized sensor volumes to generate a response having a substantially maximized signal to noise ratio upon introduction of a fluid containing one or more target analytes. Methods of fabricating and using such sensor arrays and systems are also disclosed.

  6. Spatiotemporal and geometric optimization of sensor arrays for detecting analytes in fluids

    DOEpatents

    Lewis, Nathan S [La Canada, CA; Freund, Michael S [Winnipeg, CA; Briglin, Shawn S [Chittenango, NY; Tokumaru, Phillip [Moorpark, CA; Martin, Charles R [Gainesville, FL; Mitchell, David [Newtown, PA

    2009-09-29

    Sensor arrays and sensor array systems for detecting analytes in fluids. Sensors configured to generate a response upon introduction of a fluid containing one or more analytes can be located on one or more surfaces relative to one or more fluid channels in an array. Fluid channels can take the form of pores or holes in a substrate material. Fluid channels can be formed between one or more substrate plates. Sensor can be fabricated with substantially optimized sensor volumes to generate a response having a substantially maximized signal to noise ratio upon introduction of a fluid containing one or more target analytes. Methods of fabricating and using such sensor arrays and systems are also disclosed.

  7. Multi-Sensor Fusion of Infrared and Electro-Optic Signals for High Resolution Night Images

    PubMed Central

    Huang, Xiaopeng; Netravali, Ravi; Man, Hong; Lawrence, Victor

    2012-01-01

    Electro-optic (EO) image sensors exhibit the properties of high resolution and low noise level at daytime, but they do not work in dark environments. Infrared (IR) image sensors exhibit poor resolution and cannot separate objects with similar temperature. Therefore, we propose a novel framework of IR image enhancement based on the information (e.g., edge) from EO images, which improves the resolution of IR images and helps us distinguish objects at night. Our framework superimposing/blending the edges of the EO image onto the corresponding transformed IR image improves their resolution. In this framework, we adopt the theoretical point spread function (PSF) proposed by Hardie et al. for the IR image, which has the modulation transfer function (MTF) of a uniform detector array and the incoherent optical transfer function (OTF) of diffraction-limited optics. In addition, we design an inverse filter for the proposed PSF and use it for the IR image transformation. The framework requires four main steps: (1) inverse filter-based IR image transformation; (2) EO image edge detection; (3) registration; and (4) blending/superimposing of the obtained image pair. Simulation results show both blended and superimposed IR images, and demonstrate that blended IR images have better quality over the superimposed images. Additionally, based on the same steps, simulation result shows a blended IR image of better quality when only the original IR image is available. PMID:23112602

  8. Multi-sensor fusion of infrared and electro-optic signals for high resolution night images.

    PubMed

    Huang, Xiaopeng; Netravali, Ravi; Man, Hong; Lawrence, Victor

    2012-01-01

    Electro-optic (EO) image sensors exhibit the properties of high resolution and low noise level at daytime, but they do not work in dark environments. Infrared (IR) image sensors exhibit poor resolution and cannot separate objects with similar temperature. Therefore, we propose a novel framework of IR image enhancement based on the information (e.g., edge) from EO images, which improves the resolution of IR images and helps us distinguish objects at night. Our framework superimposing/blending the edges of the EO image onto the corresponding transformed IR image improves their resolution. In this framework, we adopt the theoretical point spread function (PSF) proposed by Hardie et al. for the IR image, which has the modulation transfer function (MTF) of a uniform detector array and the incoherent optical transfer function (OTF) of diffraction-limited optics. In addition, we design an inverse filter for the proposed PSF and use it for the IR image transformation. The framework requires four main steps: (1) inverse filter-based IR image transformation; (2) EO image edge detection; (3) registration; and (4) blending/superimposing of the obtained image pair. Simulation results show both blended and superimposed IR images, and demonstrate that blended IR images have better quality over the superimposed images. Additionally, based on the same steps, simulation result shows a blended IR image of better quality when only the original IR image is available.

  9. Precision segmented reflector, figure verification sensor

    NASA Technical Reports Server (NTRS)

    Manhart, Paul K.; Macenka, Steve A.

    1989-01-01

    The Precision Segmented Reflector (PSR) program currently under way at the Jet Propulsion Laboratory is a test bed and technology demonstration program designed to develop and study the structural and material technologies required for lightweight, precision segmented reflectors. A Figure Verification Sensor (FVS) which is designed to monitor the active control system of the segments is described, a best fit surface is defined, and an image or wavefront quality of the assembled array of reflecting panels is assessed

  10. Active pixel sensor array with multiresolution readout

    NASA Technical Reports Server (NTRS)

    Fossum, Eric R. (Inventor); Kemeny, Sabrina E. (Inventor); Pain, Bedabrata (Inventor)

    1999-01-01

    An imaging device formed as a monolithic complementary metal oxide semiconductor integrated circuit in an industry standard complementary metal oxide semiconductor process, the integrated circuit including a focal plane array of pixel cells, each one of the cells including a photogate overlying the substrate for accumulating photo-generated charge in an underlying portion of the substrate and a charge coupled device section formed on the substrate adjacent the photogate having a sensing node and at least one charge coupled device stage for transferring charge from the underlying portion of the substrate to the sensing node. There is also a readout circuit, part of which can be disposed at the bottom of each column of cells and be common to all the cells in the column. The imaging device can also include an electronic shutter formed on the substrate adjacent the photogate, and/or a storage section to allow for simultaneous integration. In addition, the imaging device can include a multiresolution imaging circuit to provide images of varying resolution. The multiresolution circuit could also be employed in an array where the photosensitive portion of each pixel cell is a photodiode. This latter embodiment could further be modified to facilitate low light imaging.

  11. Continued Development of Meandering Winding Magnetometer (MWM (Register Trademark)) Eddy Current Sensors for the Health Monitoring, Modeling and Damage Detection of Composite Overwrapped Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Russell, Richard; Wincheski, Russell; Jablonski, David; Washabaugh, Andy; Sheiretov, Yanko; Martin, Christopher; Goldfine, Neil

    2011-01-01

    Composite Overwrapped Pressure Vessels (COPVs) are used in essentially all NASA spacecraft, launch. vehicles and payloads to contain high-pressure fluids for propulsion, life support systems and science experiments. Failure of any COPV either in flight or during ground processing would result in catastrophic damage to the spacecraft or payload, and could lead to loss of life. Therefore, NASA continues to investigate new methods to non-destructively inspect (NDE) COPVs for structural anomalies and to provide a means for in-situ structural health monitoring (SHM) during operational service. Partnering with JENTEK Sensors, engineers at NASA, Kennedy Space Center have successfully conducted a proof-of-concept study to develop Meandering Winding Magnetometer (MWM) eddy current sensors designed to make direct measurements of the stresses of the internal layers of a carbon fiber composite wrapped COPV. During this study three different MWM sensors were tested at three orientations to demonstrate the ability of the technology to measure stresses at various fiber orientations and depths. These results showed good correlation with actual surface strain gage measurements. MWM-Array technology for scanning COPVs can reliably be used to image and detect mechanical damage. To validate this conclusion, several COPVs were scanned to obtain a baseline, and then each COPV was impacted at varying energy levels and then rescanned. The baseline subtracted images were used to demonstrate damage detection. These scans were performed with two different MWM-Arrays. with different geometries for near-surface and deeper penetration imaging at multiple frequencies and in multiple orientations of the linear MWM drive. This presentation will include a review of micromechanical models that relate measured sensor responses to composite material constituent properties, validated by the proof of concept study, as the basis for SHM and NDE data analysis as well as potential improvements including design changes to miniaturize and make the sensors durable in the vacuum of space

  12. Real time in vivo imaging and measurement of serine protease activity in the mouse hippocampus using a dedicated complementary metal-oxide semiconductor imaging device.

    PubMed

    Ng, David C; Tamura, Hideki; Tokuda, Takashi; Yamamoto, Akio; Matsuo, Masamichi; Nunoshita, Masahiro; Ishikawa, Yasuyuki; Shiosaka, Sadao; Ohta, Jun

    2006-09-30

    The aim of the present study is to demonstrate the application of complementary metal-oxide semiconductor (CMOS) imaging technology for studying the mouse brain. By using a dedicated CMOS image sensor, we have successfully imaged and measured brain serine protease activity in vivo, in real-time, and for an extended period of time. We have developed a biofluorescence imaging device by packaging the CMOS image sensor which enabled on-chip imaging configuration. In this configuration, no optics are required whereby an excitation filter is applied onto the sensor to replace the filter cube block found in conventional fluorescence microscopes. The fully packaged device measures 350 microm thick x 2.7 mm wide, consists of an array of 176 x 144 pixels, and is small enough for measurement inside a single hemisphere of the mouse brain, while still providing sufficient imaging resolution. In the experiment, intraperitoneally injected kainic acid induced upregulation of serine protease activity in the brain. These events were captured in real time by imaging and measuring the fluorescence from a fluorogenic substrate that detected this activity. The entire device, which weighs less than 1% of the body weight of the mouse, holds promise for studying freely moving animals.

  13. Advanced Concurrent-Multiband, Multibeam, Aperture-Synthesis with Intelligent Processing for Urban Operation Sensing

    DTIC Science & Technology

    2012-04-09

    signatures (RSS), in particular, despeckling, superresolution and convergence rate, for a variety of admissible 115 imaging array sensor...attain the superresolution performances in the resulting SSP estimates (3.4), we propose the VA inspired approach [13], [14] to specify the POCS

  14. Miniature optical planar camera based on a wide-angle metasurface doublet corrected for monochromatic aberrations

    PubMed Central

    Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei

    2016-01-01

    Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision. PMID:27892454

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa

    Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° ×more » 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.« less

  16. Miniature optical planar camera based on a wide-angle metasurface doublet corrected for monochromatic aberrations

    NASA Astrophysics Data System (ADS)

    Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei

    2016-11-01

    Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.

  17. Gamma ray camera

    DOEpatents

    Perez-Mendez, V.

    1997-01-21

    A gamma ray camera is disclosed for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array. 6 figs.

  18. Gamma ray camera

    DOEpatents

    Perez-Mendez, Victor

    1997-01-01

    A gamma ray camera for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array.

  19. Imaging through strong turbulence with a light field approach.

    PubMed

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C

    2016-05-30

    Under strong turbulence conditions, object's images can be severely distorted and become unrecognizable throughout the observing time. Conventional image restoring algorithms do not perform effectively in these circumstances due to the loss of good references on the object. We propose the use a plenoptic sensor as a light field camera to map a conventional camera image onto a cell image array in the image's sub-angular spaces. Accordingly, each cell image on the plenoptic sensor is equivalent to the image acquired by a sub-aperture of the imaging lens. The wavefront distortion over the lens aperture can be analyzed by comparing cell images in the plenoptic sensor. By using a modified "Laplacian" metric, we can identify a good cell image in a plenoptic image sequence. The good cell image corresponds with the time and sub-aperture area on the imaging lens where wavefront distortion becomes relatively and momentarily "flat". As a result, it will reveal the fundamental truths of the object that would be severely distorted on normal cameras. In this paper, we will introduce the underlying physics principles and mechanisms of our approach and experimentally demonstrate its effectiveness under strong turbulence conditions. In application, our approach can be used to provide a good reference for conventional image restoring approaches under strong turbulence conditions. This approach can also be used as an independent device to perform object recognition tasks through severe turbulence distortions.

  20. Distortion effects in a switch array UWB radar for time-lapse imaging of human heartbeats

    NASA Astrophysics Data System (ADS)

    Brovoll, Sverre; Berger, Tor; Aardal, Åyvind; Lande, Tor S.; Hamran, Svein-Erik

    2014-05-01

    Cardiovascular diseases (CVD) are a major cause of deaths all over the world. Microwave radar can be an alternative sensor for heart diagnostics and monitoring in modern healthcare that aids early detection of CVD symptoms. In this paper measurements from a switch array radar system are presented. This UWB system operates below 3 GHz and does time-lapse imaging of the beating heart inside the human body. The array consists of eight fat dipole elements. With a switch system, every possible sequence of transmit/receive element pairs can be selected to build a radar image from the recordings. To make the radar waves penetrate the human tissue, the antenna array is placed in contact with the body. Removal of the direct signal leakage through the antennas and body surface are done by high-pass (HP) filtering of the data prior to image processing. To analyze the results, measurements of moving spheres in air and simulations are carried out. We see that removal of the direct signal introduces amplitude distortion in the images. In addition, the effect of small target motion between the collection times of data from the individual elements is analyzed. With low pulse repetition frequency (PRF) this motion will distort the image. By using data from real measurements of heart motion in simulations, we analyze how the PRF and the antenna geometry influence this distortions.

  1. Investigation of polarization-selective InGaAs sensor with elliptical two-dimensional holes array structure

    NASA Astrophysics Data System (ADS)

    Wang, Wenbo; Fu, Dong; Hu, Xiaobin; Xu, Yun; Song, Guofeng; Wei, Xin

    2016-10-01

    Polarimetric imaging in infrared wavelengths have attracted more and more attention for broad applications in meteorological observations, medicine, remote sensing and many other fields. Metal metamaterial structures are used in nanophotonics in order to localize and enhance the incident electromagnetic field. Here we develop an elliptical gold Two-Dimensional Holes Array (2DHA) in which photons can be manipulated by surface plasmon resonance, and the ellipse introduce the asymmetry to realize a polarization selective function. Strong polarization dependence is observed in the simulated transmission spectra. To further understand the coupling mechanism between gold holes array and InP, the different parameters of the 2DHA are analyzed. It is shown that the polarization axis is perpendicular to the major axis of the ellipse, and the degree of polarization is determined by the aspect ratio of the ellipse. Furthermore, the resonance frequency of the 2DHA shows a linear dependence on the array period, the bandwidth of transmission spectra closely related to duty cycle of the ellipse in each period. This result will establish a basis for the development of innovative polarization selective infrared sensor.

  2. Three-dimensional imaging through turbid media based on polarization-difference liquid-crystal microlens array

    NASA Astrophysics Data System (ADS)

    Xin, Zhaowei; Wei, Dong; Li, Dapeng; Xie, Xingwang; Chen, Mingce; Zhang, Xinyu; Wang, Haiwei; Xie, Changsheng

    2018-02-01

    In this paper, a polarization difference liquid-crystal microlens array (PD-LCMLA) for three dimensional imaging application through turbid media is fabricated and demonstrated. This device is composed of a twisted nematic liquidcrystal cell (TNLCC), a polarizer and a liquid-crystal microlens array. The polarizer is sandwiched between the TNLCC and LCMLA to help the polarization difference system achieving the orthogonal polarization raw images. The prototyped camera for polarization difference imaging has been constructed by integrating the PD-LCMLA with an image sensor. The orthogonally polarized light-field images are recorded by switching the working state of the TNLCC. Here, by using a special microstructure in conjunction with the polarization-difference algorithm, we demonstrate that the three-dimensional information in the scattering media can be retrieved from the polarization-difference imaging system with an electrically tunable PD-LCMLA. We further investigate the system's potential function based on the flexible microstructure. The microstructure provides a wide operation range in the manipulation of incident beams and also emerges multiple operation modes for imaging applications, such as conventional planar imaging, polarization imaging mode, and polarization-difference imaging mode. Since the PD-LCMLA demonstrates a very low power consumption, multiple imaging modes and simple manufacturing, this kind of device presents a potential to be used in many other optical and electro-optical systems.

  3. Room temperature 1040fps, 1 megapixel photon-counting image sensor with 1.1um pixel pitch

    NASA Astrophysics Data System (ADS)

    Masoodian, S.; Ma, J.; Starkey, D.; Wang, T. J.; Yamashita, Y.; Fossum, E. R.

    2017-05-01

    A 1Mjot single-bit quanta image sensor (QIS) implemented in a stacked backside-illuminated (BSI) process is presented. This is the first work to report a megapixel photon-counting CMOS-type image sensor to the best of our knowledge. A QIS with 1.1μm pitch tapered-pump-gate jots is implemented with cluster-parallel readout, where each cluster of jots is associated with its own dedicated readout electronics stacked under the cluster. Power dissipation is reduced with this cluster readout because of the reduced column bus parasitic capacitance, which is important for the development of 1Gjot arrays. The QIS functions at 1040fps with binary readout and dissipates only 17.6mW, including I/O pads. The readout signal chain uses a fully differential charge-transfer amplifier (CTA) gain stage before a 1b-ADC to achieve an energy/bit FOM of 16.1pJ/b and 6.9pJ/b for the whole sensor and gain stage+ADC, respectively. Analog outputs with on-chip gain are implemented for pixel characterization purposes.

  4. Defense Small Business Innovation Research Program (SBIR). Volume 1. Army Abstracts of Phase 1 Awards 1991

    DTIC Science & Technology

    1991-01-01

    Office: MICOM HUNTSVILLE, AL 35805 Contract #: DAAHO1-92-C-R150 Phone: (205) 876-7502 Pi: D. BRETI BEASLEY Title: INFRARED LASER DIODE BASED INFRARED ...TECHNIQUES WILL BE INVESTIGATED TO DESIGN A FORM FIT GIMBALL-MOUNTED 94 GHZ/ INFRARED FOCAL PLANE ARRAY DUAL-MODE MISSILE SEEKER SENSOR BASED ON LOW...RESOLUTION AT 94 GHZ AND A 128X128 ARRAY IR IMAGE PROCESSING FOR AUTONOMOUS TARGET RECOGNITION AND AIMPOINT SELECTION. THE 94 GHZ AND INFRARED ELECTRONICS

  5. A compact LWIR hyperspectral system employing a microbolometer array and a variable gap Fabry-Perot interferometer employed as a Fourier transform spectrometer

    NASA Astrophysics Data System (ADS)

    Lucey, Paul G.; Hinrichs, John L.; Akagi, Jason

    2012-06-01

    A prototype long wave infrared Fourier transform spectral imaging system using a wedged Fabry-Perot interferometer and a microbolometer array was designed and built. The instrument can be used at both short (cm) and long standoff ranges (infinity focus). Signal to noise ratios are in the several hundred range for 30 C targets. The sensor is compact, fitting in a volume about 12 x12 x 4 inches.

  6. Design and calibration of a six-axis MEMS sensor array for use in scoliosis correction surgery

    NASA Astrophysics Data System (ADS)

    Benfield, David; Yue, Shichao; Lou, Edmond; Moussa, Walied A.

    2014-08-01

    A six-axis sensor array has been developed to quantify the 3D force and moment loads applied in scoliosis correction surgery. Initially this device was developed to be applied during scoliosis correction surgery and augmented onto existing surgical instrumentation, however, use as a general load sensor is also feasible. The development has included the design, microfabrication, deployment and calibration of a sensor array. The sensor array consists of four membrane devices, each containing piezoresistive sensing elements, generating a total of 16 differential voltage outputs. The calibration procedure has made use of a custom built load application frame, which allows quantified forces and moments to be applied and compared to the outputs from the sensor array. Linear or non-linear calibration equations are generated to convert the voltage outputs from the sensor array back into 3D force and moment information for display or analysis.

  7. Plenoptic mapping for imaging and retrieval of the complex field amplitude of a laser beam.

    PubMed

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C

    2016-12-26

    The plenoptic sensor has been developed to sample complicated beam distortions produced by turbulence in the low atmosphere (deep turbulence or strong turbulence) with high density data samples. In contrast with the conventional Shack-Hartmann wavefront sensor, which utilizes all the pixels under each lenslet of a micro-lens array (MLA) to obtain one data sample indicating sub-aperture phase gradient and photon intensity, the plenoptic sensor uses each illuminated pixel (with significant pixel value) under each MLA lenslet as a data point for local phase gradient and intensity. To characterize the working principle of a plenoptic sensor, we propose the concept of plenoptic mapping and its inverse mapping to describe the imaging and reconstruction process respectively. As a result, we show that the plenoptic mapping is an efficient method to image and reconstruct the complex field amplitude of an incident beam with just one image. With a proof of concept experiment, we show that adaptive optics (AO) phase correction can be instantaneously achieved without going through a phase reconstruction process under the concept of plenoptic mapping. The plenoptic mapping technology has high potential for applications in imaging, free space optical (FSO) communication and directed energy (DE) where atmospheric turbulence distortion needs to be compensated.

  8. Wireless Sensor Array Network DoA Estimation from Compressed Array Data via Joint Sparse Representation.

    PubMed

    Yu, Kai; Yin, Ming; Luo, Ji-An; Wang, Yingguan; Bao, Ming; Hu, Yu-Hen; Wang, Zhi

    2016-05-23

    A compressive sensing joint sparse representation direction of arrival estimation (CSJSR-DoA) approach is proposed for wireless sensor array networks (WSAN). By exploiting the joint spatial and spectral correlations of acoustic sensor array data, the CSJSR-DoA approach provides reliable DoA estimation using randomly-sampled acoustic sensor data. Since random sampling is performed at remote sensor arrays, less data need to be transmitted over lossy wireless channels to the fusion center (FC), and the expensive source coding operation at sensor nodes can be avoided. To investigate the spatial sparsity, an upper bound of the coherence of incoming sensor signals is derived assuming a linear sensor array configuration. This bound provides a theoretical constraint on the angular separation of acoustic sources to ensure the spatial sparsity of the received acoustic sensor array signals. The Cram e ´ r-Rao bound of the CSJSR-DoA estimator that quantifies the theoretical DoA estimation performance is also derived. The potential performance of the CSJSR-DoA approach is validated using both simulations and field experiments on a prototype WSAN platform. Compared to existing compressive sensing-based DoA estimation methods, the CSJSR-DoA approach shows significant performance improvement.

  9. A novel weighted-direction color interpolation

    NASA Astrophysics Data System (ADS)

    Tao, Jin-you; Yang, Jianfeng; Xue, Bin; Liang, Xiaofen; Qi, Yong-hong; Wang, Feng

    2013-08-01

    A digital camera capture images by covering the sensor surface with a color filter array (CFA), only get a color sample at pixel location. Demosaicking is a process by estimating the missing color components of each pixel to get a full resolution image. In this paper, a new algorithm based on edge adaptive and different weighting factors is proposed. Our method can effectively suppress undesirable artifacts. Experimental results based on Kodak images show that the proposed algorithm obtain higher quality images compared to other methods in numerical and visual aspects.

  10. Development and testing of bio-inspired microelectromechanical pressure sensor arrays for increased situational awareness for marine vehicles

    NASA Astrophysics Data System (ADS)

    Dusek, J.; Kottapalli, A. G. P.; Woo, M. E.; Asadnia, M.; Miao, J.; Lang, J. H.; Triantafyllou, M. S.

    2013-01-01

    The lateral line found on most species of fish is a sensory organ without analog in humans. Using sensory feedback from the lateral line, fish are able to track prey, school, avoid obstacles, and detect vortical flow structures. Composed of both a superficial component, and a component contained within canals beneath the fish’s skin, the lateral line acts in a similar fashion to an array of differential pressure sensors. In an effort to enhance the situational and environmental awareness of marine vehicles, lateral-line-inspired pressure sensor arrays were developed to mimic the enhanced sensory capabilities observed in fish. Three flexible and waterproof pressure sensor arrays were fabricated for use as a surface-mounted ‘smart skin’ on marine vehicles. Two of the sensor arrays were based around the use of commercially available piezoresistive sensor dies, with innovative packaging schemes to allow for flexibility and underwater operation. The sensor arrays employed liquid crystal polymer and flexible printed circuit board substrates with metallic circuits and silicone encapsulation. The third sensor array employed a novel nanocomposite material set that allowed for the fabrication of a completely flexible sensor array. All three sensors were surface mounted on the curved hull of an autonomous kayak vehicle, and tested in both pool and reservoir environments. Results demonstrated that all three sensors were operational while deployed on the autonomous vehicle, and provided an accurate means for monitoring the vehicle dynamics.

  11. Improved microseismic event locations through large-N arrays and wave-equation imaging and inversion

    NASA Astrophysics Data System (ADS)

    Witten, B.; Shragge, J. C.

    2016-12-01

    The recent increased focus on small-scale seismicity, Mw < 4 has come about primarily for two reasons. First, there is an increase in induced seismicity related to injection operations primarily for wastewater disposal and hydraulic fracturing for oil and gas recovery and for geothermal energy production. While the seismicity associated with injection is sometimes felt, it is more often weak. Some weak events are detected on current sparse arrays; however, accurate location of the events often requires a larger number of (multi-component) sensors. This leads to the second reason for an increased focus on small magnitude seismicity: a greater number of seismometers are being deployed in large N-arrays. The greater number of sensors decreases the detection threshold and therefore significantly increases the number of weak events found. Overall, these two factors bring new challenges and opportunities. Many standard seismological location and inversion techniques are geared toward large, easily identifiable events recorded on a sparse number of stations. However, with large-N arrays we can detect small events by utilizing multi-trace processing techniques, and increased processing power equips us with tools that employ more complete physics for simultaneously locating events and inverting for P- and S-wave velocity structure. We present a method that uses large-N arrays and wave-equation-based imaging and inversion to jointly locate earthquakes and estimate the elastic velocities of the earth. The technique requires no picking and is thus suitable for weak events. We validate the methodology through synthetic and field data examples.

  12. On-sky performance evaluation and calibration of a polarization-sensitive focal plane array

    NASA Astrophysics Data System (ADS)

    Vorobiev, Dmitry; Ninkov, Zoran; Brock, Neal; West, Ray

    2016-07-01

    The advent of pixelated micropolarizer arrays (MPAs) has facilitated the development of polarization-sensitive focal plane arrays (FPAs) based on charge-coupled devices (CCDs) and active pixel sensors (APSs), which are otherwise only able to measure the intensity of light. Polarization sensors based on MPAs are extremely compact, light-weight, mechanically robust devices with no moving parts, capable of measuring the degree and angle of polarization of light in a single snapshot. Furthermore, micropolarizer arrays based on wire grid polarizers (so called micro-grid polarizers) offer extremely broadband performance, across the optical and infrared regimes. These devices have potential for a wide array of commercial and research applications, where measurements of polarization can provide critical information, but where conventional polarimeters could be practically implemented. To date, the most successful commercial applications of these devices are 4D Technology's PhaseCam laser interferometers and PolarCam imaging polarimeters. Recently, MPA-based polarimeters have been identified as a potential solution for space-based telescopes, where the small size, snapshot capability and low power consumption (offered by these devices) are extremely desirable. In this work, we investigated the performance of MPA-based polarimeters designed for astronomical polarimetry using the Rochester Institute of Technology Polarization Imaging Camera (RITPIC). We deployed RITPIC on the 0.9 meter SMARTS telescope at the Cerro Tololo Inter-American Observatory and observed a variety of astronomical objects (calibration stars, variable stars, reflection nebulae and planetary nebulae). We use our observations to develop calibration procedures that are unique to these devices and provide an estimate for polarimetric precision that is achievable.

  13. Optimizing Floating Guard Ring Designs for FASPAX N-in-P Silicon Sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Kyung-Wook; Bradford, Robert; Lipton, Ronald

    2016-10-06

    FASPAX (Fermi-Argonne Semiconducting Pixel Array X-ray detector) is being developed as a fast integrating area detector with wide dynamic range for time resolved applications at the upgraded Advanced Photon Source (APS.) A burst mode detector with intendedmore » $$\\mbox{13 $$MHz$}$ image rate, FASPAX will also incorporate a novel integration circuit to achieve wide dynamic range, from single photon sensitivity to $$10^{\\text{5}}$$ x-rays/pixel/pulse. To achieve these ambitious goals, a novel silicon sensor design is required. This paper will detail early design of the FASPAX sensor. Results from TCAD optimization studies, and characterization of prototype sensors will be presented.« less

  14. The Transition-Edge-Sensor Array for the Micro-X Sounding Rocket

    NASA Technical Reports Server (NTRS)

    Eckart, M. E.; Adams, J. S.; Bailey, C. N.; Bandler, S. R.; Busch, Sarah Elizabeth; Chervenak J. A.; Finkbeiner, F. M.; Kelley, R. L.; Kilbourne, C. A.; Porst, J. P.; hide

    2012-01-01

    The Micro-X sounding rocket program will fly a 128-element array of transition-edge-sensor microcalorimeters to enable high-resolution X-ray imaging spectroscopy of the Puppis-A supernova remnant. To match the angular resolution of the optics while maximizing the field-of-view and retaining a high energy resolution (< 4 eV at 1 keV), we have designed the pixels using 600 x 600 sq. micron Au/Bi absorbers, which overhang 140 x 140 sq. micron Mo/Au sensors. The data-rate capabilities of the rocket telemetry system require the pulse decay to be approximately 2 ms to allow a significant portion of the data to be telemetered during flight. Here we report experimental results from the flight array, including measurements of energy resolution, uniformity, and absorber thermalization. In addition, we present studies of test devices that have a variety of absorber contact geometries, as well as a variety of membrane-perforation schemes designed to slow the pulse decay time to match the telemetry requirements. Finally, we describe the reduction in pixel-to-pixel crosstalk afforded by an angle-evaporated Cu backside heatsinking layer, which provides Cu coverage on the four sidewalls of the silicon wells beneath each pixel.

  15. Close-in detection system for the Mine Hunter/Killer program

    NASA Astrophysics Data System (ADS)

    Bishop, Steven S.; Campana, Stephen B.; Lang, David A.; Wiggins, Carl M.

    2000-08-01

    The Close-in Detection (CID) System is the vehicle-mounted multisensor landmine detection system for the Army CECOM Night Vision Electronic Sensors Directorate (NVESD) Mine Hunter/Killer (MH/K) Program. The CID System is being developed by BAE Systems in San Diego, CA. TRW Systems and Information Technology Group in Arlington, VA and a team of specialists for ERIM, E-OIR, SNL, and APL/JHU support NVESD in the development, analysis and testing of the CID and associated signal and data processing. The CID System includes tow down-looking sensor arrays: a ground- penetrating radar (GPR) array, and a set of Electro-Magnetic Induction (EMI) coils for metal detection. These arrays span a 3-meter wide swath in front of a high mobility, multipurpose wheeled vehicle. The system also includes a forward looking IR imaging system mounted on the roof of the vehicle and covering a swath of the road ahead of the vehicle. Signals from each sensor are processed separately to detect and localize objects of interest. Features of candidate objects are integrated in a processor that uses them to discriminates between anti-tank miens and clutter. Mine locations are passed to the neutralization subsystem of MH/K. This paper reviews the design of the sensors and signal processing of the CID system and gives examples and analysis of recent test results at the NVESD mine lanes. The strengths and weaknesses of each sensor are discussed, and the application of multisensor fusion is illustrated.

  16. Artificial tactile sensing in minimally invasive surgery - a new technical approach.

    PubMed

    Schostek, Sebastian; Ho, Chi-Nghia; Kalanovic, Daniel; Schurr, Marc O

    2006-01-01

    The loss of tactile sensation is a commonly known drawback of minimally invasive surgery (MIS). Since the advent of MIS, research activities in providing tactile information to the surgeon are still ongoing, in order to improve patient safety and to extend the indications for MIS. We have designed a tactile sensor system comprising a tactile laparoscopic grasper for surgical palpation. For this purpose, we developed a novel tactile sensor technology which allows the manufacturing of an integrated sensor array within an acceptable price range. The array was integrated into the jaws of a 10mm laparoscopic grasper. The tactile data are transferred wirelessly via Bluetooth and are presented visually to the surgeon. The goal was to be able to obtain information about the shape and consistency of tissue structures by gently compressing the tissue between the jaws of the tactile instrument and thus to be able to recognize and assess anatomical or pathological structures, even if they are hidden in the tissue. With a prototype of the tactile sensor system we have conducted bench-tests as well as in-vitro and in-vivo experiments. The system proved feasibility in an experimental environment, it was easy to use, and the novel tactile sensor array was applicable for both palpation and grasping manoeuvres with forces of up to 60N. The tactile data turned out to be a useful supplement to the minimal amount of haptic feedback that is provided by current endoscopic instruments and the endoscopic image under certain conditions.

  17. CdZnTe Image Detectors for Hard-X-Ray Telescopes

    NASA Technical Reports Server (NTRS)

    Chen, C. M. Hubert; Cook, Walter R.; Harrison, Fiona A.; Lin, Jiao Y. Y.; Mao, Peter H.; Schindler, Stephen M.

    2005-01-01

    Arrays of CdZnTe photodetectors and associated electronic circuitry have been built and tested in a continuing effort to develop focal-plane image sensor systems for hard-x-ray telescopes. Each array contains 24 by 44 pixels at a pitch of 498 m. The detector designs are optimized to obtain low power demand with high spectral resolution in the photon- energy range of 5 to 100 keV. More precisely, each detector array is a hybrid of a CdZnTe photodetector array and an application-specific integrated circuit (ASIC) containing an array of amplifiers in the same pixel pattern as that of the detectors. The array is fabricated on a single crystal of CdZnTe having dimensions of 23.6 by 12.9 by 2 mm. The detector-array cathode is a monolithic platinum contact. On the anode plane, the contact metal is patterned into the aforementioned pixel array, surrounded by a guard ring that is 1 mm wide on three sides and is 0.1 mm wide on the fourth side so that two such detector arrays can be placed side-by-side to form a roughly square sensor area with minimal dead area between them. Figure 1 shows two anode patterns. One pattern features larger pixel anode contacts, with a 30-m gap between them. The other pattern features smaller pixel anode contacts plus a contact for a shaping electrode in the form of a grid that separates all the pixels. In operation, the grid is held at a potential intermediate between the cathode and anode potentials to steer electric charges toward the anode in order to reduce the loss of charges in the inter-anode gaps. The CdZnTe photodetector array is mechanically and electrically connected to the ASIC (see Figure 2), either by use of indium bump bonds or by use of conductive epoxy bumps on the CdZnTe array joined to gold bumps on the ASIC. Hence, the output of each pixel detector is fed to its own amplifier chain.

  18. Using Bayesian Inference Framework towards Identifying Gas Species and Concentration from High Temperature Resistive Sensor Array Data

    DOE PAGES

    Liu, Yixin; Zhou, Kai; Lei, Yu

    2015-01-01

    High temperature gas sensors have been highly demanded for combustion process optimization and toxic emissions control, which usually suffer from poor selectivity. In order to solve this selectivity issue and identify unknown reducing gas species (CO, CH 4 , and CH 8 ) and concentrations, a high temperature resistive sensor array data set was built in this study based on 5 reported sensors. As each sensor showed specific responses towards different types of reducing gas with certain concentrations, based on which calibration curves were fitted, providing benchmark sensor array response database, then Bayesian inference framework was utilized to process themore » sensor array data and build a sample selection program to simultaneously identify gas species and concentration, by formulating proper likelihood between input measured sensor array response pattern of an unknown gas and each sampled sensor array response pattern in benchmark database. This algorithm shows good robustness which can accurately identify gas species and predict gas concentration with a small error of less than 10% based on limited amount of experiment data. These features indicate that Bayesian probabilistic approach is a simple and efficient way to process sensor array data, which can significantly reduce the required computational overhead and training data.« less

  19. An Optical Wavefront Sensor Based on a Double Layer Microlens Array

    PubMed Central

    Lin, Vinna; Wei, Hsiang-Chun; Hsieh, Hsin-Ta; Su, Guo-Dung John

    2011-01-01

    In order to determine light aberrations, Shack-Hartmann optical wavefront sensors make use of microlens arrays (MLA) to divide the incident light into small parts and focus them onto image planes. In this paper, we present the design and fabrication of long focal length MLA with various shapes and arrangements based on a double layer structure for optical wavefront sensing applications. A longer focal length MLA could provide high sensitivity in determining the average slope across each microlens under a given wavefront, and spatial resolution of a wavefront sensor is increased by numbers of microlenses across a detector. In order to extend focal length, we used polydimethysiloxane (PDMS) above MLA on a glass substrate. Because of small refractive index difference between PDMS and MLA interface (UV-resin), the incident light is less refracted and focused in further distance. Other specific focal lengths could also be realized by modifying the refractive index difference without changing the MLA size. Thus, the wavefront sensor could be improved with better sensitivity and higher spatial resolution. PMID:22346643

  20. Mutual capacitance of liquid conductors in deformable tactile sensing arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Bin; Fontecchio, Adam K.; Visell, Yon

    2016-01-04

    Advances in highly deformable electronics are needed in order to enable emerging categories of soft computing devices ranging from wearable electronics, to medical devices, and soft robotic components. The combination of highly elastic substrates with intrinsically stretchable conductors holds the promise of enabling electronic sensors that can conform to curved objects, reconfigurable displays, or soft biological tissues, including the skin. Here, we contribute sensing principles for tactile (mechanical image) sensors based on very low modulus polymer substrates with embedded liquid metal microfluidic arrays. The sensors are fabricated using a single-step casting method that utilizes fine nylon filaments to produce arraysmore » of cylindrical channels on two layers. The liquid metal (gallium indium alloy) conductors that fill these channels readily adopt the shape of the embedding membrane, yielding levels of deformability greater than 400%, due to the use of soft polymer substrates. We modeled the sensor performance using electrostatic theory and continuum mechanics, yielding excellent agreement with experiments. Using a matrix-addressed capacitance measurement technique, we are able to resolve strain distributions with millimeter resolution over areas of several square centimeters.« less

  1. Monte Carlo Techniques for Calculations of Charge Deposition and Displacement Damage from Protons in Visible and Infrared Sensor Arrays

    NASA Technical Reports Server (NTRS)

    Marshall, Paul; Reed, Robert; Fodness, Bryan; Jordan, Tom; Pickel, Jim; Xapsos, Michael; Burke, Ed

    2004-01-01

    This slide presentation examines motivation for Monte Carlo methods, charge deposition in sensor arrays, displacement damage calculations, and future work. The discussion of charge deposition sensor arrays includes Si active pixel sensor APS arrays and LWIR HgCdTe FPAs. The discussion of displacement damage calculations includes nonionizing energy loss (NIEL), HgCdTe NIEL calculation results including variance, and implications for damage in HgCdTe detector arrays.

  2. An automated mapping satellite system ( Mapsat).

    USGS Publications Warehouse

    Colvocoresses, A.P.

    1982-01-01

    The favorable environment of space permits a satellite to orbit the Earth with very high stability as long as no local perturbing forces are involved. Solid-state linear-array sensors have no moving parts and create no perturbing force on the satellite. Digital data from highly stabilized stereo linear arrays are amenable to simplified processing to produce both planimetric imagery and elevation data. A satellite imaging system, called Mapsat, including this concept has been proposed to produce data from which automated mapping in near real time can be accomplished. Image maps as large as 1:50 000 scale with contours as close as a 20-m interval may be produced from Mapsat data. -from Author

  3. Quantitative analysis and temperature-induced variations of moiré pattern in fiber-coupled imaging sensors.

    PubMed

    Karbasi, Salman; Arianpour, Ashkan; Motamedi, Nojan; Mellette, William M; Ford, Joseph E

    2015-06-10

    Imaging fiber bundles can map the curved image surface formed by some high-performance lenses onto flat focal plane detectors. The relative alignment between the focal plane array pixels and the quasi-periodic fiber-bundle cores can impose an undesirable space variant moiré pattern, but this effect may be greatly reduced by flat-field calibration, provided that the local responsivity is known. Here we demonstrate a stable metric for spatial analysis of the moiré pattern strength, and use it to quantify the effect of relative sensor and fiber-bundle pitch, and that of the Bayer color filter. We measure the thermal dependence of the moiré pattern, and the achievable improvement by flat-field calibration at different operating temperatures. We show that a flat-field calibration image at a desired operating temperature can be generated using linear interpolation between white images at several fixed temperatures, comparing the final image quality with an experimentally acquired image at the same temperature.

  4. A CMOS One-chip Wireless Camera with Digital Image Transmission Function for Capsule Endoscopes

    NASA Astrophysics Data System (ADS)

    Itoh, Shinya; Kawahito, Shoji; Terakawa, Susumu

    This paper presents the design and implementation of a one-chip camera device for capsule endoscopes. This experimental chip integrates functional circuits required for capsule endoscopes and digital image transmission function. The integrated functional blocks include an image array, a timing generator, a clock generator, a voltage regulator, a 10b cyclic A/D converter, and a BPSK modulator. It can be operated autonomously with 3 pins (VDD, GND, and DATAOUT). A prototype image sensor chip which has 320x240 effective pixels was fabricated using 0.25μm CMOS image sensor process and the autonomous imaging was demonstrated. The chip size is 4.84mmx4.34mm. With a 2.0 V power supply, the analog part consumes 950μW and the total power consumption at 2 frames per second (fps) is 2.6mW. Error-free image transmission over a distance of 48cm at 2.5Mbps corresponding to 2fps has been succeeded with inductive coupling.

  5. Modular Analytical Multicomponent Analysis in Gas Sensor Aarrays

    PubMed Central

    Chaiyboun, Ali; Traute, Rüdiger; Kiesewetter, Olaf; Ahlers, Simon; Müller, Gerhard; Doll, Theodor

    2006-01-01

    A multi-sensor system is a chemical sensor system which quantitatively and qualitatively records gases with a combination of cross-sensitive gas sensor arrays and pattern recognition software. This paper addresses the issue of data analysis for identification of gases in a gas sensor array. We introduce a software tool for gas sensor array configuration and simulation. It concerns thereby about a modular software package for the acquisition of data of different sensors. A signal evaluation algorithm referred to as matrix method was used specifically for the software tool. This matrix method computes the gas concentrations from the signals of a sensor array. The software tool was used for the simulation of an array of five sensors to determine gas concentration of CH4, NH3, H2, CO and C2H5OH. The results of the present simulated sensor array indicate that the software tool is capable of the following: (a) identify a gas independently of its concentration; (b) estimate the concentration of the gas, even if the system was not previously exposed to this concentration; (c) tell when a gas concentration exceeds a certain value. A gas sensor data base was build for the configuration of the software. With the data base one can create, generate and manage scenarios and source files for the simulation. With the gas sensor data base and the simulation software an on-line Web-based version was developed, with which the user can configure and simulate sensor arrays on-line.

  6. Fiber Optic Geophysics Sensor Array

    NASA Astrophysics Data System (ADS)

    Grochowski, Lucjan

    1989-01-01

    The distributed optical sensor arrays are analysed in view of specific needs of 3-D seismic explorations methods. There are compared advantages and disadventages of arrays supported by the sensors which are modulated in intensity and phase. In these systems all-fiber optic structures and their compabilities with digital geophysic formats are discussed. It was shown that the arrays based on TDM systems with the intensity modulated sensors are economically and technically the best matched for geophysic systems supported by a large number of the sensors.

  7. Scanning Shack-Hartmann wavefront sensor

    NASA Astrophysics Data System (ADS)

    Molebny, Vasyl V.

    2004-09-01

    Criss-crossing of focal images is the cause of a narrow dynamic range in Shack-Hartmann sensors. Practically, aberration range wider than +/-3 diopters can not be measured. A method has been proposed for ophthalmologic applications using a rarefied lenslet array through which a wave front is projected with the successive step-by-step changing of the global tilt. The data acquired in each step are accumulated and processed. In experimental setup, a doubled dynamic range was achieved with four steps of wave front tilting.

  8. Design of a Unique Azimuth Monitoring Device.

    DTIC Science & Technology

    1980-11-10

    which will direct two beams N (in close proximity to the receiving photomulttplters or image dsetr if these are the receiving sensors ) to the target...receiving sensors and data Av., recording equipment are also at the transmitting/receiving site. It is lrist 2 Iai ~~W5 s-i I I gJJ- I 1&I hoped that for...facility in Bedford, Massachusetts, close to a tiltmeter array site. Pillars will be constructed to accept the observing equipment and the targets with a

  9. An Algorithm to Identify and Localize Suitable Dock Locations from 3-D LiDAR Scans

    DTIC Science & Technology

    2013-05-10

    Locations from 3-D LiDAR Scans 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Graves, Mitchell Robert 5d. PROJECT NUMBER...Ranging ( LiDAR ) scans. A LiDAR sensor is a sensor that collects range images from a rotating array of vertically aligned lasers. Our solution leverages...Algorithm, Dock, Locations, Point Clouds, LiDAR , Identify 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a

  10. Smart focal-plane technology for micro-instruments and micro-rovers

    NASA Technical Reports Server (NTRS)

    Fossum, Eric R.

    1993-01-01

    It is inevitable that micro-instruments and micro-rovers for space exploration will contain one or more focal-plane arrays for imaging, spectroscopy, or navigation. In this paper, we explore the state-of-the-art in focal-plane technology for visible sensors. Also discussed is present research activity in advanced focal-plane technology with particular emphasis on the development of smart sensors. The paper concludes with a discussion of possible future directions for the advancement of the technology.

  11. Laser speckle strain and deformation sensor using linear array image cross-correlation method for specifically arranged triple-beam triple-camera configuration

    NASA Technical Reports Server (NTRS)

    Sarrafzadeh-Khoee, Adel K. (Inventor)

    2000-01-01

    The invention provides a method of triple-beam and triple-sensor in a laser speckle strain/deformation measurement system. The triple-beam/triple-camera configuration combined with sequential timing of laser beam shutters is capable of providing indications of surface strain and structure deformations. The strain and deformation quantities, the four variables of surface strain, in-plane displacement, out-of-plane displacement and tilt, are determined in closed form solutions.

  12. Nanowire sensor, sensor array, and method for making the same

    NASA Technical Reports Server (NTRS)

    Homer, Margie (Inventor); Fleurial, Jean-Pierre (Inventor); Bugga, Ratnakumar (Inventor); Vasquez, Richard (Inventor); Yun, Minhee (Inventor); Myung, Nosang (Inventor); Choi, Daniel (Inventor); Goddard, William (Inventor); Ryan, Margaret (Inventor); Yen, Shiao-Pin (Inventor)

    2012-01-01

    The present invention relates to a nanowire sensor and method for forming the same. More specifically, the nanowire sensor comprises at least one nanowire formed on a substrate, with a sensor receptor disposed on a surface of the nanowire, thereby forming a receptor-coated nanowire. The nanowire sensor can be arranged as a sensor sub-unit comprising a plurality of homogeneously receptor-coated nanowires. A plurality of sensor subunits can be formed to collectively comprise a nanowire sensor array. Each sensor subunit in the nanowire sensor array can be formed to sense a different stimulus, allowing a user to sense a plurality of stimuli. Additionally, each sensor subunit can be formed to sense the same stimuli through different aspects of the stimulus. The sensor array is fabricated through a variety of techniques, such as by creating nanopores on a substrate and electrodepositing nanowires within the nanopores.

  13. Flight Results from the HST SM4 Relative Navigation Sensor System

    NASA Technical Reports Server (NTRS)

    Naasz, Bo; Eepoel, John Van; Queen, Steve; Southward, C. Michael; Hannah, Joel

    2010-01-01

    On May 11, 2009, Space Shuttle Atlantis roared off of Launch Pad 39A enroute to the Hubble Space Telescope (HST) to undertake its final servicing of HST, Servicing Mission 4. Onboard Atlantis was a small payload called the Relative Navigation Sensor experiment, which included three cameras of varying focal ranges, avionics to record images and estimate, in real time, the relative position and attitude (aka "pose") of the telescope during rendezvous and deploy. The avionics package, known as SpaceCube and developed at the Goddard Space Flight Center, performed image processing using field programmable gate arrays to accelerate this process, and in addition executed two different pose algorithms in parallel, the Goddard Natural Feature Image Recognition and the ULTOR Passive Pose and Position Engine (P3E) algorithms

  14. Periodicity analysis on cat-eye reflected beam profiles of optical detectors

    NASA Astrophysics Data System (ADS)

    Gong, Mali; He, Sifeng

    2017-05-01

    The cat-eye effect reflected beam profiles of most optical detectors have a certain characteristic of periodicity, which is caused by array arrangement of sensors at their optical focal planes. It is the first time to find and prove that the reflected beam profile becomes several periodic spots at the reflected propagation distance corresponding to half the imaging distance of a CCD camera. Furthermore, the spatial cycle of these spots is approximately constant, independent of the CCD camera's imaging distance, which is related only to the focal length and pixel size of the CCD sensor. Thus, we can obtain the imaging distance and intrinsic parameters of the optical detector by analyzing its cat-eye reflected beam profiles. This conclusion can be applied in the field of non-cooperative cat-eye target recognition.

  15. Design of a Low-Light-Level Image Sensor with On-Chip Sigma-Delta Analog-to- Digital Conversion

    NASA Technical Reports Server (NTRS)

    Mendis, Sunetra K.; Pain, Bedabrata; Nixon, Robert H.; Fossum, Eric R.

    1993-01-01

    The design and projected performance of a low-light-level active-pixel-sensor (APS) chip with semi-parallel analog-to-digital (A/D) conversion is presented. The individual elements have been fabricated and tested using MOSIS* 2 micrometer CMOS technology, although the integrated system has not yet been fabricated. The imager consists of a 128 x 128 array of active pixels at a 50 micrometer pitch. Each column of pixels shares a 10-bit A/D converter based on first-order oversampled sigma-delta (Sigma-Delta) modulation. The 10-bit outputs of each converter are multiplexed and read out through a single set of outputs. A semi-parallel architecture is chosen to achieve 30 frames/second operation even at low light levels. The sensor is designed for less than 12 e^- rms noise performance.

  16. Ionizing radiation effects on CMOS imagers manufactured in deep submicron process

    NASA Astrophysics Data System (ADS)

    Goiffon, Vincent; Magnan, Pierre; Bernard, Frédéric; Rolland, Guy; Saint-Pé, Olivier; Huger, Nicolas; Corbière, Franck

    2008-02-01

    We present here a study on both CMOS sensors and elementary structures (photodiodes and in-pixel MOSFETs) manufactured in a deep submicron process dedicated to imaging. We designed a test chip made of one 128×128-3T-pixel array with 10 μm pitch and more than 120 isolated test structures including photodiodes and MOSFETs with various implants and different sizes. All these devices were exposed to ionizing radiation up to 100 krad and their responses were correlated to identify the CMOS sensor weaknesses. Characterizations in darkness and under illumination demonstrated that dark current increase is the major sensor degradation. Shallow trench isolation was identified to be responsible for this degradation as it increases the number of generation centers in photodiode depletion regions. Consequences on hardness assurance and hardening-by-design are discussed.

  17. High-bandwidth acoustic detection system (HBADS) for stripmap synthetic aperture acoustic imaging of canonical ground targets using airborne sound and a 16 element receiving array

    NASA Astrophysics Data System (ADS)

    Bishop, Steven S.; Moore, Timothy R.; Gugino, Peter; Smith, Brett; Kirkwood, Kathryn P.; Korman, Murray S.

    2018-04-01

    High Bandwidth Acoustic Detection System (HBADS) is an emerging active acoustic sensor technology undergoing study by the US Army's Night Vision and Electronic Sensors Directorate. Mounted on a commercial all-terrain type vehicle, it uses a single source pulse chirp while moving and a new array (two rows each containing eight microphones) mounted horizontally and oriented in a side scan mode. Experiments are performed with this synthetic aperture air acoustic (SAA) array to image canonical ground targets in clutter or foliage. A commercial audio speaker transmits a linear FM chirp having an effective frequency range of 2 kHz to 15 kHz. The system includes an inertial navigation system using two differential GPS antennas, an inertial measurement unit and a wheel coder. A web camera is mounted midway between the two horizontal microphone arrays and a meteorological unit acquires ambient, temperature, pressure and humidity information. A data acquisition system is central to the system's operation, which is controlled by a laptop computer. Recent experiments include imaging canonical targets located on the ground in a grassy field and similar targets camouflaged by natural vegetation along the side of a road. A recent modification involves implementing SAA stripmap mode interferometry for computing the reflectance of targets placed along the ground. Typical strip map SAA parameters are chirp pulse = 10 or 40 ms, slant range resolution c/(2*BW) = 0.013 m, microphone diameter D = 0.022 m, azimuthal resolution (D/2) = 0.01, air sound speed c ≍ 340 m/s and maximum vehicle speed ≍ 2 m/s.

  18. Development and testing of a magnetic position sensor system for automotive and avionics applications

    NASA Astrophysics Data System (ADS)

    Jacobs, Bryan C.; Nelson, Carl V.

    2001-08-01

    A magnetic sensor system has been developed to measure the 3-D location and orientation of a rigid body relative to an array of magnetic dipole transmitters. A generalized solution to the measurement problem has been formulated, allowing the transmitter and receiver parameters (position, orientation, number, etc.) to be optimized for various applications. Additionally, the method of images has been used to mitigate the impact of metallic materials in close proximity to the sensor. The resulting system allows precise tracking of high-speed motion in confined metal environments. The sensor system was recently configured and tested as an abdomen displacement sensor for an automobile crash-test dummy. The test results indicate a positional accuracy of approximately 1 mm rms during 20 m/s motions. The dynamic test results also confirmed earlier covariance model predictions, which were used to optimize the sensor geometry. A covariance analysis was performed to evaluate the applicability of this magnetic position system for tracking a pilot's head motion inside an aircraft cockpit. Realistic design parameters indicate that a robust tracking system, consisting of lightweight pickup coils mounted on a pilot's helmet, and an array of transmitter coils distributed throughout a cockpit, is feasible. Recent test and covariance results are presented.

  19. IR CMOS: near infrared enhanced digital imaging (Presentation Recording)

    NASA Astrophysics Data System (ADS)

    Pralle, Martin U.; Carey, James E.; Joy, Thomas; Vineis, Chris J.; Palsule, Chintamani

    2015-08-01

    SiOnyx has demonstrated imaging at light levels below 1 mLux (moonless starlight) at video frame rates with a 720P CMOS image sensor in a compact, low latency camera. Low light imaging is enabled by the combination of enhanced quantum efficiency in the near infrared together with state of the art low noise image sensor design. The quantum efficiency enhancements are achieved by applying Black Silicon, SiOnyx's proprietary ultrafast laser semiconductor processing technology. In the near infrared, silicon's native indirect bandgap results in low absorption coefficients and long absorption lengths. The Black Silicon nanostructured layer fundamentally disrupts this paradigm by enhancing the absorption of light within a thin pixel layer making 5 microns of silicon equivalent to over 300 microns of standard silicon. This results in a demonstrate 10 fold improvements in near infrared sensitivity over incumbent imaging technology while maintaining complete compatibility with standard CMOS image sensor process flows. Applications include surveillance, nightvision, and 1064nm laser see spot. Imaging performance metrics will be discussed. Demonstrated performance characteristics: Pixel size : 5.6 and 10 um Array size: 720P/1.3Mpix Frame rate: 60 Hz Read noise: 2 ele/pixel Spectral sensitivity: 400 to 1200 nm (with 10x QE at 1064nm) Daytime imaging: color (Bayer pattern) Nighttime imaging: moonless starlight conditions 1064nm laser imaging: daytime imaging out to 2Km

  20. An HDR imaging method with DTDI technology for push-broom cameras

    NASA Astrophysics Data System (ADS)

    Sun, Wu; Han, Chengshan; Xue, Xucheng; Lv, Hengyi; Shi, Junxia; Hu, Changhong; Li, Xiangzhi; Fu, Yao; Jiang, Xiaonan; Huang, Liang; Han, Hongyin

    2018-03-01

    Conventionally, high dynamic-range (HDR) imaging is based on taking two or more pictures of the same scene with different exposure. However, due to a high-speed relative motion between the camera and the scene, it is hard for this technique to be applied to push-broom remote sensing cameras. For the sake of HDR imaging in push-broom remote sensing applications, the present paper proposes an innovative method which can generate HDR images without redundant image sensors or optical components. Specifically, this paper adopts an area array CMOS (complementary metal oxide semiconductor) with the digital domain time-delay-integration (DTDI) technology for imaging, instead of adopting more than one row of image sensors, thereby taking more than one picture with different exposure. And then a new HDR image by fusing two original images with a simple algorithm can be achieved. By conducting the experiment, the dynamic range (DR) of the image increases by 26.02 dB. The proposed method is proved to be effective and has potential in other imaging applications where there is a relative motion between the cameras and scenes.

  1. Method and apparatus for optical encoding with compressible imaging

    NASA Technical Reports Server (NTRS)

    Leviton, Douglas B. (Inventor)

    2006-01-01

    The present invention presents an optical encoder with increased conversion rates. Improvement in the conversion rate is a result of combining changes in the pattern recognition encoder's scale pattern with an image sensor readout technique which takes full advantage of those changes, and lends itself to operation by modern, high-speed, ultra-compact microprocessors and digital signal processors (DSP) or field programmable gate array (FPGA) logic elements which can process encoder scale images at the highest speeds. Through these improvements, all three components of conversion time (reciprocal conversion rate)--namely exposure time, image readout time, and image processing time--are minimized.

  2. 3-D readout-electronics packaging for high-bandwidth massively paralleled imager

    DOEpatents

    Kwiatkowski, Kris; Lyke, James

    2007-12-18

    Dense, massively parallel signal processing electronics are co-packaged behind associated sensor pixels. Microchips containing a linear or bilinear arrangement of photo-sensors, together with associated complex electronics, are integrated into a simple 3-D structure (a "mirror cube"). An array of photo-sensitive cells are disposed on a stacked CMOS chip's surface at a 45.degree. angle from light reflecting mirror surfaces formed on a neighboring CMOS chip surface. Image processing electronics are held within the stacked CMOS chip layers. Electrical connections couple each of said stacked CMOS chip layers and a distribution grid, the connections for distributing power and signals to components associated with each stacked CSMO chip layer.

  3. A 7 ke-SD-FWC 1.2 e-RMS Temporal Random Noise 128×256 Time-Resolved CMOS Image Sensor With Two In-Pixel SDs for Biomedical Applications.

    PubMed

    Seo, Min-Woong; Kawahito, Shoji

    2017-12-01

    A large full well capacity (FWC) for wide signal detection range and low temporal random noise for high sensitivity lock-in pixel CMOS image sensor (CIS) embedded with two in-pixel storage diodes (SDs) has been developed and presented in this paper. For fast charge transfer from photodiode to SDs, a lateral electric field charge modulator (LEFM) is used for the developed lock-in pixel. As a result, the time-resolved CIS achieves a very large SD-FWC of approximately 7ke-, low temporal random noise of 1.2e-rms at 20 fps with true correlated double sampling operation and fast intrinsic response less than 500 ps at 635 nm. The proposed imager has an effective pixel array of and a pixel size of . The sensor chip is fabricated by Dongbu HiTek 1P4M 0.11 CIS process.

  4. Submillimetre wave imaging and security: imaging performance and prediction

    NASA Astrophysics Data System (ADS)

    Appleby, R.; Ferguson, S.

    2016-10-01

    Within the European Commission Seventh Framework Programme (FP7), CONSORTIS (Concealed Object Stand-Off Real-Time Imaging for Security) has designed and is fabricating a stand-off system operating at sub-millimetre wave frequencies for the detection of objects concealed on people. This system scans people as they walk by the sensor. This paper presents the top level system design which brings together both passive and active sensors to provide good performance. The passive system operates in two bands between 100 and 600GHz and is based on a cryogen free cooled focal plane array sensor whilst the active system is a solid-state 340GHz radar. A modified version of OpenFX was used for modelling the passive system. This model was recently modified to include realistic location-specific skin temperature and to accept animated characters wearing up to three layers of clothing that move dynamically, such as those typically found in cinematography. Targets under clothing have been modelled and the performance simulated. The strengths and weaknesses of this modelling approach are discussed.

  5. Earth resources shuttle imaging radar. [systems analysis and design analysis of pulse radar for earth resources information system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A report is presented on a preliminary design of a Synthetic Array Radar (SAR) intended for experimental use with the space shuttle program. The radar is called Earth Resources Shuttle Imaging Radar (ERSIR). Its primary purpose is to determine the usefulness of SAR in monitoring and managing earth resources. The design of the ERSIR, along with tradeoffs made during its evolution is discussed. The ERSIR consists of a flight sensor for collecting the raw radar data and a ground sensor used both for reducing these radar data to images and for extracting earth resources information from the data. The flight sensor consists of two high powered coherent, pulse radars, one that operates at L and the other at X-band. Radar data, recorded on tape can be either transmitted via a digital data link to a ground terminal or the tape can be delivered to the ground station after the shuttle lands. A description of data processing equipment and display devices is given.

  6. SnO2/Pt Thin Film Laser Ablated Gas Sensor Array

    PubMed Central

    Shahrokh Abadi, Mohammad Hadi; Hamidon, Mohd Nizar; Shaari, Abdul Halim; Abdullah, Norhafizah; Wagiran, Rahman

    2011-01-01

    A gas sensor array was developed in a 10 × 10 mm2 space using Screen Printing and Pulse Laser Ablation Deposition (PLAD) techniques. Heater, electrode, and an insulator interlayer were printed using the screen printing method on an alumina substrate, while tin oxide and platinum films, as sensing and catalyst layers, were deposited on the electrode at room temperature using the PLAD method, respectively. To ablate SnO2 and Pt targets, depositions were achieved by using a 1,064 nm Nd-YAG laser, with a power of 0.7 J/s, at different deposition times of 2, 5 and 10 min, in an atmosphere containing 0.04 mbar (4 kPa) of O2. A range of spectroscopic diffraction and real space imaging techniques, SEM, EDX, XRD, and AFM were used in order to characterize the surface morphology, structure, and composition of the films. Measurement on the array shows sensitivity to some solvent and wood smoke can be achieved with short response and recovery times. PMID:22164041

  7. Standoff chemical D and Id with extended LWIR hyperspectral imaging spectroradiometer

    NASA Astrophysics Data System (ADS)

    Prel, Florent; Moreau, Louis; Lavoie, Hugo; Bouffard, François; Thériault, Jean-Marc; Vallieres, Christian; Roy, Claude; Dubé, Denis

    2013-05-01

    Standoff detection and identification (D and Id) of unknown volatile chemicals such as chemical pollutants and consequences of industrial incidents has been increasingly desired for first responders and for environmental monitoring. On site gas detection sensors are commercially available and several of them can even detect more than one chemical species, however only few of them have the capabilities of detecting a wide variety of gases at long and safe distances. The ABB Hyperspectral Imaging Spectroradiometer (MR-i), configured for gas detection detects and identifies a wide variety of chemical species including toxic industrial chemicals (TICs) and surrogates several kilometers away from the sensor. This configuration is called iCATSI for improved Compact Atmospheric Sounding Interferometer. iCATSI is a standoff passive system. The modularity of the MR-i platform allows optimization of the detection configuration with a 256 x 256 Focal Plane Array imager or a line scanning imager both covering the long wave IR atmospheric window up to 14 μm. The uniqueness of its extended LWIR cut off enables to detect more chemicals as well as provide higher probability of detection than usual LWIR sensors.

  8. Producing CCD imaging sensor with flashed backside metal film

    NASA Technical Reports Server (NTRS)

    Janesick, James R. (Inventor)

    1988-01-01

    A backside illuminated CCD imaging sensor for reading out image charges from wells of the array of pixels is significantly improved for blue, UV, far UV and low energy x-ray wavelengths (1-5000.ANG.) by so overthinning the backside as to place the depletion edge at the surface and depositing a thin transparent metal film of about 10.ANG. on a native-quality oxide film of less than about 30.ANG. grown on the thinned backside. The metal is selected to have a higher work function than that of the semiconductor to so bend the energy bands (at the interface of the semiconductor material and the oxide film) as to eliminate wells that would otherwise trap minority carriers. A bias voltage may be applied to extend the frontside depletion edge to the interface of the semiconductor material with the oxide film in the event there is not sufficient thinning. This metal film (flash gate), which improves and stabilizes the quantum efficiency of a CCD imaging sensor, will also improve the QE of any p-n junction photodetector.

  9. CCD imaging sensor with flashed backside metal film

    NASA Technical Reports Server (NTRS)

    Janesick, James R. (Inventor)

    1991-01-01

    A backside illuminated CCD imaging sensor for reading out image charges from wells of the array of pixels is significantly improved for blue, UV, far UV and low energy x-ray wavelengths (1-5000.ANG.) by so overthinning the backside as to place the depletion edge at the surface and depositing a thin transparent metal film of about 10.ANG. on a native-quality oxide film of less than about 30.ANG. grown on the thinned backside. The metal is selected to have a higher work function than that of the semiconductor to so bend the energy bands (at the interface of the semiconductor material and the oxide film) as to eliminate wells that would otherwise trap minority carriers. A bias voltage may be applied to extend the frontside depletion edge to the interface of the semiconductor material with the oxide film in the event there is not sufficient thinning. This metal film (flash gate), which improves and stabilizes the quantum efficiency of a CCD imaging sensor, will also improve the QE of any p-n junction photodetector.

  10. A high sensitivity 20Mfps CMOS image sensor with readout speed of 1Tpixel/sec for visualization of ultra-high speed phenomena

    NASA Astrophysics Data System (ADS)

    Kuroda, R.; Sugawa, S.

    2017-02-01

    Ultra-high speed (UHS) CMOS image sensors with on-chop analog memories placed on the periphery of pixel array for the visualization of UHS phenomena are overviewed in this paper. The developed UHS CMOS image sensors consist of 400H×256V pixels and 128 memories/pixel, and the readout speed of 1Tpixel/sec is obtained, leading to 10 Mfps full resolution video capturing with consecutive 128 frames, and 20 Mfps half resolution video capturing with consecutive 256 frames. The first development model has been employed in the high speed video camera and put in practical use in 2012. By the development of dedicated process technologies, photosensitivity improvement and power consumption reduction were simultaneously achieved, and the performance improved version has been utilized in the commercialized high-speed video camera since 2015 that offers 10 Mfps with ISO16,000 photosensitivity. Due to the improved photosensitivity, clear images can be captured and analyzed even under low light condition, such as under a microscope as well as capturing of UHS light emission phenomena.

  11. Low power gas sensor array on flexible acetate substrate

    NASA Astrophysics Data System (ADS)

    Benedict, Samatha; Basu, Palash Kumar; Bhat, Navakanta

    2017-07-01

    In this paper, we present a novel approach of fabricating a low-cost and low power gas sensor array on flexible acetate sheets for sensing CO, SO2, H2 and NO2 gases. The array has four sensor elements with an integrated microheater which can be individually controlled enabling the monitoring of four gases. The thermal properties of the microheater characterized by IR imaging are presented. The microheater with an active area of 15 µm  ×  5 µm reaches a temperature of 300 °C, consuming 2 mW power, the lowest reported on flexible substrates. A sensing electrode is patterned on top of the microheater, and a nanogap (100 nm) is created by an electromigration process. This nanogap is bridged by four sensing materials doped with platinum, deposited using a solution dispensing technique. The sensing material characterization is completed using energy dispersive x-ray analysis. The sensing characteristics of ZnO for CO, V2O5 for SO2, SnO2 for H2 and WO3 for NO2 gases are studied at different microheater voltages. The sensing characteristics of ZnO at different bending angles is also studied, which shows that the microheater and the sensing material are intact without any breaking upto a bending angle of 20°. The ZnO CO sensor shows sensitivity of 146.2% at 1 ppm with good selectivity.

  12. Aberration analysis and calculation in system of Gaussian beam illuminates lenslet array

    NASA Astrophysics Data System (ADS)

    Zhao, Zhu; Hui, Mei; Zhou, Ping; Su, Tianquan; Feng, Yun; Zhao, Yuejin

    2014-09-01

    Low order aberration was founded when focused Gaussian beam imaging at Kodak KAI -16000 image detector, which is integrated with lenslet array. Effect of focused Gaussian beam and numerical simulation calculation of the aberration were presented in this paper. First, we set up a model of optical imaging system based on previous experiment. Focused Gaussian beam passed through a pinhole and was received by Kodak KAI -16000 image detector whose microlenses of lenslet array were exactly focused on sensor surface. Then, we illustrated the characteristics of focused Gaussian beam and the effect of relative space position relations between waist of Gaussian beam and front spherical surface of microlenses to the aberration. Finally, we analyzed the main element of low order aberration and calculated the spherical aberration caused by lenslet array according to the results of above two steps. Our theoretical calculations shown that , the numerical simulation had a good agreement with the experimental result. Our research results proved that spherical aberration was the main element and made up about 93.44% of the 48 nm error, which was demonstrated in previous experiment. The spherical aberration is inversely proportional to the value of divergence distance between microlens and waist, and directly proportional to the value of the Gaussian beam waist radius.

  13. Optimising the multiplex factor of the frequency domain multiplexed readout of the TES-based microcalorimeter imaging array for the X-IFU instrument on the Athena x-ray observatory

    NASA Astrophysics Data System (ADS)

    van der Kuur, J.; Gottardi, L. G.; Akamatsu, H.; van Leeuwen, B. J.; den Hartog, R.; Haas, D.; Kiviranta, M.; Jackson, B. J.

    2016-07-01

    Athena is a space-based X-ray observatory intended for exploration of the hot and energetic universe. One of the science instruments on Athena will be the X-ray Integrated Field Unit (X-IFU), which is a cryogenic X-ray spectrometer, based on a large cryogenic imaging array of Transition Edge Sensors (TES) based microcalorimeters operating at a temperature of 100mK. The imaging array consists of 3800 pixels providing 2.5 eV spectral resolution, and covers a field of view with a diameter of of 5 arc minutes. Multiplexed readout of the cryogenic microcalorimeter array is essential to comply with the cooling power and complexity constraints on a space craft. Frequency domain multiplexing has been under development for the readout of TES-based detectors for this purpose, not only for the X-IFU detector arrays but also for TES-based bolometer arrays for the Safari instrument of the Japanese SPICA observatory. This paper discusses the design considerations which are applicable to optimise the multiplex factor within the boundary conditions as set by the space craft. More specifically, the interplay between the science requirements such as pixel dynamic range, pixel speed, and cross talk, and the space craft requirements such as the power dissipation budget, available bandwidth, and electromagnetic compatibility will be discussed.

  14. Iterative current mode per pixel ADC for 3D SoftChip implementation in CMOS

    NASA Astrophysics Data System (ADS)

    Lachowicz, Stefan W.; Rassau, Alexander; Lee, Seung-Minh; Eshraghian, Kamran; Lee, Mike M.

    2003-04-01

    Mobile multimedia communication has rapidly become a significant area of research and development constantly challenging boundaries on a variety of technological fronts. The processing requirements for the capture, conversion, compression, decompression, enhancement, display, etc. of increasingly higher quality multimedia content places heavy demands even on current ULSI (ultra large scale integration) systems, particularly for mobile applications where area and power are primary considerations. The ADC presented in this paper is designed for a vertically integrated (3D) system comprising two distinct layers bonded together using Indium bump technology. The top layer is a CMOS imaging array containing analogue-to-digital converters, and a buffer memory. The bottom layer takes the form of a configurable array processor (CAP), a highly parallel array of soft programmable processors capable of carrying out complex processing tasks directly on data stored in the top plane. This paper presents a ADC scheme for the image capture plane. The analogue photocurrent or sampled voltage is transferred to the ADC via a column or a column/row bus. In the proposed system, an array of analogue-to-digital converters is distributed, so that a one-bit cell is associated with one sensor. The analogue-to-digital converters are algorithmic current-mode converters. Eight such cells are cascaded to form an 8-bit converter. Additionally, each photo-sensor is equipped with a current memory cell, and multiple conversions are performed with scaled values of the photocurrent for colour processing.

  15. Solid-state flat panel imager with avalanche amorphous selenium

    NASA Astrophysics Data System (ADS)

    Scheuermann, James R.; Howansky, Adrian; Goldan, Amir H.; Tousignant, Olivier; Levéille, Sébastien; Tanioka, K.; Zhao, Wei

    2016-03-01

    Active matrix flat panel imagers (AMFPI) have become the dominant detector technology for digital radiography and fluoroscopy. For low dose imaging, electronic noise from the amorphous silicon thin film transistor (TFT) array degrades imaging performance. We have fabricated the first prototype solid-state AMFPI using a uniform layer of avalanche amorphous selenium (a-Se) photoconductor to amplify the signal to eliminate the effect of electronic noise. We have previously developed a large area solid-state avalanche a-Se sensor structure referred to as High Gain Avalanche Rushing Photoconductor (HARP) capable of achieving gains of 75. In this work we successfully deposited this HARP structure onto a 24 x 30 cm2 TFT array with a pixel pitch of 85 μm. An electric field (ESe) up to 105 Vμm-1 was applied across the a-Se layer without breakdown. Using the HARP layer as a direct detector, an X-ray avalanche gain of 15 +/- 3 was achieved at ESe = 105 Vμm-1. In indirect mode with a 150 μm thick structured CsI scintillator, an optical gain of 76 +/- 5 was measured at ESe = 105 Vμm-1. Image quality at low dose increases with the avalanche gain until the electronic noise is overcome at a constant exposure level of 0.76 mR. We demonstrate the success of a solid-state HARP X-ray imager as well as the largest active area HARP sensor to date.

  16. An Electronic-Nose Sensor Node Based on a Polymer-Coated Surface Acoustic Wave Array for Wireless Sensor Network Applications

    PubMed Central

    Tang, Kea-Tiong; Li, Cheng-Han; Chiu, Shih-Wen

    2011-01-01

    This study developed an electronic-nose sensor node based on a polymer-coated surface acoustic wave (SAW) sensor array. The sensor node comprised an SAW sensor array, a frequency readout circuit, and an Octopus II wireless module. The sensor array was fabricated on a large K2 128° YX LiNbO3 sensing substrate. On the surface of this substrate, an interdigital transducer (IDT) was produced with a Cr/Au film as its metallic structure. A mixed-mode frequency readout application specific integrated circuit (ASIC) was fabricated using a TSMC 0.18 μm process. The ASIC output was connected to a wireless module to transmit sensor data to a base station for data storage and analysis. This sensor node is applicable for wireless sensor network (WSN) applications. PMID:22163865

  17. An electronic-nose sensor node based on a polymer-coated surface acoustic wave array for wireless sensor network applications.

    PubMed

    Tang, Kea-Tiong; Li, Cheng-Han; Chiu, Shih-Wen

    2011-01-01

    This study developed an electronic-nose sensor node based on a polymer-coated surface acoustic wave (SAW) sensor array. The sensor node comprised an SAW sensor array, a frequency readout circuit, and an Octopus II wireless module. The sensor array was fabricated on a large K(2) 128° YX LiNbO3 sensing substrate. On the surface of this substrate, an interdigital transducer (IDT) was produced with a Cr/Au film as its metallic structure. A mixed-mode frequency readout application specific integrated circuit (ASIC) was fabricated using a TSMC 0.18 μm process. The ASIC output was connected to a wireless module to transmit sensor data to a base station for data storage and analysis. This sensor node is applicable for wireless sensor network (WSN) applications.

  18. A smart sensor architecture based on emergent computation in an array of outer-totalistic cells

    NASA Astrophysics Data System (ADS)

    Dogaru, Radu; Dogaru, Ioana; Glesner, Manfred

    2005-06-01

    A novel smart-sensor architecture is proposed, capable to segment and recognize characters in a monochrome image. It is capable to provide a list of ASCII codes representing the recognized characters from the monochrome visual field. It can operate as a blind's aid or for industrial applications. A bio-inspired cellular model with simple linear neurons was found the best to perform the nontrivial task of cropping isolated compact objects such as handwritten digits or characters. By attaching a simple outer-totalistic cell to each pixel sensor, emergent computation in the resulting cellular automata lattice provides a straightforward and compact solution to the otherwise computationally intensive problem of character segmentation. A simple and robust recognition algorithm is built in a compact sequential controller accessing the array of cells so that the integrated device can provide directly a list of codes of the recognized characters. Preliminary simulation tests indicate good performance and robustness to various distortions of the visual field.

  19. Ultra-Sensitive Strain Sensor Based on Flexible Poly(vinylidene fluoride) Piezoelectric Film

    NASA Astrophysics Data System (ADS)

    Lu, Kai; Huang, Wen; Guo, Junxiong; Gong, Tianxun; Wei, Xiongbang; Lu, Bing-Wei; Liu, Si-Yi; Yu, Bin

    2018-03-01

    A flexible 4 × 4 sensor array with 16 micro-scale capacitive units has been demonstrated based on flexible piezoelectric poly(vinylidene fluoride) (PVDF) film. The piezoelectricity and surface morphology of the PVDF were examined by optical imaging and piezoresponse force microscopy (PFM). The PFM shows phase contrast, indicating clear interface between the PVDF and electrode. The electro-mechanical properties show that the sensor exhibits excellent output response and an ultra-high signal-to-noise ratio. The output voltage and the applied pressure possess linear relationship with a slope of 12 mV/kPa. The hold-and-release output characteristics recover in less than 2.5 μs, demonstrating outstanding electro-mechanical response. Additionally, signal interference between the adjacent arrays has been investigated via theoretical simulation. The results show the interference reduces with decreasing pressure at a rate of 0.028 mV/kPa, highly scalable with electrode size and becoming insignificant for pressure level under 178 kPa.

  20. Ultra-Sensitive Strain Sensor Based on Flexible Poly(vinylidene fluoride) Piezoelectric Film.

    PubMed

    Lu, Kai; Huang, Wen; Guo, Junxiong; Gong, Tianxun; Wei, Xiongbang; Lu, Bing-Wei; Liu, Si-Yi; Yu, Bin

    2018-03-14

    A flexible 4 × 4 sensor array with 16 micro-scale capacitive units has been demonstrated based on flexible piezoelectric poly(vinylidene fluoride) (PVDF) film. The piezoelectricity and surface morphology of the PVDF were examined by optical imaging and piezoresponse force microscopy (PFM). The PFM shows phase contrast, indicating clear interface between the PVDF and electrode. The electro-mechanical properties show that the sensor exhibits excellent output response and an ultra-high signal-to-noise ratio. The output voltage and the applied pressure possess linear relationship with a slope of 12 mV/kPa. The hold-and-release output characteristics recover in less than 2.5 μs, demonstrating outstanding electro-mechanical response. Additionally, signal interference between the adjacent arrays has been investigated via theoretical simulation. The results show the interference reduces with decreasing pressure at a rate of 0.028 mV/kPa, highly scalable with electrode size and becoming insignificant for pressure level under 178 kPa.

  1. Freely suspended nanocomposite membranes as highly sensitive sensors

    NASA Astrophysics Data System (ADS)

    Jiang, Chaoyang; Markutsya, Sergiy; Pikus, Yuri; Tsukruk, Vladimir V.

    2004-10-01

    Highly sensitive sensor arrays are in high demand for prospective applications in remote sensing and imaging. Measuring microscopic deflections of compliant micromembranes and cantilevers is developing into one of the most versatile approaches for thermal, acoustic and chemical sensing. Here, we report on an innovative fabrication of compliant nanocomposite membranes with nanoscale thickness showing extraordinary sensitivity and dynamic range, which makes them candidates for a new generation of membrane-based sensor arrays. These nanomembranes with a thickness of 25-70 nm, which can be freely suspended over large (hundred micrometres) openings are fabricated with molecular precision by time-efficient, spin-assisted layer-by-layer assembly. They are designed as multilayered molecular composites made of a combination of polymeric monolayers and a metal nanoparticle intralayer. We demonstrate that these nanocomposite membranes possess unparalleled sensitivity and a unique autorecovering ability. The membrane nanostructure that is responsible for these outstanding properties combines multilayered polymer/nanoparticle organization, high polymer-chain orientation, and a pre-stretched state.

  2. Performance of a novel wafer scale CMOS active pixel sensor for bio-medical imaging.

    PubMed

    Esposito, M; Anaxagoras, T; Konstantinidis, A C; Zheng, Y; Speller, R D; Evans, P M; Allinson, N M; Wells, K

    2014-07-07

    Recently CMOS active pixels sensors (APSs) have become a valuable alternative to amorphous silicon and selenium flat panel imagers (FPIs) in bio-medical imaging applications. CMOS APSs can now be scaled up to the standard 20 cm diameter wafer size by means of a reticle stitching block process. However, despite wafer scale CMOS APS being monolithic, sources of non-uniformity of response and regional variations can persist representing a significant challenge for wafer scale sensor response. Non-uniformity of stitched sensors can arise from a number of factors related to the manufacturing process, including variation of amplification, variation between readout components, wafer defects and process variations across the wafer due to manufacturing processes. This paper reports on an investigation into the spatial non-uniformity and regional variations of a wafer scale stitched CMOS APS. For the first time a per-pixel analysis of the electro-optical performance of a wafer CMOS APS is presented, to address inhomogeneity issues arising from the stitching techniques used to manufacture wafer scale sensors. A complete model of the signal generation in the pixel array has been provided and proved capable of accounting for noise and gain variations across the pixel array. This novel analysis leads to readout noise and conversion gain being evaluated at pixel level, stitching block level and in regions of interest, resulting in a coefficient of variation ⩽1.9%. The uniformity of the image quality performance has been further investigated in a typical x-ray application, i.e. mammography, showing a uniformity in terms of CNR among the highest when compared with mammography detectors commonly used in clinical practice. Finally, in order to compare the detection capability of this novel APS with the technology currently used (i.e. FPIs), theoretical evaluation of the detection quantum efficiency (DQE) at zero-frequency has been performed, resulting in a higher DQE for this detector compared to FPIs. Optical characterization, x-ray contrast measurements and theoretical DQE evaluation suggest that a trade off can be found between the need of a large imaging area and the requirement of a uniform imaging performance, making the DynAMITe large area CMOS APS suitable for a range of bio-medical applications.

  3. Improved chemical identification from sensor arrays using intelligent algorithms

    NASA Astrophysics Data System (ADS)

    Roppel, Thaddeus A.; Wilson, Denise M.

    2001-02-01

    Intelligent signal processing algorithms are shown to improve identification rates significantly in chemical sensor arrays. This paper focuses on the use of independently derived sensor status information to modify the processing of sensor array data by using a fast, easily-implemented "best-match" approach to filling in missing sensor data. Most fault conditions of interest (e.g., stuck high, stuck low, sudden jumps, excess noise, etc.) can be detected relatively simply by adjunct data processing, or by on-board circuitry. The objective then is to devise, implement, and test methods for using this information to improve the identification rates in the presence of faulted sensors. In one typical example studied, utilizing separately derived, a-priori knowledge about the health of the sensors in the array improved the chemical identification rate by an artificial neural network from below 10 percent correct to over 99 percent correct. While this study focuses experimentally on chemical sensor arrays, the results are readily extensible to other types of sensor platforms.

  4. Transition-edge sensor imaging arrays for astrophysics applications

    NASA Astrophysics Data System (ADS)

    Burney, Jennifer Anne

    Many interesting objects in our universe currently elude observation in the optical band: they are too faint or they vary rapidly and thus any structure in their radiation is lost over the period of an exposure. Conventional photon detectors cannot simultaneously provide energy resolution and time-stamping of individual photons at fast rates. Superconducting detectors have recently made the possibility of simultaneous photon counting, imaging, and energy resolution a reality. Our research group has pioneered the use of one such detector, the Transition-Edge Sensor (TES). TES physics is simple and elegant. A thin superconducting film, biased at its critical temperature, can act as a particle detector: an incident particle deposits energy and drives the film into its superconducting-normal transition. By inductively coupling the detector to a SQUID amplifier circuit, this resistance change can be read out as a current pulse, and its energy deduced by integrating over the pulse. TESs can be used to accurately time-stamp (to 0.1 [mu]s) and energy-resolve (0.15 eV at 1.6 eV) near-IR/visible/near-UV photons at rates of 30~kHz. The first astronomical observations using fiber-coupled detectors were made at the Stanford Student Observatory 0.6~m telescope in 1999. Further observations of the Crab Pulsar from the 107" telescope at the University of Texas McDonald Observatory showed rapid phase variations over the near-IR/visible/near-UV band. These preliminary observations provided a glimpse into a new realm of observations of pulsars, binary systems, and accreting black holes promised by TES arrays. This thesis describes the development, characterization, and preliminary use of the first camera system based on Transition-Edge Sensors. While single-device operation is relatively well-understood, the operation of a full imaging array poses significant challenges. This thesis addresses all aspects related to the creation and characterization of this cryogenic imaging instrument. I discuss experiments probing a host of cryostat constraints and design innovations to surmount them; simulations and experiments to characterize and filter infrared radiation; theoretical and experimental exploration of detector and array noise, cross-talk, and position-dependence; challenges of low-temperature a nd readout electronics; acquisition and analysis of data; and first light.

  5. WO{sub 3} thin film based multiple sensor array for electronic nose application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramgir, Niranjan S., E-mail: niranjanpr@yahoo.com, E-mail: deepakcct1991@gmail.com; Goyal, C. P.; Datta, N.

    2015-06-24

    Multiple sensor array comprising 16 x 2 sensing elements were realized using RF sputtered WO{sub 3} thin films. The sensor films were modified with a thin layer of sensitizers namely Au, Ni, Cu, Al, Pd, Ti, Pt. The resulting sensor array were tested for their response towards different gases namely H{sub 2}S, NH{sub 3}, NO and C{sub 2}H{sub 5}OH. The sensor response values measured from the response curves indicates that the sensor array generates a unique signature pattern (bar chart) for the gases. The sensor response values can be used to get both qualitative and quantitative information about the gas.

  6. Development of Thermal Infrared Sensor to Supplement Operational Land Imager

    NASA Technical Reports Server (NTRS)

    Shu, Peter; Waczynski, Augustyn; Kan, Emily; Wen, Yiting; Rosenberry, Robert

    2012-01-01

    The thermal infrared sensor (TIRS) is a quantum well infrared photodetector (QWIP)-based instrument intended to supplement the Operational Land Imager (OLI) for the Landsat Data Continuity Mission (LDCM). The TIRS instrument is a far-infrared imager operating in the pushbroom mode with two IR channels: 10.8 and 12 m. The focal plane will contain three 640 512 QWIP arrays mounted onto a silicon substrate. The readout integrated circuit (ROIC) addresses each pixel on the QWIP arrays and reads out the pixel value (signal). The ROIC is controlled by the focal plane electronics (FPE) by means of clock signals and bias voltage value. The means of how the FPE is designed to control and interact with the TIRS focal plane assembly (FPA) is the basis for this work. The technology developed under the FPE is for the TIRS focal plane assembly (FPA). The FPE must interact with the FPA to command and control the FPA, extract analog signals from the FPA, and then convert the analog signals to digital format and send them via a serial link (USB) to a computer. The FPE accomplishes the described functions by converting electrical power from generic power supplies to the required bias power that is needed by the FPA. The FPE also generates digital clocking signals and shifts the typical transistor-to-transistor logic (TTL) to }5 V required by the FPA. The FPE also uses an application- specific integrated circuit (ASIC) named System Image, Digitizing, Enhancing, Controlling, And Retrieving (SIDECAR) from Teledyne Corp. to generate the clocking patterns commanded by the user. The uniqueness of the FPE for TIRS lies in that the TIRS FPA has three QWIP detector arrays, and all three detector arrays must be in synchronization while in operation. This is to avoid data skewing while observing Earth flying in space. The observing scenario may be customized by uploading new control software to the SIDECAR.

  7. A low-frequency near-field interferometric-TOA 3-D Lightning Mapping Array

    NASA Astrophysics Data System (ADS)

    Lyu, Fanchao; Cummer, Steven A.; Solanki, Rahulkumar; Weinert, Joel; McTague, Lindsay; Katko, Alex; Barrett, John; Zigoneanu, Lucian; Xie, Yangbo; Wang, Wenqi

    2014-11-01

    We report on the development of an easily deployable LF near-field interferometric-time of arrival (TOA) 3-D Lightning Mapping Array applied to imaging of entire lightning flashes. An interferometric cross-correlation technique is applied in our system to compute windowed two-sensor time differences with submicrosecond time resolution before TOA is used for source location. Compared to previously reported LF lightning location systems, our system captures many more LF sources. This is due mainly to the improved mapping of continuous lightning processes by using this type of hybrid interferometry/TOA processing method. We show with five station measurements that the array detects and maps different lightning processes, such as stepped and dart leaders, during both in-cloud and cloud-to-ground flashes. Lightning images mapped by our LF system are remarkably similar to those created by VHF mapping systems, which may suggest some special links between LF and VHF emission during lightning processes.

  8. Digital solar edge tracker for the Halogen Occultation Experiment

    NASA Technical Reports Server (NTRS)

    Mauldin, L. E., III; Moore, A. S.; Stump, C. W.; Mayo, L. S.

    1987-01-01

    The optical and electronic design of the Halogen Occultation Experiment (Haloe) elevation sun sensor is described. The Haloe instrument is a gas-correlation radiometer now being developed at NASA Langley for the Upper Atmosphere Research Satellite. The system uses a Galilean telescope to form a solar image on a linear silicon photodiode array. The array is a self-scanned monolithic CCD. The addresses of both solar edges imaged on the array are used by the control/pointing system to scan the Haloe science instantaneous field of view (IFOV) across the vertical solar diameter during instrument calibration and then to maintain the science IFOV 4 arcmin below the top edge during the science data occultation event. Vertical resolution of 16 arcsec and a radiometric dynamic range of 100 are achieved at the 700-nm operating wavelength. The design provides for loss of individual photodiode elements without loss of angular tracking capability.

  9. Learning from concurrent Lightning Imaging Sensor and Lightning Mapping Array observations in preparation for the MTG-LI mission

    NASA Astrophysics Data System (ADS)

    Defer, Eric; Bovalo, Christophe; Coquillat, Sylvain; Pinty, Jean-Pierre; Farges, Thomas; Krehbiel, Paul; Rison, William

    2016-04-01

    The upcoming decade will see the deployment and the operation of French, European and American space-based missions dedicated to the detection and the characterization of the lightning activity on Earth. For instance the Tool for the Analysis of Radiation from lightNIng and Sprites (TARANIS) mission, with an expected launch in 2018, is a CNES mission dedicated to the study of impulsive energy transfers between the atmosphere of the Earth and the space environment. It will carry a package of Micro Cameras and Photometers (MCP) to detect and locate lightning flashes and triggered Transient Luminous Events (TLEs). At the European level, the Meteosat Third Generation Imager (MTG-I) satellites will carry in 2019 the Lightning Imager (LI) aimed at detecting and locating the lightning activity over almost the full disk of Earth as usually observed with Meteosat geostationary infrared/visible imagers. The American community plans to operate a similar instrument on the GOES-R mission for an effective operation in early 2016. In addition NASA will install in 2016 on the International Space Station the spare version of the Lightning Imaging Sensor (LIS) that has proved its capability to optically detect the tropical lightning activity from the Tropical Rainfall Measuring Mission (TRMM) spacecraft. We will present concurrent observations recorded by the optical space-borne Lightning Imaging Sensor (LIS) and the ground-based Very High Frequency (VHF) Lightning Mapping Array (LMA) for different types of lightning flashes. The properties of the cloud environment will also be considered in the analysis thanks to coincident observations of the different TRMM cloud sensors. The characteristics of the optical signal will be discussed according to the nature of the parent flash components and the cloud properties. This study should provide some insights not only on the expected optical signal that will be recorded by LI, but also on the definition of the validation strategy of LI, and on the synergetic use of LI and ground-based VHF mappers like the SAETTA LMA network in Corsica for operational and research activities. Acknowledgements: this study is part of the SOLID-PREVALS project and is supported by CNES-TOSCA.

  10. A 65k pixel, 150k frames-per-second camera with global gating and micro-lenses suitable for fluorescence lifetime imaging

    NASA Astrophysics Data System (ADS)

    Burri, Samuel; Powolny, François; Bruschini, Claudio E.; Michalet, Xavier; Regazzoni, Francesco; Charbon, Edoardo

    2014-05-01

    This paper presents our work on a 65k pixel single-photon avalanche diode (SPAD) based imaging sensor realized in a 0.35μm standard CMOS process. At a resolution of 512 by 128 pixels the sensor is read out in 6.4μs to deliver over 150k monochrome frames per second. The individual pixel has a size of 24μm2 and contains the SPAD with a 12T quenching and gating circuitry along with a memory element. The gating signals are distributed across the chip through a balanced tree to minimize the signal skew between the pixels. The array of pixels is row-addressable and data is sent out of the chip on 128 lines in parallel at a frequency of 80MHz. The system is controlled by an FPGA which generates the gating and readout signals and can be used for arbitrary real-time computation on the frames from the sensor. The communication protocol between the camera and a conventional PC is USB2. The active area of the chip is 5% and can be significantly improved with the application of a micro-lens array. A micro-lens array, for use with collimated light, has been designed and its performance is reviewed in the paper. Among other high-speed phenomena the gating circuitry capable of generating illumination periods shorter than 5ns can be used for Fluorescence Lifetime Imaging (FLIM). In order to measure the lifetime of fluorophores excited by a picosecond laser, the sensor's illumination period is synchronized with the excitation laser pulses. A histogram of the photon arrival times relative to the excitation is then constructed by counting the photons arriving during the sensitive time for several positions of the illumination window. The histogram for each pixel is transferred afterwards to a computer where software routines extract the lifetime at each location with an accuracy better than 100ps. We show results for fluorescence lifetime measurements using different fluorophores with lifetimes ranging from 150ps to 5ns.

  11. An ECT/ERT dual-modality sensor for oil-water two-phase flow measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Pitao; Wang, Huaxiang; Sun, Benyuan

    2014-04-11

    This paper presents a new sensor for ECT/ERT dual-modality system which can simultaneously obtain the permittivity and conductivity of the materials in the pipeline. Quasi-static electromagnetic fields are produced by the inner electrodes array sensor of electrical capacitance tomography (ECT) system. The results of simulation show that the data of permittivity and conductivity can be simultaneously obtained from the same measurement electrode and the fusion of two kinds of data may improve the quality of the reconstructed images. For uniform oil-water mixtures, the performance of designed dual-modality sensor for measuring the various oil fractions has been tested on representative datamore » and the results of experiments show that the designed sensor broadens the measurement range compared to single modality.« less

  12. Short-wavelength infrared imaging using low dark current InGaAs detector arrays and vertical-cavity surface-emitting laser illuminators

    NASA Astrophysics Data System (ADS)

    Macdougal, Michael; Geske, Jon; Wang, Chad; Follman, David

    2011-06-01

    We describe the factors that go into the component choices for a short wavelength IR (SWIR) imager, which include the SWIR sensor, the lens, and the illuminator. We have shown the factors for reducing dark current, and shown that we can achieve well below 1.5 nA/cm2 for 15 μm devices at 7 °C. In addition, we have mated our InGaAs detector arrays to 640×512 readout integrated integrated circuits to make focal plane arrays (FPAs). The resulting FPAs are capable of imaging photon fluxes with wavelengths between 1 and 1.6 μm at low light levels. The dark current associated with these FPAs is extremely low, exhibiting a mean dark current density of 0.26 nA/cm2 at 0 °C. Noise due to the readout can be reduced from 95 to 57 electrons by using off-chip correlated double sampling. In addition, Aerius has developed laser arrays that provide flat illumination in scenes that are normally light-starved. The illuminators have 40% wall-plug efficiency and provide low-speckle illumination, and provide artifact-free imagery versus conventional laser illuminators.

  13. Neural Network Substorm Identification: Enabling TREx Sensor Web Modes

    NASA Astrophysics Data System (ADS)

    Chaddock, D.; Spanswick, E.; Arnason, K. M.; Donovan, E.; Liang, J.; Ahmad, S.; Jackel, B. J.

    2017-12-01

    Transition Region Explorer (TREx) is a ground-based sensor web of optical and radio instruments that is presently being deployed across central Canada. The project consists of an array of co-located blue-line, full-colour, and near-infrared all-sky imagers, imaging riometers, proton aurora spectrographs, and GNSS systems. A key goal of the TREx project is to create the world's first (artificial) intelligent sensor web for remote sensing space weather. The sensor web will autonomously control and coordinate instrument operations in real-time. To accomplish this, we will use real-time in-line analytics of TREx and other data to dynamically switch between operational modes. An operating mode could be, for example, to have a blue-line imager gather data at a one or two orders of magnitude higher cadence than it operates for its `baseline' mode. The software decision to increase the imaging cadence would be in response to an anticipated increase in auroral activity or other programmatic requirements. Our first test for TREx's sensor web technologies is to develop the capacity to autonomously alter the TREx operating mode prior to a substorm expansion phase onset. In this paper, we present our neural network analysis of historical optical and riometer data and our ability to predict an optical onset. We explore the preliminary insights into using a neural network to pick out trends and features which it deems are similar among substorms.

  14. Adaptive scene-based correction algorithm for removal of residual fixed pattern noise in microgrid image data

    NASA Astrophysics Data System (ADS)

    Ratliff, Bradley M.; LeMaster, Daniel A.

    2012-06-01

    Pixel-to-pixel response nonuniformity is a common problem that affects nearly all focal plane array sensors. This results in a frame-to-frame fixed pattern noise (FPN) that causes an overall degradation in collected data. FPN is often compensated for through the use of blackbody calibration procedures; however, FPN is a particularly challenging problem because the detector responsivities drift relative to one another in time, requiring that the sensor be recalibrated periodically. The calibration process is obstructive to sensor operation and is therefore only performed at discrete intervals in time. Thus, any drift that occurs between calibrations (along with error in the calibration sources themselves) causes varying levels of residual calibration error to be present in the data at all times. Polarimetric microgrid sensors are particularly sensitive to FPN due to the spatial differencing involved in estimating the Stokes vector images. While many techniques exist in the literature to estimate FPN for conventional video sensors, few have been proposed to address the problem in microgrid imaging sensors. Here we present a scene-based nonuniformity correction technique for microgrid sensors that is able to reduce residual fixed pattern noise while preserving radiometry under a wide range of conditions. The algorithm requires a low number of temporal data samples to estimate the spatial nonuniformity and is computationally efficient. We demonstrate the algorithm's performance using real data from the AFRL PIRATE and University of Arizona LWIR microgrid sensors.

  15. Design of a temperature control system using incremental PID algorithm for a special homemade shortwave infrared spatial remote sensor based on FPGA

    NASA Astrophysics Data System (ADS)

    Xu, Zhipeng; Wei, Jun; Li, Jianwei; Zhou, Qianting

    2010-11-01

    An image spectrometer of a spatial remote sensing satellite requires shortwave band range from 2.1μm to 3μm which is one of the most important bands in remote sensing. We designed an infrared sub-system of the image spectrometer using a homemade 640x1 InGaAs shortwave infrared sensor working on FPA system which requires high uniformity and low level of dark current. The working temperature should be -15+/-0.2 Degree Celsius. This paper studies the model of noise for focal plane array (FPA) system, investigated the relationship with temperature and dark current noise, and adopts Incremental PID algorithm to generate PWM wave in order to control the temperature of the sensor. There are four modules compose of the FPGA module design. All of the modules are coded by VHDL and implemented in FPGA device APA300. Experiment shows the intelligent temperature control system succeeds in controlling the temperature of the sensor.

  16. Rapid and highly integrated FPGA-based Shack-Hartmann wavefront sensor for adaptive optics system

    NASA Astrophysics Data System (ADS)

    Chen, Yi-Pin; Chang, Chia-Yuan; Chen, Shean-Jen

    2018-02-01

    In this study, a field programmable gate array (FPGA)-based Shack-Hartmann wavefront sensor (SHWS) programmed on LabVIEW can be highly integrated into customized applications such as adaptive optics system (AOS) for performing real-time wavefront measurement. Further, a Camera Link frame grabber embedded with FPGA is adopted to enhance the sensor speed reacting to variation considering its advantage of the highest data transmission bandwidth. Instead of waiting for a frame image to be captured by the FPGA, the Shack-Hartmann algorithm are implemented in parallel processing blocks design and let the image data transmission synchronize with the wavefront reconstruction. On the other hand, we design a mechanism to control the deformable mirror in the same FPGA and verify the Shack-Hartmann sensor speed by controlling the frequency of the deformable mirror dynamic surface deformation. Currently, this FPGAbead SHWS design can achieve a 266 Hz cyclic speed limited by the camera frame rate as well as leaves 40% logic slices for additionally flexible design.

  17. Accurate positioning based on acoustic and optical sensors

    NASA Astrophysics Data System (ADS)

    Cai, Kerong; Deng, Jiahao; Guo, Hualing

    2009-11-01

    Unattended laser target designator (ULTD) was designed to partly take the place of conventional LTDs for accurate positioning and laser marking. Analyzed the precision, accuracy and errors of acoustic sensor array, the requirements of laser generator, and the technology of image analysis and tracking, the major system modules were determined. The target's classification, velocity and position can be measured by sensors, and then coded laser beam will be emitted intelligently to mark the excellent position at the excellent time. The conclusion shows that, ULTD can not only avoid security threats, be deployed massively, and accomplish battle damage assessment (BDA), but also be fit for information-based warfare.

  18. FPGA-based real time processing of the Plenoptic Wavefront Sensor

    NASA Astrophysics Data System (ADS)

    Rodríguez-Ramos, L. F.; Marín, Y.; Díaz, J. J.; Piqueras, J.; García-Jiménez, J.; Rodríguez-Ramos, J. M.

    The plenoptic wavefront sensor combines measurements at pupil and image planes in order to obtain simultaneously wavefront information from different points of view, being capable to sample the volume above the telescope to extract the tomographic information of the atmospheric turbulence. The advantages of this sensor are presented elsewhere at this conference (José M. Rodríguez-Ramos et al). This paper will concentrate in the processing required for pupil plane phase recovery, and its computation in real time using FPGAs (Field Programmable Gate Arrays). This technology eases the implementation of massive parallel processing and allows tailoring the system to the requirements, maintaining flexibility, speed and cost figures.

  19. A novel digital image sensor with row wise gain compensation for Hyper Spectral Imager (HySI) application

    NASA Astrophysics Data System (ADS)

    Lin, Shengmin; Lin, Chi-Pin; Wang, Weng-Lyang; Hsiao, Feng-Ke; Sikora, Robert

    2009-08-01

    A 256x512 element digital image sensor has been developed which has a large pixel size, slow scan and low power consumption for Hyper Spectral Imager (HySI) applications. The device is a mixed mode, silicon on chip (SOC) IC. It combines analog circuitry, digital circuitry and optical sensor circuitry into a single chip. This chip integrates a 256x512 active pixel sensor array, a programming gain amplifier (PGA) for row wise gain setting, I2C interface, SRAM, 12 bit analog to digital convertor (ADC), voltage regulator, low voltage differential signal (LVDS) and timing generator. The device can be used for 256 pixels of spatial resolution and 512 bands of spectral resolution ranged from 400 nm to 950 nm in wavelength. In row wise gain readout mode, one can set a different gain on each row of the photo detector by storing the gain setting data on the SRAM thru the I2C interface. This unique row wise gain setting can be used to compensate the silicon spectral response non-uniformity problem. Due to this unique function, the device is suitable for hyper-spectral imager applications. The HySI camera located on-board the Chandrayaan-1 satellite, was successfully launched to the moon on Oct. 22, 2008. The device is currently mapping the moon and sending back excellent images of the moon surface. The device design and the moon image data will be presented in the paper.

  20. Surface chemistry and morphology in single particle optical imaging

    NASA Astrophysics Data System (ADS)

    Ekiz-Kanik, Fulya; Sevenler, Derin Deniz; Ünlü, Neşe Lortlar; Chiari, Marcella; Ünlü, M. Selim

    2017-05-01

    Biological nanoparticles such as viruses and exosomes are important biomarkers for a range of medical conditions, from infectious diseases to cancer. Biological sensors that detect whole viruses and exosomes with high specificity, yet without additional labeling, are promising because they reduce the complexity of sample preparation and may improve measurement quality by retaining information about nanoscale physical structure of the bio-nanoparticle (BNP). Towards this end, a variety of BNP biosensor technologies have been developed, several of which are capable of enumerating the precise number of detected viruses or exosomes and analyzing physical properties of each individual particle. Optical imaging techniques are promising candidates among broad range of label-free nanoparticle detectors. These imaging BNP sensors detect the binding of single nanoparticles on a flat surface functionalized with a specific capture molecule or an array of multiplexed capture probes. The functionalization step confers all molecular specificity for the sensor's target but can introduce an unforeseen problem; a rough and inhomogeneous surface coating can be a source of noise, as these sensors detect small local changes in optical refractive index. In this paper, we review several optical technologies for label-free BNP detectors with a focus on imaging systems. We compare the surface-imaging methods including dark-field, surface plasmon resonance imaging and interference reflectance imaging. We discuss the importance of ensuring consistently uniform and smooth surface coatings of capture molecules for these types of biosensors and finally summarize several methods that have been developed towards addressing this challenge.

Top