Sample records for image sensor technology

  1. CMOS Image Sensors: Electronic Camera On A Chip

    NASA Technical Reports Server (NTRS)

    Fossum, E. R.

    1995-01-01

    Recent advancements in CMOS image sensor technology are reviewed, including both passive pixel sensors and active pixel sensors. On- chip analog to digital converters and on-chip timing and control circuits permit realization of an electronic camera-on-a-chip. Highly miniaturized imaging systems based on CMOS image sensor technology are emerging as a competitor to charge-coupled devices for low cost uses.

  2. The lucky image-motion prediction for simple scene observation based soft-sensor technology

    NASA Astrophysics Data System (ADS)

    Li, Yan; Su, Yun; Hu, Bin

    2015-08-01

    High resolution is important to earth remote sensors, while the vibration of the platforms of the remote sensors is a major factor restricting high resolution imaging. The image-motion prediction and real-time compensation are key technologies to solve this problem. For the reason that the traditional autocorrelation image algorithm cannot meet the demand for the simple scene image stabilization, this paper proposes to utilize soft-sensor technology in image-motion prediction, and focus on the research of algorithm optimization in imaging image-motion prediction. Simulations results indicate that the improving lucky image-motion stabilization algorithm combining the Back Propagation Network (BP NN) and support vector machine (SVM) is the most suitable for the simple scene image stabilization. The relative error of the image-motion prediction based the soft-sensor technology is below 5%, the training computing speed of the mathematical predication model is as fast as the real-time image stabilization in aerial photography.

  3. Development of CMOS Active Pixel Image Sensors for Low Cost Commercial Applications

    NASA Technical Reports Server (NTRS)

    Gee, R.; Kemeny, S.; Kim, Q.; Mendis, S.; Nakamura, J.; Nixon, R.; Ortiz, M.; Pain, B.; Staller, C.; Zhou, Z; hide

    1994-01-01

    JPL, under sponsorship from the NASA Office of Advanced Concepts and Technology, has been developing a second-generation solid-state image sensor technology. Charge-coupled devices (CCD) are a well-established first generation image sensor technology. For both commercial and NASA applications, CCDs have numerous shortcomings. In response, the active pixel sensor (APS) technology has been under research. The major advantages of APS technology are the ability to integrate on-chip timing, control, signal-processing and analog-to-digital converter functions, reduced sensitivity to radiation effects, low power operation, and random access readout.

  4. Smart image sensors: an emerging key technology for advanced optical measurement and microsystems

    NASA Astrophysics Data System (ADS)

    Seitz, Peter

    1996-08-01

    Optical microsystems typically include photosensitive devices, analog preprocessing circuitry and digital signal processing electronics. The advances in semiconductor technology have made it possible today to integrate all photosensitive and electronical devices on one 'smart image sensor' or photo-ASIC (application-specific integrated circuits containing photosensitive elements). It is even possible to provide each 'smart pixel' with additional photoelectronic functionality, without compromising the fill factor substantially. This technological capability is the basis for advanced cameras and optical microsystems showing novel on-chip functionality: Single-chip cameras with on- chip analog-to-digital converters for less than $10 are advertised; image sensors have been developed including novel functionality such as real-time selectable pixel size and shape, the capability of performing arbitrary convolutions simultaneously with the exposure, as well as variable, programmable offset and sensitivity of the pixels leading to image sensors with a dynamic range exceeding 150 dB. Smart image sensors have been demonstrated offering synchronous detection and demodulation capabilities in each pixel (lock-in CCD), and conventional image sensors are combined with an on-chip digital processor for complete, single-chip image acquisition and processing systems. Technological problems of the monolithic integration of smart image sensors include offset non-uniformities, temperature variations of electronic properties, imperfect matching of circuit parameters, etc. These problems can often be overcome either by designing additional compensation circuitry or by providing digital correction routines. Where necessary for technological or economic reasons, smart image sensors can also be combined with or realized as hybrids, making use of commercially available electronic components. It is concluded that the possibilities offered by custom smart image sensors will influence the design and the performance of future electronic imaging systems in many disciplines, reaching from optical metrology to machine vision on the factory floor and in robotics applications.

  5. Technical guidance for the development of a solid state image sensor for human low vision image warping

    NASA Technical Reports Server (NTRS)

    Vanderspiegel, Jan

    1994-01-01

    This report surveys different technologies and approaches to realize sensors for image warping. The goal is to study the feasibility, technical aspects, and limitations of making an electronic camera with special geometries which implements certain transformations for image warping. This work was inspired by the research done by Dr. Juday at NASA Johnson Space Center on image warping. The study has looked into different solid-state technologies to fabricate image sensors. It is found that among the available technologies, CMOS is preferred over CCD technology. CMOS provides more flexibility to design different functions into the sensor, is more widely available, and is a lower cost solution. By using an architecture with row and column decoders one has the added flexibility of addressing the pixels at random, or read out only part of the image.

  6. Photon Counting Imaging with an Electron-Bombarded Pixel Image Sensor

    PubMed Central

    Hirvonen, Liisa M.; Suhling, Klaus

    2016-01-01

    Electron-bombarded pixel image sensors, where a single photoelectron is accelerated directly into a CCD or CMOS sensor, allow wide-field imaging at extremely low light levels as they are sensitive enough to detect single photons. This technology allows the detection of up to hundreds or thousands of photon events per frame, depending on the sensor size, and photon event centroiding can be employed to recover resolution lost in the detection process. Unlike photon events from electron-multiplying sensors, the photon events from electron-bombarded sensors have a narrow, acceleration-voltage-dependent pulse height distribution. Thus a gain voltage sweep during exposure in an electron-bombarded sensor could allow photon arrival time determination from the pulse height with sub-frame exposure time resolution. We give a brief overview of our work with electron-bombarded pixel image sensor technology and recent developments in this field for single photon counting imaging, and examples of some applications. PMID:27136556

  7. Image Sensors Enhance Camera Technologies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

  8. Low Power Camera-on-a-Chip Using CMOS Active Pixel Sensor Technology

    NASA Technical Reports Server (NTRS)

    Fossum, E. R.

    1995-01-01

    A second generation image sensor technology has been developed at the NASA Jet Propulsion Laboratory as a result of the continuing need to miniaturize space science imaging instruments. Implemented using standard CMOS, the active pixel sensor (APS) technology permits the integration of the detector array with on-chip timing, control and signal chain electronics, including analog-to-digital conversion.

  9. CMOS Imaging Sensor Technology for Aerial Mapping Cameras

    NASA Astrophysics Data System (ADS)

    Neumann, Klaus; Welzenbach, Martin; Timm, Martin

    2016-06-01

    In June 2015 Leica Geosystems launched the first large format aerial mapping camera using CMOS sensor technology, the Leica DMC III. This paper describes the motivation to change from CCD sensor technology to CMOS for the development of this new aerial mapping camera. In 2002 the DMC first generation was developed by Z/I Imaging. It was the first large format digital frame sensor designed for mapping applications. In 2009 Z/I Imaging designed the DMC II which was the first digital aerial mapping camera using a single ultra large CCD sensor to avoid stitching of smaller CCDs. The DMC III is now the third generation of large format frame sensor developed by Z/I Imaging and Leica Geosystems for the DMC camera family. It is an evolution of the DMC II using the same system design with one large monolithic PAN sensor and four multi spectral camera heads for R,G, B and NIR. For the first time a 391 Megapixel large CMOS sensor had been used as PAN chromatic sensor, which is an industry record. Along with CMOS technology goes a range of technical benefits. The dynamic range of the CMOS sensor is approx. twice the range of a comparable CCD sensor and the signal to noise ratio is significantly better than with CCDs. Finally results from the first DMC III customer installations and test flights will be presented and compared with other CCD based aerial sensors.

  10. Wide area detection system: Conceptual design study. [using television and microelectronic technology

    NASA Technical Reports Server (NTRS)

    Hilbert, E. E.; Carl, C.; Goss, W.; Hansen, G. R.; Olsasky, M. J.; Johnston, A. R.

    1978-01-01

    An integrated sensor for traffic surveillance on mainline sections of urban freeways is described. Applicable imaging and processor technology is surveyed and the functional requirements for the sensors and the conceptual design of the breadboard sensors are given. Parameters measured by the sensors include lane density, speed, and volume. The freeway image is also used for incident diagnosis.

  11. Advanced scanners and imaging systems for earth observations. [conferences

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Assessments of present and future sensors and sensor related technology are reported along with a description of user needs and applications. Five areas are outlined: (1) electromechanical scanners, (2) self-scanned solid state sensors, (3) electron beam imagers, (4) sensor related technology, and (5) user applications. Recommendations, charts, system designs, technical approaches, and bibliographies are included for each area.

  12. Wireless image-data transmission from an implanted image sensor through a living mouse brain by intra body communication

    NASA Astrophysics Data System (ADS)

    Hayami, Hajime; Takehara, Hiroaki; Nagata, Kengo; Haruta, Makito; Noda, Toshihiko; Sasagawa, Kiyotaka; Tokuda, Takashi; Ohta, Jun

    2016-04-01

    Intra body communication technology allows the fabrication of compact implantable biomedical sensors compared with RF wireless technology. In this paper, we report the fabrication of an implantable image sensor of 625 µm width and 830 µm length and the demonstration of wireless image-data transmission through a brain tissue of a living mouse. The sensor was designed to transmit output signals of pixel values by pulse width modulation (PWM). The PWM signals from the sensor transmitted through a brain tissue were detected by a receiver electrode. Wireless data transmission of a two-dimensional image was successfully demonstrated in a living mouse brain. The technique reported here is expected to provide useful methods of data transmission using micro sized implantable biomedical sensors.

  13. Camera-on-a-Chip

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Jet Propulsion Laboratory's research on a second generation, solid-state image sensor technology has resulted in the Complementary Metal- Oxide Semiconductor Active Pixel Sensor (CMOS), establishing an alternative to the Charged Coupled Device (CCD). Photobit Corporation, the leading supplier of CMOS image sensors, has commercialized two products of their own based on this technology: the PB-100 and PB-300. These devices are cameras on a chip, combining all camera functions. CMOS "active-pixel" digital image sensors offer several advantages over CCDs, a technology used in video and still-camera applications for 30 years. The CMOS sensors draw less energy, they use the same manufacturing platform as most microprocessors and memory chips, and they allow on-chip programming of frame size, exposure, and other parameters.

  14. A 128 x 128 CMOS Active Pixel Image Sensor for Highly Integrated Imaging Systems

    NASA Technical Reports Server (NTRS)

    Mendis, Sunetra K.; Kemeny, Sabrina E.; Fossum, Eric R.

    1993-01-01

    A new CMOS-based image sensor that is intrinsically compatible with on-chip CMOS circuitry is reported. The new CMOS active pixel image sensor achieves low noise, high sensitivity, X-Y addressability, and has simple timing requirements. The image sensor was fabricated using a 2 micrometer p-well CMOS process, and consists of a 128 x 128 array of 40 micrometer x 40 micrometer pixels. The CMOS image sensor technology enables highly integrated smart image sensors, and makes the design, incorporation and fabrication of such sensors widely accessible to the integrated circuit community.

  15. Passive IR polarization sensors: a new technology for mine detection

    NASA Astrophysics Data System (ADS)

    Barbour, Blair A.; Jones, Michael W.; Barnes, Howard B.; Lewis, Charles P.

    1998-09-01

    The problem of mine and minefield detection continues to provide a significant challenge to sensor systems. Although the various sensor technologies (infrared, ground penetrating radar, etc.) may excel in certain situations there does not exist a single sensor technology that can adequately detect mines in all conditions such as time of day, weather, buried or surface laid, etc. A truly robust mine detection system will likely require the fusion of data from multiple sensor technologies. The performance of these systems, however, will ultimately depend on the performance of the individual sensors. Infrared (IR) polarimetry is a new and innovative sensor technology that adds substantial capabilities to the detection of mines. IR polarimetry improves on basic IR imaging by providing improved spatial resolution of the target, an inherent ability to suppress clutter, and the capability for zero (Delta) T imaging. Nichols Research Corporation (Nichols) is currently evaluating the effectiveness of IR polarization for mine detection. This study is partially funded by the U.S. Army Night Vision & Electronic Sensors Directorate (NVESD). The goal of the study is to demonstrate, through phenomenology studies and limited field trials, that IR polarizaton outperforms conventional IR imaging in the mine detection arena.

  16. Highly Concentrated Seed-Mediated Synthesis of Monodispersed Gold Nanorods (Postprint)

    DTIC Science & Technology

    2017-07-17

    imaging, therapeutics and sensors, to large area coatings, filters , and optical attenuators. Development of the latter technologies has been hindered by...sensors, to large area coatings, filters , and optical attenuators. Development of the latter technologies has been hindered by the lack of cost-effective...challenges the utilization of Au-NRs in a diverse array of technologies, ranging from therapeutics, imaging and sensors, to large area coatings, filters and

  17. Integrated imaging sensor systems with CMOS active pixel sensor technology

    NASA Technical Reports Server (NTRS)

    Yang, G.; Cunningham, T.; Ortiz, M.; Heynssens, J.; Sun, C.; Hancock, B.; Seshadri, S.; Wrigley, C.; McCarty, K.; Pain, B.

    2002-01-01

    This paper discusses common approaches to CMOS APS technology, as well as specific results on the five-wire programmable digital camera-on-a-chip developed at JPL. The paper also reports recent research in the design, operation, and performance of APS imagers for several imager applications.

  18. Ionizing doses and displacement damage testing of COTS CMOS imagers

    NASA Astrophysics Data System (ADS)

    Bernard, Frédéric; Petit, Sophie; Courtade, Sophie

    2017-11-01

    CMOS sensors begin to be a credible alternative to CCD sensors in some space missions. However, technology evolution of CMOS sensors is much faster than CCD one's. So a continuous technology evaluation is needed for CMOS imagers. Many of commercial COTS (Components Off The Shelf) CMOS sensors use organic filters, micro-lenses and non rad-hard technologies. An evaluation of the possibilities offered by such technologies is interesting before any custom development. This can be obtained by testing commercial COTS imagers. This article will present electro-optical performances evolution of off the shelves CMOS imagers after Ionizing Doses until 50kRad(Si) and Displacement Damage environment tests (until 1011 p/cm2 at 50 MeV). Dark current level and non uniformity evolutions are compared and discussed. Relative spectral response measurement and associated evolution with irradiation will also be presented and discussed. Tests have been performed on CNES detection benches.

  19. Electric potential and electric field imaging

    NASA Astrophysics Data System (ADS)

    Generazio, E. R.

    2017-02-01

    The technology and methods for remote quantitative imaging of electrostatic potentials and electrostatic fields in and around objects and in free space is presented. Electric field imaging (EFI) technology may be applied to characterize intrinsic or existing electric potentials and electric fields, or an externally generated electrostatic field made be used for "illuminating" volumes to be inspected with EFI. The baseline sensor technology (e-Sensor) and its construction, optional electric field generation (quasi-static generator), and current e-Sensor enhancements (ephemeral e-Sensor) are discussed. Demonstrations for structural, electronic, human, and memory applications are shown. This new EFI capability is demonstrated to reveal characterization of electric charge distribution creating a new field of study embracing areas of interest including electrostatic discharge (ESD) mitigation, crime scene forensics, design and materials selection for advanced sensors, dielectric morphology of structures, tether integrity, organic molecular memory, and medical diagnostic and treatment efficacy applications such as cardiac polarization wave propagation and electromyography imaging.

  20. Electric Potential and Electric Field Imaging with Applications

    NASA Technical Reports Server (NTRS)

    Generazio, Ed

    2016-01-01

    The technology and techniques for remote quantitative imaging of electrostatic potentials and electrostatic fields in and around objects and in free space is presented. Electric field imaging (EFI) technology may be applied to characterize intrinsic or existing electric potentials and electric fields, or an externally generated electrostatic field may be used for (illuminating) volumes to be inspected with EFI. The baseline sensor technology, electric field sensor (e-sensor), and its construction, optional electric field generation (quasistatic generator), and current e-sensor enhancements (ephemeral e-sensor) are discussed. Demonstrations for structural, electronic, human, and memory applications are shown. This new EFI capability is demonstrated to reveal characterization of electric charge distribution, creating a new field of study that embraces areas of interest including electrostatic discharge mitigation, crime scene forensics, design and materials selection for advanced sensors, dielectric morphology of structures, inspection of containers, inspection for hidden objects, tether integrity, organic molecular memory, and medical diagnostic and treatment efficacy applications such as cardiac polarization wave propagation and electromyography imaging.

  1. Image sensor pixel with on-chip high extinction ratio polarizer based on 65-nm standard CMOS technology.

    PubMed

    Sasagawa, Kiyotaka; Shishido, Sanshiro; Ando, Keisuke; Matsuoka, Hitoshi; Noda, Toshihiko; Tokuda, Takashi; Kakiuchi, Kiyomi; Ohta, Jun

    2013-05-06

    In this study, we demonstrate a polarization sensitive pixel for a complementary metal-oxide-semiconductor (CMOS) image sensor based on 65-nm standard CMOS technology. Using such a deep-submicron CMOS technology, it is possible to design fine metal patterns smaller than the wavelengths of visible light by using a metal wire layer. We designed and fabricated a metal wire grid polarizer on a 20 × 20 μm(2) pixel for image sensor. An extinction ratio of 19.7 dB was observed at a wavelength 750 nm.

  2. The progress of sub-pixel imaging methods

    NASA Astrophysics Data System (ADS)

    Wang, Hu; Wen, Desheng

    2014-02-01

    This paper reviews the Sub-pixel imaging technology principles, characteristics, the current development status at home and abroad and the latest research developments. As Sub-pixel imaging technology has achieved the advantages of high resolution of optical remote sensor, flexible working ways and being miniaturized with no moving parts. The imaging system is suitable for the application of space remote sensor. Its application prospect is very extensive. It is quite possible to be the research development direction of future space optical remote sensing technology.

  3. VTT's Fabry-Perot interferometer technologies for hyperspectral imaging and mobile sensing applications

    NASA Astrophysics Data System (ADS)

    Rissanen, Anna; Guo, Bin; Saari, Heikki; Näsilä, Antti; Mannila, Rami; Akujärvi, Altti; Ojanen, Harri

    2017-02-01

    VTT's Fabry-Perot interferometers (FPI) technology enables creation of small and cost-efficient microspectrometers and hyperspectral imagers - these robust and light-weight sensors are currently finding their way into a variety of novel applications, including emerging medical products, automotive sensors, space instruments and mobile sensing devices. This presentation gives an overview of our core FPI technologies with current advances in generation of novel sensing applications including recent mobile technology demonstrators of a hyperspectral iPhone and a mobile phone CO2 sensor, which aim to advance mobile spectroscopic sensing.

  4. Atmospheric turbulence and sensor system effects on biometric algorithm performance

    NASA Astrophysics Data System (ADS)

    Espinola, Richard L.; Leonard, Kevin R.; Byrd, Kenneth A.; Potvin, Guy

    2015-05-01

    Biometric technologies composed of electro-optical/infrared (EO/IR) sensor systems and advanced matching algorithms are being used in various force protection/security and tactical surveillance applications. To date, most of these sensor systems have been widely used in controlled conditions with varying success (e.g., short range, uniform illumination, cooperative subjects). However the limiting conditions of such systems have yet to be fully studied for long range applications and degraded imaging environments. Biometric technologies used for long range applications will invariably suffer from the effects of atmospheric turbulence degradation. Atmospheric turbulence causes blur, distortion and intensity fluctuations that can severely degrade image quality of electro-optic and thermal imaging systems and, for the case of biometrics technology, translate to poor matching algorithm performance. In this paper, we evaluate the effects of atmospheric turbulence and sensor resolution on biometric matching algorithm performance. We use a subset of the Facial Recognition Technology (FERET) database and a commercial algorithm to analyze facial recognition performance on turbulence degraded facial images. The goal of this work is to understand the feasibility of long-range facial recognition in degraded imaging conditions, and the utility of camera parameter trade studies to enable the design of the next generation biometrics sensor systems.

  5. Image acquisition system using on sensor compressed sampling technique

    NASA Astrophysics Data System (ADS)

    Gupta, Pravir Singh; Choi, Gwan Seong

    2018-01-01

    Advances in CMOS technology have made high-resolution image sensors possible. These image sensors pose significant challenges in terms of the amount of raw data generated, energy efficiency, and frame rate. This paper presents a design methodology for an imaging system and a simplified image sensor pixel design to be used in the system so that the compressed sensing (CS) technique can be implemented easily at the sensor level. This results in significant energy savings as it not only cuts the raw data rate but also reduces transistor count per pixel; decreases pixel size; increases fill factor; simplifies analog-to-digital converter, JPEG encoder, and JPEG decoder design; decreases wiring; and reduces the decoder size by half. Thus, CS has the potential to increase the resolution of image sensors for a given technology and die size while significantly decreasing the power consumption and design complexity. We show that it has potential to reduce power consumption by about 23% to 65%.

  6. Traffic Monitor

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Intelligent Vision Systems, Inc. (InVision) needed image acquisition technology that was reliable in bad weather for its TDS-200 Traffic Detection System. InVision researchers used information from NASA Tech Briefs and assistance from Johnson Space Center to finish the system. The NASA technology used was developed for Earth-observing imaging satellites: charge coupled devices, in which silicon chips convert light directly into electronic or digital images. The TDS-200 consists of sensors mounted above traffic on poles or span wires, enabling two sensors to view an intersection; a "swing and sway" feature to compensate for movement of the sensors; a combination of electronic shutter and gain control; and sensor output to an image digital signal processor, still frame video and optionally live video.

  7. High-density Schottky barrier IRCCD sensors for remote sensing applications

    NASA Astrophysics Data System (ADS)

    Elabd, H.; Tower, J. R.; McCarthy, B. M.

    1983-01-01

    It is pointed out that the ambitious goals envisaged for the next generation of space-borne sensors challenge the state-of-the-art in solid-state imaging technology. Studies are being conducted with the aim to provide focal plane array technology suitable for use in future Multispectral Linear Array (MLA) earth resource instruments. An important new technology for IR-image sensors involves the use of monolithic Schottky barrier infrared charge-coupled device arrays. This technology is suitable for earth sensing applications in which moderate quantum efficiency and intermediate operating temperatures are required. This IR sensor can be fabricated by using standard integrated circuit (IC) processing techniques, and it is possible to employ commercial IC grade silicon. For this reason, it is feasible to construct Schottky barrier area and line arrays with large numbers of elements and high-density designs. A Pd2Si Schottky barrier sensor for multispectral imaging in the 1 to 3.5 micron band is under development.

  8. Integrative Multi-Spectral Sensor Device for Far-Infrared and Visible Light Fusion

    NASA Astrophysics Data System (ADS)

    Qiao, Tiezhu; Chen, Lulu; Pang, Yusong; Yan, Gaowei

    2018-06-01

    Infrared and visible light image fusion technology is a hot spot in the research of multi-sensor fusion technology in recent years. Existing infrared and visible light fusion technologies need to register before fusion because of using two cameras. However, the application effect of the registration technology has yet to be improved. Hence, a novel integrative multi-spectral sensor device is proposed for infrared and visible light fusion, and by using the beam splitter prism, the coaxial light incident from the same lens is projected to the infrared charge coupled device (CCD) and visible light CCD, respectively. In this paper, the imaging mechanism of the proposed sensor device is studied with the process of the signals acquisition and fusion. The simulation experiment, which involves the entire process of the optic system, signal acquisition, and signal fusion, is constructed based on imaging effect model. Additionally, the quality evaluation index is adopted to analyze the simulation result. The experimental results demonstrate that the proposed sensor device is effective and feasible.

  9. Light-Addressable Potentiometric Sensors for Quantitative Spatial Imaging of Chemical Species.

    PubMed

    Yoshinobu, Tatsuo; Miyamoto, Ko-Ichiro; Werner, Carl Frederik; Poghossian, Arshak; Wagner, Torsten; Schöning, Michael J

    2017-06-12

    A light-addressable potentiometric sensor (LAPS) is a semiconductor-based chemical sensor, in which a measurement site on the sensing surface is defined by illumination. This light addressability can be applied to visualize the spatial distribution of pH or the concentration of a specific chemical species, with potential applications in the fields of chemistry, materials science, biology, and medicine. In this review, the features of this chemical imaging sensor technology are compared with those of other technologies. Instrumentation, principles of operation, and various measurement modes of chemical imaging sensor systems are described. The review discusses and summarizes state-of-the-art technologies, especially with regard to the spatial resolution and measurement speed; for example, a high spatial resolution in a submicron range and a readout speed in the range of several tens of thousands of pixels per second have been achieved with the LAPS. The possibility of combining this technology with microfluidic devices and other potential future developments are discussed.

  10. Smart sensors II; Proceedings of the Seminar, San Diego, CA, July 31, August 1, 1980

    NASA Astrophysics Data System (ADS)

    Barbe, D. F.

    1980-01-01

    Topics discussed include technology for smart sensors, smart sensors for tracking and surveillance, and techniques and algorithms for smart sensors. Papers are presented on the application of very large scale integrated circuits to smart sensors, imaging charge-coupled devices for deep-space surveillance, ultra-precise star tracking using charge coupled devices, and automatic target identification of blurred images with super-resolution features. Attention is also given to smart sensors for terminal homing, algorithms for estimating image position, and the computational efficiency of multiple image registration algorithms.

  11. Uncooled LWIR imaging: applications and market analysis

    NASA Astrophysics Data System (ADS)

    Takasawa, Satomi

    2015-05-01

    The evolution of infrared (IR) imaging sensor technology for defense market has played an important role in developing commercial market, as dual use of the technology has expanded. In particular, technologies of both reduction in pixel pitch and vacuum package have drastically evolved in the area of uncooled Long-Wave IR (LWIR; 8-14 μm wavelength region) imaging sensor, increasing opportunity to create new applications. From the macroscopic point of view, the uncooled LWIR imaging market is divided into two areas. One is a high-end market where uncooled LWIR imaging sensor with sensitivity as close to that of cooled one as possible is required, while the other is a low-end market which is promoted by miniaturization and reduction in price. Especially, in the latter case, approaches towards consumer market have recently appeared, such as applications of uncooled LWIR imaging sensors to night visions for automobiles and smart phones. The appearance of such a kind of commodity surely changes existing business models. Further technological innovation is necessary for creating consumer market, and there will be a room for other companies treating components and materials such as lens materials and getter materials and so on to enter into the consumer market.

  12. Nanophotonic Image Sensors

    PubMed Central

    Hu, Xin; Wen, Long; Yu, Yan; Cumming, David R. S.

    2016-01-01

    The increasing miniaturization and resolution of image sensors bring challenges to conventional optical elements such as spectral filters and polarizers, the properties of which are determined mainly by the materials used, including dye polymers. Recent developments in spectral filtering and optical manipulating techniques based on nanophotonics have opened up the possibility of an alternative method to control light spectrally and spatially. By integrating these technologies into image sensors, it will become possible to achieve high compactness, improved process compatibility, robust stability and tunable functionality. In this Review, recent representative achievements on nanophotonic image sensors are presented and analyzed including image sensors with nanophotonic color filters and polarizers, metamaterial‐based THz image sensors, filter‐free nanowire image sensors and nanostructured‐based multispectral image sensors. This novel combination of cutting edge photonics research and well‐developed commercial products may not only lead to an important application of nanophotonics but also offer great potential for next generation image sensors beyond Moore's Law expectations. PMID:27239941

  13. Further applications for mosaic pixel FPA technology

    NASA Astrophysics Data System (ADS)

    Liddiard, Kevin C.

    2011-06-01

    In previous papers to this SPIE forum the development of novel technology for next generation PIR security sensors has been described. This technology combines the mosaic pixel FPA concept with low cost optics and purpose-designed readout electronics to provide a higher performance and affordable alternative to current PIR sensor technology, including an imaging capability. Progressive development has resulted in increased performance and transition from conventional microbolometer fabrication to manufacture on 8 or 12 inch CMOS/MEMS fabrication lines. A number of spin-off applications have been identified. In this paper two specific applications are highlighted: high performance imaging IRFPA design and forest fire detection. The former involves optional design for small pixel high performance imaging. The latter involves cheap expendable sensors which can detect approaching fire fronts and send alarms with positional data via mobile phone or satellite link. We also introduce to this SPIE forum the application of microbolometer IR sensor technology to IoT, the Internet of Things.

  14. Proceedings of the Augmented VIsual Display (AVID) Research Workshop

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K. (Editor); Sweet, Barbara T. (Editor)

    1993-01-01

    The papers, abstracts, and presentations were presented at a three day workshop focused on sensor modeling and simulation, and image enhancement, processing, and fusion. The technical sessions emphasized how sensor technology can be used to create visual imagery adequate for aircraft control and operations. Participants from industry, government, and academic laboratories contributed to panels on Sensor Systems, Sensor Modeling, Sensor Fusion, Image Processing (Computer and Human Vision), and Image Evaluation and Metrics.

  15. Fusion: ultra-high-speed and IR image sensors

    NASA Astrophysics Data System (ADS)

    Etoh, T. Goji; Dao, V. T. S.; Nguyen, Quang A.; Kimata, M.

    2015-08-01

    Most targets of ultra-high-speed video cameras operating at more than 1 Mfps, such as combustion, crack propagation, collision, plasma, spark discharge, an air bag at a car accident and a tire under a sudden brake, generate sudden heat. Researchers in these fields require tools to measure the high-speed motion and heat simultaneously. Ultra-high frame rate imaging is achieved by an in-situ storage image sensor. Each pixel of the sensor is equipped with multiple memory elements to record a series of image signals simultaneously at all pixels. Image signals stored in each pixel are read out after an image capturing operation. In 2002, we developed an in-situ storage image sensor operating at 1 Mfps 1). However, the fill factor of the sensor was only 15% due to a light shield covering the wide in-situ storage area. Therefore, in 2011, we developed a backside illuminated (BSI) in-situ storage image sensor to increase the sensitivity with 100% fill factor and a very high quantum efficiency 2). The sensor also achieved a much higher frame rate,16.7 Mfps, thanks to the wiring on the front side with more freedom 3). The BSI structure has another advantage that it has less difficulties in attaching an additional layer on the backside, such as scintillators. This paper proposes development of an ultra-high-speed IR image sensor in combination of advanced nano-technologies for IR imaging and the in-situ storage technology for ultra-highspeed imaging with discussion on issues in the integration.

  16. Research-grade CMOS image sensors for remote sensing applications

    NASA Astrophysics Data System (ADS)

    Saint-Pe, Olivier; Tulet, Michel; Davancens, Robert; Larnaudie, Franck; Magnan, Pierre; Martin-Gonthier, Philippe; Corbiere, Franck; Belliot, Pierre; Estribeau, Magali

    2004-11-01

    Imaging detectors are key elements for optical instruments and sensors on board space missions dedicated to Earth observation (high resolution imaging, atmosphere spectroscopy...), Solar System exploration (micro cameras, guidance for autonomous vehicle...) and Universe observation (space telescope focal planes, guiding sensors...). This market has been dominated by CCD technology for long. Since the mid-90s, CMOS Image Sensors (CIS) have been competing with CCDs for consumer domains (webcams, cell phones, digital cameras...). Featuring significant advantages over CCD sensors for space applications (lower power consumption, smaller system size, better radiations behaviour...), CMOS technology is also expanding in this field, justifying specific R&D and development programs funded by national and European space agencies (mainly CNES, DGA and ESA). All along the 90s and thanks to their increasingly improving performances, CIS have started to be successfully used for more and more demanding space applications, from vision and control functions requiring low-level performances to guidance applications requiring medium-level performances. Recent technology improvements have made possible the manufacturing of research-grade CIS that are able to compete with CCDs in the high-performances arena. After an introduction outlining the growing interest of optical instruments designers for CMOS image sensors, this paper will present the existing and foreseen ways to reach high-level electro-optics performances for CIS. The developments and performances of CIS prototypes built using an imaging CMOS process will be presented in the corresponding section.

  17. The Solid State Image Sensor's Contribution To The Development Of Silicon Technology

    NASA Astrophysics Data System (ADS)

    Weckler, Gene P.

    1985-12-01

    Until recently, a solid-state image sensor with full television resolution was a dream. However, the dream of a solid state image sensor has been a driving force in the development of silicon technology for more than twenty-five years. There are probably many in the main stream of semiconductor technology who would argue with this; however, the solid state image sensor was conceived years before the invention of the semi conductor RAM or the microprocessor (i.e., even before the invention of the integrated circuit). No other potential application envisioned at that time required such complexity. How could anyone have ever hoped in 1960 to make a semi conductor chip containing half-a-million picture elements, capable of resolving eight to twelve bits of infornation, and each capable of readout rates in the tens of mega-pixels per second? As early as 1960 arrays of p-n junctions were being investigated as the optical targets in vidicon tubes, replacing the photoconductive targets. It took silicon technology several years to catch up with these dreamers.

  18. Nanophotonic Image Sensors.

    PubMed

    Chen, Qin; Hu, Xin; Wen, Long; Yu, Yan; Cumming, David R S

    2016-09-01

    The increasing miniaturization and resolution of image sensors bring challenges to conventional optical elements such as spectral filters and polarizers, the properties of which are determined mainly by the materials used, including dye polymers. Recent developments in spectral filtering and optical manipulating techniques based on nanophotonics have opened up the possibility of an alternative method to control light spectrally and spatially. By integrating these technologies into image sensors, it will become possible to achieve high compactness, improved process compatibility, robust stability and tunable functionality. In this Review, recent representative achievements on nanophotonic image sensors are presented and analyzed including image sensors with nanophotonic color filters and polarizers, metamaterial-based THz image sensors, filter-free nanowire image sensors and nanostructured-based multispectral image sensors. This novel combination of cutting edge photonics research and well-developed commercial products may not only lead to an important application of nanophotonics but also offer great potential for next generation image sensors beyond Moore's Law expectations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. EDITORIAL: Molecular Imaging Technology

    NASA Astrophysics Data System (ADS)

    Asai, Keisuke; Okamoto, Koji

    2006-06-01

    'Molecular Imaging Technology' focuses on image-based techniques using nanoscale molecules as sensor probes to measure spatial variations of various species (molecular oxygen, singlet oxygen, carbon dioxide, nitric monoxide, etc) and physical properties (pressure, temperature, skin friction, velocity, mechanical stress, etc). This special feature, starting on page 1237, contains selected papers from The International Workshop on Molecular Imaging for Interdisciplinary Research, sponsored by the Ministry of Education, Culture, Sports, Science and Technology (MEXT) in Japan, which was held at the Sendai Mediatheque, Sendai, Japan, on 8 9 November 2004. The workshop was held as a sequel to the MOSAIC International Workshop that was held in Tokyo in 2003, to summarize the outcome of the 'MOSAIC Project', a five-year interdisciplinary project supported by Techno-Infrastructure Program, the Special Coordination Fund for Promotion of Science Technology to develop molecular sensor technology for aero-thermodynamic research. The workshop focused on molecular imaging technology and its applications to interdisciplinary research areas. More than 110 people attended this workshop from various research fields such as aerospace engineering, automotive engineering, radiotechnology, fluid dynamics, bio-science/engineering and medical engineering. The purpose of this workshop is to stimulate intermixing of these interdisciplinary fields for further development of molecular sensor and imaging technology. It is our pleasure to publish the seven papers selected from our workshop as a special feature in Measurement and Science Technology. We will be happy if this issue inspires people to explore the future direction of molecular imaging technology for interdisciplinary research.

  20. Autonomous chemical and biological miniature wireless-sensor

    NASA Astrophysics Data System (ADS)

    Goldberg, Bar-Giora

    2005-05-01

    The presentation discusses a new concept and a paradigm shift in biological, chemical and explosive sensor system design and deployment. From large, heavy, centralized and expensive systems to distributed wireless sensor networks utilizing miniature platforms (nodes) that are lightweight, low cost and wirelessly connected. These new systems are possible due to the emergence and convergence of new innovative radio, imaging, networking and sensor technologies. Miniature integrated radio-sensor networks, is a technology whose time has come. These network systems are based on large numbers of distributed low cost and short-range wireless platforms that sense and process their environment and communicate data thru a network to a command center. The recent emergence of chemical and explosive sensor technology based on silicon nanostructures, coupled with the fast evolution of low-cost CMOS imagers, low power DSP engines and integrated radio chips, has created an opportunity to realize the vision of autonomous wireless networks. These threat detection networks will perform sophisticated analysis at the sensor node and convey alarm information up the command chain. Sensor networks of this type are expected to revolutionize the ability to detect and locate biological, chemical, or explosive threats. The ability to distribute large numbers of low-cost sensors over large areas enables these devices to be close to the targeted threats and therefore improve detection efficiencies and enable rapid counter responses. These sensor networks will be used for homeland security, shipping container monitoring, and other applications such as laboratory medical analysis, drug discovery, automotive, environmental and/or in-vivo monitoring. Avaak"s system concept is to image a chromatic biological, chemical and/or explosive sensor utilizing a digital imager, analyze the images and distribute alarm or image data wirelessly through the network. All the imaging, processing and communications would take place within the miniature, low cost distributed sensor platforms. This concept however presents a significant challenge due to a combination and convergence of required new technologies, as mentioned above. Passive biological and chemical sensors with very high sensitivity and which require no assaying are in development using a technique to optically and chemically encode silicon wafers with tailored nanostructures. The silicon wafer is patterned with nano-structures designed to change colors ad patterns when exposed to the target analytes (TICs, TIMs, VOC). A small video camera detects the color and pattern changes on the sensor. To determine if an alarm condition is present, an on board DSP processor, using specialized image processing algorithms and statistical analysis, determines if color gradient changes occurred on the sensor array. These sensors can detect several agents simultaneously. This system is currently under development by Avaak, with funding from DARPA through an SBIR grant.

  1. CMOS Active Pixel Sensor Technology and Reliability Characterization Methodology

    NASA Technical Reports Server (NTRS)

    Chen, Yuan; Guertin, Steven M.; Pain, Bedabrata; Kayaii, Sammy

    2006-01-01

    This paper describes the technology, design features and reliability characterization methodology of a CMOS Active Pixel Sensor. Both overall chip reliability and pixel reliability are projected for the imagers.

  2. Advances in miniature spectrometer and sensor development

    NASA Astrophysics Data System (ADS)

    Malinen, Jouko; Rissanen, Anna; Saari, Heikki; Karioja, Pentti; Karppinen, Mikko; Aalto, Timo; Tukkiniemi, Kari

    2014-05-01

    Miniaturization and cost reduction of spectrometer and sensor technologies has great potential to open up new applications areas and business opportunities for analytical technology in hand held, mobile and on-line applications. Advances in microfabrication have resulted in high-performance MEMS and MOEMS devices for spectrometer applications. Many other enabling technologies are useful for miniature analytical solutions, such as silicon photonics, nanoimprint lithography (NIL), system-on-chip, system-on-package techniques for integration of electronics and photonics, 3D printing, powerful embedded computing platforms, networked solutions as well as advances in chemometrics modeling. This paper will summarize recent work on spectrometer and sensor miniaturization at VTT Technical Research Centre of Finland. Fabry-Perot interferometer (FPI) tunable filter technology has been developed in two technical versions: Piezoactuated FPIs have been applied in miniature hyperspectral imaging needs in light weight UAV and nanosatellite applications, chemical imaging as well as medical applications. Microfabricated MOEMS FPIs have been developed as cost-effective sensor platforms for visible, NIR and IR applications. Further examples of sensor miniaturization will be discussed, including system-on-package sensor head for mid-IR gas analyzer, roll-to-roll printed Surface Enhanced Raman Scattering (SERS) technology as well as UV imprinted waveguide sensor for formaldehyde detection.

  3. Evaluation of physical properties of different digital intraoral sensors.

    PubMed

    Al-Rawi, Wisam; Teich, Sorin

    2013-09-01

    Digital technologies provide clinically acceptable results comparable to traditional films while having other advantages such as the ability to store and manipulate images, immediate evaluation of the image diagnostic quality, possible reduction in patient radiation exposure, and so on. The purpose of this paper is to present the results of the evaluation of the physical design of eight CMOS digital intraoral sensors. Sensors tested included: XDR (Cyber Medical Imaging, Los Angeles, CA, USA), RVG 6100 (Carestream Dental LLC, Atlanta, GA, USA), Platinum (DEXIS LLC., Hatfield, PA, USA), CDR Elite (Schick Technologies, Long Island City, NY, USA), ProSensor (Planmeca, Helsinki, Finland), EVA (ImageWorks, Elmsford, NY, USA), XIOS Plus (Sirona, Bensheim, Germany), and GXS-700 (Gendex Dental Systems, Hatfield, PA, USA). The sensors were evaluated for cable configuration, connectivity interface, presence of back-scattering radiation shield, plate thickness, active sensor area, and comparing the active imaging area to the outside casing and to conventional radiographic films. There were variations among the physical design of different sensors. For most parameters tested, a lack of standardization exists in the industry. The results of this study revealed that these details are not always available through the material provided by the manufacturers and are often not advertised. For all sensor sizes, active imaging area was smaller compared with conventional films. There was no sensor in the group that had the best physical design. Data presented in this paper establishes a benchmark for comparing the physical design of digital intraoral sensors.

  4. Design and fabrication of vertically-integrated CMOS image sensors.

    PubMed

    Skorka, Orit; Joseph, Dileepan

    2011-01-01

    Technologies to fabricate integrated circuits (IC) with 3D structures are an emerging trend in IC design. They are based on vertical stacking of active components to form heterogeneous microsystems. Electronic image sensors will benefit from these technologies because they allow increased pixel-level data processing and device optimization. This paper covers general principles in the design of vertically-integrated (VI) CMOS image sensors that are fabricated by flip-chip bonding. These sensors are composed of a CMOS die and a photodetector die. As a specific example, the paper presents a VI-CMOS image sensor that was designed at the University of Alberta, and fabricated with the help of CMC Microsystems and Micralyne Inc. To realize prototypes, CMOS dies with logarithmic active pixels were prepared in a commercial process, and photodetector dies with metal-semiconductor-metal devices were prepared in a custom process using hydrogenated amorphous silicon. The paper also describes a digital camera that was developed to test the prototype. In this camera, scenes captured by the image sensor are read using an FPGA board, and sent in real time to a PC over USB for data processing and display. Experimental results show that the VI-CMOS prototype has a higher dynamic range and a lower dark limit than conventional electronic image sensors.

  5. An improved triangulation laser rangefinder using a custom CMOS HDR linear image sensor

    NASA Astrophysics Data System (ADS)

    Liscombe, Michael

    3-D triangulation laser rangefinders are used in many modern applications, from terrain mapping to biometric identification. Although a wide variety of designs have been proposed, laser speckle noise still provides a fundamental limitation on range accuracy. These works propose a new triangulation laser rangefinder designed specifically to mitigate the effects of laser speckle noise. The proposed rangefinder uses a precision linear translator to laterally reposition the imaging system (e.g., image sensor and imaging lens). For a given spatial location of the laser spot, capturing N spatially uncorrelated laser spot profiles is shown to improve range accuracy by a factor of N . This technique has many advantages over past speckle-reduction technologies, such as a fixed system cost and form factor, and the ability to virtually eliminate laser speckle noise. These advantages are made possible through spatial diversity and come at the cost of increased acquisition time. The rangefinder makes use of the ICFYKWG1 linear image sensor, a custom CMOS sensor developed at the Vision Sensor Laboratory (York University). Tests are performed on the image sensor's innovative high dynamic range technology to determine its effects on range accuracy. As expected, experimental results have shown that the sensor provides a trade-off between dynamic range and range accuracy.

  6. Design and Fabrication of Vertically-Integrated CMOS Image Sensors

    PubMed Central

    Skorka, Orit; Joseph, Dileepan

    2011-01-01

    Technologies to fabricate integrated circuits (IC) with 3D structures are an emerging trend in IC design. They are based on vertical stacking of active components to form heterogeneous microsystems. Electronic image sensors will benefit from these technologies because they allow increased pixel-level data processing and device optimization. This paper covers general principles in the design of vertically-integrated (VI) CMOS image sensors that are fabricated by flip-chip bonding. These sensors are composed of a CMOS die and a photodetector die. As a specific example, the paper presents a VI-CMOS image sensor that was designed at the University of Alberta, and fabricated with the help of CMC Microsystems and Micralyne Inc. To realize prototypes, CMOS dies with logarithmic active pixels were prepared in a commercial process, and photodetector dies with metal-semiconductor-metal devices were prepared in a custom process using hydrogenated amorphous silicon. The paper also describes a digital camera that was developed to test the prototype. In this camera, scenes captured by the image sensor are read using an FPGA board, and sent in real time to a PC over USB for data processing and display. Experimental results show that the VI-CMOS prototype has a higher dynamic range and a lower dark limit than conventional electronic image sensors. PMID:22163860

  7. Research-grade CMOS image sensors for demanding space applications

    NASA Astrophysics Data System (ADS)

    Saint-Pé, Olivier; Tulet, Michel; Davancens, Robert; Larnaudie, Franck; Magnan, Pierre; Corbière, Franck; Martin-Gonthier, Philippe; Belliot, Pierre

    2004-06-01

    Imaging detectors are key elements for optical instruments and sensors on board space missions dedicated to Earth observation (high resolution imaging, atmosphere spectroscopy...), Solar System exploration (micro cameras, guidance for autonomous vehicle...) and Universe observation (space telescope focal planes, guiding sensors...). This market has been dominated by CCD technology for long. Since the mid-90s, CMOS Image Sensors (CIS) have been competing with CCDs for more and more consumer domains (webcams, cell phones, digital cameras...). Featuring significant advantages over CCD sensors for space applications (lower power consumption, smaller system size, better radiations behaviour...), CMOS technology is also expanding in this field, justifying specific R&D and development programs funded by national and European space agencies (mainly CNES, DGA, and ESA). All along the 90s and thanks to their increasingly improving performances, CIS have started to be successfully used for more and more demanding applications, from vision and control functions requiring low-level performances to guidance applications requiring medium-level performances. Recent technology improvements have made possible the manufacturing of research-grade CIS that are able to compete with CCDs in the high-performances arena. After an introduction outlining the growing interest of optical instruments designers for CMOS image sensors, this talk will present the existing and foreseen ways to reach high-level electro-optics performances for CIS. The developments of CIS prototypes built using an imaging CMOS process and of devices based on improved designs will be presented.

  8. Research-grade CMOS image sensors for demanding space applications

    NASA Astrophysics Data System (ADS)

    Saint-Pé, Olivier; Tulet, Michel; Davancens, Robert; Larnaudie, Franck; Magnan, Pierre; Corbière, Franck; Martin-Gonthier, Philippe; Belliot, Pierre

    2017-11-01

    Imaging detectors are key elements for optical instruments and sensors on board space missions dedicated to Earth observation (high resolution imaging, atmosphere spectroscopy...), Solar System exploration (micro cameras, guidance for autonomous vehicle...) and Universe observation (space telescope focal planes, guiding sensors...). This market has been dominated by CCD technology for long. Since the mid- 90s, CMOS Image Sensors (CIS) have been competing with CCDs for more and more consumer domains (webcams, cell phones, digital cameras...). Featuring significant advantages over CCD sensors for space applications (lower power consumption, smaller system size, better radiations behaviour...), CMOS technology is also expanding in this field, justifying specific R&D and development programs funded by national and European space agencies (mainly CNES, DGA, and ESA). All along the 90s and thanks to their increasingly improving performances, CIS have started to be successfully used for more and more demanding applications, from vision and control functions requiring low-level performances to guidance applications requiring medium-level performances. Recent technology improvements have made possible the manufacturing of research-grade CIS that are able to compete with CCDs in the high-performances arena. After an introduction outlining the growing interest of optical instruments designers for CMOS image sensors, this talk will present the existing and foreseen ways to reach high-level electro-optics performances for CIS. The developments of CIS prototypes built using an imaging CMOS process and of devices based on improved designs will be presented.

  9. Autonomous vision networking: miniature wireless sensor networks with imaging technology

    NASA Astrophysics Data System (ADS)

    Messinger, Gioia; Goldberg, Giora

    2006-09-01

    The recent emergence of integrated PicoRadio technology, the rise of low power, low cost, System-On-Chip (SOC) CMOS imagers, coupled with the fast evolution of networking protocols and digital signal processing (DSP), created a unique opportunity to achieve the goal of deploying large-scale, low cost, intelligent, ultra-low power distributed wireless sensor networks for the visualization of the environment. Of all sensors, vision is the most desired, but its applications in distributed sensor networks have been elusive so far. Not any more. The practicality and viability of ultra-low power vision networking has been proven and its applications are countless, from security, and chemical analysis to industrial monitoring, asset tracking and visual recognition, vision networking represents a truly disruptive technology applicable to many industries. The presentation discusses some of the critical components and technologies necessary to make these networks and products affordable and ubiquitous - specifically PicoRadios, CMOS imagers, imaging DSP, networking and overall wireless sensor network (WSN) system concepts. The paradigm shift, from large, centralized and expensive sensor platforms, to small, low cost, distributed, sensor networks, is possible due to the emergence and convergence of a few innovative technologies. Avaak has developed a vision network that is aided by other sensors such as motion, acoustic and magnetic, and plans to deploy it for use in military and commercial applications. In comparison to other sensors, imagers produce large data files that require pre-processing and a certain level of compression before these are transmitted to a network server, in order to minimize the load on the network. Some of the most innovative chemical detectors currently in development are based on sensors that change color or pattern in the presence of the desired analytes. These changes are easily recorded and analyzed by a CMOS imager and an on-board DSP processor. Image processing at the sensor node level may also be required for applications in security, asset management and process control. Due to the data bandwidth requirements posed on the network by video sensors, new networking protocols or video extensions to existing standards (e.g. Zigbee) are required. To this end, Avaak has designed and implemented an ultra-low power networking protocol designed to carry large volumes of data through the network. The low power wireless sensor nodes that will be discussed include a chemical sensor integrated with a CMOS digital camera, a controller, a DSP processor and a radio communication transceiver, which enables relaying of an alarm or image message, to a central station. In addition to the communications, identification is very desirable; hence location awareness will be later incorporated to the system in the form of Time-Of-Arrival triangulation, via wide band signaling. While the wireless imaging kernel already exists specific applications for surveillance and chemical detection are under development by Avaak, as part of a co-founded program from ONR and DARPA. Avaak is also designing vision networks for commercial applications - some of which are undergoing initial field tests.

  10. Flexible phosphor sensors: a digital supplement or option to rigid sensors.

    PubMed

    Glazer, Howard S

    2014-01-01

    An increasing number of dental practices are upgrading from film radiography to digital radiography, for reasons that include faster image processing, easier image access, better patient education, enhanced data storage, and improved office productivity. Most practices that have converted to digital technology use rigid, or direct, sensors. Another digital option is flexible phosphor sensors, also called indirect sensors or phosphor storage plates (PSPs). Flexible phosphor sensors can be advantageous for use with certain patients who may be averse to direct sensors, and they can deliver a larger image area. Additionally, sensor cost for replacement PSPs is considerably lower than for hard sensors. As such, flexible phosphor sensors appear to be a viable supplement or option to direct sensors.

  11. Target Detection over the Diurnal Cycle Using a Multispectral Infrared Sensor.

    PubMed

    Zhao, Huijie; Ji, Zheng; Li, Na; Gu, Jianrong; Li, Yansong

    2016-12-29

    When detecting a target over the diurnal cycle, a conventional infrared thermal sensor might lose the target due to the thermal crossover, which could happen at any time throughout the day when the infrared image contrast between target and background in a scene is indistinguishable due to the temperature variation. In this paper, the benefits of using a multispectral-based infrared sensor over the diurnal cycle have been shown. Firstly, a brief theoretical analysis on how the thermal crossover influences a conventional thermal sensor, within the conditions where the thermal crossover would happen and why the mid-infrared (3~5 μm) multispectral technology is effective, is presented. Furthermore, the effectiveness of this technology is also described and we describe how the prototype design and multispectral technology is employed to help solve the thermal crossover detection problem. Thirdly, several targets are set up outside and imaged in the field experiment over a 24-h period. The experimental results show that the multispectral infrared imaging system can enhance the contrast of the detected images and effectively solve the failure of the conventional infrared sensor during the diurnal cycle, which is of great significance for infrared surveillance applications.

  12. Target Detection over the Diurnal Cycle Using a Multispectral Infrared Sensor

    PubMed Central

    Zhao, Huijie; Ji, Zheng; Li, Na; Gu, Jianrong; Li, Yansong

    2016-01-01

    When detecting a target over the diurnal cycle, a conventional infrared thermal sensor might lose the target due to the thermal crossover, which could happen at any time throughout the day when the infrared image contrast between target and background in a scene is indistinguishable due to the temperature variation. In this paper, the benefits of using a multispectral-based infrared sensor over the diurnal cycle have been shown. Firstly, a brief theoretical analysis on how the thermal crossover influences a conventional thermal sensor, within the conditions where the thermal crossover would happen and why the mid-infrared (3~5 μm) multispectral technology is effective, is presented. Furthermore, the effectiveness of this technology is also described and we describe how the prototype design and multispectral technology is employed to help solve the thermal crossover detection problem. Thirdly, several targets are set up outside and imaged in the field experiment over a 24-h period. The experimental results show that the multispectral infrared imaging system can enhance the contrast of the detected images and effectively solve the failure of the conventional infrared sensor during the diurnal cycle, which is of great significance for infrared surveillance applications. PMID:28036073

  13. BCB Bonding Technology of Back-Side Illuminated COMS Device

    NASA Astrophysics Data System (ADS)

    Wu, Y.; Jiang, G. Q.; Jia, S. X.; Shi, Y. M.

    2018-03-01

    Back-side illuminated CMOS(BSI) sensor is a key device in spaceborne hyperspectral imaging technology. Compared with traditional devices, the path of incident light is simplified and the spectral response is planarized by BSI sensors, which meets the requirements of quantitative hyperspectral imaging applications. Wafer bonding is the basic technology and key process of the fabrication of BSI sensors. 6 inch bonding of CMOS wafer and glass wafer was fabricated based on the low bonding temperature and high stability of BCB. The influence of different thickness of BCB on bonding strength was studied. Wafer bonding with high strength, high stability and no bubbles was fabricated by changing bonding conditions.

  14. Electric Potential and Electric Field Imaging with Dynamic Applications: 2017 Research Award Innovation

    NASA Technical Reports Server (NTRS)

    Generazio, Ed

    2017-01-01

    The technology and methods for remote quantitative imaging of electrostatic potentials and electrostatic fields in and around objects and in free space is presented. Electric field imaging (EFI) technology may be applied to characterize intrinsic or existing electric potentials and electric fields, or an externally generated electrostatic field may be used for illuminating volumes to be inspected with EFI. The baseline sensor technology (e-Sensor) and its construction, optional electric field generation (quasi-static generator), and current e- Sensor enhancements (ephemeral e-Sensor) are discussed. Critical design elements of current linear and real-time two-dimensional (2D) measurement systems are highlighted, and the development of a three dimensional (3D) EFI system is presented. Demonstrations for structural, electronic, human, and memory applications are shown. Recent work demonstrates that phonons may be used to create and annihilate electric dipoles within structures. Phonon induced dipoles are ephemeral and their polarization, strength, and location may be quantitatively characterized by EFI providing a new subsurface Phonon-EFI imaging technology. Initial results from real-time imaging of combustion and ion flow, and their measurement complications, will be discussed. These new EFI capabilities are demonstrated to characterize electric charge distribution creating a new field of study embracing areas of interest including electrostatic discharge (ESD) mitigation, crime scene forensics, design and materials selection for advanced sensors, combustion science, on-orbit space potential, container inspection, remote characterization of electronic circuits and level of activation, dielectric morphology of structures, tether integrity, organic molecular memory, atmospheric science, and medical diagnostic and treatment efficacy applications such as cardiac polarization wave propagation and electromyography imaging.

  15. Advanced electro-optical imaging techniques. [conference papers on sensor technology applicable to Large Space Telescope program

    NASA Technical Reports Server (NTRS)

    Sobieski, S. (Editor); Wampler, E. J. (Editor)

    1973-01-01

    The papers presented at the symposium are given which deal with the present state of sensors, as may be applicable to the Large Space Telescope (LST) program. Several aspects of sensors are covered including a discussion of the properties of photocathodes and the operational imaging camera tubes.

  16. Overview of CMOS process and design options for image sensor dedicated to space applications

    NASA Astrophysics Data System (ADS)

    Martin-Gonthier, P.; Magnan, P.; Corbiere, F.

    2005-10-01

    With the growth of huge volume markets (mobile phones, digital cameras...) CMOS technologies for image sensor improve significantly. New process flows appear in order to optimize some parameters such as quantum efficiency, dark current, and conversion gain. Space applications can of course benefit from these improvements. To illustrate this evolution, this paper reports results from three technologies that have been evaluated with test vehicles composed of several sub arrays designed with some space applications as target. These three technologies are CMOS standard, improved and sensor optimized process in 0.35μm generation. Measurements are focussed on quantum efficiency, dark current, conversion gain and noise. Other measurements such as Modulation Transfer Function (MTF) and crosstalk are depicted in [1]. A comparison between results has been done and three categories of CMOS process for image sensors have been listed. Radiation tolerance has been also studied for the CMOS improved process in the way of hardening the imager by design. Results at 4, 15, 25 and 50 krad prove a good ionizing dose radiation tolerance applying specific techniques.

  17. The application of remote sensing techniques: Technical and methodological issues

    NASA Technical Reports Server (NTRS)

    Polcyn, F. C.; Wagner, T. W.

    1974-01-01

    Capabilities and limitations of modern imaging electromagnetic sensor systems are outlined, and the products of such systems are compared with those of the traditional aerial photographic system. Focus is given to the interface between the rapidly developing remote sensing technology and the information needs of operational agencies, and communication gaps are shown to retard early adoption of the technology by these agencies. An assessment is made of the current status of imaging remote sensors and their potential for the future. Public sources of remote sensor data and several cost comparisons are included.

  18. Low-cost compact thermal imaging sensors for body temperature measurement

    NASA Astrophysics Data System (ADS)

    Han, Myung-Soo; Han, Seok Man; Kim, Hyo Jin; Shin, Jae Chul; Ahn, Mi Sook; Kim, Hyung Won; Han, Yong Hee

    2013-06-01

    This paper presents a 32x32 microbolometer thermal imaging sensor for human body temperature measurement. Waferlevel vacuum packaging technology allows us to get a low cost and compact imaging sensor chip. The microbolometer uses V-W-O film as sensing material and ROIC has been designed 0.35-um CMOS process in UMC. A thermal image of a human face and a hand using f/1 lens convinces that it has a potential of human body temperature for commercial use.

  19. The challenge of sCMOS image sensor technology to EMCCD

    NASA Astrophysics Data System (ADS)

    Chang, Weijing; Dai, Fang; Na, Qiyue

    2018-02-01

    In the field of low illumination image sensor, the noise of the latest scientific-grade CMOS image sensor is close to EMCCD, and the industry thinks it has the potential to compete and even replace EMCCD. Therefore we selected several typical sCMOS and EMCCD image sensors and cameras to compare their performance parameters. The results show that the signal-to-noise ratio of sCMOS is close to EMCCD, and the other parameters are superior. But signal-to-noise ratio is very important for low illumination imaging, and the actual imaging results of sCMOS is not ideal. EMCCD is still the first choice in the high-performance application field.

  20. HPT: A High Spatial Resolution Multispectral Sensor for Microsatellite Remote Sensing

    PubMed Central

    Takahashi, Yukihiro; Sakamoto, Yuji; Kuwahara, Toshinori

    2018-01-01

    Although nano/microsatellites have great potential as remote sensing platforms, the spatial and spectral resolutions of an optical payload instrument are limited. In this study, a high spatial resolution multispectral sensor, the High-Precision Telescope (HPT), was developed for the RISING-2 microsatellite. The HPT has four image sensors: three in the visible region of the spectrum used for the composition of true color images, and a fourth in the near-infrared region, which employs liquid crystal tunable filter (LCTF) technology for wavelength scanning. Band-to-band image registration methods have also been developed for the HPT and implemented in the image processing procedure. The processed images were compared with other satellite images, and proven to be useful in various remote sensing applications. Thus, LCTF technology can be considered an innovative tool that is suitable for future multi/hyperspectral remote sensing by nano/microsatellites. PMID:29463022

  1. A data-management system using sensor technology and wireless devices for port security

    NASA Astrophysics Data System (ADS)

    Saldaña, Manuel; Rivera, Javier; Oyola, Jose; Manian, Vidya

    2014-05-01

    Sensor technologies such as infrared sensors and hyperspectral imaging, video camera surveillance are proven to be viable in port security. Drawing from sources such as infrared sensor data, digital camera images and processed hyperspectral images, this article explores the implementation of a real-time data delivery system. In an effort to improve the manner in which anomaly detection data is delivered to interested parties in port security, this system explores how a client-server architecture can provide protected access to data, reports, and device status. Sensor data and hyperspectral image data will be kept in a monitored directory, where the system will link it to existing users in the database. Since this system will render processed hyperspectral images that are dynamically added to the server - which often occupy a large amount of space - the resolution of these images is trimmed down to around 1024×768 pixels. Changes that occur in any image or data modification that originates from any sensor will trigger a message to all users that have a relation with the aforementioned. These messages will be sent to the corresponding users through automatic email generation and through a push notification using Google Cloud Messaging for Android. Moreover, this paper presents the complete architecture for data reception from the sensors, processing, storage and discusses how users of this system such as port security personnel can use benefit from the use of this service to receive secure real-time notifications if their designated sensors have detected anomalies and/or have remote access to results from processed hyperspectral imagery relevant to their assigned posts.

  2. 3-D surface scan of biological samples with a push-broom imaging spectrometer

    USDA-ARS?s Scientific Manuscript database

    The food industry is always on the lookout for sensing technologies for rapid and nondestructive inspection of food products. Hyperspectral imaging technology integrates both imaging and spectroscopy into unique imaging sensors. Its application for food safety and quality inspection has made signifi...

  3. SWIR hyperspectral imaging detector for surface residues

    NASA Astrophysics Data System (ADS)

    Nelson, Matthew P.; Mangold, Paul; Gomer, Nathaniel; Klueva, Oksana; Treado, Patrick

    2013-05-01

    ChemImage has developed a SWIR Hyperspectral Imaging (HSI) sensor which uses hyperspectral imaging for wide area surveillance and standoff detection of surface residues. Existing detection technologies often require close proximity for sensing or detecting, endangering operators and costly equipment. Furthermore, most of the existing sensors do not support autonomous, real-time, mobile platform based detection of threats. The SWIR HSI sensor provides real-time standoff detection of surface residues. The SWIR HSI sensor provides wide area surveillance and HSI capability enabled by liquid crystal tunable filter technology. Easy-to-use detection software with a simple, intuitive user interface produces automated alarms and real-time display of threat and type. The system has potential to be used for the detection of variety of threats including chemicals and illicit drug substances and allows for easy updates in the field for detection of new hazardous materials. SWIR HSI technology could be used by law enforcement for standoff screening of suspicious locations and vehicles in pursuit of illegal labs or combat engineers to support route-clearance applications- ultimately to save the lives of soldiers and civilians. In this paper, results from a SWIR HSI sensor, which include detection of various materials in bulk form, as well as residue amounts on vehicles, people and other surfaces, will be discussed.

  4. Study the performance of star sensor influenced by space radiation damage of image sensor

    NASA Astrophysics Data System (ADS)

    Feng, Jie; Li, Yudong; Wen, Lin; Guo, Qi; Zhang, Xingyao

    2018-03-01

    Star sensor is an essential component of spacecraft attitude control system. Spatial radiation can cause star sensor performance degradation, abnormal work, attitude measurement accuracy and reliability reduction. Many studies have already been dedicated to the radiation effect on Charge-Coupled Device(CCD) image sensor, but fewer studies focus on the radiation effect of star sensor. The innovation of this paper is to study the radiation effects from the device level to the system level. The influence of the degradation of CCD image sensor radiation sensitive parameters on the performance parameters of star sensor is studied in this paper. The correlation among the radiation effect of proton, the non-uniformity noise of CCD image sensor and the performance parameter of star sensor is analyzed. This paper establishes a foundation for the study of error prediction and correction technology of star sensor on-orbit attitude measurement, and provides some theoretical basis for the design of high performance star sensor.

  5. CMOS Image Sensors for High Speed Applications.

    PubMed

    El-Desouki, Munir; Deen, M Jamal; Fang, Qiyin; Liu, Louis; Tse, Frances; Armstrong, David

    2009-01-01

    Recent advances in deep submicron CMOS technologies and improved pixel designs have enabled CMOS-based imagers to surpass charge-coupled devices (CCD) imaging technology for mainstream applications. The parallel outputs that CMOS imagers can offer, in addition to complete camera-on-a-chip solutions due to being fabricated in standard CMOS technologies, result in compelling advantages in speed and system throughput. Since there is a practical limit on the minimum pixel size (4∼5 μm) due to limitations in the optics, CMOS technology scaling can allow for an increased number of transistors to be integrated into the pixel to improve both detection and signal processing. Such smart pixels truly show the potential of CMOS technology for imaging applications allowing CMOS imagers to achieve the image quality and global shuttering performance necessary to meet the demands of ultrahigh-speed applications. In this paper, a review of CMOS-based high-speed imager design is presented and the various implementations that target ultrahigh-speed imaging are described. This work also discusses the design, layout and simulation results of an ultrahigh acquisition rate CMOS active-pixel sensor imager that can take 8 frames at a rate of more than a billion frames per second (fps).

  6. Panoramic thermal imaging: challenges and tradeoffs

    NASA Astrophysics Data System (ADS)

    Aburmad, Shimon

    2014-06-01

    Over the past decade, we have witnessed a growing demand for electro-optical systems that can provide continuous 3600 coverage. Applications such as perimeter security, autonomous vehicles, and military warning systems are a few of the most common applications for panoramic imaging. There are several different technological approaches for achieving panoramic imaging. Solutions based on rotating elements do not provide continuous coverage as there is a time lag between updates. Continuous panoramic solutions either use "stitched" images from multiple adjacent sensors, or sophisticated optical designs which warp a panoramic view onto a single sensor. When dealing with panoramic imaging in the visible spectrum, high volume production and advancement of semiconductor technology has enabled the use of CMOS/CCD image sensors with a huge number of pixels, small pixel dimensions, and low cost devices. However, in the infrared spectrum, the growth of detector pixel counts, pixel size reduction, and cost reduction is taking place at a slower rate due to the complexity of the technology and limitations caused by the laws of physics. In this work, we will explore the challenges involved in achieving 3600 panoramic thermal imaging, and will analyze aspects such as spatial resolution, FOV, data complexity, FPA utilization, system complexity, coverage and cost of the different solutions. We will provide illustrations, calculations, and tradeoffs between three solutions evaluated by Opgal: A unique 3600 lens design using an LWIR XGA detector, stitching of three adjacent LWIR sensors equipped with a low distortion 1200 lens, and a fisheye lens with a HFOV of 180º and an XGA sensor.

  7. Real-time Imaging Technology for the Return to the Moon

    NASA Technical Reports Server (NTRS)

    Epp, Chirold

    2008-01-01

    This viewgraph presentation reviews realtime Autonomous Landing Hazard Avoidance Technology (ALHAT) technology for the return to the Moon. The topics inclde: 1) ALHAT Background; 2) Safe and Precise Landing; 3) ALHAT Mission Phases; 4) Terminal Descent Phase; 5) Lighting; 6) Lander Tolerance; 7) HDA Sensor Performance; and 8) HDA Terrain Sensors.

  8. Optimization of CMOS image sensor utilizing variable temporal multisampling partial transfer technique to achieve full-frame high dynamic range with superior low light and stop motion capability

    NASA Astrophysics Data System (ADS)

    Kabir, Salman; Smith, Craig; Armstrong, Frank; Barnard, Gerrit; Schneider, Alex; Guidash, Michael; Vogelsang, Thomas; Endsley, Jay

    2018-03-01

    Differential binary pixel technology is a threshold-based timing, readout, and image reconstruction method that utilizes the subframe partial charge transfer technique in a standard four-transistor (4T) pixel CMOS image sensor to achieve a high dynamic range video with stop motion. This technology improves low light signal-to-noise ratio (SNR) by up to 21 dB. The method is verified in silicon using a Taiwan Semiconductor Manufacturing Company's 65 nm 1.1 μm pixel technology 1 megapixel test chip array and is compared with a traditional 4 × oversampling technique using full charge transfer to show low light SNR superiority of the presented technology.

  9. Degradation of CMOS image sensors in deep-submicron technology due to γ-irradiation

    NASA Astrophysics Data System (ADS)

    Rao, Padmakumar R.; Wang, Xinyang; Theuwissen, Albert J. P.

    2008-09-01

    In this work, radiation induced damage mechanisms in deep submicron technology is resolved using finger gated-diodes (FGDs) as a radiation sensitive tool. It is found that these structures are simple yet efficient structures to resolve radiation induced damage in advanced CMOS processes. The degradation of the CMOS image sensors in deep-submicron technology due to γ-ray irradiation is studied by developing a model for the spectral response of the sensor and also by the dark-signal degradation as a function of STI (shallow-trench isolation) parameters. It is found that threshold shifts in the gate-oxide/silicon interface as well as minority carrier life-time variations in the silicon bulk are minimal. The top-layer material properties and the photodiode Si-SiO2 interface quality are degraded due to γ-ray irradiation. Results further suggest that p-well passivated structures are inevitable for radiation-hard designs. It was found that high electrical fields in submicron technologies pose a threat to high quality imaging in harsh environments.

  10. Thermal luminescence spectroscopy chemical imaging sensor.

    PubMed

    Carrieri, Arthur H; Buican, Tudor N; Roese, Erik S; Sutter, James; Samuels, Alan C

    2012-10-01

    The authors present a pseudo-active chemical imaging sensor model embodying irradiative transient heating, temperature nonequilibrium thermal luminescence spectroscopy, differential hyperspectral imaging, and artificial neural network technologies integrated together. We elaborate on various optimizations, simulations, and animations of the integrated sensor design and apply it to the terrestrial chemical contamination problem, where the interstitial contaminant compounds of detection interest (analytes) comprise liquid chemical warfare agents, their various derivative condensed phase compounds, and other material of a life-threatening nature. The sensor must measure and process a dynamic pattern of absorptive-emissive middle infrared molecular signature spectra of subject analytes to perform its chemical imaging and standoff detection functions successfully.

  11. New sensor technologies in quality evaluation of Chinese materia medica: 2010-2015.

    PubMed

    Miao, Xiaosu; Cui, Qingyu; Wu, Honghui; Qiao, Yanjiang; Zheng, Yanfei; Wu, Zhisheng

    2017-03-01

    New sensor technologies play an important role in quality evaluation of Chinese materia medica (CMM) and include near-infrared spectroscopy, chemical imaging, electronic nose and electronic tongue. This review on quality evaluation of CMM and the application of the new sensors in this assessment is based on studies from 2010 to 2015, with prospects and opportunities for future research.

  12. A time-resolved image sensor for tubeless streak cameras

    NASA Astrophysics Data System (ADS)

    Yasutomi, Keita; Han, SangMan; Seo, Min-Woong; Takasawa, Taishi; Kagawa, Keiichiro; Kawahito, Shoji

    2014-03-01

    This paper presents a time-resolved CMOS image sensor with draining-only modulation (DOM) pixels for tube-less streak cameras. Although the conventional streak camera has high time resolution, the device requires high voltage and bulky system due to the structure with a vacuum tube. The proposed time-resolved imager with a simple optics realize a streak camera without any vacuum tubes. The proposed image sensor has DOM pixels, a delay-based pulse generator, and a readout circuitry. The delay-based pulse generator in combination with an in-pixel logic allows us to create and to provide a short gating clock to the pixel array. A prototype time-resolved CMOS image sensor with the proposed pixel is designed and implemented using 0.11um CMOS image sensor technology. The image array has 30(Vertical) x 128(Memory length) pixels with the pixel pitch of 22.4um. .

  13. Images Revealing More Than a Thousand Words

    NASA Technical Reports Server (NTRS)

    2003-01-01

    A unique sensor developed by ProVision Technologies, a NASA Commercial Space Center housed by the Institute for Technology Development, produces hyperspectral images with cutting-edge applications in food safety, skin health, forensics, and anti-terrorism activities. While hyperspectral imaging technology continues to make advances with ProVision Technologies, it has also been transferred to the commercial sector through a spinoff company, Photon Industries, Inc.

  14. Deployable Laboratory Applications of Nano- and Bio-Technology (Applications de nanotechnologie et biotechnologie destinees a un laboratoire deployable)

    DTIC Science & Technology

    2014-10-01

    applications of present nano-/ bio -technology include advanced health and fitness monitoring, high-resolution imaging, new environmental sensor platforms...others areas where nano-/ bio -technology development is needed: • Sensors : Diagnostic and detection kits (gene-chips, protein-chips, lab-on-chips, etc...studies on chemo- bio nano- sensors , ultra-sensitive biochips (“lab-on-a-chip” and “cells-on-chips” devices) have been prepared for routine medical

  15. Technologies for Positioning and Placement of Underwater Structures

    DTIC Science & Technology

    2000-03-01

    for imaging the bottom immediately before placement of the structure. c. Use passive sensors (such as tiltmeters , inclinometers, and gyrocompasses...4 Acoustic Sensors .................................................................... 5 Multibeamn and Side-Scan Sonar Transducers...11.I Video Camera....................................................................11. Passive Sensors

  16. REMOTE SENSING AND GIS FOR WETLANDS

    EPA Science Inventory

    In identifying and characterizing wetland and adjacent features, the use of remote sensor and Geographic Information Systems (GIS) technologies has been valuable. Remote sensors such as photographs and computer-sensor generated images can illustrate conditions of hydrology, exten...

  17. Projection technologies for imaging sensor calibration, characterization, and HWIL testing at AEDC

    NASA Astrophysics Data System (ADS)

    Lowry, H. S.; Breeden, M. F.; Crider, D. H.; Steely, S. L.; Nicholson, R. A.; Labello, J. M.

    2010-04-01

    The characterization, calibration, and mission simulation testing of imaging sensors require continual involvement in the development and evaluation of radiometric projection technologies. Arnold Engineering Development Center (AEDC) uses these technologies to perform hardware-in-the-loop (HWIL) testing with high-fidelity complex scene projection technologies that involve sophisticated radiometric source calibration systems to validate sensor mission performance. Testing with the National Institute of Standards and Technology (NIST) Ballistic Missile Defense Organization (BMDO) transfer radiometer (BXR) and Missile Defense Agency (MDA) transfer radiometer (MDXR) offers improved radiometric and temporal fidelity in this cold-background environment. The development of hardware and test methodologies to accommodate wide field of view (WFOV), polarimetric, and multi/hyperspectral imaging systems is being pursued to support a variety of program needs such as space situational awareness (SSA). Test techniques for the acquisition of data needed for scene generation models (solar/lunar exclusion, radiation effects, etc.) are also needed and are being sought. The extension of HWIL testing to the 7V Chamber requires the upgrade of the current satellite emulation scene generation system. This paper provides an overview of pertinent technologies being investigated and implemented at AEDC.

  18. A REAL-TIME COAL CONTENT/ORE GRADE (C2OC) SENSOR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rand Swanson

    2005-04-01

    This is the final report of a three year DOE funded project titled ''A real-time coal content/ore grade (C{sub 2}OG) sensor''. The sensor, which is based on hyperspectral imaging technology, was designed to give a machine vision assay of ore or coal. Sensors were designed and built at Resonon, Inc., and then deployed at the Stillwater Mining Company core room in southcentral Montana for analyzing platinum/palladium ore and at the Montana Tech Spectroscopy Lab for analyzing coal and other materials. The Stillwater sensor imaged 91' of core and analyzed this data for surface sulfides which are considered to be pathfindermore » minerals for platinum/palladium at this mine. Our results indicate that the sensor could deliver a relative ore grade provided tool markings and iron oxidation were kept to a minimum. Coal, talc, and titanium sponge samples were also imaged and analyzed for content and grade with promising results. This research has led directly to a DOE SBIR Phase II award for Resonon to develop a down-hole imaging spectrometer based on the same imaging technology used in the Stillwater core room C{sub 2}OG sensor. The Stillwater Mining Company has estimated that this type of imaging system could lead to a 10% reduction in waste rock from their mine and provide a $650,000 benefit per year. The proposed system may also lead to an additional 10% of ore tonnage, which would provide a total economic benefit of more than $3.1 million per year. If this benefit could be realized on other metal ores for which the proposed technology is suitable, the possible economic benefits to U.S. mines is over $70 million per year. In addition to these currently lost economic benefits, there are also major energy losses from mining waste rock and environmental impacts from mining, processing, and disposing of waste rock.« less

  19. Novel EO/IR sensor technologies

    NASA Astrophysics Data System (ADS)

    Lewis, Keith

    2011-10-01

    The requirements for advanced EO/IR sensor technologies are discussed in the context of evolving military operations, with significant emphasis on the development of new sensing technologies to meet the challenges posed by asymmetric threats. The Electro-Magnetic Remote Sensing (EMRS DTC) was established in 2003 to provide a centre of excellence in sensor research and development, supporting new capabilities in key military areas such as precision attack, battlespace manoeuvre and information superiority. In the area of advanced electro-optic technology, the DTC has supported work on discriminative imaging, advanced detectors, laser components/technologies, and novel optical techniques. This paper provides a summary of some of the EO/IR technologies explored by the DTC.

  20. Optical technologies for space sensor

    NASA Astrophysics Data System (ADS)

    Wang, Hu; Liu, Jie; Xue, Yaoke; Liu, Yang; Liu, Meiying; Wang, Lingguang; Yang, Shaodong; Lin, Shangmin; Chen, Su; Luo, Jianjun

    2015-10-01

    Space sensors are used in navigation sensor fields. The sun, the earth, the moon and other planets are used as frame of reference to obtain stellar position coordinates, and then to control the attitude of an aircraft. Being the "eyes" of the space sensors, Optical sensor system makes images of the infinite far stars and other celestial bodies. It directly affects measurement accuracy of the space sensor, indirectly affecting the data updating rate. Star sensor technology is the pilot for Space sensors. At present more and more attention is paid on all-day star sensor technology. By day and night measurements of the stars, the aircraft's attitude in the inertial coordinate system can be provided. Facing the requirements of ultra-high-precision, large field of view, wide spectral range, long life and high reliability, multi-functional optical system, we integration, integration optical sensors will be future space technology trends. In the meantime, optical technologies for space-sensitive research leads to the development of ultra-precision optical processing, optical and precision test machine alignment technology. It also promotes the development of long-life optical materials and applications. We have achieved such absolute distortion better than ±1um, Space life of at least 15years of space-sensitive optical system.

  1. Design of polarization imaging system based on CIS and FPGA

    NASA Astrophysics Data System (ADS)

    Zeng, Yan-an; Liu, Li-gang; Yang, Kun-tao; Chang, Da-ding

    2008-02-01

    As polarization is an important characteristic of light, polarization image detecting is a new image detecting technology of combining polarimetric and image processing technology. Contrasting traditional image detecting in ray radiation, polarization image detecting could acquire a lot of very important information which traditional image detecting couldn't. Polarization image detecting will be widely used in civilian field and military field. As polarization image detecting could resolve some problem which couldn't be resolved by traditional image detecting, it has been researched widely around the world. The paper introduces polarization image detecting in physical theory at first, then especially introduces image collecting and polarization image process based on CIS (CMOS image sensor) and FPGA. There are two parts including hardware and software for polarization imaging system. The part of hardware include drive module of CMOS image sensor, VGA display module, SRAM access module and the real-time image data collecting system based on FPGA. The circuit diagram and PCB was designed. Stokes vector and polarization angle computing method are analyzed in the part of software. The float multiply of Stokes vector is optimized into just shift and addition operation. The result of the experiment shows that real time image collecting system could collect and display image data from CMOS image sensor in real-time.

  2. Performance evaluation and modeling of a conformal filter (CF) based real-time standoff hazardous material detection sensor

    NASA Astrophysics Data System (ADS)

    Nelson, Matthew P.; Tazik, Shawna K.; Bangalore, Arjun S.; Treado, Patrick J.; Klem, Ethan; Temple, Dorota

    2017-05-01

    Hyperspectral imaging (HSI) systems can provide detection and identification of a variety of targets in the presence of complex backgrounds. However, current generation sensors are typically large, costly to field, do not usually operate in real time and have limited sensitivity and specificity. Despite these shortcomings, HSI-based intelligence has proven to be a valuable tool, thus resulting in increased demand for this type of technology. By moving the next generation of HSI technology into a more adaptive configuration, and a smaller and more cost effective form factor, HSI technologies can help maintain a competitive advantage for the U.S. armed forces as well as local, state and federal law enforcement agencies. Operating near the physical limits of HSI system capability is often necessary and very challenging, but is often enabled by rigorous modeling of detection performance. Specific performance envelopes we consistently strive to improve include: operating under low signal to background conditions; at higher and higher frame rates; and under less than ideal motion control scenarios. An adaptable, low cost, low footprint, standoff sensor architecture we have been maturing includes the use of conformal liquid crystal tunable filters (LCTFs). These Conformal Filters (CFs) are electro-optically tunable, multivariate HSI spectrometers that, when combined with Dual Polarization (DP) optics, produce optimized spectral passbands on demand, which can readily be reconfigured, to discriminate targets from complex backgrounds in real-time. With DARPA support, ChemImage Sensor Systems (CISS™) in collaboration with Research Triangle Institute (RTI) International are developing a novel, real-time, adaptable, compressive sensing short-wave infrared (SWIR) hyperspectral imaging technology called the Reconfigurable Conformal Imaging Sensor (RCIS) based on DP-CF technology. RCIS will address many shortcomings of current generation systems and offer improvements in operational agility and detection performance, while addressing sensor weight, form factor and cost needs. This paper discusses recent test and performance modeling results of a RCIS breadboard apparatus.

  3. A survey of current solid state star tracker technology

    NASA Astrophysics Data System (ADS)

    Armstrong, R. W.; Staley, D. A.

    1985-12-01

    This paper is a survey of the current state of the art in design of star trackers for spacecraft attitude determination systems. Specific areas discussed are sensor technology, including the current state-of-the-art solid state sensors and techniques of mounting and cooling the sensor, analog image preprocessing electronics performance, and digital processing hardware and software. Three examples of area array solid state star tracker development are presented - ASTROS, developed by the Jet Propulsion Laboratory, the Retroreflector Field Tracker (RFT) by Ball Aerospace, and TRW's MADAN. Finally, a discussion of solid state line arrays explores the possibilities for one-dimensional imagers which offer simplified scan control electronics.

  4. Development of CMOS Active Pixel Image Sensors for Low Cost Commercial Applications

    NASA Technical Reports Server (NTRS)

    Fossum, E.; Gee, R.; Kemeny, S.; Kim, Q.; Mendis, S.; Nakamura, J.; Nixon, R.; Ortiz, M.; Pain, B.; Zhou, Z.; hide

    1994-01-01

    This paper describes ongoing research and development of CMOS active pixel image sensors for low cost commercial applications. A number of sensor designs have been fabricated and tested in both p-well and n-well technologies. Major elements in the development of the sensor include on-chip analog signal processing circuits for the reduction of fixed pattern noise, on-chip timing and control circuits and on-chip analog-to-digital conversion (ADC). Recent results and continuing efforts in these areas will be presented.

  5. Visible Wavelength Color Filters Using Dielectric Subwavelength Gratings for Backside-Illuminated CMOS Image Sensor Technologies.

    PubMed

    Horie, Yu; Han, Seunghoon; Lee, Jeong-Yub; Kim, Jaekwan; Kim, Yongsung; Arbabi, Amir; Shin, Changgyun; Shi, Lilong; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Lee, Hong-Seok; Hwang, Sungwoo; Faraon, Andrei

    2017-05-10

    We report transmissive color filters based on subwavelength dielectric gratings that can replace conventional dye-based color filters used in backside-illuminated CMOS image sensor (BSI CIS) technologies. The filters are patterned in an 80 nm-thick poly silicon film on a 115 nm-thick SiO 2 spacer layer. They are optimized for operating at the primary RGB colors, exhibit peak transmittance of 60-80%, and have an almost insensitive response over a ± 20° angular range. This technology enables shrinking of the pixel sizes down to near a micrometer.

  6. NASA's Technology Transfer Program for the Early Detection of Breast Cancer

    NASA Technical Reports Server (NTRS)

    Schmidt, Gregory; Frey, Mary Anne; Vernikos, Joan; Winfield, Daniel; Dalton, Bonnie P. (Technical Monitor)

    1996-01-01

    The National Aeronautics and Space Administration (NASA) has led the development of advanced imaging sensors and image processing technologies for space science and Earth science missions. NASA considers the transfer and commercialization of such technologies a fundamental mission of the agency. Over the last two years, efforts have been focused on the application of aerospace imaging and computing to the field of diagnostic imaging, specifically to breast cancer imaging. These technology transfer efforts offer significant promise in helping in the national public health priority of the early detection of breast cancer.

  7. Image processing techniques and applications to the Earth Resources Technology Satellite program

    NASA Technical Reports Server (NTRS)

    Polge, R. J.; Bhagavan, B. K.; Callas, L.

    1973-01-01

    The Earth Resources Technology Satellite system is studied, with emphasis on sensors, data processing requirements, and image data compression using the Fast Fourier and Hadamard transforms. The ERTS-A system and the fundamentals of remote sensing are discussed. Three user applications (forestry, crops, and rangelands) are selected and their spectral signatures are described. It is shown that additional sensors are needed for rangeland management. An on-board information processing system is recommended to reduce the amount of data transmitted.

  8. Electric Potential and Electric Field Imaging with Dynamic Applications & Extensions

    NASA Technical Reports Server (NTRS)

    Generazio, Ed

    2017-01-01

    The technology and methods for remote quantitative imaging of electrostatic potentials and electrostatic fields in and around objects and in free space is presented. Electric field imaging (EFI) technology may be applied to characterize intrinsic or existing electric potentials and electric fields, or an externally generated electrostatic field made be used for volumes to be inspected with EFI. The baseline sensor technology (e-Sensor) and its construction, optional electric field generation (quasi-static generator), and current e- Sensor enhancements (ephemeral e-Sensor) are discussed. Critical design elements of current linear and real-time two-dimensional (2D) measurement systems are highlighted, and the development of a three dimensional (3D) EFI system is presented. Demonstrations for structural, electronic, human, and memory applications are shown. Recent work demonstrates that phonons may be used to create and annihilate electric dipoles within structures. Phonon induced dipoles are ephemeral and their polarization, strength, and location may be quantitatively characterized by EFI providing a new subsurface Phonon-EFI imaging technology. Results from real-time imaging of combustion and ion flow, and their measurement complications, will be discussed. Extensions to environment, Space and subterranean applications will be presented, and initial results for quantitative characterizing material properties are shown. A wearable EFI system has been developed by using fundamental EFI concepts. These new EFI capabilities are demonstrated to characterize electric charge distribution creating a new field of study embracing areas of interest including electrostatic discharge (ESD) mitigation, manufacturing quality control, crime scene forensics, design and materials selection for advanced sensors, combustion science, on-orbit space potential, container inspection, remote characterization of electronic circuits and level of activation, dielectric morphology of structures, tether integrity, organic molecular memory, atmospheric science, weather prediction, earth quake prediction, and medical diagnostic and treatment efficacy applications such as cardiac polarization wave propagation and electromyography imaging.

  9. Determination of technical readiness for an atmospheric carbon imaging spectrometer

    NASA Astrophysics Data System (ADS)

    Mobilia, Joseph; Kumer, John B.; Palmer, Alice; Sawyer, Kevin; Mao, Yalan; Katz, Noah; Mix, Jack; Nast, Ted; Clark, Charles S.; Vanbezooijen, Roel; Magoncelli, Antonio; Baraze, Ronald A.; Chenette, David L.

    2013-09-01

    The geoCARB sensor uses a 4-channel push broom slit-scan infrared imaging grating spectrometer to measure the absorption spectra of sunlight reflected from the ground in narrow wavelength regions. The instrument is designed for flight at geostationary orbit to provide mapping of greenhouse gases over continental scales, several times per day, with a spatial resolution of a few kilometers. The sensor provides multiple daily maps of column-averaged mixing ratios of CO2, CH4, and CO over the regions of interest, which enables flux determination at unprecedented time, space, and accuracy scales. The geoCARB sensor development is based on our experience in successful implementation of advanced space deployed optical instruments for remote sensing. A few recent examples include the Atmospheric Imaging Assembly (AIA) and Helioseismic and Magnetic Imager (HMI) on the geostationary Solar Dynamics Observatory (SDO), the Space Based Infrared System (SBIRS GEO-1) and the Interface Region Imaging Spectrograph (IRIS), along with sensors under development, the Near Infared camera (NIRCam) for James Webb (JWST), and the Global Lightning Mapper (GLM) and Solar UltraViolet Imager (SUVI) for the GOES-R series. The Tropospheric Infrared Mapping Spectrometer (TIMS), developed in part through the NASA Instrument Incubator Program (IIP), provides an important part of the strong technological foundation for geoCARB. The paper discusses subsystem heritage and technology readiness levels for these subsystems. The system level flight technology readiness and methods used to determine this level are presented along with plans to enhance the level.

  10. Advanced processing for high-bandwidth sensor systems

    NASA Astrophysics Data System (ADS)

    Szymanski, John J.; Blain, Phil C.; Bloch, Jeffrey J.; Brislawn, Christopher M.; Brumby, Steven P.; Cafferty, Maureen M.; Dunham, Mark E.; Frigo, Janette R.; Gokhale, Maya; Harvey, Neal R.; Kenyon, Garrett; Kim, Won-Ha; Layne, J.; Lavenier, Dominique D.; McCabe, Kevin P.; Mitchell, Melanie; Moore, Kurt R.; Perkins, Simon J.; Porter, Reid B.; Robinson, S.; Salazar, Alfonso; Theiler, James P.; Young, Aaron C.

    2000-11-01

    Compute performance and algorithm design are key problems of image processing and scientific computing in general. For example, imaging spectrometers are capable of producing data in hundreds of spectral bands with millions of pixels. These data sets show great promise for remote sensing applications, but require new and computationally intensive processing. The goal of the Deployable Adaptive Processing Systems (DAPS) project at Los Alamos National Laboratory is to develop advanced processing hardware and algorithms for high-bandwidth sensor applications. The project has produced electronics for processing multi- and hyper-spectral sensor data, as well as LIDAR data, while employing processing elements using a variety of technologies. The project team is currently working on reconfigurable computing technology and advanced feature extraction techniques, with an emphasis on their application to image and RF signal processing. This paper presents reconfigurable computing technology and advanced feature extraction algorithm work and their application to multi- and hyperspectral image processing. Related projects on genetic algorithms as applied to image processing will be introduced, as will the collaboration between the DAPS project and the DARPA Adaptive Computing Systems program. Further details are presented in other talks during this conference and in other conferences taking place during this symposium.

  11. High responsivity CMOS imager pixel implemented in SOI technology

    NASA Technical Reports Server (NTRS)

    Zheng, X.; Wrigley, C.; Yang, G.; Pain, B.

    2000-01-01

    Availability of mature sub-micron CMOS technology and the advent of the new low noise active pixel sensor (APS) concept have enabled the development of low power, miniature, single-chip, CMOS digital imagers in the decade of the 1990's.

  12. Advanced microlens and color filter process technology for the high-efficiency CMOS and CCD image sensors

    NASA Astrophysics Data System (ADS)

    Fan, Yang-Tung; Peng, Chiou-Shian; Chu, Cheng-Yu

    2000-12-01

    New markets are emerging for digital electronic image device, especially in visual communications, PC camera, mobile/cell phone, security system, toys, vehicle image system and computer peripherals for document capture. To enable one-chip image system that image sensor is with a full digital interface, can make image capture devices in our daily lives. Adding a color filter to such image sensor in a pattern of mosaics pixel or wide stripes can make image more real and colorful. We can say 'color filter makes the life more colorful color filter is? Color filter means can filter image light source except the color with specific wavelength and transmittance that is same as color filter itself. Color filter process is coating and patterning green, red and blue (or cyan, magenta and yellow) mosaic resists onto matched pixel in image sensing array pixels. According to the signal caught from each pixel, we can figure out the environment image picture. Widely use of digital electronic camera and multimedia applications today makes the feature of color filter becoming bright. Although it has challenge but it is very worthy to develop the process of color filter. We provide the best service on shorter cycle time, excellent color quality, high and stable yield. The key issues of advanced color process have to be solved and implemented are planarization and micro-lens technology. Lost of key points of color filter process technology have to consider will also be described in this paper.

  13. Emerging electro-optical technologies for defense applications

    NASA Astrophysics Data System (ADS)

    Venkateswarlu, Ronda; Ser, W.; Er, Meng H.; Chan, Philip

    1999-11-01

    Technological breakthroughs in the field of imaging and non- imaging sensor sand the related signal processors helped the military users to achieve 'force multiplication'. Present day 'smart-weapon systems' are being converted to 'brilliant-weapon systems' to bridge the gap until the most potent new 'fourth generation systems' come on line based on nanotechnology. The recent military tactics have evolved to take advantage of ever improving technologies to improve the quality and performance over time. The drive behind these technologies is to get a first-pass-mission-success against the target with negligible collateral damage, protecting property and the lives of non-combatants. These technologies revolve around getting target information, detection, designation, guidance, aim-point selection, and mission accomplishment. The effectiveness of these technologies is amply demonstrated during recent wars. This paper brings out the emerging trends in visible/IR/radar smart-sensors and the related signal processing technologies that lead to brilliant guided weapon systems. The purpose of this paper is to give an overview to the readers about futuristic systems. This paper also addresses various system configurations including sensor-fusion.

  14. Commercial Sensory Survey Radiation Testing Progress Report

    NASA Technical Reports Server (NTRS)

    Becker, Heidi N.; Dolphic, Michael D.; Thorbourn, Dennis O.; Alexander, James W.; Salomon, Phil M.

    2008-01-01

    The NASA Electronic Parts and Packaging (NEPP) Program Sensor Technology Commercial Sensor Survey task is geared toward benefiting future NASA space missions with low-cost, short-duty-cycle, visible imaging needs. Such applications could include imaging for educational outreach purposes or short surveys of spacecraft, planetary, or lunar surfaces. Under the task, inexpensive commercial grade CMOS sensors were surveyed in fiscal year 2007 (FY07) and three sensors were selected for total ionizing dose (TID) and displacement damage dose (DDD) tolerance testing. The selected sensors had to meet selection criteria chosen to support small, low-mass cameras that produce good resolution color images. These criteria are discussed in detail in [1]. This document discusses the progress of radiation testing on the Micron and OmniVision sensors selected in FY07 for radiation tolerance testing.

  15. Compressive Sensing Image Sensors-Hardware Implementation

    PubMed Central

    Dadkhah, Mohammadreza; Deen, M. Jamal; Shirani, Shahram

    2013-01-01

    The compressive sensing (CS) paradigm uses simultaneous sensing and compression to provide an efficient image acquisition technique. The main advantages of the CS method include high resolution imaging using low resolution sensor arrays and faster image acquisition. Since the imaging philosophy in CS imagers is different from conventional imaging systems, new physical structures have been developed for cameras that use the CS technique. In this paper, a review of different hardware implementations of CS encoding in optical and electrical domains is presented. Considering the recent advances in CMOS (complementary metal–oxide–semiconductor) technologies and the feasibility of performing on-chip signal processing, important practical issues in the implementation of CS in CMOS sensors are emphasized. In addition, the CS coding for video capture is discussed. PMID:23584123

  16. Image Registration of High-Resolution Uav Data: the New Hypare Algorithm

    NASA Astrophysics Data System (ADS)

    Bahr, T.; Jin, X.; Lasica, R.; Giessel, D.

    2013-08-01

    Unmanned aerial vehicles play an important role in the present-day civilian and military intelligence. Equipped with a variety of sensors, such as SAR imaging modes, E/O- and IR sensor technology, they are due to their agility suitable for many applications. Hence, the necessity arises to use fusion technologies and to develop them continuously. Here an exact image-to-image registration is essential. It serves as the basis for important image processing operations such as georeferencing, change detection, and data fusion. Therefore we developed the Hybrid Powered Auto-Registration Engine (HyPARE). HyPARE combines all available spatial reference information with a number of image registration approaches to improve the accuracy, performance, and automation of tie point generation and image registration. We demonstrate this approach by the registration of 39 still images from a high-resolution image stream, acquired with a Aeryon Photo3S™ camera on an Aeryon Scout micro-UAV™.

  17. CMOS image sensor-based immunodetection by refractive-index change.

    PubMed

    Devadhasan, Jasmine P; Kim, Sanghyo

    2012-01-01

    A complementary metal oxide semiconductor (CMOS) image sensor is an intriguing technology for the development of a novel biosensor. Indeed, the CMOS image sensor mechanism concerning the detection of the antigen-antibody (Ag-Ab) interaction at the nanoscale has been ambiguous so far. To understand the mechanism, more extensive research has been necessary to achieve point-of-care diagnostic devices. This research has demonstrated a CMOS image sensor-based analysis of cardiovascular disease markers, such as C-reactive protein (CRP) and troponin I, Ag-Ab interactions on indium nanoparticle (InNP) substrates by simple photon count variation. The developed sensor is feasible to detect proteins even at a fg/mL concentration under ordinary room light. Possible mechanisms, such as dielectric constant and refractive-index changes, have been studied and proposed. A dramatic change in the refractive index after protein adsorption on an InNP substrate was observed to be a predominant factor involved in CMOS image sensor-based immunoassay.

  18. Achieving thermography with a thermal security camera using uncooled amorphous silicon microbolometer image sensors

    NASA Astrophysics Data System (ADS)

    Wang, Yu-Wei; Tesdahl, Curtis; Owens, Jim; Dorn, David

    2012-06-01

    Advancements in uncooled microbolometer technology over the last several years have opened up many commercial applications which had been previously cost prohibitive. Thermal technology is no longer limited to the military and government market segments. One type of thermal sensor with low NETD which is available in the commercial market segment is the uncooled amorphous silicon (α-Si) microbolometer image sensor. Typical thermal security cameras focus on providing the best image quality by auto tonemaping (contrast enhancing) the image, which provides the best contrast depending on the temperature range of the scene. While this may provide enough information to detect objects and activities, there are further benefits of being able to estimate the actual object temperatures in a scene. This thermographic ability can provide functionality beyond typical security cameras by being able to monitor processes. Example applications of thermography[2] with thermal camera include: monitoring electrical circuits, industrial machinery, building thermal leaks, oil/gas pipelines, power substations, etc...[3][5] This paper discusses the methodology of estimating object temperatures by characterizing/calibrating different components inside a thermal camera utilizing an uncooled amorphous silicon microbolometer image sensor. Plots of system performance across camera operating temperatures will be shown.

  19. Review of infrared technology in The Netherlands

    NASA Astrophysics Data System (ADS)

    de Jong, Arie N.

    1993-11-01

    The use of infrared sensors in the Netherlands is substantial. Users can be found in a variety of disciplines, military as well as civil. This need for IR sensors implied a long history on IR technology and development. The result was a large technological-capability allowing the realization of IR hardware: specialized measuring equipment, engineering development models, prototype and production sensors for different applications. These applications range from small size, local radiometry up to large space-borne imaging. Large scale production of IR sensors has been realized for army vehicles. IR sensors have been introduced now in all of the armed forces. Facilities have been built to test the performance of these sensors. Models have been developed to predict the performance of a new sensor. A great effort has been spent on atmospheric research, leading to knowledge upon atmospheric- and background limitations of IR sensors.

  20. Identification of Air Force Emerging Technologies and Militarily Significant Emerging Technologies.

    DTIC Science & Technology

    1985-08-31

    taking an integrated approach to avionics and EU, the various sensors and receivers on the aircraft can time-share the use of common signal processors...functions mentioned above has required, in addition to a separate sensor or antenna, a totally independent electronics suite. Many of the advanced...Classification A3. IMAGING SENSOR AUTOPROCESSOR The Air Force has contracted with Rockwell International and Honeywell in this work. Rockwell’s work is

  1. Imaging Science Panel. Multispectral Imaging Science Working Group joint meeting with Information Science Panel: Introduction

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The state-of-the-art of multispectral sensing is reviewed and recommendations for future research and development are proposed. specifically, two generic sensor concepts were discussed. One is the multispectral pushbroom sensor utilizing linear array technology which operates in six spectral bands including two in the SWIR region and incorporates capabilities for stereo and crosstrack pointing. The second concept is the imaging spectrometer (IS) which incorporates a dispersive element and area arrays to provide both spectral and spatial information simultaneously. Other key technology areas included very large scale integration and the computer aided design of these devices.

  2. OFSETH: smart medical textile for continuous monitoring of respiratory motions under magnetic resonance imaging.

    PubMed

    De Jonckheere, J; Narbonneau, F; Jeanne, M; Kinet, D; Witt, J; Krebber, K; Paquet, B; Depre, A; Logier, R

    2009-01-01

    The potential impact of optical fiber sensors embedded into medical textiles for the continuous monitoring of the patient during Magnetic Resonance Imaging is presented. We report on two pure optical sensing technologies for respiratory movements monitoring - a macro bending sensor and a Bragg grating sensor, designed to measure the elongation due to abdominal and thoracic motions during breathing. We demonstrate that the two sensors can successfully sense textile elongation between, 0% and 3%, while maintaining the stretching properties of the textile substrates for a good comfort of the patient.

  3. CMOS image sensor with lateral electric field modulation pixels for fluorescence lifetime imaging with sub-nanosecond time response

    NASA Astrophysics Data System (ADS)

    Li, Zhuo; Seo, Min-Woong; Kagawa, Keiichiro; Yasutomi, Keita; Kawahito, Shoji

    2016-04-01

    This paper presents the design and implementation of a time-resolved CMOS image sensor with a high-speed lateral electric field modulation (LEFM) gating structure for time domain fluorescence lifetime measurement. Time-windowed signal charge can be transferred from a pinned photodiode (PPD) to a pinned storage diode (PSD) by turning on a pair of transfer gates, which are situated beside the channel. Unwanted signal charge can be drained from the PPD to the drain by turning on another pair of gates. The pixel array contains 512 (V) × 310 (H) pixels with 5.6 × 5.6 µm2 pixel size. The imager chip was fabricated using 0.11 µm CMOS image sensor process technology. The prototype sensor has a time response of 150 ps at 374 nm. The fill factor of the pixels is 5.6%. The usefulness of the prototype sensor is demonstrated for fluorescence lifetime imaging through simulation and measurement results.

  4. SSME leak detection feasibility investigation by utilization of infrared sensor technology

    NASA Technical Reports Server (NTRS)

    Shohadaee, Ahmad A.; Crawford, Roger A.

    1990-01-01

    This investigation examined the potential of using state-of-the-art technology of infrared (IR) thermal imaging systems combined with computer, digital image processing and expert systems for Space Shuttle Main Engines (SSME) propellant path peak detection as an early warning system of imminent engine failure. A low-cost, laboratory experiment was devised and an experimental approach was established. The system was installed, checked out, and data were successfully acquired demonstrating the proof-of-concept. The conclusion from this investigation is that both numerical and experimental results indicate that the leak detection by using infrared sensor technology proved to be feasible for a rocket engine health monitoring system.

  5. Development of a handheld widefield hyperspectral imaging (HSI) sensor for standoff detection of explosive, chemical, and narcotic residues

    NASA Astrophysics Data System (ADS)

    Nelson, Matthew P.; Basta, Andrew; Patil, Raju; Klueva, Oksana; Treado, Patrick J.

    2013-05-01

    The utility of Hyper Spectral Imaging (HSI) passive chemical detection employing wide field, standoff imaging continues to be advanced in detection applications. With a drive for reduced SWaP (Size, Weight, and Power), increased speed of detection and sensitivity, developing a handheld platform that is robust and user-friendly increases the detection capabilities of the end user. In addition, easy to use handheld detectors could improve the effectiveness of locating and identifying threats while reducing risks to the individual. ChemImage Sensor Systems (CISS) has developed the HSI Aperio™ sensor for real time, wide area surveillance and standoff detection of explosives, chemical threats, and narcotics for use in both government and commercial contexts. Employing liquid crystal tunable filter technology, the HSI system has an intuitive user interface that produces automated detections and real-time display of threats with an end user created library of threat signatures that is easily updated allowing for new hazardous materials. Unlike existing detection technologies that often require close proximity for sensing and so endanger operators and costly equipment, the handheld sensor allows the individual operator to detect threats from a safe distance. Uses of the sensor include locating production facilities of illegal drugs or IEDs by identification of materials on surfaces such as walls, floors, doors, deposits on production tools and residue on individuals. In addition, the sensor can be used for longer-range standoff applications such as hasty checkpoint or vehicle inspection of residue materials on surfaces or bulk material identification. The CISS Aperio™ sensor has faster data collection, faster image processing, and increased detection capability compared to previous sensors.

  6. Hybrid imaging: a quantum leap in scientific imaging

    NASA Astrophysics Data System (ADS)

    Atlas, Gene; Wadsworth, Mark V.

    2004-01-01

    ImagerLabs has advanced its patented next generation imaging technology called the Hybrid Imaging Technology (HIT) that offers scientific quality performance. The key to the HIT is the merging of the CCD and CMOS technologies through hybridization rather than process integration. HIT offers exceptional QE, fill factor, broad spectral response and very low noise properties of the CCD. In addition, it provides the very high-speed readout, low power, high linearity and high integration capability of CMOS sensors. In this work, we present the benefits, and update the latest advances in the performance of this exciting technology.

  7. A CMOS image sensor with stacked photodiodes for lensless observation system of digital enzyme-linked immunosorbent assay

    NASA Astrophysics Data System (ADS)

    Takehara, Hironari; Miyazawa, Kazuya; Noda, Toshihiko; Sasagawa, Kiyotaka; Tokuda, Takashi; Kim, Soo Hyeon; Iino, Ryota; Noji, Hiroyuki; Ohta, Jun

    2014-01-01

    A CMOS image sensor with stacked photodiodes was fabricated using 0.18 µm mixed signal CMOS process technology. Two photodiodes were stacked at the same position of each pixel of the CMOS image sensor. The stacked photodiodes consist of shallow high-concentration N-type layer (N+), P-type well (PW), deep N-type well (DNW), and P-type substrate (P-sub). PW and P-sub were shorted to ground. By monitoring the voltage of N+ and DNW individually, we can observe two monochromatic colors simultaneously without using any color filters. The CMOS image sensor is suitable for fluorescence imaging, especially contact imaging such as a lensless observation system of digital enzyme-linked immunosorbent assay (ELISA). Since the fluorescence increases with time in digital ELISA, it is possible to observe fluorescence accurately by calculating the difference from the initial relation between the pixel values for both photodiodes.

  8. Image science team

    NASA Technical Reports Server (NTRS)

    Ando, K.

    1982-01-01

    A substantial technology base of solid state pushbroom sensors exists and is in the process of further evolution at both GSFC and JPL. Technologies being developed relate to short wave infrared (SWIR) detector arrays; HgCdTe hybrid detector arrays; InSb linear and area arrays; passive coolers; spectral beam splitters; the deposition of spectral filters on detector arrays; and the functional design of the shuttle/space platform imaging spectrometer (SIS) system. Spatial and spectral characteristics of field, aircraft and space multispectral sensors are summaried. The status, field of view, and resolution of foreign land observing systems are included.

  9. An update on TED gunshot detection system development status

    NASA Astrophysics Data System (ADS)

    Tidhar, Gil A.; Aphek, Ori; Gurovich, Martin

    2009-05-01

    In recent years the TED system has been under development, starting from new SWIR sensor technology, optics and real-time sensor technologies and following with complete system architecture as a soldier mounted optical gun shot detection system with high precision and imaging means. For the first time, the modules and the concept of operation of the system will be explained, with emphasis on new sensor-to-shooter capabilities. Actual field trial results will be shown.

  10. Synthetic Foveal Imaging Technology

    NASA Technical Reports Server (NTRS)

    Nikzad, Shouleh (Inventor); Monacos, Steve P. (Inventor); Hoenk, Michael E. (Inventor)

    2013-01-01

    Apparatuses and methods are disclosed that create a synthetic fovea in order to identify and highlight interesting portions of an image for further processing and rapid response. Synthetic foveal imaging implements a parallel processing architecture that uses reprogrammable logic to implement embedded, distributed, real-time foveal image processing from different sensor types while simultaneously allowing for lossless storage and retrieval of raw image data. Real-time, distributed, adaptive processing of multi-tap image sensors with coordinated processing hardware used for each output tap is enabled. In mosaic focal planes, a parallel-processing network can be implemented that treats the mosaic focal plane as a single ensemble rather than a set of isolated sensors. Various applications are enabled for imaging and robotic vision where processing and responding to enormous amounts of data quickly and efficiently is important.

  11. ManPortable and UGV LIVAR: advances in sensor suite integration bring improvements to target observation and identification for the electronic battlefield

    NASA Astrophysics Data System (ADS)

    Lynam, Jeff R.

    2001-09-01

    A more highly integrated, electro-optical sensor suite using Laser Illuminated Viewing and Ranging (LIVAR) techniques is being developed under the Army Advanced Concept Technology- II (ACT-II) program for enhanced manportable target surveillance and identification. The ManPortable LIVAR system currently in development employs a wide-array of sensor technologies that provides the foot-bound soldier and UGV significant advantages and capabilities in lightweight, fieldable, target location, ranging and imaging systems. The unit incorporates a wide field-of-view, 5DEG x 3DEG, uncooled LWIR passive sensor for primary target location. Laser range finding and active illumination is done with a triggered, flash-lamp pumped, eyesafe micro-laser operating in the 1.5 micron region, and is used in conjunction with a range-gated, electron-bombarded CCD digital camera to then image the target objective in a more- narrow, 0.3$DEG, field-of-view. Target range determination is acquired using the integrated LRF and a target position is calculated using data from other onboard devices providing GPS coordinates, tilt, bank and corrected magnetic azimuth. Range gate timing and coordinated receiver optics focus control allow for target imaging operations to be optimized. The onboard control electronics provide power efficient, system operations for extended field use periods from the internal, rechargeable battery packs. Image data storage, transmission, and processing performance capabilities are also being incorporated to provide the best all-around support, for the electronic battlefield, in this type of system. The paper will describe flash laser illumination technology, EBCCD camera technology with flash laser detection system, and image resolution improvement through frame averaging.

  12. Commercialization of Australian advanced infrared technology

    NASA Astrophysics Data System (ADS)

    Redpath, John; Brown, Allen; Woods, William F.

    1995-09-01

    For several decades, the main thrust in infrared technology developments in Australia has been in two main sensor technologies: uncooled silicon chip printed bolometric sensors pioneered by DSTO's Kevin Liddiard, and precision engineered high quality Cadmium Mercury Telluride developed at DSTO under the guidance of Dr. Richard Hartley. In late 1993 a low cost infrared imaging device was developed at DSTO as a sensor for guided missiles. The combination of these three innovations made up a unique package that enabled Australian industry to break through the barriers of commercializing infrared technology. The privately owned company, R.J. Optronics Pty Ltd undertook the process of re-engineering a selection of these DSTO developments to be applicable to a wide range of infrared products. The first project was a novel infrared imager based on a Palmer scan (translated circle) mechanism. This device applies a spinning wedge and a single detector, it uses a video processor to convert the image into a standard rectangular format. Originally developed as an imaging seeker for a stand-off weapon, it is producing such high quality images at such a low cost that it is now also being adapted for a wide variety of other military and commercial applications. A technique for electronically stabilizing it has been developed which uses the inertial signals from co-mounted sensors to compensate for platform motions. This enables it to meet the requirements of aircraft, marine vessels and masthead sight applications without the use of gimbals. After tests on a three-axis motion table, several system configurations have now been successfully operated on a number of lightweight platforms, including a Cessna 172 and the Australian made Seabird Seeker aircraft.

  13. SEM contour based metrology for microlens process studies in CMOS image sensor technologies

    NASA Astrophysics Data System (ADS)

    Lakcher, Amine; Ostrovsky, Alain; Le-Gratiet, Bertrand; Berthier, Ludovic; Bidault, Laurent; Ducoté, Julien; Jamin-Mornet, Clémence; Mortini, Etienne; Besacier, Maxime

    2018-03-01

    From the first digital cameras which appeared during the 70s to cameras of current smartphones, image sensors have undergone significant technological development in the last decades. The development of CMOS image sensor technologies in the 90s has been the main driver of the recent progresses. The main component of an image sensor is the pixel. A pixel contains a photodiode connected to transistors but only the photodiode area is light sensitive. This results in a significant loss of efficiency. To solve this issue, microlenses are used to focus the incident light on the photodiode. A microlens array is made out of a transparent material and has a spherical cap shape. To obtain this spherical shape, a lithography process is performed to generate resist blocks which are then annealed above their glass transition temperature (reflow). Even if the dimensions to consider are higher than in advanced IC nodes, microlenses are sensitive to process variability during lithography and reflow. A good control of the microlens dimensions is key to optimize the process and thus the performance of the final product. The purpose of this paper is to apply SEM contour metrology [1, 2, 3, 4] to microlenses in order to develop a relevant monitoring methodology and to propose new metrics to engineers to evaluate their process or optimize the design of the microlens arrays.

  14. Log polar image sensor in CMOS technology

    NASA Astrophysics Data System (ADS)

    Scheffer, Danny; Dierickx, Bart; Pardo, Fernando; Vlummens, Jan; Meynants, Guy; Hermans, Lou

    1996-08-01

    We report on the design, design issues, fabrication and performance of a log-polar CMOS image sensor. The sensor is developed for the use in a videophone system for deaf and hearing impaired people, who are not capable of communicating through a 'normal' telephone. The system allows 15 detailed images per second to be transmitted over existing telephone lines. This framerate is sufficient for conversations by means of sign language or lip reading. The pixel array of the sensor consists of 76 concentric circles with (up to) 128 pixels per circle, in total 8013 pixels. The interior pixels have a pitch of 14 micrometers, up to 250 micrometers at the border. The 8013-pixels image is mapped (log-polar transformation) in a X-Y addressable 76 by 128 array.

  15. Protection performance evaluation regarding imaging sensors hardened against laser dazzling

    NASA Astrophysics Data System (ADS)

    Ritt, Gunnar; Koerber, Michael; Forster, Daniel; Eberle, Bernd

    2015-05-01

    Electro-optical imaging sensors are widely distributed and used for many different purposes, including civil security and military operations. However, laser irradiation can easily disturb their operational capability. Thus, an adequate protection mechanism for electro-optical sensors against dazzling and damaging is highly desirable. Different protection technologies exist now, but none of them satisfies the operational requirements without any constraints. In order to evaluate the performance of various laser protection measures, we present two different approaches based on triangle orientation discrimination on the one hand and structural similarity on the other hand. For both approaches, image analysis algorithms are applied to images taken of a standard test scene with triangular test patterns which is superimposed by dazzling laser light of various irradiance levels. The evaluation methods are applied to three different sensors: a standard complementary metal oxide semiconductor camera, a high dynamic range camera with a nonlinear response curve, and a sensor hardened against laser dazzling.

  16. Fusion of spectral and panchromatic images using false color mapping and wavelet integrated approach

    NASA Astrophysics Data System (ADS)

    Zhao, Yongqiang; Pan, Quan; Zhang, Hongcai

    2006-01-01

    With the development of sensory technology, new image sensors have been introduced that provide a greater range of information to users. But as the power limitation of radiation, there will always be some trade-off between spatial and spectral resolution in the image captured by specific sensors. Images with high spatial resolution can locate objects with high accuracy, whereas images with high spectral resolution can be used to identify the materials. Many applications in remote sensing require fusing low-resolution imaging spectral images with panchromatic images to identify materials at high resolution in clutter. A pixel-based false color mapping and wavelet transform integrated fusion algorithm is presented in this paper, the resulting images have a higher information content than each of the original images and retain sensor-specific image information. The simulation results show that this algorithm can enhance the visibility of certain details and preserve the difference of different materials.

  17. International Symposium on Applications of Ferroelectrics

    DTIC Science & Technology

    1993-02-01

    neighborhood of the Curie point. A high dielectric constant The technology of producing monolithic IR detectors using is also useful in many imaging applications...a linear array of sensors. Eacha detector (pixel) or group of Work on new infrared (IR) sensors is at present them, can thus produce a signal ... recorded . The signal beam , was expanded to certain input image (or a partial one) is illuminated only with the 15mm to carry images and was then

  18. Image quality testing of assembled IR camera modules

    NASA Astrophysics Data System (ADS)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  19. Smart focal-plane technology for micro-instruments and micro-rovers

    NASA Technical Reports Server (NTRS)

    Fossum, Eric R.

    1993-01-01

    It is inevitable that micro-instruments and micro-rovers for space exploration will contain one or more focal-plane arrays for imaging, spectroscopy, or navigation. In this paper, we explore the state-of-the-art in focal-plane technology for visible sensors. Also discussed is present research activity in advanced focal-plane technology with particular emphasis on the development of smart sensors. The paper concludes with a discussion of possible future directions for the advancement of the technology.

  20. The effect of split pixel HDR image sensor technology on MTF measurements

    NASA Astrophysics Data System (ADS)

    Deegan, Brian M.

    2014-03-01

    Split-pixel HDR sensor technology is particularly advantageous in automotive applications, because the images are captured simultaneously rather than sequentially, thereby reducing motion blur. However, split pixel technology introduces artifacts in MTF measurement. To achieve a HDR image, raw images are captured from both large and small sub-pixels, and combined to make the HDR output. In some cases, a large sub-pixel is used for long exposure captures, and a small sub-pixel for short exposures, to extend the dynamic range. The relative size of the photosensitive area of the pixel (fill factor) plays a very significant role in the output MTF measurement. Given an identical scene, the MTF will be significantly different, depending on whether you use the large or small sub-pixels i.e. a smaller fill factor (e.g. in the short exposure sub-pixel) will result in higher MTF scores, but significantly greater aliasing. Simulations of split-pixel sensors revealed that, when raw images from both sub-pixels are combined, there is a significant difference in rising edge (i.e. black-to-white transition) and falling edge (white-to-black) reproduction. Experimental results showed a difference of ~50% in measured MTF50 between the falling and rising edges of a slanted edge test chart.

  1. OSUS sensor integration in Army experiments

    NASA Astrophysics Data System (ADS)

    Ganger, Robert; Nowicki, Mark; Kovach, Jesse; Gregory, Timothy; Liss, Brian

    2016-05-01

    Live sensor data was obtained from an Open Standard for Unattended Sensors (OSUS, formerly Terra Harvest)- based system provided by the Army Research Lab (ARL) and fed into the Communications-Electronics Research, Development and Engineering Center (CERDEC) sponsored Actionable Intelligence Technology Enabled Capabilities Demonstration (AI-TECD) Micro Cloud during the E15 demonstration event that took place at Fort Dix, New Jersey during July 2015. This data was an enabler for other technologies, such as Sensor Assignment to Mission (SAM), Sensor Data Server (SDS), and the AI-TECD Sensor Dashboard, providing rich sensor data (including images) for use by the Company Intel Support Team (CoIST) analyst. This paper describes how the OSUS data was integrated and used in the E15 event to support CoIST operations.

  2. Color image fusion for concealed weapon detection

    NASA Astrophysics Data System (ADS)

    Toet, Alexander

    2003-09-01

    Recent advances in passive and active imaging sensor technology offer the potential to detect weapons that are concealed underneath a person's clothing or carried along in bags. Although the concealed weapons can sometimes easily be detected, it can be difficult to perceive their context, due to the non-literal nature of these images. Especially for dynamic crowd surveillance purposes it may be impossible to rapidly asses with certainty which individual in the crowd is the one carrying the observed weapon. Sensor fusion is an enabling technology that may be used to solve this problem. Through fusion the signal of the sensor that depicts the weapon can be displayed in the context provided by a sensor of a different modality. We propose an image fusion scheme in which non-literal imagery can be fused with standard color images such that the result clearly displays the observed weapons in the context of the original color image. The procedure is such that the relevant contrast details from the non-literal image are transferred to the color image without altering the original color distribution of this image. The result is a natural looking color image that fluently combines all details from both input sources. When an observer who performs a dynamic crowd surveillance task, detects a weapon in the scene, he will also be able to quickly determine which person in the crowd is actually carrying the observed weapon (e.g. "the man with the red T-shirt and blue jeans"). The method is illustrated by the fusion of thermal 8-12 μm imagery with standard RGB color images.

  3. A non-disruptive technology for robust 3D tool tracking for ultrasound-guided interventions.

    PubMed

    Mung, Jay; Vignon, Francois; Jain, Ameet

    2011-01-01

    In the past decade ultrasound (US) has become the preferred modality for a number of interventional procedures, offering excellent soft tissue visualization. The main limitation however is limited visualization of surgical tools. A new method is proposed for robust 3D tracking and US image enhancement of surgical tools under US guidance. Small US sensors are mounted on existing surgical tools. As the imager emits acoustic energy, the electrical signal from the sensor is analyzed to reconstruct its 3D coordinates. These coordinates can then be used for 3D surgical navigation, similar to current day tracking systems. A system with real-time 3D tool tracking and image enhancement was implemented on a commercial ultrasound scanner and 3D probe. Extensive water tank experiments with a tracked 0.2mm sensor show robust performance in a wide range of imaging conditions and tool position/orientations. The 3D tracking accuracy was 0.36 +/- 0.16mm throughout the imaging volume of 55 degrees x 27 degrees x 150mm. Additionally, the tool was successfully tracked inside a beating heart phantom. This paper proposes an image enhancement and tool tracking technology with sub-mm accuracy for US-guided interventions. The technology is non-disruptive, both in terms of existing clinical workflow and commercial considerations, showing promise for large scale clinical impact.

  4. Recent Design Development in Molecular Imaging for Breast Cancer Detection Using Nanometer CMOS Based Sensors.

    PubMed

    Nguyen, Dung C; Ma, Dongsheng Brian; Roveda, Janet M W

    2012-01-01

    As one of the key clinical imaging methods, the computed X-ray tomography can be further improved using new nanometer CMOS sensors. This will enhance the current technique's ability in terms of cancer detection size, position, and detection accuracy on the anatomical structures. The current paper reviewed designs of SOI-based CMOS sensors and their architectural design in mammography systems. Based on the existing experimental results, using the SOI technology can provide a low-noise (SNR around 87.8 db) and high-gain (30 v/v) CMOS imager. It is also expected that, together with the fast data acquisition designs, the new type of imagers may play important roles in the near-future high-dimensional images in additional to today's 2D imagers.

  5. Crosstalk quantification, analysis, and trends in CMOS image sensors.

    PubMed

    Blockstein, Lior; Yadid-Pecht, Orly

    2010-08-20

    Pixel crosstalk (CTK) consists of three components, optical CTK (OCTK), electrical CTK (ECTK), and spectral CTK (SCTK). The CTK has been classified into two groups: pixel-architecture dependent and pixel-architecture independent. The pixel-architecture-dependent CTK (PADC) consists of the sum of two CTK components, i.e., the OCTK and the ECTK. This work presents a short summary of a large variety of methods for PADC reduction. Following that, this work suggests a clear quantifiable definition of PADC. Three complementary metal-oxide-semiconductor (CMOS) image sensors based on different technologies were empirically measured, using a unique scanning technology, the S-cube. The PADC is analyzed, and technology trends are shown.

  6. Small SWAP 3D imaging flash ladar for small tactical unmanned air systems

    NASA Astrophysics Data System (ADS)

    Bird, Alan; Anderson, Scott A.; Wojcik, Michael; Budge, Scott E.

    2015-05-01

    The Space Dynamics Laboratory (SDL), working with Naval Research Laboratory (NRL) and industry leaders Advanced Scientific Concepts (ASC) and Hood Technology Corporation, has developed a small SWAP (size, weight, and power) 3D imaging flash ladar (LAser Detection And Ranging) sensor system concept design for small tactical unmanned air systems (STUAS). The design utilizes an ASC 3D flash ladar camera and laser in a Hood Technology gyro-stabilized gimbal system. The design is an autonomous, intelligent, geo-aware sensor system that supplies real-time 3D terrain and target images. Flash ladar and visible camera data are processed at the sensor using a custom digitizer/frame grabber with compression. Mounted in the aft housing are power, controls, processing computers, and GPS/INS. The onboard processor controls pointing and handles image data, detection algorithms and queuing. The small SWAP 3D imaging flash ladar sensor system generates georeferenced terrain and target images with a low probability of false return and <10 cm range accuracy through foliage in real-time. The 3D imaging flash ladar is designed for a STUAS with a complete system SWAP estimate of <9 kg, <0.2 m3 and <350 W power. The system is modeled using LadarSIM, a MATLAB® and Simulink®- based ladar system simulator designed and developed by the Center for Advanced Imaging Ladar (CAIL) at Utah State University. We will present the concept design and modeled performance predictions.

  7. Smart CMOS image sensor for lightning detection and imaging.

    PubMed

    Rolando, Sébastien; Goiffon, Vincent; Magnan, Pierre; Corbière, Franck; Molina, Romain; Tulet, Michel; Bréart-de-Boisanger, Michel; Saint-Pé, Olivier; Guiry, Saïprasad; Larnaudie, Franck; Leone, Bruno; Perez-Cuevas, Leticia; Zayer, Igor

    2013-03-01

    We present a CMOS image sensor dedicated to lightning detection and imaging. The detector has been designed to evaluate the potentiality of an on-chip lightning detection solution based on a smart sensor. This evaluation is performed in the frame of the predevelopment phase of the lightning detector that will be implemented in the Meteosat Third Generation Imager satellite for the European Space Agency. The lightning detection process is performed by a smart detector combining an in-pixel frame-to-frame difference comparison with an adjustable threshold and on-chip digital processing allowing an efficient localization of a faint lightning pulse on the entire large format array at a frequency of 1 kHz. A CMOS prototype sensor with a 256×256 pixel array and a 60 μm pixel pitch has been fabricated using a 0.35 μm 2P 5M technology and tested to validate the selected detection approach.

  8. Compact LWIR sensors using spatial interferometric technology (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Bingham, Adam L.; Lucey, Paul G.; Knobbe, Edward T.

    2017-05-01

    Recent developments in reducing the cost and mass of hyperspectral sensors have enabled more widespread use for short range compositional imaging applications. HSI in the long wave infrared (LWIR) is of interest because it is sensitive to spectral phenomena not accessible to other wavelengths, and because of its inherent thermal imaging capability. At Spectrum Photonics we have pursued compact LWIR hyperspectral sensors both using microbolometer arrays and compact cryogenic detector cameras. Our microbolometer-based systems are principally aimed at short standoff applications, currently weigh 10-15 lbs and feature sizes approximately 20x20x10 cm, with sensitivity in the 1-2 microflick range, and imaging times on the order of 30 seconds. Our systems that employ cryogenic arrays are aimed at medium standoff ranges such as nadir looking missions from UAVs. Recent work with cooled sensors has focused on Strained Layer Superlattice (SLS) technology, as these detector arrays are undergoing rapid improvements, and have some advantages compared to HgCdTe detectors in terms of calibration stability. These sensors include full on-board processing sensor stabilization so are somewhat larger than the microbolometer systems, but could be adapted to much more compact form factors. We will review our recent progress in both these application areas.

  9. A novel, optical, on-line bacteria sensor for monitoring drinking water quality

    PubMed Central

    Højris, Bo; Christensen, Sarah Christine Boesgaard; Albrechtsen, Hans-Jørgen; Smith, Christian; Dahlqvist, Mathis

    2016-01-01

    Today, microbial drinking water quality is monitored through either time-consuming laboratory methods or indirect on-line measurements. Results are thus either delayed or insufficient to support proactive action. A novel, optical, on-line bacteria sensor with a 10-minute time resolution has been developed. The sensor is based on 3D image recognition, and the obtained pictures are analyzed with algorithms considering 59 quantified image parameters. The sensor counts individual suspended particles and classifies them as either bacteria or abiotic particles. The technology is capable of distinguishing and quantifying bacteria and particles in pure and mixed suspensions, and the quantification correlates with total bacterial counts. Several field applications have demonstrated that the technology can monitor changes in the concentration of bacteria, and is thus well suited for rapid detection of critical conditions such as pollution events in drinking water. PMID:27040142

  10. A novel, optical, on-line bacteria sensor for monitoring drinking water quality.

    PubMed

    Højris, Bo; Christensen, Sarah Christine Boesgaard; Albrechtsen, Hans-Jørgen; Smith, Christian; Dahlqvist, Mathis

    2016-04-04

    Today, microbial drinking water quality is monitored through either time-consuming laboratory methods or indirect on-line measurements. Results are thus either delayed or insufficient to support proactive action. A novel, optical, on-line bacteria sensor with a 10-minute time resolution has been developed. The sensor is based on 3D image recognition, and the obtained pictures are analyzed with algorithms considering 59 quantified image parameters. The sensor counts individual suspended particles and classifies them as either bacteria or abiotic particles. The technology is capable of distinguishing and quantifying bacteria and particles in pure and mixed suspensions, and the quantification correlates with total bacterial counts. Several field applications have demonstrated that the technology can monitor changes in the concentration of bacteria, and is thus well suited for rapid detection of critical conditions such as pollution events in drinking water.

  11. IR CMOS: near infrared enhanced digital imaging (Presentation Recording)

    NASA Astrophysics Data System (ADS)

    Pralle, Martin U.; Carey, James E.; Joy, Thomas; Vineis, Chris J.; Palsule, Chintamani

    2015-08-01

    SiOnyx has demonstrated imaging at light levels below 1 mLux (moonless starlight) at video frame rates with a 720P CMOS image sensor in a compact, low latency camera. Low light imaging is enabled by the combination of enhanced quantum efficiency in the near infrared together with state of the art low noise image sensor design. The quantum efficiency enhancements are achieved by applying Black Silicon, SiOnyx's proprietary ultrafast laser semiconductor processing technology. In the near infrared, silicon's native indirect bandgap results in low absorption coefficients and long absorption lengths. The Black Silicon nanostructured layer fundamentally disrupts this paradigm by enhancing the absorption of light within a thin pixel layer making 5 microns of silicon equivalent to over 300 microns of standard silicon. This results in a demonstrate 10 fold improvements in near infrared sensitivity over incumbent imaging technology while maintaining complete compatibility with standard CMOS image sensor process flows. Applications include surveillance, nightvision, and 1064nm laser see spot. Imaging performance metrics will be discussed. Demonstrated performance characteristics: Pixel size : 5.6 and 10 um Array size: 720P/1.3Mpix Frame rate: 60 Hz Read noise: 2 ele/pixel Spectral sensitivity: 400 to 1200 nm (with 10x QE at 1064nm) Daytime imaging: color (Bayer pattern) Nighttime imaging: moonless starlight conditions 1064nm laser imaging: daytime imaging out to 2Km

  12. The Multispectral Imaging Science Working Group. Volume 3: Appendices

    NASA Technical Reports Server (NTRS)

    Cox, S. C. (Editor)

    1982-01-01

    The status and technology requirements for using multispectral sensor imagery in geographic, hydrologic, and geologic applications are examined. Critical issues in image and information science are identified.

  13. The influence of the in situ camera calibration for direct georeferencing of aerial imagery

    NASA Astrophysics Data System (ADS)

    Mitishita, E.; Barrios, R.; Centeno, J.

    2014-11-01

    The direct determination of exterior orientation parameters (EOPs) of aerial images via GNSS/INS technologies is an essential prerequisite in photogrammetric mapping nowadays. Although direct sensor orientation technologies provide a high degree of automation in the process due to the GNSS/INS technologies, the accuracies of the obtained results depend on the quality of a group of parameters that models accurately the conditions of the system at the moment the job is performed. One sub-group of parameters (lever arm offsets and boresight misalignments) models the position and orientation of the sensors with respect to the IMU body frame due to the impossibility of having all sensors on the same position and orientation in the airborne platform. Another sub-group of parameters models the internal characteristics of the sensor (IOP). A system calibration procedure has been recommended by worldwide studies to obtain accurate parameters (mounting and sensor characteristics) for applications of the direct sensor orientation. Commonly, mounting and sensor characteristics are not stable; they can vary in different flight conditions. The system calibration requires a geometric arrangement of the flight and/or control points to decouple correlated parameters, which are not available in the conventional photogrammetric flight. Considering this difficulty, this study investigates the feasibility of the in situ camera calibration to improve the accuracy of the direct georeferencing of aerial images. The camera calibration uses a minimum image block, extracted from the conventional photogrammetric flight, and control point arrangement. A digital Vexcel UltraCam XP camera connected to POS AV TM system was used to get two photogrammetric image blocks. The blocks have different flight directions and opposite flight line. In situ calibration procedures to compute different sets of IOPs are performed and their results are analyzed and used in photogrammetric experiments. The IOPs from the in situ camera calibration improve significantly the accuracies of the direct georeferencing. The obtained results from the experiments are shown and discussed.

  14. Studies of prototype DEPFET sensors for the Wide Field Imager of Athena

    NASA Astrophysics Data System (ADS)

    Treberspurg, Wolfgang; Andritschke, Robert; Bähr, Alexander; Behrens, Annika; Hauser, Günter; Lechner, Peter; Meidinger, Norbert; Müller-Seidlitz, Johannes; Treis, Johannes

    2017-08-01

    The Wide Field Imager (WFI) of ESA's next X-ray observatory Athena will combine a high count rate capability with a large field of view, both with state-of-the-art spectroscopic performance. To meet these demands, specific DEPFET active pixel detectors have been developed and operated. Due to the intrinsic amplification of detected signals they are best suited to achieve a high speed and low noise performance. Different fabrication technologies and transistor geometries have been implemented on a dedicated prototype production in the course of the development of the DEPFET sensors. The main modifications between the sensors concern the shape of the transistor gate - regarding the layout - and the thickness of the gate oxide - regarding the technology. To facilitate the fabrication and testing of the resulting variety of sensors the presented studies were carried out with 64×64 pixel detectors. The detector comprises a control ASIC (Switcher-A), a readout ASIC (VERITAS- 2) and the sensor. In this paper we give an overview on the evaluation of different prototype sensors. The most important results, which have been decisive for the identification of the optimal fabrication technology and transistor layout for subsequent sensor productions are summarized. It will be shown that the developments result in an excellent performance of spectroscopic X-ray DEPFETs with typical noise values below 2.5 ENC at 2.5 μs/row.

  15. Extremely High-Frequency Holographic Radar Imaging of Personnel and Mail

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMakin, Douglas L.; Sheen, David M.; Griffin, Jeffrey W.

    2006-08-01

    The awareness of terrorists covertly transporting chemical warfare (CW) and biological warfare (BW) agents into government, military, and civilian facilities to harm the occupants has increased dramatically since the attacks of 9/11. Government and civilian security personnel have a need for innovative surveillance technology that can rapidly detect these lethal agents, even when they are hidden away in sealed containers and concealed either under clothing or in hand-carried items such as mailed packages or handbags. Sensor technology that detects BW and CW agents in mail or sealed containers carried under the clothing are under development. One promising sensor technology presentlymore » under development to defeat these threats is active millimeter-wave holographic radar imaging, which can readily image concealed items behind paper, cardboard, and clothing. Feasibility imaging studies at frequencies greater than 40 GHz have been conducted to determine whether simulated biological or chemical agents concealed in mail packages or under clothing could be detected using this extremely high-frequency imaging technique. The results of this imaging study will be presented in this paper.« less

  16. Displacement damage effects on CMOS APS image sensors induced by neutron irradiation from a nuclear reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zujun, E-mail: wangzujun@nint.ac.cn; Huang, Shaoyan; Liu, Minbo

    The experiments of displacement damage effects on CMOS APS image sensors induced by neutron irradiation from a nuclear reactor are presented. The CMOS APS image sensors are manufactured in the standard 0.35 μm CMOS technology. The flux of neutron beams was about 1.33 × 10{sup 8} n/cm{sup 2}s. The three samples were exposed by 1 MeV neutron equivalent-fluence of 1 × 10{sup 11}, 5 × 10{sup 11}, and 1 × 10{sup 12} n/cm{sup 2}, respectively. The mean dark signal (K{sub D}), dark signal spike, dark signal non-uniformity (DSNU), noise (V{sub N}), saturation output signal voltage (V{sub S}), and dynamic rangemore » (DR) versus neutron fluence are investigated. The degradation mechanisms of CMOS APS image sensors are analyzed. The mean dark signal increase due to neutron displacement damage appears to be proportional to displacement damage dose. The dark images from CMOS APS image sensors irradiated by neutrons are presented to investigate the generation of dark signal spike.« less

  17. Optimizing Distributed Sensor Placement for Border Patrol Interdiction Using Microsoft Excel

    DTIC Science & Technology

    2007-04-01

    weather conditions and they can be evaded by using techniques which minimize heat signatures use of lasers and other technologies day or night (26:8...technologies which can be used for border security. Maier [2004] developed a seismic intrusion sensor technology which uses fiber optic cables, lasers , and...needed to create the is used as the base map for the network. program originally developed by Keyhole by Google Inc. It provides satellite images of

  18. Toward one Giga frames per second--evolution of in situ storage image sensors.

    PubMed

    Etoh, Takeharu G; Son, Dao V T; Yamada, Tetsuo; Charbon, Edoardo

    2013-04-08

    The ISIS is an ultra-fast image sensor with in-pixel storage. The evolution of the ISIS in the past and in the near future is reviewed and forecasted. To cover the storage area with a light shield, the conventional frontside illuminated ISIS has a limited fill factor. To achieve higher sensitivity, a BSI ISIS was developed. To avoid direct intrusion of light and migration of signal electrons to the storage area on the frontside, a cross-sectional sensor structure with thick pnpn layers was developed, and named "Tetratified structure". By folding and looping in-pixel storage CCDs, an image signal accumulation sensor, ISAS, is proposed. The ISAS has a new function, the in-pixel signal accumulation, in addition to the ultra-high-speed imaging. To achieve much higher frame rate, a multi-collection-gate (MCG) BSI image sensor architecture is proposed. The photoreceptive area forms a honeycomb-like shape. Performance of a hexagonal CCD-type MCG BSI sensor is examined by simulations. The highest frame rate is theoretically more than 1Gfps. For the near future, a stacked hybrid CCD/CMOS MCG image sensor seems most promising. The associated problems are discussed. A fine TSV process is the key technology to realize the structure.

  19. Configuration and Management of Wireless Sensor Networks

    DTIC Science & Technology

    2005-12-01

    monitor network status. B. CONCLUSIONS AND FUTURE WORK WSNs are an exciting and useful technology which will be used in various areas in the...int h = getSize().height; Image resizedImage = null; ImageFilter replicate = new ReplicateScaleFilter(w, h); ImageProducer prod = new

  20. Surface chemistry and morphology in single particle optical imaging

    NASA Astrophysics Data System (ADS)

    Ekiz-Kanik, Fulya; Sevenler, Derin Deniz; Ünlü, Neşe Lortlar; Chiari, Marcella; Ünlü, M. Selim

    2017-05-01

    Biological nanoparticles such as viruses and exosomes are important biomarkers for a range of medical conditions, from infectious diseases to cancer. Biological sensors that detect whole viruses and exosomes with high specificity, yet without additional labeling, are promising because they reduce the complexity of sample preparation and may improve measurement quality by retaining information about nanoscale physical structure of the bio-nanoparticle (BNP). Towards this end, a variety of BNP biosensor technologies have been developed, several of which are capable of enumerating the precise number of detected viruses or exosomes and analyzing physical properties of each individual particle. Optical imaging techniques are promising candidates among broad range of label-free nanoparticle detectors. These imaging BNP sensors detect the binding of single nanoparticles on a flat surface functionalized with a specific capture molecule or an array of multiplexed capture probes. The functionalization step confers all molecular specificity for the sensor's target but can introduce an unforeseen problem; a rough and inhomogeneous surface coating can be a source of noise, as these sensors detect small local changes in optical refractive index. In this paper, we review several optical technologies for label-free BNP detectors with a focus on imaging systems. We compare the surface-imaging methods including dark-field, surface plasmon resonance imaging and interference reflectance imaging. We discuss the importance of ensuring consistently uniform and smooth surface coatings of capture molecules for these types of biosensors and finally summarize several methods that have been developed towards addressing this challenge.

  1. Enhanced technologies for unattended ground sensor systems

    NASA Astrophysics Data System (ADS)

    Hartup, David C.

    2010-04-01

    Progress in several technical areas is being leveraged to advantage in Unattended Ground Sensor (UGS) systems. This paper discusses advanced technologies that are appropriate for use in UGS systems. While some technologies provide evolutionary improvements, other technologies result in revolutionary performance advancements for UGS systems. Some specific technologies discussed include wireless cameras and viewers, commercial PDA-based system programmers and monitors, new materials and techniques for packaging improvements, low power cueing sensor radios, advanced long-haul terrestrial and SATCOM radios, and networked communications. Other technologies covered include advanced target detection algorithms, high pixel count cameras for license plate and facial recognition, small cameras that provide large stand-off distances, video transmissions of target activity instead of still images, sensor fusion algorithms, and control center hardware. The impact of each technology on the overall UGS system architecture is discussed, along with the advantages provided to UGS system users. Areas of analysis include required camera parameters as a function of stand-off distance for license plate and facial recognition applications, power consumption for wireless cameras and viewers, sensor fusion communication requirements, and requirements to practically implement video transmission through UGS systems. Examples of devices that have already been fielded using technology from several of these areas are given.

  2. Design and implementation of non-linear image processing functions for CMOS image sensor

    NASA Astrophysics Data System (ADS)

    Musa, Purnawarman; Sudiro, Sunny A.; Wibowo, Eri P.; Harmanto, Suryadi; Paindavoine, Michel

    2012-11-01

    Today, solid state image sensors are used in many applications like in mobile phones, video surveillance systems, embedded medical imaging and industrial vision systems. These image sensors require the integration in the focal plane (or near the focal plane) of complex image processing algorithms. Such devices must meet the constraints related to the quality of acquired images, speed and performance of embedded processing, as well as low power consumption. To achieve these objectives, low-level analog processing allows extracting the useful information in the scene directly. For example, edge detection step followed by a local maxima extraction will facilitate the high-level processing like objects pattern recognition in a visual scene. Our goal was to design an intelligent image sensor prototype achieving high-speed image acquisition and non-linear image processing (like local minima and maxima calculations). For this purpose, we present in this article the design and test of a 64×64 pixels image sensor built in a standard CMOS Technology 0.35 μm including non-linear image processing. The architecture of our sensor, named nLiRIC (non-Linear Rapid Image Capture), is based on the implementation of an analog Minima/Maxima Unit. This MMU calculates the minimum and maximum values (non-linear functions), in real time, in a 2×2 pixels neighbourhood. Each MMU needs 52 transistors and the pitch of one pixel is 40×40 mu m. The total area of the 64×64 pixels is 12.5mm2. Our tests have shown the validity of the main functions of our new image sensor like fast image acquisition (10K frames per second), minima/maxima calculations in less then one ms.

  3. Organic-on-silicon complementary metal-oxide-semiconductor colour image sensors.

    PubMed

    Lim, Seon-Jeong; Leem, Dong-Seok; Park, Kyung-Bae; Kim, Kyu-Sik; Sul, Sangchul; Na, Kyoungwon; Lee, Gae Hwang; Heo, Chul-Joon; Lee, Kwang-Hee; Bulliard, Xavier; Satoh, Ryu-Ichi; Yagi, Tadao; Ro, Takkyun; Im, Dongmo; Jung, Jungkyu; Lee, Myungwon; Lee, Tae-Yon; Han, Moon Gyu; Jin, Yong Wan; Lee, Sangyoon

    2015-01-12

    Complementary metal-oxide-semiconductor (CMOS) colour image sensors are representative examples of light-detection devices. To achieve extremely high resolutions, the pixel sizes of the CMOS image sensors must be reduced to less than a micron, which in turn significantly limits the number of photons that can be captured by each pixel using silicon (Si)-based technology (i.e., this reduction in pixel size results in a loss of sensitivity). Here, we demonstrate a novel and efficient method of increasing the sensitivity and resolution of the CMOS image sensors by superposing an organic photodiode (OPD) onto a CMOS circuit with Si photodiodes, which consequently doubles the light-input surface area of each pixel. To realise this concept, we developed organic semiconductor materials with absorption properties selective to green light and successfully fabricated highly efficient green-light-sensitive OPDs without colour filters. We found that such a top light-receiving OPD, which is selective to specific green wavelengths, demonstrates great potential when combined with a newly designed Si-based CMOS circuit containing only blue and red colour filters. To demonstrate the effectiveness of this state-of-the-art hybrid colour image sensor, we acquired a real full-colour image using a camera that contained the organic-on-Si hybrid CMOS colour image sensor.

  4. Organic-on-silicon complementary metal–oxide–semiconductor colour image sensors

    PubMed Central

    Lim, Seon-Jeong; Leem, Dong-Seok; Park, Kyung-Bae; Kim, Kyu-Sik; Sul, Sangchul; Na, Kyoungwon; Lee, Gae Hwang; Heo, Chul-Joon; Lee, Kwang-Hee; Bulliard, Xavier; Satoh, Ryu-Ichi; Yagi, Tadao; Ro, Takkyun; Im, Dongmo; Jung, Jungkyu; Lee, Myungwon; Lee, Tae-Yon; Han, Moon Gyu; Jin, Yong Wan; Lee, Sangyoon

    2015-01-01

    Complementary metal–oxide–semiconductor (CMOS) colour image sensors are representative examples of light-detection devices. To achieve extremely high resolutions, the pixel sizes of the CMOS image sensors must be reduced to less than a micron, which in turn significantly limits the number of photons that can be captured by each pixel using silicon (Si)-based technology (i.e., this reduction in pixel size results in a loss of sensitivity). Here, we demonstrate a novel and efficient method of increasing the sensitivity and resolution of the CMOS image sensors by superposing an organic photodiode (OPD) onto a CMOS circuit with Si photodiodes, which consequently doubles the light-input surface area of each pixel. To realise this concept, we developed organic semiconductor materials with absorption properties selective to green light and successfully fabricated highly efficient green-light-sensitive OPDs without colour filters. We found that such a top light-receiving OPD, which is selective to specific green wavelengths, demonstrates great potential when combined with a newly designed Si-based CMOS circuit containing only blue and red colour filters. To demonstrate the effectiveness of this state-of-the-art hybrid colour image sensor, we acquired a real full-colour image using a camera that contained the organic-on-Si hybrid CMOS colour image sensor. PMID:25578322

  5. [Flat-panel detector technology -State-of-the-art and future prospects-].

    PubMed

    Yamazaki, Tatsuya

    2002-01-01

    A flat-panel detector (FPD) is a long-awaited technology to implement the digital X-ray imaging technology into the radiological department. This paper describes the state-of-the-art technology and future prospects on the FPD technology. State-of-the-art technology was reviewed taking the CXDI series as an example. Several FPD-based systems have been introduced into the Japanese market since CXDI-11 opened it in November 1998. Accompanying CXDI-C2 for control, CXDI-22 for table position and CXDI-31 for portable, the CXDI series fulfills the requirement of the radiography room being a fully digitalized room. The FPD on the CXDI series is comprised of a scintillator (Gd(2)O(2)S:Tb(3+)) as a primary sensor in which the X-ray is captured and an amorphous silicon detector (LANMIT) as a secondary sensor in which the fluorescent light is detected. Since the scintillator is identical to that of the screen-film systems, it can be said as proven, durable and chemically stable and it is expected to produce the same image quality as the screen-film systems. CXDI-31, a portable FPD-based system, was developed targeting thinner dimensions, lightweight, durability and high spatial resolution. Thoroughly re-designing the mechanical structure and reducing the power consumption at the readout IC realized thinner dimensions. Introducing the portable note PC technologies successfully combined lightweight with durability. Improving the sensor process and re-designing the layout made the sensor high resolution without compromising the signal-to-noise ratio. Future prospects were overviewed in the aspect of technology and applications. Sensitivity, spatial resolution, frame rate and portability were described as the upcoming technology. Increasing gain and reducing noise will realize higher sensitivity, especially by adopting the PbI(2), HgI(2) or such photoconductor materials as the primary sensor. Pixelized amplifier will also achieve higher sensitivity. Layered sensor designed such that TFT layer and sensitive layer are constructed separately will decrease the pixel pitch lower than 100 microm. The FPD has been applied in radiography, mammography and angiography. It will expand the applications into low-dose fluoroscopy to replace the X-ray image intensifiers and into cone-beam computer tomography. What the FPD brought was mainly the efficient workflow of the X-ray technologist. However, diagnosis efficiency and patient benefit must be improved further more by combining FPD technology into computer-aided diagnosis, tele-radiography or other IT-based technologies. Such prospect may come true in the near future.

  6. Diffractive optics technology and the NASA Geostationary Earth Observatory (GEO)

    NASA Technical Reports Server (NTRS)

    Morris, G. Michael; Michaels, Robert L.; Faklis, Dean

    1992-01-01

    Diffractive (or binary) optics offers unique capabilities for the development of large-aperture, high-performance, light-weight optical systems. The Geostationary Earth Observatory (GEO) will consist of a variety of instruments to monitor the environmental conditions of the earth and its atmosphere. The aim of this investigation is to analyze the design of the GEO instrument that is being proposed and to identify the areas in which diffractive (or binary) optics technology can make a significant impact in GEO sensor design. Several potential applications where diffractive optics may indeed serve as a key technology for improving the performance and reducing the weight and cost of the GEO sensors have been identified. Applications include the use of diffractive/refractive hybrid lenses for aft-optic imagers, diffractive telescopes for narrowband imaging, subwavelength structured surfaces for anti-reflection and polarization control, and aberration compensation for reflective imaging systems and grating spectrometers.

  7. An HDR imaging method with DTDI technology for push-broom cameras

    NASA Astrophysics Data System (ADS)

    Sun, Wu; Han, Chengshan; Xue, Xucheng; Lv, Hengyi; Shi, Junxia; Hu, Changhong; Li, Xiangzhi; Fu, Yao; Jiang, Xiaonan; Huang, Liang; Han, Hongyin

    2018-03-01

    Conventionally, high dynamic-range (HDR) imaging is based on taking two or more pictures of the same scene with different exposure. However, due to a high-speed relative motion between the camera and the scene, it is hard for this technique to be applied to push-broom remote sensing cameras. For the sake of HDR imaging in push-broom remote sensing applications, the present paper proposes an innovative method which can generate HDR images without redundant image sensors or optical components. Specifically, this paper adopts an area array CMOS (complementary metal oxide semiconductor) with the digital domain time-delay-integration (DTDI) technology for imaging, instead of adopting more than one row of image sensors, thereby taking more than one picture with different exposure. And then a new HDR image by fusing two original images with a simple algorithm can be achieved. By conducting the experiment, the dynamic range (DR) of the image increases by 26.02 dB. The proposed method is proved to be effective and has potential in other imaging applications where there is a relative motion between the cameras and scenes.

  8. Microfabricated optically pumped magnetometer arrays for biomedical imaging

    NASA Astrophysics Data System (ADS)

    Perry, A. R.; Sheng, D.; Krzyzewski, S. P.; Geller, S.; Knappe, S.

    2017-02-01

    Optically-pumped magnetometers have demonstrated magnetic field measurements as precise as the best superconducting quantum interference device magnetometers. Our group develops miniature alkali atom-based magnetic sensors using microfabrication technology. Our sensors do not require cryogenic cooling, and can be positioned very close to the sample, making these sensors an attractive option for development in the medical community. We will present our latest chip-scale optically-pumped gradiometer developed for array applications to image magnetic fields from the brain noninvasively. These developments should lead to improved spatial resolution, and potentially sensitive measurements in unshielded environments.

  9. Towards establishing compact imaging spectrometer standards

    USGS Publications Warehouse

    Slonecker, E. Terrence; Allen, David W.; Resmini, Ronald G.

    2016-01-01

    Remote sensing science is currently undergoing a tremendous expansion in the area of hyperspectral imaging (HSI) technology. Spurred largely by the explosive growth of Unmanned Aerial Vehicles (UAV), sometimes called Unmanned Aircraft Systems (UAS), or drones, HSI capabilities that once required access to one of only a handful of very specialized and expensive sensor systems are now miniaturized and widely available commercially. Small compact imaging spectrometers (CIS) now on the market offer a number of hyperspectral imaging capabilities in terms of spectral range and sampling. The potential uses of HSI/CIS on UAVs/UASs seem limitless. However, the rapid expansion of unmanned aircraft and small hyperspectral sensor capabilities has created a number of questions related to technological, legal, and operational capabilities. Lightweight sensor systems suitable for UAV platforms are being advertised in the trade literature at an ever-expanding rate with no standardization of system performance specifications or terms of reference. To address this issue, both the U.S. Geological Survey and the National Institute of Standards and Technology are eveloping draft standards to meet these issues. This paper presents the outline of a combined USGS/NIST cooperative strategy to develop and test a characterization methodology to meet the needs of a new and expanding UAV/CIS/HSI user community.

  10. Advanced Sensors Boost Optical Communication, Imaging

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Brooklyn, New York-based Amplification Technologies Inc. (ATI), employed Phase I and II SBIR funding from NASA s Jet Propulsion Laboratory to forward the company's solid-state photomultiplier technology. Under the SBIR, ATI developed a small, energy-efficient, extremely high-gain sensor capable of detecting light down to single photons in the near infrared wavelength range. The company has commercialized this technology in the form of its NIRDAPD photomultiplier, ideal for use in free space optical communications, lidar and ladar, night vision goggles, and other light sensing applications.

  11. 4K x 2K pixel color video pickup system

    NASA Astrophysics Data System (ADS)

    Sugawara, Masayuki; Mitani, Kohji; Shimamoto, Hiroshi; Fujita, Yoshihiro; Yuyama, Ichiro; Itakura, Keijirou

    1998-12-01

    This paper describes the development of an experimental super- high-definition color video camera system. During the past several years there has been much interest in super-high- definition images as the next generation image media. One of the difficulties in implementing a super-high-definition motion imaging system is constructing the image-capturing section (camera). Even the state-of-the-art semiconductor technology can not realize the image sensor which has enough pixels and output data rate for super-high-definition images. The present study is an attempt to fill the gap in this respect. The authors intend to solve the problem by using new imaging method in which four HDTV sensors are attached on a new color separation optics so that their pixel sample pattern forms checkerboard pattern. A series of imaging experiments demonstrate that this technique is an effective approach to capturing super-high-definition moving images in the present situation where no image sensors exist for such images.

  12. Highly sensitive and area-efficient CMOS image sensor using a PMOSFET-type photodetector with a built-in transfer gate

    NASA Astrophysics Data System (ADS)

    Seo, Sang-Ho; Kim, Kyoung-Do; Kong, Jae-Sung; Shin, Jang-Kyoo; Choi, Pyung

    2007-02-01

    In this paper, a new CMOS image sensor is presented, which uses a PMOSFET-type photodetector with a transfer gate that has a high and variable sensitivity. The proposed CMOS image sensor has been fabricated using a 0.35 μm 2-poly 4- metal standard CMOS technology and is composed of a 256 × 256 array of 7.05 × 7.10 μm pixels. The unit pixel has a configuration of a pseudo 3-transistor active pixel sensor (APS) with the PMOSFET-type photodetector with a transfer gate, which has a function of conventional 4-transistor APS. The generated photocurrent is controlled by the transfer gate of the PMOSFET-type photodetector. The maximum responsivity of the photodetector is larger than 1.0 × 10 3 A/W without any optical lens. Fabricated 256 × 256 CMOS image sensor exhibits a good response to low-level illumination as low as 5 lux.

  13. MOSES: a modular sensor electronics system for space science and commercial applications

    NASA Astrophysics Data System (ADS)

    Michaelis, Harald; Behnke, Thomas; Tschentscher, Matthias; Mottola, Stefano; Neukum, Gerhard

    1999-10-01

    The camera group of the DLR--Institute of Space Sensor Technology and Planetary Exploration is developing imaging instruments for scientific and space applications. One example is the ROLIS imaging system of the ESA scientific space mission `Rosetta', which consists of a descent/downlooking and a close-up imager. Both are parts of the Rosetta-Lander payload and will operate in the extreme environment of a cometary nucleus. The Rosetta Lander Imaging System (ROLIS) will introduce a new concept for the sensor electronics, which is referred to as MOSES (Modula Sensor Electronics System). MOSES is a 3D miniaturized CCD- sensor-electronics which is based on single modules. Each of the modules has some flexibility and enables a simple adaptation to specific application requirements. MOSES is mainly designed for space applications where high performance and high reliability are required. This concept, however, can also be used in other science or commercial applications. This paper describes the concept of MOSES, its characteristics, performance and applications.

  14. Design Considerations For Imaging Charge-Coupled Device (ICCD) Star Sensors

    NASA Astrophysics Data System (ADS)

    McAloon, K. J.

    1981-04-01

    A development program is currently underway to produce a precision star sensor using imaging charge coupled device (ICCD) technology. The effort is the critical component development phase for the Air Force Multi-Mission Attitude Determination and Autonomous Navigation System (MADAN). A number of unique considerations have evolved in designing an arcsecond accuracy sensor around an ICCD detector. Three tiers of performance criteria are involved: at the spacecraft attitude determination system level, at the star sensor level, and at the detector level. Optimum attitude determination system performance involves a tradeoff between Kalman filter iteration time and sensor ICCD integration time. The ICCD star sensor lends itself to the use of a new approach in the functional interface between the attitude determination system and the sensor. At the sensor level image data processing tradeoffs are important for optimum sensor performance. These tradeoffs involve the sensor optic configuration, the optical point spread function (PSF) size and shape, the PSF position locator, and the microprocessor locator algorithm. Performance modelling of the sensor mandates the use of computer simulation programs. Five key performance parameters at the ICCD detector level are defined. ICCD error characteristics have also been isolated to five key parameters.

  15. Driving into the future: how imaging technology is shaping the future of cars

    NASA Astrophysics Data System (ADS)

    Zhang, Buyue

    2015-03-01

    Fueled by the development of advanced driver assistance system (ADAS), autonomous vehicles, and the proliferation of cameras and sensors, automotive is becoming a rich new domain for innovations in imaging technology. This paper presents an overview of ADAS, the important imaging and computer vision problems to solve for automotive, and examples of how some of these problems are solved, through which we highlight the challenges and opportunities in the automotive imaging space.

  16. Features Extraction of Flotation Froth Images and BP Neural Network Soft-Sensor Model of Concentrate Grade Optimized by Shuffled Cuckoo Searching Algorithm

    PubMed Central

    Wang, Jie-sheng; Han, Shuang; Shen, Na-na; Li, Shu-xia

    2014-01-01

    For meeting the forecasting target of key technology indicators in the flotation process, a BP neural network soft-sensor model based on features extraction of flotation froth images and optimized by shuffled cuckoo search algorithm is proposed. Based on the digital image processing technique, the color features in HSI color space, the visual features based on the gray level cooccurrence matrix, and the shape characteristics based on the geometric theory of flotation froth images are extracted, respectively, as the input variables of the proposed soft-sensor model. Then the isometric mapping method is used to reduce the input dimension, the network size, and learning time of BP neural network. Finally, a shuffled cuckoo search algorithm is adopted to optimize the BP neural network soft-sensor model. Simulation results show that the model has better generalization results and prediction accuracy. PMID:25133210

  17. Autonomous Sensors for Large Scale Data Collection

    NASA Astrophysics Data System (ADS)

    Noto, J.; Kerr, R.; Riccobono, J.; Kapali, S.; Migliozzi, M. A.; Goenka, C.

    2017-12-01

    Presented here is a novel implementation of a "Doppler imager" which remotely measures winds and temperatures of the neutral background atmosphere at ionospheric altitudes of 87-300Km and possibly above. Incorporating both recent optical manufacturing developments, modern network awareness and the application of machine learning techniques for intelligent self-monitoring and data classification. This system achieves cost savings in manufacturing, deployment and lifetime operating costs. Deployed in both ground and space-based modalities, this cost-disruptive technology will allow computer models of, ionospheric variability and other space weather models to operate with higher precision. Other sensors can be folded into the data collection and analysis architecture easily creating autonomous virtual observatories. A prototype version of this sensor has recently been deployed in Trivandrum India for the Indian Government. This Doppler imager is capable of operation, even within the restricted CubeSat environment. The CubeSat bus offers a very challenging environment, even for small instruments. The lack of SWaP and the challenging thermal environment demand development of a new generation of instruments; the Doppler imager presented is well suited to this environment. Concurrent with this CubeSat development is the development and construction of ground based arrays of inexpensive sensors using the proposed technology. This instrument could be flown inexpensively on one or more CubeSats to provide valuable data to space weather forecasters and ionospheric scientists. Arrays of magnetometers have been deployed for the last 20 years [Alabi, 2005]. Other examples of ground based arrays include an array of white-light all sky imagers (THEMIS) deployed across Canada [Donovan et al., 2006], oceans sensors on buoys [McPhaden et al., 2010], and arrays of seismic sensors [Schweitzer et al., 2002]. A comparable array of Doppler imagers can be constructed and deployed on the ground, to compliment the CubeSat data.

  18. Modular multiple sensors information management for computer-integrated surgery.

    PubMed

    Vaccarella, Alberto; Enquobahrie, Andinet; Ferrigno, Giancarlo; Momi, Elena De

    2012-09-01

    In the past 20 years, technological advancements have modified the concept of modern operating rooms (ORs) with the introduction of computer-integrated surgery (CIS) systems, which promise to enhance the outcomes, safety and standardization of surgical procedures. With CIS, different types of sensor (mainly position-sensing devices, force sensors and intra-operative imaging devices) are widely used. Recently, the need for a combined use of different sensors raised issues related to synchronization and spatial consistency of data from different sources of information. In this study, we propose a centralized, multi-sensor management software architecture for a distributed CIS system, which addresses sensor information consistency in both space and time. The software was developed as a data server module in a client-server architecture, using two open-source software libraries: Image-Guided Surgery Toolkit (IGSTK) and OpenCV. The ROBOCAST project (FP7 ICT 215190), which aims at integrating robotic and navigation devices and technologies in order to improve the outcome of the surgical intervention, was used as the benchmark. An experimental protocol was designed in order to prove the feasibility of a centralized module for data acquisition and to test the application latency when dealing with optical and electromagnetic tracking systems and ultrasound (US) imaging devices. Our results show that a centralized approach is suitable for minimizing synchronization errors; latency in the client-server communication was estimated to be 2 ms (median value) for tracking systems and 40 ms (median value) for US images. The proposed centralized approach proved to be adequate for neurosurgery requirements. Latency introduced by the proposed architecture does not affect tracking system performance in terms of frame rate and limits US images frame rate at 25 fps, which is acceptable for providing visual feedback to the surgeon in the OR. Copyright © 2012 John Wiley & Sons, Ltd.

  19. Human perception testing methodology for evaluating EO/IR imaging systems

    NASA Astrophysics Data System (ADS)

    Graybeal, John J.; Monfort, Samuel S.; Du Bosq, Todd W.; Familoni, Babajide O.

    2018-04-01

    The U.S. Army's RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD) Perception Lab is tasked with supporting the development of sensor systems for the U.S. Army by evaluating human performance of emerging technologies. Typical research questions involve detection, recognition and identification as a function of range, blur, noise, spectral band, image processing techniques, image characteristics, and human factors. NVESD's Perception Lab provides an essential bridge between the physics of the imaging systems and the performance of the human operator. In addition to quantifying sensor performance, perception test results can also be used to generate models of human performance and to drive future sensor requirements. The Perception Lab seeks to develop and employ scientifically valid and efficient perception testing procedures within the practical constraints of Army research, including rapid development timelines for critical technologies, unique guidelines for ethical testing of Army personnel, and limited resources. The purpose of this paper is to describe NVESD Perception Lab capabilities, recent methodological improvements designed to align our methodology more closely with scientific best practice, and to discuss goals for future improvements and expanded capabilities. Specifically, we discuss modifying our methodology to improve training, to account for human fatigue, to improve assessments of human performance, and to increase experimental design consultation provided by research psychologists. Ultimately, this paper outlines a template for assessing human perception and overall system performance related to EO/IR imaging systems.

  20. Day, night and all-weather security surveillance automation synergy from combining two powerful technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morellas, Vassilios; Johnson, Andrew; Johnston, Chris

    2006-07-01

    Thermal imaging is rightfully a real-world technology proven to bring confidence to daytime, night-time and all weather security surveillance. Automatic image processing intrusion detection algorithms are also a real world technology proven to bring confidence to system surveillance security solutions. Together, day, night and all weather video imagery sensors and automated intrusion detection software systems create the real power to protect early against crime, providing real-time global homeland protection, rather than simply being able to monitor and record activities for post event analysis. These solutions, whether providing automatic security system surveillance at airports (to automatically detect unauthorized aircraft takeoff andmore » landing activities) or at high risk private, public or government facilities (to automatically detect unauthorized people or vehicle intrusion activities) are on the move to provide end users the power to protect people, capital equipment and intellectual property against acts of vandalism and terrorism. As with any technology, infrared sensors and automatic image intrusion detection systems for global homeland security protection have clear technological strengths and limitations compared to other more common day and night vision technologies or more traditional manual man-in-the-loop intrusion detection security systems. This paper addresses these strength and limitation capabilities. False Alarm (FAR) and False Positive Rate (FPR) is an example of some of the key customer system acceptability metrics and Noise Equivalent Temperature Difference (NETD) and Minimum Resolvable Temperature are examples of some of the sensor level performance acceptability metrics. (authors)« less

  1. Active-Pixel Image Sensor With Analog-To-Digital Converters

    NASA Technical Reports Server (NTRS)

    Fossum, Eric R.; Mendis, Sunetra K.; Pain, Bedabrata; Nixon, Robert H.

    1995-01-01

    Proposed single-chip integrated-circuit image sensor contains 128 x 128 array of active pixel sensors at 50-micrometer pitch. Output terminals of all pixels in each given column connected to analog-to-digital (A/D) converter located at bottom of column. Pixels scanned in semiparallel fashion, one row at time; during time allocated to scanning row, outputs of all active pixel sensors in row fed to respective A/D converters. Design of chip based on complementary metal oxide semiconductor (CMOS) technology, and individual circuit elements fabricated according to 2-micrometer CMOS design rules. Active pixel sensors designed to operate at video rate of 30 frames/second, even at low light levels. A/D scheme based on first-order Sigma-Delta modulation.

  2. Communications for unattended sensor networks

    NASA Astrophysics Data System (ADS)

    Nemeroff, Jay L.; Angelini, Paul; Orpilla, Mont; Garcia, Luis; DiPierro, Stefano

    2004-07-01

    The future model of the US Army's Future Combat Systems (FCS) and the Future Force reflects a combat force that utilizes lighter armor protection than the current standard. Survival on the future battlefield will be increased by the use of advanced situational awareness provided by unattended tactical and urban sensors that detect, identify, and track enemy targets and threats. Successful implementation of these critical sensor fields requires the development of advanced sensors, sensor and data-fusion processors, and a specialized communications network. To ensure warfighter and asset survivability, the communications must be capable of near real-time dissemination of the sensor data using robust, secure, stealthy, and jam resistant links so that the proper and decisive action can be taken. Communications will be provided to a wide-array of mission-specific sensors that are capable of processing data from acoustic, magnetic, seismic, and/or Chemical, Biological, Radiological, and Nuclear (CBRN) sensors. Other, more powerful, sensor node configurations will be capable of fusing sensor data and intelligently collect and process data images from infrared or visual imaging cameras. The radio waveform and networking protocols being developed under the Soldier Level Integrated Communications Environment (SLICE) Soldier Radio Waveform (SRW) and the Networked Sensors for the Future Force Advanced Technology Demonstration are part of an effort to develop a common waveform family which will operate across multiple tactical domains including dismounted soldiers, ground sensor, munitions, missiles and robotics. These waveform technologies will ultimately be transitioned to the JTRS library, specifically the Cluster 5 requirement.

  3. A 45 nm Stacked CMOS Image Sensor Process Technology for Submicron Pixel.

    PubMed

    Takahashi, Seiji; Huang, Yi-Min; Sze, Jhy-Jyi; Wu, Tung-Ting; Guo, Fu-Sheng; Hsu, Wei-Cheng; Tseng, Tung-Hsiung; Liao, King; Kuo, Chin-Chia; Chen, Tzu-Hsiang; Chiang, Wei-Chieh; Chuang, Chun-Hao; Chou, Keng-Yu; Chung, Chi-Hsien; Chou, Kuo-Yu; Tseng, Chien-Hsien; Wang, Chuan-Joung; Yaung, Dun-Nien

    2017-12-05

    A submicron pixel's light and dark performance were studied by experiment and simulation. An advanced node technology incorporated with a stacked CMOS image sensor (CIS) is promising in that it may enhance performance. In this work, we demonstrated a low dark current of 3.2 e - /s at 60 °C, an ultra-low read noise of 0.90 e - ·rms, a high full well capacity (FWC) of 4100 e - , and blooming of 0.5% in 0.9 μm pixels with a pixel supply voltage of 2.8 V. In addition, the simulation study result of 0.8 μm pixels is discussed.

  4. Validation of Inertial and Optical Navigation Techniques for Space Applications with UAVS

    NASA Astrophysics Data System (ADS)

    Montaño, J.; Wis, M.; Pulido, J. A.; Latorre, A.; Molina, P.; Fernández, E.; Angelats, E.; Colomina, I.

    2015-09-01

    PERIGEO is an R&D project, funded by the INNPRONTA 2011-2014 programme from Spanish CDTI, which aims to investigate the use of UAV technologies and processes for the validation of space oriented technologies. For this purpose, among different space missions and technologies, a set of activities for absolute and relative navigation are being carried out to deal with the attitude and position estimation problem from a temporal image sequence from a camera on the visible spectrum and/or Light Detection and Ranging (LIDAR) sensor. The process is covered entirely: from sensor measurements and data acquisition (images, LiDAR ranges and angles), data pre-processing (calibration and co-registration of camera and LIDAR data), features and landmarks extraction from the images and image/LiDAR-based state estimation. In addition to image processing area, classical navigation system based on inertial sensors is also included in the research. The reason of combining both approaches is to enable the possibility to keep navigation capability in environments or missions where the radio beacon or reference signal as the GNSS satellite is not available (as for example an atmospheric flight in Titan). The rationale behind the combination of those systems is that they complement each other. The INS is capable of providing accurate position, velocity and full attitude estimations at high data rates. However, they need an absolute reference observation to compensate the time accumulative errors caused by inertial sensor inaccuracies. On the other hand, imaging observables can provide absolute and relative positioning and attitude estimations. However they need that the sensor head is pointing toward ground (something that may not be possible if the carrying platform is maneuvering) to provide accurate estimations and they are not capable of provide some hundreds of Hz that can deliver an INS. This mutual complementarity has been observed in PERIGEO and because of this they are combined into one system. The inertial navigation system implemented in PERIGEO is based on a classical loosely coupled INS/GNSS approach that is very similar to the implementation of the INS/Imaging navigation system that is mentioned above. The activities envisaged in PERIGEO cover the algorithms development and validation and technology testing on UAVs under representative conditions. Past activities have covered the design and development of the algorithms and systems. This paper presents the most recent activities and results on the area of image processing for robust estimation within PERIGEO, which are related with the hardware platforms definition (including sensors) and its integration in UAVs. Results for the tests performed during the flight campaigns in representative outdoor environments will be also presented (at the time of the full paper submission the tests will be performed), as well as analyzed, together with a roadmap definition for future developments.

  5. Plenoptic camera image simulation for reconstruction algorithm verification

    NASA Astrophysics Data System (ADS)

    Schwiegerling, Jim

    2014-09-01

    Plenoptic cameras have emerged in recent years as a technology for capturing light field data in a single snapshot. A conventional digital camera can be modified with the addition of a lenslet array to create a plenoptic camera. Two distinct camera forms have been proposed in the literature. The first has the camera image focused onto the lenslet array. The lenslet array is placed over the camera sensor such that each lenslet forms an image of the exit pupil onto the sensor. The second plenoptic form has the lenslet array relaying the image formed by the camera lens to the sensor. We have developed a raytracing package that can simulate images formed by a generalized version of the plenoptic camera. Several rays from each sensor pixel are traced backwards through the system to define a cone of rays emanating from the entrance pupil of the camera lens. Objects that lie within this cone are integrated to lead to a color and exposure level for that pixel. To speed processing three-dimensional objects are approximated as a series of planes at different depths. Repeating this process for each pixel in the sensor leads to a simulated plenoptic image on which different reconstruction algorithms can be tested.

  6. Image Processing Occupancy Sensor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The Image Processing Occupancy Sensor, or IPOS, is a novel sensor technology developed at the National Renewable Energy Laboratory (NREL). The sensor is based on low-cost embedded microprocessors widely used by the smartphone industry and leverages mature open-source computer vision software libraries. Compared to traditional passive infrared and ultrasonic-based motion sensors currently used for occupancy detection, IPOS has shown the potential for improved accuracy and a richer set of feedback signals for occupant-optimized lighting, daylighting, temperature setback, ventilation control, and other occupancy and location-based uses. Unlike traditional passive infrared (PIR) or ultrasonic occupancy sensors, which infer occupancy based only onmore » motion, IPOS uses digital image-based analysis to detect and classify various aspects of occupancy, including the presence of occupants regardless of motion, their number, location, and activity levels of occupants, as well as the illuminance properties of the monitored space. The IPOS software leverages the recent availability of low-cost embedded computing platforms, computer vision software libraries, and camera elements.« less

  7. CMOS image sensors: State-of-the-art

    NASA Astrophysics Data System (ADS)

    Theuwissen, Albert J. P.

    2008-09-01

    This paper gives an overview of the state-of-the-art of CMOS image sensors. The main focus is put on the shrinkage of the pixels : what is the effect on the performance characteristics of the imagers and on the various physical parameters of the camera ? How is the CMOS pixel architecture optimized to cope with the negative performance effects of the ever-shrinking pixel size ? On the other hand, the smaller dimensions in CMOS technology allow further integration on column level and even on pixel level. This will make CMOS imagers even smarter that they are already.

  8. CMOS Cell Sensors for Point-of-Care Diagnostics

    PubMed Central

    Adiguzel, Yekbun; Kulah, Haluk

    2012-01-01

    The burden of health-care related services in a global era with continuously increasing population and inefficient dissipation of the resources requires effective solutions. From this perspective, point-of-care diagnostics is a demanded field in clinics. It is also necessary both for prompt diagnosis and for providing health services evenly throughout the population, including the rural districts. The requirements can only be fulfilled by technologies whose productivity has already been proven, such as complementary metal-oxide-semiconductors (CMOS). CMOS-based products can enable clinical tests in a fast, simple, safe, and reliable manner, with improved sensitivities. Portability due to diminished sensor dimensions and compactness of the test set-ups, along with low sample and power consumption, is another vital feature. CMOS-based sensors for cell studies have the potential to become essential counterparts of point-of-care diagnostics technologies. Hence, this review attempts to inform on the sensors fabricated with CMOS technology for point-of-care diagnostic studies, with a focus on CMOS image sensors and capacitance sensors for cell studies. PMID:23112587

  9. CMOS cell sensors for point-of-care diagnostics.

    PubMed

    Adiguzel, Yekbun; Kulah, Haluk

    2012-01-01

    The burden of health-care related services in a global era with continuously increasing population and inefficient dissipation of the resources requires effective solutions. From this perspective, point-of-care diagnostics is a demanded field in clinics. It is also necessary both for prompt diagnosis and for providing health services evenly throughout the population, including the rural districts. The requirements can only be fulfilled by technologies whose productivity has already been proven, such as complementary metal-oxide-semiconductors (CMOS). CMOS-based products can enable clinical tests in a fast, simple, safe, and reliable manner, with improved sensitivities. Portability due to diminished sensor dimensions and compactness of the test set-ups, along with low sample and power consumption, is another vital feature. CMOS-based sensors for cell studies have the potential to become essential counterparts of point-of-care diagnostics technologies. Hence, this review attempts to inform on the sensors fabricated with CMOS technology for point-of-care diagnostic studies, with a focus on CMOS image sensors and capacitance sensors for cell studies.

  10. Review of Current Aided/Automatic Target Acquisition Technology for Military Target Acquisition Tasks

    DTIC Science & Technology

    2011-07-01

    radar [e.g., synthetic aperture radar (SAR)]. EO/IR includes multi- and hyperspectral imaging. Signal processing of data from nonimaging sensors, such...enhanced recognition ability. Other nonimage -based techniques, such as category theory,45 hierarchical systems,46 and gradient index flow,47 are possible...the battle- field. There is a plethora of imaging and nonimaging sensors on the battlefield that are being networked together for trans- mission of

  11. 3D imaging of translucent media with a plenoptic sensor based on phase space optics

    NASA Astrophysics Data System (ADS)

    Zhang, Xuanzhe; Shu, Bohong; Du, Shaojun

    2015-05-01

    Traditional stereo imaging technology is not working for dynamical translucent media, because there are no obvious characteristic patterns on it and it's not allowed using multi-cameras in most cases, while phase space optics can solve the problem, extracting depth information directly from "space-spatial frequency" distribution of the target obtained by plenoptic sensor with single lens. This paper discussed the presentation of depth information in phase space data, and calculating algorithms with different transparency. A 3D imaging example of waterfall was given at last.

  12. Design considerations for imaging charge-coupled device

    NASA Astrophysics Data System (ADS)

    1981-04-01

    The image dissector tube, which was formerly used as detector in star trackers, will be replaced by solid state imaging devices. The technology advances of charge transfer devices, like the charge-coupled device (CCD) and the charge-injection device (CID) have made their application to star trackers an immediate reality. The Air Force in 1979 funded an American Aerospace company to develop an imaging CCD (ICCD) star sensor for the Multimission Attitude Determination and Autonomous Navigation (MADAN) system. The MADAN system is a technology development for a strapdown attitude and navigation system which can be used on all Air Force 3-axis stabilized satellites. The system will be autonomous and will provide real-time satellite attitude and position information. The star sensor accuracy provides an overall MADAN attitude accuracy of 2 arcsec for star rates up to 300 arcsec/sec. The ICCD is basically an integrating device. Its pixel resolution in not yet satisfactory for precision applications.

  13. Helmet-Mounted Displays: Sensation, Perception and Cognition Issues

    DTIC Science & Technology

    2009-01-01

    Inc., web site: http://www.metavr.com/ technology/ papers /syntheticvision.html Helmetag, A., Halbig, C., Kubbat, W., and Schmidt, R. (1999...system-of-systems.” One integral system is a “head-borne vision enhancement” system (an HMD) that provides fused I2/ IR sensor imagery (U.S. Army Natick...Using microwave, radar, I2, infrared ( IR ), and other technology-based imaging sensors, the “seeing” range of the human eye is extended into the

  14. Advanced digital image archival system using MPEG technologies

    NASA Astrophysics Data System (ADS)

    Chang, Wo

    2009-08-01

    Digital information and records are vital to the human race regardless of the nationalities and eras in which they were produced. Digital image contents are produced at a rapid pace from cultural heritages via digitalization, scientific and experimental data via high speed imaging sensors, national defense satellite images from governments, medical and healthcare imaging records from hospitals, personal collection of photos from digital cameras. With these mass amounts of precious and irreplaceable data and knowledge, what standards technologies can be applied to preserve and yet provide an interoperable framework for accessing the data across varieties of systems and devices? This paper presents an advanced digital image archival system by applying the international standard of MPEG technologies to preserve digital image content.

  15. Laser beam welding quality monitoring system based in high-speed (10 kHz) uncooled MWIR imaging sensors

    NASA Astrophysics Data System (ADS)

    Linares, Rodrigo; Vergara, German; Gutiérrez, Raúl; Fernández, Carlos; Villamayor, Víctor; Gómez, Luis; González-Camino, Maria; Baldasano, Arturo; Castro, G.; Arias, R.; Lapido, Y.; Rodríguez, J.; Romero, Pablo

    2015-05-01

    The combination of flexibility, productivity, precision and zero-defect manufacturing in future laser-based equipment are a major challenge that faces this enabling technology. New sensors for online monitoring and real-time control of laserbased processes are necessary for improving products quality and increasing manufacture yields. New approaches to fully automate processes towards zero-defect manufacturing demand smarter heads where lasers, optics, actuators, sensors and electronics will be integrated in a unique compact and affordable device. Many defects arising in laser-based manufacturing processes come from instabilities in the dynamics of the laser process. Temperature and heat dynamics are key parameters to be monitored. Low cost infrared imagers with high-speed of response will constitute the next generation of sensors to be implemented in future monitoring and control systems for laser-based processes, capable to provide simultaneous information about heat dynamics and spatial distribution. This work describes the result of using an innovative low-cost high-speed infrared imager based on the first quantum infrared imager monolithically integrated with Si-CMOS ROIC of the market. The sensor is able to provide low resolution images at frame rates up to 10 KHz in uncooled operation at the same cost as traditional infrared spot detectors. In order to demonstrate the capabilities of the new sensor technology, a low-cost camera was assembled on a standard production laser welding head, allowing to register melting pool images at frame rates of 10 kHz. In addition, a specific software was developed for defect detection and classification. Multiple laser welding processes were recorded with the aim to study the performance of the system and its application to the real-time monitoring of laser welding processes. During the experiments, different types of defects were produced and monitored. The classifier was fed with the experimental images obtained. Self-learning strategies were implemented with very promising results, demonstrating the feasibility of using low-cost high-speed infrared imagers in advancing towards a real-time / in-line zero-defect production systems.

  16. Dual-mode lensless imaging device for digital enzyme linked immunosorbent assay

    NASA Astrophysics Data System (ADS)

    Sasagawa, Kiyotaka; Kim, Soo Heyon; Miyazawa, Kazuya; Takehara, Hironari; Noda, Toshihiko; Tokuda, Takashi; Iino, Ryota; Noji, Hiroyuki; Ohta, Jun

    2014-03-01

    Digital enzyme linked immunosorbent assay (ELISA) is an ultra-sensitive technology for detecting biomarkers and viruses etc. As a conventional ELISA technique, a target molecule is bonded to an antibody with an enzyme by antigen-antibody reaction. In this technology, a femto-liter droplet chamber array is used as reaction chambers. Due to its small volume, the concentration of fluorescent product by single enzyme can be sufficient for detection by a fluorescent microscopy. In this work, we demonstrate a miniaturized lensless imaging device for digital ELISA by using a custom image sensor. The pixel array of the sensor is coated with a 20 μm-thick yellow filter to eliminate excitation light at 470 nm and covered by a fiber optic plate (FOP) to protect the sensor without resolution degradation. The droplet chamber array formed on a 50μm-thick glass plate is directly placed on the FOP. In the digital ELISA, microbeads coated with antibody are loaded into the droplet chamber array, and the ratio of the fluorescent to the non-fluorescent chambers with the microbeads are observed. In the fluorescence imaging, the spatial resolution is degraded by the spreading through the glass plate because the fluorescence is irradiated omnidirectionally. This degradation is compensated by image processing and the resolution of ~35 μm was achieved. In the bright field imaging, the projected images of the beads with collimated illumination are observed. By varying the incident angle and image composition, microbeads were successfully imaged.

  17. NeuroSeek dual-color image processing infrared focal plane array

    NASA Astrophysics Data System (ADS)

    McCarley, Paul L.; Massie, Mark A.; Baxter, Christopher R.; Huynh, Buu L.

    1998-09-01

    Several technologies have been developed in recent years to advance the state of the art of IR sensor systems including dual color affordable focal planes, on-focal plane array biologically inspired image and signal processing techniques and spectral sensing techniques. Pacific Advanced Technology (PAT) and the Air Force Research Lab Munitions Directorate have developed a system which incorporates the best of these capabilities into a single device. The 'NeuroSeek' device integrates these technologies into an IR focal plane array (FPA) which combines multicolor Midwave IR/Longwave IR radiometric response with on-focal plane 'smart' neuromorphic analog image processing. The readout and processing integrated circuit very large scale integration chip which was developed under this effort will be hybridized to a dual color detector array to produce the NeuroSeek FPA, which will have the capability to fuse multiple pixel-based sensor inputs directly on the focal plane. Great advantages are afforded by application of massively parallel processing algorithms to image data in the analog domain; the high speed and low power consumption of this device mimic operations performed in the human retina.

  18. Distributed multimodal data fusion for large scale wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Ertin, Emre

    2006-05-01

    Sensor network technology has enabled new surveillance systems where sensor nodes equipped with processing and communication capabilities can collaboratively detect, classify and track targets of interest over a large surveillance area. In this paper we study distributed fusion of multimodal sensor data for extracting target information from a large scale sensor network. Optimal tracking, classification, and reporting of threat events require joint consideration of multiple sensor modalities. Multiple sensor modalities improve tracking by reducing the uncertainty in the track estimates as well as resolving track-sensor data association problems. Our approach to solving the fusion problem with large number of multimodal sensors is construction of likelihood maps. The likelihood maps provide a summary data for the solution of the detection, tracking and classification problem. The likelihood map presents the sensory information in an easy format for the decision makers to interpret and is suitable with fusion of spatial prior information such as maps, imaging data from stand-off imaging sensors. We follow a statistical approach to combine sensor data at different levels of uncertainty and resolution. The likelihood map transforms each sensor data stream to a spatio-temporal likelihood map ideally suitable for fusion with imaging sensor outputs and prior geographic information about the scene. We also discuss distributed computation of the likelihood map using a gossip based algorithm and present simulation results.

  19. A 45 nm Stacked CMOS Image Sensor Process Technology for Submicron Pixel †

    PubMed Central

    Takahashi, Seiji; Huang, Yi-Min; Sze, Jhy-Jyi; Wu, Tung-Ting; Guo, Fu-Sheng; Hsu, Wei-Cheng; Tseng, Tung-Hsiung; Liao, King; Kuo, Chin-Chia; Chen, Tzu-Hsiang; Chiang, Wei-Chieh; Chuang, Chun-Hao; Chou, Keng-Yu; Chung, Chi-Hsien; Chou, Kuo-Yu; Tseng, Chien-Hsien; Wang, Chuan-Joung; Yaung, Dun-Nien

    2017-01-01

    A submicron pixel’s light and dark performance were studied by experiment and simulation. An advanced node technology incorporated with a stacked CMOS image sensor (CIS) is promising in that it may enhance performance. In this work, we demonstrated a low dark current of 3.2 e−/s at 60 °C, an ultra-low read noise of 0.90 e−·rms, a high full well capacity (FWC) of 4100 e−, and blooming of 0.5% in 0.9 μm pixels with a pixel supply voltage of 2.8 V. In addition, the simulation study result of 0.8 μm pixels is discussed. PMID:29206162

  20. A model-based approach for detection of runways and other objects in image sequences acquired using an on-board camera

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Devadiga, Sadashiva; Tang, Yuan-Liang

    1994-01-01

    This research was initiated as a part of the Advanced Sensor and Imaging System Technology (ASSIST) program at NASA Langley Research Center. The primary goal of this research is the development of image analysis algorithms for the detection of runways and other objects using an on-board camera. Initial effort was concentrated on images acquired using a passive millimeter wave (PMMW) sensor. The images obtained using PMMW sensors under poor visibility conditions due to atmospheric fog are characterized by very low spatial resolution but good image contrast compared to those images obtained using sensors operating in the visible spectrum. Algorithms developed for analyzing these images using a model of the runway and other objects are described in Part 1 of this report. Experimental verification of these algorithms was limited to a sequence of images simulated from a single frame of PMMW image. Subsequent development and evaluation of algorithms was done using video image sequences. These images have better spatial and temporal resolution compared to PMMW images. Algorithms for reliable recognition of runways and accurate estimation of spatial position of stationary objects on the ground have been developed and evaluated using several image sequences. These algorithms are described in Part 2 of this report. A list of all publications resulting from this work is also included.

  1. CMOS Active Pixel Sensors for Low Power, Highly Miniaturized Imaging Systems

    NASA Technical Reports Server (NTRS)

    Fossum, Eric R.

    1996-01-01

    The complementary metal-oxide-semiconductor (CMOS) active pixel sensor (APS) technology has been developed over the past three years by NASA at the Jet Propulsion Laboratory, and has reached a level of performance comparable to CCDs with greatly increased functionality but at a very reduced power level.

  2. Broadband image sensor array based on graphene-CMOS integration

    NASA Astrophysics Data System (ADS)

    Goossens, Stijn; Navickaite, Gabriele; Monasterio, Carles; Gupta, Shuchi; Piqueras, Juan José; Pérez, Raúl; Burwell, Gregory; Nikitskiy, Ivan; Lasanta, Tania; Galán, Teresa; Puma, Eric; Centeno, Alba; Pesquera, Amaia; Zurutuza, Amaia; Konstantatos, Gerasimos; Koppens, Frank

    2017-06-01

    Integrated circuits based on complementary metal-oxide-semiconductors (CMOS) are at the heart of the technological revolution of the past 40 years, enabling compact and low-cost microelectronic circuits and imaging systems. However, the diversification of this platform into applications other than microcircuits and visible-light cameras has been impeded by the difficulty to combine semiconductors other than silicon with CMOS. Here, we report the monolithic integration of a CMOS integrated circuit with graphene, operating as a high-mobility phototransistor. We demonstrate a high-resolution, broadband image sensor and operate it as a digital camera that is sensitive to ultraviolet, visible and infrared light (300-2,000 nm). The demonstrated graphene-CMOS integration is pivotal for incorporating 2D materials into the next-generation microelectronics, sensor arrays, low-power integrated photonics and CMOS imaging systems covering visible, infrared and terahertz frequencies.

  3. A Method for Imaging Oxygen Distribution and Respiration at a Microscopic Level of Resolution.

    PubMed

    Rolletschek, Hardy; Liebsch, Gregor

    2017-01-01

    Conventional oxygen (micro-) sensors assess oxygen concentration within a particular region or across a transect of tissue, but provide no information regarding its bidimensional distribution. Here, a novel imaging technology is presented, in which an optical sensor foil (i.e., the planar optode) is attached to the surface of the sample. The sensor converts a fluorescent signal into an oxygen value. Since each single image captures an entire area of the sample surface, the system is able to deduce the distribution of oxygen at a resolution level of few micrometers. It can be deployed to dynamically monitor oxygen consumption, thereby providing a detailed respiration map at close to cellular resolution. Here, we demonstrate the application of the imaging tool to developing plant seeds; the protocol is explained step by step and some potential pitfalls are discussed.

  4. Advanced Image Processing for NASA Applications

    NASA Technical Reports Server (NTRS)

    LeMoign, Jacqueline

    2007-01-01

    The future of space exploration will involve cooperating fleets of spacecraft or sensor webs geared towards coordinated and optimal observation of Earth Science phenomena. The main advantage of such systems is to utilize multiple viewing angles as well as multiple spatial and spectral resolutions of sensors carried on multiple spacecraft but acting collaboratively as a single system. Within this framework, our research focuses on all areas related to sensing in collaborative environments, which means systems utilizing intracommunicating spatially distributed sensor pods or crafts being deployed to monitor or explore different environments. This talk will describe the general concept of sensing in collaborative environments, will give a brief overview of several technologies developed at NASA Goddard Space Flight Center in this area, and then will concentrate on specific image processing research related to that domain, specifically image registration and image fusion.

  5. Comparison of JPL-AIRSAR and DLR E-SAR images from the MAC Europe 1991 campaign over testsite Oberpfaffenhofen: Frequency and polarization dependent backscatter variations from agricultural fields

    NASA Technical Reports Server (NTRS)

    Schmullius, C.; Nithack, J.

    1992-01-01

    On July 12, the MAC Europe '91 (Multi-Sensor Airborne Campaign) took place over test site Oberpfaffenhofen. The DLR Institute of Radio-Frequency Technology participated with its C-VV, X-VV, and X-HH Experimental Synthetic Aperture Radar (E-SAR). The high resolution E-SAR images with a pixel size between 1 and 2 m and the polarimetric AIRSAR images were analyzed. Using both sensors in combination is a unique opportunity to evaluate SAR images in a frequency range from P- to X-band and to investigate polarimetric information.

  6. Precision segmented reflector, figure verification sensor

    NASA Technical Reports Server (NTRS)

    Manhart, Paul K.; Macenka, Steve A.

    1989-01-01

    The Precision Segmented Reflector (PSR) program currently under way at the Jet Propulsion Laboratory is a test bed and technology demonstration program designed to develop and study the structural and material technologies required for lightweight, precision segmented reflectors. A Figure Verification Sensor (FVS) which is designed to monitor the active control system of the segments is described, a best fit surface is defined, and an image or wavefront quality of the assembled array of reflecting panels is assessed

  7. Electrochromic Molecular Imprinting Sensor for Visual and Smartphone-Based Detections.

    PubMed

    Capoferri, Denise; Álvarez-Diduk, Ruslan; Del Carlo, Michele; Compagnone, Dario; Merkoçi, Arben

    2018-05-01

    Electrochromic effect and molecularly imprinted technology have been used to develop a sensitive and selective electrochromic sensor. The polymeric matrices obtained using the imprinting technology are robust molecular recognition elements and have the potential to mimic natural recognition entities with very high selectivity. The electrochromic behavior of iridium oxide nanoparticles (IrOx NPs) as physicochemical transducer together with a molecularly imprinted polymer (MIP) as recognition layer resulted in a fast and efficient translation of the detection event. The sensor was fabricated using screen-printing technology with indium tin oxide as a transparent working electrode; IrOx NPs where electrodeposited onto the electrode followed by thermal polymerization of polypyrrole in the presence of the analyte (chlorpyrifos). Two different approaches were used to detect and quantify the pesticide: direct visual detection and smartphone imaging. Application of different oxidation potentials for 10 s resulted in color changes directly related to the concentration of the analyte. For smartphone imaging, at fixed potential, the concentration of the analyte was dependent on the color intensity of the electrode. The electrochromic sensor detects a highly toxic compound (chlorpyrifos) with a 100 fM and 1 mM dynamic range. So far, to the best of our knowledge, this is the first work where an electrochromic MIP sensor uses the electrochromic properties of IrOx to detect a certain analyte with high selectivity and sensitivity.

  8. Advanced shortwave infrared and Raman hyperspectral sensors for homeland security and law enforcement operations

    NASA Astrophysics Data System (ADS)

    Klueva, Oksana; Nelson, Matthew P.; Gardner, Charles W.; Gomer, Nathaniel R.

    2015-05-01

    Proliferation of chemical and explosive threats as well as illicit drugs continues to be an escalating danger to civilian and military personnel. Conventional means of detecting and identifying hazardous materials often require the use of reagents and/or physical sampling, which is a time-consuming, costly and often dangerous process. Stand-off detection allows the operator to detect threat residues from a safer distance minimizing danger to people and equipment. Current fielded technologies for standoff detection of chemical and explosive threats are challenged by low area search rates, poor targeting efficiency, lack of sensitivity and specificity or use of costly and potentially unsafe equipment such as lasers. A demand exists for stand-off systems that are fast, safe, reliable and user-friendly. To address this need, ChemImage Sensor Systems™ (CISS) has developed reagent-less, non-contact, non-destructive sensors for the real-time detection of hazardous materials based on widefield shortwave infrared (SWIR) and Raman hyperspectral imaging (HSI). Hyperspectral imaging enables automated target detection displayed in the form of image making result analysis intuitive and user-friendly. Application of the CISS' SWIR-HSI and Raman sensing technologies to Homeland Security and Law Enforcement for standoff detection of homemade explosives and illicit drugs and their precursors in vehicle and personnel checkpoints is discussed. Sensing technologies include a portable, robot-mounted and standalone variants of the technology. Test data is shown that supports the use of SWIR and Raman HSI for explosive and drug screening at checkpoints as well as screening for explosives and drugs at suspected clandestine manufacturing facilities.

  9. A review of wearable technology in medicine.

    PubMed

    Iqbal, Mohammed H; Aydin, Abdullatif; Brunckhorst, Oliver; Dasgupta, Prokar; Ahmed, Kamran

    2016-10-01

    With rapid advances in technology, wearable devices have evolved and been adopted for various uses, ranging from simple devices used in aiding fitness to more complex devices used in assisting surgery. Wearable technology is broadly divided into head-mounted displays and body sensors. A broad search of the current literature revealed a total of 13 different body sensors and 11 head-mounted display devices. The latter have been reported for use in surgery (n = 7), imaging (n = 3), simulation and education (n = 2) and as navigation tools (n = 1). Body sensors have been used as vital signs monitors (n = 9) and for posture-related devices for posture and fitness (n = 4). Body sensors were found to have excellent functionality in aiding patient posture and rehabilitation while head-mounted displays can provide information to surgeons to while maintaining sterility during operative procedures. There is a potential role for head-mounted wearable technology and body sensors in medicine and patient care. However, there is little scientific evidence available proving that the application of such technologies improves patient satisfaction or care. Further studies need to be conducted prior to a clear conclusion. © The Royal Society of Medicine.

  10. Cross calibration of the Landsat-7 ETM+ and EO-1 ALI sensor

    USGS Publications Warehouse

    Chander, G.; Meyer, D.J.; Helder, D.L.

    2004-01-01

    As part of the Earth Observer 1 (EO-1) Mission, the Advanced Land Imager (ALI) demonstrates a potential technological direction for Landsat Data Continuity Missions. To evaluate ALI's capabilities in this role, a cross-calibration methodology has been developed using image pairs from the Landsat-7 (L7) Enhanced Thematic Mapper Plus (ETM+) and EO-1 (ALI) to verify the radiometric calibration of ALI with respect to the well-calibrated L7 ETM+ sensor. Results have been obtained using two different approaches. The first approach involves calibration of nearly simultaneous surface observations based on image statistics from areas observed simultaneously by the two sensors. The second approach uses vicarious calibration techniques to compare the predicted top-of-atmosphere radiance derived from ground reference data collected during the overpass to the measured radiance obtained from the sensor. The results indicate that the relative sensor chip assemblies gains agree with the ETM+ visible and near-infrared bands to within 2% and the shortwave infrared bands to within 4%.

  11. A Low-Power High-Speed Smart Sensor Design for Space Exploration Missions

    NASA Technical Reports Server (NTRS)

    Fang, Wai-Chi

    1997-01-01

    A low-power high-speed smart sensor system based on a large format active pixel sensor (APS) integrated with a programmable neural processor for space exploration missions is presented. The concept of building an advanced smart sensing system is demonstrated by a system-level microchip design that is composed with an APS sensor, a programmable neural processor, and an embedded microprocessor in a SOI CMOS technology. This ultra-fast smart sensor system-on-a-chip design mimics what is inherent in biological vision systems. Moreover, it is programmable and capable of performing ultra-fast machine vision processing in all levels such as image acquisition, image fusion, image analysis, scene interpretation, and control functions. The system provides about one tera-operation-per-second computing power which is a two order-of-magnitude increase over that of state-of-the-art microcomputers. Its high performance is due to massively parallel computing structures, high data throughput rates, fast learning capabilities, and advanced VLSI system-on-a-chip implementation.

  12. Error modeling and analysis of star cameras for a class of 1U spacecraft

    NASA Astrophysics Data System (ADS)

    Fowler, David M.

    As spacecraft today become increasingly smaller, the demand for smaller components and sensors rises as well. The smartphone, a cutting edge consumer technology, has impressive collections of both sensors and processing capabilities and may have the potential to fill this demand in the spacecraft market. If the technologies of a smartphone can be used in space, the cost of building miniature satellites would drop significantly and give a boost to the aerospace and scientific communities. Concentrating on the problem of spacecraft orientation, this study sets ground to determine the capabilities of a smartphone camera when acting as a star camera. Orientations determined from star images taken from a smartphone camera are compared to those of higher quality cameras in order to determine the associated accuracies. The results of the study reveal the abilities of low-cost off-the-shelf imagers in space and give a starting point for future research in the field. The study began with a complete geometric calibration of each analyzed imager such that all comparisons start from the same base. After the cameras were calibrated, image processing techniques were introduced to correct for atmospheric, lens, and image sensor effects. Orientations for each test image are calculated through methods of identifying the stars exposed on each image. Analyses of these orientations allow the overall errors of each camera to be defined and provide insight into the abilities of low-cost imagers.

  13. Precise color images a high-speed color video camera system with three intensified sensors

    NASA Astrophysics Data System (ADS)

    Oki, Sachio; Yamakawa, Masafumi; Gohda, Susumu; Etoh, Takeharu G.

    1999-06-01

    High speed imaging systems have been used in a large field of science and engineering. Although the high speed camera systems have been improved to high performance, most of their applications are only to get high speed motion pictures. However, in some fields of science and technology, it is useful to get some other information, such as temperature of combustion flame, thermal plasma and molten materials. Recent digital high speed video imaging technology should be able to get such information from those objects. For this purpose, we have already developed a high speed video camera system with three-intensified-sensors and cubic prism image splitter. The maximum frame rate is 40,500 pps (picture per second) at 64 X 64 pixels and 4,500 pps at 256 X 256 pixels with 256 (8 bit) intensity resolution for each pixel. The camera system can store more than 1,000 pictures continuously in solid state memory. In order to get the precise color images from this camera system, we need to develop a digital technique, which consists of a computer program and ancillary instruments, to adjust displacement of images taken from two or three image sensors and to calibrate relationship between incident light intensity and corresponding digital output signals. In this paper, the digital technique for pixel-based displacement adjustment are proposed. Although the displacement of the corresponding circle was more than 8 pixels in original image, the displacement was adjusted within 0.2 pixels at most by this method.

  14. Synthetic Foveal Imaging Technology

    NASA Technical Reports Server (NTRS)

    Hoenk, Michael; Monacos, Steve; Nikzad, Shouleh

    2009-01-01

    Synthetic Foveal imaging Technology (SyFT) is an emerging discipline of image capture and image-data processing that offers the prospect of greatly increased capabilities for real-time processing of large, high-resolution images (including mosaic images) for such purposes as automated recognition and tracking of moving objects of interest. SyFT offers a solution to the image-data processing problem arising from the proposed development of gigapixel mosaic focal-plane image-detector assemblies for very wide field-of-view imaging with high resolution for detecting and tracking sparse objects or events within narrow subfields of view. In order to identify and track the objects or events without the means of dynamic adaptation to be afforded by SyFT, it would be necessary to post-process data from an image-data space consisting of terabytes of data. Such post-processing would be time-consuming and, as a consequence, could result in missing significant events that could not be observed at all due to the time evolution of such events or could not be observed at required levels of fidelity without such real-time adaptations as adjusting focal-plane operating conditions or aiming of the focal plane in different directions to track such events. The basic concept of foveal imaging is straightforward: In imitation of a natural eye, a foveal-vision image sensor is designed to offer higher resolution in a small region of interest (ROI) within its field of view. Foveal vision reduces the amount of unwanted information that must be transferred from the image sensor to external image-data-processing circuitry. The aforementioned basic concept is not new in itself: indeed, image sensors based on these concepts have been described in several previous NASA Tech Briefs articles. Active-pixel integrated-circuit image sensors that can be programmed in real time to effect foveal artificial vision on demand are one such example. What is new in SyFT is a synergistic combination of recent advances in foveal imaging, computing, and related fields, along with a generalization of the basic foveal-vision concept to admit a synthetic fovea that is not restricted to one contiguous region of an image.

  15. Simulation and ground testing with the Advanced Video Guidance Sensor

    NASA Technical Reports Server (NTRS)

    Howard, Richard T.; Johnston, Albert S.; Bryan, Thomas C.; Book, Michael L.

    2005-01-01

    The Advanced Video Guidance Sensor (AVGS), an active sensor system that provides near-range 6-degree-of-freedom sensor data, has been developed as part of an automatic rendezvous and docking system for the Demonstration of Autonomous Rendezvous Technology (DART). The sensor determines the relative positions and attitudes between the active sensor and the passive target at ranges up to 300 meters. The AVGS uses laser diodes to illuminate retro-reflectors in the target, a solid-state imager to detect the light returned from the target, and image capture electronics and a digital signal processor to convert the video information into the relative positions and attitudes. The development of the sensor, through initial prototypes, final prototypes, and three flight units, has required a great deal of testing at every phase, and the different types of testing, their effectiveness, and their results, are presented in this paper, focusing on the testing of the flight units. Testing has improved the sensor's performance.

  16. High-speed uncooled MWIR hostile fire indication sensor

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Pantuso, F. P.; Jin, G.; Mazurenko, A.; Erdtmann, M.; Radhakrishnan, S.; Salerno, J.

    2011-06-01

    Hostile fire indication (HFI) systems require high-resolution sensor operation at extremely high speeds to capture hostile fire events, including rocket-propelled grenades, anti-aircraft artillery, heavy machine guns, anti-tank guided missiles and small arms. HFI must also be conducted in a waveband with large available signal and low background clutter, in particular the mid-wavelength infrared (MWIR). The shortcoming of current HFI sensors in the MWIR is the bandwidth of the sensor is not sufficient to achieve the required frame rate at the high sensor resolution. Furthermore, current HFI sensors require cryogenic cooling that contributes to size, weight, and power (SWAP) in aircraft-mounted applications where these factors are at a premium. Based on its uncooled photomechanical infrared imaging technology, Agiltron has developed a low-SWAP, high-speed MWIR HFI sensor that breaks the bandwidth bottleneck typical of current infrared sensors. This accomplishment is made possible by using a commercial-off-the-shelf, high-performance visible imager as the readout integrated circuit and physically separating this visible imager from the MWIR-optimized photomechanical sensor chip. With this approach, we have achieved high-resolution operation of our MWIR HFI sensor at 1000 fps, which is unprecedented for an uncooled infrared sensor. We have field tested our MWIR HFI sensor for detecting all hostile fire events mentioned above at several test ranges under a wide range of environmental conditions. The field testing results will be presented.

  17. Robust Light Filters Support Powerful Imaging Devices

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Infrared (IR) light filters developed by Lake Shore Cryotronics Inc. of Westerville, Ohio -- using SBIR funding from NASA s Jet Propulsion Laboratory and Langley Research Center -- employ porous silicon and metal mesh technology to provide optical filtration even at the ultra-low temperatures required by many IR sensors. With applications in the astronomy community, Lake Shore s SBIR-developed filters are also promising tools for use in terahertz imaging, the next wave of technology for applications like medical imaging, the study of fragile artworks, and airport security.

  18. EOS image data processing system definition study

    NASA Technical Reports Server (NTRS)

    Gilbert, J.; Honikman, T.; Mcmahon, E.; Miller, E.; Pietrzak, L.; Yorsz, W.

    1973-01-01

    The Image Processing System (IPS) requirements and configuration are defined for NASA-sponsored advanced technology Earth Observatory System (EOS). The scope included investigation and definition of IPS operational, functional, and product requirements considering overall system constraints and interfaces (sensor, etc.) The scope also included investigation of the technical feasibility and definition of a point design reflecting system requirements. The design phase required a survey of present and projected technology related to general and special-purpose processors, high-density digital tape recorders, and image recorders.

  19. Toward Optical Sensors: Review and Applications

    NASA Astrophysics Data System (ADS)

    Sabri, Naseer; Aljunid, S. A.; Salim, M. S.; Ahmad, R. B.; Kamaruddin, R.

    2013-04-01

    Recent advances in fiber optics (FOs) and the numerous advantages of light over electronic systems have boosted the utility and demand for optical sensors in various military, industry and social fields. Environmental and atmospheric monitoring, earth and space sciences, industrial chemical processing and biotechnology, law enforcement, digital imaging, scanning, and printing are exemplars of them. The ubiquity of photonic technologies could drive down prices which reduced the cost of optical fibers and lasers. Fiber optic sensors (FOSs) offer a wide spectrum of advantages over traditional sensing systems, such as small size and longer lifetime. Immunity to electromagnetic interference, amenability to multiplexing, and high sensitivity make FOs the sensor technology of choice in several fields, including the healthcare and aerospace sectors. FOSs show reliable and rigid sensing tasks over conventional electrical and electronic sensors. This paper presents an executive review of optical fiber sensors and the most beneficial applications.

  20. Vision Sensors and Cameras

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 μm technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and μm-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  1. Development of a 750x750 pixels CMOS imager sensor for tracking applications

    NASA Astrophysics Data System (ADS)

    Larnaudie, Franck; Guardiola, Nicolas; Saint-Pé, Olivier; Vignon, Bruno; Tulet, Michel; Davancens, Robert; Magnan, Pierre; Corbière, Franck; Martin-Gonthier, Philippe; Estribeau, Magali

    2017-11-01

    Solid-state optical sensors are now commonly used in space applications (navigation cameras, astronomy imagers, tracking sensors...). Although the charge-coupled devices are still widely used, the CMOS image sensor (CIS), which performances are continuously improving, is a strong challenger for Guidance, Navigation and Control (GNC) systems. This paper describes a 750x750 pixels CMOS image sensor that has been specially designed and developed for star tracker and tracking sensor applications. Such detector, that is featuring smart architecture enabling very simple and powerful operations, is built using the AMIS 0.5μm CMOS technology. It contains 750x750 rectangular pixels with 20μm pitch. The geometry of the pixel sensitive zone is optimized for applications based on centroiding measurements. The main feature of this device is the on-chip control and timing function that makes the device operation easier by drastically reducing the number of clocks to be applied. This powerful function allows the user to operate the sensor with high flexibility: measurement of dark level from masked lines, direct access to the windows of interest… A temperature probe is also integrated within the CMOS chip allowing a very precise measurement through the video stream. A complete electro-optical characterization of the sensor has been performed. The major parameters have been evaluated: dark current and its uniformity, read-out noise, conversion gain, Fixed Pattern Noise, Photo Response Non Uniformity, quantum efficiency, Modulation Transfer Function, intra-pixel scanning. The characterization tests are detailed in the paper. Co60 and protons irradiation tests have been also carried out on the image sensor and the results are presented. The specific features of the 750x750 image sensor such as low power CMOS design (3.3V, power consumption<100mW), natural windowing (that allows efficient and robust tracking algorithms), simple proximity electronics (because of the on-chip control and timing function) enabling a high flexibility architecture, make this imager a good candidate for high performance tracking applications.

  2. Wide-angle vision for road views

    NASA Astrophysics Data System (ADS)

    Huang, F.; Fehrs, K.-K.; Hartmann, G.; Klette, R.

    2013-03-01

    The field-of-view of a wide-angle image is greater than (say) 90 degrees, and so contains more information than available in a standard image. A wide field-of-view is more advantageous than standard input for understanding the geometry of 3D scenes, and for estimating the poses of panoramic sensors within such scenes. Thus, wide-angle imaging sensors and methodologies are commonly used in various road-safety, street surveillance, street virtual touring, or street 3D modelling applications. The paper reviews related wide-angle vision technologies by focusing on mathematical issues rather than on hardware.

  3. A highly accurate wireless digital sun sensor based on profile detecting and detector multiplexing technologies

    NASA Astrophysics Data System (ADS)

    Wei, Minsong; Xing, Fei; You, Zheng

    2017-01-01

    The advancing growth of micro- and nano-satellites requires miniaturized sun sensors which could be conveniently applied in the attitude determination subsystem. In this work, a profile detecting technology based high accurate wireless digital sun sensor was proposed, which could transform a two-dimensional image into two-linear profile output so that it can realize a high update rate under a very low power consumption. A multiple spots recovery approach with an asymmetric mask pattern design principle was introduced to fit the multiplexing image detector method for accuracy improvement of the sun sensor within a large Field of View (FOV). A FOV determination principle based on the concept of FOV region was also proposed to facilitate both sub-FOV analysis and the whole FOV determination. A RF MCU, together with solar cells, was utilized to achieve the wireless and self-powered functionality. The prototype of the sun sensor is approximately 10 times lower in size and weight compared with the conventional digital sun sensor (DSS). Test results indicated that the accuracy of the prototype was 0.01° within a cone FOV of 100°. Such an autonomous DSS could be equipped flexibly on a micro- or nano-satellite, especially for highly accurate remote sensing applications.

  4. Changing requirements and solutions for unattended ground sensors

    NASA Astrophysics Data System (ADS)

    Prado, Gervasio; Johnson, Robert

    2007-10-01

    Unattended Ground Sensors (UGS) were first used to monitor Viet Cong activity along the Ho Chi Minh Trail in the 1960's. In the 1980's, significant improvement in the capabilities of UGS became possible with the development of digital signal processors; this led to their use as fire control devices for smart munitions (for example: the Wide Area Mine) and later to monitor the movements of mobile missile launchers. In these applications, the targets of interest were large military vehicles with strong acoustic, seismic and magnetic signatures. Currently, the requirements imposed by new terrorist threats and illegal border crossings have changed the emphasis to the monitoring of light vehicles and foot traffic. These new requirements have changed the way UGS are used. To improve performance against targets with lower emissions, sensors are used in multi-modal arrangements. Non-imaging sensors (acoustic, seismic, magnetic and passive infrared) are now being used principally as activity sensors to cue imagers and remote cameras. The availability of better imaging technology has made imagers the preferred source of "actionable intelligence". Infrared cameras are now based on un-cooled detector-arrays that have made their application in UGS possible in terms of their cost and power consumption. Visible light imagers are also more sensitive extending their utility well beyond twilight. The imagers are equipped with sophisticated image processing capabilities (image enhancement, moving target detection and tracking, image compression). Various commercial satellite services now provide relatively inexpensive long-range communications and the Internet provides fast worldwide access to the data.

  5. Concept Study of Multi Sensor Detection Imaging and Explosive Confirmation of Mines

    DTIC Science & Technology

    1998-03-20

    surface feature removal can be achieved in LMR images. Small Business Technology Transfer (STTR) Solicitation Topic 97T006 Mufi -Sensor Detection...divided by the applied voltage. This is mathematically given by: 00 Y-I-G+jB = 1o+2E’. COS m4; m1l 1-1 = j120 72(+a) where G = the input conductance...of detector operation that are incorporated into a mathematical algorithm to convert detector impedance characteristics into recognizable indicators

  6. LWIR hyperspectral imaging application and detection of chemical precursors

    NASA Astrophysics Data System (ADS)

    Lavoie, Hugo; Thériault, Jean-Marc; Bouffard, François; Puckrin, Eldon; Dubé, Denis

    2012-10-01

    Detection and identification of Toxic industrial chemicals (TICs) represent a major challenge to protect and sustain first responder and public security. In this context, passive Hyperspectral Imaging (HSI) is a promising technology for the standoff detection and identification of chemical vapors emanating from a distant location. To investigate this method, the Department of National Defense and Public Safety Canada have mandated Defense Research and Development Canada (DRDC) - Valcartier to develop and test Very Long Wave Infrared (VLWIR) HSI sensors for standoff detection. The initial effort was focused to address the standoff detection and identification of toxic industrial chemicals (TICs), surrogates and precursors. Sensors such as the Improved Compact ATmospheric Sounding Interferometer (iCATSI) and the Multi-option Differential Detection and Imaging Fourier Spectrometer (MoDDIFS) were developed for this application. This paper presents the sensor developments and preliminary results of standoff detection and identification of TICs and precursors. The iCATSI and MoDDIFS sensors are based on the optical differential Fourier-transform infrared (FTIR) radiometric technology and are able to detect, spectrally resolve and identify small leak at ranges in excess of 1 km. Results from a series of trials in asymmetric threat type scenarios are reported. These results serve to establish the potential of passive standoff HSI detection of TICs, precursors and surrogates.

  7. Remote sensing advances in agricultural inventories

    NASA Technical Reports Server (NTRS)

    Dragg, J. L.; Bizzell, R. M.; Trichel, M. C.; Hatch, R. E.; Phinney, D. E.; Baker, T. C.

    1984-01-01

    As the complexity of the world's agricultural industry increases, more timely and more accurate world-wide agricultural information is required to support production and marketing decisions, policy formulation, and technology development. The Inventory Technology Development Project of the AgRISTARS Program has developed new automated technology that uses data sets acquired by spaceborne remote sensors. Research has emphasized the development of multistage, multisensor sampling and estimation techniques for use in global environments where reliable ground observations are not available. This paper presents research results obtained from data sets acquired by four different sensors: Landsat MSS, Landsat TM, Shuttle-Imaging Radar and environmental satellite (AVHRR).

  8. Proton-counting radiography for proton therapy: a proof of principle using CMOS APS technology

    NASA Astrophysics Data System (ADS)

    Poludniowski, G.; Allinson, N. M.; Anaxagoras, T.; Esposito, M.; Green, S.; Manolopoulos, S.; Nieto-Camero, J.; Parker, D. J.; Price, T.; Evans, P. M.

    2014-06-01

    Despite the early recognition of the potential of proton imaging to assist proton therapy (Cormack 1963 J. Appl. Phys. 34 2722), the modality is still removed from clinical practice, with various approaches in development. For proton-counting radiography applications such as computed tomography (CT), the water-equivalent-path-length that each proton has travelled through an imaged object must be inferred. Typically, scintillator-based technology has been used in various energy/range telescope designs. Here we propose a very different alternative of using radiation-hard CMOS active pixel sensor technology. The ability of such a sensor to resolve the passage of individual protons in a therapy beam has not been previously shown. Here, such capability is demonstrated using a 36 MeV cyclotron beam (University of Birmingham Cyclotron, Birmingham, UK) and a 200 MeV clinical radiotherapy beam (iThemba LABS, Cape Town, SA). The feasibility of tracking individual protons through multiple CMOS layers is also demonstrated using a two-layer stack of sensors. The chief advantages of this solution are the spatial discrimination of events intrinsic to pixelated sensors, combined with the potential provision of information on both the range and residual energy of a proton. The challenges in developing a practical system are discussed.

  9. Proton-counting radiography for proton therapy: a proof of principle using CMOS APS technology

    PubMed Central

    Poludniowski, G; Allinson, N M; Anaxagoras, T; Esposito, M; Green, S; Manolopoulos, S; Nieto-Camero, J; Parker, D J; Price, T; Evans, P M

    2014-01-01

    Despite the early recognition of the potential of proton imaging to assist proton therapy the modality is still removed from clinical practice, with various approaches in development. For proton-counting radiography applications such as Computed Tomography (CT), the Water-Equivalent-Path-Length (WEPL) that each proton has travelled through an imaged object must be inferred. Typically, scintillator-based technology has been used in various energy/range telescope designs. Here we propose a very different alternative of using radiation-hard CMOS Active Pixel Sensor (APS) technology. The ability of such a sensor to resolve the passage of individual protons in a therapy beam has not been previously shown. Here, such capability is demonstrated using a 36 MeV cyclotron beam (University of Birmingham Cyclotron, Birmingham, UK) and a 200 MeV clinical radiotherapy beam (iThemba LABS, Cape Town, SA). The feasibility of tracking individual protons through multiple CMOS layers is also demonstrated using a two-layer stack of sensors. The chief advantages of this solution are the spatial discrimination of events intrinsic to pixelated sensors, combined with the potential provision of information on both the range and residual energy of a proton. The challenges in developing a practical system are discussed. PMID:24785680

  10. Novel instrumentation of multispectral imaging technology for detecting tissue abnormity

    NASA Astrophysics Data System (ADS)

    Yi, Dingrong; Kong, Linghua

    2012-10-01

    Multispectral imaging is becoming a powerful tool in a wide range of biological and clinical studies by adding spectral, spatial and temporal dimensions to visualize tissue abnormity and the underlying biological processes. A conventional spectral imaging system includes two physically separated major components: a band-passing selection device (such as liquid crystal tunable filter and diffraction grating) and a scientific-grade monochromatic camera, and is expensive and bulky. Recently micro-arrayed narrow-band optical mosaic filter was invented and successfully fabricated to reduce the size and cost of multispectral imaging devices in order to meet the clinical requirement for medical diagnostic imaging applications. However the challenging issue of how to integrate and place the micro filter mosaic chip to the targeting focal plane, i.e., the imaging sensor, of an off-shelf CMOS/CCD camera is not reported anywhere. This paper presents the methods and results of integrating such a miniaturized filter with off-shelf CMOS imaging sensors to produce handheld real-time multispectral imaging devices for the application of early stage pressure ulcer (ESPU) detection. Unlike conventional multispectral imaging devices which are bulky and expensive, the resulting handheld real-time multispectral ESPU detector can produce multiple images at different center wavelengths with a single shot, therefore eliminates the image registration procedure required by traditional multispectral imaging technologies.

  11. Survey of computer vision technology for UVA navigation

    NASA Astrophysics Data System (ADS)

    Xie, Bo; Fan, Xiang; Li, Sijian

    2017-11-01

    Navigation based on computer version technology, which has the characteristics of strong independence, high precision and is not susceptible to electrical interference, has attracted more and more attention in the filed of UAV navigation research. Early navigation project based on computer version technology mainly applied to autonomous ground robot. In recent years, the visual navigation system is widely applied to unmanned machine, deep space detector and underwater robot. That further stimulate the research of integrated navigation algorithm based on computer version technology. In China, with many types of UAV development and two lunar exploration, the three phase of the project started, there has been significant progress in the study of visual navigation. The paper expounds the development of navigation based on computer version technology in the filed of UAV navigation research and draw a conclusion that visual navigation is mainly applied to three aspects as follows.(1) Acquisition of UAV navigation parameters. The parameters, including UAV attitude, position and velocity information could be got according to the relationship between the images from sensors and carrier's attitude, the relationship between instant matching images and the reference images and the relationship between carrier's velocity and characteristics of sequential images.(2) Autonomous obstacle avoidance. There are many ways to achieve obstacle avoidance in UAV navigation. The methods based on computer version technology ,including feature matching, template matching, image frames and so on, are mainly introduced. (3) The target tracking, positioning. Using the obtained images, UAV position is calculated by using optical flow method, MeanShift algorithm, CamShift algorithm, Kalman filtering and particle filter algotithm. The paper expounds three kinds of mainstream visual system. (1) High speed visual system. It uses parallel structure, with which image detection and processing are carried out at high speed. The system is applied to rapid response system. (2) The visual system of distributed network. There are several discrete image data acquisition sensor in different locations, which transmit image data to the node processor to increase the sampling rate. (3) The visual system combined with observer. The system combines image sensors with the external observers to make up for lack of visual equipment. To some degree, these systems overcome lacks of the early visual system, including low frequency, low processing efficiency and strong noise. In the end, the difficulties of navigation based on computer version technology in practical application are briefly discussed. (1) Due to the huge workload of image operation , the real-time performance of the system is poor. (2) Due to the large environmental impact , the anti-interference ability of the system is poor.(3) Due to the ability to work in a particular environment, the system has poor adaptability.

  12. Design of an Intelligent Front-End Signal Conditioning Circuit for IR Sensors

    NASA Astrophysics Data System (ADS)

    de Arcas, G.; Ruiz, M.; Lopez, J. M.; Gutierrez, R.; Villamayor, V.; Gomez, L.; Montojo, Mª. T.

    2008-02-01

    This paper presents the design of an intelligent front-end signal conditioning system for IR sensors. The system has been developed as an interface between a PbSe IR sensor matrix and a TMS320C67x digital signal processor. The system architecture ensures its scalability so it can be used for sensors with different matrix sizes. It includes an integrator based signal conditioning circuit, a data acquisition converter block, and a FPGA based advanced control block that permits including high level image preprocessing routines such as faulty pixel detection and sensor calibration in the signal conditioning front-end. During the design phase virtual instrumentation technologies proved to be a very valuable tool for prototyping when choosing the best A/D converter type for the application. Development time was significantly reduced due to the use of this technology.

  13. Evaluation of Algorithms for Compressing Hyperspectral Data

    NASA Technical Reports Server (NTRS)

    Cook, Sid; Harsanyi, Joseph; Faber, Vance

    2003-01-01

    With EO-1 Hyperion in orbit NASA is showing their continued commitment to hyperspectral imaging (HSI). As HSI sensor technology continues to mature, the ever-increasing amounts of sensor data generated will result in a need for more cost effective communication and data handling systems. Lockheed Martin, with considerable experience in spacecraft design and developing special purpose onboard processors, has teamed with Applied Signal & Image Technology (ASIT), who has an extensive heritage in HSI spectral compression and Mapping Science (MSI) for JPEG 2000 spatial compression expertise, to develop a real-time and intelligent onboard processing (OBP) system to reduce HSI sensor downlink requirements. Our goal is to reduce the downlink requirement by a factor > 100, while retaining the necessary spectral and spatial fidelity of the sensor data needed to satisfy the many science, military, and intelligence goals of these systems. Our compression algorithms leverage commercial-off-the-shelf (COTS) spectral and spatial exploitation algorithms. We are currently in the process of evaluating these compression algorithms using statistical analysis and NASA scientists. We are also developing special purpose processors for executing these algorithms onboard a spacecraft.

  14. Magnetic resonance imaging-compatible tactile sensing device based on a piezoelectric array.

    PubMed

    Hamed, Abbi; Masamune, Ken; Tse, Zion Tsz Ho; Lamperth, Michael; Dohi, Takeyoshi

    2012-07-01

    Minimally invasive surgery is a widely used medical technique, one of the drawbacks of which is the loss of direct sense of touch during the operation. Palpation is the use of fingertips to explore and make fast assessments of tissue morphology. Although technologies are developed to equip minimally invasive surgery tools with haptic feedback capabilities, the majority focus on tissue stiffness profiling and tool-tissue interaction force measurement. For greatly increased diagnostic capability, a magnetic resonance imaging-compatible tactile sensor design is proposed, which allows minimally invasive surgery to be performed under image guidance, combining the strong capability of magnetic resonance imaging soft tissue and intuitive palpation. The sensing unit is based on a piezoelectric sensor methodology, which conforms to the stringent mechanical and electrical design requirements imposed by the magnetic resonance environment The sensor mechanical design and the device integration to a 0.2 Tesla open magnetic resonance imaging scanner are described, together with the device's magnetic resonance compatibility testing. Its design limitations and potential future improvements are also discussed. A tactile sensing unit based on a piezoelectric sensor principle is proposed, which is designed for magnetic resonance imaging guided interventions.

  15. Smart sensing surveillance system

    NASA Astrophysics Data System (ADS)

    Hsu, Charles; Chu, Kai-Dee; O'Looney, James; Blake, Michael; Rutar, Colleen

    2010-04-01

    An effective public safety sensor system for heavily-populated applications requires sophisticated and geographically-distributed infrastructures, centralized supervision, and deployment of large-scale security and surveillance networks. Artificial intelligence in sensor systems is a critical design to raise awareness levels, improve the performance of the system and adapt to a changing scenario and environment. In this paper, a highly-distributed, fault-tolerant, and energy-efficient Smart Sensing Surveillance System (S4) is presented to efficiently provide a 24/7 and all weather security operation in crowded environments or restricted areas. Technically, the S4 consists of a number of distributed sensor nodes integrated with specific passive sensors to rapidly collect, process, and disseminate heterogeneous sensor data from near omni-directions. These distributed sensor nodes can cooperatively work to send immediate security information when new objects appear. When the new objects are detected, the S4 will smartly select the available node with a Pan- Tilt- Zoom- (PTZ) Electro-Optics EO/IR camera to track the objects and capture associated imagery. The S4 provides applicable advanced on-board digital image processing capabilities to detect and track the specific objects. The imaging detection operations include unattended object detection, human feature and behavior detection, and configurable alert triggers, etc. Other imaging processes can be updated to meet specific requirements and operations. In the S4, all the sensor nodes are connected with a robust, reconfigurable, LPI/LPD (Low Probability of Intercept/ Low Probability of Detect) wireless mesh network using Ultra-wide band (UWB) RF technology. This UWB RF technology can provide an ad-hoc, secure mesh network and capability to relay network information, communicate and pass situational awareness and messages. The Service Oriented Architecture of S4 enables remote applications to interact with the S4 network and use the specific presentation methods. In addition, the S4 is compliant with Open Geospatial Consortium - Sensor Web Enablement (OGC-SWE) standards to efficiently discover, access, use, and control heterogeneous sensors and their metadata. These S4 capabilities and technologies have great potential for both military and civilian applications, enabling highly effective security support tools for improving surveillance activities in densely crowded environments. The S4 system is directly applicable to solutions for emergency response personnel, law enforcement, and other homeland security missions, as well as in applications requiring the interoperation of sensor networks with handheld or body-worn interface devices.

  16. Bioinspired polarization navigation sensor for autonomous munitions systems

    NASA Astrophysics Data System (ADS)

    Giakos, G. C.; Quang, T.; Farrahi, T.; Deshpande, A.; Narayan, C.; Shrestha, S.; Li, Y.; Agarwal, M.

    2013-05-01

    Small unmanned aerial vehicles UAVs (SUAVs), micro air vehicles (MAVs), Automated Target Recognition (ATR), and munitions guidance, require extreme operational agility and robustness which can be partially offset by efficient bioinspired imaging sensor designs capable to provide enhanced guidance, navigation and control capabilities (GNC). Bioinspired-based imaging technology can be proved useful either for long-distance surveillance of targets in a cluttered environment, or at close distances limited by space surroundings and obstructions. The purpose of this study is to explore the phenomenology of image formation by different insect eye architectures, which would directly benefit the areas of defense and security, on the following four distinct areas: a) fabrication of the bioinspired sensor b) optical architecture, c) topology, and d) artificial intelligence. The outcome of this study indicates that bioinspired imaging can impact the areas of defense and security significantly by dedicated designs fitting into different combat scenarios and applications.

  17. Novel snapshot hyperspectral imager for fluorescence imaging

    NASA Astrophysics Data System (ADS)

    Chandler, Lynn; Chandler, Andrea; Periasamy, Ammasi

    2018-02-01

    Hyperspectral imaging has emerged as a new technique for the identification and classification of biological tissue1. Benefitting recent developments in sensor technology, the new class of hyperspectral imagers can capture entire hypercubes with single shot operation and it shows great potential for real-time imaging in biomedical sciences. This paper explores the use of a SnapShot imager in fluorescence imaging via microscope for the very first time. Utilizing the latest imaging sensor, the Snapshot imager is both compact and attachable via C-mount to any commercially available light microscope. Using this setup, fluorescence hypercubes of several cells were generated, containing both spatial and spectral information. The fluorescence images were acquired with one shot operation for all the emission range from visible to near infrared (VIS-IR). The paper will present the hypercubes obtained images from example tissues (475-630nm). This study demonstrates the potential of application in cell biology or biomedical applications for real time monitoring.

  18. A multimodal image sensor system for identifying water stress in grapevines

    NASA Astrophysics Data System (ADS)

    Zhao, Yong; Zhang, Qin; Li, Minzan; Shao, Yongni; Zhou, Jianfeng; Sun, Hong

    2012-11-01

    Water stress is one of the most common limitations of fruit growth. Water is the most limiting resource for crop growth. In grapevines, as well as in other fruit crops, fruit quality benefits from a certain level of water deficit which facilitates to balance vegetative and reproductive growth and the flow of carbohydrates to reproductive structures. A multi-modal sensor system was designed to measure the reflectance signature of grape plant surfaces and identify different water stress levels in this paper. The multi-modal sensor system was equipped with one 3CCD camera (three channels in R, G, and IR). The multi-modal sensor can capture and analyze grape canopy from its reflectance features, and identify the different water stress levels. This research aims at solving the aforementioned problems. The core technology of this multi-modal sensor system could further be used as a decision support system that combines multi-modal sensory data to improve plant stress detection and identify the causes of stress. The images were taken by multi-modal sensor which could output images in spectral bands of near-infrared, green and red channel. Based on the analysis of the acquired images, color features based on color space and reflectance features based on image process method were calculated. The results showed that these parameters had the potential as water stress indicators. More experiments and analysis are needed to validate the conclusion.

  19. Flexible digital x-ray technology for far-forward remote diagnostic and conformal x-ray imaging applications

    NASA Astrophysics Data System (ADS)

    Smith, Joseph; Marrs, Michael; Strnad, Mark; Apte, Raj B.; Bert, Julie; Allee, David; Colaneri, Nicholas; Forsythe, Eric; Morton, David

    2013-05-01

    Today's flat panel digital x-ray image sensors, which have been in production since the mid-1990s, are produced exclusively on glass substrates. While acceptable for use in a hospital or doctor's office, conventional glass substrate digital x-ray sensors are too fragile for use outside these controlled environments without extensive reinforcement. Reinforcement, however, significantly increases weight, bulk, and cost, making them impractical for far-forward remote diagnostic applications, which demand rugged and lightweight x-ray detectors. Additionally, glass substrate x-ray detectors are inherently rigid. This limits their use in curved or bendable, conformal x-ray imaging applications such as the non-destructive testing (NDT) of oil pipelines. However, by extending low-temperature thin-film transistor (TFT) technology previously demonstrated on plastic substrate- based electrophoretic and organic light emitting diode (OLED) flexible displays, it is now possible to manufacture durable, lightweight, as well as flexible digital x-ray detectors. In this paper, we discuss the principal technical approaches used to apply flexible display technology to two new large-area flexible digital x-ray sensors for defense, security, and industrial applications and demonstrate their imaging capabilities. Our results include a 4.8″ diagonal, 353 x 463 resolution, flexible digital x-ray detector, fabricated on a 6″ polyethylene naphthalate (PEN) plastic substrate; and a larger, 7.9″ diagonal, 720 x 640 resolution, flexible digital x-ray detector also fabricated on PEN and manufactured on a gen 2 (370 x 470 mm) substrate.

  20. A low cost thermal infrared hyperspectral imager for small satellites

    NASA Astrophysics Data System (ADS)

    Crites, S. T.; Lucey, P. G.; Wright, R.; Garbeil, H.; Horton, K. A.

    2011-06-01

    The traditional model for space-based earth observations involves long mission times, high cost, and long development time. Because of the significant time and monetary investment required, riskier instrument development missions or those with very specific scientific goals are unlikely to successfully obtain funding. However, a niche for earth observations exploiting new technologies in focused, short lifetime missions is opening with the growth of the small satellite market and launch opportunities for these satellites. These low-cost, short-lived missions provide an experimental platform for testing new sensor technologies that may transition to larger, more long-lived platforms. The low costs and short lifetimes also increase acceptable risk to sensors, enabling large decreases in cost using commercial off the shelf (COTS) parts and allowing early-career scientists and engineers to gain experience with these projects. We are building a low-cost long-wave infrared spectral sensor, funded by the NASA Experimental Project to Stimulate Competitive Research program (EPSCOR), to demonstrate the ways in which a university's scientific and instrument development programs can fit into this niche. The sensor is a low-mass, power efficient thermal hyperspectral imager with electronics contained in a pressure vessel to enable the use of COTS electronics, and will be compatible with small satellite platforms. The sensor, called Thermal Hyperspectral Imager (THI), is based on a Sagnac interferometer and uses an uncooled 320x256 microbolometer array. The sensor will collect calibrated radiance data at long-wave infrared (LWIR, 8-14 microns) wavelengths in 230-meter pixels with 20 wavenumber spectral resolution from a 400-km orbit.

  1. Potential of Future Hurricane Imaging Radiometer (HIRAD) Ocean Surface Wind Observations for Determining Tropical Storm Vortex Intensity and Structure

    NASA Technical Reports Server (NTRS)

    Atlas, Robert; Bailey, M. C.; Black, Peter; James, Mark; Johnson, James; Jones, Linwood; Miller, Timothy; Ruf, Christopher; Uhlhorn, Eric

    2008-01-01

    The Hurricane Imaging Radiometer (HIRAD) is an innovative technology development, which offers the potential of new and unique remotely sensed observations of both extreme oceanic wind events and strong precipitation from either UAS or satellite platforms. It is based on the airborne Stepped Frequency Microwave Radiometer (SFMR), which is a proven aircraft remote sensing technique for observing tropical cyclone ocean surface wind speeds and rain rates, including those of major hurricane intensity. The proposed HIRAD instrument advances beyond the current nadir viewing SFMR to an equivalent wide-swath SFMR imager using passive microwave synthetic thinned aperture radiometer technology. This sensor will operate over 4-7 GHz (C-band frequencies) where the required tropical cyclone remote sensing physics has been validated by both SFMR and WindSat radiometers. HIRAD incorporates a unique, technologically advanced array antenna and several other technologies successfully demonstrated by the NASA's Instrument Incubator Program. A brassboard version of the instrument is complete and has been successfully tested in an anechoic chamber, and development of the aircraft instrument is well underway. HIRAD will be a compact, lightweight, low-power instrument with no moving parts that will produce wide-swath imagery of ocean vector winds and rain during hurricane conditions when existing microwave sensors (radiometers or scatterometers) are hindered. Preliminary studies show that HIRAD will have a significant positive impact on analyses as either a new aircraft or satellite sensor.

  2. A CMOS-based large-area high-resolution imaging system for high-energy x-ray applications

    NASA Astrophysics Data System (ADS)

    Rodricks, Brian; Fowler, Boyd; Liu, Chiao; Lowes, John; Haeffner, Dean; Lienert, Ulrich; Almer, John

    2008-08-01

    CCDs have been the primary sensor in imaging systems for x-ray diffraction and imaging applications in recent years. CCDs have met the fundamental requirements of low noise, high-sensitivity, high dynamic range and spatial resolution necessary for these scientific applications. State-of-the-art CMOS image sensor (CIS) technology has experienced dramatic improvements recently and their performance is rivaling or surpassing that of most CCDs. The advancement of CIS technology is at an ever-accelerating pace and is driven by the multi-billion dollar consumer market. There are several advantages of CIS over traditional CCDs and other solid-state imaging devices; they include low power, high-speed operation, system-on-chip integration and lower manufacturing costs. The combination of superior imaging performance and system advantages makes CIS a good candidate for high-sensitivity imaging system development. This paper will describe a 1344 x 1212 CIS imaging system with a 19.5μm pitch optimized for x-ray scattering studies at high-energies. Fundamental metrics of linearity, dynamic range, spatial resolution, conversion gain, sensitivity are estimated. The Detective Quantum Efficiency (DQE) is also estimated. Representative x-ray diffraction images are presented. Diffraction images are compared against a CCD-based imaging system.

  3. ESTO Investments in Innovative Sensor Technologies for Remote Sensing

    NASA Technical Reports Server (NTRS)

    Babu, Sachidananda R.

    2017-01-01

    For more then 18 years NASA Earth Science Technology Office has been investing in remote sensing technologies. During this period ESTO has invested in more then 900 tasks. These tasks are managed under multiple programs like Instrument Incubator Program (IIP), Advanced Component Technology (ACT), Advanced Information Systems Technology (AIST), In-Space Validation of Earth Science Technologies (InVEST), Sustainable Land Imaging - Technology (SLI-T) and others. This covers the whole spectrum of technologies from component to full up satellite in space and software. Over the years many of these technologies have been infused into space missions like Aquarius, SMAP, CYGNSS, SWOT, TEMPO and others. Over the years ESTO is actively investing in Infrared sensor technologies for space applications. Recent investments have been for SLI-T and InVEST program. On these tasks technology development is from simple Bolometers to Advanced Photonic waveguide based spectrometers. Some of the details on these missions and technologies will be presented.

  4. Naval sensor data database (NSDD)

    NASA Astrophysics Data System (ADS)

    Robertson, Candace J.; Tubridy, Lisa H.

    1999-08-01

    The Naval Sensor Data database (NSDD) is a multi-year effort to archive, catalogue, and disseminate data from all types of sensors to the mine warfare, signal and image processing, and sensor development communities. The purpose is to improve and accelerate research and technology. Providing performers with the data required to develop and validate improvements in hardware, simulation, and processing will foster advances in sensor and system performance. The NSDD will provide a centralized source of sensor data in its associated ground truth, which will support an improved understanding will be benefited in the areas of signal processing, computer-aided detection and classification, data compression, data fusion, and geo-referencing, as well as sensor and sensor system design.

  5. High-End CMOS Active Pixel Sensors For Space-Borne Imaging Instruments

    DTIC Science & Technology

    2005-07-13

    DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release, distribution unlimited 13. SUPPLEMENTARY NOTES See also ADM001791, Potentially Disruptive ... Technologies and Their Impact in Space Programs Held in Marseille, France on 4-6 July 2005. , The original document contains color images. 14

  6. Development of Gentle Slope Light Guide Structure in a 3.4 μm Pixel Pitch Global Shutter CMOS Image Sensor with Multiple Accumulation Shutter Technology.

    PubMed

    Sekine, Hiroshi; Kobayashi, Masahiro; Onuki, Yusuke; Kawabata, Kazunari; Tsuboi, Toshiki; Matsuno, Yasushi; Takahashi, Hidekazu; Inoue, Shunsuke; Ichikawa, Takeshi

    2017-12-09

    CMOS image sensors (CISs) with global shutter (GS) function are strongly required in order to avoid image degradation. However, CISs with GS function have generally been inferior to the rolling shutter (RS) CIS in performance, because they have more components. This problem is remarkable in small pixel pitch. The newly developed 3.4 µm pitch GS CIS solves this problem by using multiple accumulation shutter technology and the gentle slope light guide structure. As a result, the developed GS pixel achieves 1.8 e - temporal noise and 16,200 e - full well capacity with charge domain memory in 120 fps operation. The sensitivity and parasitic light sensitivity are 28,000 e - /lx·s and -89 dB, respectively. Moreover, the incident light angle dependence of sensitivity and parasitic light sensitivity are improved by the gentle slope light guide structure.

  7. Very-large-area CCD image sensors: concept and cost-effective research

    NASA Astrophysics Data System (ADS)

    Bogaart, E. W.; Peters, I. M.; Kleimann, A. C.; Manoury, E. J. P.; Klaassens, W.; de Laat, W. T. F. M.; Draijer, C.; Frost, R.; Bosiers, J. T.

    2009-01-01

    A new-generation full-frame 36x48 mm2 48Mp CCD image sensor with vertical anti-blooming for professional digital still camera applications is developed by means of the so-called building block concept. The 48Mp devices are formed by stitching 1kx1k building blocks with 6.0 µm pixel pitch in 6x8 (hxv) format. This concept allows us to design four large-area (48Mp) and sixty-two basic (1Mp) devices per 6" wafer. The basic image sensor is relatively small in order to obtain data from many devices. Evaluation of the basic parameters such as the image pixel and on-chip amplifier provides us statistical data using a limited number of wafers. Whereas the large-area devices are evaluated for aspects typical to large-sensor operation and performance, such as the charge transport efficiency. Combined with the usability of multi-layer reticles, the sensor development is cost effective for prototyping. Optimisation of the sensor design and technology has resulted in a pixel charge capacity of 58 ke- and significantly reduced readout noise (12 electrons at 25 MHz pixel rate, after CDS). Hence, a dynamic range of 73 dB is obtained. Microlens and stack optimisation resulted in an excellent angular response that meets with the wide-angle photography demands.

  8. Distributed processing method for arbitrary view generation in camera sensor network

    NASA Astrophysics Data System (ADS)

    Tehrani, Mehrdad P.; Fujii, Toshiaki; Tanimoto, Masayuki

    2003-05-01

    Camera sensor network as a new advent of technology is a network that each sensor node can capture video signals, process and communicate them with other nodes. The processing task in this network is to generate arbitrary view, which can be requested from central node or user. To avoid unnecessary communication between nodes in camera sensor network and speed up the processing time, we have distributed the processing tasks between nodes. In this method, each sensor node processes part of interpolation algorithm to generate the interpolated image with local communication between nodes. The processing task in camera sensor network is ray-space interpolation, which is an object independent method and based on MSE minimization by using adaptive filtering. Two methods were proposed for distributing processing tasks, which are Fully Image Shared Decentralized Processing (FIS-DP), and Partially Image Shared Decentralized Processing (PIS-DP), to share image data locally. Comparison of the proposed methods with Centralized Processing (CP) method shows that PIS-DP has the highest processing speed after FIS-DP, and CP has the lowest processing speed. Communication rate of CP and PIS-DP is almost same and better than FIS-DP. So, PIS-DP is recommended because of its better performance than CP and FIS-DP.

  9. Interferometric Reflectance Imaging Sensor (IRIS)—A Platform Technology for Multiplexed Diagnostics and Digital Detection

    PubMed Central

    Avci, Oguzhan; Lortlar Ünlü, Nese; Yalçın Özkumur, Ayça; Ünlü, M. Selim

    2015-01-01

    Over the last decade, the growing need in disease diagnostics has stimulated rapid development of new technologies with unprecedented capabilities. Recent emerging infectious diseases and epidemics have revealed the shortcomings of existing diagnostics tools, and the necessity for further improvements. Optical biosensors can lay the foundations for future generation diagnostics by providing means to detect biomarkers in a highly sensitive, specific, quantitative and multiplexed fashion. Here, we review an optical sensing technology, Interferometric Reflectance Imaging Sensor (IRIS), and the relevant features of this multifunctional platform for quantitative, label-free and dynamic detection. We discuss two distinct modalities for IRIS: (i) low-magnification (ensemble biomolecular mass measurements) and (ii) high-magnification (digital detection of individual nanoparticles) along with their applications, including label-free detection of multiplexed protein chips, measurement of single nucleotide polymorphism, quantification of transcription factor DNA binding, and high sensitivity digital sensing and characterization of nanoparticles and viruses. PMID:26205273

  10. Thermoelectric infrared imaging sensors for automotive applications

    NASA Astrophysics Data System (ADS)

    Hirota, Masaki; Nakajima, Yasushi; Saito, Masanori; Satou, Fuminori; Uchiyama, Makoto

    2004-07-01

    This paper describes three low-cost thermoelectric infrared imaging sensors having a 1,536, 2,304, and 10,800 element thermoelectric focal plane array (FPA) respectively and two experimental automotive application systems. The FPAs are basically fabricated with a conventional IC process and micromachining technologies and have a low cost potential. Among these sensors, the sensor having 2,304 elements provide high responsivity of 5,500 V/W and a very small size with adopting a vacuum-sealed package integrated with a wide-angle ZnS lens. One experimental system incorporated in the Nissan ASV-2 is a blind spot pedestrian warning system that employs four infrared imaging sensors. This system helps alert the driver to the presence of a pedestrian in a blind spot by detecting the infrared radiation emitted from the person"s body. The system can also prevent the vehicle from moving in the direction of the pedestrian. The other is a rearview camera system with an infrared detection function. This system consists of a visible camera and infrared sensors, and it helps alert the driver to the presence of a pedestrian in a rear blind spot. Various issues that will need to be addressed in order to expand the automotive applications of IR imaging sensors in the future are also summarized. This performance is suitable for consumer electronics as well as automotive applications.

  11. CMOS sensors for atmospheric imaging

    NASA Astrophysics Data System (ADS)

    Pratlong, Jérôme; Burt, David; Jerram, Paul; Mayer, Frédéric; Walker, Andrew; Simpson, Robert; Johnson, Steven; Hubbard, Wendy

    2017-09-01

    Recent European atmospheric imaging missions have seen a move towards the use of CMOS sensors for the visible and NIR parts of the spectrum. These applications have particular challenges that are completely different to those that have driven the development of commercial sensors for applications such as cell-phone or SLR cameras. This paper will cover the design and performance of general-purpose image sensors that are to be used in the MTG (Meteosat Third Generation) and MetImage satellites and the technology challenges that they have presented. We will discuss how CMOS imagers have been designed with 4T pixel sizes of up to 250 μm square achieving good charge transfer efficiency, or low lag, with signal levels up to 2M electrons and with high line rates. In both devices a low noise analogue read-out chain is used with correlated double sampling to suppress the readout noise and give a maximum dynamic range that is significantly larger than in standard commercial devices. Radiation hardness is a particular challenge for CMOS detectors and both of these sensors have been designed to be fully radiation hard with high latch-up and single-event-upset tolerances, which is now silicon proven on MTG. We will also cover the impact of ionising radiation on these devices. Because with such large pixels the photodiodes have a large open area, front illumination technology is sufficient to meet the detection efficiency requirements but with thicker than standard epitaxial silicon to give improved IR response (note that this makes latch up protection even more important). However with narrow band illumination reflections from the front and back of the dielectric stack on the top of the sensor produce Fabry-Perot étalon effects, which have been minimised with process modifications. We will also cover the addition of precision narrow band filters inside the MTG package to provide a complete imaging subsystem. Control of reflected light is also critical in obtaining the required optical performance and this has driven the development of a black coating layer that can be applied between the active silicon regions.

  12. IR sensors and imagers in networked operations

    NASA Astrophysics Data System (ADS)

    Breiter, Rainer; Cabanski, Wolfgang

    2005-05-01

    "Network-centric Warfare" is a common slogan describing an overall concept of networked operation of sensors, information and weapons to gain command and control superiority. Referring to IR sensors, integration and fusion of different channels like day/night or SAR images or the ability to spread image data among various users are typical requirements. Looking for concrete implementations the German Army future infantryman IdZ is an example where a group of ten soldiers build a unit with every soldier equipped with a personal digital assistant (PDA) for information display, day photo camera and a high performance thermal imager for every unit. The challenge to allow networked operation among such a unit is bringing information together and distribution over a capable network. So also AIM's thermal reconnaissance and targeting sight HuntIR which was selected for the IdZ program provides this capabilities by an optional wireless interface. Besides the global approach of Network-centric Warfare network technology can also be an interesting solution for digital image data distribution and signal processing behind the FPA replacing analog video networks or specific point to point interfaces. The resulting architecture can provide capabilities of data fusion from e.g. IR dual-band or IR multicolor sensors. AIM has participated in a German/UK collaboration program to produce a demonstrator for day/IR video distribution via Gigabit Ethernet for vehicle applications. In this study Ethernet technology was chosen for network implementation and a set of electronics was developed for capturing video data of IR and day imagers and Gigabit Ethernet video distribution. The demonstrator setup follows the requirements of current and future vehicles having a set of day and night imager cameras and a crew station with several members. Replacing the analog video path by a digital video network also makes it easy to implement embedded training by simply feeding the network with simulation data. The paper addresses the special capabilities, requirements and design considerations of IR sensors and imagers in applications like thermal weapon sights and UAVs for networked operating infantry forces.

  13. Hyperspectral Systems Increase Imaging Capabilities

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In 1983, NASA started developing hyperspectral systems to image in the ultraviolet and infrared wavelengths. In 2001, the first on-orbit hyperspectral imager, Hyperion, was launched aboard the Earth Observing-1 spacecraft. Based on the hyperspectral imaging sensors used in Earth observation satellites, Stennis Space Center engineers and Institute for Technology Development researchers collaborated on a new design that was smaller and used an improved scanner. Featured in Spinoff 2007, the technology is now exclusively licensed by Themis Vision Systems LLC, of Richmond, Virginia, and is widely used in medical and life sciences, defense and security, forensics, and microscopy.

  14. Imaging detectors and electronics—a view of the future

    NASA Astrophysics Data System (ADS)

    Spieler, Helmuth

    2004-09-01

    Imaging sensors and readout electronics have made tremendous strides in the past two decades. The application of modern semiconductor fabrication techniques and the introduction of customized monolithic integrated circuits have made large-scale imaging systems routine in high-energy physics. This technology is now finding its way into other areas, such as space missions, synchrotron light sources, and medical imaging. I review current developments and discuss the promise and limits of new technologies. Several detector systems are described as examples of future trends. The discussion emphasizes semiconductor detector systems, but I also include recent developments for large-scale superconducting detector arrays.

  15. Advancements in Transmitters and Sensors for Biological Tissue Imaging in Magnetic Induction Tomography

    PubMed Central

    Zakaria, Zulkarnay; Rahim, Ruzairi Abdul; Mansor, Muhammad Saiful Badri; Yaacob, Sazali; Ayub, Nor Muzakkir Nor; Muji, Siti Zarina Mohd.; Rahiman, Mohd Hafiz Fazalul; Aman, Syed Mustafa Kamal Syed

    2012-01-01

    Magnetic Induction Tomography (MIT), which is also known as Electromagnetic Tomography (EMT) or Mutual Inductance Tomography, is among the imaging modalities of interest to many researchers around the world. This noninvasive modality applies an electromagnetic field and is sensitive to all three passive electromagnetic properties of a material that are conductivity, permittivity and permeability. MIT is categorized under the passive imaging family with an electrodeless technique through the use of excitation coils to induce an electromagnetic field in the material, which is then measured at the receiving side by sensors. The aim of this review is to discuss the challenges of the MIT technique and summarize the recent advancements in the transmitters and sensors, with a focus on applications in biological tissue imaging. It is hoped that this review will provide some valuable information on the MIT for those who have interest in this modality. The need of this knowledge may speed up the process of adopted of MIT as a medical imaging technology. PMID:22969341

  16. Forensics for flatbed scanners

    NASA Astrophysics Data System (ADS)

    Gloe, Thomas; Franz, Elke; Winkler, Antje

    2007-02-01

    Within this article, we investigate possibilities for identifying the origin of images acquired with flatbed scanners. A current method for the identification of digital cameras takes advantage of image sensor noise, strictly speaking, the spatial noise. Since flatbed scanners and digital cameras use similar technologies, the utilization of image sensor noise for identifying the origin of scanned images seems to be possible. As characterization of flatbed scanner noise, we considered array reference patterns and sensor line reference patterns. However, there are particularities of flatbed scanners which we expect to influence the identification. This was confirmed by extensive tests: Identification was possible to a certain degree, but less reliable than digital camera identification. In additional tests, we simulated the influence of flatfielding and down scaling as examples for such particularities of flatbed scanners on digital camera identification. One can conclude from the results achieved so far that identifying flatbed scanners is possible. However, since the analyzed methods are not able to determine the image origin in all cases, further investigations are necessary.

  17. Scintillating Quantum Dots for Imaging X-Rays (SQDIX) for Aircraft Inspection

    NASA Technical Reports Server (NTRS)

    Burke, E. R.; DeHaven, S. L.; Williams, P. A.

    2015-01-01

    Scintillation is the process currently employed by conventional X-ray detectors to create X-ray images. Scintillating quantum dots (StQDs) or nano-crystals are novel, nanometer-scale materials that upon excitation by X-rays, re-emit the absorbed energy as visible light. StQDs theoretically have higher output efficiency than conventional scintillating materials and are more environmentally friendly. This paper will present the characterization of several critical elements in the use of StQDs that have been performed along a path to the use of this technology in wide spread X-ray imaging. Initial work on the scintillating quantum dots for imaging X-rays (SQDIX) system has shown great promise to create state-of-the-art sensors using StQDs as a sensor material. In addition, this work also demonstrates a high degree of promise using StQDs in microstructured fiber optics. Using the microstructured fiber as a light guide could greatly increase the capture efficiency of a StQDs based imaging sensor.

  18. Toward a digital camera to rival the human eye

    NASA Astrophysics Data System (ADS)

    Skorka, Orit; Joseph, Dileepan

    2011-07-01

    All things considered, electronic imaging systems do not rival the human visual system despite notable progress over 40 years since the invention of the CCD. This work presents a method that allows design engineers to evaluate the performance gap between a digital camera and the human eye. The method identifies limiting factors of the electronic systems by benchmarking against the human system. It considers power consumption, visual field, spatial resolution, temporal resolution, and properties related to signal and noise power. A figure of merit is defined as the performance gap of the weakest parameter. Experimental work done with observers and cadavers is reviewed to assess the parameters of the human eye, and assessment techniques are also covered for digital cameras. The method is applied to 24 modern image sensors of various types, where an ideal lens is assumed to complete a digital camera. Results indicate that dynamic range and dark limit are the most limiting factors. The substantial functional gap, from 1.6 to 4.5 orders of magnitude, between the human eye and digital cameras may arise from architectural differences between the human retina, arranged in a multiple-layer structure, and image sensors, mostly fabricated in planar technologies. Functionality of image sensors may be significantly improved by exploiting technologies that allow vertical stacking of active tiers.

  19. SPIDER: Next Generation Chip Scale Imaging Sensor Update

    NASA Astrophysics Data System (ADS)

    Duncan, A.; Kendrick, R.; Ogden, C.; Wuchenich, D.; Thurman, S.; Su, T.; Lai, W.; Chun, J.; Li, S.; Liu, G.; Yoo, S. J. B.

    2016-09-01

    The Lockheed Martin Advanced Technology Center (LM ATC) and the University of California at Davis (UC Davis) are developing an electro-optical (EO) imaging sensor called SPIDER (Segmented Planar Imaging Detector for Electro-optical Reconnaissance) that seeks to provide a 10x to 100x size, weight, and power (SWaP) reduction alternative to the traditional bulky optical telescope and focal-plane detector array. The substantial reductions in SWaP would reduce cost and/or provide higher resolution by enabling a larger-aperture imager in a constrained volume. Our SPIDER imager replaces the traditional optical telescope and digital focal plane detector array with a densely packed interferometer array based on emerging photonic integrated circuit (PIC) technologies that samples the object being imaged in the Fourier domain (i.e., spatial frequency domain), and then reconstructs an image. Our approach replaces the large optics and structures required by a conventional telescope with PICs that are accommodated by standard lithographic fabrication techniques (e.g., complementary metal-oxide-semiconductor (CMOS) fabrication). The standard EO payload integration and test process that involves precision alignment and test of optical components to form a diffraction limited telescope is, therefore, replaced by in-process integration and test as part of the PIC fabrication, which substantially reduces associated schedule and cost. This paper provides an overview of performance data on the second-generation PIC for SPIDER developed under the Defense Advanced Research Projects Agency (DARPA)'s SPIDER Zoom research funding. We also update the design description of the SPIDER Zoom imaging sensor and the second-generation PIC (high- and low resolution versions).

  20. Robust optical sensors for safety critical automotive applications

    NASA Astrophysics Data System (ADS)

    De Locht, Cliff; De Knibber, Sven; Maddalena, Sam

    2008-02-01

    Optical sensors for the automotive industry need to be robust, high performing and low cost. This paper focuses on the impact of automotive requirements on optical sensor design and packaging. Main strategies to lower optical sensor entry barriers in the automotive market include: Perform sensor calibration and tuning by the sensor manufacturer, sensor test modes on chip to guarantee functional integrity at operation, and package technology is key. As a conclusion, optical sensor applications are growing in automotive. Optical sensor robustness matured to the level of safety critical applications like Electrical Power Assisted Steering (EPAS) and Drive-by-Wire by optical linear arrays based systems and Automated Cruise Control (ACC), Lane Change Assist and Driver Classification/Smart Airbag Deployment by camera imagers based systems.

  1. Design and testing of a dual-band enhanced vision system

    NASA Astrophysics Data System (ADS)

    Way, Scott P.; Kerr, Richard; Imamura, Joseph J.; Arnoldy, Dan; Zeylmaker, Dick; Zuro, Greg

    2003-09-01

    An effective enhanced vision system must operate over a broad spectral range in order to offer a pilot an optimized scene that includes runway background as well as airport lighting and aircraft operations. The large dynamic range of intensities of these images is best handled with separate imaging sensors. The EVS 2000 is a patented dual-band Infrared Enhanced Vision System (EVS) utilizing image fusion concepts. It has the ability to provide a single image from uncooled infrared imagers combined with SWIR, NIR or LLLTV sensors. The system is designed to provide commercial and corporate airline pilots with improved situational awareness at night and in degraded weather conditions but can also be used in a variety of applications where the fusion of dual band or multiband imagery is required. A prototype of this system was recently fabricated and flown on the Boeing Advanced Technology Demonstrator 737-900 aircraft. This paper will discuss the current EVS 2000 concept, show results taken from the Boeing Advanced Technology Demonstrator program, and discuss future plans for the fusion system.

  2. Evaluation of novel technologies for the miniaturization of flash imaging lidar

    NASA Astrophysics Data System (ADS)

    Mitev, V.; Pollini, A.; Haesler, J.; Perenzoni, D.; Stoppa, D.; Kolleck, Christian; Chapuy, M.; Kervendal, E.; Pereira do Carmo, João.

    2017-11-01

    Planetary exploration constitutes one of the main components in the European Space activities. Missions to Mars, Moon and asteroids are foreseen where it is assumed that the human missions shall be preceded by robotic exploitation flights. The 3D vision is recognised as a key enabling technology in the relative proximity navigation of the space crafts, where imaging LiDAR is one of the best candidates for such 3D vision sensor.

  3. Sensor-oriented feature usability evaluation in fingerprint segmentation

    NASA Astrophysics Data System (ADS)

    Li, Ying; Yin, Yilong; Yang, Gongping

    2013-06-01

    Existing fingerprint segmentation methods usually process fingerprint images captured by different sensors with the same feature or feature set. We propose to improve the fingerprint segmentation result in view of an important fact that images from different sensors have different characteristics for segmentation. Feature usability evaluation, which means to evaluate the usability of features to find the personalized feature or feature set for different sensors to improve the performance of segmentation. The need for feature usability evaluation for fingerprint segmentation is raised and analyzed as a new issue. To address this issue, we present a decision-tree-based feature-usability evaluation method, which utilizes a C4.5 decision tree algorithm to evaluate and pick the best suitable feature or feature set for fingerprint segmentation from a typical candidate feature set. We apply the novel method on the FVC2002 database of fingerprint images, which are acquired by four different respective sensors and technologies. Experimental results show that the accuracy of segmentation is improved, and time consumption for feature extraction is dramatically reduced with selected feature(s).

  4. Imaging Beyond What Man Can See

    NASA Technical Reports Server (NTRS)

    May, George; Mitchell, Brian

    2004-01-01

    Three lightweight, portable hyperspectral sensor systems have been built that capture energy from 200 to 1700 nanometers (ultravio1et to shortwave infrared). The sensors incorporate a line scanning technique that requires no relative movement between the target and the sensor. This unique capability, combined with portability, opens up new uses of hyperspectral imaging for laboratory and field environments. Each system has a GUI-based software package that allows the user to communicate with the imaging device for setting spatial resolution, spectral bands and other parameters. NASA's Space Partnership Development has sponsored these innovative developments and their application to human problems on Earth and in space. Hyperspectral datasets have been captured and analyzed in numerous areas including precision agriculture, food safety, biomedical imaging, and forensics. Discussion on research results will include realtime detection of food contaminants, molds and toxin research on corn, identifying counterfeit documents, non-invasive wound monitoring and aircraft applications. Future research will include development of a thermal infrared hyperspectral sensor that will support natural resource applications on Earth and thermal analyses during long duration space flight. This paper incorporates a variety of disciplines and imaging technologies that have been linked together to allow the expansion of remote sensing across both traditional and non-traditional boundaries.

  5. Enhancing Spatial Resolution of Remotely Sensed Imagery Using Deep Learning

    NASA Astrophysics Data System (ADS)

    Beck, J. M.; Bridges, S.; Collins, C.; Rushing, J.; Graves, S. J.

    2017-12-01

    Researchers at the Information Technology and Systems Center at the University of Alabama in Huntsville are using Deep Learning with Convolutional Neural Networks (CNNs) to develop a method for enhancing the spatial resolutions of moderate resolution (10-60m) multispectral satellite imagery. This enhancement will effectively match the resolutions of imagery from multiple sensors to provide increased global temporal-spatial coverage for a variety of Earth science products. Our research is centered on using Deep Learning for automatically generating transformations for increasing the spatial resolution of remotely sensed images with different spatial, spectral, and temporal resolutions. One of the most important steps in using images from multiple sensors is to transform the different image layers into the same spatial resolution, preferably the highest spatial resolution, without compromising the spectral information. Recent advances in Deep Learning have shown that CNNs can be used to effectively and efficiently upscale or enhance the spatial resolution of multispectral images with the use of an auxiliary data source such as a high spatial resolution panchromatic image. In contrast, we are using both the spatial and spectral details inherent in low spatial resolution multispectral images for image enhancement without the use of a panchromatic image. This presentation will discuss how this technology will benefit many Earth Science applications that use remotely sensed images with moderate spatial resolutions.

  6. High frame rate imaging systems developed in Northwest Institute of Nuclear Technology

    NASA Astrophysics Data System (ADS)

    Li, Binkang; Wang, Kuilu; Guo, Mingan; Ruan, Linbo; Zhang, Haibing; Yang, Shaohua; Feng, Bing; Sun, Fengrong; Chen, Yanli

    2007-01-01

    This paper presents high frame rate imaging systems developed in Northwest Institute of Nuclear Technology in recent years. Three types of imaging systems are included. The first type of system utilizes EG&G RETICON Photodiode Array (PDA) RA100A as the image sensor, which can work at up to 1000 frame per second (fps). Besides working continuously, the PDA system is also designed to switch to capture flash light event working mode. A specific time sequence is designed to satisfy this request. The camera image data can be transmitted to remote area by coaxial or optic fiber cable and then be stored. The second type of imaging system utilizes PHOTOBIT Complementary Metal Oxygen Semiconductor (CMOS) PB-MV13 as the image sensor, which has a high resolution of 1280 (H) ×1024 (V) pixels per frame. The CMOS system can operate at up to 500fps in full frame and 4000fps partially. The prototype scheme of the system is presented. The third type of imaging systems adopts charge coupled device (CCD) as the imagers. MINTRON MTV-1881EX, DALSA CA-D1 and CA-D6 camera head are used in the systems development. The features comparison of the RA100A, PB-MV13, and CA-D6 based systems are given in the end.

  7. Radiometric cross-calibration of EO-1 ALI with L7 ETM+ and Terra MODIS sensors using near-simultaneous desert observations

    USGS Publications Warehouse

    Chander, Gyanesh; Angal, Amit; Choi, Taeyoung; Xiong, Xiaoxiong

    2013-01-01

    The Earth Observing-1 (EO-1) satellite was launched on November 21, 2000, as part of a one-year technology demonstration mission. The mission was extended because of the value it continued to add to the scientific community. EO-1 has now been operational for more than a decade, providing both multispectral and hyperspectral measurements. As part of the EO-1 mission, the Advanced Land Imager (ALI) sensor demonstrates a potential technological direction for the next generation of Landsat sensors. To evaluate the ALI sensor capabilities as a precursor to the Operational Land Imager (OLI) onboard the Landsat Data Continuity Mission (LDCM, or Landsat 8 after launch), its measured top-of-atmosphere (TOA) reflectances were compared to the well-calibrated Landsat 7 (L7) Enhanced Thematic Mapper Plus (ETM+) and the Terra Moderate Resolution Imaging Spectroradiometer (MODIS) sensors in the reflective solar bands (RSB). These three satellites operate in a near-polar, sun-synchronous orbit 705 km above the Earth's surface. EO-1 was designed to fly one minute behind L7 and approximately 30 minutes in front of Terra. In this configuration, all the three sensors can view near-identical ground targets with similar atmospheric, solar, and viewing conditions. However, because of the differences in the relative spectral response (RSR), the measured physical quantities can be significantly different while observing the same target. The cross-calibration of ALI with ETM+ and MODIS was performed using near-simultaneous surface observations based on image statistics from areas observed by these sensors over four desert sites (Libya 4, Mauritania 2, Arabia 1, and Sudan 1). The differences in the measured TOA reflectances due to RSR mismatches were compensated by using a spectral band adjustment factor (SBAF), which takes into account the spectral profile of the target and the RSR of each sensor. For this study, the spectral profile of the target comes from the near-simultaneous EO-1 Hyperion data over these sites. The results indicate that the TOA reflectance measurements for ALI agree with those of ETM+ and MODIS to within 5% after the application of SBAF.

  8. Calibration of Kinect for Xbox One and Comparison between the Two Generations of Microsoft Sensors

    PubMed Central

    Pagliari, Diana; Pinto, Livio

    2015-01-01

    In recent years, the videogame industry has been characterized by a great boost in gesture recognition and motion tracking, following the increasing request of creating immersive game experiences. The Microsoft Kinect sensor allows acquiring RGB, IR and depth images with a high frame rate. Because of the complementary nature of the information provided, it has proved an attractive resource for researchers with very different backgrounds. In summer 2014, Microsoft launched a new generation of Kinect on the market, based on time-of-flight technology. This paper proposes a calibration of Kinect for Xbox One imaging sensors, focusing on the depth camera. The mathematical model that describes the error committed by the sensor as a function of the distance between the sensor itself and the object has been estimated. All the analyses presented here have been conducted for both generations of Kinect, in order to quantify the improvements that characterize every single imaging sensor. Experimental results show that the quality of the delivered model improved applying the proposed calibration procedure, which is applicable to both point clouds and the mesh model created with the Microsoft Fusion Libraries. PMID:26528979

  9. Calibration of Kinect for Xbox One and Comparison between the Two Generations of Microsoft Sensors.

    PubMed

    Pagliari, Diana; Pinto, Livio

    2015-10-30

    In recent years, the videogame industry has been characterized by a great boost in gesture recognition and motion tracking, following the increasing request of creating immersive game experiences. The Microsoft Kinect sensor allows acquiring RGB, IR and depth images with a high frame rate. Because of the complementary nature of the information provided, it has proved an attractive resource for researchers with very different backgrounds. In summer 2014, Microsoft launched a new generation of Kinect on the market, based on time-of-flight technology. This paper proposes a calibration of Kinect for Xbox One imaging sensors, focusing on the depth camera. The mathematical model that describes the error committed by the sensor as a function of the distance between the sensor itself and the object has been estimated. All the analyses presented here have been conducted for both generations of Kinect, in order to quantify the improvements that characterize every single imaging sensor. Experimental results show that the quality of the delivered model improved applying the proposed calibration procedure, which is applicable to both point clouds and the mesh model created with the Microsoft Fusion Libraries.

  10. New optical sensor systems for high-resolution satellite, airborne and terrestrial imaging systems

    NASA Astrophysics Data System (ADS)

    Eckardt, Andreas; Börner, Anko; Lehmann, Frank

    2007-10-01

    The department of Optical Information Systems (OS) at the Institute of Robotics and Mechatronics of the German Aerospace Center (DLR) has more than 25 years experience with high-resolution imaging technology. The technology changes in the development of detectors, as well as the significant change of the manufacturing accuracy in combination with the engineering research define the next generation of spaceborne sensor systems focusing on Earth observation and remote sensing. The combination of large TDI lines, intelligent synchronization control, fast-readable sensors and new focal-plane concepts open the door to new remote-sensing instruments. This class of instruments is feasible for high-resolution sensor systems regarding geometry and radiometry and their data products like 3D virtual reality. Systemic approaches are essential for such designs of complex sensor systems for dedicated tasks. The system theory of the instrument inside a simulated environment is the beginning of the optimization process for the optical, mechanical and electrical designs. Single modules and the entire system have to be calibrated and verified. Suitable procedures must be defined on component, module and system level for the assembly test and verification process. This kind of development strategy allows the hardware-in-the-loop design. The paper gives an overview about the current activities at DLR in the field of innovative sensor systems for photogrammetric and remote sensing purposes.

  11. Origin of high photoconductive gain in fully transparent heterojunction nanocrystalline oxide image sensors and interconnects.

    PubMed

    Jeon, Sanghun; Song, Ihun; Lee, Sungsik; Ryu, Byungki; Ahn, Seung-Eon; Lee, Eunha; Kim, Young; Nathan, Arokia; Robertson, John; Chung, U-In

    2014-11-05

    A technique for invisible image capture using a photosensor array based on transparent conducting oxide semiconductor thin-film transistors and transparent interconnection technologies is presented. A transparent conducting layer is employed for the sensor electrodes as well as interconnection in the array, providing about 80% transmittance at visible-light wavelengths. The phototransistor is a Hf-In-Zn-O/In-Zn-O heterostructure yielding a high quantum-efficiency in the visible range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Electrical Capacitance Volume Tomography: Design and Applications

    PubMed Central

    Wang, Fei; Marashdeh, Qussai; Fan, Liang-Shih; Warsito, Warsito

    2010-01-01

    This article reports recent advances and progress in the field of electrical capacitance volume tomography (ECVT). ECVT, developed from the two-dimensional electrical capacitance tomography (ECT), is a promising non-intrusive imaging technology that can provide real-time three-dimensional images of the sensing domain. Images are reconstructed from capacitance measurements acquired by electrodes placed on the outside boundary of the testing vessel. In this article, a review of progress on capacitance sensor design and applications to multi-phase flows is presented. The sensor shape, electrode configuration, and the number of electrodes that comprise three key elements of three-dimensional capacitance sensors are illustrated. The article also highlights applications of ECVT sensors on vessels of various sizes from 1 to 60 inches with complex geometries. Case studies are used to show the capability and validity of ECVT. The studies provide qualitative and quantitative real-time three-dimensional information of the measuring domain under study. Advantages of ECVT render it a favorable tool to be utilized for industrial applications and fundamental multi-phase flow research. PMID:22294905

  13. Preliminary Design of a Lightning Optical Camera and ThundEr (LOCATE) Sensor

    NASA Technical Reports Server (NTRS)

    Phanord, Dieudonne D.; Koshak, William J.; Rybski, Paul M.; Arnold, James E. (Technical Monitor)

    2001-01-01

    The preliminary design of an optical/acoustical instrument is described for making highly accurate real-time determinations of the location of cloud-to-ground (CG) lightning. The instrument, named the Lightning Optical Camera And ThundEr (LOCATE) sensor, will also image the clear and cloud-obscured lightning channel produced from CGs and cloud flashes, and will record the transient optical waveforms produced from these discharges. The LOCATE sensor will consist of a full (360 degrees) field-of-view optical camera for obtaining CG channel image and azimuth, a sensitive thunder microphone for obtaining CG range, and a fast photodiode system for time-resolving the lightning optical waveform. The optical waveform data will be used to discriminate CGs from cloud flashes. Together, the optical azimuth and thunder range is used to locate CGs and it is anticipated that a network of LOCATE sensors would determine CG source location to well within 100 meters. All of this would be accomplished for a relatively inexpensive cost compared to present RF lightning location technologies, but of course the range detection is limited and will be quantified in the future. The LOCATE sensor technology would have practical applications for electric power utility companies, government (e.g. NASA Kennedy Space Center lightning safety and warning), golf resort lightning safety, telecommunications, and other industries.

  14. DETECTION AND IDENTIFICATION OF TOXIC AIR POLLUTANTS USING FIELD PORTABLE AND AIRBORNE REMOTE IMAGING SYSTEMS

    EPA Science Inventory

    Remote sensing technologies are a class of instrument and sensor systems that include laser imageries, imaging spectrometers, and visible to thermal infrared cameras. These systems have been successfully used for gas phase chemical compound identification in a variety of field e...

  15. High-resolution CCD imaging alternatives

    NASA Astrophysics Data System (ADS)

    Brown, D. L.; Acker, D. E.

    1992-08-01

    High resolution CCD color cameras have recently stimulated the interest of a large number of potential end-users for a wide range of practical applications. Real-time High Definition Television (HDTV) systems are now being used or considered for use in applications ranging from entertainment program origination through digital image storage to medical and scientific research. HDTV generation of electronic images offers significant cost and time-saving advantages over the use of film in such applications. Further in still image systems electronic image capture is faster and more efficient than conventional image scanners. The CCD still camera can capture 3-dimensional objects into the computing environment directly without having to shoot a picture on film develop it and then scan the image into a computer. 2. EXTENDING CCD TECHNOLOGY BEYOND BROADCAST Most standard production CCD sensor chips are made for broadcast-compatible systems. One popular CCD and the basis for this discussion offers arrays of roughly 750 x 580 picture elements (pixels) or a total array of approximately 435 pixels (see Fig. 1). FOR. A has developed a technique to increase the number of available pixels for a given image compared to that produced by the standard CCD itself. Using an inter-lined CCD with an overall spatial structure several times larger than the photo-sensitive sensor areas each of the CCD sensors is shifted in two dimensions in order to fill in spatial gaps between adjacent sensors.

  16. Improved Space Object Observation Techniques Using CMOS Detectors

    NASA Astrophysics Data System (ADS)

    Schildknecht, T.; Hinze, A.; Schlatter, P.; Silha, J.; Peltonen, J.; Santti, T.; Flohrer, T.

    2013-08-01

    CMOS-sensors, or in general Active Pixel Sensors (APS), are rapidly replacing CCDs in the consumer camera market. Due to significant technological advances during the past years these devices start to compete with CCDs also for demanding scientific imaging applications, in particular in the astronomy community. CMOS detectors offer a series of inherent advantages compared to CCDs, due to the structure of their basic pixel cells, which each contain their own amplifier and readout electronics. The most prominent advantages for space object observations are the extremely fast and flexible readout capabilities, feasibility for electronic shuttering and precise epoch registration, and the potential to perform image processing operations on-chip and in real-time. Presently applied and proposed optical observation strategies for space debris surveys and space surveillance applications had to be analyzed. The major design drivers were identified and potential benefits from using available and future CMOS sensors were assessed. The major challenges and design drivers for ground-based and space-based optical observation strategies have been analyzed. CMOS detector characteristics were critically evaluated and compared with the established CCD technology, especially with respect to the above mentioned observations. Similarly, the desirable on-chip processing functionalities which would further enhance the object detection and image segmentation were identified. Finally, the characteristics of a particular CMOS sensor available at the Zimmerwald observatory were analyzed by performing laboratory test measurements.

  17. Cadastral Audit and Assessments Using Unmanned Aerial Systems

    NASA Astrophysics Data System (ADS)

    Cunningham, K.; Walker, G.; Stahlke, E.; Wilson, R.

    2011-09-01

    Ground surveys and remote sensing are integral to establishing fair and equitable property valuations necessary for real property taxation. The International Association of Assessing Officers (IAAO) has embraced aerial and street-view imaging as part of its standards related to property tax assessments and audits. New technologies, including unmanned aerial systems (UAS) paired with imaging sensors, will become more common as local governments work to ensure their cadastre and tax rolls are both accurate and complete. Trends in mapping technology have seen an evolution in platforms from large, expensive manned aircraft to very small, inexpensive UAS. Traditional methods of photogrammetry have also given way to new equipment and sensors: digital cameras, infrared imagers, light detection and ranging (LiDAR) laser scanners, and now synthetic aperture radar (SAR). At the University of Alaska Fairbanks (UAF), we work extensively with unmanned aerial systems equipped with each of these newer sensors. UAF has significant experience flying unmanned systems in the US National Airspace, having begun in 1969 with scientific rockets and expanded to unmanned aircraft in 2003. Ongoing field experience allows UAF to partner effectively with outside organizations to test and develop leading-edge research in UAS and remote sensing. This presentation will discuss our research related to various sensors and payloads for mapping. We will also share our experience with UAS and optical systems for creating some of the first cadastral surveys in rural Alaska.

  18. Event-based Sensing for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Cohen, G.; Afshar, S.; van Schaik, A.; Wabnitz, A.; Bessell, T.; Rutten, M.; Morreale, B.

    A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatiotemporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and nighttime (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An eventbased sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding challenges required by space-based SSA systems. Results from these experiments and the systems developed highlight the applicability of event-based sensors to ground and space-based SSA tasks.

  19. Hyperspectral Sensors Final Report CRADA No. TC02173.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Priest, R. E.; Sauvageau, J. E.

    This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and Science Applications International Corporation (SAIC), National Security Space Operations/SRBU, to develop longwave infrared (LWIR) hyperspectral imaging (HSI) sensors for airborne and potentially ground and space, platforms. LLNL has designed and developed LWIR HSI sensors since 1995. The current generation of these sensors has applications to users within the U.S. Department of Defense and the Intelligence Community. User needs are for multiple copies provided by commercial industry. To gain the most benefit from the U.S. Government’s prior investments inmore » LWIR HSI sensors developed at LLNL, transfer of technology and know-how from LLNL HSI experts to commercial industry was needed. The overarching purpose of the CRADA project was to facilitate the transfer of the necessary technology from LLNL to SAIC thereby allowing the U.S. Government to procure LWIR HSI sensors from this company.« less

  20. Three-dimensional estimates of tree canopies: Scaling from high-resolution UAV data to satellite observations

    NASA Astrophysics Data System (ADS)

    Sankey, T.; Donald, J.; McVay, J.

    2015-12-01

    High resolution remote sensing images and datasets are typically acquired at a large cost, which poses big a challenge for many scientists. Northern Arizona University recently acquired a custom-engineered, cutting-edge UAV and we can now generate our own images with the instrument. The UAV has a unique capability to carry a large payload including a hyperspectral sensor, which images the Earth surface in over 350 spectral bands at 5 cm resolution, and a lidar scanner, which images the land surface and vegetation in 3-dimensions. Both sensors represent the newest available technology with very high resolution, precision, and accuracy. Using the UAV sensors, we are monitoring the effects of regional forest restoration treatment efforts. Individual tree canopy width and height are measured in the field and via the UAV sensors. The high-resolution UAV images are then used to segment individual tree canopies and to derive 3-dimensional estimates. The UAV image-derived variables are then correlated to the field-based measurements and scaled to satellite-derived tree canopy measurements. The relationships between the field-based and UAV-derived estimates are then extrapolated to a larger area to scale the tree canopy dimensions and to estimate tree density within restored and control forest sites.

  1. Hyperspectral CMOS imager

    NASA Astrophysics Data System (ADS)

    Jerram, P. A.; Fryer, M.; Pratlong, J.; Pike, A.; Walker, A.; Dierickx, B.; Dupont, B.; Defernez, A.

    2017-11-01

    CCDs have been used for many years for Hyperspectral imaging missions and have been extremely successful. These include the Medium Resolution Imaging Spectrometer (MERIS) [1] on Envisat, the Compact High Resolution Imaging Spectrometer (CHRIS) on Proba and the Ozone Monitoring Instrument operating in the UV spectral region. ESA are also planning a number of further missions that are likely to use CCD technology (Sentinel 3, 4 and 5). However CMOS sensors have a number of advantages which means that they will probably be used for hyperspectral applications in the longer term. There are two main advantages with CMOS sensors: First a hyperspectral image consists of spectral lines with a large difference in intensity; in a frame transfer CCD the faint spectral lines have to be transferred through the part of the imager illuminated by intense lines. This can lead to cross-talk and whilst this problem can be reduced by the use of split frame transfer and faster line rates CMOS sensors do not require a frame transfer and hence inherently will not suffer from this problem. Second, with a CMOS sensor the intense spectral lines can be read multiple times within a frame to give a significant increase in dynamic range. We will describe the design, and initial test of a CMOS sensor for use in hyperspectral applications. This device has been designed to give as high a dynamic range as possible with minimum cross-talk. The sensor has been manufactured on high resistivity epitaxial silicon wafers and is be back-thinned and left relatively thick in order to obtain the maximum quantum efficiency across the entire spectral range

  2. Overview of Digital Forensics Algorithms in Dslr Cameras

    NASA Astrophysics Data System (ADS)

    Aminova, E.; Trapeznikov, I.; Priorov, A.

    2017-05-01

    The widespread usage of the mobile technologies and the improvement of the digital photo devices getting has led to more frequent cases of falsification of images including in the judicial practice. Consequently, the actual task for up-to-date digital image processing tools is the development of algorithms for determining the source and model of the DSLR (Digital Single Lens Reflex) camera and improve image formation algorithms. Most research in this area based on the mention that the extraction of unique sensor trace of DSLR camera could be possible on the certain stage of the imaging process into the camera. It is considered that the study focuses on the problem of determination of unique feature of DSLR cameras based on optical subsystem artifacts and sensor noises.

  3. Comparison of a CCD and an APS for soft X-ray diffraction

    NASA Astrophysics Data System (ADS)

    Stewart, Graeme; Bates, R.; Blue, A.; Clark, A.; Dhesi, S. S.; Maneuski, D.; Marchal, J.; Steadman, P.; Tartoni, N.; Turchetta, R.

    2011-12-01

    We compare a new CMOS Active Pixel Sensor (APS) to a Princeton Instruments PIXIS-XO: 2048B Charge Coupled Device (CCD) with soft X-rays tested in a synchrotron beam line at the Diamond Light Source (DLS). Despite CCDs being established in the field of scientific imaging, APS are an innovative technology that offers advantages over CCDs. These include faster readout, higher operational temperature, in-pixel electronics for advanced image processing and reduced manufacturing cost. The APS employed was the Vanilla sensor designed by the MI3 collaboration and funded by an RCUK Basic technology grant. This sensor has 520 x 520 square pixels, of size 25 μm on each side. The sensor can operate at a full frame readout of up to 20 Hz. The sensor had been back-thinned, to the epitaxial layer. This was the first time that a back-thinned APS had been demonstrated at a beam line at DLS. In the synchrotron experiment soft X-rays with an energy of approximately 708 eV were used to produce a diffraction pattern from a permalloy sample. The pattern was imaged at a range of integration times with both sensors. The CCD had to be operated at a temperature of -55°C whereas the Vanilla was operated over a temperature range from 20°C to -10°C. We show that the APS detector can operate with frame rates up to two hundred times faster than the CCD, without excessive degradation of image quality. The signal to noise of the APS is shown to be the same as that of the CCD at identical integration times and the response is shown to be linear, with no charge blooming effects. The experiment has allowed a direct comparison of back thinned APS and CCDs in a real soft x-ray synchrotron experiment.

  4. The precision-processing subsystem for the Earth Resources Technology Satellite.

    NASA Technical Reports Server (NTRS)

    Chapelle, W. E.; Bybee, J. E.; Bedross, G. M.

    1972-01-01

    Description of the precision processor, a subsystem in the image-processing system for the Earth Resources Technology Satellite (ERTS). This processor is a special-purpose image-measurement and printing system, designed to process user-selected bulk images to produce 1:1,000,000-scale film outputs and digital image data, presented in a Universal-Transverse-Mercator (UTM) projection. The system will remove geometric and radiometric errors introduced by the ERTS multispectral sensors and by the bulk-processor electron-beam recorder. The geometric transformations required for each input scene are determined by resection computations based on reseau measurements and image comparisons with a special ground-control base contained within the system; the images are then printed and digitized by electronic image-transfer techniques.

  5. Test and evaluation of Japanese GPR-EMI dual sensor systems at Benkovac test site in Croatia

    NASA Astrophysics Data System (ADS)

    Ishikawa, J.; Furuta, K.; Pavković, Nikola

    2007-04-01

    This paper presents an experimental design and the evaluation result of a trial that were carried out from 1 February to 9 March 2006 using real PMA-1A and PMA-2 landmines at the Benkovac test site in Croatia. The objective of the Croatia- Japan joint trial is to evaluate dual sensor systems, which use both ground penetrating radar (GPR) and electromagnetic inductive (EMI) sensors. A comparative trial was also carried out by Croatian deminers using an existing EMI sensor, i.e., a metal detector (MD). The trial aims at evaluating differences in performance between dual sensors and MDs, especially in terms of discrimination of landmines from metal fragments and extension of detectable range in the depth direction. Devices evaluated here are 4 prototypes of anti-personnel landmine detection systems developed under a project of the Japan Science and Technology Agency (JST), the supervising authority of which is the Ministry of Education, Culture, Sports, Science and Technology (MEXT). The prototypes provide operators with subsurface images, and final decision whether a shadow in the image is a real landmine or not is left to the operator. This is similar to the way that medical doctors find cancer by reading CT images. Since operators' pre-knowledge of locations of buried targets significantly influences the test result, three test lanes, which have 3 different kinds of soils, have been designed to be suitable for blind tests. The result showed that the dual sensor systems have a potential to discriminate landmines from metal fragments and that probability of detection for small targets in mineralized soils can be improved by using GPR.

  6. CMOS cassette for digital upgrade of film-based mammography systems

    NASA Astrophysics Data System (ADS)

    Baysal, Mehmet A.; Toker, Emre

    2006-03-01

    While full-field digital mammography (FFDM) technology is gaining clinical acceptance, the overwhelming majority (96%) of the installed base of mammography systems are conventional film-screen (FSM) systems. A high performance, and economical digital cassette based product to conveniently upgrade FSM systems to FFDM would accelerate the adoption of FFDM, and make the clinical and technical advantages of FFDM available to a larger population of women. The planned FFDM cassette is based on our commercial Digital Radiography (DR) cassette for 10 cm x 10 cm field-of-view spot imaging and specimen radiography, utilizing a 150 micron columnar CsI(Tl) scintillator and 48 micron active-pixel CMOS sensor modules. Unlike a Computer Radiography (CR) cassette, which requires an external digitizer, our DR cassette transfers acquired images to a display workstation within approximately 5 seconds of exposure, greatly enhancing patient flow. We will present the physical performance of our prototype system against other FFDM systems in clinical use today, using established objective criteria such as the Modulation Transfer Function (MTF), Detective Quantum Efficiency (DQE), and subjective criteria, such as a contrast-detail (CD-MAM) observer performance study. Driven by the strong demand from the computer industry, CMOS technology is one of the lowest cost, and the most readily accessible technologies available for FFDM today. Recent popular use of CMOS imagers in high-end consumer cameras have also resulted in significant advances in the imaging performance of CMOS sensors against rivaling CCD sensors. This study promises to take advantage of these unique features to develop the first CMOS based FFDM upgrade cassette.

  7. Emergency Response Fire-Imaging UAS Missions over the Southern California Wildfire Disaster

    NASA Technical Reports Server (NTRS)

    DelFrate, John H.

    2008-01-01

    Objectives include: Demonstrate capabilities of UAS to overfly and collect sensor data on widespread fires throughout Western US. Demonstrate long-endurance mission capabilities (20-hours+). Image multiple fires (greater than 4 fires per mission), to showcase extendable mission configuration and ability to either linger over key fires or station over disparate regional fires. Demonstrate new UAV-compatible, autonomous sensor for improved thermal characterization of fires. Provide automated, on-board, terrain and geo-rectified sensor imagery over OTH satcom links to national fire personnel and Incident commanders. Deliver real-time imagery (within 10-minutes of acquisition). Demonstrate capabilities of OTS technologies (GoogleEarth) to serve and display mission-critical sensor data, coincident with other pertinent data elements to facilitate information processing (WX data, ground asset data, other satellite data, R/T video, flight track info, etc).

  8. Emergency Response Fire-Imaging UAS Missions over the Southern California Wildfire Disaster

    NASA Technical Reports Server (NTRS)

    Cobleigh, Brent R.

    2007-01-01

    Objectives include: Demonstrate capabilities of UAS to overfly and collect sensor data on widespread fires throughout Western US. Demonstrate long-endurance mission capabilities (20-hours+). Image multiple fires (greater than 4 fires per mission), to showcase extendable mission configuration and ability to either linger over key fires or station over disparate regional fires. Demonstrate new UAV-compatible, autonomous sensor for improved thermal characterization of fires. Provide automated, on-board, terrain and geo-rectified sensor imagery over OTH satcom links to national fire personnel and Incident commanders. Deliver real-time imagery (within 10-minutes of acquisition). Demonstrate capabilities of OTS technologies (GoogleEarth) to serve and display mission-critical sensor data, coincident with other pertinent data elements to facilitate information processing (WX data, ground asset data, other satellite data, R/T video, flight track info, etc).

  9. LIRIS flight database and its use toward noncooperative rendezvous

    NASA Astrophysics Data System (ADS)

    Mongrard, O.; Ankersen, F.; Casiez, P.; Cavrois, B.; Donnard, A.; Vergnol, A.; Southivong, U.

    2018-06-01

    ESA's fifth and last Automated Transfer Vehicle, ATV Georges Lemaître, tested new rendezvous technology before docking with the International Space Station (ISS) in August 2014. The technology demonstration called Laser Infrared Imaging Sensors (LIRIS) provides an unseen view of the ISS. During Georges Lemaître's rendezvous, LIRIS sensors, composed of two infrared cameras, one visible camera, and a scanning LIDAR (Light Detection and Ranging), were turned on two and a half hours and 3500 m from the Space Station. All sensors worked as expected and a large amount of data was recorded and stored within ATV-5's cargo hold before being returned to Earth with the Soyuz flight 38S in September 2014. As a part of the LIRIS postflight activities, the information gathered by all sensors is collected inside a flight database together with the reference ATV trajectory and attitude estimated by ATV main navigation sensors. Although decoupled from the ATV main computer, the LIRIS data were carefully synchronized with ATV guidance, navigation, and control (GNC) data. Hence, the LIRIS database can be used to assess the performance of various image processing algorithms to provide range and line-of-sight (LoS) navigation at long/medium range but also 6 degree-of-freedom (DoF) navigation at short range. The database also contains information related to the overall ATV position with respect to Earth and the Sun direction within ATV frame such that the effect of the environment on the sensors can also be investigated. This paper introduces the structure of the LIRIS database and provides some example of applications to increase the technology readiness level of noncooperative rendezvous.

  10. ACOUSTICAL IMAGING AND MECHANICAL PROPERTIES OF SOFT ROCK AND MARINE SEDIMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thurman E. Scott, Jr., Ph.D.; Younane Abousleiman, Ph.D.; Musharraf Zaman, Ph.D., P.E.

    2002-11-18

    During the sixth quarter of this research project the research team developed a method and the experimental procedures for acquiring the data needed for ultrasonic tomography of rock core samples under triaxial stress conditions as outlined in Task 10. Traditional triaxial compression experiments, where compressional and shear wave velocities are measured, provide little or no information about the internal spatial distribution of mechanical damage within the sample. The velocities measured between platen-to-platen or sensor-to-sensor reflects an averaging of all the velocities occurring along that particular raypath across the boundaries of the rock. The research team is attempting to develop andmore » refine a laboratory equivalent of seismic tomography for use on rock samples deformed under triaxial stress conditions. Seismic tomography, utilized for example in crosswell tomography, allows an imaging of the velocities within a discrete zone within the rock. Ultrasonic or acoustic tomography is essentially the extension of that field technology applied to rock samples deforming in the laboratory at high pressures. This report outlines the technical steps and procedures for developing this technology for use on weak, soft chalk samples. Laboratory tests indicate that the chalk samples exhibit major changes in compressional and shear wave velocities during compaction. Since chalk is the rock type responsible for the severe subsidence and compaction in the North Sea it was selected for the first efforts at tomographic imaging of soft rocks. Field evidence from the North Sea suggests that compaction, which has resulted in over 30 feet of subsidence to date, is heterogeneously distributed within the reservoir. The research team will attempt to image this very process in chalk samples. The initial tomographic studies (Scott et al., 1994a,b; 1998) were accomplished on well cemented, competent rocks such as Berea sandstone. The extension of the technology to weaker samples is more difficult but potentially much more rewarding. The chalk, since it is a weak material, also attenuates wave propagation more than other rock types. Three different types of sensors were considered (and tested) for the tomographic imaging project: 600 KHz PZT, 1 MHz PZT, and PVDF film sensors. 600 KHz PZT crystals were selected because they generated a sufficiently high amplitude pulse to propagate across the damaged chalk. A number of different configurations were considered for placement of the acoustic arrays. It was decided after preliminary testing that the most optimum arrangement of the acoustic sensors was to place three arrays of sensors, with each array containing twenty sensors, around the sample. There would be two horizontal arrays to tomographically image two circular cross-sectional planes through the rock core sample. A third array would be vertically oriented to provide a vertical cross-sectional view of the sample. A total of 260 acoustic raypaths would be shot and acquired in the horizontal acoustic array to create each horizontal tomographic image. The sensors can be used as both acoustic sources or as acoustic each of the 10 pulsers to the 10 receivers.« less

  11. The Advanced Linked Extended Reconnaissance & Targeting Technology Demonstration project

    NASA Astrophysics Data System (ADS)

    Edwards, Mark

    2008-04-01

    The Advanced Linked Extended Reconnaissance & Targeting (ALERT) Technology Demonstration (TD) project is addressing many operational needs of the future Canadian Army's Surveillance and Reconnaissance forces. Using the surveillance system of the Coyote reconnaissance vehicle as an experimental platform, the ALERT TD project aims to significantly enhance situational awareness by fusing multi-sensor and tactical data, developing automated processes, and integrating beyond line-of-sight sensing. The project is exploiting important advances made in computer processing capability, displays technology, digital communications, and sensor technology since the design of the original surveillance system. As the major research area within the project, concepts are discussed for displaying and fusing multi-sensor and tactical data within an Enhanced Operator Control Station (EOCS). The sensor data can originate from the Coyote's own visible-band and IR cameras, laser rangefinder, and ground-surveillance radar, as well as from beyond line-of-sight systems such as mini-UAVs and unattended ground sensors. Video-rate image processing has been developed to assist the operator to detect poorly visible targets. As a second major area of research, automatic target cueing capabilities have been added to the system. These include scene change detection, automatic target detection and aided target recognition algorithms processing both IR and visible-band images to draw the operator's attention to possible targets. The merits of incorporating scene change detection algorithms are also discussed. In the area of multi-sensor data fusion, up to Joint Defence Labs level 2 has been demonstrated. The human factors engineering aspects of the user interface in this complex environment are presented, drawing upon multiple user group sessions with military surveillance system operators. The paper concludes with Lessons Learned from the project. The ALERT system has been used in a number of C4ISR field trials, most recently at Exercise Empire Challenge in China Lake CA, and at Trial Quest in Norway. Those exercises provided further opportunities to investigate operator interactions. The paper concludes with recommendations for future work in operator interface design.

  12. The fusion of satellite and UAV data: simulation of high spatial resolution band

    NASA Astrophysics Data System (ADS)

    Jenerowicz, Agnieszka; Siok, Katarzyna; Woroszkiewicz, Malgorzata; Orych, Agata

    2017-10-01

    Remote sensing techniques used in the precision agriculture and farming that apply imagery data obtained with sensors mounted on UAV platforms became more popular in the last few years due to the availability of low- cost UAV platforms and low- cost sensors. Data obtained from low altitudes with low- cost sensors can be characterised by high spatial and radiometric resolution but quite low spectral resolution, therefore the application of imagery data obtained with such technology is quite limited and can be used only for the basic land cover classification. To enrich the spectral resolution of imagery data acquired with low- cost sensors from low altitudes, the authors proposed the fusion of RGB data obtained with UAV platform with multispectral satellite imagery. The fusion is based on the pansharpening process, that aims to integrate the spatial details of the high-resolution panchromatic image with the spectral information of lower resolution multispectral or hyperspectral imagery to obtain multispectral or hyperspectral images with high spatial resolution. The key of pansharpening is to properly estimate the missing spatial details of multispectral images while preserving their spectral properties. In the research, the authors presented the fusion of RGB images (with high spatial resolution) obtained with sensors mounted on low- cost UAV platforms and multispectral satellite imagery with satellite sensors, i.e. Landsat 8 OLI. To perform the fusion of UAV data with satellite imagery, the simulation of the panchromatic bands from RGB data based on the spectral channels linear combination, was conducted. Next, for simulated bands and multispectral satellite images, the Gram-Schmidt pansharpening method was applied. As a result of the fusion, the authors obtained several multispectral images with very high spatial resolution and then analysed the spatial and spectral accuracies of processed images.

  13. The advanced linked extended reconnaissance and targeting technology demonstration project

    NASA Astrophysics Data System (ADS)

    Cruickshank, James; de Villers, Yves; Maheux, Jean; Edwards, Mark; Gains, David; Rea, Terry; Banbury, Simon; Gauthier, Michelle

    2007-06-01

    The Advanced Linked Extended Reconnaissance & Targeting (ALERT) Technology Demonstration (TD) project is addressing key operational needs of the future Canadian Army's Surveillance and Reconnaissance forces by fusing multi-sensor and tactical data, developing automated processes, and integrating beyond line-of-sight sensing. We discuss concepts for displaying and fusing multi-sensor and tactical data within an Enhanced Operator Control Station (EOCS). The sensor data can originate from the Coyote's own visible-band and IR cameras, laser rangefinder, and ground-surveillance radar, as well as beyond line-of-sight systems such as a mini-UAV and unattended ground sensors. The authors address technical issues associated with the use of fully digital IR and day video cameras and discuss video-rate image processing developed to assist the operator to recognize poorly visible targets. Automatic target detection and recognition algorithms processing both IR and visible-band images have been investigated to draw the operator's attention to possible targets. The machine generated information display requirements are presented with the human factors engineering aspects of the user interface in this complex environment, with a view to establishing user trust in the automation. The paper concludes with a summary of achievements to date and steps to project completion.

  14. VLC-based indoor location awareness using LED light and image sensors

    NASA Astrophysics Data System (ADS)

    Lee, Seok-Ju; Yoo, Jong-Ho; Jung, Sung-Yoon

    2012-11-01

    Recently, indoor LED lighting can be considered for constructing green infra with energy saving and additionally providing LED-IT convergence services such as visible light communication (VLC) based location awareness and navigation services. For example, in case of large complex shopping mall, location awareness to navigate the destination is very important issue. However, the conventional navigation using GPS is not working indoors. Alternative location service based on WLAN has a problem that the position accuracy is low. For example, it is difficult to estimate the height exactly. If the position error of the height is greater than the height between floors, it may cause big problem. Therefore, conventional navigation is inappropriate for indoor navigation. Alternative possible solution for indoor navigation is VLC based location awareness scheme. Because indoor LED infra will be definitely equipped for providing lighting functionality, indoor LED lighting has a possibility to provide relatively high accuracy of position estimation combined with VLC technology. In this paper, we provide a new VLC based positioning system using visible LED lights and image sensors. Our system uses location of image sensor lens and location of reception plane. By using more than two image sensor, we can determine transmitter position less than 1m position error. Through simulation, we verify the validity of the proposed VLC based new positioning system using visible LED light and image sensors.

  15. The CAOS camera platform: ushering in a paradigm change in extreme dynamic range imager design

    NASA Astrophysics Data System (ADS)

    Riza, Nabeel A.

    2017-02-01

    Multi-pixel imaging devices such as CCD, CMOS and Focal Plane Array (FPA) photo-sensors dominate the imaging world. These Photo-Detector Array (PDA) devices certainly have their merits including increasingly high pixel counts and shrinking pixel sizes, nevertheless, they are also being hampered by limitations in instantaneous dynamic range, inter-pixel crosstalk, quantum full well capacity, signal-to-noise ratio, sensitivity, spectral flexibility, and in some cases, imager response time. Recently invented is the Coded Access Optical Sensor (CAOS) Camera platform that works in unison with current Photo-Detector Array (PDA) technology to counter fundamental limitations of PDA-based imagers while providing high enough imaging spatial resolution and pixel counts. Using for example the Texas Instruments (TI) Digital Micromirror Device (DMD) to engineer the CAOS camera platform, ushered in is a paradigm change in advanced imager design, particularly for extreme dynamic range applications.

  16. Visible and infrared imaging radiometers for ocean observations

    NASA Technical Reports Server (NTRS)

    Barnes, W. L.

    1977-01-01

    The current status of visible and infrared sensors designed for the remote monitoring of the oceans is reviewed. Emphasis is placed on multichannel scanning radiometers that are either operational or under development. Present design practices and parameter constraints are discussed. Airborne sensor systems examined include the ocean color scanner and the ocean temperature scanner. The costal zone color scanner and advanced very high resolution radiometer are reviewed with emphasis on design specifications. Recent technological advances and their impact on sensor design are examined.

  17. The Performance of a Tight Ins/gnss/photogrammetric Integration Scheme for Land Based MMS Applications in Gnss Denied Environments

    NASA Astrophysics Data System (ADS)

    Chu, Chien-Hsun; Chiang, Kai-Wei

    2016-06-01

    The early development of mobile mapping system (MMS) was restricted to applications that permitted the determination of the elements of exterior orientation from existing ground control. Mobile mapping refers to a means of collecting geospatial data using mapping sensors that are mounted on a mobile platform. Research works concerning mobile mapping dates back to the late 1980s. This process is mainly driven by the need for highway infrastructure mapping and transportation corridor inventories. In the early nineties, advances in satellite and inertial technology made it possible to think about mobile mapping in a different way. Instead of using ground control points as references for orienting the images in space, the trajectory and attitude of the imager platform could now be determined directly. Cameras, along with navigation and positioning sensors are integrated and mounted on a land vehicle for mapping purposes. Objects of interest can be directly measured and mapped from images that have been georeferenced using navigation and positioning sensors. Direct georeferencing (DG) is the determination of time-variable position and orientation parameters for a mobile digital imager. The most common technologies used for this purpose today are satellite positioning using the Global Navigation Satellite System (GNSS) and inertial navigation using an Inertial Measuring Unit (IMU). Although either technology used along could in principle determine both position and orientation, they are usually integrated in such a way that the IMU is the main orientation sensor, while the GNSS receiver is the main position sensor. However, GNSS signals are obstructed due to limited number of visible satellites in GNSS denied environments such as urban canyon, foliage, tunnel and indoor that cause the GNSS gap or interfered by reflected signals that cause abnormal measurement residuals thus deteriorates the positioning accuracy in GNSS denied environments. This study aims at developing a novel method that uses ground control points to maintain the positioning accuracy of the MMS in GNSS denied environments. At last, this study analyses the performance of proposed method using about 20 check-points through DG process.

  18. Compact SPAD-Based Pixel Architectures for Time-Resolved Image Sensors

    PubMed Central

    Perenzoni, Matteo; Pancheri, Lucio; Stoppa, David

    2016-01-01

    This paper reviews the state of the art of single-photon avalanche diode (SPAD) image sensors for time-resolved imaging. The focus of the paper is on pixel architectures featuring small pixel size (<25 μm) and high fill factor (>20%) as a key enabling technology for the successful implementation of high spatial resolution SPAD-based image sensors. A summary of the main CMOS SPAD implementations, their characteristics and integration challenges, is provided from the perspective of targeting large pixel arrays, where one of the key drivers is the spatial uniformity. The main analog techniques aimed at time-gated photon counting and photon timestamping suitable for compact and low-power pixels are critically discussed. The main features of these solutions are the adoption of analog counting techniques and time-to-analog conversion, in NMOS-only pixels. Reliable quantum-limited single-photon counting, self-referenced analog-to-digital conversion, time gating down to 0.75 ns and timestamping with 368 ps jitter are achieved. PMID:27223284

  19. Real-time image processing of TOF range images using a reconfigurable processor system

    NASA Astrophysics Data System (ADS)

    Hussmann, S.; Knoll, F.; Edeler, T.

    2011-07-01

    During the last years, Time-of-Flight sensors achieved a significant impact onto research fields in machine vision. In comparison to stereo vision system and laser range scanners they combine the advantages of active sensors providing accurate distance measurements and camera-based systems recording a 2D matrix at a high frame rate. Moreover low cost 3D imaging has the potential to open a wide field of additional applications and solutions in markets like consumer electronics, multimedia, digital photography, robotics and medical technologies. This paper focuses on the currently implemented 4-phase-shift algorithm in this type of sensors. The most time critical operation of the phase-shift algorithm is the arctangent function. In this paper a novel hardware implementation of the arctangent function using a reconfigurable processor system is presented and benchmarked against the state-of-the-art CORDIC arctangent algorithm. Experimental results show that the proposed algorithm is well suited for real-time processing of the range images of TOF cameras.

  20. Scintillating Quantum Dots for Imaging X-rays (SQDIX) for Aircraft Inspection

    NASA Technical Reports Server (NTRS)

    Burke, Eric (Principal Investigator); Williams, Phillip (Principal Investigator); Dehaven, Stan

    2015-01-01

    Scintillation is the process currently employed by conventional x-ray detectors to create x-ray images. Scintillating quantum dots or nano-crystals (StQDs) are a novel, nanometer-scale material that upon excitation by x-rays, re-emit the absorbed energy as visible light. StQDs theoretically have higher output efficiency than conventional scintillating materials and are more environmental friendly. This paper will present the characterization of several critical elements in the use of StQDs that have been performed along a path to the use of this technology in wide spread x-ray imaging. Initial work on the SQDIX system has shown great promise to create state-of-the-art sensors using StQDs as a sensor material. In addition, this work also demonstrates a high degree of promise using StQDs in microstructured fiber optics. Using the microstructured fiber as a light guide could greatly increase the capture efficiency a StQDs based imaging sensor.

  1. A CMOS image sensor with programmable pixel-level analog processing.

    PubMed

    Massari, Nicola; Gottardi, Massimo; Gonzo, Lorenzo; Stoppa, David; Simoni, Andrea

    2005-11-01

    A prototype of a 34 x 34 pixel image sensor, implementing real-time analog image processing, is presented. Edge detection, motion detection, image amplification, and dynamic-range boosting are executed at pixel level by means of a highly interconnected pixel architecture based on the absolute value of the difference among neighbor pixels. The analog operations are performed over a kernel of 3 x 3 pixels. The square pixel, consisting of 30 transistors, has a pitch of 35 microm with a fill-factor of 20%. The chip was fabricated in a 0.35 microm CMOS technology, and its power consumption is 6 mW with 3.3 V power supply. The device was fully characterized and achieves a dynamic range of 50 dB with a light power density of 150 nW/mm2 and a frame rate of 30 frame/s. The measured fixed pattern noise corresponds to 1.1% of the saturation level. The sensor's dynamic range can be extended up to 96 dB using the double-sampling technique.

  2. Sensor and information fusion for improved hostile fire situational awareness

    NASA Astrophysics Data System (ADS)

    Scanlon, Michael V.; Ludwig, William D.

    2010-04-01

    A research-oriented Army Technology Objective (ATO) named Sensor and Information Fusion for Improved Hostile Fire Situational Awareness uniquely focuses on the underpinning technologies to detect and defeat any hostile threat; before, during, and after its occurrence. This is a joint effort led by the Army Research Laboratory, with the Armaments and the Communications and Electronics Research, Development, and Engineering Centers (CERDEC and ARDEC) partners. It addresses distributed sensor fusion and collaborative situational awareness enhancements, focusing on the underpinning technologies to detect/identify potential hostile shooters prior to firing a shot and to detect/classify/locate the firing point of hostile small arms, mortars, rockets, RPGs, and missiles after the first shot. A field experiment conducted addressed not only diverse modality sensor performance and sensor fusion benefits, but gathered useful data to develop and demonstrate the ad hoc networking and dissemination of relevant data and actionable intelligence. Represented at this field experiment were various sensor platforms such as UGS, soldier-worn, manned ground vehicles, UGVs, UAVs, and helicopters. This ATO continues to evaluate applicable technologies to include retro-reflection, UV, IR, visible, glint, LADAR, radar, acoustic, seismic, E-field, narrow-band emission and image processing techniques to detect the threats with very high confidence. Networked fusion of multi-modal data will reduce false alarms and improve actionable intelligence by distributing grid coordinates, detection report features, and imagery of threats.

  3. Monitoring Animal Behaviour and Environmental Interactions Using Wireless Sensor Networks, GPS Collars and Satellite Remote Sensing

    PubMed Central

    Handcock, Rebecca N.; Swain, Dave L.; Bishop-Hurley, Greg J.; Patison, Kym P.; Wark, Tim; Valencia, Philip; Corke, Peter; O'Neill, Christopher J.

    2009-01-01

    Remote monitoring of animal behaviour in the environment can assist in managing both the animal and its environmental impact. GPS collars which record animal locations with high temporal frequency allow researchers to monitor both animal behaviour and interactions with the environment. These ground-based sensors can be combined with remotely-sensed satellite images to understand animal-landscape interactions. The key to combining these technologies is communication methods such as wireless sensor networks (WSNs). We explore this concept using a case-study from an extensive cattle enterprise in northern Australia and demonstrate the potential for combining GPS collars and satellite images in a WSN to monitor behavioural preferences and social behaviour of cattle. PMID:22412327

  4. Airborne measurements in the infrared using FTIR-based imaging hyperspectral sensors

    NASA Astrophysics Data System (ADS)

    Puckrin, E.; Turcotte, C. S.; Lahaie, P.; Dubé, D.; Lagueux, P.; Farley, V.; Marcotte, F.; Chamberland, M.

    2009-09-01

    Hyperspectral ground mapping is being used in an ever-increasing extent for numerous applications in the military, geology and environmental fields. The different regions of the electromagnetic spectrum help produce information of differing nature. The visible, near-infrared and short-wave infrared radiation (400 nm to 2.5 μm) has been mostly used to analyze reflected solar light, while the mid-wave (3 to 5 μm) and long-wave (8 to 12 μm or thermal) infrared senses the self-emission of molecules directly, enabling the acquisition of data during night time. Push-broom dispersive sensors have been typically used for airborne hyperspectral mapping. However, extending the spectral range towards the mid-wave and long-wave infrared brings performance limitations due to the self emission of the sensor itself. The Fourier-transform spectrometer technology has been extensively used in the infrared spectral range due to its high transmittance as well as throughput and multiplex advantages, thereby reducing the sensor self-emission problem. Telops has developed the Hyper-Cam, a rugged and compact infrared hyperspectral imager. The Hyper-Cam is based on the Fourier-transform technology yielding high spectral resolution and enabling high accuracy radiometric calibration. It provides passive signature measurement capability, with up to 320x256 pixels at spectral resolutions of up to 0.25 cm-1. The Hyper-Cam has been used on the ground in several field campaigns, including the demonstration of standoff chemical agent detection. More recently, the Hyper-Cam has been integrated into an airplane to provide airborne measurement capabilities. A special pointing module was designed to compensate for airplane attitude and forward motion. To our knowledge, the Hyper-Cam is the first commercial airborne hyperspectral imaging sensor based on Fourier-transform infrared technology. The first airborne measurements and some preliminary performance criteria for the Hyper-Cam are presented in this paper.

  5. Airborne measurements in the infrared using FTIR-based imaging hyperspectral sensors

    NASA Astrophysics Data System (ADS)

    Puckrin, E.; Turcotte, C. S.; Lahaie, P.; Dubé, D.; Farley, V.; Lagueux, P.; Marcotte, F.; Chamberland, M.

    2009-05-01

    Hyperspectral ground mapping is being used in an ever-increasing extent for numerous applications in the military, geology and environmental fields. The different regions of the electromagnetic spectrum help produce information of differing nature. The visible, near-infrared and short-wave infrared radiation (400 nm to 2.5 μm) has been mostly used to analyze reflected solar light, while the mid-wave (3 to 5 μm) and long-wave (8 to 12 μm or thermal) infrared senses the self-emission of molecules directly, enabling the acquisition of data during night time. Push-broom dispersive sensors have been typically used for airborne hyperspectral mapping. However, extending the spectral range towards the mid-wave and long-wave infrared brings performance limitations due to the self emission of the sensor itself. The Fourier-transform spectrometer technology has been extensively used in the infrared spectral range due to its high transmittance as well as throughput and multiplex advantages, thereby reducing the sensor self-emission problem. Telops has developed the Hyper-Cam, a rugged and compact infrared hyperspectral imager. The Hyper-Cam is based on the Fourier-transform technology yielding high spectral resolution and enabling high accuracy radiometric calibration. It provides passive signature measurement capability, with up to 320x256 pixels at spectral resolutions of up to 0.25 cm-1. The Hyper-Cam has been used on the ground in several field campaigns, including the demonstration of standoff chemical agent detection. More recently, the Hyper-Cam has been integrated into an airplane to provide airborne measurement capabilities. A special pointing module was designed to compensate for airplane attitude and forward motion. To our knowledge, the Hyper-Cam is the first commercial airborne hyperspectral imaging sensor based on Fourier-transform infrared technology. The first airborne measurements and some preliminary performance criteria for the Hyper-Cam are presented in this paper.

  6. Spatial Quality Evaluation of Resampled Unmanned Aerial Vehicle-Imagery for Weed Mapping.

    PubMed

    Borra-Serrano, Irene; Peña, José Manuel; Torres-Sánchez, Jorge; Mesas-Carrascosa, Francisco Javier; López-Granados, Francisca

    2015-08-12

    Unmanned aerial vehicles (UAVs) combined with different spectral range sensors are an emerging technology for providing early weed maps for optimizing herbicide applications. Considering that weeds, at very early phenological stages, are similar spectrally and in appearance, three major components are relevant: spatial resolution, type of sensor and classification algorithm. Resampling is a technique to create a new version of an image with a different width and/or height in pixels, and it has been used in satellite imagery with different spatial and temporal resolutions. In this paper, the efficiency of resampled-images (RS-images) created from real UAV-images (UAV-images; the UAVs were equipped with two types of sensors, i.e., visible and visible plus near-infrared spectra) captured at different altitudes is examined to test the quality of the RS-image output. The performance of the object-based-image-analysis (OBIA) implemented for the early weed mapping using different weed thresholds was also evaluated. Our results showed that resampling accurately extracted the spectral values from high spatial resolution UAV-images at an altitude of 30 m and the RS-image data at altitudes of 60 and 100 m, was able to provide accurate weed cover and herbicide application maps compared with UAV-images from real flights.

  7. Spatial Quality Evaluation of Resampled Unmanned Aerial Vehicle-Imagery for Weed Mapping

    PubMed Central

    Borra-Serrano, Irene; Peña, José Manuel; Torres-Sánchez, Jorge; Mesas-Carrascosa, Francisco Javier; López-Granados, Francisca

    2015-01-01

    Unmanned aerial vehicles (UAVs) combined with different spectral range sensors are an emerging technology for providing early weed maps for optimizing herbicide applications. Considering that weeds, at very early phenological stages, are similar spectrally and in appearance, three major components are relevant: spatial resolution, type of sensor and classification algorithm. Resampling is a technique to create a new version of an image with a different width and/or height in pixels, and it has been used in satellite imagery with different spatial and temporal resolutions. In this paper, the efficiency of resampled-images (RS-images) created from real UAV-images (UAV-images; the UAVs were equipped with two types of sensors, i.e., visible and visible plus near-infrared spectra) captured at different altitudes is examined to test the quality of the RS-image output. The performance of the object-based-image-analysis (OBIA) implemented for the early weed mapping using different weed thresholds was also evaluated. Our results showed that resampling accurately extracted the spectral values from high spatial resolution UAV-images at an altitude of 30 m and the RS-image data at altitudes of 60 and 100 m, was able to provide accurate weed cover and herbicide application maps compared with UAV-images from real flights. PMID:26274960

  8. Real time in vivo imaging and measurement of serine protease activity in the mouse hippocampus using a dedicated complementary metal-oxide semiconductor imaging device.

    PubMed

    Ng, David C; Tamura, Hideki; Tokuda, Takashi; Yamamoto, Akio; Matsuo, Masamichi; Nunoshita, Masahiro; Ishikawa, Yasuyuki; Shiosaka, Sadao; Ohta, Jun

    2006-09-30

    The aim of the present study is to demonstrate the application of complementary metal-oxide semiconductor (CMOS) imaging technology for studying the mouse brain. By using a dedicated CMOS image sensor, we have successfully imaged and measured brain serine protease activity in vivo, in real-time, and for an extended period of time. We have developed a biofluorescence imaging device by packaging the CMOS image sensor which enabled on-chip imaging configuration. In this configuration, no optics are required whereby an excitation filter is applied onto the sensor to replace the filter cube block found in conventional fluorescence microscopes. The fully packaged device measures 350 microm thick x 2.7 mm wide, consists of an array of 176 x 144 pixels, and is small enough for measurement inside a single hemisphere of the mouse brain, while still providing sufficient imaging resolution. In the experiment, intraperitoneally injected kainic acid induced upregulation of serine protease activity in the brain. These events were captured in real time by imaging and measuring the fluorescence from a fluorogenic substrate that detected this activity. The entire device, which weighs less than 1% of the body weight of the mouse, holds promise for studying freely moving animals.

  9. Attitude determination for high-accuracy submicroradian jitter pointing on space-based platforms

    NASA Astrophysics Data System (ADS)

    Gupta, Avanindra A.; van Houten, Charles N.; Germann, Lawrence M.

    1990-10-01

    A description of the requirement definition process is given for a new wideband attitude determination subsystem (ADS) for image motion compensation (IMC) systems. The subsystem consists of either lateral accelerometers functioning in differential pairs or gas-bearing gyros for high-frequency sensors using CCD-based star trackers for low-frequency sensors. To minimize error the sensor signals are combined so that the mixing filter does not allow phase distortion. The two ADS models are introduced in an IMC simulation to predict measurement error, correction capability, and residual image jitter for a variety of system parameters. The IMC three-axis testbed is utilized to simulate an incoming beam in inertial space. Results demonstrate that both mechanical and electronic IMC meet the requirements of image stabilization for space-based observation at submicroradian-jitter levels. Currently available technology may be employed to implement IMC systems.

  10. Multichannel imager for littoral zone characterization

    NASA Astrophysics Data System (ADS)

    Podobna, Yuliya; Schoonmaker, Jon; Dirbas, Joe; Sofianos, James; Boucher, Cynthia; Gilbert, Gary

    2010-04-01

    This paper describes an approach to utilize a multi-channel, multi-spectral electro-optic (EO) system for littoral zone characterization. Advanced Coherent Technologies, LLC (ACT) presents their EO sensor systems for the surf zone environmental assessment and potential surf zone target detection. Specifically, an approach is presented to determine a Surf Zone Index (SZI) from the multi-spectral EO sensor system. SZI provides a single quantitative value of the surf zone conditions delivering an immediate understanding of the area and an assessment as to how well an airborne optical system might perform in a mine countermeasures (MCM) operation. Utilizing consecutive frames of SZI images, ACT is able to measure variability over time. A surf zone nomograph, which incorporates targets, sensor, and environmental data, including the SZI to determine the environmental impact on system performance, is reviewed in this work. ACT's electro-optical multi-channel, multi-spectral imaging system and test results are presented and discussed.

  11. Basic Geometric Support of Systems for Earth Observation from Geostationary and Highly Elliptical Orbits

    NASA Astrophysics Data System (ADS)

    Gektin, Yu. M.; Egoshkin, N. A.; Eremeev, V. V.; Kuznecov, A. E.; Moskatinyev, I. V.; Smelyanskiy, M. B.

    2017-12-01

    A set of standardized models and algorithms for geometric normalization and georeferencing images from geostationary and highly elliptical Earth observation systems is considered. The algorithms can process information from modern scanning multispectral sensors with two-coordinate scanning and represent normalized images in optimal projection. Problems of the high-precision ground calibration of the imaging equipment using reference objects, as well as issues of the flight calibration and refinement of geometric models using the absolute and relative reference points, are considered. Practical testing of the models, algorithms, and technologies is performed in the calibration of sensors for spacecrafts of the Electro-L series and during the simulation of the Arktika prospective system.

  12. Leonardo (formerly Selex ES) infrared sensors for astronomy: present and future

    NASA Astrophysics Data System (ADS)

    Baker, Ian; Maxey, Chris; Hipwood, Les; Barnes, Keith

    2016-07-01

    Many branches of science require infrared detectors sensitive to individual photons. Applications range from low background astronomy to high speed imaging. Leonardo in Southampton, UK, has been developing HgCdTe avalanche photodiode (APD) sensors for astronomy in collaboration with European Southern Observatory (ESO) since 2008 and more recently the University of Hawaii. The devices utilise Metal Organic Vapour Phase Epitaxy, MOVPE, grown on low-cost GaAs substrates and in combination with a mesa device structure achieve very low dark current and near-ideal MTF. MOVPE provides the ability to grow complex HgCdTe heterostructures and these have proved crucial to suppress breakdown currents and allow high avalanche gain in low background situations. A custom device called Saphira (320x256/24μm) has been developed for wavefront sensors, interferometry and transient event imaging. This device has achieved read noise as low as 0.26 electrons rms and single photon imaging with avalanche gain up to x450. It is used in the ESO Gravity program for adaptive optics and fringe tracking and has been successfully trialled on the 3m NASA IRTF, 8.2m Subaru and 60 inch Mt Palomar for lucky imaging and wavefront sensing. In future the technology offers much shorter observation times for read-noise limited instruments, particularly spectroscopy. The paper will describe the MOVPE APD technology and current performance status.

  13. A Wireless Sensor Network-Based Ubiquitous Paprika Growth Management System

    PubMed Central

    Hwang, Jeonghwan; Shin, Changsun; Yoe, Hyun

    2010-01-01

    Wireless Sensor Network (WSN) technology can facilitate advances in productivity, safety and human quality of life through its applications in various industries. In particular, the application of WSN technology to the agricultural area, which is labor-intensive compared to other industries, and in addition is typically lacking in IT technology applications, adds value and can increase the agricultural productivity. This study attempts to establish a ubiquitous agricultural environment and improve the productivity of farms that grow paprika by suggesting a ‘Ubiquitous Paprika Greenhouse Management System’ using WSN technology. The proposed system can collect and monitor information related to the growth environment of crops outside and inside paprika greenhouses by installing WSN sensors and monitoring images captured by CCTV cameras. In addition, the system provides a paprika greenhouse environment control facility for manual and automatic control from a distance, improves the convenience and productivity of users, and facilitates an optimized environment to grow paprika based on the growth environment data acquired by operating the system. PMID:22163543

  14. Earth imaging and scientific observations by SSTI ``Clark'' a NASA technology demonstration spacecraft

    NASA Astrophysics Data System (ADS)

    Hayduk, Robert J.; Scott, Walter S.; Walberg, Gerald D.; Butts, James J.; Starr, Richard D.

    1997-01-01

    The Small Satellite Technology Initiative (SSTI) is a National Aeronautics and Space Administration (NASA) program to demonstrate smaller, high technology satellites constructed rapidly and less expensively. Under SSTI, NASA funded the development of ``Clark,'' a high technology demonstration satellite to provide 3-m resolution panchromatic and 15-m resolution multispectral images, as well as collect atmospheric constituent and cosmic x-ray data. The 690-lb. satellite, to be launched in early 1997, will be in a 476 km, circular, sun-synchronous polar orbit. This paper describes the program objectives, the technical characteristics of the sensors and satellite, image processing, archiving and distribution. Data archiving and distribution will be performed by NASA Stennis Space Center and by the EROS Data Center, Sioux Falls, South Dakota, USA.

  15. Demonstration of Airborne Wide Area Assessment Technologies at Pueblo Precision Bombing Ranges, Colorado. Hyperspectral Imaging, Version 2.0

    DTIC Science & Technology

    2007-09-27

    the spatial and spectral resolution ...variety of geological and vegetation mapping efforts, the Hymap sensor offered the best available combination of spectral and spatial resolution , signal... The limitations of the technology currently relate to spatial and spectral resolution and geo- correction accuracy. Secondly, HSI datasets

  16. Noise Reduction Effect of Multiple-Sampling-Based Signal-Readout Circuits for Ultra-Low Noise CMOS Image Sensors.

    PubMed

    Kawahito, Shoji; Seo, Min-Woong

    2016-11-06

    This paper discusses the noise reduction effect of multiple-sampling-based signal readout circuits for implementing ultra-low-noise image sensors. The correlated multiple sampling (CMS) technique has recently become an important technology for high-gain column readout circuits in low-noise CMOS image sensors (CISs). This paper reveals how the column CMS circuits, together with a pixel having a high-conversion-gain charge detector and low-noise transistor, realizes deep sub-electron read noise levels based on the analysis of noise components in the signal readout chain from a pixel to the column analog-to-digital converter (ADC). The noise measurement results of experimental CISs are compared with the noise analysis and the effect of noise reduction to the sampling number is discussed at the deep sub-electron level. Images taken with three CMS gains of two, 16, and 128 show distinct advantage of image contrast for the gain of 128 (noise(median): 0.29 e - rms ) when compared with the CMS gain of two (2.4 e - rms ), or 16 (1.1 e - rms ).

  17. Noise Reduction Effect of Multiple-Sampling-Based Signal-Readout Circuits for Ultra-Low Noise CMOS Image Sensors

    PubMed Central

    Kawahito, Shoji; Seo, Min-Woong

    2016-01-01

    This paper discusses the noise reduction effect of multiple-sampling-based signal readout circuits for implementing ultra-low-noise image sensors. The correlated multiple sampling (CMS) technique has recently become an important technology for high-gain column readout circuits in low-noise CMOS image sensors (CISs). This paper reveals how the column CMS circuits, together with a pixel having a high-conversion-gain charge detector and low-noise transistor, realizes deep sub-electron read noise levels based on the analysis of noise components in the signal readout chain from a pixel to the column analog-to-digital converter (ADC). The noise measurement results of experimental CISs are compared with the noise analysis and the effect of noise reduction to the sampling number is discussed at the deep sub-electron level. Images taken with three CMS gains of two, 16, and 128 show distinct advantage of image contrast for the gain of 128 (noise(median): 0.29 e−rms) when compared with the CMS gain of two (2.4 e−rms), or 16 (1.1 e−rms). PMID:27827972

  18. ASPECT (Airborne Spectral Photometric Environmental Collection Technology) Fact Sheet

    EPA Pesticide Factsheets

    This multi-sensor screening tool provides infrared and photographic images with geospatial, chemical, and radiological data within minutes to support emergency responses, home-land security missions, environmental surveys, and climate monitoring missions.

  19. Digital mammography, cancer screening: Factors important for image compression

    NASA Technical Reports Server (NTRS)

    Clarke, Laurence P.; Blaine, G. James; Doi, Kunio; Yaffe, Martin J.; Shtern, Faina; Brown, G. Stephen; Winfield, Daniel L.; Kallergi, Maria

    1993-01-01

    The use of digital mammography for breast cancer screening poses several novel problems such as development of digital sensors, computer assisted diagnosis (CAD) methods for image noise suppression, enhancement, and pattern recognition, compression algorithms for image storage, transmission, and remote diagnosis. X-ray digital mammography using novel direct digital detection schemes or film digitizers results in large data sets and, therefore, image compression methods will play a significant role in the image processing and analysis by CAD techniques. In view of the extensive compression required, the relative merit of 'virtually lossless' versus lossy methods should be determined. A brief overview is presented here of the developments of digital sensors, CAD, and compression methods currently proposed and tested for mammography. The objective of the NCI/NASA Working Group on Digital Mammography is to stimulate the interest of the image processing and compression scientific community for this medical application and identify possible dual use technologies within the NASA centers.

  20. Accurate reconstruction of hyperspectral images from compressive sensing measurements

    NASA Astrophysics Data System (ADS)

    Greer, John B.; Flake, J. C.

    2013-05-01

    The emerging field of Compressive Sensing (CS) provides a new way to capture data by shifting the heaviest burden of data collection from the sensor to the computer on the user-end. This new means of sensing requires fewer measurements for a given amount of information than traditional sensors. We investigate the efficacy of CS for capturing HyperSpectral Imagery (HSI) remotely. We also introduce a new family of algorithms for constructing HSI from CS measurements with Split Bregman Iteration [Goldstein and Osher,2009]. These algorithms combine spatial Total Variation (TV) with smoothing in the spectral dimension. We examine models for three different CS sensors: the Coded Aperture Snapshot Spectral Imager-Single Disperser (CASSI-SD) [Wagadarikar et al.,2008] and Dual Disperser (CASSI-DD) [Gehm et al.,2007] cameras, and a hypothetical random sensing model closer to CS theory, but not necessarily implementable with existing technology. We simulate the capture of remotely sensed images by applying the sensor forward models to well-known HSI scenes - an AVIRIS image of Cuprite, Nevada and the HYMAP Urban image. To measure accuracy of the CS models, we compare the scenes constructed with our new algorithm to the original AVIRIS and HYMAP cubes. The results demonstrate the possibility of accurately sensing HSI remotely with significantly fewer measurements than standard hyperspectral cameras.

  1. An Over 90 dB Intra-Scene Single-Exposure Dynamic Range CMOS Image Sensor Using a 3.0 μm Triple-Gain Pixel Fabricated in a Standard BSI Process.

    PubMed

    Takayanagi, Isao; Yoshimura, Norio; Mori, Kazuya; Matsuo, Shinichiro; Tanaka, Shunsuke; Abe, Hirofumi; Yasuda, Naoto; Ishikawa, Kenichiro; Okura, Shunsuke; Ohsawa, Shinji; Otaka, Toshinori

    2018-01-12

    To respond to the high demand for high dynamic range imaging suitable for moving objects with few artifacts, we have developed a single-exposure dynamic range image sensor by introducing a triple-gain pixel and a low noise dual-gain readout circuit. The developed 3 μm pixel is capable of having three conversion gains. Introducing a new split-pinned photodiode structure, linear full well reaches 40 ke - . Readout noise under the highest pixel gain condition is 1 e - with a low noise readout circuit. Merging two signals, one with high pixel gain and high analog gain, and the other with low pixel gain and low analog gain, a single exposure dynamic rage (SEHDR) signal is obtained. Using this technology, a 1/2.7", 2M-pixel CMOS image sensor has been developed and characterized. The image sensor also employs an on-chip linearization function, yielding a 16-bit linear signal at 60 fps, and an intra-scene dynamic range of higher than 90 dB was successfully demonstrated. This SEHDR approach inherently mitigates the artifacts from moving objects or time-varying light sources that can appear in the multiple exposure high dynamic range (MEHDR) approach.

  2. Plenoptic mapping for imaging and retrieval of the complex field amplitude of a laser beam.

    PubMed

    Wu, Chensheng; Ko, Jonathan; Davis, Christopher C

    2016-12-26

    The plenoptic sensor has been developed to sample complicated beam distortions produced by turbulence in the low atmosphere (deep turbulence or strong turbulence) with high density data samples. In contrast with the conventional Shack-Hartmann wavefront sensor, which utilizes all the pixels under each lenslet of a micro-lens array (MLA) to obtain one data sample indicating sub-aperture phase gradient and photon intensity, the plenoptic sensor uses each illuminated pixel (with significant pixel value) under each MLA lenslet as a data point for local phase gradient and intensity. To characterize the working principle of a plenoptic sensor, we propose the concept of plenoptic mapping and its inverse mapping to describe the imaging and reconstruction process respectively. As a result, we show that the plenoptic mapping is an efficient method to image and reconstruct the complex field amplitude of an incident beam with just one image. With a proof of concept experiment, we show that adaptive optics (AO) phase correction can be instantaneously achieved without going through a phase reconstruction process under the concept of plenoptic mapping. The plenoptic mapping technology has high potential for applications in imaging, free space optical (FSO) communication and directed energy (DE) where atmospheric turbulence distortion needs to be compensated.

  3. An Over 90 dB Intra-Scene Single-Exposure Dynamic Range CMOS Image Sensor Using a 3.0 μm Triple-Gain Pixel Fabricated in a Standard BSI Process †

    PubMed Central

    Takayanagi, Isao; Yoshimura, Norio; Mori, Kazuya; Matsuo, Shinichiro; Tanaka, Shunsuke; Abe, Hirofumi; Yasuda, Naoto; Ishikawa, Kenichiro; Okura, Shunsuke; Ohsawa, Shinji; Otaka, Toshinori

    2018-01-01

    To respond to the high demand for high dynamic range imaging suitable for moving objects with few artifacts, we have developed a single-exposure dynamic range image sensor by introducing a triple-gain pixel and a low noise dual-gain readout circuit. The developed 3 μm pixel is capable of having three conversion gains. Introducing a new split-pinned photodiode structure, linear full well reaches 40 ke−. Readout noise under the highest pixel gain condition is 1 e− with a low noise readout circuit. Merging two signals, one with high pixel gain and high analog gain, and the other with low pixel gain and low analog gain, a single exposure dynamic rage (SEHDR) signal is obtained. Using this technology, a 1/2.7”, 2M-pixel CMOS image sensor has been developed and characterized. The image sensor also employs an on-chip linearization function, yielding a 16-bit linear signal at 60 fps, and an intra-scene dynamic range of higher than 90 dB was successfully demonstrated. This SEHDR approach inherently mitigates the artifacts from moving objects or time-varying light sources that can appear in the multiple exposure high dynamic range (MEHDR) approach. PMID:29329210

  4. Study on an agricultural environment monitoring server system using Wireless Sensor Networks.

    PubMed

    Hwang, Jeonghwan; Shin, Changsun; Yoe, Hyun

    2010-01-01

    This paper proposes an agricultural environment monitoring server system for monitoring information concerning an outdoors agricultural production environment utilizing Wireless Sensor Network (WSN) technology. The proposed agricultural environment monitoring server system collects environmental and soil information on the outdoors through WSN-based environmental and soil sensors, collects image information through CCTVs, and collects location information using GPS modules. This collected information is converted into a database through the agricultural environment monitoring server consisting of a sensor manager, which manages information collected from the WSN sensors, an image information manager, which manages image information collected from CCTVs, and a GPS manager, which processes location information of the agricultural environment monitoring server system, and provides it to producers. In addition, a solar cell-based power supply is implemented for the server system so that it could be used in agricultural environments with insufficient power infrastructure. This agricultural environment monitoring server system could even monitor the environmental information on the outdoors remotely, and it could be expected that the use of such a system could contribute to increasing crop yields and improving quality in the agricultural field by supporting the decision making of crop producers through analysis of the collected information.

  5. Optimization of illuminating system to detect optical properties inside a finger

    NASA Astrophysics Data System (ADS)

    Sano, Emiko; Shikai, Masahiro; Shiratsuki, Akihide; Maeda, Takuji; Matsushita, Masahito; Sasakawa, Koichi

    2007-01-01

    Biometrics performs personal authentication using individual bodily features including fingerprints, faces, etc. These technologies have been studied and developed for many years. In particular, fingerprint authentication has evolved over many years, and fingerprinting is currently one of world's most established biometric authentication techniques. Not long ago this technique was only used for personal identification in criminal investigations and high-security facilities. In recent years, however, various biometric authentication techniques have appeared in everyday applications. Even though providing great convenience, they have also produced a number of technical issues concerning operation. Generally, fingerprint authentication is comprised of a number of component technologies: (1) sensing technology for detecting the fingerprint pattern; (2) image processing technology for converting the captured pattern into feature data that can be used for verification; (3) verification technology for comparing the feature data with a reference and determining whether it matches. Current fingerprint authentication issues, revealed in research results, originate with fingerprint sensing technology. Sensing methods for detecting a person's fingerprint pattern for image processing are particularly important because they impact overall fingerprint authentication performance. The following are the current problems concerning sensing methods that occur in some cases: Some fingers whose fingerprints used to be difficult to detect by conventional sensors. Fingerprint patterns are easily affected by the finger's surface condition, such noise as discontinuities and thin spots can appear in fingerprint patterns obtained from wrinkled finger, sweaty finger, and so on. To address these problems, we proposed a novel fingerprint sensor based on new scientific knowledge. A characteristic of this new method is that obtained fingerprint patterns are not easily affected by the finger's surface condition because it detects the fingerprint pattern inside the finger using transmitted light. We examined optimization of illumination system of this novel fingerprint sensor to detect contrasty fingerprint pattern from wide area and to improve image processing at (2).

  6. Sensing, Spectra and Scaling: What's in Store for Land Observations

    NASA Technical Reports Server (NTRS)

    Goetz, Alexander F. H.

    2001-01-01

    Bill Pecora's 1960's vision of the future, using spacecraft-based sensors for mapping the environment and exploring for resources, is being implemented today. New technology has produced better sensors in space such as the Landsat Thematic Mapper (TM) and SPOT, and creative researchers are continuing to find new applications. However, with existing sensors, and those intended for launch in this century, the potential for extracting information from the land surface is far from being exploited. The most recent technology development is imaging spectrometry, the acquisition of images in hundreds of contiguous spectral bands, such that for any pixel a complete reflectance spectrum can be acquired. Experience with Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) has shown that, with proper attention paid to absolute calibration, it is possible to acquire apparent surface reflectance to 5% accuracy without any ground-based measurement. The data reduction incorporates in educated guess of the aerosol scattering, development of a precipitable water vapor map from the data and mapping of cirrus clouds in the 1.38 micrometer band. This is not possible with TM. The pixel size in images of the earth plays and important role in the type and quality of information that can be derived. Less understood is the coupling between spatial and spectral resolution in a sensor. Recent work has shown that in processing the data to derive the relative abundance of materials in a pixel, also known is unmixing, the pixel size is an important parameter. A variance in the relative abundance of materials among the pixels is necessary to be able to derive the endmembers or pure material constituent spectra. In most cases, the 1 km pixel size for the Earth Observing System Moderate Resolution Imaging Spectroradiometer (MODIS) instrument is too large to meet the variance criterion. A pointable high spatial and spectral resolution imaging spectrometer in orbit will be necessary to make the major next step in our understanding of the solid earth surface and its changing face.

  7. An electromagnetic noncontacting sensor for thickness measurement in a dispersive medium

    NASA Technical Reports Server (NTRS)

    Chufo, Robert L.

    1994-01-01

    This paper describes a general purpose imaging technology developed by the U.S. Bureau of Mines (USBM) that, when fully implemented, will solve the general problem of 'seeing into the earth.' A first-generation radar coal thickness sensor, the RCTS-1, has been developed and field-tested in both underground and highwall mines. The noncontacting electromagnetic technique uses spatial modulation created by moving a simple sensor antenna in a direction along each axis to be measured while the complex reflection coefficient is measured at multiple frequencies over a two-to-one bandwidth. The antenna motion imparts spatial modulation to the data that enables signal processing to solve the problems of media, target, and antenna dispersion. Knowledge of the dielectric constant of the media is not necessary because the electrical properties of the media are determined automatically along with the distance to the target and thickness of each layer of the target. The sensor was developed as a navigation guidance sensor to accurately detect the coal/noncoal interface required for the USBM computer-assisted mining machine program. Other mining applications include the location of rock fractures, water-filled voids, and abandoned gas wells. These hazards can be detected in advance of the mining operation. This initiating technology is being expanded into a full three-dimensional (3-D) imaging system that will have applications in both the underground and surface environment.

  8. UV-visible sensors based on polymorphous silicon

    NASA Astrophysics Data System (ADS)

    Guedj, Cyril S.; Cabarrocas, Pere R. i.; Massoni, Nicolas; Moussy, Norbert; Morel, Damien; Tchakarov, Svetoslav; Bonnassieux, Yvan

    2003-09-01

    UV-based imaging systems can be used for low-altitude rockets detection or biological agents identification (for instance weapons containing ANTHRAX). Compared to conventional CCD technology, CMOS-based active pixel sensors provide several advantages, including excellent electro-optical performances, high integration, low voltage operation, low power consumption, low cost, long lifetime, and robustness against environment. The monolithic integration of UV, visible and infrared detectors on the same uncooled CMOS smart system would therefore represent a major advance in the combat field, for characterization and representation of targets and backgrounds. In this approach, we have recently developped a novel technology using polymorphous silicon. This new material, fully compatible with above-IC silicon technology, is made of nanometric size ordered domains embedded in an amorphous matrix. The typical quantum efficiency of detectors made of this nano-material reach up to 80 % at 550 nm and 30 % in the UV range, depending of the design and the growth parameters. Furthermore, a record dark current of 20 pA/cm2 at -3 V has been reached. In addition, this new generation of sensors is significantly faster and more stable than their amorphous silicon counterparts. In this paper, we will present the relationship between the sensor technology and the overall performances.

  9. Percutaneous fiber-optic sensor for the detection of chemotherapy-induced apoptosis in vivo

    NASA Astrophysics Data System (ADS)

    O'Kelly, James; Liao, Kuo-Chih; Clifton, William; Lu, Daning; Koeffler, Phillip; Loeb, Gerald

    2010-02-01

    Early imaging of tumor response to chemotherapy has the potential for significant clinical benefits. We are developing a family of fiber-optic sensors called SencilsTM (sensory cilia), which are disposable, minimally invasive, and can provide in vivo monitoring of various analytes for several weeks. The objective of this study was to develop and test our sensor to image the labeling of phosphatidylserine by apoptotic cells in response to chemotherapeutic drugs. FM1-43 was a better fluorescent marker for detecting phosphatidylserine expression than Annexin V-FITC; both the proportion of labeled cells (Annexin V, 15%; FM1-43, 58%) and the relative fluorescent increase (Annexin V-FITC, 1.5-fold; FM1-43, 4.5-fold) was greater when FM1-43 was used to detect apoptosis. Initial testing of the optical sensing technology using Taxol-treated MCF-7 cells demonstrated that injection of FM1-43 resulted in a rapid, transient increase in fluorescence that was greater in apoptotic cells compared to control cells (apoptotic cells, 4-fold increase; control cells, 2-fold increase). Using an established animal model, mice were injected with cyclophosphamide and hepatic apoptosis was assessed by imaging of PS expression. Both the amplitude of fluorescence increase and the time taken for the amplitude to decay to half of its peak were increased in livers from animals treated with cyclophosphamide. Our optical sensing technology can be used to detect the early apoptotic response of cells to chemotherapeutic drugs both in vitro and in vivo. This novel technology represents a unique option for the imaging of tumor responses in vivo, and provides an inexpensive, specific system for the detection of early-stage apoptosis.

  10. Finite element model for MOI applications using A-V formulation

    NASA Astrophysics Data System (ADS)

    Xuan, L.; Shanker, B.; Udpa, L.; Shih, W.; Fitzpatrick, G.

    2001-04-01

    Magneto-optic imaging (MOI) is a relatively new sensor application of an extension of bubble memory technology to NDT and produce easy-to-interpret, real time analog images. MOI systems use a magneto-optic (MO) sensor to produce analog images of magnetic flux leakage from surface and subsurface defects. The instrument's capability in detecting the relatively weak magnetic fields associated with subsurface defects depends on the sensitivity of the magneto-optic sensor. The availability of a theoretical model that can simulate the MOI system performance is extremely important for optimization of the MOI sensor and hardware system. A nodal finite element model based on magnetic vector potential formulation has been developed for simulating MOI phenomenon. This model has been used for predicting the magnetic fields in simple test geometry with corrosion dome defects. In the case of test samples with multiple discontinuities, a more robust model using the magnetic vector potential Ā and electrical scalar potential V is required. In this paper, a finite element model based on A-V formulation is developed to model complex circumferential crack under aluminum rivets in dimpled countersink.

  11. Real-time, wide-area hyperspectral imaging sensors for standoff detection of explosives and chemical warfare agents

    NASA Astrophysics Data System (ADS)

    Gomer, Nathaniel R.; Tazik, Shawna; Gardner, Charles W.; Nelson, Matthew P.

    2017-05-01

    Hyperspectral imaging (HSI) is a valuable tool for the detection and analysis of targets located within complex backgrounds. HSI can detect threat materials on environmental surfaces, where the concentration of the target of interest is often very low and is typically found within complex scenery. Unfortunately, current generation HSI systems have size, weight, and power limitations that prohibit their use for field-portable and/or real-time applications. Current generation systems commonly provide an inefficient area search rate, require close proximity to the target for screening, and/or are not capable of making real-time measurements. ChemImage Sensor Systems (CISS) is developing a variety of real-time, wide-field hyperspectral imaging systems that utilize shortwave infrared (SWIR) absorption and Raman spectroscopy. SWIR HSI sensors provide wide-area imagery with at or near real time detection speeds. Raman HSI sensors are being developed to overcome two obstacles present in standard Raman detection systems: slow area search rate (due to small laser spot sizes) and lack of eye-safety. SWIR HSI sensors have been integrated into mobile, robot based platforms and handheld variants for the detection of explosives and chemical warfare agents (CWAs). In addition, the fusion of these two technologies into a single system has shown the feasibility of using both techniques concurrently to provide higher probability of detection and lower false alarm rates. This paper will provide background on Raman and SWIR HSI, discuss the applications for these techniques, and provide an overview of novel CISS HSI sensors focusing on sensor design and detection results.

  12. Synthetic Aperture Radar (SAR) data processing

    NASA Technical Reports Server (NTRS)

    Beckner, F. L.; Ahr, H. A.; Ausherman, D. A.; Cutrona, L. J.; Francisco, S.; Harrison, R. E.; Heuser, J. S.; Jordan, R. L.; Justus, J.; Manning, B.

    1978-01-01

    The available and optimal methods for generating SAR imagery for NASA applications were identified. The SAR image quality and data processing requirements associated with these applications were studied. Mathematical operations and algorithms required to process sensor data into SAR imagery were defined. The architecture of SAR image formation processors was discussed, and technology necessary to implement the SAR data processors used in both general purpose and dedicated imaging systems was addressed.

  13. Research on multi-source image fusion technology in haze environment

    NASA Astrophysics Data System (ADS)

    Ma, GuoDong; Piao, Yan; Li, Bing

    2017-11-01

    In the haze environment, the visible image collected by a single sensor can express the details of the shape, color and texture of the target very well, but because of the haze, the sharpness is low and some of the target subjects are lost; Because of the expression of thermal radiation and strong penetration ability, infrared image collected by a single sensor can clearly express the target subject, but it will lose detail information. Therefore, the multi-source image fusion method is proposed to exploit their respective advantages. Firstly, the improved Dark Channel Prior algorithm is used to preprocess the visible haze image. Secondly, the improved SURF algorithm is used to register the infrared image and the haze-free visible image. Finally, the weighted fusion algorithm based on information complementary is used to fuse the image. Experiments show that the proposed method can improve the clarity of the visible target and highlight the occluded infrared target for target recognition.

  14. Relating transverse ray error and light fields in plenoptic camera images

    NASA Astrophysics Data System (ADS)

    Schwiegerling, Jim; Tyo, J. Scott

    2013-09-01

    Plenoptic cameras have emerged in recent years as a technology for capturing light field data in a single snapshot. A conventional digital camera can be modified with the addition of a lenslet array to create a plenoptic camera. The camera image is focused onto the lenslet array. The lenslet array is placed over the camera sensor such that each lenslet forms an image of the exit pupil onto the sensor. The resultant image is an array of circular exit pupil images, each corresponding to the overlying lenslet. The position of the lenslet encodes the spatial information of the scene, whereas as the sensor pixels encode the angular information for light incident on the lenslet. The 4D light field is therefore described by the 2D spatial information and 2D angular information captured by the plenoptic camera. In aberration theory, the transverse ray error relates the pupil coordinates of a given ray to its deviation from the ideal image point in the image plane and is consequently a 4D function as well. We demonstrate a technique for modifying the traditional transverse ray error equations to recover the 4D light field of a general scene. In the case of a well corrected optical system, this light field is easily related to the depth of various objects in the scene. Finally, the effects of sampling with both the lenslet array and the camera sensor on the 4D light field data are analyzed to illustrate the limitations of such systems.

  15. Multi-sensor millimeter-wave system for hidden objects detection by non-collaborative screening

    NASA Astrophysics Data System (ADS)

    Zouaoui, Rhalem; Czarny, Romain; Diaz, Frédéric; Khy, Antoine; Lamarque, Thierry

    2011-05-01

    In this work, we present the development of a multi-sensor system for the detection of objects concealed under clothes using passive and active millimeter-wave (mmW) technologies. This study concerns both the optimization of a commercial passive mmW imager at 94 GHz using a phase mask and the development of an active mmW detector at 77 GHz based on synthetic aperture radar (SAR). A first wide-field inspection is done by the passive imager while the person is walking. If a suspicious area is detected, the active imager is switched-on and focused on this area in order to obtain more accurate data (shape of the object, nature of the material ...).

  16. Atmospheric aerosol measurements by employing a polarization scheimpflug lidar system

    NASA Astrophysics Data System (ADS)

    Mei, Liang; Guan, Peng; Yang, Yang

    2018-04-01

    A polarization Scheimpflug lidar system based on the Scheimpflug principle has been developed by employing a compact 808-nm multimode highpower laser diode and two highly integrated CMOS sensors in Dalian University of Technology (DLUT), Dalian, China. The parallel and orthogonal polarized backscattering signal are recorded by two 45 degree tilted image sensors, respectively. Atmospheric particle measurements were carried out by employing the polarization Scheimpflug lidar system.

  17. Neural Network Substorm Identification: Enabling TREx Sensor Web Modes

    NASA Astrophysics Data System (ADS)

    Chaddock, D.; Spanswick, E.; Arnason, K. M.; Donovan, E.; Liang, J.; Ahmad, S.; Jackel, B. J.

    2017-12-01

    Transition Region Explorer (TREx) is a ground-based sensor web of optical and radio instruments that is presently being deployed across central Canada. The project consists of an array of co-located blue-line, full-colour, and near-infrared all-sky imagers, imaging riometers, proton aurora spectrographs, and GNSS systems. A key goal of the TREx project is to create the world's first (artificial) intelligent sensor web for remote sensing space weather. The sensor web will autonomously control and coordinate instrument operations in real-time. To accomplish this, we will use real-time in-line analytics of TREx and other data to dynamically switch between operational modes. An operating mode could be, for example, to have a blue-line imager gather data at a one or two orders of magnitude higher cadence than it operates for its `baseline' mode. The software decision to increase the imaging cadence would be in response to an anticipated increase in auroral activity or other programmatic requirements. Our first test for TREx's sensor web technologies is to develop the capacity to autonomously alter the TREx operating mode prior to a substorm expansion phase onset. In this paper, we present our neural network analysis of historical optical and riometer data and our ability to predict an optical onset. We explore the preliminary insights into using a neural network to pick out trends and features which it deems are similar among substorms.

  18. Miniature optical planar camera based on a wide-angle metasurface doublet corrected for monochromatic aberrations

    PubMed Central

    Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei

    2016-01-01

    Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision. PMID:27892454

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa

    Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° ×more » 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.« less

  20. Miniature optical planar camera based on a wide-angle metasurface doublet corrected for monochromatic aberrations

    NASA Astrophysics Data System (ADS)

    Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei

    2016-11-01

    Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° × 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.

  1. Real time three-dimensional space video rate sensors for millimeter waves imaging based very inexpensive plasma LED lamps

    NASA Astrophysics Data System (ADS)

    Levanon, Assaf; Yitzhaky, Yitzhak; Kopeika, Natan S.; Rozban, Daniel; Abramovich, Amir

    2014-10-01

    In recent years, much effort has been invested to develop inexpensive but sensitive Millimeter Wave (MMW) detectors that can be used in focal plane arrays (FPAs), in order to implement real time MMW imaging. Real time MMW imaging systems are required for many varied applications in many fields as homeland security, medicine, communications, military products and space technology. It is mainly because this radiation has high penetration and good navigability through dust storm, fog, heavy rain, dielectric materials, biological tissue, and diverse materials. Moreover, the atmospheric attenuation in this range of the spectrum is relatively low and the scattering is also low compared to NIR and VIS. The lack of inexpensive room temperature imaging systems makes it difficult to provide a suitable MMW system for many of the above applications. In last few years we advanced in research and development of sensors using very inexpensive (30-50 cents) Glow Discharge Detector (GDD) plasma indicator lamps as MMW detectors. This paper presents three kinds of GDD sensor based lamp Focal Plane Arrays (FPA). Those three kinds of cameras are different in the number of detectors, scanning operation, and detection method. The 1st and 2nd generations are 8 × 8 pixel array and an 18 × 2 mono-rail scanner array respectively, both of them for direct detection and limited to fixed imaging. The last designed sensor is a multiplexing frame rate of 16x16 GDD FPA. It permits real time video rate imaging of 30 frames/ sec and comprehensive 3D MMW imaging. The principle of detection in this sensor is a frequency modulated continuous wave (FMCW) system while each of the 16 GDD pixel lines is sampled simultaneously. Direct detection is also possible and can be done with a friendly user interface. This FPA sensor is built over 256 commercial GDD lamps with 3 mm diameter International Light, Inc., Peabody, MA model 527 Ne indicator lamps as pixel detectors. All three sensors are fully supported by software Graphical Unit Interface (GUI). They were tested and characterized through different kinds of optical systems for imaging applications, super resolution, and calibration methods. Capability of the 16x16 sensor is to employ a chirp radar like method to produced depth and reflectance information in the image. This enables 3-D MMW imaging in real time with video frame rate. In this work we demonstrate different kinds of optical imaging systems. Those systems have capability of 3-D imaging for short range and longer distances to at least 10-20 meters.

  2. Photonic hydrogel sensors.

    PubMed

    Yetisen, Ali K; Butt, Haider; Volpatti, Lisa R; Pavlichenko, Ida; Humar, Matjaž; Kwok, Sheldon J J; Koo, Heebeom; Kim, Ki Su; Naydenova, Izabela; Khademhosseini, Ali; Hahn, Sei Kwang; Yun, Seok Hyun

    2016-01-01

    Analyte-sensitive hydrogels that incorporate optical structures have emerged as sensing platforms for point-of-care diagnostics. The optical properties of the hydrogel sensors can be rationally designed and fabricated through self-assembly, microfabrication or laser writing. The advantages of photonic hydrogel sensors over conventional assay formats include label-free, quantitative, reusable, and continuous measurement capability that can be integrated with equipment-free text or image display. This Review explains the operation principles of photonic hydrogel sensors, presents syntheses of stimuli-responsive polymers, and provides an overview of qualitative and quantitative readout technologies. Applications in clinical samples are discussed, and potential future directions are identified. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Hydra Rendezvous and Docking Sensor

    NASA Technical Reports Server (NTRS)

    Roe, Fred; Carrington, Connie

    2007-01-01

    The U.S. technology to support a CEV AR&D activity is mature and was developed by NASA and supporting industry during an extensive research and development program conducted during the 1990's and early 2000 time frame at the Marshall Space Flight Center. Development and demonstration of a rendezvous/docking sensor was identified early in the AR&D Program as the critical enabling technology that allows automated proxinity operations and docking. A first generation rendezvous/docking sensor, the Video Guidance Sensor (VGS) was developed and successfully flown on STS 87 and again on STS 95, proving the concept of a video-based sensor. Advances in both video and signal processing technologies and the lessons learned from the two successful flight experiments provided a baseline for the development of a new generation of video based rendezvous/docking sensor. The Advanced Video Guidance Sensor (AVGS) has greatly increased performance and additional capability for longer-range operation. A Demonstration Automatic Rendezvous Technology (DART) flight experiment was flown in April 2005 using AVGS as the primary proximity operations sensor. Because of the absence of a docking mechanism on the target satellite, this mission did not demonstrate the ability of the sensor to coltrold ocking. Mission results indicate that the rendezvous sensor operated successfully in "spot mode" (2 km acquisition of the target, bearing data only) but was never commanded to "acquire and track" the docking target. Parts obsolescence issues prevent the construction of current design AVGS units to support the NASA Exploration initiative. This flight proven AR&D technology is being modularized and upgraded with additional capabilities through the Hydra project at the Marshall Space Flight Center. Hydra brings a unique engineering approach and sensor architecture to the table, to solve the continuing issues of parts obsolescence and multiple sensor integration. This paper presents an approach to sensor hardware trades, to address the needs of future vehicles that may rendezvous and dock with the International Space Station (ISS). It will also discuss approaches for upgrading AVGS to address parts obsolescence, and concepts for modularizing the sensor to provide configuration flexibility for multiple vehicle applications. Options for complementary sensors to be integrated into the multi-head Hydra system will also be presented. Complementary sensor options include ULTOR, a digital image correlator system that could provide relative six-degree-of-freedom information independently from AVGS, and time-of-flight sensors, which determine the range between vehicles by timing pulses that travel from the sensor to the target and back. Common targets and integrated targets, suitable for use with the multi-sensor options in Hydra, will also be addressed.

  4. Development of a fusion approach selection tool

    NASA Astrophysics Data System (ADS)

    Pohl, C.; Zeng, Y.

    2015-06-01

    During the last decades number and quality of available remote sensing satellite sensors for Earth observation has grown significantly. The amount of available multi-sensor images along with their increased spatial and spectral resolution provides new challenges to Earth scientists. With a Fusion Approach Selection Tool (FAST) the remote sensing community would obtain access to an optimized and improved image processing technology. Remote sensing image fusion is a mean to produce images containing information that is not inherent in the single image alone. In the meantime the user has access to sophisticated commercialized image fusion techniques plus the option to tune the parameters of each individual technique to match the anticipated application. This leaves the operator with an uncountable number of options to combine remote sensing images, not talking about the selection of the appropriate images, resolution and bands. Image fusion can be a machine and time-consuming endeavour. In addition it requires knowledge about remote sensing, image fusion, digital image processing and the application. FAST shall provide the user with a quick overview of processing flows to choose from to reach the target. FAST will ask for available images, application parameters and desired information to process this input to come out with a workflow to quickly obtain the best results. It will optimize data and image fusion techniques. It provides an overview on the possible results from which the user can choose the best. FAST will enable even inexperienced users to use advanced processing methods to maximize the benefit of multi-sensor image exploitation.

  5. The Earthscan Industry.

    ERIC Educational Resources Information Center

    Aviation/Space, 1982

    1982-01-01

    Spurred by National Aeronautics and Space Administration (NASA) technological advances, a budding industry is manufacturing equipment and providing services toward better management of earth's resources. Topics discussed include image processing, multispectral photography, ground use sensor, and weather data receiver. (Author/JN)

  6. New radiological material detection technologies for nuclear forensics: Remote optical imaging and graphene-based sensors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Richard Karl; Martin, Jeffrey B.; Wiemann, Dora K.

    We developed new detector technologies to identify the presence of radioactive materials for nuclear forensics applications. First, we investigated an optical radiation detection technique based on imaging nitrogen fluorescence excited by ionizing radiation. We demonstrated optical detection in air under indoor and outdoor conditions for alpha particles and gamma radiation at distances up to 75 meters. We also contributed to the development of next generation systems and concepts that could enable remote detection at distances greater than 1 km, and originated a concept that could enable daytime operation of the technique. A second area of research was the development ofmore » room-temperature graphene-based sensors for radiation detection and measurement. In this project, we observed tunable optical and charged particle detection, and developed improved devices. With further development, the advancements described in this report could enable new capabilities for nuclear forensics applications.« less

  7. High-Speed Scanning Interferometer Using CMOS Image Sensor and FPGA Based on Multifrequency Phase-Tracking Detection

    NASA Technical Reports Server (NTRS)

    Ohara, Tetsuo

    2012-01-01

    A sub-aperture stitching optical interferometer can provide a cost-effective solution for an in situ metrology tool for large optics; however, the currently available technologies are not suitable for high-speed and real-time continuous scan. NanoWave s SPPE (Scanning Probe Position Encoder) has been proven to exhibit excellent stability and sub-nanometer precision with a large dynamic range. This same technology can transform many optical interferometers into real-time subnanometer precision tools with only minor modification. The proposed field-programmable gate array (FPGA) signal processing concept, coupled with a new-generation, high-speed, mega-pixel CMOS (complementary metal-oxide semiconductor) image sensor, enables high speed (>1 m/s) and real-time continuous surface profiling that is insensitive to variation of pixel sensitivity and/or optical transmission/reflection. This is especially useful for large optics surface profiling.

  8. A portfolio of products from the rapid terrain visualization interferometric SAR

    NASA Astrophysics Data System (ADS)

    Bickel, Douglas L.; Doerry, Armin W.

    2007-04-01

    The Rapid Terrain Visualization interferometric synthetic aperture radar was designed and built at Sandia National Laboratories as part of an Advanced Concept Technology Demonstration (ACTD) to "demonstrate the technologies and infrastructure to meet the Army requirement for rapid generation of digital topographic data to support emerging crisis or contingencies." This sensor was built by Sandia National Laboratories for the Joint Programs Sustainment and Development (JPSD) Project Office to provide highly accurate digital elevation models (DEMs) for military and civilian customers, both inside and outside of the United States. The sensor achieved better than HRTe Level IV position accuracy in near real-time. The system was flown on a deHavilland DHC-7 Army aircraft. This paper presents a collection of images and data products from the Rapid Terrain Visualization interferometric synthetic aperture radar. The imagery includes orthorectified images and DEMs from the RTV interferometric SAR radar.

  9. A wireless narrowband imaging chip for capsule endoscope.

    PubMed

    Lan-Rong Dung; Yin-Yi Wu

    2010-12-01

    This paper presents a dual-mode capsule gastrointestinal endoscope device. An endoscope combined with a narrowband image (NBI), has been shown to be a superior diagnostic tool for early stage tissue neoplasms detection. Nevertheless, a wireless capsule endoscope with the narrowband imaging technology has not been presented in the market up to now. The narrowband image acquisition and power dissipation reduction are the main challenges of NBI capsule endoscope. In this paper, we present the first narrowband imaging capsule endoscope that can assist clinical doctors to effectively diagnose early gastrointestinal cancers, profited from our dedicated dual-mode complementary metal-oxide semiconductor (CMOS) sensor. The dedicated dual-mode CMOS sensor can offer white-light and narrowband images. Implementation results show that the proposed 512 × 512 CMOS sensor consumes only 2 mA at a 3-V power supply. The average current of the NBI capsule with an 8-Mb/s RF transmitter is nearly 7 ~ 8 mA that can continuously work for 6 ~ 8 h with two 1.5-V 80-mAh button batteries while the frame rate is 2 fps. Experimental results on backside mucosa of a human tongue and pig's small intestine showed that the wireless NBI capsule endoscope can significantly improve the image quality, compared with a commercial-of-the-shelf capsule endoscope for gastrointestinal tract diagnosis.

  10. Research on auto-calibration technology of the image plane's center of 360-degree and all round looking camera

    NASA Astrophysics Data System (ADS)

    Zhang, Shaojun; Xu, Xiping

    2015-10-01

    The 360-degree and all round looking camera, as its characteristics of suitable for automatic analysis and judgment on the ambient environment of the carrier by image recognition algorithm, is usually applied to opto-electronic radar of robots and smart cars. In order to ensure the stability and consistency of image processing results of mass production, it is necessary to make sure the centers of image planes of different cameras are coincident, which requires to calibrate the position of the image plane's center. The traditional mechanical calibration method and electronic adjusting mode of inputting the offsets manually, both exist the problem of relying on human eyes, inefficiency and large range of error distribution. In this paper, an approach of auto- calibration of the image plane of this camera is presented. The imaging of the 360-degree and all round looking camera is a ring-shaped image consisting of two concentric circles, the center of the image is a smaller circle and the outside is a bigger circle. The realization of the technology is just to exploit the above characteristics. Recognizing the two circles through HOUGH TRANSFORM algorithm and calculating the center position, we can get the accurate center of image, that the deviation of the central location of the optic axis and image sensor. The program will set up the image sensor chip through I2C bus automatically, we can adjusting the center of the image plane automatically and accurately. The technique has been applied to practice, promotes productivity and guarantees the consistent quality of products.

  11. High performance thermal imaging for the 21st century

    NASA Astrophysics Data System (ADS)

    Clarke, David J.; Knowles, Peter

    2003-01-01

    In recent years IR detector technology has developed from early short linear arrays. Such devices require high performance signal processing electronics to meet today's thermal imaging requirements for military and para-military applications. This paper describes BAE SYSTEMS Avionics Group's Sensor Integrated Modular Architecture thermal imager which has been developed alongside the group's Eagle 640×512 arrays to provide high performance imaging capability. The electronics architecture also supprots High Definition TV format 2D arrays for future growth capability.

  12. Spatial noise in microdisplays for near-to-eye applications

    NASA Astrophysics Data System (ADS)

    Hastings, Arthur R., Jr.; Draper, Russell S.; Wood, Michael V.; Fellowes, David A.

    2011-06-01

    Spatial noise in imaging systems has been characterized and its impact on image quality metrics has been addressed primarily with respect to the introduction of this noise at the sensor component. However, sensor fixed pattern noise is not the only source of fixed pattern noise in an imaging system. Display fixed pattern noise cannot be easily mitigated in processing and, therefore, must be addressed. In this paper, a thorough examination of the amount and the effect of display fixed pattern noise is presented. The specific manifestation of display fixed pattern noise is dependent upon the display technology. Utilizing a calibrated camera, US Army RDECOM CERDEC NVESD has developed a microdisplay (μdisplay) spatial noise data collection capability. Noise and signal power spectra were used to characterize the display signal to noise ratio (SNR) as a function of spatial frequency analogous to the minimum resolvable temperature difference (MRTD) of a thermal sensor. The goal of this study is to establish a measurement technique to characterize μdisplay limiting performance to assist in proper imaging system specification.

  13. A high sensitivity 20Mfps CMOS image sensor with readout speed of 1Tpixel/sec for visualization of ultra-high speed phenomena

    NASA Astrophysics Data System (ADS)

    Kuroda, R.; Sugawa, S.

    2017-02-01

    Ultra-high speed (UHS) CMOS image sensors with on-chop analog memories placed on the periphery of pixel array for the visualization of UHS phenomena are overviewed in this paper. The developed UHS CMOS image sensors consist of 400H×256V pixels and 128 memories/pixel, and the readout speed of 1Tpixel/sec is obtained, leading to 10 Mfps full resolution video capturing with consecutive 128 frames, and 20 Mfps half resolution video capturing with consecutive 256 frames. The first development model has been employed in the high speed video camera and put in practical use in 2012. By the development of dedicated process technologies, photosensitivity improvement and power consumption reduction were simultaneously achieved, and the performance improved version has been utilized in the commercialized high-speed video camera since 2015 that offers 10 Mfps with ISO16,000 photosensitivity. Due to the improved photosensitivity, clear images can be captured and analyzed even under low light condition, such as under a microscope as well as capturing of UHS light emission phenomena.

  14. The fast and accurate 3D-face scanning technology based on laser triangle sensors

    NASA Astrophysics Data System (ADS)

    Wang, Jinjiang; Chang, Tianyu; Ge, Baozhen; Tian, Qingguo; Chen, Yang; Kong, Bin

    2013-08-01

    A laser triangle scanning method and the structure of 3D-face measurement system were introduced. In presented system, a liner laser source was selected as an optical indicated signal in order to scanning a line one times. The CCD image sensor was used to capture image of the laser line modulated by human face. The system parameters were obtained by system calibrated calculated. The lens parameters of image part of were calibrated with machine visual image method and the triangle structure parameters were calibrated with fine wire paralleled arranged. The CCD image part and line laser indicator were set with a linear motor carry which can achieve the line laser scanning form top of the head to neck. For the nose is ledge part and the eyes are sunk part, one CCD image sensor can not obtain the completed image of laser line. In this system, two CCD image sensors were set symmetric at two sides of the laser indicator. In fact, this structure includes two laser triangle measure units. Another novel design is there laser indicators were arranged in order to reduce the scanning time for it is difficult for human to keep static for longer time. The 3D data were calculated after scanning. And further data processing include 3D coordinate refine, mesh calculate and surface show. Experiments show that this system has simply structure, high scanning speed and accurate. The scanning range covers the whole head of adult, the typical resolution is 0.5mm.

  15. JPRS Report, Science & Technology, Japan, 27th Aircraft Symposium

    DTIC Science & Technology

    1990-10-29

    screen; the relative attitude is then determined . 2) Video Sensor System Specific patterns (grapple target, etc.) drawn on the target spacecraft , or the...entire target spacecraft , is imaged by camera . Navigation information is obtained by on-board image processing, such as extraction of contours and...standard figure called "grapple target" located in the vicinity of the grapple fixture on the target spacecraft is imaged by camera . Contour lines and

  16. Modelling the influence of noise of the image sensor for blood cells recognition in computer microscopy

    NASA Astrophysics Data System (ADS)

    Nikitaev, V. G.; Nagornov, O. V.; Pronichev, A. N.; Polyakov, E. V.; Dmitrieva, V. V.

    2017-12-01

    The first stage of diagnostics of blood cancer is the analysis of blood smears. The application of decision-making support systems would reduce the subjectivity of the diagnostic process and avoid errors, resulting in often irreversible changes in the patient's condition. In this regard, the solution of this problem requires the use of modern technology. One of the tools of the program classification of blood cells are texture features, and the task of finding informative among them is promising. The paper investigates the effect of noise of the image sensor to informative texture features with application of methods of mathematical modelling.

  17. Utility of BRDF Models for Estimating Optimal View Angles in Classification of Remotely Sensed Images

    NASA Technical Reports Server (NTRS)

    Valdez, P. F.; Donohoe, G. W.

    1997-01-01

    Statistical classification of remotely sensed images attempts to discriminate between surface cover types on the basis of the spectral response recorded by a sensor. It is well known that surfaces reflect incident radiation as a function of wavelength producing a spectral signature specific to the material under investigation. Multispectral and hyperspectral sensors sample the spectral response over tens and even hundreds of wavelength bands to capture the variation of spectral response with wavelength. Classification algorithms then exploit these differences in spectral response to distinguish between materials of interest. Sensors of this type, however, collect detailed spectral information from one direction (usually nadir); consequently, do not consider the directional nature of reflectance potentially detectable at different sensor view angles. Improvements in sensor technology have resulted in remote sensing platforms capable of detecting reflected energy across wavelengths (spectral signatures) and from multiple view angles (angular signatures) in the fore and aft directions. Sensors of this type include: the moderate resolution imaging spectroradiometer (MODIS), the multiangle imaging spectroradiometer (MISR), and the airborne solid-state array spectroradiometer (ASAS). A goal of this paper, then, is to explore the utility of Bidirectional Reflectance Distribution Function (BRDF) models in the selection of optimal view angles for the classification of remotely sensed images by employing a strategy of searching for the maximum difference between surface BRDFs. After a brief discussion of directional reflect ante in Section 2, attention is directed to the Beard-Maxwell BRDF model and its use in predicting the bidirectional reflectance of a surface. The selection of optimal viewing angles is addressed in Section 3, followed by conclusions and future work in Section 4.

  18. Ultrahigh sensitivity endoscopic camera using a new CMOS image sensor: providing with clear images under low illumination in addition to fluorescent images.

    PubMed

    Aoki, Hisae; Yamashita, Hiromasa; Mori, Toshiyuki; Fukuyo, Tsuneo; Chiba, Toshio

    2014-11-01

    We developed a new ultrahigh-sensitive CMOS camera using a specific sensor that has a wide range of spectral sensitivity characteristics. The objective of this study is to present our updated endoscopic technology that has successfully integrated two innovative functions; ultrasensitive imaging as well as advanced fluorescent viewing. Two different experiments were conducted. One was carried out to evaluate the function of the ultrahigh-sensitive camera. The other was to test the availability of the newly developed sensor and its performance as a fluorescence endoscope. In both studies, the distance from the endoscopic tip to the target was varied and those endoscopic images in each setting were taken for further comparison. In the first experiment, the 3-CCD camera failed to display the clear images under low illumination, and the target was hardly seen. In contrast, the CMOS camera was able to display the targets regardless of the camera-target distance under low illumination. Under high illumination, imaging quality given by both cameras was quite alike. In the second experiment as a fluorescence endoscope, the CMOS camera was capable of clearly showing the fluorescent-activated organs. The ultrahigh sensitivity CMOS HD endoscopic camera is expected to provide us with clear images under low illumination in addition to the fluorescent images under high illumination in the field of laparoscopic surgery.

  19. Advances in time-of-flight PET

    PubMed Central

    Surti, Suleman; Karp, Joel S.

    2016-01-01

    This paper provides a review and an update on time-of-flight PET imaging with a focus on PET instrumentation, ranging from hardware design to software algorithms. We first present a short introduction to PET, followed by a description of TOF PET imaging and its history from the early days. Next, we introduce the current state-of-art in TOF PET technology and briefly summarize the benefits of TOF PET imaging. This is followed by a discussion of the various technological advancements in hardware (scintillators, photo-sensors, electronics) and software (image reconstruction) that have led to the current widespread use of TOF PET technology, and future developments that have the potential for further improvements in the TOF imaging performance. We conclude with a discussion of some new research areas that have opened up in PET imaging as a result of having good system timing resolution, ranging from new algorithms for attenuation correction, through efficient system calibration techniques, to potential for new PET system designs. PMID:26778577

  20. Uncooled Terahertz real-time imaging 2D arrays developed at LETI: present status and perspectives

    NASA Astrophysics Data System (ADS)

    Simoens, François; Meilhan, Jérôme; Dussopt, Laurent; Nicolas, Jean-Alain; Monnier, Nicolas; Sicard, Gilles; Siligaris, Alexandre; Hiberty, Bruno

    2017-05-01

    As for other imaging sensor markets, whatever is the technology, the commercial spread of terahertz (THz) cameras has to fulfil simultaneously the criteria of high sensitivity and low cost and SWAP (size, weight and power). Monolithic silicon-based 2D sensors integrated in uncooled THz real-time cameras are good candidates to meet these requirements. Over the past decade, LETI has been studying and developing such arrays with two complimentary technological approaches, i.e. antenna-coupled silicon bolometers and CMOS Field Effect Transistors (FET), both being compatible to standard silicon microelectronics processes. LETI has leveraged its know-how in thermal infrared bolometer sensors in developing a proprietary architecture for THz sensing. High technological maturity has been achieved as illustrated by the demonstration of fast scanning of large field of view and the recent birth of a commercial camera. In the FET-based THz field, recent works have been focused on innovative CMOS read-out-integrated circuit designs. The studied architectures take advantage of the large pixel pitch to enhance the flexibility and the sensitivity: an embedded in-pixel configurable signal processing chain dramatically reduces the noise. Video sequences at 100 frames per second using our 31x31 pixels 2D Focal Plane Arrays (FPA) have been achieved. The authors describe the present status of these developments and perspectives of performance evolutions are discussed. Several experimental imaging tests are also presented in order to illustrate the capabilities of these arrays to address industrial applications such as non-destructive testing (NDT), security or quality control of food.

  1. Continued Development of Meandering Winding Magnetometer (MWM (Register Trademark)) Eddy Current Sensors for the Health Monitoring, Modeling and Damage Detection of Composite Overwrapped Pressure Vessels

    NASA Technical Reports Server (NTRS)

    Russell, Richard; Wincheski, Russell; Jablonski, David; Washabaugh, Andy; Sheiretov, Yanko; Martin, Christopher; Goldfine, Neil

    2011-01-01

    Composite Overwrapped Pressure Vessels (COPVs) are used in essentially all NASA spacecraft, launch. vehicles and payloads to contain high-pressure fluids for propulsion, life support systems and science experiments. Failure of any COPV either in flight or during ground processing would result in catastrophic damage to the spacecraft or payload, and could lead to loss of life. Therefore, NASA continues to investigate new methods to non-destructively inspect (NDE) COPVs for structural anomalies and to provide a means for in-situ structural health monitoring (SHM) during operational service. Partnering with JENTEK Sensors, engineers at NASA, Kennedy Space Center have successfully conducted a proof-of-concept study to develop Meandering Winding Magnetometer (MWM) eddy current sensors designed to make direct measurements of the stresses of the internal layers of a carbon fiber composite wrapped COPV. During this study three different MWM sensors were tested at three orientations to demonstrate the ability of the technology to measure stresses at various fiber orientations and depths. These results showed good correlation with actual surface strain gage measurements. MWM-Array technology for scanning COPVs can reliably be used to image and detect mechanical damage. To validate this conclusion, several COPVs were scanned to obtain a baseline, and then each COPV was impacted at varying energy levels and then rescanned. The baseline subtracted images were used to demonstrate damage detection. These scans were performed with two different MWM-Arrays. with different geometries for near-surface and deeper penetration imaging at multiple frequencies and in multiple orientations of the linear MWM drive. This presentation will include a review of micromechanical models that relate measured sensor responses to composite material constituent properties, validated by the proof of concept study, as the basis for SHM and NDE data analysis as well as potential improvements including design changes to miniaturize and make the sensors durable in the vacuum of space

  2. 10000 pixels wide CMOS frame imager for earth observation from a HALE UAV

    NASA Astrophysics Data System (ADS)

    Delauré, B.; Livens, S.; Everaerts, J.; Kleihorst, R.; Schippers, Gert; de Wit, Yannick; Compiet, John; Banachowicz, Bartosz

    2009-09-01

    MEDUSA is the lightweight high resolution camera, designed to be operated from a solar-powered Unmanned Aerial Vehicle (UAV) flying at stratospheric altitudes. The instrument is a technology demonstrator within the Pegasus program and targets applications such as crisis management and cartography. A special wide swath CMOS imager has been developed by Cypress Semiconductor Cooperation Belgium to meet the specific sensor requirements of MEDUSA. The CMOS sensor has a stitched design comprising a panchromatic and color sensor on the same die. Each sensor consists of 10000*1200 square pixels (5.5μm size, novel 6T architecture) with micro-lenses. The exposure is performed by means of a high efficiency snapshot shutter. The sensor is able to operate at a rate of 30fps in full frame readout. Due to a novel pixel design, the sensor has low dark leakage of the memory elements (PSNL) and low parasitic light sensitivity (PLS). Still it maintains a relative high QE (Quantum efficiency) and a FF (fill factor) of over 65%. It features an MTF (Modulation Transfer Function) higher than 60% at Nyquist frequency in both X and Y directions The measured optical/electrical crosstalk (expressed as MTF) of this 5.5um pixel is state-of-the art. These properties makes it possible to acquire sharp images also in low-light conditions.

  3. Display challenges resulting from the use of wide field of view imaging devices

    NASA Astrophysics Data System (ADS)

    Petty, Gregory J.; Fulton, Jack; Nicholson, Gail; Seals, Ean

    2012-06-01

    As focal plane array technologies advance and imagers increase in resolution, display technology must outpace the imaging improvements in order to adequately represent the complete data collection. Typical display devices tend to have an aspect ratio similar to 4:3 or 16:9, however a breed of Wide Field of View (WFOV) imaging devices exist that skew from the norm with aspect ratios as high as 5:1. This particular quality, when coupled with a high spatial resolution, presents a unique challenge for display devices. Standard display devices must choose between resizing the image data to fit the display and displaying the image data in native resolution and truncating potentially important information. The problem compounds when considering the applications; WFOV high-situationalawareness imagers are sought for space-limited military vehicles. Tradeoffs between these issues are assessed to the image quality of the WFOV sensor.

  4. A mobile unit for memory retrieval in daily life based on image and sensor processing

    NASA Astrophysics Data System (ADS)

    Takesumi, Ryuji; Ueda, Yasuhiro; Nakanishi, Hidenobu; Nakamura, Atsuyoshi; Kakimori, Nobuaki

    2003-10-01

    We developed a Mobile Unit which purpose is to support memory retrieval of daily life. In this paper, we describe the two characteristic factors of this unit. (1)The behavior classification with an acceleration sensor. (2)Extracting the difference of environment with image processing technology. In (1), By analyzing power and frequency of an acceleration sensor which turns to gravity direction, the one's activities can be classified using some techniques to walk, stay, and so on. In (2), By extracting the difference between the beginning scene and the ending scene of a stay scene with image processing, the result which is done by user is recognized as the difference of environment. Using those 2 techniques, specific scenes of daily life can be extracted, and important information at the change of scenes can be realized to record. Especially we describe the effect to support retrieving important things, such as a thing left behind and a state of working halfway.

  5. A New Test Method of Circuit Breaker Spring Telescopic Characteristics Based Image Processing

    NASA Astrophysics Data System (ADS)

    Huang, Huimin; Wang, Feifeng; Lu, Yufeng; Xia, Xiaofei; Su, Yi

    2018-06-01

    This paper applied computer vision technology to the fatigue condition monitoring of springs, and a new telescopic characteristics test method is proposed for circuit breaker operating mechanism spring based on image processing technology. High-speed camera is utilized to capture spring movement image sequences when high voltage circuit breaker operated. Then the image-matching method is used to obtain the deformation-time curve and speed-time curve, and the spring expansion and deformation parameters are extracted from it, which will lay a foundation for subsequent spring force analysis and matching state evaluation. After performing simulation tests at the experimental site, this image analyzing method could solve the complex problems of traditional mechanical sensor installation and monitoring online, status assessment of the circuit breaker spring.

  6. End-to-end remote sensing at the Science and Technology Laboratory of John C. Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Kelly, Patrick; Rickman, Douglas; Smith, Eric

    1991-01-01

    The Science and Technology Laboratory (STL) of Stennis Space Center (SSC) was developing an expertise in remote sensing for more than a decade. Capabilities at SSC/STL include all major areas of the field. STL includes the Sensor Development Laboratory (SDL), Image Processing Center, a Learjet 23 flight platform, and on-staff scientific investigators.

  7. Restoration of non-uniform exposure motion blurred image

    NASA Astrophysics Data System (ADS)

    Luo, Yuanhong; Xu, Tingfa; Wang, Ningming; Liu, Feng

    2014-11-01

    Restoring motion-blurred image is the key technologies in the opto-electronic detection system. The imaging sensors such as CCD and infrared imaging sensor, which are mounted on the motion platforms, quickly move together with the platforms of high speed. As a result, the images become blur. The image degradation will cause great trouble for the succeeding jobs such as objects detection, target recognition and tracking. So the motion-blurred images must be restoration before detecting motion targets in the subsequent images. On the demand of the real weapon task, in order to deal with targets in the complex background, this dissertation uses the new theories in the field of image processing and computer vision to research the new technology of motion deblurring and motion detection. The principle content is as follows: 1) When the prior knowledge about degradation function is unknown, the uniform motion blurred images are restored. At first, the blur parameters, including the motion blur extent and direction of PSF(point spread function), are estimated individually in domain of logarithmic frequency. The direction of PSF is calculated by extracting the central light line of the spectrum, and the extent is computed by minimizing the correction between the fourier spectrum of the blurred image and a detecting function. Moreover, in order to remove the strip in the deblurred image, windows technique is employed in the algorithm, which makes the deblurred image clear. 2) According to the principle of infrared image non-uniform exposure, a new restoration model for infrared blurred images is developed. The fitting of infrared image non-uniform exposure curve is performed by experiment data. The blurred images are restored by the fitting curve.

  8. Technology Needs for Gamma Ray Astronomy

    NASA Technical Reports Server (NTRS)

    Gehrels, Neil

    2011-01-01

    Gamma ray astronomy is currently in an exciting period of multiple missions and a wealth of data. Results from INTEGRAL, Fermi, AGILE, Suzaku and Swift are making large contributions to our knowledge of high energy processes in the universe. The advances are due to new detector and imaging technologies. The steps to date have been from scintillators to solid state detectors for sensors and from light buckets to coded aperture masks and pair telescopes for imagers. A key direction for the future is toward focusing telescopes pushing into the hard X-ray regime and Compton telescopes and pair telescopes with fine spatial resolution for medium and high energy gamma rays. These technologies will provide finer imaging of gamma-ray sources. Importantly, they will also enable large steps forward in sensitivity by reducing background.

  9. TMA optics for HISUI HSS and MSS imagers

    NASA Astrophysics Data System (ADS)

    Rodolfo, J.; Geyl, R.; Leplan, H.; Ruch, E.

    2017-11-01

    Sagem is presently working on a new project for the Japanese HISUI instrument made from a Hyper Spectral Sensor and a Multi Spectral Sensor, both including a Three Mirror Anastigmat (TMA) main optics. Mirrors are made from Zerodur from Schott but also from NTSIC, the New Technology Silicon Carbide developed in Japan. This report is also the opportunity to show to the community Sagem recent progress in precision TMA optics polishing and alignment.

  10. Satellite Ocean Color Sensor Design Concepts and Performance Requirements

    NASA Technical Reports Server (NTRS)

    McClain, Charles R.; Meister, Gerhard; Monosmith, Bryan

    2014-01-01

    In late 1978, the National Aeronautics and Space Administration (NASA) launched the Nimbus-7 satellite with the Coastal Zone Color Scanner (CZCS) and several other sensors, all of which provided major advances in Earth remote sensing. The inspiration for the CZCS is usually attributed to an article in Science by Clarke et al. who demonstrated that large changes in open ocean spectral reflectance are correlated to chlorophyll-a concentrations. Chlorophyll-a is the primary photosynthetic pigment in green plants (marine and terrestrial) and is used in estimating primary production, i.e., the amount of carbon fixed into organic matter during photosynthesis. Thus, accurate estimates of global and regional primary production are key to studies of the earth's carbon cycle. Because the investigators used an airborne radiometer, they were able to demonstrate the increased radiance contribution of the atmosphere with altitude that would be a major issue for spaceborne measurements. Since 1978, there has been much progress in satellite ocean color remote sensing such that the technique is well established and is used for climate change science and routine operational environmental monitoring. Also, the science objectives and accompanying methodologies have expanded and evolved through a succession of global missions, e.g., the Ocean Color and Temperature Sensor (OCTS), the Seaviewing Wide Field-of-view Sensor (SeaWiFS), the Moderate Resolution Imaging Spectroradiometer (MODIS), the Medium Resolution Imaging Spectrometer (MERIS), and the Global Imager (GLI). With each advance in science objectives, new and more stringent requirements for sensor capabilities (e.g., spectral coverage) and performance (e.g., signal-to-noise ratio, SNR) are established. The CZCS had four bands for chlorophyll and aerosol corrections. The Ocean Color Imager (OCI) recommended for the NASA Pre-Aerosol, Cloud, and Ocean Ecosystems (PACE) mission includes 5 nanometers hyperspectral coverage from 350 to 800 nanometers with three additional discrete near infrared (NIR) and shortwave infrared (SWIR) ocean aerosol correction bands. Also, to avoid drift in sensor sensitivity from being interpreted as environmental change, climate change research requires rigorous monitoring of sensor stability. For SeaWiFS, monthly lunar imaging accurately tracked stability at an accuracy of approximately 0.1% that allowed the data to be used for climate studies [2]. It is now acknowledged by the international community that future missions and sensor designs need to accommodate lunar calibrations. An overview of ocean color remote sensing and a review of the progress made in ocean color remote sensing and the variety of research applications derived from global satellite ocean color data are provided. The purpose of this chapter is to discuss the design options for ocean color satellite radiometers, performance and testing criteria, and sensor components (optics, detectors, electronics, etc.) that must be integrated into an instrument concept. These ultimately dictate the quality and quantity of data that can be delivered as a trade against mission cost. Historically, science and sensor technology have advanced in a "leap-frog" manner in that sensor design requirements for a mission are defined many years before a sensor is launched and by the end of the mission, perhaps 15-20 years later, science applications and requirements are well beyond the capabilities of the sensor. Section 3 provides a summary of historical mission science objectives and sensor requirements. This progression is expected to continue in the future as long as sensor costs can be constrained to affordable levels and still allow the incorporation of new technologies without incurring unacceptable risk to mission success. The IOCCG Report Number 13 discusses future ocean biology mission Level-1 requirements in depth.

  11. Design of a Low-Light-Level Image Sensor with On-Chip Sigma-Delta Analog-to- Digital Conversion

    NASA Technical Reports Server (NTRS)

    Mendis, Sunetra K.; Pain, Bedabrata; Nixon, Robert H.; Fossum, Eric R.

    1993-01-01

    The design and projected performance of a low-light-level active-pixel-sensor (APS) chip with semi-parallel analog-to-digital (A/D) conversion is presented. The individual elements have been fabricated and tested using MOSIS* 2 micrometer CMOS technology, although the integrated system has not yet been fabricated. The imager consists of a 128 x 128 array of active pixels at a 50 micrometer pitch. Each column of pixels shares a 10-bit A/D converter based on first-order oversampled sigma-delta (Sigma-Delta) modulation. The 10-bit outputs of each converter are multiplexed and read out through a single set of outputs. A semi-parallel architecture is chosen to achieve 30 frames/second operation even at low light levels. The sensor is designed for less than 12 e^- rms noise performance.

  12. NASA's small spacecraft technology initiative _Clark_ spacecraft

    NASA Astrophysics Data System (ADS)

    Hayduk, Robert J.; Scott, Walter S.; Walberg, Gerald D.; Butts, James J.; Starr, Richard D.

    1996-11-01

    The Small Satellite Technology Initiative (SSTI) is a National Aeronautics and Space Administration (NASA) program to demonstrate smaller, high technology satellites constructed rapidly and less expensively. Under SSTI, NASA funded the development of "Clark," a high technology demonstration satellite to provide 3-m resolution panchromatic and 15-m resolution multispectral images, as well as collect atmospheric constituent and cosmic x-ray data. The 690-Ib. satellite, to be launched in early 1997, will be in a 476 km, circular, sun-synchronous polar orbit. This paper describes the program objectives, the technical characteristics of the sensors and satellite, image processing, archiving and distribution. Data archiving and distribution will be performed by NASA Stennis Space Center and by the EROS Data Center, Sioux Falls, South Dakota, USA.

  13. Multi-color IR sensors based on QWIP technology for security and surveillance applications

    NASA Astrophysics Data System (ADS)

    Sundaram, Mani; Reisinger, Axel; Dennis, Richard; Patnaude, Kelly; Burrows, Douglas; Cook, Robert; Bundas, Jason

    2006-05-01

    Room-temperature targets are detected at the furthest distance by imaging them in the long wavelength (LW: 8-12 μm) infrared spectral band where they glow brightest. Focal plane arrays (FPAs) based on quantum well infrared photodetectors (QWIPs) have sensitivity, noise, and cost metrics that have enabled them to become the best commercial solution for certain security and surveillance applications. Recently, QWIP technology has advanced to provide pixelregistered dual-band imaging in both the midwave (MW: 3-5 μm) and longwave infrared spectral bands in a single chip. This elegant technology affords a degree of target discrimination as well as the ability to maximize detection range for hot targets (e.g. missile plumes) by imaging in the midwave and for room-temperature targets (e.g. humans, trucks) by imaging in the longwave with one simple camera. Detection-range calculations are illustrated and FPA performance is presented.

  14. Modification of measurement methods for evaluation of tissue-engineered cartilage function and biochemical properties using nanosecond pulsed laser

    NASA Astrophysics Data System (ADS)

    Ishihara, Miya; Sato, Masato; Kutsuna, Toshiharu; Ishihara, Masayuki; Mochida, Joji; Kikuchi, Makoto

    2008-02-01

    There is a demand in the field of regenerative medicine for measurement technology that enables determination of functions and components of engineered tissue. To meet this demand, we developed a method for extracellular matrix characterization using time-resolved autofluorescence spectroscopy, which enabled simultaneous measurements with mechanical properties using relaxation of laser-induced stress wave. In this study, in addition to time-resolved fluorescent spectroscopy, hyperspectral sensor, which enables to capture both spectral and spatial information, was used for evaluation of biochemical characterization of tissue-engineered cartilage. Hyperspectral imaging system provides spectral resolution of 1.2 nm and image rate of 100 images/sec. The imaging system consisted of the hyperspectral sensor, a scanner for x-y plane imaging, magnifying optics and Xenon lamp for transmmissive lighting. Cellular imaging using the hyperspectral image system has been achieved by improvement in spatial resolution up to 9 micrometer. The spectroscopic cellular imaging could be observed using cultured chondrocytes as sample. At early stage of culture, the hyperspectral imaging offered information about cellular function associated with endogeneous fluorescent biomolecules.

  15. Transport infrastructure surveillance and monitoring by electromagnetic sensing: the ISTIMES project.

    PubMed

    Proto, Monica; Bavusi, Massimo; Bernini, Romeo; Bigagli, Lorenzo; Bost, Marie; Bourquin, Frédrèric; Cottineau, Louis-Marie; Cuomo, Vincenzo; Della Vecchia, Pietro; Dolce, Mauro; Dumoulin, Jean; Eppelbaum, Lev; Fornaro, Gianfranco; Gustafsson, Mats; Hugenschmidt, Johannes; Kaspersen, Peter; Kim, Hyunwook; Lapenna, Vincenzo; Leggio, Mario; Loperte, Antonio; Mazzetti, Paolo; Moroni, Claudio; Nativi, Stefano; Nordebo, Sven; Pacini, Fabrizio; Palombo, Angelo; Pascucci, Simone; Perrone, Angela; Pignatti, Stefano; Ponzo, Felice Carlo; Rizzo, Enzo; Soldovieri, Francesco; Taillade, Fédrèric

    2010-01-01

    The ISTIMES project, funded by the European Commission in the frame of a joint Call "ICT and Security" of the Seventh Framework Programme, is presented and preliminary research results are discussed. The main objective of the ISTIMES project is to design, assess and promote an Information and Communication Technologies (ICT)-based system, exploiting distributed and local sensors, for non-destructive electromagnetic monitoring of critical transport infrastructures. The integration of electromagnetic technologies with new ICT information and telecommunications systems enables remotely controlled monitoring and surveillance and real time data imaging of the critical transport infrastructures. The project exploits different non-invasive imaging technologies based on electromagnetic sensing (optic fiber sensors, Synthetic Aperture Radar satellite platform based, hyperspectral spectroscopy, Infrared thermography, Ground Penetrating Radar-, low-frequency geophysical techniques, Ground based systems for displacement monitoring). In this paper, we show the preliminary results arising from the GPR and infrared thermographic measurements carried out on the Musmeci bridge in Potenza, located in a highly seismic area of the Apennine chain (Southern Italy) and representing one of the test beds of the project.

  16. Transport Infrastructure Surveillance and Monitoring by Electromagnetic Sensing: The ISTIMES Project

    PubMed Central

    Proto, Monica; Bavusi, Massimo; Bernini, Romeo; Bigagli, Lorenzo; Bost, Marie; Bourquin, Frédrèric.; Cottineau, Louis-Marie; Cuomo, Vincenzo; Vecchia, Pietro Della; Dolce, Mauro; Dumoulin, Jean; Eppelbaum, Lev; Fornaro, Gianfranco; Gustafsson, Mats; Hugenschmidt, Johannes; Kaspersen, Peter; Kim, Hyunwook; Lapenna, Vincenzo; Leggio, Mario; Loperte, Antonio; Mazzetti, Paolo; Moroni, Claudio; Nativi, Stefano; Nordebo, Sven; Pacini, Fabrizio; Palombo, Angelo; Pascucci, Simone; Perrone, Angela; Pignatti, Stefano; Ponzo, Felice Carlo; Rizzo, Enzo; Soldovieri, Francesco; Taillade, Fédrèric

    2010-01-01

    The ISTIMES project, funded by the European Commission in the frame of a joint Call “ICT and Security” of the Seventh Framework Programme, is presented and preliminary research results are discussed. The main objective of the ISTIMES project is to design, assess and promote an Information and Communication Technologies (ICT)-based system, exploiting distributed and local sensors, for non-destructive electromagnetic monitoring of critical transport infrastructures. The integration of electromagnetic technologies with new ICT information and telecommunications systems enables remotely controlled monitoring and surveillance and real time data imaging of the critical transport infrastructures. The project exploits different non-invasive imaging technologies based on electromagnetic sensing (optic fiber sensors, Synthetic Aperture Radar satellite platform based, hyperspectral spectroscopy, Infrared thermography, Ground Penetrating Radar-, low-frequency geophysical techniques, Ground based systems for displacement monitoring). In this paper, we show the preliminary results arising from the GPR and infrared thermographic measurements carried out on the Musmeci bridge in Potenza, located in a highly seismic area of the Apennine chain (Southern Italy) and representing one of the test beds of the project. PMID:22163489

  17. Battling Brittle Bones

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The accuDEXA(R) Bone Mineral Density Assessment System, manufactured by Schick Technologies, Inc., utilizes "camera on a chip" sensor technology invented and developed by NASA's Jet Propulsion Laboratory. Schick's accuDEXA system offers several advantages over traditional osteoporosis tests, which assess bone density loss in the hip and spine, and require specialized personnel to conduct. With accuDEXA, physicians can test the entire body's bone density at a peripheral site, such as the finger, without applying gels or having patients remove garments. Results are achieved in 30 seconds and printed out in less than a minute, compared to the estimated exam time of 15 minutes for hip and spine density analyses. Schick has also applied the CMOS APS technology to a new software product that performs dental radiography using up to 90 percent less radiation exposure than conventional X-rays. Called Computed Dental Radiography(R), the new digital imaging product utilizes an electronic sensor in place of X-ray film to generate sharp and clear images that appear on a computer screen within 3 seconds, and can be enlarged and enhanced to identify problems.

  18. Usaf Space Sensing Cryogenic Considerations

    NASA Astrophysics Data System (ADS)

    Roush, F.

    2010-04-01

    Infrared (IR) space sensing missions of the future depend upon low mass components and highly capable imaging technologies. Limitations in visible imaging due to the earth's shadow drive the use of IR surveillance methods for a wide variety of applications for Intelligence, Surveillance, and Reconnaissance (ISR), Ballistic Missile Defense (BMD) applications, and almost certainly in Space Situational Awareness (SSA) and Operationally Responsive Space (ORS) missions. Utilization of IR sensors greatly expands and improves mission capabilities including target and target behavioral discrimination. Background IR emissions and electronic noise that is inherently present in Focal Plane Arrays (FPAs) and surveillance optics bench designs prevents their use unless they are cooled to cryogenic temperatures. This paper describes the role of cryogenic coolers as an enabling technology for generic ISR and BMD missions and provides ISR and BMD mission and requirement planners with a brief glimpse of this critical technology implementation potential. The interaction between cryogenic refrigeration component performance and the IR sensor optics and FPA can be seen as not only mission enabling but also as mission performance enhancing when the refrigeration system is considered as part of an overall optimization problem.

  19. ARL participation in the C4ISR OTM experiment: integration and performance results

    NASA Astrophysics Data System (ADS)

    Zong, Lei; O'Brien, Barry J.

    2007-04-01

    The Command, Control, Communication, Computers, Intelligence, Surveillance and Reconnaissance (C4ISR) On-The- Move (OTM) demonstration is an annual showcase of how innovative technologies can help modern troops increase their situational awareness (SA) in battlefield environments. To evaluate the effectiveness these new technologies have on the soldiers' abilities to gather situational information, the demonstration involves United States Army National Guard troops in realistic war game scenarios at an Army Reserve training ground. The Army Research Laboratory (ARL) was invited to participate in the event, with the objective demonstrating system-level integration of disparate technologies developed for gathering SA information in small unit combat operations. ARL provided expertise in Unattended Ground Sensing (UGS) technology, Unmanned Ground Vehicle (UGV) technology, information processing and wireless mobile ad hoc communication. The ARL C4ISR system included a system of multimodal sensors (MMS), a trip wire imager, a man-portable robotic vehicle (PackBot), and low power sensor radios for communication between an ARL system and a hosting platoon vehicle. This paper will focus on the integration effort of bringing the multiple families of sensor assets together into a working system.

  20. Technology for robotic surface inspection in space

    NASA Technical Reports Server (NTRS)

    Volpe, Richard; Balaram, J.

    1994-01-01

    This paper presents on-going research in robotic inspection of space platforms. Three main areas of investigation are discussed: machine vision inspection techniques, an integrated sensor end-effector, and an orbital environment laboratory simulation. Machine vision inspection utilizes automatic comparison of new and reference images to detect on-orbit induced damage such as micrometeorite impacts. The cameras and lighting used for this inspection are housed in a multisensor end-effector, which also contains a suite of sensors for detection of temperature, gas leaks, proximity, and forces. To fully test all of these sensors, a realistic space platform mock-up has been created, complete with visual, temperature, and gas anomalies. Further, changing orbital lighting conditions are effectively mimicked by a robotic solar simulator. In the paper, each of these technology components will be discussed, and experimental results are provided.

  1. Exploration of assistive technology for uniform laparoscopic surgery.

    PubMed

    Sato, Masakazu; Koizumi, Minako; Hino, Takahiro; Takahashi, Yu; Nagashima, Natsuki; Itaoka, Nao; Ueshima, Chiharu; Nakata, Maki; Hasumi, Yoko

    2018-02-19

    Laparoscopic surgery is less invasive than open surgery and is now common in various medical fields. However, laparoscopic surgery is more difficult than open surgery and often requires additional time for the operator to achieve mastery. Therefore, we investigated the use of assistive technology for uniform laparoscopic surgery. We used the OpenCV2 library for augmented reality with an ArUco marker to detect and estimate forceps positioning. We used Sense HAT as the gyro sensor. The development platforms used were Mac OS X 10.11.3 and Raspberry Pi 3, model B. By attaching the ArUco marker to the needle holder, we could draw a line vertically to the marker. When the needle was held, a cube could be imagined, and both the needle and lines could be used to determine the appropriate position. By attaching the gyro sensor to the camera, we could detect its angle of rotation. We obtained stabilized images by rotating the image by the detected degrees; this was possible for any camera position. Assistive technology allowed us to obtain consecutive converted images in real time and may be readily applicable to clinical practice. © 2018 Japan Society for Endoscopic Surgery, Asia Endosurgery Task Force and John Wiley & Sons Australia, Ltd.

  2. Framework of passive millimeter-wave scene simulation based on material classification

    NASA Astrophysics Data System (ADS)

    Park, Hyuk; Kim, Sung-Hyun; Lee, Ho-Jin; Kim, Yong-Hoon; Ki, Jae-Sug; Yoon, In-Bok; Lee, Jung-Min; Park, Soon-Jun

    2006-05-01

    Over the past few decades, passive millimeter-wave (PMMW) sensors have emerged as useful implements in transportation and military applications such as autonomous flight-landing system, smart weapons, night- and all weather vision system. As an efficient way to predict the performance of a PMMW sensor and apply it to system, it is required to test in SoftWare-In-the-Loop (SWIL). The PMMW scene simulation is a key component for implementation of this simulator. However, there is no commercial on-the-shelf available to construct the PMMW scene simulation; only there have been a few studies on this technology. We have studied the PMMW scene simulation method to develop the PMMW sensor SWIL simulator. This paper describes the framework of the PMMW scene simulation and the tentative results. The purpose of the PMMW scene simulation is to generate sensor outputs (or image) from a visible image and environmental conditions. We organize it into four parts; material classification mapping, PMMW environmental setting, PMMW scene forming, and millimeter-wave (MMW) sensorworks. The background and the objects in the scene are classified based on properties related with MMW radiation and reflectivity. The environmental setting part calculates the following PMMW phenomenology; atmospheric propagation and emission including sky temperature, weather conditions, and physical temperature. Then, PMMW raw images are formed with surface geometry. Finally, PMMW sensor outputs are generated from PMMW raw images by applying the sensor characteristics such as an aperture size and noise level. Through the simulation process, PMMW phenomenology and sensor characteristics are simulated on the output scene. We have finished the design of framework of the simulator, and are working on implementation in detail. As a tentative result, the flight observation was simulated in specific conditions. After implementation details, we plan to increase the reliability of the simulation by data collecting using actual PMMW sensors. With the reliable PMMW scene simulator, it will be more efficient to apply the PMMW sensor to various applications.

  3. Temperature measurement with industrial color camera devices

    NASA Astrophysics Data System (ADS)

    Schmidradler, Dieter J.; Berndorfer, Thomas; van Dyck, Walter; Pretschuh, Juergen

    1999-05-01

    This paper discusses color camera based temperature measurement. Usually, visual imaging and infrared image sensing are treated as two separate disciplines. We will show, that a well selected color camera device might be a cheaper, more robust and more sophisticated solution for optical temperature measurement in several cases. Herein, only implementation fragments and important restrictions for the sensing element will be discussed. Our aim is to draw the readers attention to the use of visual image sensors for measuring thermal radiation and temperature and to give reasons for the need of improved technologies for infrared camera devices. With AVL-List, our partner of industry, we successfully used the proposed sensor to perform temperature measurement for flames inside the combustion chamber of diesel engines which finally led to the presented insights.

  4. Near-infrared fluorescence goggle system with complementary metal–oxide–semiconductor imaging sensor and see-through display

    PubMed Central

    Liu, Yang; Njuguna, Raphael; Matthews, Thomas; Akers, Walter J.; Sudlow, Gail P.; Mondal, Suman; Tang, Rui

    2013-01-01

    Abstract. We have developed a near-infrared (NIR) fluorescence goggle system based on the complementary metal–oxide–semiconductor active pixel sensor imaging and see-through display technologies. The fluorescence goggle system is a compact wearable intraoperative fluorescence imaging and display system that can guide surgery in real time. The goggle is capable of detecting fluorescence of indocyanine green solution in the picomolar range. Aided by NIR quantum dots, we successfully used the fluorescence goggle to guide sentinel lymph node mapping in a rat model. We further demonstrated the feasibility of using the fluorescence goggle in guiding surgical resection of breast cancer metastases in the liver in conjunction with NIR fluorescent probes. These results illustrate the diverse potential use of the goggle system in surgical procedures. PMID:23728180

  5. The selectable hyperspectral airborne remote sensing kit (SHARK) as an enabler for precision agriculture

    NASA Astrophysics Data System (ADS)

    Holasek, Rick; Nakanishi, Keith; Ziph-Schatzberg, Leah; Santman, Jeff; Woodman, Patrick; Zacaroli, Richard; Wiggins, Richard

    2017-04-01

    Hyperspectral imaging (HSI) has been used for over two decades in laboratory research, academic, environmental and defense applications. In more recent time, HSI has started to be adopted for commercial applications in machine vision, conservation, resource exploration, and precision agriculture, to name just a few of the economically viable uses for the technology. Corning Incorporated (Corning) has been developing and manufacturing HSI sensors, sensor systems, and sensor optical engines, as well as HSI sensor components such as gratings and slits for over a decade and a half. This depth of experience and technological breadth has allowed Corning to design and develop unique HSI spectrometers with an unprecedented combination of high performance, low cost and low Size, Weight, and Power (SWaP). These sensors and sensor systems are offered with wavelength coverage ranges from the visible to the Long Wave Infrared (LWIR). The extremely low SWaP of Corning's HSI sensors and sensor systems enables their deployment using limited payload platforms such as small unmanned aerial vehicles (UAVs). This paper discusses use of the Corning patented monolithic design Offner spectrometer, the microHSI™, to build a highly compact 400-1000 nm HSI sensor in combination with a small Inertial Navigation System (INS) and micro-computer to make a complete turn-key airborne remote sensing payload. This Selectable Hyperspectral Airborne Remote sensing Kit (SHARK) has industry leading SWaP (1.5 lbs) at a disruptively low price due, in large part, to Corning's ability to manufacture the monolithic spectrometer out of polymers (i.e. plastic) and therefore reduce manufacturing costs considerably. The other factor in lowering costs is Corning's well established in house manufacturing capability in optical components and sensors that further enable cost-effective fabrication. The competitive SWaP and low cost of the microHSI™ sensor is approaching, and in some cases less than the price point of Multi Spectral Imaging (MSI) sensors. Specific designs of the Corning microHSI™ SHARK visNIR turn-key system are presented along with salient performance characteristics. Initial focus market areas include precision agriculture and historic and recent microHSI™ SHARK prototype test results are presented.

  6. Label-Free Biomedical Imaging Using High-Speed Lock-In Pixel Sensor for Stimulated Raman Scattering

    PubMed Central

    Mars, Kamel; Kawahito, Shoji; Yasutomi, Keita; Kagawa, Keiichiro; Yamada, Takahiro

    2017-01-01

    Raman imaging eliminates the need for staining procedures, providing label-free imaging to study biological samples. Recent developments in stimulated Raman scattering (SRS) have achieved fast acquisition speed and hyperspectral imaging. However, there has been a problem of lack of detectors suitable for MHz modulation rate parallel detection, detecting multiple small SRS signals while eliminating extremely strong offset due to direct laser light. In this paper, we present a complementary metal-oxide semiconductor (CMOS) image sensor using high-speed lock-in pixels for stimulated Raman scattering that is capable of obtaining the difference of Stokes-on and Stokes-off signal at modulation frequency of 20 MHz in the pixel before reading out. The generated small SRS signal is extracted and amplified in a pixel using a high-speed and large area lateral electric field charge modulator (LEFM) employing two-step ion implantation and an in-pixel pair of low-pass filter, a sample and hold circuit and a switched capacitor integrator using a fully differential amplifier. A prototype chip is fabricated using 0.11 μm CMOS image sensor technology process. SRS spectra and images of stearic acid and 3T3-L1 samples are successfully obtained. The outcomes suggest that hyperspectral and multi-focus SRS imaging at video rate is viable after slight modifications to the pixel architecture and the acquisition system. PMID:29120358

  7. Label-Free Biomedical Imaging Using High-Speed Lock-In Pixel Sensor for Stimulated Raman Scattering.

    PubMed

    Mars, Kamel; Lioe, De Xing; Kawahito, Shoji; Yasutomi, Keita; Kagawa, Keiichiro; Yamada, Takahiro; Hashimoto, Mamoru

    2017-11-09

    Raman imaging eliminates the need for staining procedures, providing label-free imaging to study biological samples. Recent developments in stimulated Raman scattering (SRS) have achieved fast acquisition speed and hyperspectral imaging. However, there has been a problem of lack of detectors suitable for MHz modulation rate parallel detection, detecting multiple small SRS signals while eliminating extremely strong offset due to direct laser light. In this paper, we present a complementary metal-oxide semiconductor (CMOS) image sensor using high-speed lock-in pixels for stimulated Raman scattering that is capable of obtaining the difference of Stokes-on and Stokes-off signal at modulation frequency of 20 MHz in the pixel before reading out. The generated small SRS signal is extracted and amplified in a pixel using a high-speed and large area lateral electric field charge modulator (LEFM) employing two-step ion implantation and an in-pixel pair of low-pass filter, a sample and hold circuit and a switched capacitor integrator using a fully differential amplifier. A prototype chip is fabricated using 0.11 μm CMOS image sensor technology process. SRS spectra and images of stearic acid and 3T3-L1 samples are successfully obtained. The outcomes suggest that hyperspectral and multi-focus SRS imaging at video rate is viable after slight modifications to the pixel architecture and the acquisition system.

  8. Direct Detection Electron Energy-Loss Spectroscopy: A Method to Push the Limits of Resolution and Sensitivity.

    PubMed

    Hart, James L; Lang, Andrew C; Leff, Asher C; Longo, Paolo; Trevor, Colin; Twesten, Ray D; Taheri, Mitra L

    2017-08-15

    In many cases, electron counting with direct detection sensors offers improved resolution, lower noise, and higher pixel density compared to conventional, indirect detection sensors for electron microscopy applications. Direct detection technology has previously been utilized, with great success, for imaging and diffraction, but potential advantages for spectroscopy remain unexplored. Here we compare the performance of a direct detection sensor operated in counting mode and an indirect detection sensor (scintillator/fiber-optic/CCD) for electron energy-loss spectroscopy. Clear improvements in measured detective quantum efficiency and combined energy resolution/energy field-of-view are offered by counting mode direct detection, showing promise for efficient spectrum imaging, low-dose mapping of beam-sensitive specimens, trace element analysis, and time-resolved spectroscopy. Despite the limited counting rate imposed by the readout electronics, we show that both core-loss and low-loss spectral acquisition are practical. These developments will benefit biologists, chemists, physicists, and materials scientists alike.

  9. Putting a finishing touch on GECIs.

    PubMed

    Rose, Tobias; Goltstein, Pieter M; Portugues, Ruben; Griesbeck, Oliver

    2014-01-01

    More than a decade ago genetically encoded calcium indicators (GECIs) entered the stage as new promising tools to image calcium dynamics and neuronal activity in living tissues and designated cell types in vivo. From a variety of initial designs two have emerged as promising prototypes for further optimization: FRET (Förster Resonance Energy Transfer)-based sensors and single fluorophore sensors of the GCaMP family. Recent efforts in structural analysis, engineering and screening have broken important performance thresholds in the latest generation for both classes. While these improvements have made GECIs a powerful means to perform physiology in living animals, a number of other aspects of sensor function deserve attention. These aspects include indicator linearity, toxicity and slow response kinetics. Furthermore creating high performance sensors with optically more favorable emission in red or infrared wavelengths as well as new stably or conditionally GECI-expressing animal lines are on the wish list. When the remaining issues are solved, imaging of GECIs will finally have crossed the last milestone, evolving from an initial promise into a fully matured technology.

  10. Improved GSO Optimized ESN Soft-Sensor Model of Flotation Process Based on Multisource Heterogeneous Information Fusion

    PubMed Central

    Wang, Jie-sheng; Han, Shuang; Shen, Na-na

    2014-01-01

    For predicting the key technology indicators (concentrate grade and tailings recovery rate) of flotation process, an echo state network (ESN) based fusion soft-sensor model optimized by the improved glowworm swarm optimization (GSO) algorithm is proposed. Firstly, the color feature (saturation and brightness) and texture features (angular second moment, sum entropy, inertia moment, etc.) based on grey-level co-occurrence matrix (GLCM) are adopted to describe the visual characteristics of the flotation froth image. Then the kernel principal component analysis (KPCA) method is used to reduce the dimensionality of the high-dimensional input vector composed by the flotation froth image characteristics and process datum and extracts the nonlinear principal components in order to reduce the ESN dimension and network complex. The ESN soft-sensor model of flotation process is optimized by the GSO algorithm with congestion factor. Simulation results show that the model has better generalization and prediction accuracy to meet the online soft-sensor requirements of the real-time control in the flotation process. PMID:24982935

  11. Multifrequency Ultra-High Resolution Miniature Scanning Microscope Using Microchannel And Solid-State Sensor Technologies And Method For Scanning Samples

    NASA Technical Reports Server (NTRS)

    Wang, Yu (Inventor)

    2006-01-01

    A miniature, ultra-high resolution, and color scanning microscope using microchannel and solid-state technology that does not require focus adjustment. One embodiment includes a source of collimated radiant energy for illuminating a sample, a plurality of narrow angle filters comprising a microchannel structure to permit the passage of only unscattered radiant energy through the microchannels with some portion of the radiant energy entering the microchannels from the sample, a solid-state sensor array attached to the microchannel structure, the microchannels being aligned with an element of the solid-state sensor array, that portion of the radiant energy entering the microchannels parallel to the microchannel walls travels to the sensor element generating an electrical signal from which an image is reconstructed by an external device, and a moving element for movement of the microchannel structure relative to the sample. Discloses a method for scanning samples whereby the sensor array elements trace parallel paths that are arbitrarily close to the parallel paths traced by other elements of the array.

  12. Investigation of CMOS pixel sensor with 0.18 μm CMOS technology for high-precision tracking detector

    NASA Astrophysics Data System (ADS)

    Zhang, L.; Fu, M.; Zhang, Y.; Yan, W.; Wang, M.

    2017-01-01

    The Circular Electron Positron Collider (CEPC) proposed by the Chinese high energy physics community is aiming to measure Higgs particles and their interactions precisely. The tracking detector including Silicon Inner Tracker (SIT) and Forward Tracking Disks (FTD) has driven stringent requirements on sensor technologies in term of spatial resolution, power consumption and readout speed. CMOS Pixel Sensor (CPS) is a promising candidate to approach these requirements. This paper presents the preliminary studies on the sensor optimization for tracking detector to achieve high collection efficiency while keeping necessary spatial resolution. Detailed studies have been performed on the charge collection using a 0.18 μm CMOS image sensor process. This process allows high resistivity epitaxial layer, leading to a significant improvement on the charge collection and therefore improving the radiation tolerance. Together with the simulation results, the first exploratory prototype has bee designed and fabricated. The prototype includes 9 different pixel arrays, which vary in terms of pixel pitch, diode size and geometry. The total area of the prototype amounts to 2 × 7.88 mm2.

  13. A multi-sensor land mine detection system: hardware and architectural outline of the Australian RRAMNS CTD system

    NASA Astrophysics Data System (ADS)

    Abeynayake, Canicious; Chant, Ian; Kempinger, Siegfried; Rye, Alan

    2005-06-01

    The Rapid Route Area and Mine Neutralisation System (RRAMNS) Capability Technology Demonstrator (CTD) is a countermine detection project undertaken by DSTO and supported by the Australian Defence Force (ADF). The limited time and budget for this CTD resulted in some difficult strategic decisions with regard to hardware selection and system architecture. Although the delivered system has certain limitations arising from its experimental status, many lessons have been learned which illustrate a pragmatic path for future development. RRAMNS a similar sensor suite to other systems, in that three complementary sensors are included. These are Ground Probing Radar, Metal Detector Array, and multi-band electro-optic sensors. However, RRAMNS uses a unique imaging system and a network based real-time control and sensor fusion architecture. The relatively simple integration of each of these components could be the basis for a robust and cost-effective operational system. The RRAMNS imaging system consists of three cameras which cover the visible spectrum, the mid-wave and long-wave infrared region. This subsystem can be used separately as a scouting sensor. This paper describes the system at its mid-2004 status, when full integration of all detection components was achieved.

  14. Multispectral Filter Arrays: Recent Advances and Practical Implementation

    PubMed Central

    Lapray, Pierre-Jean; Wang, Xingbo; Thomas, Jean-Baptiste; Gouton, Pierre

    2014-01-01

    Thanks to some technical progress in interferencefilter design based on different technologies, we can finally successfully implement the concept of multispectral filter array-based sensors. This article provides the relevant state-of-the-art for multispectral imaging systems and presents the characteristics of the elements of our multispectral sensor as a case study. The spectral characteristics are based on two different spatial arrangements that distribute eight different bandpass filters in the visible and near-infrared area of the spectrum. We demonstrate that the system is viable and evaluate its performance through sensor spectral simulation. PMID:25407904

  15. Hierarchical classification in high dimensional numerous class cases

    NASA Technical Reports Server (NTRS)

    Kim, Byungyong; Landgrebe, D. A.

    1990-01-01

    As progress in new sensor technology continues, increasingly high resolution imaging sensors are being developed. These sensors give more detailed and complex data for each picture element and greatly increase the dimensionality of data over past systems. Three methods for designing a decision tree classifier are discussed: a top down approach, a bottom up approach, and a hybrid approach. Three feature extraction techniques are implemented. Canonical and extended canonical techniques are mainly dependent upon the mean difference between two classes. An autocorrelation technique is dependent upon the correlation differences. The mathematical relationship between sample size, dimensionality, and risk value is derived.

  16. Development of Meandering Winding Magnetometer (MWM (Register Trademark)) Eddy Current Sensors for the Health Monitoring, Modeling and Damage Detection of High Temperature Composite Materials

    NASA Technical Reports Server (NTRS)

    Russell, Richard; Washabaugh, Andy; Sheiretov, Yanko; Martin, Christopher; Goldfine, Neil

    2011-01-01

    The increased use of high-temperature composite materials in modern and next generation aircraft and spacecraft have led to the need for improved nondestructive evaluation and health monitoring techniques. Such technologies are desirable to improve quality control, damage detection, stress evaluation and temperature measurement capabilities. Novel eddy current sensors and sensor arrays, such as Meandering Winding Magnetometers (MWMs) have provided alternate or complimentary techniques to ultrasound and thermography for both nondestructive evaluation (NDE) and structural health monitoring (SHM). This includes imaging of composite material quality, damage detection and .the monitoring of fiber temperatures and multidirectional stresses. Historically, implementation of MWM technology for the inspection of the Space Shuttle Orbiter Reinforced Carbon-Carbon Composite (RCC) leading edge panels was developed by JENTEK Sensors and was subsequently transitioned by NASA as an operational pre and post flight in-situ inspection at the Kennedy Space Center. A manual scanner, which conformed'automatically to the curvature of the RCC panels was developed and used as a secondary technique if a defect was found during an infrared thermography screening, During a recent proof of concept study on composite overwrapped pressure vessels (COPV's), three different MWM sensors were tested at three orientations to demonstrate the ability of the technology to measure stresses at various fiber orientations and depths. These results showed excellent correlation with actual surface strain gage measurements. Recent advancements of this technology have been made applying MWM sensor technology for scanning COPVs for mechanical damage. This presentation will outline the recent advance in the MWM.technology and the development of MWM techniques for NDE and SHM of carbon wraped composite overwrapped pressure vessels (COPVs) including the measurement of internal stresses via a surface mounted sensor array. In addition, this paper will outline recent efforts to produce sensors capable of making real-time measurements at temperatures up to 850 C, and discuss previous results demonstrating capability to monitor carbon fiber temperature changes within a composite material.

  17. Micromachined Chip Scale Thermal Sensor for Thermal Imaging.

    PubMed

    Shekhawat, Gajendra S; Ramachandran, Srinivasan; Jiryaei Sharahi, Hossein; Sarkar, Souravi; Hujsak, Karl; Li, Yuan; Hagglund, Karl; Kim, Seonghwan; Aden, Gary; Chand, Ami; Dravid, Vinayak P

    2018-02-27

    The lateral resolution of scanning thermal microscopy (SThM) has hitherto never approached that of mainstream atomic force microscopy, mainly due to poor performance of the thermal sensor. Herein, we report a nanomechanical system-based thermal sensor (thermocouple) that enables high lateral resolution that is often required in nanoscale thermal characterization in a wide range of applications. This thermocouple-based probe technology delivers excellent lateral resolution (∼20 nm), extended high-temperature measurements >700 °C without cantilever bending, and thermal sensitivity (∼0.04 °C). The origin of significantly improved figures-of-merit lies in the probe design that consists of a hollow silicon tip integrated with a vertically oriented thermocouple sensor at the apex (low thermal mass) which interacts with the sample through a metallic nanowire (50 nm diameter), thereby achieving high lateral resolution. The efficacy of this approach to SThM is demonstrated by imaging embedded metallic nanostructures in silica core-shell, metal nanostructures coated with polymer films, and metal-polymer interconnect structures. The nanoscale pitch and extremely small thermal mass of the probe promise significant improvements over existing methods and wide range of applications in several fields including semiconductor industry, biomedical imaging, and data storage.

  18. Advances in biologically inspired on/near sensor processing

    NASA Astrophysics Data System (ADS)

    McCarley, Paul L.

    1999-07-01

    As electro-optic sensors increase in size and frame rate, the data transfer and digital processing resource requirements also increase. In many missions, the spatial area of interest is but a small fraction of the available field of view. Choosing the right region of interest, however, is a challenge and still requires an enormous amount of downstream digital processing resources. In order to filter this ever-increasing amount of data, we look at how nature solves the problem. The Advanced Guidance Division of the Munitions Directorate, Air Force Research Laboratory at Elgin AFB, Florida, has been pursuing research in the are of advanced sensor and image processing concepts based on biologically inspired sensory information processing. A summary of two 'neuromorphic' processing efforts will be presented along with a seeker system concept utilizing this innovative technology. The Neuroseek program is developing a 256 X 256 2-color dual band IRFPA coupled to an optimized silicon CMOS read-out and processing integrated circuit that provides simultaneous full-frame imaging in MWIR/LWIR wavebands along with built-in biologically inspired sensor image processing functions. Concepts and requirements for future such efforts will also be discussed.

  19. [Development of a Surgical Navigation System with Beam Split and Fusion of the Visible and Near-Infrared Fluorescence].

    PubMed

    Yang, Xiaofeng; Wu, Wei; Wang, Guoan

    2015-04-01

    This paper presents a surgical optical navigation system with non-invasive, real-time, and positioning characteristics for open surgical procedure. The design was based on the principle of near-infrared fluorescence molecular imaging. The in vivo fluorescence excitation technology, multi-channel spectral camera technology and image fusion software technology were used. Visible and near-infrared light ring LED excitation source, multi-channel band pass filters, spectral camera 2 CCD optical sensor technology and computer systems were integrated, and, as a result, a new surgical optical navigation system was successfully developed. When the near-infrared fluorescence was injected, the system could display anatomical images of the tissue surface and near-infrared fluorescent functional images of surgical field simultaneously. The system can identify the lymphatic vessels, lymph node, tumor edge which doctor cannot find out with naked eye intra-operatively. Our research will guide effectively the surgeon to remove the tumor tissue to improve significantly the success rate of surgery. The technologies have obtained a national patent, with patent No. ZI. 2011 1 0292374. 1.

  20. Plant health sensing

    NASA Technical Reports Server (NTRS)

    Manukian, Ara; Mckelvy, Colleen; Pearce, Michael; Syslo, Steph

    1988-01-01

    If plants are to be used as a food source for long term space missions, they must be grown in a stable environment where the health of the crops is continuously monitored. The sensor(s) to be used should detect any diseases or health problems before irreversible damage occurs. The method of analysis must be nondestructive and provide instantaneous information on the condition of the crop. In addition, the sensor(s) must be able to function in microgravity. This first semester, the plant health and disease sensing group concentrated on researching and consulting experts in many fields in attempts to find reliable plant health indicators. Once several indicators were found, technologies that could detect them were investigated. Eventually the three methods chosen to be implemented next semester were stimulus response monitoring, video image processing and chlorophyll level detection. Most of the other technologies investigated this semester are discussed here. They were rejected for various reasons but are included in the report because NASA may wish to consider pursuing them in the future.

  1. MEMS-based thermally-actuated image stabilizer for cellular phone camera

    NASA Astrophysics Data System (ADS)

    Lin, Chun-Ying; Chiou, Jin-Chern

    2012-11-01

    This work develops an image stabilizer (IS) that is fabricated using micro-electro-mechanical system (MEMS) technology and is designed to counteract the vibrations when human using cellular phone cameras. The proposed IS has dimensions of 8.8 × 8.8 × 0.3 mm3 and is strong enough to suspend an image sensor. The processes that is utilized to fabricate the IS includes inductive coupled plasma (ICP) processes, reactive ion etching (RIE) processes and the flip-chip bonding method. The IS is designed to enable the electrical signals from the suspended image sensor to be successfully emitted out using signal output beams, and the maximum actuating distance of the stage exceeds 24.835 µm when the driving current is 155 mA. Depending on integration of MEMS device and designed controller, the proposed IS can decrease the hand tremor by 72.5%.

  2. The AOLI Non-Linear Curvature Wavefront Sensor: High sensitivity reconstruction for low-order AO

    NASA Astrophysics Data System (ADS)

    Crass, Jonathan; King, David; Mackay, Craig

    2013-12-01

    Many adaptive optics (AO) systems in use today require bright reference objects to determine the effects of atmospheric distortions on incoming wavefronts. This requirement is because Shack Hartmann wavefront sensors (SHWFS) distribute incoming light from reference objects into a large number of sub-apertures. Bright natural reference objects occur infrequently across the sky leading to the use of laser guide stars which add complexity to wavefront measurement systems. The non-linear curvature wavefront sensor as described by Guyon et al. has been shown to offer a significant increase in sensitivity when compared to a SHWFS. This facilitates much greater sky coverage using natural guide stars alone. This paper describes the current status of the non-linear curvature wavefront sensor being developed as part of an adaptive optics system for the Adaptive Optics Lucky Imager (AOLI) project. The sensor comprises two photon-counting EMCCD detectors from E2V Technologies, recording intensity at four near-pupil planes. These images are used with a reconstruction algorithm to determine the phase correction to be applied by an ALPAO 241-element deformable mirror. The overall system is intended to provide low-order correction for a Lucky Imaging based multi CCD imaging camera. We present the current optical design of the instrument including methods to minimise inherent optical effects, principally chromaticity. Wavefront reconstruction methods are discussed and strategies for their optimisation to run at the required real-time speeds are introduced. Finally, we discuss laboratory work with a demonstrator setup of the system.

  3. Remote observations of reentering spacecraft including the space shuttle orbiter

    NASA Astrophysics Data System (ADS)

    Horvath, Thomas J.; Cagle, Melinda F.; Grinstead, Jay H.; Gibson, David M.

    Flight measurement is a critical phase in development, validation and certification processes of technologies destined for future civilian and military operational capabilities. This paper focuses on several recent NASA-sponsored remote observations that have provided unique engineering and scientific insights of reentry vehicle flight phenomenology and performance that could not necessarily be obtained with more traditional instrumentation methods such as onboard discrete surface sensors. The missions highlighted include multiple spatially-resolved infrared observations of the NASA Space Shuttle Orbiter during hypersonic reentry from 2009 to 2011, and emission spectroscopy of comparatively small-sized sample return capsules returning from exploration missions. Emphasis has been placed upon identifying the challenges associated with these remote sensing missions with focus on end-to-end aspects that include the initial science objective, selection of the appropriate imaging platform and instrumentation suite, target flight path analysis and acquisition strategy, pre-mission simulations to optimize sensor configuration, logistics and communications during the actual observation. Explored are collaborative opportunities and technology investments required to develop a next-generation quantitative imaging system (i.e., an intelligent sensor and platform) with greater capability, which could more affordably support cross cutting civilian and military flight test needs.

  4. Bundle Block Adjustment of Airborne Three-Line Array Imagery Based on Rotation Angles

    PubMed Central

    Zhang, Yongjun; Zheng, Maoteng; Huang, Xu; Xiong, Jinxin

    2014-01-01

    In the midst of the rapid developments in electronic instruments and remote sensing technologies, airborne three-line array sensors and their applications are being widely promoted and plentiful research related to data processing and high precision geo-referencing technologies is under way. The exterior orientation parameters (EOPs), which are measured by the integrated positioning and orientation system (POS) of airborne three-line sensors, however, have inevitable systematic errors, so the level of precision of direct geo-referencing is not sufficiently accurate for surveying and mapping applications. Consequently, a few ground control points are necessary to refine the exterior orientation parameters, and this paper will discuss bundle block adjustment models based on the systematic error compensation and the orientation image, considering the principle of an image sensor and the characteristics of the integrated POS. Unlike the models available in the literature, which mainly use a quaternion to represent the rotation matrix of exterior orientation, three rotation angles are directly used in order to effectively model and eliminate the systematic errors of the POS observations. Very good experimental results have been achieved with several real datasets that verify the correctness and effectiveness of the proposed adjustment models. PMID:24811075

  5. Bundle block adjustment of airborne three-line array imagery based on rotation angles.

    PubMed

    Zhang, Yongjun; Zheng, Maoteng; Huang, Xu; Xiong, Jinxin

    2014-05-07

    In the midst of the rapid developments in electronic instruments and remote sensing technologies, airborne three-line array sensors and their applications are being widely promoted and plentiful research related to data processing and high precision geo-referencing technologies is under way. The exterior orientation parameters (EOPs), which are measured by the integrated positioning and orientation system (POS) of airborne three-line sensors, however, have inevitable systematic errors, so the level of precision of direct geo-referencing is not sufficiently accurate for surveying and mapping applications. Consequently, a few ground control points are necessary to refine the exterior orientation parameters, and this paper will discuss bundle block adjustment models based on the systematic error compensation and the orientation image, considering the principle of an image sensor and the characteristics of the integrated POS. Unlike the models available in the literature, which mainly use a quaternion to represent the rotation matrix of exterior orientation, three rotation angles are directly used in order to effectively model and eliminate the systematic errors of the POS observations. Very good experimental results have been achieved with several real datasets that verify the correctness and effectiveness of the proposed adjustment models.

  6. Remote Observations of Reentering Spacecraft Including the Space Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas J.; Cagle, Melinda F.; Grinstead, jay H.; Gibson, David

    2013-01-01

    Flight measurement is a critical phase in development, validation and certification processes of technologies destined for future civilian and military operational capabilities. This paper focuses on several recent NASA-sponsored remote observations that have provided unique engineering and scientific insights of reentry vehicle flight phenomenology and performance that could not necessarily be obtained with more traditional instrumentation methods such as onboard discrete surface sensors. The missions highlighted include multiple spatially-resolved infrared observations of the NASA Space Shuttle Orbiter during hypersonic reentry from 2009 to 2011, and emission spectroscopy of comparatively small-sized sample return capsules returning from exploration missions. Emphasis has been placed upon identifying the challenges associated with these remote sensing missions with focus on end-to-end aspects that include the initial science objective, selection of the appropriate imaging platform and instrumentation suite, target flight path analysis and acquisition strategy, pre-mission simulations to optimize sensor configuration, logistics and communications during the actual observation. Explored are collaborative opportunities and technology investments required to develop a next-generation quantitative imaging system (i.e., an intelligent sensor and platform) with greater capability, which could more affordably support cross cutting civilian and military flight test needs.

  7. A geometric performance assessment of the EO-1 advanced land imager

    USGS Publications Warehouse

    Storey, James C.; Choate, M.J.; Meyer, D.J.

    2004-01-01

    The Earth Observing 1 (EO-1) Advanced Land Imager (ALI) demonstrates technology applicable to a successor system to the Landsat Thematic Mapper series. A study of the geometric performance characteristics of the ALI was conducted under the auspices of the EO-1 Science Validation Team. This study evaluated ALI performance with respect to absolute pointing knowledge, focal plane sensor chip assembly alignment, and band-to-band registration for purposes of comparing this new technology to the heritage Landsat systems. On-orbit geometric calibration procedures were developed that allowed the generation of ALI geometrically corrected products that compare favorably with their Landsat 7 counterparts with respect to absolute geodetic accuracy, internal image geometry, and band registration.

  8. Using a plenoptic camera to measure distortions in wavefronts affected by atmospheric turbulence

    NASA Astrophysics Data System (ADS)

    Eslami, Mohammed; Wu, Chensheng; Rzasa, John; Davis, Christopher C.

    2012-10-01

    Ideally, as planar wave fronts travel through an imaging system, all rays, or vectors pointing in the direction of the propagation of energy are parallel, and thus the wave front is focused to a particular point. If the wave front arrives at an imaging system with energy vectors that point in different directions, each part of the wave front will be focused at a slightly different point on the sensor plane and result in a distorted image. The Hartmann test, which involves the insertion of a series of pinholes between the imaging system and the sensor plane, was developed to sample the wavefront at different locations and measure the distortion angles at different points in the wave front. An adaptive optic system, such as a deformable mirror, is then used to correct for these distortions and allow the planar wave front to focus at the point desired on the sensor plane, thereby correcting the distorted image. The apertures of a pinhole array limit the amount of light that reaches the sensor plane. By replacing the pinholes with a microlens array each bundle of rays is focused to brighten the image. Microlens arrays are making their way into newer imaging technologies, such as "light field" or "plenoptic" cameras. In these cameras, the microlens array is used to recover the ray information of the incoming light by using post processing techniques to focus on objects at different depths. The goal of this paper is to demonstrate the use of these plenoptic cameras to recover the distortions in wavefronts. Taking advantage of the microlens array within the plenoptic camera, CODE-V simulations show that its performance can provide more information than a Shack-Hartmann sensor. Using the microlens array to retrieve the ray information and then backstepping through the imaging system provides information about distortions in the arriving wavefront.

  9. New frontiers for infrared

    NASA Astrophysics Data System (ADS)

    Corsi, C.

    2015-03-01

    Infrared (IR) science and technology has been mainly dedicated to surveillance and security: since the 70's specialized techniques have been emerging in thermal imaging for medical and cultural heritage diagnostics, building and aeronautics structures control, energy savings and remote sensing. Most of these applications were developed thanks to IR FPAs sensors with high numbers of pixels and, actually, working at room temperatures. Besides these technological achievements in sensors/ receivers, advanced developments of IR laser sources up to far IR bands have been achieved in the form QCL (quantum cascade laser), allowing wide band TLC and high sensitivity systems for security. recently new sensors and sources with improved performances are emerging in the very far IR region up to submillimeter wavelengths, the so called terahertz (THz) region. A survey of the historical growth and a forecast of the future developments in Devices and Systems for the new frontier of IR will be discussed, in particular for the key questions: "From where and when is IR coming?", "Where is it now?" and "Where will it go and when?". These questions will be treated for key systems (Military/Civil), key devices (Sensors/ Sources), and new strategic technologies (Nanotech/TeraHertz).

  10. The Employment Effects of High-Technology: A Case Study of Machine Vision. Research Report No. 86-19.

    ERIC Educational Resources Information Center

    Chen, Kan; Stafford, Frank P.

    A case study of machine vision was conducted to identify and analyze the employment effects of high technology in general. (Machine vision is the automatic acquisition and analysis of an image to obtain desired information for use in controlling an industrial activity, such as the visual sensor system that gives eyes to a robot.) Machine vision as…

  11. Collaborative Point Paper on Border Surveillance Technology

    DTIC Science & Technology

    2007-06-01

    Systems PLC LORHIS (Long Range Hyperspectral Imaging System ) can be configured for either manned or unmanned aircraft to automatically detect and...Airships, and/or Aerostats, (RF, Electro-Optical, Infrared, Video) • Land- based Sensor Systems (Attended/Mobile and Unattended: e.g., CCD, Motion, Acoustic...electronic surveillance technologies for intrusion detection and warning. These ground- based systems are primarily short-range, up to around 500 meters

  12. Active Pixel Sensors: Are CCD's Dinosaurs?

    NASA Technical Reports Server (NTRS)

    Fossum, Eric R.

    1993-01-01

    Charge-coupled devices (CCD's) are presently the technology of choice for most imaging applications. In the 23 years since their invention in 1970, they have evolved to a sophisticated level of performance. However, as with all technologies, we can be certain that they will be supplanted someday. In this paper, the Active Pixel Sensor (APS) technology is explored as a possible successor to the CCD. An active pixel is defined as a detector array technology that has at least one active transistor within the pixel unit cell. The APS eliminates the need for nearly perfect charge transfer -- the Achilles' heel of CCDs. This perfect charge transfer makes CCD's radiation 'soft,' difficult to use under low light conditions, difficult to manufacture in large array sizes, difficult to integrate with on-chip electronics, difficult to use at low temperatures, difficult to use at high frame rates, and difficult to manufacture in non-silicon materials that extend wavelength response.

  13. The implementation of CMOS sensors within a real time digital mammography intelligent imaging system: The I-ImaS System

    NASA Astrophysics Data System (ADS)

    Esbrand, C.; Royle, G.; Griffiths, J.; Speller, R.

    2009-07-01

    The integration of technology with healthcare has undoubtedly propelled the medical imaging sector well into the twenty first century. The concept of digital imaging introduced during the 1970s has since paved the way for established imaging techniques where digital mammography, phase contrast imaging and CT imaging are just a few examples. This paper presents a prototype intelligent digital mammography system designed and developed by a European consortium. The final system, the I-ImaS system, utilises CMOS monolithic active pixel sensor (MAPS) technology promoting on-chip data processing, enabling the acts of data processing and image acquisition to be achieved simultaneously; consequently, statistical analysis of tissue is achievable in real-time for the purpose of x-ray beam modulation via a feedback mechanism during the image acquisition procedure. The imager implements a dual array of twenty 520 pixel × 40 pixel CMOS MAPS sensing devices with a 32μm pixel size, each individually coupled to a 100μm thick thallium doped structured CsI scintillator. This paper presents the first intelligent images of real breast tissue obtained from the prototype system of real excised breast tissue where the x-ray exposure was modulated via the statistical information extracted from the breast tissue itself. Conventional images were experimentally acquired where the statistical analysis of the data was done off-line, resulting in the production of simulated real-time intelligently optimised images. The results obtained indicate real-time image optimisation using the statistical information extracted from the breast as a means of a feedback mechanisms is beneficial and foreseeable in the near future.

  14. EOID Model Validation and Performance Prediction

    DTIC Science & Technology

    2002-09-30

    Our long-term goal is to accurately predict the capability of the current generation of laser-based underwater imaging sensors to perform Electro ... Optic Identification (EOID) against relevant targets in a variety of realistic environmental conditions. The two most prominent technologies in this area

  15. NASA Tech Briefs, October 1998. Volume 22, No. 10

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Topics include: special coverage sections on sensors/imaging and mechanical technology, and sections on electronic components and circuits, electronic systems, software, materials, machinery/automation, manufacturing/fabrication, physical sciences, information sciences, book and reports, and a special section of Photonics Tech Briefs.

  16. Cameras Reveal Elements in the Short Wave Infrared

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Goodrich ISR Systems Inc. (formerly Sensors Unlimited Inc.), based out of Princeton, New Jersey, received Small Business Innovation Research (SBIR) contracts from the Jet Propulsion Laboratory, Marshall Space Flight Center, Kennedy Space Center, Goddard Space Flight Center, Ames Research Center, Stennis Space Center, and Langley Research Center to assist in advancing and refining indium gallium arsenide imaging technology. Used on the Lunar Crater Observation and Sensing Satellite (LCROSS) mission in 2009 for imaging the short wave infrared wavelengths, the technology has dozens of applications in military, security and surveillance, machine vision, medical, spectroscopy, semiconductor inspection, instrumentation, thermography, and telecommunications.

  17. Plant stress analysis technology deployment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ebadian, M.A.

    1998-01-01

    Monitoring vegetation is an active area of laser-induced fluorescence imaging (LIFI) research. The Hemispheric Center for Environmental Technology (HCET) at Florida International University (FIU) is assisting in the transfer of the LIFI technology to the agricultural private sector through a market survey. The market survey will help identify the key eco-agricultural issues of the nations that could benefit from the use of sensor technologies developed by the Office of Science and Technology (OST). The principal region of interest is the Western Hemisphere, particularly, the rapidly growing countries of Latin America and the Caribbean. The analysis of needs will assure thatmore » the focus of present and future research will center on economically important issues facing both hemispheres. The application of the technology will be useful to the agriculture industry for airborne crop analysis as well as in the detection and characterization of contaminated sites by monitoring vegetation. LIFI airborne and close-proximity systems will be evaluated as stand-alone technologies and additions to existing sensor technologies that have been used to monitor crops in the field and in storage.« less

  18. Measuring noise equivalent irradiance of a digital short-wave infrared imaging system using a broadband source to simulate the night spectrum

    NASA Astrophysics Data System (ADS)

    Green, John R.; Robinson, Timothy

    2015-05-01

    There is a growing interest in developing helmet-mounted digital imaging systems (HMDIS) for integration into military aircraft cockpits. This interest stems from the multiple advantages of digital vs. analog imaging such as image fusion from multiple sensors, data processing to enhance the image contrast, superposition of non-imaging data over the image, and sending images to remote location for analysis. There are several properties an HMDIS must have in order to aid the pilot during night operations. In addition to the resolution, image refresh rate, dynamic range, and sensor uniformity over the entire Focal Plane Array (FPA); the imaging system must have the sensitivity to detect the limited night light available filtered through cockpit transparencies. Digital sensor sensitivity is generally measured monochromatically using a laser with a wavelength near the peak detector quantum efficiency, and is generally reported as either the Noise Equivalent Power (NEP) or Noise Equivalent Irradiance (NEI). This paper proposes a test system that measures NEI of Short-Wave Infrared (SWIR) digital imaging systems using a broadband source that simulates the night spectrum. This method has a few advantages over a monochromatic method. Namely, the test conditions provide spectrum closer to what is experienced by the end-user, and the resulting NEI may be compared directly to modeled night glow irradiance calculation. This comparison may be used to assess the Technology Readiness Level of the imaging system for the application. The test system is being developed under a Cooperative Research and Development Agreement (CRADA) with the Air Force Research Laboratory.

  19. Origami silicon optoelectronics for hemispherical electronic eye systems.

    PubMed

    Zhang, Kan; Jung, Yei Hwan; Mikael, Solomon; Seo, Jung-Hun; Kim, Munho; Mi, Hongyi; Zhou, Han; Xia, Zhenyang; Zhou, Weidong; Gong, Shaoqin; Ma, Zhenqiang

    2017-11-24

    Digital image sensors in hemispherical geometries offer unique imaging advantages over their planar counterparts, such as wide field of view and low aberrations. Deforming miniature semiconductor-based sensors with high-spatial resolution into such format is challenging. Here we report a simple origami approach for fabricating single-crystalline silicon-based focal plane arrays and artificial compound eyes that have hemisphere-like structures. Convex isogonal polyhedral concepts allow certain combinations of polygons to fold into spherical formats. Using each polygon block as a sensor pixel, the silicon-based devices are shaped into maps of truncated icosahedron and fabricated on flexible sheets and further folded either into a concave or convex hemisphere. These two electronic eye prototypes represent simple and low-cost methods as well as flexible optimization parameters in terms of pixel density and design. Results demonstrated in this work combined with miniature size and simplicity of the design establish practical technology for integration with conventional electronic devices.

  20. A Vision of Quantitative Imaging Technology for Validation of Advanced Flight Technologies

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas J.; Kerns, Robert V.; Jones, Kenneth M.; Grinstead, Jay H.; Schwartz, Richard J.; Gibson, David M.; Taylor, Jeff C.; Tack, Steve; Dantowitz, Ronald F.

    2011-01-01

    Flight-testing is traditionally an expensive but critical element in the development and ultimate validation and certification of technologies destined for future operational capabilities. Measurements obtained in relevant flight environments also provide unique opportunities to observe flow phenomenon that are often beyond the capabilities of ground testing facilities and computational tools to simulate or duplicate. However, the challenges of minimizing vehicle weight and internal complexity as well as instrumentation bandwidth limitations often restrict the ability to make high-density, in-situ measurements with discrete sensors. Remote imaging offers a potential opportunity to noninvasively obtain such flight data in a complementary fashion. The NASA Hypersonic Thermodynamic Infrared Measurements Project has demonstrated such a capability to obtain calibrated thermal imagery on a hypersonic vehicle in flight. Through the application of existing and accessible technologies, the acreage surface temperature of the Shuttle lower surface was measured during reentry. Future hypersonic cruise vehicles, launcher configurations and reentry vehicles will, however, challenge current remote imaging capability. As NASA embarks on the design and deployment of a new Space Launch System architecture for access beyond earth orbit (and the commercial sector focused on low earth orbit), an opportunity exists to implement an imagery system and its supporting infrastructure that provides sufficient flexibility to incorporate changing technology to address the future needs of the flight test community. A long term vision is offered that supports the application of advanced multi-waveband sensing technology to aid in the development of future aerospace systems and critical technologies to enable highly responsive vehicle operations across the aerospace continuum, spanning launch, reusable space access and global reach. Motivations for development of an Agency level imagery-based measurement capability to support cross cutting applications that span the Agency mission directorates as well as meeting potential needs of the commercial sector and national interests of the Intelligence, Surveillance and Reconnaissance community are explored. A recommendation is made for an assessment study to baseline current imaging technology including the identification of future mission requirements. Development of requirements fostered by the applications suggested in this paper would be used to identify technology gaps and direct roadmapping for implementation of an affordable and sustainable next generation sensor/platform system.

  1. Design and application of a small size SAFT imaging system for concrete structure

    NASA Astrophysics Data System (ADS)

    Shao, Zhixue; Shi, Lihua; Shao, Zhe; Cai, Jian

    2011-07-01

    A method of ultrasonic imaging detection is presented for quick non-destructive testing (NDT) of concrete structures using synthesized aperture focusing technology (SAFT). A low cost ultrasonic sensor array consisting of 12 market available low frequency ultrasonic transducers is designed and manufactured. A channel compensation method is proposed to improve the consistency of different transducers. The controlling devices for array scan as well as the virtual instrument for SAFT imaging are designed. In the coarse scan mode with the scan step of 50 mm, the system can quickly give an image display of a cross section of 600 mm (L) × 300 mm (D) by one measurement. In the refined scan model, the system can reduce the scan step and give an image display of the same cross section by moving the sensor array several times. Experiments on staircase specimen, concrete slab with embedded target, and building floor with underground pipe line all verify the efficiency of the proposed method.

  2. Miniature optical planar camera based on a wide-angle metasurface doublet corrected for monochromatic aberrations

    DOE PAGES

    Arbabi, Amir; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; ...

    2016-11-28

    Optical metasurfaces are two-dimensional arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optics by enabling complex low-cost systems where multiple metasurfaces are lithographically stacked and integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has a small f-number of 0.9, an angle-of-view larger than 60° ×more » 60°, and operates at 850 nm wavelength with 70% focusing efficiency. The camera exhibits nearly diffraction-limited image quality, which indicates the potential of this technology in the development of optical systems for microscopy, photography, and computer vision.« less

  3. Roadside IED detection using subsurface imaging radar and rotary UAV

    NASA Astrophysics Data System (ADS)

    Qin, Yexian; Twumasi, Jones O.; Le, Viet Q.; Ren, Yu-Jiun; Lai, C. P.; Yu, Tzuyang

    2016-05-01

    Modern improvised explosive device (IED) and mine detection sensors using microwave technology are based on ground penetrating radar operated by a ground vehicle. Vehicle size, road conditions, and obstacles along the troop marching direction limit operation of such sensors. This paper presents a new conceptual design using a rotary unmanned aerial vehicle (UAV) to carry subsurface imaging radar for roadside IED detection. We have built a UAV flight simulator with the subsurface imaging radar running in a laboratory environment and tested it with non-metallic and metallic IED-like targets. From the initial lab results, we can detect the IED-like target 10-cm below road surface while carried by a UAV platform. One of the challenges is to design the radar and antenna system for a very small payload (less than 3 lb). The motion compensation algorithm is also critical to the imaging quality. In this paper, we also demonstrated the algorithm simulation and experimental imaging results with different IED target materials, sizes, and clutters.

  4. The Application of Virtex-II Pro FPGA in High-Speed Image Processing Technology of Robot Vision Sensor

    NASA Astrophysics Data System (ADS)

    Ren, Y. J.; Zhu, J. G.; Yang, X. Y.; Ye, S. H.

    2006-10-01

    The Virtex-II Pro FPGA is applied to the vision sensor tracking system of IRB2400 robot. The hardware platform, which undertakes the task of improving SNR and compressing data, is constructed by using the high-speed image processing of FPGA. The lower level image-processing algorithm is realized by combining the FPGA frame and the embedded CPU. The velocity of image processing is accelerated due to the introduction of FPGA and CPU. The usage of the embedded CPU makes it easily to realize the logic design of interface. Some key techniques are presented in the text, such as read-write process, template matching, convolution, and some modules are simulated too. In the end, the compare among the modules using this design, using the PC computer and using the DSP, is carried out. Because the high-speed image processing system core is a chip of FPGA, the function of which can renew conveniently, therefore, to a degree, the measure system is intelligent.

  5. Spinoff 2010

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Topics covered include: Burnishing Techniques Strengthen Hip Implants; Signal Processing Methods Monitor Cranial Pressure; Ultraviolet-Blocking Lenses Protect, Enhance Vision; Hyperspectral Systems Increase Imaging Capabilities; Programs Model the Future of Air Traffic Management; Tail Rotor Airfoils Stabilize Helicopters, Reduce Noise; Personal Aircraft Point to the Future of Transportation; Ducted Fan Designs Lead to Potential New Vehicles; Winglets Save Billions of Dollars in Fuel Costs; Sensor Systems Collect Critical Aerodynamics Data; Coatings Extend Life of Engines and Infrastructure; Radiometers Optimize Local Weather Prediction; Energy-Efficient Systems Eliminate Icing Danger for UAVs; Rocket-Powered Parachutes Rescue Entire Planes; Technologies Advance UAVs for Science, Military; Inflatable Antennas Support Emergency Communication; Smart Sensors Assess Structural Health; Hand-Held Devices Detect Explosives and Chemical Agents; Terahertz Tools Advance Imaging for Security, Industry; LED Systems Target Plant Growth; Aerogels Insulate Against Extreme Temperatures; Image Sensors Enhance Camera Technologies; Lightweight Material Patches Allow for Quick Repairs; Nanomaterials Transform Hairstyling Tools; Do-It-Yourself Additives Recharge Auto Air Conditioning; Systems Analyze Water Quality in Real Time; Compact Radiometers Expand Climate Knowledge; Energy Servers Deliver Clean, Affordable Power; Solutions Remediate Contaminated Groundwater; Bacteria Provide Cleanup of Oil Spills, Wastewater; Reflective Coatings Protect People and Animals; Innovative Techniques Simplify Vibration Analysis; Modeling Tools Predict Flow in Fluid Dynamics; Verification Tools Secure Online Shopping, Banking; Toolsets Maintain Health of Complex Systems; Framework Resources Multiply Computing Power; Tools Automate Spacecraft Testing, Operation; GPS Software Packages Deliver Positioning Solutions; Solid-State Recorders Enhance Scientific Data Collection; Computer Models Simulate Fine Particle Dispersion; Composite Sandwich Technologies Lighten Components; Cameras Reveal Elements in the Short Wave Infrared; Deformable Mirrors Correct Optical Distortions; Stitching Techniques Advance Optics Manufacturing; Compact, Robust Chips Integrate Optical Functions; Fuel Cell Stations Automate Processes, Catalyst Testing; Onboard Systems Record Unique Videos of Space Missions; Space Research Results Purify Semiconductor Materials; and Toolkits Control Motion of Complex Robotics.

  6. Concept of electro-optical sensor module for sniper detection system

    NASA Astrophysics Data System (ADS)

    Trzaskawka, Piotr; Dulski, Rafal; Kastek, Mariusz

    2010-10-01

    The paper presents an initial concept of the electro-optical sensor unit for sniper detection purposes. This unit, comprising of thermal and daylight cameras, can operate as a standalone device but its primary application is a multi-sensor sniper and shot detection system. Being a part of a larger system it should contribute to greater overall system efficiency and lower false alarm rate thanks to data and sensor fusion techniques. Additionally, it is expected to provide some pre-shot detection capabilities. Generally acoustic (or radar) systems used for shot detection offer only "after-the-shot" information and they cannot prevent enemy attack, which in case of a skilled sniper opponent usually means trouble. The passive imaging sensors presented in this paper, together with active systems detecting pointed optics, are capable of detecting specific shooter signatures or at least the presence of suspected objects in the vicinity. The proposed sensor unit use thermal camera as a primary sniper and shot detection tool. The basic camera parameters such as focal plane array size and type, focal length and aperture were chosen on the basis of assumed tactical characteristics of the system (mainly detection range) and current technology level. In order to provide costeffective solution the commercially available daylight camera modules and infrared focal plane arrays were tested, including fast cooled infrared array modules capable of 1000 fps image acquisition rate. The daylight camera operates as a support, providing corresponding visual image, easier to comprehend for a human operator. The initial assumptions concerning sensor operation were verified during laboratory and field test and some example shot recording sequences are presented.

  7. Band registration of tuneable frame format hyperspectral UAV imagers in complex scenes

    NASA Astrophysics Data System (ADS)

    Honkavaara, Eija; Rosnell, Tomi; Oliveira, Raquel; Tommaselli, Antonio

    2017-12-01

    A recent revolution in miniaturised sensor technology has provided markets with novel hyperspectral imagers operating in the frame format principle. In the case of unmanned aerial vehicle (UAV) based remote sensing, the frame format technology is highly attractive in comparison to the commonly utilised pushbroom scanning technology, because it offers better stability and the possibility to capture stereoscopic data sets, bringing an opportunity for 3D hyperspectral object reconstruction. Tuneable filters are one of the approaches for capturing multi- or hyperspectral frame images. The individual bands are not aligned when operating a sensor based on tuneable filters from a mobile platform, such as UAV, because the full spectrum recording is carried out in the time-sequential principle. The objective of this investigation was to study the aspects of band registration of an imager based on tuneable filters and to develop a rigorous and efficient approach for band registration in complex 3D scenes, such as forests. The method first determines the orientations of selected reference bands and reconstructs the 3D scene using structure-from-motion and dense image matching technologies. The bands, without orientation, are then matched to the oriented bands accounting the 3D scene to provide exterior orientations, and afterwards, hyperspectral orthomosaics, or hyperspectral point clouds, are calculated. The uncertainty aspects of the novel approach were studied. An empirical assessment was carried out in a forested environment using hyperspectral images captured with a hyperspectral 2D frame format camera, based on a tuneable Fabry-Pérot interferometer (FPI) on board a multicopter and supported by a high spatial resolution consumer colour camera. A theoretical assessment showed that the method was capable of providing band registration accuracy better than 0.5-pixel size. The empirical assessment proved the performance and showed that, with the novel method, most parts of the band misalignments were less than the pixel size. Furthermore, it was shown that the performance of the band alignment was dependent on the spatial distance from the reference band.

  8. Infrared hyperspectral imaging miniaturized for UAV applications

    NASA Astrophysics Data System (ADS)

    Hinnrichs, Michele; Hinnrichs, Bradford; McCutchen, Earl

    2017-02-01

    Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera, both MWIR and LWIR, small enough to serve as a payload on a miniature unmanned aerial vehicles. The optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of the sensor. This new and innovative approach to infrared hyperspectral imaging spectrometer uses micro-optics and will be explained in this paper. The micro-optics are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a mini-UAV or commercial quadcopter. Also, an example of how this technology can easily be used to quantify a hydrocarbon gas leak's volume and mass flowrates. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the spatial resolution. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each frame.

  9. A monolithic 640 × 512 CMOS imager with high-NIR sensitivity

    NASA Astrophysics Data System (ADS)

    Lauxtermann, Stefan; Fisher, John; McDougal, Michael

    2014-06-01

    In this paper we present first results from a backside illuminated CMOS image sensor that we fabricated on high resistivity silicon. Compared to conventional CMOS imagers, a thicker photosensitive membrane can be depleted when using silicon with low background doping concentration while maintaining low dark current and good MTF performance. The benefits of such a fully depleted silicon sensor are high quantum efficiency over a wide spectral range and a fast photo detector response. Combining these characteristics with the circuit complexity and manufacturing maturity available from a modern, mixed signal CMOS technology leads to a new type of sensor, with an unprecedented performance spectrum in a monolithic device. Our fully depleted, backside illuminated CMOS sensor was designed to operate at integration times down to 100nsec and frame rates up to 1000Hz. Noise in Integrate While Read (IWR) snapshot shutter operation for these conditions was simulated to be below 10e- at room temperature. 2×2 binning with a 4× increase in sensitivity and a maximum frame rate of 4000 Hz is supported. For application in hyperspectral imaging systems the full well capacity in each row can individually be programmed between 10ke-, 60ke- and 500ke-. On test structures we measured a room temperature dark current of 360pA/cm2 at a reverse bias of 3.3V. A peak quantum efficiency of 80% was measured with a single layer AR coating on the backside. Test images captured with the 50μm thick VGA imager between 30Hz and 90Hz frame rate show a strong response at NIR wavelengths.

  10. Knowledge-based imaging-sensor fusion system

    NASA Technical Reports Server (NTRS)

    Westrom, George

    1989-01-01

    An imaging system which applies knowledge-based technology to supervise and control both sensor hardware and computation in the imaging system is described. It includes the development of an imaging system breadboard which brings together into one system work that we and others have pursued for LaRC for several years. The goal is to combine Digital Signal Processing (DSP) with Knowledge-Based Processing and also include Neural Net processing. The system is considered a smart camera. Imagine that there is a microgravity experiment on-board Space Station Freedom with a high frame rate, high resolution camera. All the data cannot possibly be acquired from a laboratory on Earth. In fact, only a small fraction of the data will be received. Again, imagine being responsible for some experiments on Mars with the Mars Rover: the data rate is a few kilobits per second for data from several sensors and instruments. Would it not be preferable to have a smart system which would have some human knowledge and yet follow some instructions and attempt to make the best use of the limited bandwidth for transmission. The system concept, current status of the breadboard system and some recent experiments at the Mars-like Amboy Lava Fields in California are discussed.

  11. A 256×256 low-light-level CMOS imaging sensor with digital CDS

    NASA Astrophysics Data System (ADS)

    Zou, Mei; Chen, Nan; Zhong, Shengyou; Li, Zhengfen; Zhang, Jicun; Yao, Li-bin

    2016-10-01

    In order to achieve high sensitivity for low-light-level CMOS image sensors (CIS), a capacitive transimpedance amplifier (CTIA) pixel circuit with a small integration capacitor is used. As the pixel and the column area are highly constrained, it is difficult to achieve analog correlated double sampling (CDS) to remove the noise for low-light-level CIS. So a digital CDS is adopted, which realizes the subtraction algorithm between the reset signal and pixel signal off-chip. The pixel reset noise and part of the column fixed-pattern noise (FPN) can be greatly reduced. A 256×256 CIS with CTIA array and digital CDS is implemented in the 0.35μm CMOS technology. The chip size is 7.7mm×6.75mm, and the pixel size is 15μm×15μm with a fill factor of 20.6%. The measured pixel noise is 24LSB with digital CDS in RMS value at dark condition, which shows 7.8× reduction compared to the image sensor without digital CDS. Running at 7fps, this low-light-level CIS can capture recognizable images with the illumination down to 0.1lux.

  12. GEOScan: A GEOScience Facility From Space

    NASA Astrophysics Data System (ADS)

    Dyrud, L. P.; Fentzke, J. T.; Anderson, B. J.; Bishop, R. L.; Bust, G. S.; Cahoy, K.; Erlandson, R. E.; Fish, C. S.; Gunter, B. C.; Hall, F. G.; Hilker, T.; Lorentz, S. R.; Mazur, J. E.; Murphy, S. D.; Mustard, J. F.; O'Brien, P. P.; Slagowski, S.; Trenberth, K. E.; Wiscombe, W. J.

    2012-12-01

    GEOScan is a proposed globally networked orbiting facility that will provide revolutionary, massively dense global geosciences observations. Major scientific research projects are typically conducted using two approaches: community facilities, or investigator led focused missions. GEOScan is a new concept in space science, blending the PI mission and community facility models: it is PI-led, but it carries sensors that are the result of a grass-roots competition, and, uniquely, it preserves open slots for sensors which are purposely not yet decided. The goal is threefold: first, to select sensors that maximize science value for the greatest number of scientific disciplines, second, to target science questions that cannot be answered without simultaneous global space-based measurements, and third to reap the cost advantages of scale manufacturing for space instrumentation. The relatively small size, mass, and power requirements of the GEOScan sensor suite would make it an ideal hosted payload aboard a global constellation of communication satellites, such as Iridium NEXT's 66-satellite constellation or as hosted small-sat payload. Each GEOScan sensor suite consists of 6 instruments: a Radiometer to measure Earth's total outgoing radiation; a GPS Compact Total Electron Content Sensor to image Earth's plasma environment and gravity field; a MicroCam Multispectral Imager to provide the first uniform, instantaneous image of Earth and measure global cloud cover, vegetation, land use, and bright aurora; a Radiation Belt Mapping System (dosimeter) to measure energetic electron and proton distributions; a Compact Earth Observing Spectrometer to measure aerosol-atmospheric composition and vegetation; and MEMS Accelerometers to deduce non-conservative forces aiding gravity and neutral drag studies. These instruments, employed in a constellation, can provide major breakthroughs in Earth and Geospace science, as well as offering a low-cost technology demonstration for operational weather, climate, and land-imaging.

  13. Handheld and mobile hyperspectral imaging sensors for wide-area standoff detection of explosives and chemical warfare agents

    NASA Astrophysics Data System (ADS)

    Gomer, Nathaniel R.; Gardner, Charles W.; Nelson, Matthew P.

    2016-05-01

    Hyperspectral imaging (HSI) is a valuable tool for the investigation and analysis of targets in complex background with a high degree of autonomy. HSI is beneficial for the detection of threat materials on environmental surfaces, where the concentration of the target of interest is often very low and is typically found within complex scenery. Two HSI techniques that have proven to be valuable are Raman and shortwave infrared (SWIR) HSI. Unfortunately, current generation HSI systems have numerous size, weight, and power (SWaP) limitations that make their potential integration onto a handheld or field portable platform difficult. The systems that are field-portable do so by sacrificing system performance, typically by providing an inefficient area search rate, requiring close proximity to the target for screening, and/or eliminating the potential to conduct real-time measurements. To address these shortcomings, ChemImage Sensor Systems (CISS) is developing a variety of wide-field hyperspectral imaging systems. Raman HSI sensors are being developed to overcome two obstacles present in standard Raman detection systems: slow area search rate (due to small laser spot sizes) and lack of eye-safety. SWIR HSI sensors have been integrated into mobile, robot based platforms and handheld variants for the detection of explosives and chemical warfare agents (CWAs). In addition, the fusion of these two technologies into a single system has shown the feasibility of using both techniques concurrently to provide higher probability of detection and lower false alarm rates. This paper will provide background on Raman and SWIR HSI, discuss the applications for these techniques, and provide an overview of novel CISS HSI sensors focused on sensor design and detection results.

  14. Bio-inspired multi-mode optic flow sensors for micro air vehicles

    NASA Astrophysics Data System (ADS)

    Park, Seokjun; Choi, Jaehyuk; Cho, Jihyun; Yoon, Euisik

    2013-06-01

    Monitoring wide-field surrounding information is essential for vision-based autonomous navigation in micro-air-vehicles (MAV). Our image-cube (iCube) module, which consists of multiple sensors that are facing different angles in 3-D space, can be applied to the wide-field of view optic flows estimation (μ-Compound eyes) and to attitude control (μ- Ocelli) in the Micro Autonomous Systems and Technology (MAST) platforms. In this paper, we report an analog/digital (A/D) mixed-mode optic-flow sensor, which generates both optic flows and normal images in different modes for μ- Compound eyes and μ-Ocelli applications. The sensor employs a time-stamp based optic flow algorithm which is modified from the conventional EMD (Elementary Motion Detector) algorithm to give an optimum partitioning of hardware blocks in analog and digital domains as well as adequate allocation of pixel-level, column-parallel, and chip-level signal processing. Temporal filtering, which may require huge hardware resources if implemented in digital domain, is remained in a pixel-level analog processing unit. The rest of the blocks, including feature detection and timestamp latching, are implemented using digital circuits in a column-parallel processing unit. Finally, time-stamp information is decoded into velocity from look-up tables, multiplications, and simple subtraction circuits in a chip-level processing unit, thus significantly reducing core digital processing power consumption. In the normal image mode, the sensor generates 8-b digital images using single slope ADCs in the column unit. In the optic flow mode, the sensor estimates 8-b 1-D optic flows from the integrated mixed-mode algorithm core and 2-D optic flows with an external timestamp processing, respectively.

  15. Development of an image converter of radical design. [employing solid state electronics towards the production of an advanced engineering model camera system

    NASA Technical Reports Server (NTRS)

    Irwin, E. L.; Farnsworth, D. L.

    1972-01-01

    A long term investigation of thin film sensors, monolithic photo-field effect transistors, and epitaxially diffused phototransistors and photodiodes to meet requirements to produce acceptable all solid state, electronically scanned imaging system, led to the production of an advanced engineering model camera which employs a 200,000 element phototransistor array (organized in a matrix of 400 rows by 500 columns) to secure resolution comparable to commercial television. The full investigation is described for the period July 1962 through July 1972, and covers the following broad topics in detail: (1) sensor monoliths; (2) fabrication technology; (3) functional theory; (4) system methodology; and (5) deployment profile. A summary of the work and conclusions are given, along with extensive schematic diagrams of the final solid state imaging system product.

  16. High-Frequency Fiber-Optic Ultrasonic Sensor Using Air Micro-Bubble for Imaging of Seismic Physical Models.

    PubMed

    Gang, Tingting; Hu, Manli; Rong, Qiangzhou; Qiao, Xueguang; Liang, Lei; Liu, Nan; Tong, Rongxin; Liu, Xiaobo; Bian, Ce

    2016-12-14

    A micro-fiber-optic Fabry-Perot interferometer (FPI) is proposed and demonstrated experimentally for ultrasonic imaging of seismic physical models. The device consists of a micro-bubble followed by the end of a single-mode fiber (SMF). The micro-structure is formed by the discharging operation on a short segment of hollow-core fiber (HCF) that is spliced to the SMF. This micro FPI is sensitive to ultrasonic waves (UWs), especially to the high-frequency (up to 10 MHz) UW, thanks to its ultra-thin cavity wall and micro-diameter. A side-band filter technology is employed for the UW interrogation, and then the high signal-to-noise ratio (SNR) UW signal is achieved. Eventually the sensor is used for lateral imaging of the physical model by scanning UW detection and two-dimensional signal reconstruction.

  17. Event-Based Tone Mapping for Asynchronous Time-Based Image Sensor

    PubMed Central

    Simon Chane, Camille; Ieng, Sio-Hoi; Posch, Christoph; Benosman, Ryad B.

    2016-01-01

    The asynchronous time-based neuromorphic image sensor ATIS is an array of autonomously operating pixels able to encode luminance information with an exceptionally high dynamic range (>143 dB). This paper introduces an event-based methodology to display data from this type of event-based imagers, taking into account the large dynamic range and high temporal accuracy that go beyond available mainstream display technologies. We introduce an event-based tone mapping methodology for asynchronously acquired time encoded gray-level data. A global and a local tone mapping operator are proposed. Both are designed to operate on a stream of incoming events rather than on time frame windows. Experimental results on real outdoor scenes are presented to evaluate the performance of the tone mapping operators in terms of quality, temporal stability, adaptation capability, and computational time. PMID:27642275

  18. Welcome to health information science and systems.

    PubMed

    Zhang, Yanchun

    2013-01-01

    Health Information Science and Systems is an exciting, new, multidisciplinary journal that aims to use technologies in computer science to assist in disease diagnoses, treatment, prediction and monitoring through the modeling, design, development, visualization, integration and management of health related information. These computer-science technologies include such as information systems, web technologies, data mining, image processing, user interaction and interface, sensors and wireless networking and are applicable to a wide range of health related information including medical data, biomedical data, bioinformatics data, public health data.

  19. Review of Fusion Systems and Contributing Technologies for SIHS-TD (Examen des Systemes de Fusion et des Technologies d’Appui pour la DT SIHS)

    DTIC Science & Technology

    2007-03-31

    Unlimited, Nivisys, Insight technology, Elcan, FLIR Systems, Stanford photonics Hardware Sensor fusion processors Video processing boards Image, video...Engineering The SPIE Digital Library is a resource for optics and photonics information. It contains more than 70,000 full-text papers from SPIE...conditions Top row: Stanford Photonics XR-Mega-10 Extreme 1400 x 1024 pixels ICCD detector, 33 msec exposure, no binning. Middle row: Andor EEV iXon

  20. How Many Pixels Does It Take to Make a Good 4"×6" Print? Pixel Count Wars Revisited

    NASA Astrophysics Data System (ADS)

    Kriss, Michael A.

    Digital still cameras emerged following the introduction of the Sony Mavica analog prototype camera in 1981. These early cameras produced poor image quality and did not challenge film cameras for overall quality. By 1995 digital still cameras in expensive SLR formats had 6 mega-pixels and produced high quality images (with significant image processing). In 2005 significant improvement in image quality was apparent and lower prices for digital still cameras (DSCs) started a rapid decline in film usage and film camera sells. By 2010 film usage was mostly limited to professionals and the motion picture industry. The rise of DSCs was marked by a “pixel war” where the driving feature of the cameras was the pixel count where even moderate cost, ˜120, DSCs would have 14 mega-pixels. The improvement of CMOS technology pushed this trend of lower prices and higher pixel counts. Only the single lens reflex cameras had large sensors and large pixels. The drive for smaller pixels hurt the quality aspects of the final image (sharpness, noise, speed, and exposure latitude). Only today are camera manufactures starting to reverse their course and producing DSCs with larger sensors and pixels. This paper will explore why larger pixels and sensors are key to the future of DSCs.

  1. High Resolution Airborne Digital Imagery for Precision Agriculture

    NASA Technical Reports Server (NTRS)

    Herwitz, Stanley R.

    1998-01-01

    The Environmental Research Aircraft and Sensor Technology (ERAST) program is a NASA initiative that seeks to demonstrate the application of cost-effective aircraft and sensor technology to private commercial ventures. In 1997-98, a series of flight-demonstrations and image acquisition efforts were conducted over the Hawaiian Islands using a remotely-piloted solar- powered platform (Pathfinder) and a fixed-wing piloted aircraft (Navajo) equipped with a Kodak DCS450 CIR (color infrared) digital camera. As an ERAST Science Team Member, I defined a set of flight lines over the largest coffee plantation in Hawaii: the Kauai Coffee Company's 4,000 acre Koloa Estate. Past studies have demonstrated the applications of airborne digital imaging to agricultural management. Few studies have examined the usefulness of high resolution airborne multispectral imagery with 10 cm pixel sizes. The Kodak digital camera integrated with ERAST's Airborne Real Time Imaging System (ARTIS) which generated multiband CCD images consisting of 6 x 106 pixel elements. At the designated flight altitude of 1,000 feet over the coffee plantation, pixel size was 10 cm. The study involved the analysis of imagery acquired on 5 March 1998 for the detection of anomalous reflectance values and for the definition of spectral signatures as indicators of tree vigor and treatment effectiveness (e.g., drip irrigation; fertilizer application).

  2. Real-time two-dimensional imaging of potassium ion distribution using an ion semiconductor sensor with charged coupled device technology.

    PubMed

    Hattori, Toshiaki; Masaki, Yoshitomo; Atsumi, Kazuya; Kato, Ryo; Sawada, Kazuaki

    2010-01-01

    Two-dimensional real-time observation of potassium ion distributions was achieved using an ion imaging device based on charge-coupled device (CCD) and metal-oxide semiconductor technologies, and an ion selective membrane. The CCD potassium ion image sensor was equipped with an array of 32 × 32 pixels (1024 pixels). It could record five frames per second with an area of 4.16 × 4.16 mm(2). Potassium ion images were produced instantly. The leaching of potassium ion from a 3.3 M KCl Ag/AgCl reference electrode was dynamically monitored in aqueous solution. The potassium ion selective membrane on the semiconductor consisted of plasticized poly(vinyl chloride) (PVC) with bis(benzo-15-crown-5). The addition of a polyhedral oligomeric silsesquioxane to the plasticized PVC membrane greatly improved adhesion of the membrane onto Si(3)N(4) of the semiconductor surface, and the potential response was stabilized. The potential response was linear from 10(-2) to 10(-5) M logarithmic concentration of potassium ion. The selectivity coefficients were K(K(+),Li(+))(pot) = 10(-2.85), K(K(+),Na(+))(pot) = 10(-2.30), K(K(+),Rb(+))(pot) =10(-1.16), and K(K(+),Cs(+))(pot) = 10(-2.05).

  3. High-Speed Monitoring of Multiple Grid-Connected Photovoltaic Array Configurations and Supplementary Weather Station.

    PubMed

    Boyd, Matthew T

    2017-06-01

    Three grid-connected monocrystalline silicon photovoltaic arrays have been instrumented with research-grade sensors on the Gaithersburg, MD campus of the National Institute of Standards and Technology (NIST). These arrays range from 73 kW to 271 kW and have different tilts, orientations, and configurations. Irradiance, temperature, wind, and electrical measurements at the arrays are recorded, and images are taken of the arrays to monitor shading and capture any anomalies. A weather station has also been constructed that includes research-grade instrumentation to measure all standard meteorological quantities plus additional solar irradiance spectral bands, full spectrum curves, and directional components using multiple irradiance sensor technologies. Reference photovoltaic (PV) modules are also monitored to provide comprehensive baseline measurements for the PV arrays. Images of the whole sky are captured, along with images of the instrumentation and reference modules to document any obstructions or anomalies. Nearly, all measurements at the arrays and weather station are sampled and saved every 1s, with monitoring having started on Aug. 1, 2014. This report describes the instrumentation approach used to monitor the performance of these photovoltaic systems, measure the meteorological quantities, and acquire the images for use in PV performance and weather monitoring and computer model validation.

  4. High-Speed Monitoring of Multiple Grid-Connected Photovoltaic Array Configurations and Supplementary Weather Station

    PubMed Central

    Boyd, Matthew T.

    2017-01-01

    Three grid-connected monocrystalline silicon photovoltaic arrays have been instrumented with research-grade sensors on the Gaithersburg, MD campus of the National Institute of Standards and Technology (NIST). These arrays range from 73 kW to 271 kW and have different tilts, orientations, and configurations. Irradiance, temperature, wind, and electrical measurements at the arrays are recorded, and images are taken of the arrays to monitor shading and capture any anomalies. A weather station has also been constructed that includes research-grade instrumentation to measure all standard meteorological quantities plus additional solar irradiance spectral bands, full spectrum curves, and directional components using multiple irradiance sensor technologies. Reference photovoltaic (PV) modules are also monitored to provide comprehensive baseline measurements for the PV arrays. Images of the whole sky are captured, along with images of the instrumentation and reference modules to document any obstructions or anomalies. Nearly, all measurements at the arrays and weather station are sampled and saved every 1s, with monitoring having started on Aug. 1, 2014. This report describes the instrumentation approach used to monitor the performance of these photovoltaic systems, measure the meteorological quantities, and acquire the images for use in PV performance and weather monitoring and computer model validation. PMID:28670044

  5. The Thermal Infrared Sensor on the Landsat Data Continuity Mission

    NASA Technical Reports Server (NTRS)

    Reuter, Dennis; Richardson, Cathy; Irons, James; Allen, Rick; Anderson, Martha; Budinoff, Jason; Casto, Gordon; Coltharp, Craig; Finneran, Paul; Forsbacka, Betsy; hide

    2010-01-01

    The Landsat Data Continuity Mission (LDCM), a joint NASA and USGS mission, is scheduled for launch in December, 2012. The LDCM instrument payload will consist of the Operational Land Imager (OLI), provided by Ball Aerospace and Technology Corporation (BATC} under contract to NASA and the Thermal Infrared Sensor (TIRS), provided by NASA's Goddard Space Flight Center (GSFC). This paper outlines the design of the TIRS instrument and gives an example of its application to monitoring water consumption by measuring evapotranspiration.

  6. Military microwaves '84; Proceedings of the Conference, London, England, October 24-26, 1984

    NASA Astrophysics Data System (ADS)

    The present conference on microwave frequency electronic warfare and military sensor equipment developments consider radar warning receivers, optical frequency spread spectrum systems, mobile digital communications troposcatter effects, wideband bulk encryption, long range air defense radars (such as the AR320, W-2000 and Martello), multistatic radars, and multimode airborne and interceptor radars. IR system and subsystem component topics encompass thermal imaging and active IR countermeasures, class 1 modules, and diamond coatings, while additional radar-related topics include radar clutter in airborne maritime reconnaissance systems, microstrip antennas with dual polarization capability, the synthesis of shaped beam antenna patterns, planar phased arrays, radar signal processing, radar cross section measurement techniques, and radar imaging and pattern analysis. Attention is also given to optical control and signal processing, mm-wave control technology and EW systems, W-band operations, planar mm-wave arrays, mm-wave monolithic solid state components, mm-wave sensor technology, GaAs monolithic ICs, and dielectric resonator and wideband tunable oscillators.

  7. Development and applications of 3-dimensional integration nanotechnologies.

    PubMed

    Kim, Areum; Choi, Eunmi; Son, Hyungbin; Pyo, Sung Gyu

    2014-02-01

    Unlike conventional two-dimensional (2D) planar structures, signal or power is supplied through through-silicon via (TSV) in three-dimensional (3D) integration technology to replace wires for binding the chip/wafer. TSVs have becomes an essential technology, as they satisfy Moore's law. This 3D integration technology enables system and sensor functions at a nanoscale via the implementation of a highly integrated nano-semiconductor as well as the fabrication of a single chip with multiple functions. Thus, this technology is considered to be a new area of development for the systemization of the nano-bio area. In this review paper, the basic technology required for such 3D integration is described and methods to measure the bonding strength in order to measure the void occurring during bonding are introduced. Currently, CMOS image sensors and memory chips associated with nanotechnology are being realized on the basis of 3D integration technology. In this paper, we intend to describe the applications of high-performance nano-biosensor technology currently under development and the direction of development of a high performance lab-on-a-chip (LOC).

  8. An efficient and secure partial image encryption for wireless multimedia sensor networks using discrete wavelet transform, chaotic maps and substitution box

    NASA Astrophysics Data System (ADS)

    Khan, Muazzam A.; Ahmad, Jawad; Javaid, Qaisar; Saqib, Nazar A.

    2017-03-01

    Wireless Sensor Networks (WSN) is widely deployed in monitoring of some physical activity and/or environmental conditions. Data gathered from WSN is transmitted via network to a central location for further processing. Numerous applications of WSN can be found in smart homes, intelligent buildings, health care, energy efficient smart grids and industrial control systems. In recent years, computer scientists has focused towards findings more applications of WSN in multimedia technologies, i.e. audio, video and digital images. Due to bulky nature of multimedia data, WSN process a large volume of multimedia data which significantly increases computational complexity and hence reduces battery time. With respect to battery life constraints, image compression in addition with secure transmission over a wide ranged sensor network is an emerging and challenging task in Wireless Multimedia Sensor Networks. Due to the open nature of the Internet, transmission of data must be secure through a process known as encryption. As a result, there is an intensive demand for such schemes that is energy efficient as well as highly secure since decades. In this paper, discrete wavelet-based partial image encryption scheme using hashing algorithm, chaotic maps and Hussain's S-Box is reported. The plaintext image is compressed via discrete wavelet transform and then the image is shuffled column-wise and row wise-wise via Piece-wise Linear Chaotic Map (PWLCM) and Nonlinear Chaotic Algorithm, respectively. To get higher security, initial conditions for PWLCM are made dependent on hash function. The permuted image is bitwise XORed with random matrix generated from Intertwining Logistic map. To enhance the security further, final ciphertext is obtained after substituting all elements with Hussain's substitution box. Experimental and statistical results confirm the strength of the anticipated scheme.

  9. Automated information-analytical system for thunderstorm monitoring and early warning alarms using modern physical sensors and information technologies with elements of artificial intelligence

    NASA Astrophysics Data System (ADS)

    Boldyreff, Anton S.; Bespalov, Dmitry A.; Adzhiev, Anatoly Kh.

    2017-05-01

    Methods of artificial intelligence are a good solution for weather phenomena forecasting. They allow to process a large amount of diverse data. Recirculation Neural Networks is implemented in the paper for the system of thunderstorm events prediction. Large amounts of experimental data from lightning sensors and electric field mills networks are received and analyzed. The average recognition accuracy of sensor signals is calculated. It is shown that Recirculation Neural Networks is a promising solution in the forecasting of thunderstorms and weather phenomena, characterized by the high efficiency of the recognition elements of the sensor signals, allows to compress images and highlight their characteristic features for subsequent recognition.

  10. The Space Technology-7 Disturbance Reduction Systems

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Hsu, Oscar C.; Hanson, John; Hruby, Vlad

    2004-01-01

    The Space Technology 7 Disturbance Reduction System (DRS) is an in-space technology demonstration designed to validate technologies that are required for future missions such as the Laser Interferometer Space Antenna (LISA) and the Micro-Arcsecond X-ray Imaging Mission (MAXIM). The primary sensors that will be used by DRS are two Gravitational Reference Sensors (GRSs) being developed by Stanford University. DRS will control the spacecraft so that it flies about one of the freely-floating Gravitational Reference Sensor test masses, keeping it centered within its housing. The other GRS serves as a cross-reference for the first as well as being used as a reference for .the spacecraft s attitude control. Colloidal MicroNewton Thrusters being developed by the Busek Co. will be used to control the spacecraft's position and attitude using a six degree-of-freedom Dynamic Control System being developed by Goddard Space Flight Center. A laser interferometer being built by the Jet Propulsion Laboratory will be used to help validate the results of the experiment. The DRS will be launched in 2008 on the European Space Agency (ESA) LISA Pathfinder spacecraft along with a similar ESA experiment, the LISA Test Package.

  11. Characterization of pixel sensor designed in 180 nm SOI CMOS technology

    NASA Astrophysics Data System (ADS)

    Benka, T.; Havranek, M.; Hejtmanek, M.; Jakovenko, J.; Janoska, Z.; Marcisovska, M.; Marcisovsky, M.; Neue, G.; Tomasek, L.; Vrba, V.

    2018-01-01

    A new type of X-ray imaging Monolithic Active Pixel Sensor (MAPS), X-CHIP-02, was developed using a 180 nm deep submicron Silicon On Insulator (SOI) CMOS commercial technology. Two pixel matrices were integrated into the prototype chip, which differ by the pixel pitch of 50 μm and 100 μm. The X-CHIP-02 contains several test structures, which are useful for characterization of individual blocks. The sensitive part of the pixel integrated in the handle wafer is one of the key structures designed for testing. The purpose of this structure is to determine the capacitance of the sensitive part (diode in the MAPS pixel). The measured capacitance is 2.9 fF for 50 μm pixel pitch and 4.8 fF for 100 μm pixel pitch at -100 V (default operational voltage). This structure was used to measure the IV characteristics of the sensitive diode. In this work, we report on a circuit designed for precise determination of sensor capacitance and IV characteristics of both pixel types with respect to X-ray irradiation. The motivation for measurement of the sensor capacitance was its importance for the design of front-end amplifier circuits. The design of pixel elements, as well as circuit simulation and laboratory measurement techniques are described. The experimental results are of great importance for further development of MAPS sensors in this technology.

  12. Accurate positioning based on acoustic and optical sensors

    NASA Astrophysics Data System (ADS)

    Cai, Kerong; Deng, Jiahao; Guo, Hualing

    2009-11-01

    Unattended laser target designator (ULTD) was designed to partly take the place of conventional LTDs for accurate positioning and laser marking. Analyzed the precision, accuracy and errors of acoustic sensor array, the requirements of laser generator, and the technology of image analysis and tracking, the major system modules were determined. The target's classification, velocity and position can be measured by sensors, and then coded laser beam will be emitted intelligently to mark the excellent position at the excellent time. The conclusion shows that, ULTD can not only avoid security threats, be deployed massively, and accomplish battle damage assessment (BDA), but also be fit for information-based warfare.

  13. FPGA-based real time processing of the Plenoptic Wavefront Sensor

    NASA Astrophysics Data System (ADS)

    Rodríguez-Ramos, L. F.; Marín, Y.; Díaz, J. J.; Piqueras, J.; García-Jiménez, J.; Rodríguez-Ramos, J. M.

    The plenoptic wavefront sensor combines measurements at pupil and image planes in order to obtain simultaneously wavefront information from different points of view, being capable to sample the volume above the telescope to extract the tomographic information of the atmospheric turbulence. The advantages of this sensor are presented elsewhere at this conference (José M. Rodríguez-Ramos et al). This paper will concentrate in the processing required for pupil plane phase recovery, and its computation in real time using FPGAs (Field Programmable Gate Arrays). This technology eases the implementation of massive parallel processing and allows tailoring the system to the requirements, maintaining flexibility, speed and cost figures.

  14. Microscopic resolution broadband dielectric spectroscopy

    NASA Astrophysics Data System (ADS)

    Mukherjee, S.; Watson, P.; Prance, R. J.

    2011-08-01

    Results are presented for a non-contact measurement system capable of micron level spatial resolution. It utilises the novel electric potential sensor (EPS) technology, invented at Sussex, to image the electric field above a simple composite dielectric material. EP sensors may be regarded as analogous to a magnetometer and require no adjustments or offsets during either setup or use. The sample consists of a standard glass/epoxy FR4 circuit board, with linear defects machined into the surface by a PCB milling machine. The sample is excited with an a.c. signal over a range of frequencies from 10 kHz to 10 MHz, from the reverse side, by placing it on a conducting sheet connected to the source. The single sensor is raster scanned over the surface at a constant working distance, consistent with the spatial resolution, in order to build up an image of the electric field, with respect to the reference potential. The results demonstrate that both the surface defects and the internal dielectric variations within the composite may be imaged in this way, with good contrast being observed between the glass mat and the epoxy resin.

  15. Image fusion based on millimeter-wave for concealed weapon detection

    NASA Astrophysics Data System (ADS)

    Zhu, Weiwen; Zhao, Yuejin; Deng, Chao; Zhang, Cunlin; Zhang, Yalin; Zhang, Jingshui

    2010-11-01

    This paper describes a novel multi sensors image fusion technology which is presented for concealed weapon detection (CWD). It is known to all, because of the good transparency of the clothes at millimeter wave band, a millimeter wave radiometer can be used to image and distinguish concealed contraband beneath clothes, for example guns, knives, detonator and so on. As a result, we adopt the passive millimeter wave (PMMW) imaging technology for airport security. However, in consideration of the wavelength of millimeter wave and the single channel mechanical scanning, the millimeter wave image has law optical resolution, which can't meet the need of practical application. Therefore, visible image (VI), which has higher resolution, is proposed for the image fusion with the millimeter wave image to enhance the readability. Before the image fusion, a novel image pre-processing which specifics to the fusion of millimeter wave imaging and visible image is adopted. And in the process of image fusion, multi resolution analysis (MRA) based on Wavelet Transform (WT) is adopted. In this way, the experiment result shows that this method has advantages in concealed weapon detection and has practical significance.

  16. Integrated Georeferencing of Stereo Image Sequences Captured with a Stereovision Mobile Mapping System - Approaches and Practical Results

    NASA Astrophysics Data System (ADS)

    Eugster, H.; Huber, F.; Nebiker, S.; Gisi, A.

    2012-07-01

    Stereovision based mobile mapping systems enable the efficient capturing of directly georeferenced stereo pairs. With today's camera and onboard storage technologies imagery can be captured at high data rates resulting in dense stereo sequences. These georeferenced stereo sequences provide a highly detailed and accurate digital representation of the roadside environment which builds the foundation for a wide range of 3d mapping applications and image-based geo web-services. Georeferenced stereo images are ideally suited for the 3d mapping of street furniture and visible infrastructure objects, pavement inspection, asset management tasks or image based change detection. As in most mobile mapping systems, the georeferencing of the mapping sensors and observations - in our case of the imaging sensors - normally relies on direct georeferencing based on INS/GNSS navigation sensors. However, in urban canyons the achievable direct georeferencing accuracy of the dynamically captured stereo image sequences is often insufficient or at least degraded. Furthermore, many of the mentioned application scenarios require homogeneous georeferencing accuracy within a local reference frame over the entire mapping perimeter. To achieve these demands georeferencing approaches are presented and cost efficient workflows are discussed which allows validating and updating the INS/GNSS based trajectory with independently estimated positions in cases of prolonged GNSS signal outages in order to increase the georeferencing accuracy up to the project requirements.

  17. SPIDER: Next Generation Chip Scale Imaging Sensor

    NASA Astrophysics Data System (ADS)

    Duncan, Alan; Kendrick, Rick; Thurman, Sam; Wuchenich, Danielle; Scott, Ryan P.; Yoo, S. J. B.; Su, Tiehui; Yu, Runxiang; Ogden, Chad; Proiett, Roberto

    The LM Advanced Technology Center and UC Davis are developing an Electro-Optical (EO) imaging sensor called SPIDER (Segmented Planar Imaging Detector for Electro-optical Reconnaissance) that provides a 10x to 100x size, weight, and power (SWaP) reduction alternative to the traditional bulky optical telescope and focal plane detector array. The substantial reductions in SWaP would reduce cost and/or provide higher resolution by enabling a larger aperture imager in a constrained volume. The SPIDER concept consists of thousands of direct detection white-light interferometers densely packed onto Photonic Integrated Circuits (PICs) to measure the amplitude and phase of the visibility function at spatial frequencies that span the full synthetic aperture. In other words, SPIDER would sample the object being imaged in the Fourier domain (i.e., spatial frequency domain), and then digitally reconstruct an image. The conventional approach for imaging interferometers requires complex mechanical delay lines to form the interference fringes. This results in designs that are not traceable to more than a few simultaneous spatial frequency measurements. SPIDER seeks to achieve this traceability by employing micron-=scale optical waveguides and nanophotonic structures fabricated on a PIC with micron-scale packing density to form the necessary interferometers. Prior LM IRAD and DARPA/NASA CRAD-funded SPIDER risk reduction experiments, design trades, and simulations have matured the SPIDER imager concept to a TRL 3 level. Current funding under the DARPA SPIDER Zoom program is maturing the underlying PIC technology for SPIDER to the TRL 4 level. This is done by developing and fabricating a second-generation PIC that is fully traceable to the multiple layers and low-power phase modulators required for higher-dimension waveguide arrays that are needed for higher field-of-view sensors. Our project also seeks to extend the SPIDER concept to add a zoom capability that would provide simultaneous low-resolution, large field-of-view and steerable high-resolution, narrow field-of-view imaging modes. A proof of concept demo is being designed to validate this capability. Finally, data collected by this project would be used to benchmark and increase the fidelity of our SPIDER image simulations and enhance our ability to predict the performance of existing and future SPIDER sensor design variations. These designs and their associated performance characteristics could then be evaluated as candidates for future mission opportunities to identify specific transition paths. This paper provides an overview of performance data on the first-generation PIC for SPIDER developed under DARPA SeeMe program funding. We provide a design description of the SPICER Zoom imaging sensor and the second-generation PIC (high- and low-resolution versions) currently under development on the DARPA SPIDER Zoom effort. Results of performance simulations and design trades are presented. Unique low-cost payload applications for future SSA missions are also discussed.

  18. New SOFRADIR 10μm pixel pitch infrared products

    NASA Astrophysics Data System (ADS)

    Lefoul, X.; Pere-Laperne, N.; Augey, T.; Rubaldo, L.; Aufranc, Sébastien; Decaens, G.; Ricard, N.; Mazaleyrat, E.; Billon-Lanfrey, D.; Gravrand, Olivier; Bisotto, Sylvette

    2014-10-01

    Recent advances in miniaturization of IR imaging technology have led to a growing market for mini thermal-imaging sensors. In that respect, Sofradir development on smaller pixel pitch has made much more compact products available to the users. When this competitive advantage is mixed with smaller coolers, made possible by HOT technology, we achieved valuable reductions in the size, weight and power of the overall package. At the same time, we are moving towards a global offer based on digital interfaces that provides our customers simplifications at the IR system design process while freeing up more space. This paper discusses recent developments on hot and small pixel pitch technologies as well as efforts made on compact packaging solution developed by SOFRADIR in collaboration with CEA-LETI.

  19. Earth Surface Monitoring with COSI-Corr, Techniques and Applications

    NASA Astrophysics Data System (ADS)

    Leprince, S.; Ayoub, F.; Avouac, J.

    2009-12-01

    Co-registration of Optically Sensed Images and Correlation (COSI-Corr) is a software package developed at the California Institute of Technology (USA) for accurate geometrical processing of optical satellite and aerial imagery. Initially developed for the measurement of co-seismic ground deformation using optical imagery, COSI-Corr is now used for a wide range of applications in Earth Sciences, which take advantage of the software capability to co-register, with very high accuracy, images taken from different sensors and acquired at different times. As long as a sensor is supported in COSI-Corr, all images between the supported sensors can be accurately orthorectified and co-registered. For example, it is possible to co-register a series of SPOT images, a series of aerial photographs, as well as to register a series of aerial photographs with a series of SPOT images, etc... Currently supported sensors include the SPOT 1-5, Quickbird, Worldview 1 and Formosat 2 satellites, the ASTER instrument, and frame camera acquisitions from e.g., aerial survey or declassified satellite imagery. Potential applications include accurate change detection between multi-temporal and multi-spectral images, and the calibration of pushbroom cameras. In particular, COSI-Corr provides a powerful correlation tool, which allows for accurate estimation of surface displacement. The accuracy depends on many factors (e.g., cloud, snow, and vegetation cover, shadows, temporal changes in general, steadiness of the imaging platform, defects of the imaging system, etc...) but in practice, the standard deviation of the measurements obtained from the correlation of mutli-temporal images is typically around 1/20 to 1/10 of the pixel size. The software package also includes post-processing tools such as denoising, destriping, and stacking tools to facilitate data interpretation. Examples drawn from current research in, e.g., seismotectonics, glaciology, and geomorphology will be presented. COSI-Corr is developed in IDL (Interactive Data Language), integrated under the user friendly interface ENVI (Environment for Visualizing Images), and is distributed free of charge for academic research purposes.

  20. Recce NG: from Recce sensor to image intelligence (IMINT)

    NASA Astrophysics Data System (ADS)

    Larroque, Serge

    2001-12-01

    Recce NG (Reconnaissance New Generation) is presented as a complete and optimized Tactical Reconnaissance System. Based on a new generation Pod integrating high resolution Dual Band sensors, the system has been designed with the operational lessons learnt from the last Peace Keeping Operations in Bosnia and Kosovo. The technical solutions retained as component modules of a full IMINT acquisition system, take benefit of the state of art in the following key technologies: Advanced Mission Planning System for long range stand-off Manned Recce, Aircraft and/or Pod tasking, operating sophisticated back-up software tools, high resolution 3D geo data and improved/combat proven MMI to reduce planning delays, Mature Dual Band sensors technology to achieve the Day and Night Recce Mission, including advanced automatic operational functions, as azimuth and roll tracking capabilities, low risk in Pod integration and in carrier avionics, controls and displays upgrades, to save time in operational turn over and maintenance, High rate Imagery Down Link, for Real Time or Near Real Time transmission, fully compatible with STANAG 7085 requirements, Advanced IMINT Exploitation Ground Segment, combat proven, NATO interoperable (STANAG 7023), integrating high value software tools for accurate location, improved radiometric image processing and open link to the C4ISR systems. The choice of an industrial Prime contractor mastering across the full system, all the prior listed key products and technologies, is mandatory to a successful delivery in terms of low Cost, Risk and Time Schedule.

  1. Development and testing of the EVS 2000 enhanced vision system

    NASA Astrophysics Data System (ADS)

    Way, Scott P.; Kerr, Richard; Imamura, Joe J.; Arnoldy, Dan; Zeylmaker, Richard; Zuro, Greg

    2003-09-01

    An effective enhanced vision system must operate over a broad spectral range in order to offer a pilot an optimized scene that includes runway background as well as airport lighting and aircraft operations. The large dynamic range of intensities of these images is best handled with separate imaging sensors. The EVS 2000 is a patented dual-band Infrared Enhanced Vision System (EVS) utilizing image fusion concepts to provide a single image from uncooled infrared imagers in both the LWIR and SWIR. The system is designed to provide commercial and corporate airline pilots with improved situational awareness at night and in degraded weather conditions. A prototype of this system was recently fabricated and flown on the Boeing Advanced Technology Demonstrator 737-900 aircraft. This paper will discuss the current EVS 2000 concept, show results taken from the Boeing Advanced Technology Demonstrator program, and discuss future plans for EVS systems.

  2. Study on polarized optical flow algorithm for imaging bionic polarization navigation micro sensor

    NASA Astrophysics Data System (ADS)

    Guan, Le; Liu, Sheng; Li, Shi-qi; Lin, Wei; Zhai, Li-yuan; Chu, Jin-kui

    2018-05-01

    At present, both the point source and the imaging polarization navigation devices only can output the angle information, which means that the velocity information of the carrier cannot be extracted from the polarization field pattern directly. Optical flow is an image-based method for calculating the velocity of pixel point movement in an image. However, for ordinary optical flow, the difference in pixel value as well as the calculation accuracy can be reduced in weak light. Polarization imaging technology has the ability to improve both the detection accuracy and the recognition probability of the target because it can acquire the extra polarization multi-dimensional information of target radiation or reflection. In this paper, combining the polarization imaging technique with the traditional optical flow algorithm, a polarization optical flow algorithm is proposed, and it is verified that the polarized optical flow algorithm has good adaptation in weak light and can improve the application range of polarization navigation sensors. This research lays the foundation for day and night all-weather polarization navigation applications in future.

  3. Ad Hoc Network Architecture for Multi-Media Networks

    DTIC Science & Technology

    2007-12-01

    sensor network . Video traffic is modeled and simulations are performed via the use of the Sun Small Programmable Object Technology (Sun SPOT) Java...characteristics of video traffic must be studied and understood. This thesis focuses on evaluating the possibility of routing video images over a wireless

  4. Three-dimensional cascaded system analysis of a 50 µm pixel pitch wafer-scale CMOS active pixel sensor x-ray detector for digital breast tomosynthesis.

    PubMed

    Zhao, C; Vassiljev, N; Konstantinidis, A C; Speller, R D; Kanicki, J

    2017-03-07

    High-resolution, low-noise x-ray detectors based on the complementary metal-oxide-semiconductor (CMOS) active pixel sensor (APS) technology have been developed and proposed for digital breast tomosynthesis (DBT). In this study, we evaluated the three-dimensional (3D) imaging performance of a 50 µm pixel pitch CMOS APS x-ray detector named DynAMITe (Dynamic Range Adjustable for Medical Imaging Technology). The two-dimensional (2D) angle-dependent modulation transfer function (MTF), normalized noise power spectrum (NNPS), and detective quantum efficiency (DQE) were experimentally characterized and modeled using the cascaded system analysis at oblique incident angles up to 30°. The cascaded system model was extended to the 3D spatial frequency space in combination with the filtered back-projection (FBP) reconstruction method to calculate the 3D and in-plane MTF, NNPS and DQE parameters. The results demonstrate that the beam obliquity blurs the 2D MTF and DQE in the high spatial frequency range. However, this effect can be eliminated after FBP image reconstruction. In addition, impacts of the image acquisition geometry and detector parameters were evaluated using the 3D cascaded system analysis for DBT. The result shows that a wider projection angle range (e.g.  ±30°) improves the low spatial frequency (below 5 mm -1 ) performance of the CMOS APS detector. In addition, to maintain a high spatial resolution for DBT, a focal spot size of smaller than 0.3 mm should be used. Theoretical analysis suggests that a pixelated scintillator in combination with the 50 µm pixel pitch CMOS APS detector could further improve the 3D image resolution. Finally, the 3D imaging performance of the CMOS APS and an indirect amorphous silicon (a-Si:H) thin-film transistor (TFT) passive pixel sensor (PPS) detector was simulated and compared.

  5. Three-dimensional cascaded system analysis of a 50 µm pixel pitch wafer-scale CMOS active pixel sensor x-ray detector for digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Zhao, C.; Vassiljev, N.; Konstantinidis, A. C.; Speller, R. D.; Kanicki, J.

    2017-03-01

    High-resolution, low-noise x-ray detectors based on the complementary metal-oxide-semiconductor (CMOS) active pixel sensor (APS) technology have been developed and proposed for digital breast tomosynthesis (DBT). In this study, we evaluated the three-dimensional (3D) imaging performance of a 50 µm pixel pitch CMOS APS x-ray detector named DynAMITe (Dynamic Range Adjustable for Medical Imaging Technology). The two-dimensional (2D) angle-dependent modulation transfer function (MTF), normalized noise power spectrum (NNPS), and detective quantum efficiency (DQE) were experimentally characterized and modeled using the cascaded system analysis at oblique incident angles up to 30°. The cascaded system model was extended to the 3D spatial frequency space in combination with the filtered back-projection (FBP) reconstruction method to calculate the 3D and in-plane MTF, NNPS and DQE parameters. The results demonstrate that the beam obliquity blurs the 2D MTF and DQE in the high spatial frequency range. However, this effect can be eliminated after FBP image reconstruction. In addition, impacts of the image acquisition geometry and detector parameters were evaluated using the 3D cascaded system analysis for DBT. The result shows that a wider projection angle range (e.g.  ±30°) improves the low spatial frequency (below 5 mm-1) performance of the CMOS APS detector. In addition, to maintain a high spatial resolution for DBT, a focal spot size of smaller than 0.3 mm should be used. Theoretical analysis suggests that a pixelated scintillator in combination with the 50 µm pixel pitch CMOS APS detector could further improve the 3D image resolution. Finally, the 3D imaging performance of the CMOS APS and an indirect amorphous silicon (a-Si:H) thin-film transistor (TFT) passive pixel sensor (PPS) detector was simulated and compared.

  6. Superconducting Detector Arrays for Astrophysics

    NASA Technical Reports Server (NTRS)

    Chervenak, James

    2008-01-01

    The next generation of astrophysics instruments will feature an order of magnitude more photon sensors or sensors that have an order of magnitude greater sensitivity. Since detector noise scales with temperature, a number of candidate technologies have been developed that use the intrinsic advantages of detector systems that operate below 1 Kelvin. Many of these systems employ of the superconducting phenomena that occur in metals at these temperatures to build ultrasensitive detectors and low-noise, low-power readout architectures. I will present one such system in use today to meet the needs of the astrophysics community at millimeter and x-ray wavelengths. Our group at NASA in collaboration with Princeton, NIST, Boulder and a number of other groups is building large format arrays of superconducting transition edge sensors (TES) read out with multiplexed superconducting quantum interference devices (SQUID). I will present the high sensitivity we have achieved in multiplexed x-ray sensors with the TES technology and describe the construction of a 1000-sensor TES/SQUID array for microwave measurements. With our collaboration's deployment of a kilopixel TES array for 2 mm radiation at the Atacarna Cosmology Telescope in November 2007, we have first images of the lensed Cosmic Microwave Background at fine angular scales.

  7. Automatic segmentation and centroid detection of skin sensors for lung interventions

    NASA Astrophysics Data System (ADS)

    Lu, Kongkuo; Xu, Sheng; Xue, Zhong; Wong, Stephen T.

    2012-02-01

    Electromagnetic (EM) tracking has been recognized as a valuable tool for locating the interventional devices in procedures such as lung and liver biopsy or ablation. The advantage of this technology is its real-time connection to the 3D volumetric roadmap, i.e. CT, of a patient's anatomy while the intervention is performed. EM-based guidance requires tracking of the tip of the interventional device, transforming the location of the device onto pre-operative CT images, and superimposing the device in the 3D images to assist physicians to complete the procedure more effectively. A key requirement of this data integration is to find automatically the mapping between EM and CT coordinate systems. Thus, skin fiducial sensors are attached to patients before acquiring the pre-operative CTs. Then, those sensors can be recognized in both CT and EM coordinate systems and used calculate the transformation matrix. In this paper, to enable the EM-based navigation workflow and reduce procedural preparation time, an automatic fiducial detection method is proposed to obtain the centroids of the sensors from the pre-operative CT. The approach has been applied to 13 rabbit datasets derived from an animal study and eight human images from an observation study. The numerical results show that it is a reliable and efficient method for use in EM-guided application.

  8. Evolution of miniature detectors and focal plane arrays for infrared sensors

    NASA Astrophysics Data System (ADS)

    Watts, Louis A.

    1993-06-01

    Sensors that are sensitive in the infrared spectral region have been under continuous development since the WW2 era. A quest for the military advantage of 'seeing in the dark' has pushed thermal imaging technology toward high spatial and temporal resolution for night vision equipment, fire control, search track, and seeker 'homing' guidance sensing devices. Similarly, scientific applications have pushed spectral resolution for chemical analysis, remote sensing of earth resources, and astronomical exploration applications. As a result of these developments, focal plane arrays (FPA) are now available with sufficient sensitivity for both high spatial and narrow bandwidth spectral resolution imaging over large fields of view. Such devices combined with emerging opto-electronic developments in integrated FPA data processing techniques can yield miniature sensors capable of imaging reflected sunlight in the near IR and emitted thermal energy in the Mid-wave (MWIR) and longwave (LWIR) IR spectral regions. Robotic space sensors equipped with advanced versions of these FPA's will provide high resolution 'pictures' of their surroundings, perform remote analysis of solid, liquid, and gas matter, or selectively look for 'signatures' of specific objects. Evolutionary trends and projections of future low power micro detector FPA developments for day/night operation or use in adverse viewing conditions are presented in the following test.

  9. Detection of chemical pollutants by passive LWIR hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Lavoie, Hugo; Thériault, Jean-Marc; Bouffard, François; Puckrin, Eldon; Dubé, Denis

    2012-09-01

    Toxic industrial chemicals (TICs) represent a major threat to public health and security. Their detection constitutes a real challenge to security and first responder's communities. One promising detection method is based on the passive standoff identification of chemical vapors emanating from the laboratory under surveillance. To investigate this method, the Department of National Defense and Public Safety Canada have mandated Defense Research and Development Canada (DRDC) - Valcartier to develop and test passive Long Wave Infrared (LWIR) hyperspectral imaging (HSI) sensors for standoff detection. The initial effort was focused to address the standoff detection and identification of toxic industrial chemicals (TICs) and precursors. Sensors such as the Multi-option Differential Detection and Imaging Fourier Spectrometer (MoDDIFS) and the Improved Compact ATmospheric Sounding Interferometer (iCATSI) were developed for this application. This paper describes the sensor developments and presents initial results of standoff detection and identification of TICs and precursors. The standoff sensors are based on the differential Fourier-transform infrared (FTIR) radiometric technology and are able to detect, spectrally resolve and identify small leak plumes at ranges in excess of 1 km. Results from a series of trials in asymmetric threat type scenarios will be presented. These results will serve to establish the potential of the method for standoff detection of TICs precursors and surrogates.

  10. Evolution of miniature detectors and focal plane arrays for infrared sensors

    NASA Technical Reports Server (NTRS)

    Watts, Louis A.

    1993-01-01

    Sensors that are sensitive in the infrared spectral region have been under continuous development since the WW2 era. A quest for the military advantage of 'seeing in the dark' has pushed thermal imaging technology toward high spatial and temporal resolution for night vision equipment, fire control, search track, and seeker 'homing' guidance sensing devices. Similarly, scientific applications have pushed spectral resolution for chemical analysis, remote sensing of earth resources, and astronomical exploration applications. As a result of these developments, focal plane arrays (FPA) are now available with sufficient sensitivity for both high spatial and narrow bandwidth spectral resolution imaging over large fields of view. Such devices combined with emerging opto-electronic developments in integrated FPA data processing techniques can yield miniature sensors capable of imaging reflected sunlight in the near IR and emitted thermal energy in the Mid-wave (MWIR) and longwave (LWIR) IR spectral regions. Robotic space sensors equipped with advanced versions of these FPA's will provide high resolution 'pictures' of their surroundings, perform remote analysis of solid, liquid, and gas matter, or selectively look for 'signatures' of specific objects. Evolutionary trends and projections of future low power micro detector FPA developments for day/night operation or use in adverse viewing conditions are presented in the following test.

  11. Fusion of imaging and nonimaging data for surveillance aircraft

    NASA Astrophysics Data System (ADS)

    Shahbazian, Elisa; Gagnon, Langis; Duquet, Jean Remi; Macieszczak, Maciej; Valin, Pierre

    1997-06-01

    This paper describes a phased incremental integration approach for application of image analysis and data fusion technologies to provide automated intelligent target tracking and identification for airborne surveillance on board an Aurora Maritime Patrol Aircraft. The sensor suite of the Aurora consists of a radar, an identification friend or foe (IFF) system, an electronic support measures (ESM) system, a spotlight synthetic aperture radar (SSAR), a forward looking infra-red (FLIR) sensor and a link-11 tactical datalink system. Lockheed Martin Canada (LMCan) is developing a testbed, which will be used to analyze and evaluate approaches for combining the data provided by the existing sensors, which were initially not designed to feed a fusion system. Three concurrent research proof-of-concept activities provide techniques, algorithms and methodology into three sequential phases of integration of this testbed. These activities are: (1) analysis of the fusion architecture (track/contact/hybrid) most appropriate for the type of data available, (2) extraction and fusion of simple features from the imaging data into the fusion system performing automatic target identification, and (3) development of a unique software architecture which will permit integration and independent evolution, enhancement and optimization of various decision aid capabilities, such as multi-sensor data fusion (MSDF), situation and threat assessment (STA) and resource management (RM).

  12. Putting a finishing touch on GECIs

    PubMed Central

    Rose, Tobias; Goltstein, Pieter M.; Portugues, Ruben; Griesbeck, Oliver

    2014-01-01

    More than a decade ago genetically encoded calcium indicators (GECIs) entered the stage as new promising tools to image calcium dynamics and neuronal activity in living tissues and designated cell types in vivo. From a variety of initial designs two have emerged as promising prototypes for further optimization: FRET (Förster Resonance Energy Transfer)-based sensors and single fluorophore sensors of the GCaMP family. Recent efforts in structural analysis, engineering and screening have broken important performance thresholds in the latest generation for both classes. While these improvements have made GECIs a powerful means to perform physiology in living animals, a number of other aspects of sensor function deserve attention. These aspects include indicator linearity, toxicity and slow response kinetics. Furthermore creating high performance sensors with optically more favorable emission in red or infrared wavelengths as well as new stably or conditionally GECI-expressing animal lines are on the wish list. When the remaining issues are solved, imaging of GECIs will finally have crossed the last milestone, evolving from an initial promise into a fully matured technology. PMID:25477779

  13. Evaluating sensor linearity of chosen infrared sensors

    NASA Astrophysics Data System (ADS)

    Walczykowski, P.; Orych, A.; Jenerowicz, A.; Karcz, P.

    2014-11-01

    The paper describes a series of experiments conducted as part of the IRAMSWater Project, the aim of which is to establish methodologies for detecting and identifying pollutants in water bodies using aerial imagery data. The main idea is based on the hypothesis, that it is possible to identify certain types of physical, biological and chemical pollutants based on their spectral reflectance characteristics. The knowledge of these spectral curves is then used to determine very narrow spectral bands in which greatest reflectance variations occur between these pollutants. A frame camera is then equipped with a band pass filter, which allows only the selected bandwidth to be registered. In order to obtain reliable reflectance data straight from the images, the team at the Military University of Technology had developed a methodology for determining the necessary acquisition parameters for the sensor (integration time and f-stop depending on the distance from the scene and it's illumination). This methodology however is based on the assumption, that the imaging sensors have a linear response. This paper shows the results of experiments used to evaluate this linearity.

  14. Backthinned TDI CCD image sensor design and performance for the Pleiades high resolution Earth observation satellites

    NASA Astrophysics Data System (ADS)

    Materne, A.; Bardoux, A.; Geoffray, H.; Tournier, T.; Kubik, P.; Morris, D.; Wallace, I.; Renard, C.

    2017-11-01

    The PLEIADES-HR Earth observing satellites, under CNES development, combine a 0.7m resolution panchromatic channel, and a multispectral channel allowing a 2.8 m resolution, in 4 spectral bands. The 2 satellites will be placed on a sun-synchronous orbit at an altitude of 695 km. The camera operates in push broom mode, providing images across a 20 km swath. This paper focuses on the specifications, design and performance of the TDI detectors developed by e2v technologies under CNES contract for the panchromatic channel. Design drivers, derived from the mission and satellite requirements, architecture of the sensor and measurement results for key performances of the first prototypes are presented.

  15. Research and development program in fiber optic sensors and distributed sensing for high temperature harsh environment energy applications (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Romanosky, Robert R.

    2017-05-01

    he National Energy Technology Laboratory (NETL) under the Department of Energy (DOE) Fossil Energy (FE) Program is leading the effort to not only develop near zero emission power generation systems, but to increaser the efficiency and availability of current power systems. The overarching goal of the program is to provide clean affordable power using domestic resources. Highly efficient, low emission power systems can have extreme conditions of high temperatures up to 1600 oC, high pressures up to 600 psi, high particulate loadings, and corrosive atmospheres that require monitoring. Sensing in these harsh environments can provide key information that directly impacts process control and system reliability. The lack of suitable measurement technology serves as a driver for the innovations in harsh environment sensor development. Advancements in sensing using optical fibers are key efforts within NETL's sensor development program as these approaches offer the potential to survive and provide critical information about these processes. An overview of the sensor development supported by the National Energy Technology Laboratory (NETL) will be given, including research in the areas of sensor materials, designs, and measurement types. New approaches to intelligent sensing, sensor placement and process control using networked sensors will be discussed as will novel approaches to fiber device design concurrent with materials development research and development in modified and coated silica and sapphire fiber based sensors. The use of these sensors for both single point and distributed measurements of temperature, pressure, strain, and a select suite of gases will be addressed. Additional areas of research includes novel control architecture and communication frameworks, device integration for distributed sensing, and imaging and other novel approaches to monitoring and controlling advanced processes. The close coupling of the sensor program with process modeling and control will be discussed for the overarching goal of clean power production.

  16. Fluorescence Intensity- and Lifetime-Based Glucose Sensing Using Glucose/Galactose-Binding Protein

    PubMed Central

    Pickup, John C.; Khan, Faaizah; Zhi, Zheng-Liang; Coulter, Jonathan; Birch, David J. S.

    2013-01-01

    We review progress in our laboratories toward developing in vivo glucose sensors for diabetes that are based on fluorescence labeling of glucose/galactose-binding protein. Measurement strategies have included both monitoring glucose-induced changes in fluorescence resonance energy transfer and labeling with the environmentally sensitive fluorophore, badan. Measuring fluorescence lifetime rather than intensity has particular potential advantages for in vivo sensing. A prototype fiber-optic-based glucose sensor using this technology is being tested.Fluorescence technique is one of the major solutions for achieving the continuous and noninvasive glucose sensor for diabetes. In this article, a highly sensitive nanostructured sensor is developed to detect extremely small amounts of aqueous glucose by applying fluorescence energy transfer (FRET). A one-pot method is applied to produce the dextran-fluorescein isothiocyanate (FITC)-conjugating mesoporous silica nanoparticles (MSNs), which afterward interact with the tetramethylrhodamine isothiocyanate (TRITC)-labeled concanavalin A (Con A) to form the FRET nanoparticles (FITC-dextran-Con A-TRITC@MSNs). The nanostructured glucose sensor is then formed via the self-assembly of the FRET nanoparticles on a transparent, flexible, and biocompatible substrate, e.g., poly(dimethylsiloxane). Our results indicate the diameter of the MSNs is 60 ± 5 nm. The difference in the images before and after adding 20 μl of glucose (0.10 mmol/liter) on the FRET sensor can be detected in less than 2 min by the laser confocal laser scanning microscope. The correlation between the ratio of fluorescence intensity, I(donor)/I(acceptor), of the FRET sensor and the concentration of aqueous glucose in the range of 0.04–4 mmol/liter has been investigated; a linear relationship is found. Furthermore, the durability of the nanostructured FRET sensor is evaluated for 5 days. In addition, the recorded images can be converted to digital images by obtaining the pixels from the resulting matrix using Matlab image processing functions. We have also studied the in vitro cytotoxicity of the device. The nanostructured FRET sensor may provide an alternative method to help patients manage the disease continuously. PMID:23439161

  17. DynAMITe: a prototype large area CMOS APS for breast cancer diagnosis using x-ray diffraction measurements

    NASA Astrophysics Data System (ADS)

    Konstantinidis, A.; Anaxagoras, T.; Esposito, M.; Allinson, N.; Speller, R.

    2012-03-01

    X-ray diffraction studies are used to identify specific materials. Several laboratory-based x-ray diffraction studies were made for breast cancer diagnosis. Ideally a large area, low noise, linear and wide dynamic range digital x-ray detector is required to perform x-ray diffraction measurements. Recently, digital detectors based on Complementary Metal-Oxide- Semiconductor (CMOS) Active Pixel Sensor (APS) technology have been used in x-ray diffraction studies. Two APS detectors, namely Vanilla and Large Area Sensor (LAS), were developed by the Multidimensional Integrated Intelligent Imaging (MI-3) consortium to cover a range of scientific applications including x-ray diffraction. The MI-3 Plus consortium developed a novel large area APS, named as Dynamically Adjustable Medical Imaging Technology (DynAMITe), to combine the key characteristics of Vanilla and LAS with a number of extra features. The active area (12.8 × 13.1 cm2) of DynaMITe offers the ability of angle dispersive x-ray diffraction (ADXRD). The current study demonstrates the feasibility of using DynaMITe for breast cancer diagnosis by identifying six breast-equivalent plastics. Further work will be done to optimize the system in order to perform ADXRD for identification of suspicious areas of breast tissue following a conventional mammogram taken with the same sensor.

  18. Novel approach for low-cost muzzle flash detection system

    NASA Astrophysics Data System (ADS)

    Voskoboinik, Asher

    2008-04-01

    A low-cost muzzle flash detection based on CMOS sensor technology is proposed. This low-cost technology makes it possible to detect various transient events with characteristic times between dozens of microseconds up to dozens of milliseconds while sophisticated algorithms successfully separate them from false alarms by utilizing differences in geometrical characteristics and/or temporal signatures. The proposed system consists of off-the-shelf smart CMOS cameras with built-in signal and image processing capabilities for pre-processing together with allocated memory for storing a buffer of images for further post-processing. Such a sensor does not require sending giant amounts of raw data to a real-time processing unit but provides all calculations in-situ where processing results are the output of the sensor. This patented CMOS muzzle flash detection concept exhibits high-performance detection capability with very low false-alarm rates. It was found that most false-alarms due to sun glints are from sources at distances of 500-700 meters from the sensor and can be distinguished by time examination techniques from muzzle flash signals. This will enable to eliminate up to 80% of falsealarms due to sun specular reflections in the battle field. Additional effort to distinguish sun glints from suspected muzzle flash signal is made by optimization of the spectral band in Near-IR region. The proposed system can be used for muzzle detection of small arms, missiles and rockets and other military applications.

  19. Towards automated spectroscopic tissue classification in thyroid and parathyroid surgery.

    PubMed

    Schols, Rutger M; Alic, Lejla; Wieringa, Fokko P; Bouvy, Nicole D; Stassen, Laurents P S

    2017-03-01

    In (para-)thyroid surgery iatrogenic parathyroid injury should be prevented. To aid the surgeons' eye, a camera system enabling parathyroid-specific image enhancement would be useful. Hyperspectral camera technology might work, provided that the spectral signature of parathyroid tissue offers enough specific features to be reliably and automatically distinguished from surrounding tissues. As a first step to investigate this, we examined the feasibility of wide band diffuse reflectance spectroscopy (DRS) for automated spectroscopic tissue classification, using silicon (Si) and indium-gallium-arsenide (InGaAs) sensors. DRS (350-1830 nm) was performed during (para-)thyroid resections. From the acquired spectra 36 features at predefined wavelengths were extracted. The best features for classification of parathyroid from adipose or thyroid were assessed by binary logistic regression for Si- and InGaAs-sensor ranges. Classification performance was evaluated by leave-one-out cross-validation. In 19 patients 299 spectra were recorded (62 tissue sites: thyroid = 23, parathyroid = 21, adipose = 18). Classification accuracy of parathyroid-adipose was, respectively, 79% (Si), 82% (InGaAs) and 97% (Si/InGaAs combined). Parathyroid-thyroid classification accuracies were 80% (Si), 75% (InGaAs), 82% (Si/InGaAs combined). Si and InGaAs sensors are fairly accurate for automated spectroscopic classification of parathyroid, adipose and thyroid tissues. Combination of both sensor technologies improves accuracy. Follow-up research, aimed towards hyperspectral imaging seems justified. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  20. Imaging lidar technology: development of a 3D-lidar elegant breadboard for rendezvous and docking, test results, and prospect to future sensor application

    NASA Astrophysics Data System (ADS)

    Moebius, B.; Pfennigbauer, M.; Pereira do Carmo, J.

    2017-11-01

    During the previous 15 years, Rendezvous and Docking Sensors (RVS) were developed, manufactured and qualified. In the mean time they were successfully applied in some space missions: For automatic docking of the European ATV "Jules Verne" on the International Space Station in 2008; for automatic berthing of the first Japanese HTV in 2009, and even the precursor model ARP-RVS for measurements during Shuttle Atlantis flights STS-84 and STS-86 to the MIR station. Up to now, about twenty RVS Flight Models for application on ATV, HTV and the American Cygnus Spacecraft were manufactured and delivered to the respective customers. RVS is designed for tracking of customer specific, cooperative targets (i.e. retro reflectors that are arranged in specific geometries). Once RVS has acquired the target, the sensor measures the distance to the target by timeof- flight determination of a pulsed laser beam. Any echo return provokes an interrupt signal and thus the readout of the according encoder positions of the two scan mirrors that represent Azimuth and Elevation measurement direction to the target. [2], [3]. The capability of the RVS for 3D mapping of the scene makes the fully space qualified RVS to be real 3D Lidar sensors; thus they are a sound technical base for the compact 3D Lidar breadboard that was developed in the course of the Imaging Lidar Technology (ILT) project.

  1. Onboard Processor for Compressing HSI Data

    NASA Technical Reports Server (NTRS)

    Cook, Sid; Harsanyi, Joe; Day, John H. (Technical Monitor)

    2002-01-01

    With EO-1 Hyperion and MightySat in orbit NASA and the DoD are showing their continued commitment to hyperspectral imaging (HSI). As HSI sensor technology continues to mature, the ever-increasing amounts of sensor data generated will result in a need for more cost effective communication and data handling systems. Lockheed Martin, with considerable experience in spacecraft design and developing special purpose onboard processors, has teamed with Applied Signal & Image Technology (ASIT), who has an extensive heritage in HSI, to develop a real-time and intelligent onboard processing (OBP) system to reduce HSI sensor downlink requirements. Our goal is to reduce the downlink requirement by a factor greater than 100, while retaining the necessary spectral fidelity of the sensor data needed to satisfy the many science, military, and intelligence goals of these systems. Our initial spectral compression experiments leverage commercial-off-the-shelf (COTS) spectral exploitation algorithms for segmentation, material identification and spectral compression that ASIT has developed. ASIT will also support the modification and integration of this COTS software into the OBP. Other commercially available COTS software for spatial compression will also be employed as part of the overall compression processing sequence. Over the next year elements of a high-performance reconfigurable OBP will be developed to implement proven preprocessing steps that distill the HSI data stream in both spectral and spatial dimensions. The system will intelligently reduce the volume of data that must be stored, transmitted to the ground, and processed while minimizing the loss of information.

  2. A Monitoring System for Laying Hens That Uses a Detection Sensor Based on Infrared Technology and Image Pattern Recognition.

    PubMed

    Zaninelli, Mauro; Redaelli, Veronica; Luzi, Fabio; Bontempo, Valentino; Dell'Orto, Vittorio; Savoini, Giovanni

    2017-05-24

    In Italy, organic egg production farms use free-range housing systems with a big outdoor area and a flock of no more than 500 hens. With additional devices and/or farming procedures, the whole flock could be forced to stay in the outdoor area for a limited time of the day. As a consequence, ozone treatments of housing areas could be performed in order to reduce the levels of atmospheric ammonia and bacterial load without risks, due by its toxicity, both for hens and workers. However, an automatic monitoring system, and a sensor able to detect the presence of animals, would be necessary. For this purpose, a first sensor was developed but some limits, related to the time necessary to detect a hen, were observed. In this study, significant improvements, for this sensor, are proposed. They were reached by an image pattern recognition technique that was applied to thermografic images acquired from the housing system. An experimental group of seven laying hens was selected for the tests, carried out for three weeks. The first week was used to set-up the sensor. Different templates, to use for the pattern recognition, were studied and different floor temperature shifts were investigated. At the end of these evaluations, a template of elliptical shape, and sizes of 135 × 63 pixels, was chosen. Furthermore, a temperature shift of one degree was selected to calculate, for each image, a color background threshold to apply in the following field tests. Obtained results showed an improvement of the sensor detection accuracy that reached values of sensitivity and specificity of 95.1% and 98.7%. In addition, the range of time necessary to detect a hen, or classify a case, was reduced at two seconds. This result could allow the sensor to control a bigger area of the housing system. Thus, the resulting monitoring system could allow to perform the sanitary treatments without risks both for animals and humans.

  3. A Monitoring System for Laying Hens That Uses a Detection Sensor Based on Infrared Technology and Image Pattern Recognition

    PubMed Central

    Zaninelli, Mauro; Redaelli, Veronica; Luzi, Fabio; Bontempo, Valentino; Dell’Orto, Vittorio; Savoini, Giovanni

    2017-01-01

    In Italy, organic egg production farms use free-range housing systems with a big outdoor area and a flock of no more than 500 hens. With additional devices and/or farming procedures, the whole flock could be forced to stay in the outdoor area for a limited time of the day. As a consequence, ozone treatments of housing areas could be performed in order to reduce the levels of atmospheric ammonia and bacterial load without risks, due by its toxicity, both for hens and workers. However, an automatic monitoring system, and a sensor able to detect the presence of animals, would be necessary. For this purpose, a first sensor was developed but some limits, related to the time necessary to detect a hen, were observed. In this study, significant improvements, for this sensor, are proposed. They were reached by an image pattern recognition technique that was applied to thermografic images acquired from the housing system. An experimental group of seven laying hens was selected for the tests, carried out for three weeks. The first week was used to set-up the sensor. Different templates, to use for the pattern recognition, were studied and different floor temperature shifts were investigated. At the end of these evaluations, a template of elliptical shape, and sizes of 135 × 63 pixels, was chosen. Furthermore, a temperature shift of one degree was selected to calculate, for each image, a color background threshold to apply in the following field tests. Obtained results showed an improvement of the sensor detection accuracy that reached values of sensitivity and specificity of 95.1% and 98.7%. In addition, the range of time necessary to detect a hen, or classify a case, was reduced at two seconds. This result could allow the sensor to control a bigger area of the housing system. Thus, the resulting monitoring system could allow to perform the sanitary treatments without risks both for animals and humans. PMID:28538654

  4. Legally compatible design of digital dactyloscopy in future surveillance scenarios

    NASA Astrophysics Data System (ADS)

    Pocs, Matthias; Schott, Maik; Hildebrandt, Mario

    2012-06-01

    Innovation in multimedia systems impacts on our society. For example surveillance camera systems combine video and audio information. Currently a new sensor for capturing fingerprint traces is being researched. It combines greyscale images to determine the intensity of the image signal, on one hand, and topographic information to determine fingerprint texture on a variety of surface materials, on the other. This research proposes new application areas which will be analyzed from a technical-legal view point. It assesses how technology design can promote legal criteria of German and European privacy and data protection. For this we focus on one technology goal as an example.

  5. Real-time image processing for non-contact monitoring of dynamic displacements using smartphone technologies

    NASA Astrophysics Data System (ADS)

    Min, Jae-Hong; Gelo, Nikolas J.; Jo, Hongki

    2016-04-01

    The newly developed smartphone application, named RINO, in this study allows measuring absolute dynamic displacements and processing them in real time using state-of-the-art smartphone technologies, such as high-performance graphics processing unit (GPU), in addition to already powerful CPU and memories, embedded high-speed/ resolution camera, and open-source computer vision libraries. A carefully designed color-patterned target and user-adjustable crop filter enable accurate and fast image processing, allowing up to 240fps for complete displacement calculation and real-time display. The performances of the developed smartphone application are experimentally validated, showing comparable accuracy with those of conventional laser displacement sensor.

  6. State of the art in video system performance

    NASA Technical Reports Server (NTRS)

    Lewis, Michael J.

    1990-01-01

    The closed circuit television (CCTV) system that is onboard the Space Shuttle has the following capabilities: camera, video signal switching and routing unit (VSU); and Space Shuttle video tape recorder. However, this system is inadequate for use with many experiments that require video imaging. In order to assess the state-of-the-art in video technology and data storage systems, a survey was conducted of the High Resolution, High Frame Rate Video Technology (HHVT) products. The performance of the state-of-the-art solid state cameras and image sensors, video recording systems, data transmission devices, and data storage systems versus users' requirements are shown graphically.

  7. Analysis of the Advantages and Limitations of Stationary Imaging Fourier Transform Spectrometer. Revised

    NASA Technical Reports Server (NTRS)

    Beecken, Brian P.; Kleinman, Randall R.

    2004-01-01

    New developments in infrared sensor technology have potentially made possible a new space-based system which can measure far-infrared radiation at lower costs (mass, power and expense). The Stationary Imaging Fourier Transform Spectrometer (SIFTS) proposed by NASA Langley Research Center, makes use of new detector array technology. A mathematical model which simulates resolution and spectral range relationships has been developed for analyzing the utility of such a radically new approach to spectroscopy. Calculations with this forward model emulate the effects of a detector array on the ability to retrieve accurate spectral features. Initial computations indicate significant attenuation at high wavenumbers.

  8. On-chip copper-dielectric interference filters for manufacturing of ambient light and proximity CMOS sensors.

    PubMed

    Frey, Laurent; Masarotto, Lilian; D'Aillon, Patrick Gros; Pellé, Catherine; Armand, Marilyn; Marty, Michel; Jamin-Mornet, Clémence; Lhostis, Sandrine; Le Briz, Olivier

    2014-07-10

    Filter technologies implemented on CMOS image sensors for spectrally selective applications often use a combination of on-chip organic resists and an external substrate with multilayer dielectric coatings. The photopic-like and near-infrared bandpass filtering functions respectively required by ambient light sensing and user proximity detection through time-of-flight can be fully integrated on chip with multilayer metal-dielectric filters. Copper, silicon nitride, and silicon oxide are the materials selected for a technological proof-of-concept on functional wafers, due to their immediate availability in front-end semiconductor fabs. Filter optical designs are optimized with respect to specific performance criteria, and the robustness of the designs regarding process errors are evaluated for industrialization purposes.

  9. Membrane-mirror-based autostereoscopic display for tele-operation and teleprescence applications

    NASA Astrophysics Data System (ADS)

    McKay, Stuart; Mair, Gordon M.; Mason, Steven; Revie, Kenneth

    2000-05-01

    An autostereoscopic display for telepresence and tele- operation applications has been developed at the University of Strathclyde in Glasgow, Scotland. The research is a collaborative effort between the Imaging Group and the Transparent Telepresence Research Group, both based at Strathclyde. A key component of the display is the directional screen; a 1.2-m diameter Stretchable Membrane Mirror is currently used. This patented technology enables large diameter, small f No., mirrors to be produced at a fraction of the cost of conventional optics. Another key element of the present system is an anthropomorphic and anthropometric stereo camera sensor platform. Thus, in addition to mirror development, research areas include sensor platform design focused on sight, hearing, research areas include sensor platform design focused on sight, hearing, and smell, telecommunications, display systems for all visual, aural and other senses, tele-operation, and augmented reality. The sensor platform is located at the remote site and transmits live video to the home location. Applications for this technology are as diverse as they are numerous, ranging from bomb disposal and other hazardous environment applications to tele-conferencing, sales, education and entertainment.

  10. Gait Analysis Methods: An Overview of Wearable and Non-Wearable Systems, Highlighting Clinical Applications

    PubMed Central

    Muro-de-la-Herran, Alvaro; Garcia-Zapirain, Begonya; Mendez-Zorrilla, Amaia

    2014-01-01

    This article presents a review of the methods used in recognition and analysis of the human gait from three different approaches: image processing, floor sensors and sensors placed on the body. Progress in new technologies has led the development of a series of devices and techniques which allow for objective evaluation, making measurements more efficient and effective and providing specialists with reliable information. Firstly, an introduction of the key gait parameters and semi-subjective methods is presented. Secondly, technologies and studies on the different objective methods are reviewed. Finally, based on the latest research, the characteristics of each method are discussed. 40% of the reviewed articles published in late 2012 and 2013 were related to non-wearable systems, 37.5% presented inertial sensor-based systems, and the remaining 22.5% corresponded to other wearable systems. An increasing number of research works demonstrate that various parameters such as precision, conformability, usability or transportability have indicated that the portable systems based on body sensors are promising methods for gait analysis. PMID:24556672

  11. The pivotal role of multimodality reporter sensors in drug discovery: from cell based assays to real time molecular imaging.

    PubMed

    Ray, Pritha

    2011-04-01

    Development and marketing of new drugs require stringent validation that are expensive and time consuming. Non-invasive multimodality molecular imaging using reporter genes holds great potential to expedite these processes at reduced cost. New generations of smarter molecular imaging strategies such as Split reporter, Bioluminescence resonance energy transfer, Multimodality fusion reporter technologies will further assist to streamline and shorten the drug discovery and developmental process. This review illustrates the importance and potential of molecular imaging using multimodality reporter genes in drug development at preclinical phases.

  12. A Brief Overview of NASA Glenn Research Center Sensor and Electronics Activities

    NASA Technical Reports Server (NTRS)

    Hunter, Gary W.

    2012-01-01

    Aerospace applications require a range of sensing technologies. There is a range of sensor and sensor system technologies being developed using microfabrication and micromachining technology to form smart sensor systems and intelligent microsystems. Drive system intelligence to the local (sensor) level -- distributed smart sensor systems. Sensor and sensor system development examples: (1) Thin-film physical sensors (2) High temperature electronics and wireless (3) "lick and stick" technology. NASA GRC is a world leader in aerospace sensor technology with a broad range of development and application experience. Core microsystems technology applicable to a range of application environmentS.

  13. BreedVision--a multi-sensor platform for non-destructive field-based phenotyping in plant breeding.

    PubMed

    Busemeyer, Lucas; Mentrup, Daniel; Möller, Kim; Wunder, Erik; Alheit, Katharina; Hahn, Volker; Maurer, Hans Peter; Reif, Jochen C; Würschum, Tobias; Müller, Joachim; Rahe, Florian; Ruckelshausen, Arno

    2013-02-27

    To achieve the food and energy security of an increasing World population likely to exceed nine billion by 2050 represents a major challenge for plant breeding. Our ability to measure traits under field conditions has improved little over the last decades and currently constitutes a major bottleneck in crop improvement. This work describes the development of a tractor-pulled multi-sensor phenotyping platform for small grain cereals with a focus on the technological development of the system. Various optical sensors like light curtain imaging, 3D Time-of-Flight cameras, laser distance sensors, hyperspectral imaging as well as color imaging are integrated into the system to collect spectral and morphological information of the plants. The study specifies: the mechanical design, the system architecture for data collection and data processing, the phenotyping procedure of the integrated system, results from field trials for data quality evaluation, as well as calibration results for plant height determination as a quantified example for a platform application. Repeated measurements were taken at three developmental stages of the plants in the years 2011 and 2012 employing triticale (×Triticosecale Wittmack L.) as a model species. The technical repeatability of measurement results was high for nearly all different types of sensors which confirmed the high suitability of the platform under field conditions. The developed platform constitutes a robust basis for the development and calibration of further sensor and multi-sensor fusion models to measure various agronomic traits like plant moisture content, lodging, tiller density or biomass yield, and thus, represents a major step towards widening the bottleneck of non-destructive phenotyping for crop improvement and plant genetic studies.

  14. BreedVision — A Multi-Sensor Platform for Non-Destructive Field-Based Phenotyping in Plant Breeding

    PubMed Central

    Busemeyer, Lucas; Mentrup, Daniel; Möller, Kim; Wunder, Erik; Alheit, Katharina; Hahn, Volker; Maurer, Hans Peter; Reif, Jochen C.; Würschum, Tobias; Müller, Joachim; Rahe, Florian; Ruckelshausen, Arno

    2013-01-01

    To achieve the food and energy security of an increasing World population likely to exceed nine billion by 2050 represents a major challenge for plant breeding. Our ability to measure traits under field conditions has improved little over the last decades and currently constitutes a major bottleneck in crop improvement. This work describes the development of a tractor-pulled multi-sensor phenotyping platform for small grain cereals with a focus on the technological development of the system. Various optical sensors like light curtain imaging, 3D Time-of-Flight cameras, laser distance sensors, hyperspectral imaging as well as color imaging are integrated into the system to collect spectral and morphological information of the plants. The study specifies: the mechanical design, the system architecture for data collection and data processing, the phenotyping procedure of the integrated system, results from field trials for data quality evaluation, as well as calibration results for plant height determination as a quantified example for a platform application. Repeated measurements were taken at three developmental stages of the plants in the years 2011 and 2012 employing triticale (×Triticosecale Wittmack L.) as a model species. The technical repeatability of measurement results was high for nearly all different types of sensors which confirmed the high suitability of the platform under field conditions. The developed platform constitutes a robust basis for the development and calibration of further sensor and multi-sensor fusion models to measure various agronomic traits like plant moisture content, lodging, tiller density or biomass yield, and thus, represents a major step towards widening the bottleneck of non-destructive phenotyping for crop improvement and plant genetic studies. PMID:23447014

  15. Development of a c-scan photoacoutsic imaging probe for prostate cancer detection

    NASA Astrophysics Data System (ADS)

    Valluru, Keerthi S.; Chinni, Bhargava K.; Rao, Navalgund A.; Bhatt, Shweta; Dogra, Vikram S.

    2011-03-01

    Prostate cancer is the second leading cause of death in American men after lung cancer. The current screening procedures include Digital Rectal Exam (DRE) and Prostate Specific Antigen (PSA) test, along with Transrectal Ultrasound (TRUS). All suffer from low sensitivity and specificity in detecting prostate cancer in early stages. There is a desperate need for a new imaging modality. We are developing a prototype transrectal photoacoustic imaging probe to detect prostate malignancies in vivo that promises high sensitivity and specificity. To generate photoacoustic (PA) signals, the probe utilizes a high energy 1064 nm laser that delivers light pulses onto the prostate at 10Hz with 10ns duration through a fiber optic cable. The designed system will generate focused C-scan planar images using acoustic lens technology. A 5 MHz custom fabricated ultrasound sensor array located in the image plane acquires the focused PA signals, eliminating the need for any synthetic aperture focusing. The lens and sensor array design was optimized towards this objective. For fast acquisition times, a custom built 16 channel simultaneous backend electronics PCB has been developed. It consists of a low-noise variable gain amplifier and a 16 channel ADC. Due to the unavailability of 2d ultrasound arrays, in the current implementation several B-scan (depth-resolved) data is first acquired by scanning a 1d array, which is then processed to reconstruct either 3d volumetric images or several C-scan planar images. Experimental results on excised tissue using a in-vitro prototype of this technology are presented to demonstrate the system capability in terms of resolution and sensitivity.

  16. A LWIR hyperspectral imager using a Sagnac interferometer and cooled HgCdTe detector array

    NASA Astrophysics Data System (ADS)

    Lucey, Paul G.; Wood, Mark; Crites, Sarah T.; Akagi, Jason

    2012-06-01

    LWIR hyperspectral imaging has a wide range of civil and military applications with its ability to sense chemical compositions at standoff ranges. Most recent implementations of this technology use spectrographs employing varying degrees of cryogenic cooling to reduce sensor self-emission that can severely limit sensitivity. We have taken an interferometric approach that promises to reduce the need for cooling while preserving high resolution. Reduced cooling has multiple benefits including faster system readiness from a power off state, lower mass, and potentially lower cost owing to lower system complexity. We coupled an uncooled Sagnac interferometer with a 256x320 mercury cadmium telluride array with an 11 micron cutoff to produce a spatial interferometric LWIR hyperspectral imaging system operating from 7.5 to 11 microns. The sensor was tested in ground-ground applications, and from a small aircraft producing spectral imagery including detection of gas emission from high vapor pressure liquids.

  17. Intelligent Network-Centric Sensors Development Program

    DTIC Science & Technology

    2012-07-31

    Image sensor Configuration: ; Cone 360 degree LWIR PFx Sensor: •■. Image sensor . Configuration: Image MWIR Configuration; Cone 360 degree... LWIR PFx Sensor: Video Configuration: Cone 360 degree SW1R, 2. Reasoning Process to Match Sensor Systems to Algorithms The ontological...effects of coherent imaging because of aberrations. Another reason is the specular nature of active imaging. Both contribute to the nonuniformity

  18. Berkeley Lab Wins Seven 2015 R&D 100 Awards | Berkeley Lab

    Science.gov Websites

    products from industry, academia, and government-sponsored research, ranging from chemistry to materials to problems in metrology techniques: the quantitative characterization of the imaging instrumentation Computational Research Division led the development of the technology. Sensor Integrated with Recombinant and

  19. Data Acquisition Using Xbox Kinect Sensor

    ERIC Educational Resources Information Center

    Ballester, Jorge; Pheatt, Charles B.

    2012-01-01

    The study of motion is central in physics education and has taken many forms as technology has provided numerous methods to acquire data. For example, the analysis of still or moving images is particularly effective in discussions of two-dimensional motion. Introductory laboratory measurement methods have progressed through water clocks, spark…

  20. Image-Based Airborne Sensors: A Combined Approach for Spectral Signatures Classification through Deterministic Simulated Annealing

    PubMed Central

    Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier

    2009-01-01

    The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989

  1. I-ImaS: intelligent imaging sensors

    NASA Astrophysics Data System (ADS)

    Griffiths, J.; Royle, G.; Esbrand, C.; Hall, G.; Turchetta, R.; Speller, R.

    2010-08-01

    Conventional x-radiography uniformly irradiates the relevant region of the patient. Across that region, however, there is likely to be significant variation in both the thickness and pathological composition of the tissues present, which means that the x-ray exposure conditions selected, and consequently the image quality achieved, are a compromise. The I-ImaS concept eliminates this compromise by intelligently scanning the patient to identify the important diagnostic features, which are then used to adaptively control the x-ray exposure conditions at each point in the patient. In this way optimal image quality is achieved throughout the region of interest whilst maintaining or reducing the dose. An I-ImaS system has been built under an EU Framework 6 project and has undergone pre-clinical testing. The system is based upon two rows of sensors controlled via an FPGA based DAQ board. Each row consists of a 160 mm × 1 mm linear array of ten scintillator coated 3T CMOS APS devices with 32 μm pixels and a readable array of 520 × 40 pixels. The first sensor row scans the patient using a fraction of the total radiation dose to produce a preview image, which is then interrogated to identify the optimal exposure conditions at each point in the image. A signal is then sent to control a beam filter mechanism to appropriately moderate x-ray beam intensity at the patient as the second row of sensors follows behind. Tests performed on breast tissue sections found that the contrast-to-noise ratio in over 70% of the images was increased by an average of 15% at an average dose reduction of 9%. The same technology is currently also being applied to baggage scanning for airport security.

  2. OAST Space Theme Workshop. Volume 3: Working group summary. 3: Sensors (E-3). A. Statement. B. Technology needs (form 1). C. Priority assessment (form 2). D. Additional assessment

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Developments required to support the space power, SETI, solar system exploration and global services programs are identified. Instrumentation and calibration sensors (rather than scientific) are needed for the space power system. Highly sophisticated receivers for narrowband detection of microwave sensors and sensors for automated stellar cataloging to provide a mapping data base for SETI are needed. Various phases of solar system exploration require large area solid state imaging arrays from UV to IR; a long focal plane telescope; high energy particle detectors; advanced spectrometers; a gravitometer; and atmospheric distanalyzer; sensors for penetrometers; in-situ sensors for surface chemical analysis, life detection, spectroscopic and microscopic analyses of surface soils, and for meteorological measurements. Active and passive multiapplication sensors, advanced multispectral scanners with improved resolution in the UV and IR ranges, and laser techniques for advanced probing and oceanographic characterization will enhance for global services.

  3. First Experiences with Kinect v2 Sensor for Close Range 3d Modelling

    NASA Astrophysics Data System (ADS)

    Lachat, E.; Macher, H.; Mittet, M.-A.; Landes, T.; Grussenmeyer, P.

    2015-02-01

    RGB-D cameras, also known as range imaging cameras, are a recent generation of sensors. As they are suitable for measuring distances to objects at high frame rate, such sensors are increasingly used for 3D acquisitions, and more generally for applications in robotics or computer vision. This kind of sensors became popular especially since the Kinect v1 (Microsoft) arrived on the market in November 2010. In July 2014, Windows has released a new sensor, the Kinect for Windows v2 sensor, based on another technology as its first device. However, due to its initial development for video games, the quality assessment of this new device for 3D modelling represents a major investigation axis. In this paper first experiences with Kinect v2 sensor are related, and the ability of close range 3D modelling is investigated. For this purpose, error sources on output data as well as a calibration approach are presented.

  4. Next Generation Advanced Video Guidance Sensor

    NASA Technical Reports Server (NTRS)

    Lee, Jimmy; Spencer, Susan; Bryan, Tom; Johnson, Jimmie; Robertson, Bryan

    2008-01-01

    The first autonomous rendezvous and docking in the history of the U.S. Space Program was successfully accomplished by Orbital Express, using the Advanced Video Guidance Sensor (AVGS) as the primary docking sensor. The United States now has a mature and flight proven sensor technology for supporting Crew Exploration Vehicles (CEV) and Commercial Orbital Transport. Systems (COTS) Automated Rendezvous and Docking (AR&D). AVGS has a proven pedigree, based on extensive ground testing and flight demonstrations. The AVGS on the Demonstration of Autonomous Rendezvous Technology (DART)mission operated successfully in "spot mode" out to 2 km. The first generation rendezvous and docking sensor, the Video Guidance Sensor (VGS), was developed and successfully flown on Space Shuttle flights in 1997 and 1998. Parts obsolescence issues prevent the construction of more AVGS. units, and the next generation sensor must be updated to support the CEV and COTS programs. The flight proven AR&D sensor is being redesigned to update parts and add additional. capabilities for CEV and COTS with the development of the Next, Generation AVGS (NGAVGS) at the Marshall Space Flight Center. The obsolete imager and processor are being replaced with new radiation tolerant parts. In addition, new capabilities might include greater sensor range, auto ranging, and real-time video output. This paper presents an approach to sensor hardware trades, use of highly integrated laser components, and addresses the needs of future vehicles that may rendezvous and dock with the International Space Station (ISS) and other Constellation vehicles. It will also discuss approaches for upgrading AVGS to address parts obsolescence, and concepts for minimizing the sensor footprint, weight, and power requirements. In addition, parts selection and test plans for the NGAVGS will be addressed to provide a highly reliable flight qualified sensor. Expanded capabilities through innovative use of existing capabilities will also be discussed.

  5. Image quality evaluation of eight complementary metal-oxide semiconductor intraoral digital X-ray sensors.

    PubMed

    Teich, Sorin; Al-Rawi, Wisam; Heima, Masahiro; Faddoul, Fady F; Goldzweig, Gil; Gutmacher, Zvi; Aizenbud, Dror

    2016-10-01

    To evaluate the image quality generated by eight commercially available intraoral sensors. Eighteen clinicians ranked the quality of a bitewing acquired from one subject using eight different intraoral sensors. Analytical methods used to evaluate clinical image quality included the Visual Grading Characteristics method, which helps to quantify subjective opinions to make them suitable for analysis. The Dexis sensor was ranked significantly better than Sirona and Carestream-Kodak sensors; and the image captured using the Carestream-Kodak sensor was ranked significantly worse than those captured using Dexis, Schick and Cyber Medical Imaging sensors. The Image Works sensor image was rated the lowest by all clinicians. Other comparisons resulted in non-significant results. None of the sensors was considered to generate images of significantly better quality than the other sensors tested. Further research should be directed towards determining the clinical significance of the differences in image quality reported in this study. © 2016 FDI World Dental Federation.

  6. Monolithic microwave integrated circuits for sensors, radar, and communications systems; Proceedings of the Meeting, Orlando, FL, Apr. 2-4, 1991

    NASA Technical Reports Server (NTRS)

    Leonard, Regis F. (Editor); Bhasin, Kul B. (Editor)

    1991-01-01

    Consideration is given to MMICs for airborne phased arrays, monolithic GaAs integrated circuit millimeter wave imaging sensors, accurate design of multiport low-noise MMICs up to 20 GHz, an ultralinear low-noise amplifier technology for space communications, variable-gain MMIC module for space applications, a high-efficiency dual-band power amplifier for radar applications, a high-density circuit approach for low-cost MMIC circuits, coplanar SIMMWIC circuits, recent advances in monolithic phased arrays, and system-level integrated circuit development for phased-array antenna applications. Consideration is also given to performance enhancement in future communications satellites with MMIC technology insertion, application of Ka-band MMIC technology for an Orbiter/ACTS communications experiment, a space-based millimeter wave debris tracking radar, low-noise high-yield octave-band feedback amplifiers to 20 GHz, quasi-optical MESFET VCOs, and a high-dynamic-range mixer using novel balun structure.

  7. Sensor for In-Motion Continuous 3D Shape Measurement Based on Dual Line-Scan Cameras

    PubMed Central

    Sun, Bo; Zhu, Jigui; Yang, Linghui; Yang, Shourui; Guo, Yin

    2016-01-01

    The acquisition of three-dimensional surface data plays an increasingly important role in the industrial sector. Numerous 3D shape measurement techniques have been developed. However, there are still limitations and challenges in fast measurement of large-scale objects or high-speed moving objects. The innovative line scan technology opens up new potentialities owing to the ultra-high resolution and line rate. To this end, a sensor for in-motion continuous 3D shape measurement based on dual line-scan cameras is presented. In this paper, the principle and structure of the sensor are investigated. The image matching strategy is addressed and the matching error is analyzed. The sensor has been verified by experiments and high-quality results are obtained. PMID:27869731

  8. KSC-2010-4679

    NASA Image and Video Library

    2010-07-28

    CAPE CANAVERAL, Fla. -- A DragonEye proximity sensor developed by Space Exploration Technologies (SpaceX) is installed while space shuttle Discovery is in Orbiter Processing Facility-3 at NASA's Kennedy Space Center in Florida. DragonEye is a Laser Imaging Detection and Ranging (LIDAR) sensor that will be tested on Discovery's docking operation with the International Space Station. Discovery's STS-133 mission, targeted to launch Nov. 1, will be the second demonstration of the sensor, following shuttle Endeavour's STS-127 mission in 2009. The DragonEye sensor will guide SpaceX's Dragon spacecraft as it approaches and berths to the station on future cargo re-supply missions. The Dragon spacecraft is a free-flying, reusable spacecraft being developed by SpaceX, which is contracted by NASA's Commercial Orbital Transportation Services (COTS) program. Photo credit: NASA/Jim Grossmann

  9. KSC-2010-4678

    NASA Image and Video Library

    2010-07-28

    CAPE CANAVERAL, Fla. -- A DragonEye proximity sensor developed by Space Exploration Technologies (SpaceX) is installed while space shuttle Discovery is in Orbiter Processing Facility-3 at NASA's Kennedy Space Center in Florida. DragonEye is a Laser Imaging Detection and Ranging (LIDAR) sensor that will be tested on Discovery's docking operation with the International Space Station. Discovery's STS-133 mission, targeted to launch Nov. 1, will be the second demonstration of the sensor, following shuttle Endeavour's STS-127 mission in 2009. The DragonEye sensor will guide SpaceX's Dragon spacecraft as it approaches and berths to the station on future cargo re-supply missions. The Dragon spacecraft is a free-flying, reusable spacecraft being developed by SpaceX, which is contracted by NASA's Commercial Orbital Transportation Services (COTS) program. Photo credit: NASA/Jim Grossmann

  10. KSC-2010-4680

    NASA Image and Video Library

    2010-07-28

    CAPE CANAVERAL, Fla. -- A DragonEye proximity sensor developed by Space Exploration Technologies (SpaceX) is installed while space shuttle Discovery is in Orbiter Processing Facility-3 at NASA's Kennedy Space Center in Florida. DragonEye is a Laser Imaging Detection and Ranging (LIDAR) sensor that will be tested on Discovery's docking operation with the International Space Station. Discovery's STS-133 mission, targeted to launch Nov. 1, will be the second demonstration of the sensor, following shuttle Endeavour's STS-127 mission in 2009. The DragonEye sensor will guide SpaceX's Dragon spacecraft as it approaches and berths to the station on future cargo re-supply missions. The Dragon spacecraft is a free-flying, reusable spacecraft being developed by SpaceX, which is contracted by NASA's Commercial Orbital Transportation Services (COTS) program. Photo credit: NASA/Jim Grossmann

  11. KSC-2010-4681

    NASA Image and Video Library

    2010-07-28

    CAPE CANAVERAL, Fla. -- A DragonEye proximity sensor developed by Space Exploration Technologies (SpaceX) is installed while space shuttle Discovery is in Orbiter Processing Facility-3 at NASA's Kennedy Space Center in Florida. DragonEye is a Laser Imaging Detection and Ranging (LIDAR) sensor that will be tested on Discovery's docking operation with the International Space Station. Discovery's STS-133 mission, targeted to launch Nov. 1, will be the second demonstration of the sensor, following shuttle Endeavour's STS-127 mission in 2009. The DragonEye sensor will guide SpaceX's Dragon spacecraft as it approaches and berths to the station on future cargo re-supply missions. The Dragon spacecraft is a free-flying, reusable spacecraft being developed by SpaceX, which is contracted by NASA's Commercial Orbital Transportation Services (COTS) program. Photo credit: NASA/Jim Grossmann

  12. KSC-2010-4683

    NASA Image and Video Library

    2010-07-28

    CAPE CANAVERAL, Fla. -- A DragonEye proximity sensor developed by Space Exploration Technologies (SpaceX) is installed while space shuttle Discovery is in Orbiter Processing Facility-3 at NASA's Kennedy Space Center in Florida. DragonEye is a Laser Imaging Detection and Ranging (LIDAR) sensor that will be tested on Discovery's docking operation with the International Space Station. Discovery's STS-133 mission, targeted to launch Nov. 1, will be the second demonstration of the sensor, following shuttle Endeavour's STS-127 mission in 2009. The DragonEye sensor will guide SpaceX's Dragon spacecraft as it approaches and berths to the station on future cargo re-supply missions. The Dragon spacecraft is a free-flying, reusable spacecraft being developed by SpaceX, which is contracted by NASA's Commercial Orbital Transportation Services (COTS) program. Photo credit: NASA/Jim Grossmann

  13. KSC-2010-4677

    NASA Image and Video Library

    2010-07-28

    CAPE CANAVERAL, Fla. -- A DragonEye proximity sensor developed by Space Exploration Technologies (SpaceX) is prepared for installation while space shuttle Discovery is in Orbiter Processing Facility-3 at NASA's Kennedy Space Center in Florida. DragonEye is a Laser Imaging Detection and Ranging (LIDAR) sensor that will be tested on Discovery's docking operation with the International Space Station. Discovery's STS-133 mission, targeted to launch Nov. 1, will be the second demonstration of the sensor, following shuttle Endeavour's STS-127 mission in 2009. The DragonEye sensor will guide SpaceX's Dragon spacecraft as it approaches and berths to the station on future cargo re-supply missions. The Dragon spacecraft is a free-flying, reusable spacecraft being developed by SpaceX, which is contracted by NASA's Commercial Orbital Transportation Services (COTS) program. Photo credit: NASA/Jim Grossmann

  14. KSC-2010-4682

    NASA Image and Video Library

    2010-07-28

    CAPE CANAVERAL, Fla. -- A DragonEye proximity sensor developed by Space Exploration Technologies (SpaceX) is installed while space shuttle Discovery is in Orbiter Processing Facility-3 at NASA's Kennedy Space Center in Florida. DragonEye is a Laser Imaging Detection and Ranging (LIDAR) sensor that will be tested on Discovery's docking operation with the International Space Station. Discovery's STS-133 mission, targeted to launch Nov. 1, will be the second demonstration of the sensor, following shuttle Endeavour's STS-127 mission in 2009. The DragonEye sensor will guide SpaceX's Dragon spacecraft as it approaches and berths to the station on future cargo re-supply missions. The Dragon spacecraft is a free-flying, reusable spacecraft being developed by SpaceX, which is contracted by NASA's Commercial Orbital Transportation Services (COTS) program. Photo credit: NASA/Jim Grossmann

  15. Nanocrystalline ZnON; High mobility and low band gap semiconductor material for high performance switch transistor and image sensor application

    PubMed Central

    Lee, Eunha; Benayad, Anass; Shin, Taeho; Lee, HyungIk; Ko, Dong-Su; Kim, Tae Sang; Son, Kyoung Seok; Ryu, Myungkwan; Jeon, Sanghun; Park, Gyeong-Su

    2014-01-01

    Interest in oxide semiconductors stems from benefits, primarily their ease of process, relatively high mobility (0.3–10 cm2/vs), and wide-bandgap. However, for practical future electronic devices, the channel mobility should be further increased over 50 cm2/vs and wide-bandgap is not suitable for photo/image sensor applications. The incorporation of nitrogen into ZnO semiconductor can be tailored to increase channel mobility, enhance the optical absorption for whole visible light and form uniform micro-structure, satisfying the desirable attributes essential for high performance transistor and visible light photo-sensors on large area platform. Here, we present electronic, optical and microstructural properties of ZnON, a composite of Zn3N2 and ZnO. Well-optimized ZnON material presents high mobility exceeding 100 cm2V−1s−1, the band-gap of 1.3 eV and nanocrystalline structure with multiphase. We found that mobility, microstructure, electronic structure, band-gap and trap properties of ZnON are varied with nitrogen concentration in ZnO. Accordingly, the performance of ZnON-based device can be adjustable to meet the requisite of both switch device and image-sensor potentials. These results demonstrate how device and material attributes of ZnON can be optimized for new device strategies in display technology and we expect the ZnON will be applicable to a wide range of imaging/display devices. PMID:24824778

  16. Direct Reflectance Measurements from Drones: Sensor Absolute Radiometric Calibration and System Tests for Forest Reflectance Characterization.

    PubMed

    Hakala, Teemu; Markelin, Lauri; Honkavaara, Eija; Scott, Barry; Theocharous, Theo; Nevalainen, Olli; Näsi, Roope; Suomalainen, Juha; Viljanen, Niko; Greenwell, Claire; Fox, Nigel

    2018-05-03

    Drone-based remote sensing has evolved rapidly in recent years. Miniaturized hyperspectral imaging sensors are becoming more common as they provide more abundant information of the object compared to traditional cameras. Reflectance is a physically defined object property and therefore often preferred output of the remote sensing data capture to be used in the further processes. Absolute calibration of the sensor provides a possibility for physical modelling of the imaging process and enables efficient procedures for reflectance correction. Our objective is to develop a method for direct reflectance measurements for drone-based remote sensing. It is based on an imaging spectrometer and irradiance spectrometer. This approach is highly attractive for many practical applications as it does not require in situ reflectance panels for converting the sensor radiance to ground reflectance factors. We performed SI-traceable spectral and radiance calibration of a tuneable Fabry-Pérot Interferometer -based (FPI) hyperspectral camera at the National Physical Laboratory NPL (Teddington, UK). The camera represents novel technology by collecting 2D format hyperspectral image cubes using time sequential spectral scanning principle. The radiance accuracy of different channels varied between ±4% when evaluated using independent test data, and linearity of the camera response was on average 0.9994. The spectral response calibration showed side peaks on several channels that were due to the multiple orders of interference of the FPI. The drone-based direct reflectance measurement system showed promising results with imagery collected over Wytham Forest (Oxford, UK).

  17. Direct Reflectance Measurements from Drones: Sensor Absolute Radiometric Calibration and System Tests for Forest Reflectance Characterization

    PubMed Central

    Hakala, Teemu; Scott, Barry; Theocharous, Theo; Näsi, Roope; Suomalainen, Juha; Greenwell, Claire; Fox, Nigel

    2018-01-01

    Drone-based remote sensing has evolved rapidly in recent years. Miniaturized hyperspectral imaging sensors are becoming more common as they provide more abundant information of the object compared to traditional cameras. Reflectance is a physically defined object property and therefore often preferred output of the remote sensing data capture to be used in the further processes. Absolute calibration of the sensor provides a possibility for physical modelling of the imaging process and enables efficient procedures for reflectance correction. Our objective is to develop a method for direct reflectance measurements for drone-based remote sensing. It is based on an imaging spectrometer and irradiance spectrometer. This approach is highly attractive for many practical applications as it does not require in situ reflectance panels for converting the sensor radiance to ground reflectance factors. We performed SI-traceable spectral and radiance calibration of a tuneable Fabry-Pérot Interferometer -based (FPI) hyperspectral camera at the National Physical Laboratory NPL (Teddington, UK). The camera represents novel technology by collecting 2D format hyperspectral image cubes using time sequential spectral scanning principle. The radiance accuracy of different channels varied between ±4% when evaluated using independent test data, and linearity of the camera response was on average 0.9994. The spectral response calibration showed side peaks on several channels that were due to the multiple orders of interference of the FPI. The drone-based direct reflectance measurement system showed promising results with imagery collected over Wytham Forest (Oxford, UK). PMID:29751560

  18. Assessment of the metrological performance of an in situ storage image sensor ultra-high speed camera for full-field deformation measurements

    NASA Astrophysics Data System (ADS)

    Rossi, Marco; Pierron, Fabrice; Forquin, Pascal

    2014-02-01

    Ultra-high speed (UHS) cameras allow us to acquire images typically up to about 1 million frames s-1 for a full spatial resolution of the order of 1 Mpixel. Different technologies are available nowadays to achieve these performances, an interesting one is the so-called in situ storage image sensor architecture where the image storage is incorporated into the sensor chip. Such an architecture is all solid state and does not contain movable devices as occurs, for instance, in the rotating mirror UHS cameras. One of the disadvantages of this system is the low fill factor (around 76% in the vertical direction and 14% in the horizontal direction) since most of the space in the sensor is occupied by memory. This peculiarity introduces a series of systematic errors when the camera is used to perform full-field strain measurements. The aim of this paper is to develop an experimental procedure to thoroughly characterize the performance of such kinds of cameras in full-field deformation measurement and identify the best operative conditions which minimize the measurement errors. A series of tests was performed on a Shimadzu HPV-1 UHS camera first using uniform scenes and then grids under rigid movements. The grid method was used as full-field measurement optical technique here. From these tests, it has been possible to appropriately identify the camera behaviour and utilize this information to improve actual measurements.

  19. NASA's Marshall Space Flight Center Recent Studies and Technology Developments in the Area of SSA/Orbital Debris

    NASA Technical Reports Server (NTRS)

    Wiegmann, Bruce M.; Hovater, Mary; Kos, Larry

    2012-01-01

    NASA/MSFC has been investigating the various aspects of the growing orbital debris problem since early 2009. Data shows that debris ranging in size from 5 mm to 10 cm presents the greatest threat to operational spacecraft today. Therefore, MSFC has focused its efforts on small orbital debris. Using off-the-shelf analysis packages, like the ESA MASTER software, analysts at MSFC have begun to characterize the small debris environment in LEO to support several spacecraft concept studies and hardware test programs addressing the characterization, mitigation, and ultimate removal, if necessary, of small debris. The Small Orbital Debris Active Removal (SODAR) architectural study investigated the overall effectiveness of removing small orbital debris from LEO using a low power, space-based laser. The Small Orbital Debris Detection, Acquisition, and Tracking (SODDAT) conceptual technology demonstration spacecraft was developed to address the challenges of in-situ small orbital debris environment classification including debris observability and instrument requirements for small debris observation. Work is underway at MSFC in the areas of hardware and testing. By combining off the shelf digital video technology, telescope lenses, and advanced video image FPGA processing, MSFC is building a breadboard of a space based, passive orbital tracking camera that can detect and track faint objects (including small debris, satellites, rocket bodies, and NEOs) at ranges of tens to hundreds of kilometers and speeds in excess of 15 km/sec,. MSFC is also sponsoring the development of a one-of-a-kind Dynamic Star Field Simulator with a high resolution large monochrome display and a custom collimator capable of projecting realistic star images with simple orbital debris spots (down to star magnitude 11-12) into a passive orbital detection and tracking system with simulated real-time angular motions of the vehicle mounted sensor. The dynamic star field simulator can be expanded for multiple sensors (including advanced star trackers), real-time vehicle pointing inputs, and more complex orbital debris images. This system is also adaptable to other sensor optics, missions, and installed sensor testing.

  20. Calibration, reconstruction, and rendering of cylindrical millimeter-wave image data

    NASA Astrophysics Data System (ADS)

    Sheen, David M.; Hall, Thomas E.

    2011-05-01

    Cylindrical millimeter-wave imaging systems and technology have been under development at the Pacific Northwest National Laboratory (PNNL) for several years. This technology has been commercialized, and systems are currently being deployed widely across the United States and internationally. These systems are effective at screening for concealed items of all types; however, new sensor designs, image reconstruction techniques, and image rendering algorithms could potentially improve performance. At PNNL, a number of specific techniques have been developed recently to improve cylindrical imaging methods including wideband techniques, combining data from full 360-degree scans, polarimetric imaging techniques, calibration methods, and 3-D data visualization techniques. Many of these techniques exploit the three-dimensionality of the cylindrical imaging technique by optimizing the depth resolution of the system and using this information to enhance detection. Other techniques, such as polarimetric methods, exploit scattering physics of the millimeter-wave interaction with concealed targets on the body. In this paper, calibration, reconstruction, and three-dimensional rendering techniques will be described that optimize the depth information in these images and the display of the images to the operator.

Top