Sample records for airborne multispectral camera

  1. An airborne multispectral imaging system based on two consumer-grade cameras for agricultural remote sensing

    USDA-ARS?s Scientific Manuscript database

    This paper describes the design and evaluation of an airborne multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame complementary metal oxide semiconductor (CMOS) sensor with 5616 × 3744 pixels. One came...

  2. A Multispectral Image Creating Method for a New Airborne Four-Camera System with Different Bandpass Filters

    PubMed Central

    Li, Hanlun; Zhang, Aiwu; Hu, Shaoxing

    2015-01-01

    This paper describes an airborne high resolution four-camera multispectral system which mainly consists of four identical monochrome cameras equipped with four interchangeable bandpass filters. For this multispectral system, an automatic multispectral data composing method was proposed. The homography registration model was chosen, and the scale-invariant feature transform (SIFT) and random sample consensus (RANSAC) were used to generate matching points. For the difficult registration problem between visible band images and near-infrared band images in cases lacking manmade objects, we presented an effective method based on the structural characteristics of the system. Experiments show that our method can acquire high quality multispectral images and the band-to-band alignment error of the composed multiple spectral images is less than 2.5 pixels. PMID:26205264

  3. An integrated compact airborne multispectral imaging system using embedded computer

    NASA Astrophysics Data System (ADS)

    Zhang, Yuedong; Wang, Li; Zhang, Xuguo

    2015-08-01

    An integrated compact airborne multispectral imaging system using embedded computer based control system was developed for small aircraft multispectral imaging application. The multispectral imaging system integrates CMOS camera, filter wheel with eight filters, two-axis stabilized platform, miniature POS (position and orientation system) and embedded computer. The embedded computer has excellent universality and expansibility, and has advantages in volume and weight for airborne platform, so it can meet the requirements of control system of the integrated airborne multispectral imaging system. The embedded computer controls the camera parameters setting, filter wheel and stabilized platform working, image and POS data acquisition, and stores the image and data. The airborne multispectral imaging system can connect peripheral device use the ports of the embedded computer, so the system operation and the stored image data management are easy. This airborne multispectral imaging system has advantages of small volume, multi-function, and good expansibility. The imaging experiment results show that this system has potential for multispectral remote sensing in applications such as resource investigation and environmental monitoring.

  4. COBRA ATD multispectral camera response model

    NASA Astrophysics Data System (ADS)

    Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.

    2000-08-01

    A new multispectral camera response model has been developed in support of the US Marine Corps (USMC) Coastal Battlefield Reconnaissance and Analysis (COBRA) Advanced Technology Demonstration (ATD) Program. This analytical model accurately estimates response form five Xybion intensified IMC 201 multispectral cameras used for COBRA ATD airborne minefield detection. The camera model design is based on a series of camera response curves which were generated through optical laboratory test performed by the Naval Surface Warfare Center, Dahlgren Division, Coastal Systems Station (CSS). Data fitting techniques were applied to these measured response curves to obtain nonlinear expressions which estimates digitized camera output as a function of irradiance, intensifier gain, and exposure. This COBRA Camera Response Model was proven to be very accurate, stable over a wide range of parameters, analytically invertible, and relatively simple. This practical camera model was subsequently incorporated into the COBRA sensor performance evaluation and computational tools for research analysis modeling toolbox in order to enhance COBRA modeling and simulation capabilities. Details of the camera model design and comparisons of modeled response to measured experimental data are presented.

  5. Highly Protable Airborne Multispectral Imaging System

    NASA Technical Reports Server (NTRS)

    Lehnemann, Robert; Mcnamee, Todd

    2001-01-01

    A portable instrumentation system is described that includes and airborne and a ground-based subsytem. It can acquire multispectral image data over swaths of terrain ranging in width from about 1.5 to 1 km. The system was developed especially for use in coastal environments and is well suited for performing remote sensing and general environmental monitoring. It includes a small,munpilotaed, remotely controlled airplance that carries a forward-looking camera for navigation, three downward-looking monochrome video cameras for imaging terrain in three spectral bands, a video transmitter, and a Global Positioning System (GPS) reciever.

  6. Airborne multispectral identification of individual cotton plants using consumer-grade cameras

    USDA-ARS?s Scientific Manuscript database

    Although multispectral remote sensing using consumer-grade cameras has successfully identified fields of small cotton plants, improvements to detection sensitivity are needed to identify individual or small clusters of plants. The imaging sensor of consumer-grade cameras are based on a Bayer patter...

  7. Design and development of an airborne multispectral imaging system

    NASA Astrophysics Data System (ADS)

    Kulkarni, Rahul R.; Bachnak, Rafic; Lyle, Stacey; Steidley, Carl W.

    2002-08-01

    Advances in imaging technology and sensors have made airborne remote sensing systems viable for many applications that require reasonably good resolution at low cost. Digital cameras are making their mark on the market by providing high resolution at very high rates. This paper describes an aircraft-mounted imaging system (AMIS) that is being designed and developed at Texas A&M University-Corpus Christi (A&M-CC) with the support of a grant from NASA. The approach is to first develop and test a one-camera system that will be upgraded into a five-camera system that offers multi-spectral capabilities. AMIS will be low cost, rugged, portable and has its own battery power source. Its immediate use will be to acquire images of the Coastal area in the Gulf of Mexico for a variety of studies covering vast spectra from near ultraviolet region to near infrared region. This paper describes AMIS and its characteristics, discusses the process for selecting the major components, and presents the progress.

  8. Airborne multispectral detection of regrowth cotton fields

    NASA Astrophysics Data System (ADS)

    Westbrook, John K.; Suh, Charles P.-C.; Yang, Chenghai; Lan, Yubin; Eyster, Ritchie S.

    2015-01-01

    Effective methods are needed for timely areawide detection of regrowth cotton plants because boll weevils (a quarantine pest) can feed and reproduce on these plants beyond the cotton production season. Airborne multispectral images of regrowth cotton plots were acquired on several dates after three shredding (i.e., stalk destruction) dates. Linear spectral unmixing (LSU) classification was applied to high-resolution airborne multispectral images of regrowth cotton plots to estimate the minimum detectable size and subsequent growth of plants. We found that regrowth cotton fields can be identified when the mean plant width is ˜0.2 m for an image resolution of 0.1 m. LSU estimates of canopy cover of regrowth cotton plots correlated well (r2=0.81) with the ratio of mean plant width to row spacing, a surrogate measure of plant canopy cover. The height and width of regrowth plants were both well correlated (r2=0.94) with accumulated degree-days after shredding. The results will help boll weevil eradication program managers use airborne multispectral images to detect and monitor the regrowth of cotton plants after stalk destruction, and identify fields that may require further inspection and mitigation of boll weevil infestations.

  9. Airborne multispectral data collection

    NASA Technical Reports Server (NTRS)

    Hasell, P. G., Jr.

    1974-01-01

    Multispectral mapping accomplishments using the M7 airborne scanner are summarized. The M7 system is described and overall results of specific data collection flight operations since June 1971 are reviewed. A major advantage of the M7 system is that all spectral bands of the scanner are in common spatial registration, whereas in the M5 they were not.

  10. Multispectral image dissector camera flight test

    NASA Technical Reports Server (NTRS)

    Johnson, B. L.

    1973-01-01

    It was demonstrated that the multispectral image dissector camera is able to provide composite pictures of the earth surface from high altitude overflights. An electronic deflection feature was used to inject the gyro error signal into the camera for correction of aircraft motion.

  11. The high resolution stereo camera (HRSC): acquisition of multi-spectral 3D-data and photogrammetric processing

    NASA Astrophysics Data System (ADS)

    Neukum, Gerhard; Jaumann, Ralf; Scholten, Frank; Gwinner, Klaus

    2017-11-01

    At the Institute of Space Sensor Technology and Planetary Exploration of the German Aerospace Center (DLR) the High Resolution Stereo Camera (HRSC) has been designed for international missions to planet Mars. For more than three years an airborne version of this camera, the HRSC-A, has been successfully applied in many flight campaigns and in a variety of different applications. It combines 3D-capabilities and high resolution with multispectral data acquisition. Variable resolutions depending on the camera control settings can be generated. A high-end GPS/INS system in combination with the multi-angle image information yields precise and high-frequent orientation data for the acquired image lines. In order to handle these data a completely automated photogrammetric processing system has been developed, and allows to generate multispectral 3D-image products for large areas and with accuracies for planimetry and height in the decimeter range. This accuracy has been confirmed by detailed investigations.

  12. Water Mapping Using Multispectral Airborne LIDAR Data

    NASA Astrophysics Data System (ADS)

    Yan, W. Y.; Shaker, A.; LaRocque, P. E.

    2018-04-01

    This study investigates the use of the world's first multispectral airborne LiDAR sensor, Optech Titan, manufactured by Teledyne Optech to serve the purpose of automatic land-water classification with a particular focus on near shore region and river environment. Although there exist recent studies utilizing airborne LiDAR data for shoreline detection and water surface mapping, the majority of them only perform experimental testing on clipped data subset or rely on data fusion with aerial/satellite image. In addition, most of the existing approaches require manual intervention or existing tidal/datum data for sample collection of training data. To tackle the drawbacks of previous approaches, we propose and develop an automatic data processing workflow for land-water classification using multispectral airborne LiDAR data. Depending on the nature of the study scene, two methods are proposed for automatic training data selection. The first method utilizes the elevation/intensity histogram fitted with Gaussian mixture model (GMM) to preliminarily split the land and water bodies. The second method mainly relies on the use of a newly developed scan line elevation intensity ratio (SLIER) to estimate the water surface data points. Regardless of the training methods being used, feature spaces can be constructed using the multispectral LiDAR intensity, elevation and other features derived from these parameters. The comprehensive workflow was tested with two datasets collected for different near shore region and river environment, where the overall accuracy yielded better than 96 %.

  13. Airborne Multi-Spectral Minefield Survey

    DTIC Science & Technology

    2005-05-01

    Swedish Defence Research Agency), GEOSPACE (Austria), GTD ( Ingenieria de Sistemas y Software Industrial, Spain), IMEC (Ineruniversity MicroElectronic...RTO-MP-SET-092 18 - 1 UNCLASSIFIED/UNLIMITED UNCLASSIFIED/UNLIMITED Airborne Multi-Spectral Minefield Survey Dirk-Jan de Lange, Eric den...actions is the severe lack of baseline information. To respond to this in a rapid way, cost-efficient data acquisition methods are a key issue. de

  14. Determining fast orientation changes of multi-spectral line cameras from the primary images

    NASA Astrophysics Data System (ADS)

    Wohlfeil, Jürgen

    2012-01-01

    Fast orientation changes of airborne and spaceborne line cameras cannot always be avoided. In such cases it is essential to measure them with high accuracy to ensure a good quality of the resulting imagery products. Several approaches exist to support the orientation measurement by using optical information received through the main objective/telescope. In this article an approach is proposed that allows the determination of non-systematic orientation changes between every captured line. It does not require any additional camera hardware or onboard processing capabilities but the payload images and a rough estimate of the camera's trajectory. The approach takes advantage of the typical geometry of multi-spectral line cameras with a set of linear sensor arrays for different spectral bands on the focal plane. First, homologous points are detected within the heavily distorted images of different spectral bands. With their help a connected network of geometrical correspondences can be built up. This network is used to calculate the orientation changes of the camera with the temporal and angular resolution of the camera. The approach was tested with an extensive set of aerial surveys covering a wide range of different conditions and achieved precise and reliable results.

  15. Airborne ballistic camera tracking systems

    NASA Technical Reports Server (NTRS)

    Redish, W. L.

    1976-01-01

    An operational airborne ballistic camera tracking system was tested for operational and data reduction feasibility. The acquisition and data processing requirements of the system are discussed. Suggestions for future improvements are also noted. A description of the data reduction mathematics is outlined. Results from a successful reentry test mission are tabulated. The test mission indicated that airborne ballistic camera tracking systems are feasible.

  16. Geometric Calibration and Radiometric Correction of the Maia Multispectral Camera

    NASA Astrophysics Data System (ADS)

    Nocerino, E.; Dubbini, M.; Menna, F.; Remondino, F.; Gattelli, M.; Covi, D.

    2017-10-01

    Multispectral imaging is a widely used remote sensing technique, whose applications range from agriculture to environmental monitoring, from food quality check to cultural heritage diagnostic. A variety of multispectral imaging sensors are available on the market, many of them designed to be mounted on different platform, especially small drones. This work focuses on the geometric and radiometric characterization of a brand-new, lightweight, low-cost multispectral camera, called MAIA. The MAIA camera is equipped with nine sensors, allowing for the acquisition of images in the visible and near infrared parts of the electromagnetic spectrum. Two versions are available, characterised by different set of band-pass filters, inspired by the sensors mounted on the WorlView-2 and Sentinel2 satellites, respectively. The camera details and the developed procedures for the geometric calibrations and radiometric correction are presented in the paper.

  17. Evaluation of multispectral plenoptic camera

    NASA Astrophysics Data System (ADS)

    Meng, Lingfei; Sun, Ting; Kosoglow, Rich; Berkner, Kathrin

    2013-01-01

    Plenoptic cameras enable capture of a 4D lightfield, allowing digital refocusing and depth estimation from data captured with a compact portable camera. Whereas most of the work on plenoptic camera design has been based a simplistic geometric-optics-based characterization of the optical path only, little work has been done of optimizing end-to-end system performance for a specific application. Such design optimization requires design tools that need to include careful parameterization of main lens elements, as well as microlens array and sensor characteristics. In this paper we are interested in evaluating the performance of a multispectral plenoptic camera, i.e. a camera with spectral filters inserted into the aperture plane of the main lens. Such a camera enables single-snapshot spectral data acquisition.1-3 We first describe in detail an end-to-end imaging system model for a spectrally coded plenoptic camera that we briefly introduced in.4 Different performance metrics are defined to evaluate the spectral reconstruction quality. We then present a prototype which is developed based on a modified DSLR camera containing a lenslet array on the sensor and a filter array in the main lens. Finally we evaluate the spectral reconstruction performance of a spectral plenoptic camera based on both simulation and measurements obtained from the prototype.

  18. Airborne system for multispectral, multiangle polarimetric imaging.

    PubMed

    Bowles, Jeffrey H; Korwan, Daniel R; Montes, Marcos J; Gray, Deric J; Gillis, David B; Lamela, Gia M; Miller, W David

    2015-11-01

    In this paper, we describe the design, fabrication, calibration, and deployment of an airborne multispectral polarimetric imager. The motivation for the development of this instrument was to explore its ability to provide information about water constituents, such as particle size and type. The instrument is based on four 16 MP cameras and uses wire grid polarizers (aligned at 0°, 45°, 90°, and 135°) to provide the separation of the polarization states. A five-position filter wheel provides for four narrow-band spectral filters (435, 550, 625, and 750 nm) and one blocked position for dark-level measurements. When flown, the instrument is mounted on a programmable stage that provides control of the view angles. View angles that range to ±65° from the nadir have been used. Data processing provides a measure of the polarimetric signature as a function of both the view zenith and view azimuth angles. As a validation of our initial results, we compare our measurements, over water, with the output of a Monte Carlo code, both of which show neutral points off the principle plane. The locations of the calculated and measured neutral points are compared. The random error level in the measured degree of linear polarization (8% at 435) is shown to be better than 0.25%.

  19. Michigan experimental multispectral mapping system: A description of the M7 airborne sensor and its performance

    NASA Technical Reports Server (NTRS)

    Hasell, P. G., Jr.

    1974-01-01

    The development and characteristics of a multispectral band scanner for an airborne mapping system are discussed. The sensor operates in the ultraviolet, visual, and infrared frequencies. Any twelve of the bands may be selected for simultaneous, optically registered recording on a 14-track analog tape recorder. Multispectral imagery recorded on magnetic tape in the aircraft can be laboratory reproduced on film strips for visual analysis or optionally machine processed in analog and/or digital computers before display. The airborne system performance is analyzed.

  20. Development of a portable multispectral thermal infrared camera

    NASA Technical Reports Server (NTRS)

    Osterwisch, Frederick G.

    1991-01-01

    The purpose of this research and development effort was to design and build a prototype instrument designated the 'Thermal Infrared Multispectral Camera' (TIRC). The Phase 2 effort was a continuation of the Phase 1 feasibility study and preliminary design for such an instrument. The completed instrument designated AA465 has application in the field of geologic remote sensing and exploration. The AA465 Thermal Infrared Camera (TIRC) System is a field-portable multispectral thermal infrared camera operating over the 8.0 - 13.0 micron wavelength range. Its primary function is to acquire two-dimensional thermal infrared images of user-selected scenes. Thermal infrared energy emitted by the scene is collected, dispersed into ten 0.5 micron wide channels, and then measured and recorded by the AA465 System. This multispectral information is presented in real time on a color display to be used by the operator to identify spectral and spatial variations in the scenes emissivity and/or irradiance. This fundamental instrument capability has a wide variety of commercial and research applications. While ideally suited for two-man operation in the field, the AA465 System can be transported and operated effectively by a single user. Functionally, the instrument operates as if it were a single exposure camera. System measurement sensitivity requirements dictate relatively long (several minutes) instrument exposure times. As such, the instrument is not suited for recording time-variant information. The AA465 was fabricated, assembled, tested, and documented during this Phase 2 work period. The detailed design and fabrication of the instrument was performed during the period of June 1989 to July 1990. The software development effort and instrument integration/test extended from July 1990 to February 1991. Software development included an operator interface/menu structure, instrument internal control functions, DSP image processing code, and a display algorithm coding program. The

  1. Evaluation of eelgrass beds mapping using a high-resolution airborne multispectral scanner

    USGS Publications Warehouse

    Su, H.; Karna, D.; Fraim, E.; Fitzgerald, M.; Dominguez, R.; Myers, J.S.; Coffland, B.; Handley, L.R.; Mace, T.

    2006-01-01

    Eelgrass (Zostera marina) can provide vital ecological functions in stabilizing sediments, influencing current dynamics, and contributing significant amounts of biomass to numerous food webs in coastal ecosystems. Mapping eelgrass beds is important for coastal water and nearshore estuarine monitoring, management, and planning. This study demonstrated the possible use of high spatial (approximately 5 m) and temporal (maximum low tide) resolution airborne multispectral scanner on mapping eelgrass beds in Northern Puget Sound, Washington. A combination of supervised and unsupervised classification approaches were performed on the multispectral scanner imagery. A normalized difference vegetation index (NDVI) derived from the red and near-infrared bands and ancillary spatial information, were used to extract and mask eelgrass beds and other submerged aquatic vegetation (SAV) in the study area. We evaluated the resulting thematic map (geocoded, classified image) against a conventional aerial photograph interpretation using 260 point locations randomly stratified over five defined classes from the thematic map. We achieved an overall accuracy of 92 percent with 0.92 Kappa Coefficient in the study area. This study demonstrates that the airborne multispectral scanner can be useful for mapping eelgrass beds in a local or regional scale, especially in regions for which optical remote sensing from space is constrained by climatic and tidal conditions. ?? 2006 American Society for Photogrammetry and Remote Sensing.

  2. Improved Airborne System for Sensing Wildfires

    NASA Technical Reports Server (NTRS)

    McKeown, Donald; Richardson, Michael

    2008-01-01

    The Wildfire Airborne Sensing Program (WASP) is engaged in a continuing effort to develop an improved airborne instrumentation system for sensing wildfires. The system could also be used for other aerial-imaging applications, including mapping and military surveillance. Unlike prior airborne fire-detection instrumentation systems, the WASP system would not be based on custom-made multispectral line scanners and associated custom- made complex optomechanical servomechanisms, sensors, readout circuitry, and packaging. Instead, the WASP system would be based on commercial off-the-shelf (COTS) equipment that would include (1) three or four electronic cameras (one for each of three or four wavelength bands) instead of a multispectral line scanner; (2) all associated drive and readout electronics; (3) a camera-pointing gimbal; (4) an inertial measurement unit (IMU) and a Global Positioning System (GPS) receiver for measuring the position, velocity, and orientation of the aircraft; and (5) a data-acquisition subsystem. It would be necessary to custom-develop an integrated sensor optical-bench assembly, a sensor-management subsystem, and software. The use of mostly COTS equipment is intended to reduce development time and cost, relative to those of prior systems.

  3. Employing airborne multispectral digital imagery to map Brazilian pepper infestation in south Texas.

    USDA-ARS?s Scientific Manuscript database

    A study was conducted in south Texas to determine the feasibility of using airborne multispectral digital imagery for differentiating the invasive plant Brazilian pepper (Schinus terebinthifolius) from other cover types. Imagery obtained in the visible, near infrared, and mid infrared regions of th...

  4. Geometric calibration of lens and filter distortions for multispectral filter-wheel cameras.

    PubMed

    Brauers, Johannes; Aach, Til

    2011-02-01

    High-fidelity color image acquisition with a multispectral camera utilizes optical filters to separate the visible electromagnetic spectrum into several passbands. This is often realized with a computer-controlled filter wheel, where each position is equipped with an optical bandpass filter. For each filter wheel position, a grayscale image is acquired and the passbands are finally combined to a multispectral image. However, the different optical properties and non-coplanar alignment of the filters cause image aberrations since the optical path is slightly different for each filter wheel position. As in a normal camera system, the lens causes additional wavelength-dependent image distortions called chromatic aberrations. When transforming the multispectral image with these aberrations into an RGB image, color fringes appear, and the image exhibits a pincushion or barrel distortion. In this paper, we address both the distortions caused by the lens and by the filters. Based on a physical model of the bandpass filters, we show that the aberrations caused by the filters can be modeled by displaced image planes. The lens distortions are modeled by an extended pinhole camera model, which results in a remaining mean calibration error of only 0.07 pixels. Using an absolute calibration target, we then geometrically calibrate each passband and compensate for both lens and filter distortions simultaneously. We show that both types of aberrations can be compensated and present detailed results on the remaining calibration errors.

  5. A spectral reflectance estimation technique using multispectral data from the Viking lander camera

    NASA Technical Reports Server (NTRS)

    Park, S. K.; Huck, F. O.

    1976-01-01

    A technique is formulated for constructing spectral reflectance curve estimates from multispectral data obtained with the Viking lander camera. The multispectral data are limited to six spectral channels in the wavelength range from 0.4 to 1.1 micrometers and most of these channels exhibit appreciable out-of-band response. The output of each channel is expressed as a linear (integral) function of the (known) solar irradiance, atmospheric transmittance, and camera spectral responsivity and the (unknown) spectral responsivity and the (unknown) spectral reflectance. This produces six equations which are used to determine the coefficients in a representation of the spectral reflectance as a linear combination of known basis functions. Natural cubic spline reflectance estimates are produced for a variety of materials that can be reasonably expected to occur on Mars. In each case the dominant reflectance features are accurately reproduced, but small period features are lost due to the limited number of channels. This technique may be a valuable aid in selecting the number of spectral channels and their responsivity shapes when designing a multispectral imaging system.

  6. Design of a multi-spectral imager built using the compressive sensing single-pixel camera architecture

    NASA Astrophysics Data System (ADS)

    McMackin, Lenore; Herman, Matthew A.; Weston, Tyler

    2016-02-01

    We present the design of a multi-spectral imager built using the architecture of the single-pixel camera. The architecture is enabled by the novel sampling theory of compressive sensing implemented optically using the Texas Instruments DLP™ micro-mirror array. The array not only implements spatial modulation necessary for compressive imaging but also provides unique diffractive spectral features that result in a multi-spectral, high-spatial resolution imager design. The new camera design provides multi-spectral imagery in a wavelength range that extends from the visible to the shortwave infrared without reduction in spatial resolution. In addition to the compressive imaging spectrometer design, we present a diffractive model of the architecture that allows us to predict a variety of detailed functional spatial and spectral design features. We present modeling results, architectural design and experimental results that prove the concept.

  7. Urban land use monitoring from computer-implemented processing of airborne multispectral data

    NASA Technical Reports Server (NTRS)

    Todd, W. J.; Mausel, P. W.; Baumgardner, M. F.

    1976-01-01

    Machine processing techniques were applied to multispectral data obtained from airborne scanners at an elevation of 600 meters over central Indianapolis in August, 1972. Computer analysis of these spectral data indicate that roads (two types), roof tops (three types), dense grass (two types), sparse grass (two types), trees, bare soil, and water (two types) can be accurately identified. Using computers, it is possible to determine land uses from analysis of type, size, shape, and spatial associations of earth surface images identified from multispectral data. Land use data developed through machine processing techniques can be programmed to monitor land use changes, simulate land use conditions, and provide impact statistics that are required to analyze stresses placed on spatial systems.

  8. Interpretation of multispectral and infrared thermal surveys of the Suez Canal Zone, Egypt

    NASA Technical Reports Server (NTRS)

    Elshazly, E. M.; Hady, M. A. A. H.; Hafez, M. A. A.; Salman, A. B.; Morsy, M. A.; Elrakaiby, M. M.; Alaassy, I. E. E.; Kamel, A. F.

    1977-01-01

    Remote sensing airborne surveys were conducted, as part of the plan of rehabilitation, of the Suez Canal Zone using I2S multispectral camera and Bendix LN-3 infrared passive scanner. The multispectral camera gives four separate photographs for the same scene in the blue, green, red, and near infrared bands. The scanner was operated in the microwave bands of 8 to 14 microns and the thermal surveying was carried out both at night and in the day time. The surveys, coupled with intensive ground investigations, were utilized in the construction of new geological, structural lineation and drainage maps for the Suez Canal Zone on a scale of approximately 1:20,000, which are superior to the maps made by normal aerial photography. A considerable number of anomalies belonging to various types were revealed through the interpretation of the executed multispectral and infrared thermal surveys.

  9. Development of a Portable 3CCD Camera System for Multispectral Imaging of Biological Samples

    PubMed Central

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S.

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  10. Object-based analysis of multispectral airborne laser scanner data for land cover classification and map updating

    NASA Astrophysics Data System (ADS)

    Matikainen, Leena; Karila, Kirsi; Hyyppä, Juha; Litkey, Paula; Puttonen, Eetu; Ahokas, Eero

    2017-06-01

    During the last 20 years, airborne laser scanning (ALS), often combined with passive multispectral information from aerial images, has shown its high feasibility for automated mapping processes. The main benefits have been achieved in the mapping of elevated objects such as buildings and trees. Recently, the first multispectral airborne laser scanners have been launched, and active multispectral information is for the first time available for 3D ALS point clouds from a single sensor. This article discusses the potential of this new technology in map updating, especially in automated object-based land cover classification and change detection in a suburban area. For our study, Optech Titan multispectral ALS data over a suburban area in Finland were acquired. Results from an object-based random forests analysis suggest that the multispectral ALS data are very useful for land cover classification, considering both elevated classes and ground-level classes. The overall accuracy of the land cover classification results with six classes was 96% compared with validation points. The classes under study included building, tree, asphalt, gravel, rocky area and low vegetation. Compared to classification of single-channel data, the main improvements were achieved for ground-level classes. According to feature importance analyses, multispectral intensity features based on several channels were more useful than those based on one channel. Automatic change detection for buildings and roads was also demonstrated by utilising the new multispectral ALS data in combination with old map vectors. In change detection of buildings, an old digital surface model (DSM) based on single-channel ALS data was also used. Overall, our analyses suggest that the new data have high potential for further increasing the automation level in mapping. Unlike passive aerial imaging commonly used in mapping, the multispectral ALS technology is independent of external illumination conditions, and there are

  11. Clustering of Multispectral Airborne Laser Scanning Data Using Gaussian Decomposition

    NASA Astrophysics Data System (ADS)

    Morsy, S.; Shaker, A.; El-Rabbany, A.

    2017-09-01

    With the evolution of the LiDAR technology, multispectral airborne laser scanning systems are currently available. The first operational multispectral airborne LiDAR sensor, the Optech Titan, acquires LiDAR point clouds at three different wavelengths (1.550, 1.064, 0.532 μm), allowing the acquisition of different spectral information of land surface. Consequently, the recent studies are devoted to use the radiometric information (i.e., intensity) of the LiDAR data along with the geometric information (e.g., height) for classification purposes. In this study, a data clustering method, based on Gaussian decomposition, is presented. First, a ground filtering mechanism is applied to separate non-ground from ground points. Then, three normalized difference vegetation indices (NDVIs) are computed for both non-ground and ground points, followed by histograms construction from each NDVI. The Gaussian function model is used to decompose the histograms into a number of Gaussian components. The maximum likelihood estimate of the Gaussian components is then optimized using Expectation - Maximization algorithm. The intersection points of the adjacent Gaussian components are subsequently used as threshold values, whereas different classes can be clustered. This method is used to classify the terrain of an urban area in Oshawa, Ontario, Canada, into four main classes, namely roofs, trees, asphalt and grass. It is shown that the proposed method has achieved an overall accuracy up to 95.1 % using different NDVIs.

  12. Comparison of different detection methods for citrus greening disease based on airborne multispectral and hyperspectral imagery

    USDA-ARS?s Scientific Manuscript database

    Citrus greening or Huanglongbing (HLB) is a devastating disease spread in many citrus groves since first found in 2005 in Florida. Multispectral (MS) and hyperspectral (HS) airborne images of citrus groves in Florida were taken to detect citrus greening infected trees in 2007 and 2010. Ground truthi...

  13. Feasibility of Multispectral Airborne Laser Scanning for Land Cover Classification, Road Mapping and Map Updating

    NASA Astrophysics Data System (ADS)

    Matikainen, L.; Karila, K.; Hyyppä, J.; Puttonen, E.; Litkey, P.; Ahokas, E.

    2017-10-01

    This article summarises our first results and experiences on the use of multispectral airborne laser scanner (ALS) data. Optech Titan multispectral ALS data over a large suburban area in Finland were acquired on three different dates in 2015-2016. We investigated the feasibility of the data from the first date for land cover classification and road mapping. Object-based analyses with segmentation and random forests classification were used. The potential of the data for change detection of buildings and roads was also demonstrated. The overall accuracy of land cover classification results with six classes was 96 % compared with validation points. The data also showed high potential for road detection, road surface classification and change detection. The multispectral intensity information appeared to be very important for automated classifications. Compared to passive aerial images, the intensity images have interesting advantages, such as the lack of shadows. Currently, we focus on analyses and applications with the multitemporal multispectral data. Important questions include, for example, the potential and challenges of the multitemporal data for change detection.

  14. SPLASSH: Open source software for camera-based high-speed, multispectral in-vivo optical image acquisition

    PubMed Central

    Sun, Ryan; Bouchard, Matthew B.; Hillman, Elizabeth M. C.

    2010-01-01

    Camera-based in-vivo optical imaging can provide detailed images of living tissue that reveal structure, function, and disease. High-speed, high resolution imaging can reveal dynamic events such as changes in blood flow and responses to stimulation. Despite these benefits, commercially available scientific cameras rarely include software that is suitable for in-vivo imaging applications, making this highly versatile form of optical imaging challenging and time-consuming to implement. To address this issue, we have developed a novel, open-source software package to control high-speed, multispectral optical imaging systems. The software integrates a number of modular functions through a custom graphical user interface (GUI) and provides extensive control over a wide range of inexpensive IEEE 1394 Firewire cameras. Multispectral illumination can be incorporated through the use of off-the-shelf light emitting diodes which the software synchronizes to image acquisition via a programmed microcontroller, allowing arbitrary high-speed illumination sequences. The complete software suite is available for free download. Here we describe the software’s framework and provide details to guide users with development of this and similar software. PMID:21258475

  15. Performance analysis of a multispectral framing camera for detecting mines in the littoral zone and beach zone

    NASA Astrophysics Data System (ADS)

    Louchard, Eric; Farm, Brian; Acker, Andrew

    2008-04-01

    BAE Systems Sensor Systems Identification & Surveillance (IS) has developed, under contract with the Office of Naval Research, a multispectral airborne sensor system and processing algorithms capable of detecting mine-like objects in the surf zone and land mines in the beach zone. BAE Systems has used this system in a blind test at a test range established by the Naval Surface Warfare Center - Panama City Division (NSWC-PCD) at Eglin Air Force Base. The airborne and ground subsystems used in this test are described, with graphical illustrations of the detection algorithms. We report on the performance of the system configured to operate with a human operator analyzing data on a ground station. A subsurface (underwater bottom proud mine in the surf zone and moored mine in shallow water) mine detection capability is demonstrated in the surf zone. Surface float detection and proud land mine detection capability is also demonstrated. Our analysis shows that this BAE Systems-developed multispectral airborne sensor provides a robust technical foundation for a viable system for mine counter-measures, and would be a valuable asset for use prior to an amphibious assault.

  16. Investigation of Parallax Issues for Multi-Lens Multispectral Camera Band Co-Registration

    NASA Astrophysics Data System (ADS)

    Jhan, J. P.; Rau, J. Y.; Haala, N.; Cramer, M.

    2017-08-01

    The multi-lens multispectral cameras (MSCs), such as Micasense Rededge and Parrot Sequoia, can record multispectral information by each separated lenses. With their lightweight and small size, which making they are more suitable for mounting on an Unmanned Aerial System (UAS) to collect high spatial images for vegetation investigation. However, due to the multi-sensor geometry of multi-lens structure induces significant band misregistration effects in original image, performing band co-registration is necessary in order to obtain accurate spectral information. A robust and adaptive band-to-band image transform (RABBIT) is proposed to perform band co-registration of multi-lens MSCs. First is to obtain the camera rig information from camera system calibration, and utilizes the calibrated results for performing image transformation and lens distortion correction. Since the calibration uncertainty leads to different amount of systematic errors, the last step is to optimize the results in order to acquire a better co-registration accuracy. Due to the potential issues of parallax that will cause significant band misregistration effects when images are closer to the targets, four datasets thus acquired from Rededge and Sequoia were applied to evaluate the performance of RABBIT, including aerial and close-range imagery. From the results of aerial images, it shows that RABBIT can achieve sub-pixel accuracy level that is suitable for the band co-registration purpose of any multi-lens MSC. In addition, the results of close-range images also has same performance, if we focus on the band co-registration on specific target for 3D modelling, or when the target has equal distance to the camera.

  17. Determination of the Actual Land Use Pattern Using Unmanned Aerial Vehicles and Multispectral Camera

    NASA Astrophysics Data System (ADS)

    Dindaroğlu, T.; Gündoğan, R.; Gülci, S.

    2017-11-01

    The international initiatives developed in the context of combating global warming are based on the monitoring of Land Use, Land Use Changes, and Forests (LULUCEF). Determination of changes in land use patterns is used to determine the effects of greenhouse gas emissions and to reduce adverse effects in subsequent processes. This process, which requires the investigation and control of quite large areas, has undoubtedly increased the importance of technological tools and equipment. The use of carrier platforms and commercially cheaper various sensors have become widespread. In this study, multispectral camera was used to determine the land use pattern with high sensitivity. Unmanned aerial flights were carried out in the research fields of Kahramanmaras Sutcu Imam University campus area. Unmanned aerial vehicle (UAV) (multi-propeller hexacopter) was used as a carrier platform for aerial photographs. Within the scope of this study, multispectral cameras were used to determine the land use pattern with high sensitivity.

  18. Integration of multispectral face recognition and multi-PTZ camera automated surveillance for security applications

    NASA Astrophysics Data System (ADS)

    Chen, Chung-Hao; Yao, Yi; Chang, Hong; Koschan, Andreas; Abidi, Mongi

    2013-06-01

    Due to increasing security concerns, a complete security system should consist of two major components, a computer-based face-recognition system and a real-time automated video surveillance system. A computerbased face-recognition system can be used in gate access control for identity authentication. In recent studies, multispectral imaging and fusion of multispectral narrow-band images in the visible spectrum have been employed and proven to enhance the recognition performance over conventional broad-band images, especially when the illumination changes. Thus, we present an automated method that specifies the optimal spectral ranges under the given illumination. Experimental results verify the consistent performance of our algorithm via the observation that an identical set of spectral band images is selected under all tested conditions. Our discovery can be practically used for a new customized sensor design associated with given illuminations for an improved face recognition performance over conventional broad-band images. In addition, once a person is authorized to enter a restricted area, we still need to continuously monitor his/her activities for the sake of security. Because pantilt-zoom (PTZ) cameras are capable of covering a panoramic area and maintaining high resolution imagery for real-time behavior understanding, researches in automated surveillance systems with multiple PTZ cameras have become increasingly important. Most existing algorithms require the prior knowledge of intrinsic parameters of the PTZ camera to infer the relative positioning and orientation among multiple PTZ cameras. To overcome this limitation, we propose a novel mapping algorithm that derives the relative positioning and orientation between two PTZ cameras based on a unified polynomial model. This reduces the dependence on the knowledge of intrinsic parameters of PTZ camera and relative positions. Experimental results demonstrate that our proposed algorithm presents substantially

  19. Airborne camera and spectrometer experiments and data evaluation

    NASA Astrophysics Data System (ADS)

    Lehmann, F. F.; Bucher, T.; Pless, S.; Wohlfeil, J.; Hirschmüller, H.

    2009-09-01

    New stereo push broom camera systems have been developed at German Aerospace Centre (DLR). The new small multispectral systems (Multi Functional Camerahead - MFC, Advanced Multispectral Scanner - AMS) are light weight, compact and display three or five RGB stereo lines of 8000, 10 000 or 14 000 pixels, which are used for stereo processing and the generation of Digital Surface Models (DSM) and near True Orthoimage Mosaics (TOM). Simultaneous acquisition of different types of MFC-cameras for infrared and RGB data has been successfully tested. All spectral channels record the image data in full resolution, pan-sharpening is not necessary. Analogue to the line scanner data an automatic processing chain for UltraCamD and UltraCamX exists. The different systems have been flown for different types of applications; main fields of interest among others are environmental applications (flooding simulations, monitoring tasks, classification) and 3D-modelling (e.g. city mapping). From the DSM and TOM data Digital Terrain Models (DTM) and 3D city models are derived. Textures for the facades are taken from oblique orthoimages, which are created from the same input data as the TOM and the DOM. The resulting models are characterised by high geometric accuracy and the perfect fit of image data and DSM. The DLR is permanently developing and testing a wide range of sensor types and imaging platforms for terrestrial and space applications. The MFC-sensors have been flown in combination with laser systems and imaging spectrometers and special data fusion products have been developed. These products include hyperspectral orthoimages and 3D hyperspectral data.

  20. Robust and adaptive band-to-band image transform of UAS miniature multi-lens multispectral camera

    NASA Astrophysics Data System (ADS)

    Jhan, Jyun-Ping; Rau, Jiann-Yeou; Haala, Norbert

    2018-03-01

    Utilizing miniature multispectral (MS) or hyperspectral (HS) cameras by mounting them on an Unmanned Aerial System (UAS) has the benefits of convenience and flexibility to collect remote sensing imagery for precision agriculture, vegetation monitoring, and environment investigation applications. Most miniature MS cameras adopt a multi-lens structure to record discrete MS bands of visible and invisible information. The differences in lens distortion, mounting positions, and viewing angles among lenses mean that the acquired original MS images have significant band misregistration errors. We have developed a Robust and Adaptive Band-to-Band Image Transform (RABBIT) method for dealing with the band co-registration of various types of miniature multi-lens multispectral cameras (Mini-MSCs) to obtain band co-registered MS imagery for remote sensing applications. The RABBIT utilizes modified projective transformation (MPT) to transfer the multiple image geometry of a multi-lens imaging system to one sensor geometry, and combines this with a robust and adaptive correction (RAC) procedure to correct several systematic errors and to obtain sub-pixel accuracy. This study applies three state-of-the-art Mini-MSCs to evaluate the RABBIT method's performance, specifically the Tetracam Miniature Multiple Camera Array (MiniMCA), Micasense RedEdge, and Parrot Sequoia. Six MS datasets acquired at different target distances and dates, and locations are also applied to prove its reliability and applicability. Results prove that RABBIT is feasible for different types of Mini-MSCs with accurate, robust, and rapid image processing efficiency.

  1. Airborne multicamera system for geo-spatial applications

    NASA Astrophysics Data System (ADS)

    Bachnak, Rafic; Kulkarni, Rahul R.; Lyle, Stacey; Steidley, Carl W.

    2003-08-01

    Airborne remote sensing has many applications that include vegetation detection, oceanography, marine biology, geographical information systems, and environmental coastal science analysis. Remotely sensed images, for example, can be used to study the aftermath of episodic events such as the hurricanes and floods that occur year round in the coastal bend area of Corpus Christi. This paper describes an Airborne Multi-Spectral Imaging System that uses digital cameras to provide high resolution at very high rates. The software is based on Delphi 5.0 and IC Imaging Control's ActiveX controls. Both time and the GPS coordinates are recorded. Three successful test flights have been conducted so far. The paper present flight test results and discusses the issues being addressed to fully develop the system.

  2. Discriminating heavy aerosol, clouds, and fires during SCAR-B: Application of airborne multispectral MAS data

    NASA Astrophysics Data System (ADS)

    King, Michael D.; Tsay, Si-Chee; Ackerman, Steven A.; Larsen, North F.

    1998-12-01

    A multispectral scanning spectrometer was used to obtain measurements of the reflection function and brightness temperature of smoke, clouds, and terrestrial surfaces at 50 discrete wavelengths between 0.55 and 14.2 μm. These observations were obtained from the NASA ER-2 aircraft as part of the Smoke, Clouds, and Radiation-Brazil (SCAR-B) campaign, conducted over a 1500×1500 km region of cerrado and rain forest throughout Brazil between August 16 and September 11, 1995. Multispectral images of the reflection function and brightness temperature in 10 distinct bands of the MODIS airborne simulator (MAS) were used to derive a confidence in clear sky (or alternatively the probability of cloud), shadow, fire, and heavy aerosol. In addition to multispectral imagery, monostatic lidar data were obtained along the nadir ground track of the aircraft and used to assess the accuracy of the cloud mask results. This analysis shows that the cloud and aerosol mask being developed for operational use on the moderate-resolution imaging spectroradiometer (MODIS), and tested using MAS data in Brazil, is quite capable of separating cloud, aerosol, shadow, and fires during daytime conditions over land.

  3. MEDUSA: an airborne multispectral oil spill detection and characterization system

    NASA Astrophysics Data System (ADS)

    Wagner, Peter; Hengstermann, Theo; Zielinski, Oliver

    2000-12-01

    MEDUSA is a sensor network, consisting of and effectively combining a variety of different remote sensing instruments. Installed in 1998 it is operationally used in a maritime surveillance aircraft maintained by the German Ministry of Transport, Building and Housing. On one hand routine oil pollution monitoring with remote sensing equipment like Side Looking Airborne Radar (SLAR), Infrared/Ultraviolet Line Scanner (IR/UV line scanner), Microwave Radiometer (MWR), Imaging Airborne Laserfluorosensor (IALFS) and Forward Looking Infrared (FLIR) requires a complex network and communication structure to be operated by a single operator. On the other hand the operation of such a variety of sensors on board of one aircraft provides an excellent opportunity to establish new concepts of integrated sensor fusion and data evaluation. In this work a general survey of the German surveillance aircraft instrumentation is given and major features of the sensor package as well as advantages of the design and architecture are presented. Results from routine operation over North and Baltic Sea are shown to illustrate the successful application of MEDUSA in maritime patrol of oil slicks and polluters. Recently the combination of the different sensor results towards one multispectral information has met with increasing interest. Thus new application fields and parameter sets could be derived, like oceanography or river flood management. The basic concepts and first results in the fusion of sensoric information will conclude the paper.

  4. Mountain pine beetle detection and monitoring: evaluation of airborne imagery

    NASA Astrophysics Data System (ADS)

    Roberts, A.; Bone, C.; Dragicevic, S.; Ettya, A.; Northrup, J.; Reich, R.

    2007-10-01

    The processing and evaluation of digital airborne imagery for detection, monitoring and modeling of mountain pine beetle (MPB) infestations is evaluated. The most efficient and reliable remote sensing strategy for identification and mapping of infestation stages ("current" to "red" to "grey" attack) of MPB in lodgepole pine forests is determined for the most practical and cost effective procedures. This research was planned to specifically enhance knowledge by determining the remote sensing imaging systems and analytical procedures that optimize resource management for this critical forest health problem. Within the context of this study, airborne remote sensing of forest environments for forest health determinations (MPB) is most suitably undertaken using multispectral digitally converted imagery (aerial photography) at scales of 1:8000 for early detection of current MPB attack and 1:16000 for mapping and sequential monitoring of red and grey attack. Digital conversion should be undertaken at 10 to 16 microns for B&W multispectral imagery and 16 to 24 microns for colour and colour infrared imagery. From an "operational" perspective, the use of twin mapping-cameras with colour and B&W or colour infrared film will provide the best approximation of multispectral digital imagery with near comparable performance in a competitive private sector context (open bidding).

  5. Multispectral photography for earth resources

    NASA Technical Reports Server (NTRS)

    Wenderoth, S.; Yost, E.; Kalia, R.; Anderson, R.

    1972-01-01

    A guide for producing accurate multispectral results for earth resource applications is presented along with theoretical and analytical concepts of color and multispectral photography. Topics discussed include: capabilities and limitations of color and color infrared films; image color measurements; methods of relating ground phenomena to film density and color measurement; sensitometry; considerations in the selection of multispectral cameras and components; and mission planning.

  6. Multispectral Photography

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Model II Multispectral Camera is an advanced aerial camera that provides optimum enhancement of a scene by recording spectral signatures of ground objects only in narrow, preselected bands of the electromagnetic spectrum. Its photos have applications in such areas as agriculture, forestry, water pollution investigations, soil analysis, geologic exploration, water depth studies and camouflage detection. The target scene is simultaneously photographed in four separate spectral bands. Using a multispectral viewer, such as their Model 75 Spectral Data creates a color image from the black and white positives taken by the camera. With this optical image analysis unit, all four bands are superimposed in accurate registration and illuminated with combinations of blue green, red, and white light. Best color combination for displaying the target object is selected and printed. Spectral Data Corporation produces several types of remote sensing equipment and also provides aerial survey, image processing and analysis and number of other remote sensing services.

  7. On-Orbit Calibration of a Multi-Spectral Satellite Satellite Sensor Using a High Altitude Airborne Imaging Spectrometer

    NASA Technical Reports Server (NTRS)

    Green, R. O.; Shimada, M.

    1996-01-01

    Earth-looking satellites must be calibrated in order to quantitatively measure and monitor components of land, water and atmosphere of the Earth system. The inevitable change in performance due to the stress of satellite launch requires that the calibration of a satellite sensor be established and validated on-orbit. A new approach to on-orbit satellite sensor calibration has been developed using the flight of a high altitude calibrated airborne imaging spectrometer below a multi-spectral satellite sensor.

  8. Remote sensing techniques applied to multispectral recognition of the Aranjuez pilot zone

    NASA Technical Reports Server (NTRS)

    Lemos, G. L.; Salinas, J.; Rebollo, M.

    1977-01-01

    A rectangular (7 x 14 km) area 40 km S of Madrid was remote-sensed with a three-stage recognition process. Ground truth was established in the first phase, airborne sensing with a multispectral scanner and photographic cameras were used in the second phase, and Landsat satellite data were obtained in the third phase. Agronomic and hydrological photointerpretation problems are discussed. Color, black/white, and labeled areas are displayed for crop recognition in the land-use survey; turbidity, concentrations of pollutants and natural chemicals, and densitometry of the water are considered in the evaluation of water resources.

  9. Development of low-cost high-performance multispectral camera system at Banpil

    NASA Astrophysics Data System (ADS)

    Oduor, Patrick; Mizuno, Genki; Olah, Robert; Dutta, Achyut K.

    2014-05-01

    Banpil Photonics (Banpil) has developed a low-cost high-performance multispectral camera system for Visible to Short- Wave Infrared (VIS-SWIR) imaging for the most demanding high-sensitivity and high-speed military, commercial and industrial applications. The 640x512 pixel InGaAs uncooled camera system is designed to provide a compact, smallform factor to within a cubic inch, high sensitivity needing less than 100 electrons, high dynamic range exceeding 190 dB, high-frame rates greater than 1000 frames per second (FPS) at full resolution, and low power consumption below 1W. This is practically all the feature benefits highly desirable in military imaging applications to expand deployment to every warfighter, while also maintaining a low-cost structure demanded for scaling into commercial markets. This paper describes Banpil's development of the camera system including the features of the image sensor with an innovation integrating advanced digital electronics functionality, which has made the confluence of high-performance capabilities on the same imaging platform practical at low cost. It discusses the strategies employed including innovations of the key components (e.g. focal plane array (FPA) and Read-Out Integrated Circuitry (ROIC)) within our control while maintaining a fabless model, and strategic collaboration with partners to attain additional cost reductions on optics, electronics, and packaging. We highlight the challenges and potential opportunities for further cost reductions to achieve a goal of a sub-$1000 uncooled high-performance camera system. Finally, a brief overview of emerging military, commercial and industrial applications that will benefit from this high performance imaging system and their forecast cost structure is presented.

  10. Computational multispectral video imaging [Invited].

    PubMed

    Wang, Peng; Menon, Rajesh

    2018-01-01

    Multispectral imagers reveal information unperceivable to humans and conventional cameras. Here, we demonstrate a compact single-shot multispectral video-imaging camera by placing a micro-structured diffractive filter in close proximity to the image sensor. The diffractive filter converts spectral information to a spatial code on the sensor pixels. Following a calibration step, this code can be inverted via regularization-based linear algebra to compute the multispectral image. We experimentally demonstrated spectral resolution of 9.6 nm within the visible band (430-718 nm). We further show that the spatial resolution is enhanced by over 30% compared with the case without the diffractive filter. We also demonstrate Vis-IR imaging with the same sensor. Because no absorptive color filters are utilized, sensitivity is preserved as well. Finally, the diffractive filters can be easily manufactured using optical lithography and replication techniques.

  11. High Resolution Airborne Digital Imagery for Precision Agriculture

    NASA Technical Reports Server (NTRS)

    Herwitz, Stanley R.

    1998-01-01

    The Environmental Research Aircraft and Sensor Technology (ERAST) program is a NASA initiative that seeks to demonstrate the application of cost-effective aircraft and sensor technology to private commercial ventures. In 1997-98, a series of flight-demonstrations and image acquisition efforts were conducted over the Hawaiian Islands using a remotely-piloted solar- powered platform (Pathfinder) and a fixed-wing piloted aircraft (Navajo) equipped with a Kodak DCS450 CIR (color infrared) digital camera. As an ERAST Science Team Member, I defined a set of flight lines over the largest coffee plantation in Hawaii: the Kauai Coffee Company's 4,000 acre Koloa Estate. Past studies have demonstrated the applications of airborne digital imaging to agricultural management. Few studies have examined the usefulness of high resolution airborne multispectral imagery with 10 cm pixel sizes. The Kodak digital camera integrated with ERAST's Airborne Real Time Imaging System (ARTIS) which generated multiband CCD images consisting of 6 x 106 pixel elements. At the designated flight altitude of 1,000 feet over the coffee plantation, pixel size was 10 cm. The study involved the analysis of imagery acquired on 5 March 1998 for the detection of anomalous reflectance values and for the definition of spectral signatures as indicators of tree vigor and treatment effectiveness (e.g., drip irrigation; fertilizer application).

  12. Inflight Radiometric Calibration of New Horizons' Multispectral Visible Imaging Camera (MVIC)

    NASA Technical Reports Server (NTRS)

    Howett, C. J. A.; Parker, A. H.; Olkin, C. B.; Reuter, D. C.; Ennico, K.; Grundy, W. M.; Graps, A. L.; Harrison, K. P.; Throop, H. B.; Buie, M. W.; hide

    2016-01-01

    We discuss two semi-independent calibration techniques used to determine the inflight radiometric calibration for the New Horizons Multi-spectral Visible Imaging Camera (MVIC). The first calibration technique compares the measured number of counts (DN) observed from a number of well calibrated stars to those predicted using the component-level calibration. The ratio of these values provides a multiplicative factor that allows a conversation between the preflight calibration to the more accurate inflight one, for each detector. The second calibration technique is a channel-wise relative radiometric calibration for MVIC's blue, near-infrared and methane color channels using Hubble and New Horizons observations of Charon and scaling from the red channel stellar calibration. Both calibration techniques produce very similar results (better than 7% agreement), providing strong validation for the techniques used. Since the stellar calibration described here can be performed without a color target in the field of view and covers all of MVIC's detectors, this calibration was used to provide the radiometric keyword values delivered by the New Horizons project to the Planetary Data System (PDS). These keyword values allow each observation to be converted from counts to physical units; a description of how these keyword values were generated is included. Finally, mitigation techniques adopted for the gain drift observed in the near-infrared detector and one of the panchromatic framing cameras are also discussed.

  13. Shift-variant linear system modeling for multispectral scanners

    NASA Astrophysics Data System (ADS)

    Amini, Abolfazl M.; Ioup, George E.; Ioup, Juliette W.

    1995-07-01

    Multispectral scanner data are affected both by the spatial impulse response of the sensor and the spectral response of each channel. To achieve a realistic representation for the output data for a given scene spectral input, both of these effects must be incorporated into a forward model. Each channel can have a different spatial response and each has its characteristic spectral response. A forward model is built which includes the shift invariant spatial broadening of the input for the channels and the shift variant spectral response across channels. The model is applied to the calibrated airborne multispectral scanner as well as the airborne terrestrial applications sensor developed at NASA Stennis Space Center.

  14. Simultaneous multispectral framing infrared camera using an embedded diffractive optical lenslet array

    NASA Astrophysics Data System (ADS)

    Hinnrichs, Michele

    2011-06-01

    Recent advances in micro-optical element fabrication using gray scale technology have opened up the opportunity to create simultaneous multi-spectral imaging with fine structure diffractive lenses. This paper will discuss an approach that uses diffractive optical lenses configured in an array (lenslet array) and placed in close proximity to the focal plane array which enables a small compact simultaneous multispectral imaging camera [1]. The lenslet array is designed so that all lenslets have a common focal length with each lenslet tuned for a different wavelength. The number of simultaneous spectral images is determined by the number of individually configured lenslets in the array. The number of spectral images can be increased by a factor of 2 when using it with a dual-band focal plane array (MWIR/LWIR) by exploiting multiple diffraction orders. In addition, modulation of the focal length of the lenslet array with piezoelectric actuation will enable spectral bin fill-in allowing additional spectral coverage while giving up simultaneity. Different lenslet array spectral imaging concept designs are presented in this paper along with a unique concept for prefiltering the radiation focused on the detector. This approach to spectral imaging has applications in the detection of chemical agents in both aerosolized form and as a liquid on a surface. It also can be applied to the detection of weaponized biological agent and IED detection in various forms from manufacturing to deployment and post detection during forensic analysis.

  15. Forest Stand Segmentation Using Airborne LIDAR Data and Very High Resolution Multispectral Imagery

    NASA Astrophysics Data System (ADS)

    Dechesne, Clément; Mallet, Clément; Le Bris, Arnaud; Gouet, Valérie; Hervieu, Alexandre

    2016-06-01

    Forest stands are the basic units for forest inventory and mapping. Stands are large forested areas (e.g., ≥ 2 ha) of homogeneous tree species composition. The accurate delineation of forest stands is usually performed by visual analysis of human operators on very high resolution (VHR) optical images. This work is highly time consuming and should be automated for scalability purposes. In this paper, a method based on the fusion of airborne laser scanning data (or lidar) and very high resolution multispectral imagery for automatic forest stand delineation and forest land-cover database update is proposed. The multispectral images give access to the tree species whereas 3D lidar point clouds provide geometric information on the trees. Therefore, multi-modal features are computed, both at pixel and object levels. The objects are individual trees extracted from lidar data. A supervised classification is performed at the object level on the computed features in order to coarsely discriminate the existing tree species in the area of interest. The analysis at tree level is particularly relevant since it significantly improves the tree species classification. A probability map is generated through the tree species classification and inserted with the pixel-based features map in an energetical framework. The proposed energy is then minimized using a standard graph-cut method (namely QPBO with α-expansion) in order to produce a segmentation map with a controlled level of details. Comparison with an existing forest land cover database shows that our method provides satisfactory results both in terms of stand labelling and delineation (matching ranges between 94% and 99%).

  16. Low SWaP multispectral sensors using dichroic filter arrays

    NASA Astrophysics Data System (ADS)

    Dougherty, John; Varghese, Ron

    2015-06-01

    The benefits of multispectral imaging are well established in a variety of applications including remote sensing, authentication, satellite and aerial surveillance, machine vision, biomedical, and other scientific and industrial uses. However, many of the potential solutions require more compact, robust, and cost-effective cameras to realize these benefits. The next generation of multispectral sensors and cameras needs to deliver improvements in size, weight, power, portability, and spectral band customization to support widespread deployment for a variety of purpose-built aerial, unmanned, and scientific applications. A novel implementation uses micro-patterning of dichroic filters1 into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. This approach can be implemented across a variety of wavelength ranges and on a variety of detector types including linear, area, silicon, and InGaAs. This dichroic filter array approach can also reduce payloads and increase range for unmanned systems, with the capability to support both handheld and autonomous systems. Recent examples and results of 4 band RGB + NIR dichroic filter arrays in multispectral cameras are discussed. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and scalable production.

  17. Multispectral thermal airborne TASI-600 data to study the Pompeii (IT) archaeological area

    NASA Astrophysics Data System (ADS)

    Palombo, Angelo; Pascucci, Simone; Pergola, Nicola; Pignatti, Stefano; Santini, Federico; Soldovieri, Francesco

    2016-04-01

    The management of archaeological areas refers to the conservation of the ruins/buildings and the eventual prospection of new areas having an archaeological potential. In this framework, airborne remote sensing is a well-developed geophysical tool for supporting the archaeological surveys of wide areas. The spectral regions applied in archaeological remote sensing spans from the VNIR to the TIR. In particular, the archaeological thermal imaging considers that materials absorb, emit, transmit, and reflect the thermal infrared radiation at different rate according to their composition, density and moisture content. Despite its potential, thermal imaging in archaeological applications are scarce. Among them, noteworthy are the ones related to the use of Landsat and ASTER [1] and airborne remote sensing [2, 3, 4 and 5]. In view of these potential in Cultural Heritage applications, the present study aims at analysing the usefulness of the high spatial resolution thermal imaging on the Pompeii archaeological park. To this purpose TASI-600 [6] airborne multispectral thermal imagery (32 channels from 8 to 11.5 nm with a spectral resolution of 100nm and a spatial resolution of 1m/pixel) was acquired on December the 7th, 2015. Airborne survey has been acquired to get useful information on the building materials (both ancient and of consolidation) characteristics and, whenever possible, to retrieve quick indicators on their conservation status. Thermal images will be, moreover, processed to have an insight of the critical environmental issues impacting the structures (e.g. moisture). The proposed study shows the preliminary results of the airborne deployments, the pre-processing of the multispectral thermal imagery and the retrieving of accurate land surface temperatures (LST). LST map will be analysed to describe the thermal pattern of the city of Pompeii and detect any thermal anomalies. As far as the ongoing TASI-600 sensors pre-processing, it will include: (a) radiometric

  18. Survey of the Pompeii (IT) archaeological Regions with the multispectral thermal airborne TASI data

    NASA Astrophysics Data System (ADS)

    Pignatti, Stefano; Palombo, Angelo; Pascucci, Simone; Santini, Federico; Laneve, Giovanni

    2017-04-01

    Thermal remote sensing, as a tool for analyzing environmental variables with regards to archaeological prospecting, has been growing ever mainly because airborne surveys allow to provide to archaeologists images at meter scale. The importance of this study lies in the evaluation of TIR imagery in view of the use of unmanned aerial vehicles (UAVs) imagery, for the Conservation of Cultural Heritage, that should provide at low cost very high spatial resolution thermal imaging. The research aims at analyzing the potential of the thermal imaging [1] on some selected areas of the Pompeii archaeological park. To this purpose, on December the 7th, 2015, a TASI-600, an [2] airborne multispectral thermal imagery (32 channels from 8 to 11.5 nm with a spectral resolution of 100nm and a spatial resolution of 1m/pixel) has surveyed the archaeological Pompeii Regions. Thermal images have been corrected, calibrated in order to obtain land surface temperatures (LST) and emissivity data set to be applied for the further analysis. The thermal data pre-processing has included: ii) radiometric calibration of the raw data and the correction of the blinking pixel; ii) atmospheric correction performed by using MODTRAN; iii) Temperature Emissivity Separation (TES) to obtain emissivity and LST maps [3]. Our objective is to shows the major results of the IR survey, the pre-processing of the multispectral thermal imagery. LST and emissivity maps have been analysed to describe the thermal/emissivity pattern of the different Regions as function of the presence, in first subsurface, of archaeological features. The obtained preliminary results are encouraging, even though, the vegetation cover, covering the different Pompeii Regions, is one of the major issues affecting the usefulness of the TIR sensing. Of course, LST anomalies and emissivity maps need to be further integrated with the classical geophysical investigation techniques to have a complete validation and to better evaluate the

  19. High Spatial Resolution Airborne Multispectral Thermal Infrared Remote Sensing Data for Analysis of Urban Landscape Characteristics

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Luvall, Jeffrey C.; Estes, Maurice G., Jr.; Arnold, James E. (Technical Monitor)

    2000-01-01

    We have used airborne multispectral thermal infrared (TIR) remote sensing data collected at a high spatial resolution (i.e., 10m) over several cities in the United States to study thermal energy characteristics of the urban landscape. These TIR data provide a unique opportunity to quantify thermal responses from discrete surfaces typical of the urban landscape and to identify both the spatial arrangement and patterns of thermal processes across the city. The information obtained from these data is critical to understanding how urban surfaces drive or force development of the Urban Heat Island (UHI) effect, which exists as a dome of elevated air temperatures that presides over cities in contrast to surrounding non-urbanized areas. The UHI is most pronounced in the summertime where urban surfaces, such as rooftops and pavement, store solar radiation throughout the day, and release this stored energy slowly after sunset creating air temperatures over the city that are in excess of 2-4'C warmer in contrast with non-urban or rural air temperatures. The UHI can also exist as a daytime phenomenon with surface temperatures in downtown areas of cities exceeding 38'C. The implications of the UHI are significant, particularly as an additive source of thermal energy input that exacerbates the overall production of ground level ozone over cities. We have used the Airborne Thermal and Land Applications Sensor (ATLAS), flown onboard a Lear 23 jet aircraft from the NASA Stennis Space Center, to acquire high spatial resolution multispectral TIR data (i.e., 6 bandwidths between 8.2-12.2 (um) over Huntsville, Alabama, Atlanta, Georgia, Baton Rouge, Louisiana, Salt Lake City, Utah, and Sacramento, California. These TIR data have been used to produce maps and other products, showing the spatial distribution of heating and cooling patterns over these cities to better understand how the morphology of the urban landscape affects development of the UHI. In turn, these data have been used

  20. HERCULES/MSI: a multispectral imager with geolocation for STS-70

    NASA Astrophysics Data System (ADS)

    Simi, Christopher G.; Kindsfather, Randy; Pickard, Henry; Howard, William, III; Norton, Mark C.; Dixon, Roberta

    1995-11-01

    A multispectral intensified CCD imager combined with a ring laser gyroscope based inertial measurement unit was flown on the Space Shuttle Discovery from July 13-22, 1995 (Space Transport System Flight No. 70, STS-70). The camera includes a six position filter wheel, a third generation image intensifier, and a CCD camera. The camera is integrated with a laser gyroscope system that determines the ground position of the imagery to an accuracy of better than three nautical miles. The camera has two modes of operation; a panchromatic mode for high-magnification imaging [ground sample distance (GSD) of 4 m], or a multispectral mode consisting of six different user-selectable spectral ranges at reduced magnification (12 m GSD). This paper discusses the system hardware and technical trade-offs involved with camera optimization, and presents imagery observed during the shuttle mission.

  1. Statistical correction of lidar-derived digital elevation models with multispectral airborne imagery in tidal marshes

    USGS Publications Warehouse

    Buffington, Kevin J.; Dugger, Bruce D.; Thorne, Karen M.; Takekawa, John Y.

    2016-01-01

    Airborne light detection and ranging (lidar) is a valuable tool for collecting large amounts of elevation data across large areas; however, the limited ability to penetrate dense vegetation with lidar hinders its usefulness for measuring tidal marsh platforms. Methods to correct lidar elevation data are available, but a reliable method that requires limited field work and maintains spatial resolution is lacking. We present a novel method, the Lidar Elevation Adjustment with NDVI (LEAN), to correct lidar digital elevation models (DEMs) with vegetation indices from readily available multispectral airborne imagery (NAIP) and RTK-GPS surveys. Using 17 study sites along the Pacific coast of the U.S., we achieved an average root mean squared error (RMSE) of 0.072 m, with a 40–75% improvement in accuracy from the lidar bare earth DEM. Results from our method compared favorably with results from three other methods (minimum-bin gridding, mean error correction, and vegetation correction factors), and a power analysis applying our extensive RTK-GPS dataset showed that on average 118 points were necessary to calibrate a site-specific correction model for tidal marshes along the Pacific coast. By using available imagery and with minimal field surveys, we showed that lidar-derived DEMs can be adjusted for greater accuracy while maintaining high (1 m) resolution.

  2. Quality evaluation of pansharpened hyperspectral images generated using multispectral images

    NASA Astrophysics Data System (ADS)

    Matsuoka, Masayuki; Yoshioka, Hiroki

    2012-11-01

    Hyperspectral remote sensing can provide a smooth spectral curve of a target by using a set of higher spectral resolution detectors. The spatial resolution of the hyperspectral images, however, is generally much lower than that of multispectral images due to the lower energy of incident radiation. Pansharpening is an image-fusion technique that generates higher spatial resolution multispectral images by combining lower resolution multispectral images with higher resolution panchromatic images. In this study, higher resolution hyperspectral images were generated by pansharpening of simulated lower hyperspectral and higher multispectral data. Spectral and spatial qualities of pansharpened images, then, were accessed in relation to the spectral bands of multispectral images. Airborne hyperspectral data of AVIRIS was used in this study, and it was pansharpened using six methods. Quantitative evaluations of pansharpened image are achieved using two frequently used indices, ERGAS, and the Q index.

  3. Land cover/use classification of Cairns, Queensland, Australia: A remote sensing study involving the conjunctive use of the airborne imaging spectrometer, the large format camera and the thematic mapper simulator

    NASA Technical Reports Server (NTRS)

    Heric, Matthew; Cox, William; Gordon, Daniel K.

    1987-01-01

    In an attempt to improve the land cover/use classification accuracy obtainable from remotely sensed multispectral imagery, Airborne Imaging Spectrometer-1 (AIS-1) images were analyzed in conjunction with Thematic Mapper Simulator (NS001) Large Format Camera color infrared photography and black and white aerial photography. Specific portions of the combined data set were registered and used for classification. Following this procedure, the resulting derived data was tested using an overall accuracy assessment method. Precise photogrammetric 2D-3D-2D geometric modeling techniques is not the basis for this study. Instead, the discussion exposes resultant spectral findings from the image-to-image registrations. Problems associated with the AIS-1 TMS integration are considered, and useful applications of the imagery combination are presented. More advanced methodologies for imagery integration are needed if multisystem data sets are to be utilized fully. Nevertheless, research, described herein, provides a formulation for future Earth Observation Station related multisensor studies.

  4. Multispectral airborne imagery in the field reveals genetic determinisms of morphological and transpiration traits of an apple tree hybrid population in response to water deficit

    PubMed Central

    Virlet, Nicolas; Costes, Evelyne; Martinez, Sébastien; Kelner, Jean-Jacques; Regnard, Jean-Luc

    2015-01-01

    Genetic studies of response to water deficit in adult trees are limited by low throughput of the usual phenotyping methods in the field. Here, we aimed at overcoming this bottleneck, applying a new methodology using airborne multispectral imagery and in planta measurements to compare a high number of individuals. An apple tree population, grafted on the same rootstock, was submitted to contrasting summer water regimes over two years. Aerial images acquired in visible, near- and thermal-infrared at three dates each year allowed calculation of vegetation and water stress indices. Tree vigour and fruit production were also assessed. Linear mixed models were built accounting for date and year effects on several variables and including the differential response of genotypes between control and drought conditions. Broad-sense heritability of most variables was high and 18 quantitative trait loci (QTLs) independent of the dates were detected on nine linkage groups of the consensus apple genetic map. For vegetation and stress indices, QTLs were related to the means, the intra-crown heterogeneity, and differences induced by water regimes. Most QTLs explained 15−20% of variance. Airborne multispectral imaging proved relevant to acquire simultaneous information on a whole tree population and to decipher genetic determinisms involved in response to water deficit. PMID:26208644

  5. Novel instrumentation of multispectral imaging technology for detecting tissue abnormity

    NASA Astrophysics Data System (ADS)

    Yi, Dingrong; Kong, Linghua

    2012-10-01

    Multispectral imaging is becoming a powerful tool in a wide range of biological and clinical studies by adding spectral, spatial and temporal dimensions to visualize tissue abnormity and the underlying biological processes. A conventional spectral imaging system includes two physically separated major components: a band-passing selection device (such as liquid crystal tunable filter and diffraction grating) and a scientific-grade monochromatic camera, and is expensive and bulky. Recently micro-arrayed narrow-band optical mosaic filter was invented and successfully fabricated to reduce the size and cost of multispectral imaging devices in order to meet the clinical requirement for medical diagnostic imaging applications. However the challenging issue of how to integrate and place the micro filter mosaic chip to the targeting focal plane, i.e., the imaging sensor, of an off-shelf CMOS/CCD camera is not reported anywhere. This paper presents the methods and results of integrating such a miniaturized filter with off-shelf CMOS imaging sensors to produce handheld real-time multispectral imaging devices for the application of early stage pressure ulcer (ESPU) detection. Unlike conventional multispectral imaging devices which are bulky and expensive, the resulting handheld real-time multispectral ESPU detector can produce multiple images at different center wavelengths with a single shot, therefore eliminates the image registration procedure required by traditional multispectral imaging technologies.

  6. Multispectral image alignment using a three channel endoscope in vivo during minimally invasive surgery

    PubMed Central

    Clancy, Neil T.; Stoyanov, Danail; James, David R. C.; Di Marco, Aimee; Sauvage, Vincent; Clark, James; Yang, Guang-Zhong; Elson, Daniel S.

    2012-01-01

    Sequential multispectral imaging is an acquisition technique that involves collecting images of a target at different wavelengths, to compile a spectrum for each pixel. In surgical applications it suffers from low illumination levels and motion artefacts. A three-channel rigid endoscope system has been developed that allows simultaneous recording of stereoscopic and multispectral images. Salient features on the tissue surface may be tracked during the acquisition in the stereo cameras and, using multiple camera triangulation techniques, this information used to align the multispectral images automatically even though the tissue or camera is moving. This paper describes a detailed validation of the set-up in a controlled experiment before presenting the first in vivo use of the device in a porcine minimally invasive surgical procedure. Multispectral images of the large bowel were acquired and used to extract the relative concentration of haemoglobin in the tissue despite motion due to breathing during the acquisition. Using the stereoscopic information it was also possible to overlay the multispectral information on the reconstructed 3D surface. This experiment demonstrates the ability of this system for measuring blood perfusion changes in the tissue during surgery and its potential use as a platform for other sequential imaging modalities. PMID:23082296

  7. Image denoising and deblurring using multispectral data

    NASA Astrophysics Data System (ADS)

    Semenishchev, E. A.; Voronin, V. V.; Marchuk, V. I.

    2017-05-01

    Currently decision-making systems get widespread. These systems are based on the analysis video sequences and also additional data. They are volume, change size, the behavior of one or a group of objects, temperature gradient, the presence of local areas with strong differences, and others. Security and control system are main areas of application. A noise on the images strongly influences the subsequent processing and decision making. This paper considers the problem of primary signal processing for solving the tasks of image denoising and deblurring of multispectral data. The additional information from multispectral channels can improve the efficiency of object classification. In this paper we use method of combining information about the objects obtained by the cameras in different frequency bands. We apply method based on simultaneous minimization L2 and the first order square difference sequence of estimates to denoising and restoring the blur on the edges. In case of loss of the information will be applied an approach based on the interpolation of data taken from the analysis of objects located in other areas and information obtained from multispectral camera. The effectiveness of the proposed approach is shown in a set of test images.

  8. Land use classification utilizing remote multispectral scanner data and computer analysis techniques

    NASA Technical Reports Server (NTRS)

    Leblanc, P. N.; Johannsen, C. J.; Yanner, J. E.

    1973-01-01

    An airborne multispectral scanner was used to collect the visible and reflective infrared data. A small subdivision near Lafayette, Indiana was selected as the test site for the urban land use study. Multispectral scanner data were collected over the subdivision on May 1, 1970 from an altitude of 915 meters. The data were collected in twelve wavelength bands from 0.40 to 1.00 micrometers by the scanner. The results indicated that computer analysis of multispectral data can be very accurate in classifying and estimating the natural and man-made materials that characterize land uses in an urban scene.

  9. Mapping of hydrothermally altered rocks using airborne multispectral scanner data, Marysvale, Utah, mining district

    USGS Publications Warehouse

    Podwysocki, M.H.; Segal, D.B.; Jones, O.D.

    1983-01-01

    Multispectral data covering an area near Marysvale, Utah, collected with the airborne National Aeronautics and Space Administration (NASA) 24-channel Bendix multispectral scanner, were analyzed to detect areas of hydrothermally altered, potentially mineralized rocks. Spectral bands were selected for analysis that approximate those of the Landsat 4 Thematic Mapper and which are diagnostic of the presence of hydrothermally derived products. Hydrothermally altered rocks, particularly volcanic rocks affected by solutions rich in sulfuric acid, are commonly characterized by concentrations of argillic minerals such as alunite and kaolinite. These minerals are important for identifying hydrothermally altered rocks in multispectral images because they have intense absorption bands centered near a wavelength of 2.2 ??m. Unaltered volcanic rocks commonly do not contain these minerals and hence do not have the absorption bands. A color-composite image was constructed using the following spectral band ratios: 1.6??m/2.2??m, 1.6??m/0.48??m, and 0.67??m/1.0??m. The particular bands were chosen to emphasize the spectral contrasts that exist for argillic versus non-argillic rocks, limonitic versus nonlimonitic rocks, and rocks versus vegetation, respectively. The color-ratio composite successfully distinguished most types of altered rocks from unaltered rocks. Some previously unrecognized areas of hydrothermal alteration were mapped. The altered rocks included those having high alunite and/or kaolinite content, siliceous rocks containing some kaolinite, and ash-fall tuffs containing zeolitic minerals. The color-ratio-composite image allowed further division of these rocks into limonitic and nonlimonitic phases. The image did not allow separation of highly siliceous or hematitically altered rocks containing no clays or alunite from unaltered rocks. A color-coded density slice image of the 1.6??m/2.2??m band ratio allowed further discrimination among the altered units. Areas

  10. Performance analysis of a multispectral system for mine detection in the littoral zone

    NASA Astrophysics Data System (ADS)

    Hargrove, John T.; Louchard, Eric

    2004-09-01

    Science & Technology International (STI) has developed, under contract with the Office of Naval Research, a system of multispectral airborne sensors and processing algorithms capable of detecting mine-like objects in the surf zone. STI has used this system to detect mine-like objects in a littoral environment as part of blind tests at Kaneohe Marine Corps Base Hawaii, and Panama City, Florida. The airborne and ground subsystems are described. The detection algorithm is graphically illustrated. We report on the performance of the system configured to operate without a human in the loop. A subsurface (underwater bottom proud mine in the surf zone and moored mine in shallow water) mine detection capability is demonstrated in the surf zone, and in shallow water with wave spillage and foam. Our analysis demonstrates that this STI-developed multispectral airborne mine detection system provides a technical foundation for a viable mine counter-measures system for use prior to an amphibious assault.

  11. Adaptive illumination source for multispectral vision system applied to material discrimination

    NASA Astrophysics Data System (ADS)

    Conde, Olga M.; Cobo, Adolfo; Cantero, Paulino; Conde, David; Mirapeix, Jesús; Cubillas, Ana M.; López-Higuera, José M.

    2008-04-01

    A multispectral system based on a monochrome camera and an adaptive illumination source is presented in this paper. Its preliminary application is focused on material discrimination for food and beverage industries, where monochrome, color and infrared imaging have been successfully applied for this task. This work proposes a different approach, in which the relevant wavelengths for the required discrimination task are selected in advance using a Sequential Forward Floating Selection (SFFS) Algorithm. A light source, based on Light Emitting Diodes (LEDs) at these wavelengths is then used to sequentially illuminate the material under analysis, and the resulting images are captured by a CCD camera with spectral response in the entire range of the selected wavelengths. Finally, the several multispectral planes obtained are processed using a Spectral Angle Mapping (SAM) algorithm, whose output is the desired material classification. Among other advantages, this approach of controlled and specific illumination produces multispectral imaging with a simple monochrome camera, and cold illumination restricted to specific relevant wavelengths, which is desirable for the food and beverage industry. The proposed system has been tested with success for the automatic detection of foreign object in the tobacco processing industry.

  12. Using a trichromatic CCD camera for spectral skylight estimation.

    PubMed

    López-Alvarez, Miguel A; Hernández-Andrés, Javier; Romero, Javier; Olmo, F J; Cazorla, A; Alados-Arboledas, L

    2008-12-01

    In a previous work [J. Opt. Soc. Am. A 24, 942-956 (2007)] we showed how to design an optimum multispectral system aimed at spectral recovery of skylight. Since high-resolution multispectral images of skylight could be interesting for many scientific disciplines, here we also propose a nonoptimum but much cheaper and faster approach to achieve this goal by using a trichromatic RGB charge-coupled device (CCD) digital camera. The camera is attached to a fish-eye lens, hence permitting us to obtain a spectrum of every point of the skydome corresponding to each pixel of the image. In this work we show how to apply multispectral techniques to the sensors' responses of a common trichromatic camera in order to obtain skylight spectra from them. This spectral information is accurate enough to estimate experimental values of some climate parameters or to be used in algorithms for automatic cloud detection, among many other possible scientific applications.

  13. Multi-spectral imaging with infrared sensitive organic light emitting diode

    PubMed Central

    Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky

    2014-01-01

    Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions. PMID:25091589

  14. Multi-spectral imaging with infrared sensitive organic light emitting diode

    NASA Astrophysics Data System (ADS)

    Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky

    2014-08-01

    Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions.

  15. The NEAR Multispectral Imager.

    NASA Astrophysics Data System (ADS)

    Hawkins, S. E., III

    1998-06-01

    Multispectral Imager, one of the primary instruments on the Near Earth Asteroid Rendezvous (NEAR) spacecraft, uses a five-element refractive optics telescope, an eight-position filter wheel, and a charge-coupled device detector to acquire images over its sensitive wavelength range of ≍400 - 1100 nm. The primary science objectives of the Multispectral Imager are to determine the morphology and composition of the surface of asteroid 433 Eros. The camera will have a critical role in navigating to the asteroid. Seven narrowband spectral filters have been selected to provide multicolor imaging for comparative studies with previous observations of asteroids in the same class as Eros. The eighth filter is broadband and will be used for optical navigation. An overview of the instrument is presented, and design parameters and tradeoffs are discussed.

  16. Multispectral Photography: the obscure becomes the obvious

    ERIC Educational Resources Information Center

    Polgrean, John

    1974-01-01

    Commonly used in map making, real estate zoning, and highway route location, aerial photography planes equipped with multispectral cameras may, among many environmental applications, now be used to locate mineral deposits, define marshland boundaries, study water pollution, and detect diseases in crops and forests. (KM)

  17. Commercial Applications Multispectral Sensor System

    NASA Technical Reports Server (NTRS)

    Birk, Ronald J.; Spiering, Bruce

    1993-01-01

    NASA's Office of Commercial Programs is funding a multispectral sensor system to be used in the development of remote sensing applications. The Airborne Terrestrial Applications Sensor (ATLAS) is designed to provide versatility in acquiring spectral and spatial information. The ATLAS system will be a test bed for the development of specifications for airborne and spaceborne remote sensing instrumentation for dedicated applications. This objective requires spectral coverage from the visible through thermal infrared wavelengths, variable spatial resolution from 2-25 meters; high geometric and geo-location accuracy; on-board radiometric calibration; digital recording; and optimized performance for minimized cost, size, and weight. ATLAS is scheduled to be available in 3rd quarter 1992 for acquisition of data for applications such as environmental monitoring, facilities management, geographic information systems data base development, and mineral exploration.

  18. Comparative performance between compressed and uncompressed airborne imagery

    NASA Astrophysics Data System (ADS)

    Phan, Chung; Rupp, Ronald; Agarwal, Sanjeev; Trang, Anh; Nair, Sumesh

    2008-04-01

    The US Army's RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD), Countermine Division is evaluating the compressibility of airborne multi-spectral imagery for mine and minefield detection application. Of particular interest is to assess the highest image data compression rate that can be afforded without the loss of image quality for war fighters in the loop and performance of near real time mine detection algorithm. The JPEG-2000 compression standard is used to perform data compression. Both lossless and lossy compressions are considered. A multi-spectral anomaly detector such as RX (Reed & Xiaoli), which is widely used as a core algorithm baseline in airborne mine and minefield detection on different mine types, minefields, and terrains to identify potential individual targets, is used to compare the mine detection performance. This paper presents the compression scheme and compares detection performance results between compressed and uncompressed imagery for various level of compressions. The compression efficiency is evaluated and its dependence upon different backgrounds and other factors are documented and presented using multi-spectral data.

  19. Semantic segmentation of forest stands of pure species combining airborne lidar data and very high resolution multispectral imagery

    NASA Astrophysics Data System (ADS)

    Dechesne, Clément; Mallet, Clément; Le Bris, Arnaud; Gouet-Brunet, Valérie

    2017-04-01

    Forest stands are the basic units for forest inventory and mapping. Stands are defined as large forested areas (e.g., ⩾ 2 ha) of homogeneous tree species composition and age. Their accurate delineation is usually performed by human operators through visual analysis of very high resolution (VHR) infra-red images. This task is tedious, highly time consuming, and should be automated for scalability and efficient updating purposes. In this paper, a method based on the fusion of airborne lidar data and VHR multispectral images is proposed for the automatic delineation of forest stands containing one dominant species (purity superior to 75%). This is the key preliminary task for forest land-cover database update. The multispectral images give information about the tree species whereas 3D lidar point clouds provide geometric information on the trees and allow their individual extraction. Multi-modal features are computed, both at pixel and object levels: the objects are individual trees extracted from lidar data. A supervised classification is then performed at the object level in order to coarsely discriminate the existing tree species in each area of interest. The classification results are further processed to obtain homogeneous areas with smooth borders by employing an energy minimum framework, where additional constraints are joined to form the energy function. The experimental results show that the proposed method provides very satisfactory results both in terms of stand labeling and delineation (overall accuracy ranges between 84 % and 99 %).

  20. Active and passive multispectral scanner for earth resources applications: An advanced applications flight experiment

    NASA Technical Reports Server (NTRS)

    Hasell, P. G., Jr.; Peterson, L. M.; Thomson, F. J.; Work, E. A.; Kriegler, F. J.

    1977-01-01

    The development of an experimental airborne multispectral scanner to provide both active (laser illuminated) and passive (solar illuminated) data from a commonly registered surface scene is discussed. The system was constructed according to specifications derived in an initial programs design study. The system was installed in an aircraft and test flown to produce illustrative active and passive multi-spectral imagery. However, data was not collected nor analyzed for any specific application.

  1. [Nitrogen stress measurement of canola based on multi-spectral charged coupled device imaging sensor].

    PubMed

    Feng, Lei; Fang, Hui; Zhou, Wei-Jun; Huang, Min; He, Yong

    2006-09-01

    Site-specific variable nitrogen application is one of the major precision crop production management operations. Obtaining sufficient crop nitrogen stress information is essential for achieving effective site-specific nitrogen applications. The present paper describes the development of a multi-spectral nitrogen deficiency sensor, which uses three channels (green, red, near-infrared) of crop images to determine the nitrogen level of canola. This sensor assesses the nitrogen stress by means of estimated SPAD value of the canola based on canola canopy reflectance sensed using three channels (green, red, near-infrared) of the multi-spectral camera. The core of this investigation is the calibration methods between the multi-spectral references and the nitrogen levels in crops measured using a SPAD 502 chlorophyll meter. Based on the results obtained from this study, it can be concluded that a multi-spectral CCD camera can provide sufficient information to perform reasonable SPAD values estimation during field operations.

  2. A Web-GIS Procedure Based on Satellite Multi-Spectral and Airborne LIDAR Data to Map the Road blockage Due to seismic Damages of Built-Up Urban Areas

    NASA Astrophysics Data System (ADS)

    Costanzo, Antonio; Montuori, Antonio; Silva, Juan Pablo; Silvestri, Malvina; Musacchio, Massimo; Buongiorno, Maria Fabrizia; Stramondo, Salvatore

    2016-08-01

    In this work, a web-GIS procedure to map the risk of road blockage in urban environments through the combined use of space-borne and airborne remote sensing sensors is presented. The methodology concerns (1) the provision of a geo-database through the integration of space-borne multispectral images and airborne LiDAR data products; (2) the modeling of building vulnerability, based on the corresponding 3D geometry and construction time information; (3) the GIS-based mapping of road closure due to seismic- related building collapses based on the building characteristic height and the width of the road. Experimental results, gathered for the Cosenza urban area, allow demonstrating the benefits of both the proposed approach and the GIS-based integration of multi-platforms remote sensing sensors and techniques for seismic road assessment purposes.

  3. Spatial and temporal skin blood volume and saturation estimation using a multispectral snapshot imaging camera

    NASA Astrophysics Data System (ADS)

    Ewerlöf, Maria; Larsson, Marcus; Salerud, E. Göran

    2017-02-01

    Hyperspectral imaging (HSI) can estimate the spatial distribution of skin blood oxygenation, using visible to near-infrared light. HSI oximeters often use a liquid-crystal tunable filter, an acousto-optic tunable filter or mechanically adjustable filter wheels, which has too long response/switching times to monitor tissue hemodynamics. This work aims to evaluate a multispectral snapshot imaging system to estimate skin blood volume and oxygen saturation with high temporal and spatial resolution. We use a snapshot imager, the xiSpec camera (MQ022HG-IM-SM4X4-VIS, XIMEA), having 16 wavelength-specific Fabry-Perot filters overlaid on the custom CMOS-chip. The spectral distribution of the bands is however substantially overlapping, which needs to be taken into account for an accurate analysis. An inverse Monte Carlo analysis is performed using a two-layered skin tissue model, defined by epidermal thickness, haemoglobin concentration and oxygen saturation, melanin concentration and spectrally dependent reduced-scattering coefficient, all parameters relevant for human skin. The analysis takes into account the spectral detector response of the xiSpec camera. At each spatial location in the field-of-view, we compare the simulated output to the detected diffusively backscattered spectra to find the best fit. The imager is evaluated for spatial and temporal variations during arterial and venous occlusion protocols applied to the forearm. Estimated blood volume changes and oxygenation maps at 512x272 pixels show values that are comparable to reference measurements performed in contact with the skin tissue. We conclude that the snapshot xiSpec camera, paired with an inverse Monte Carlo algorithm, permits us to use this sensor for spatial and temporal measurement of varying physiological parameters, such as skin tissue blood volume and oxygenation.

  4. Leica ADS40 Sensor for Coastal Multispectral Imaging

    NASA Technical Reports Server (NTRS)

    Craig, John C.

    2007-01-01

    The Leica ADS40 Sensor as it is used for coastal multispectral imaging is presented. The contents include: 1) Project Area Overview; 2) Leica ADS40 Sensor; 3) Focal Plate Arrangements; 4) Trichroid Filter; 5) Gradient Correction; 6) Image Acquisition; 7) Remote Sensing and ADS40; 8) Band comparisons of Satellite and Airborne Sensors; 9) Impervious Surface Extraction; and 10) Impervious Surface Details.

  5. Airborne Thermal Infrared Multispectral Scanner (TIMS) images over disseminated gold deposits, Osgood Mountains, Humboldt County, Nevada

    NASA Technical Reports Server (NTRS)

    Krohn, M. Dennis

    1986-01-01

    The U.S. Geological Survey (USGS) acquired airborne Thermal Infrared Multispectral Scanner (TIMS) images over several disseminated gold deposits in northern Nevada in 1983. The aerial surveys were flown to determine whether TIMS data could depict jasperoids (siliceous replacement bodies) associated with the gold deposits. The TIMS data were collected over the Pinson and Getchell Mines in the Osgood Mountains, the Carlin, Maggie Creek, Bootstrap, and other mines in the Tuscarora Mountains, and the Jerritt Canyon Mine in the Independence Mountains. The TIMS data seem to be a useful supplement to conventional geochemical exploration for disseminated gold deposits in the western United States. Siliceous outcrops are readily separable in the TIMS image from other types of host rocks. Different forms of silicification are not readily separable, yet, due to limitations of spatial resolution and spectral dynamic range. Features associated with the disseminated gold deposits, such as the large intrusive bodies and fault structures, are also resolvable on TIMS data. Inclusion of high-resolution thermal inertia data would be a useful supplement to the TIMS data.

  6. D Land Cover Classification Based on Multispectral LIDAR Point Clouds

    NASA Astrophysics Data System (ADS)

    Zou, Xiaoliang; Zhao, Guihua; Li, Jonathan; Yang, Yuanxi; Fang, Yong

    2016-06-01

    Multispectral Lidar System can emit simultaneous laser pulses at the different wavelengths. The reflected multispectral energy is captured through a receiver of the sensor, and the return signal together with the position and orientation information of sensor is recorded. These recorded data are solved with GNSS/IMU data for further post-processing, forming high density multispectral 3D point clouds. As the first commercial multispectral airborne Lidar sensor, Optech Titan system is capable of collecting point clouds data from all three channels at 532nm visible (Green), at 1064 nm near infrared (NIR) and at 1550nm intermediate infrared (IR). It has become a new source of data for 3D land cover classification. The paper presents an Object Based Image Analysis (OBIA) approach to only use multispectral Lidar point clouds datasets for 3D land cover classification. The approach consists of three steps. Firstly, multispectral intensity images are segmented into image objects on the basis of multi-resolution segmentation integrating different scale parameters. Secondly, intensity objects are classified into nine categories by using the customized features of classification indexes and a combination the multispectral reflectance with the vertical distribution of object features. Finally, accuracy assessment is conducted via comparing random reference samples points from google imagery tiles with the classification results. The classification results show higher overall accuracy for most of the land cover types. Over 90% of overall accuracy is achieved via using multispectral Lidar point clouds for 3D land cover classification.

  7. Retinal oxygen saturation evaluation by multi-spectral fundus imaging

    NASA Astrophysics Data System (ADS)

    Khoobehi, Bahram; Ning, Jinfeng; Puissegur, Elise; Bordeaux, Kimberly; Balasubramanian, Madhusudhanan; Beach, James

    2007-03-01

    Purpose: To develop a multi-spectral method to measure oxygen saturation of the retina in the human eye. Methods: Five Cynomolgus monkeys with normal eyes were anesthetized with intramuscular ketamine/xylazine and intravenous pentobarbital. Multi-spectral fundus imaging was performed in five monkeys with a commercial fundus camera equipped with a liquid crystal tuned filter in the illumination light path and a 16-bit digital camera. Recording parameters were controlled with software written specifically for the application. Seven images at successively longer oxygen-sensing wavelengths were recorded within 4 seconds. Individual images for each wavelength were captured in less than 100 msec of flash illumination. Slightly misaligned images of separate wavelengths due to slight eye motion were registered and corrected by translational and rotational image registration prior to analysis. Numerical values of relative oxygen saturation of retinal arteries and veins and the underlying tissue in between the artery/vein pairs were evaluated by an algorithm previously described, but which is now corrected for blood volume from averaged pixels (n > 1000). Color saturation maps were constructed by applying the algorithm at each image pixel using a Matlab script. Results: Both the numerical values of relative oxygen saturation and the saturation maps correspond to the physiological condition, that is, in a normal retina, the artery is more saturated than the tissue and the tissue is more saturated than the vein. With the multi-spectral fundus camera and proper registration of the multi-wavelength images, we were able to determine oxygen saturation in the primate retinal structures on a tolerable time scale which is applicable to human subjects. Conclusions: Seven wavelength multi-spectral imagery can be used to measure oxygen saturation in retinal artery, vein, and tissue (microcirculation). This technique is safe and can be used to monitor oxygen uptake in humans. This work

  8. Dual multispectral and 3D structured light laparoscope

    NASA Astrophysics Data System (ADS)

    Clancy, Neil T.; Lin, Jianyu; Arya, Shobhit; Hanna, George B.; Elson, Daniel S.

    2015-03-01

    Intraoperative feedback on tissue function, such as blood volume and oxygenation would be useful to the surgeon in cases where current clinical practice relies on subjective measures, such as identification of ischaemic bowel or tissue viability during anastomosis formation. Also, tissue surface profiling may be used to detect and identify certain pathologies, as well as diagnosing aspects of tissue health such as gut motility. In this paper a dual modality laparoscopic system is presented that combines multispectral reflectance and 3D surface imaging. White light illumination from a xenon source is detected by a laparoscope-mounted fast filter wheel camera to assemble a multispectral image (MSI) cube. Surface shape is then calculated using a spectrally-encoded structured light (SL) pattern detected by the same camera and triangulated using an active stereo technique. Images of porcine small bowel were acquired during open surgery. Tissue reflectance spectra were acquired and blood volume was calculated at each spatial pixel across the bowel wall and mesentery. SL features were segmented and identified using a `normalised cut' algoritm and the colour vector of each spot. Using the 3D geometry defined by the camera coordinate system the multispectral data could be overlaid onto the surface mesh. Dual MSI and SL imaging has the potential to provide augmented views to the surgeon supplying diagnostic information related to blood supply health and organ function. Future work on this system will include filter optimisation to reduce noise in tissue optical property measurement, and minimise spot identification errors in the SL pattern.

  9. Spatio-temporal monitoring of cotton cultivation using ground-based and airborne multispectral sensors in GIS environment.

    PubMed

    Papadopoulos, Antonis; Kalivas, Dionissios; Theocharopoulos, Sid

    2017-07-01

    Multispectral sensor capability of capturing reflectance data at several spectral channels, together with the inherent reflectance responses of various soils and especially plant surfaces, has gained major interest in crop production. In present study, two multispectral sensing systems, a ground-based and an aerial-based, were applied for the multispatial and temporal monitoring of two cotton fields in central Greece. The ground-based system was Crop Circle ACS-430, while the aerial consisted of a consumer-level quadcopter (Phantom 2) and a modified Hero3+ Black digital camera. The purpose of the research was to monitor crop growth with the two systems and investigate possible interrelations between the derived well-known normalized difference vegetation index (NDVI). Five data collection campaigns were conducted during the cultivation period and concerned scanning soil and plants with the ground-based sensor and taking aerial photographs of the fields with the unmanned aerial system. According to the results, both systems successfully monitored cotton growth stages in terms of space and time. The mean values of NDVI changes through time as retrieved by the ground-based system were satisfactorily modelled by a second-order polynomial equation (R 2 0.96 in Field 1 and 0.99 in Field 2). Further, they were highly correlated (r 0.90 in Field 1 and 0.74 in Field 2) with the according values calculated via the aerial-based system. The unmanned aerial system (UAS) can potentially substitute crop scouting as it concerns a time-effective, non-destructive and reliable way of soil and plant monitoring.

  10. Fusion of remotely sensed data from airborne and ground-based sensors for cotton regrowth study

    USDA-ARS?s Scientific Manuscript database

    The study investigated the use of aerial multispectral imagery and ground-based hyperspectral data for the discrimination of different crop types and timely detection of cotton plants over large areas. Airborne multispectral imagery and ground-based spectral reflectance data were acquired at the sa...

  11. Feasibility study and quality assessment of unmanned aircraft system-derived multispectral images

    NASA Astrophysics Data System (ADS)

    Chang, Kuo-Jen

    2017-04-01

    The purpose of study is to explore the precision and the applicability of UAS-derived multispectral images. In this study, the Micro-MCA6 multispectral camera was mounted on quadcopter. The Micro-MCA6 shoot images synchronized of each single band. By means of geotagged images and control points, the orthomosaic images of each single band generated firstly by 14cm resolution. The multispectral image was merged complete with 6 bands. In order to improve the spatial resolution, the 6 band image fused with 9cm resolution image taken from RGB camera. Quality evaluation of the image is verified of the each single band by using control points and check points. The standard deviations of errors are within 1 to 2 pixel resolution of each band. The quality of the multispectral image is compared with 3 cm resolution orthomosaic RGB image gathered from UAV in the same mission, as well. The standard deviations of errors are within 2 to 3 pixel resolution. The result shows that the errors resulting from the blurry and the band dislocation of the objects edge identification. To the end, the normalized difference vegetation index (NDVI) extracted from the image to explore the condition of vegetation and the nature of the environment. This study demonstrates the feasibility and the capability of the high resolution multispectral images.

  12. Multispectral imaging system for contaminant detection

    NASA Technical Reports Server (NTRS)

    Poole, Gavin H. (Inventor)

    2003-01-01

    An automated inspection system for detecting digestive contaminants on food items as they are being processed for consumption includes a conveyor for transporting the food items, a light sealed enclosure which surrounds a portion of the conveyor, with a light source and a multispectral or hyperspectral digital imaging camera disposed within the enclosure. Operation of the conveyor, light source and camera are controlled by a central computer unit. Light reflected by the food items within the enclosure is detected in predetermined wavelength bands, and detected intensity values are analyzed to detect the presence of digestive contamination.

  13. Curved CCD detector devices and arrays for multispectral astrophysical applications and terrestrial stereo panoramic cameras

    NASA Astrophysics Data System (ADS)

    Swain, Pradyumna; Mark, David

    2004-09-01

    The emergence of curved CCD detectors as individual devices or as contoured mosaics assembled to match the curved focal planes of astronomical telescopes and terrestrial stereo panoramic cameras represents a major optical design advancement that greatly enhances the scientific potential of such instruments. In altering the primary detection surface within the telescope"s optical instrumentation system from flat to curved, and conforming the applied CCD"s shape precisely to the contour of the telescope"s curved focal plane, a major increase in the amount of transmittable light at various wavelengths through the system is achieved. This in turn enables multi-spectral ultra-sensitive imaging with much greater spatial resolution necessary for large and very large telescope applications, including those involving infrared image acquisition and spectroscopy, conducted over very wide fields of view. For earth-based and space-borne optical telescopes, the advent of curved CCD"s as the principle detectors provides a simplification of the telescope"s adjoining optics, reducing the number of optical elements and the occurrence of optical aberrations associated with large corrective optics used to conform to flat detectors. New astronomical experiments may be devised in the presence of curved CCD applications, in conjunction with large format cameras and curved mosaics, including three dimensional imaging spectroscopy conducted over multiple wavelengths simultaneously, wide field real-time stereoscopic tracking of remote objects within the solar system at high resolution, and deep field survey mapping of distant objects such as galaxies with much greater multi-band spatial precision over larger sky regions. Terrestrial stereo panoramic cameras equipped with arrays of curved CCD"s joined with associative wide field optics will require less optical glass and no mechanically moving parts to maintain continuous proper stereo convergence over wider perspective viewing fields than

  14. Multi-spectral endogenous fluorescence imaging for bacterial differentiation

    NASA Astrophysics Data System (ADS)

    Chernomyrdin, Nikita V.; Babayants, Margarita V.; Korotkov, Oleg V.; Kudrin, Konstantin G.; Rimskaya, Elena N.; Shikunova, Irina A.; Kurlov, Vladimir N.; Cherkasova, Olga P.; Komandin, Gennady A.; Reshetov, Igor V.; Zaytsev, Kirill I.

    2017-07-01

    In this paper, the multi-spectral endogenous fluorescence imaging was implemented for bacterial differentiation. The fluorescence imaging was performed using a digital camera equipped with a set of visual bandpass filters. Narrowband 365 nm ultraviolet radiation passed through a beam homogenizer was used to excite the sample fluorescence. In order to increase a signal-to-noise ratio and suppress a non-fluorescence background in images, the intensity of the UV excitation was modulated using a mechanical chopper. The principal components were introduced for differentiating the samples of bacteria based on the multi-spectral endogenous fluorescence images.

  15. Wetland Vegetation Integrity Assessment with Low Altitude Multispectral Uav Imagery

    NASA Astrophysics Data System (ADS)

    Boon, M. A.; Tesfamichael, S.

    2017-08-01

    The use of multispectral sensors on Unmanned Aerial Vehicles (UAVs) was until recently too heavy and bulky although this changed in recent times and they are now commercially available. The focus on the usage of these sensors is mostly directed towards the agricultural sector where the focus is on precision farming. Applications of these sensors for mapping of wetland ecosystems are rare. Here, we evaluate the performance of low altitude multispectral UAV imagery to determine the state of wetland vegetation in a localised spatial area. Specifically, NDVI derived from multispectral UAV imagery was used to inform the determination of the integrity of the wetland vegetation. Furthermore, we tested different software applications for the processing of the imagery. The advantages and disadvantages we experienced of these applications are also shortly presented in this paper. A JAG-M fixed-wing imaging system equipped with a MicaScene RedEdge multispectral camera were utilised for the survey. A single surveying campaign was undertaken in early autumn of a 17 ha study area at the Kameelzynkraal farm, Gauteng Province, South Africa. Structure-from-motion photogrammetry software was used to reconstruct the camera position's and terrain features to derive a high resolution orthoretified mosaic. MicaSense Atlas cloud-based data platform, Pix4D and PhotoScan were utilised for the processing. The WET-Health level one methodology was followed for the vegetation assessment, where wetland health is a measure of the deviation of a wetland's structure and function from its natural reference condition. An on-site evaluation of the vegetation integrity was first completed. Disturbance classes were then mapped using the high resolution multispectral orthoimages and NDVI. The WET-Health vegetation module completed with the aid of the multispectral UAV products indicated that the vegetation of the wetland is largely modified ("D" PES Category) and that the condition is expected to

  16. A preliminary report of multispectral scanner data from the Cleveland harbor study

    NASA Technical Reports Server (NTRS)

    Shook, D.; Raquet, C.; Svehla, R.; Wachter, D.; Salzman, J.; Coney, T.; Gedney, D.

    1975-01-01

    Imagery obtained from an airborne multispectral scanner is presented. A synoptic view of the entire study area is shown for a number of time periods and for a number of spectral bands. Using several bands, sediment distributions, thermal plumes, and Rhodamine B dye distributions are shown.

  17. Uncertainty in multispectral lidar signals caused by incidence angle effects

    PubMed Central

    Nevalainen, Olli; Hakala, Teemu; Kaasalainen, Mikko

    2018-01-01

    Multispectral terrestrial laser scanning (TLS) is an emerging technology. Several manufacturers already offer commercial dual or three wavelength airborne laser scanners, while multispectral TLS is still carried out mainly with research instruments. Many of these research efforts have focused on the study of vegetation. The aim of this paper is to study the uncertainty of the measurement of spectral indices of vegetation with multispectral lidar. Using two spectral indices as examples, we find that the uncertainty is due to systematic errors caused by the wavelength dependency of laser incidence angle effects. This finding is empirical, and the error cannot be removed by modelling or instrument modification. The discovery and study of these effects has been enabled by hyperspectral and multispectral TLS, and it has become a subject of active research within the past few years. We summarize the most recent studies on multi-wavelength incidence angle effects and present new results on the effect of specular reflection from the leaf surface, and the surface structure, which have been suggested to play a key role. We also discuss the consequences to the measurement of spectral indices with multispectral TLS, and a possible correction scheme using a synthetic laser footprint. PMID:29503718

  18. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.

  19. Field Test of the ExoMars Panoramic Camera in the High Arctic - First Results and Lessons Learned

    NASA Astrophysics Data System (ADS)

    Schmitz, N.; Barnes, D.; Coates, A.; Griffiths, A.; Hauber, E.; Jaumann, R.; Michaelis, H.; Mosebach, H.; Paar, G.; Reissaus, P.; Trauthan, F.

    2009-04-01

    The ExoMars mission as the first element of the ESA Aurora program is scheduled to be launched to Mars in 2016. Part of the Pasteur Exobiology Payload onboard the ExoMars rover is a Panoramic Camera System (‘PanCam') being designed to obtain high-resolution color and wide-angle multi-spectral stereoscopic panoramic images from the mast of the ExoMars rover. The PanCam instrument consists of two wide-angle cameras (WACs), which will provide multispectral stereo images with 34° field-of-view (FOV) and a High-Resolution RGB Channel (HRC) to provide close-up images with 5° field-of-view. For field testing of the PanCam breadboard in a representative environment the ExoMars PanCam team joined the 6th Arctic Mars Analogue Svalbard Expedition (AMASE) 2008. The expedition took place from 4-17 August 2008 in the Svalbard archipelago, Norway, which is considered to be an excellent site, analogue to ancient Mars. 31 scientists and engineers involved in Mars Exploration (among them the ExoMars WISDOM, MIMA and Raman-LIBS team as well as several NASA MSL teams) combined their knowledge, instruments and techniques to study the geology, geophysics, biosignatures, and life forms that can be found in volcanic complexes, warm springs, subsurface ice, and sedimentary deposits. This work has been carried out by using instruments, a rover (NASA's CliffBot), and techniques that will/may be used in future planetary missions, thereby providing the capability to simulate a full mission environment in a Mars analogue terrain. Besides demonstrating PanCam's general functionality in a field environment, test and verification of the interpretability of PanCam data for in-situ geological context determination and scientific target selection was a main objective. To process the collected data, a first version of the preliminary PanCam 3D reconstruction processing & visualization chain was used. Other objectives included to test and refine the operational scenario (based on ExoMars Rover

  20. Medium-sized aperture camera for Earth observation

    NASA Astrophysics Data System (ADS)

    Kim, Eugene D.; Choi, Young-Wan; Kang, Myung-Seok; Kim, Ee-Eul; Yang, Ho-Soon; Rasheed, Ad. Aziz Ad.; Arshad, Ahmad Sabirin

    2017-11-01

    Satrec Initiative and ATSB have been developing a medium-sized aperture camera (MAC) for an earth observation payload on a small satellite. Developed as a push-broom type high-resolution camera, the camera has one panchromatic and four multispectral channels. The panchromatic channel has 2.5m, and multispectral channels have 5m of ground sampling distances at a nominal altitude of 685km. The 300mm-aperture Cassegrain telescope contains two aspheric mirrors and two spherical correction lenses. With a philosophy of building a simple and cost-effective camera, the mirrors incorporate no light-weighting, and the linear CCDs are mounted on a single PCB with no beam splitters. MAC is the main payload of RazakSAT to be launched in 2005. RazakSAT is a 180kg satellite including MAC, designed to provide high-resolution imagery of 20km swath width on a near equatorial orbit (NEqO). The mission objective is to demonstrate the capability of a high-resolution remote sensing satellite system on a near equatorial orbit. This paper describes the overview of the MAC and RarakSAT programmes, and presents the current development status of MAC focusing on key optical aspects of Qualification Model.

  1. Time-of-Flight Microwave Camera

    NASA Astrophysics Data System (ADS)

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-10-01

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.

  2. Time-of-Flight Microwave Camera.

    PubMed

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-10-05

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable "stealth" regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows "camera-like" behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.

  3. Remote sensing operations (multispectral scanner and photographic) in the New York Bight, 22 September 1975

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.; Hall, J. B., Jr.

    1977-01-01

    Ocean dumping of waste materials is a significant environmental concern in the New York Bight. One of these waste materials, sewage sludge, was monitored in an experiment conducted in the New York Bight on September 22, 1975. Remote sensing over controlled sewage sludge dumping included an 11-band multispectral scanner, fiver multispectral cameras and one mapping camera. Concurrent in situ water samples were taken and acoustical measurements were made of the sewage sludge plumes. Data were obtained for sewage sludge plumes resulting from line (moving barge) and spot (stationary barge) dumps. Multiple aircraft overpasses were made to evaluate temporal effects on the plume signature.

  4. Registration of 3D and Multispectral Data for the Study of Cultural Heritage Surfaces

    PubMed Central

    Chane, Camille Simon; Schütze, Rainer; Boochs, Frank; Marzani, Franck S.

    2013-01-01

    We present a technique for the multi-sensor registration of featureless datasets based on the photogrammetric tracking of the acquisition systems in use. This method is developed for the in situ study of cultural heritage objects and is tested by digitizing a small canvas successively with a 3D digitization system and a multispectral camera while simultaneously tracking the acquisition systems with four cameras and using a cubic target frame with a side length of 500 mm. The achieved tracking accuracy is better than 0.03 mm spatially and 0.150 mrad angularly. This allows us to seamlessly register the 3D acquisitions and to project the multispectral acquisitions on the 3D model. PMID:23322103

  5. Time-resolved multispectral imaging of combustion reactions

    NASA Astrophysics Data System (ADS)

    Huot, Alexandrine; Gagnon, Marc-André; Jahjah, Karl-Alexandre; Tremblay, Pierre; Savary, Simon; Farley, Vincent; Lagueux, Philippe; Guyot, Éric; Chamberland, Martin; Marcotte, Frédérick

    2015-10-01

    Thermal infrared imaging is a field of science that evolves rapidly. Scientists have used for years the simplest tool: thermal broadband cameras. These allow to perform target characterization in both the longwave (LWIR) and midwave (MWIR) infrared spectral range. Infrared thermal imaging is used for a wide range of applications, especially in the combustion domain. For example, it can be used to follow combustion reactions, in order to characterize the injection and the ignition in a combustion chamber or even to observe gases produced by a flare or smokestack. Most combustion gases, such as carbon dioxide (CO2), selectively absorb/emit infrared radiation at discrete energies, i.e. over a very narrow spectral range. Therefore, temperatures derived from broadband imaging are not reliable without prior knowledge of spectral emissivity. This information is not directly available from broadband images. However, spectral information is available using spectral filters. In this work, combustion analysis was carried out using a Telops MS-IR MW camera, which allows multispectral imaging at a high frame rate. A motorized filter wheel allowing synchronized acquisitions on eight (8) different channels was used to provide time-resolved multispectral imaging of combustion products of a candle in which black powder has been burnt to create a burst. It was then possible to estimate the temperature by modeling spectral profiles derived from information obtained with the different spectral filters. Comparison with temperatures obtained using conventional broadband imaging illustrates the benefits of time-resolved multispectral imaging for the characterization of combustion processes.

  6. Time-resolved multispectral imaging of combustion reaction

    NASA Astrophysics Data System (ADS)

    Huot, Alexandrine; Gagnon, Marc-André; Jahjah, Karl-Alexandre; Tremblay, Pierre; Savary, Simon; Farley, Vincent; Lagueux, Philippe; Guyot, Éric; Chamberland, Martin; Marcotte, Fréderick

    2015-05-01

    Thermal infrared imaging is a field of science that evolves rapidly. Scientists have used for years the simplest tool: thermal broadband cameras. This allows to perform target characterization in both the longwave (LWIR) and midwave (MWIR) infrared spectral range. Infrared thermal imaging is used for a wide range of applications, especially in the combustion domain. For example, it can be used to follow combustion reactions, in order to characterize the injection and the ignition in a combustion chamber or even to observe gases produced by a flare or smokestack. Most combustion gases such as carbon dioxide (CO2) selectively absorb/emit infrared radiation at discrete energies, i.e. over a very narrow spectral range. Therefore, temperatures derived from broadband imaging are not reliable without prior knowledge about spectral emissivity. This information is not directly available from broadband images. However, spectral information is available using spectral filters. In this work, combustion analysis was carried out using Telops MS-IR MW camera which allows multispectral imaging at a high frame rate. A motorized filter wheel allowing synchronized acquisitions on eight (8) different channels was used to provide time-resolved multispectral imaging of combustion products of a candle in which black powder has been burnt to create a burst. It was then possible to estimate the temperature by modeling spectral profile derived from information obtained with the different spectral filters. Comparison with temperatures obtained using conventional broadband imaging illustrates the benefits of time-resolved multispectral imaging for the characterization of combustion processes.

  7. Generalized assorted pixel camera: postcapture control of resolution, dynamic range, and spectrum.

    PubMed

    Yasuma, Fumihito; Mitsunaga, Tomoo; Iso, Daisuke; Nayar, Shree K

    2010-09-01

    We propose the concept of a generalized assorted pixel (GAP) camera, which enables the user to capture a single image of a scene and, after the fact, control the tradeoff between spatial resolution, dynamic range and spectral detail. The GAP camera uses a complex array (or mosaic) of color filters. A major problem with using such an array is that the captured image is severely under-sampled for at least some of the filter types. This leads to reconstructed images with strong aliasing. We make four contributions in this paper: 1) we present a comprehensive optimization method to arrive at the spatial and spectral layout of the color filter array of a GAP camera. 2) We develop a novel algorithm for reconstructing the under-sampled channels of the image while minimizing aliasing artifacts. 3) We demonstrate how the user can capture a single image and then control the tradeoff of spatial resolution to generate a variety of images, including monochrome, high dynamic range (HDR) monochrome, RGB, HDR RGB, and multispectral images. 4) Finally, the performance of our GAP camera has been verified using extensive simulations that use multispectral images of real world scenes. A large database of these multispectral images has been made available at http://www1.cs.columbia.edu/CAVE/projects/gap_camera/ for use by the research community.

  8. The Panoramic Camera (PanCam) Instrument for the ESA ExoMars Rover

    NASA Astrophysics Data System (ADS)

    Griffiths, A.; Coates, A.; Jaumann, R.; Michaelis, H.; Paar, G.; Barnes, D.; Josset, J.

    The recently approved ExoMars rover is the first element of the ESA Aurora programme and is slated to deliver the Pasteur exobiology payload to Mars by 2013. The 0.7 kg Panoramic Camera will provide multispectral stereo images with 65° field-of- view (1.1 mrad/pixel) and high resolution (85 µrad/pixel) monoscopic "zoom" images with 5° field-of-view. The stereo Wide Angle Cameras (WAC) are based on Beagle 2 Stereo Camera System heritage. The Panoramic Camera instrument is designed to fulfil the digital terrain mapping requirements of the mission as well as providing multispectral geological imaging, colour and stereo panoramic images, solar images for water vapour abundance and dust optical depth measurements and to observe retrieved subsurface samples before ingestion into the rest of the Pasteur payload. Additionally the High Resolution Camera (HRC) can be used for high resolution imaging of interesting targets detected in the WAC panoramas and of inaccessible locations on crater or valley walls.

  9. Multispectral data processing from unmanned aerial vehicles: application in precision agriculture using different sensors and platforms

    NASA Astrophysics Data System (ADS)

    Piermattei, Livia; Bozzi, Carlo Alberto; Mancini, Adriano; Tassetti, Anna Nora; Karel, Wilfried; Pfeifer, Norbert

    2017-04-01

    Unmanned aerial vehicles (UAVs) in combination with consumer grade cameras have become standard tools for photogrammetric applications and surveying. The recent generation of multispectral, cost-efficient and lightweight cameras has fostered a breakthrough in the practical application of UAVs for precision agriculture. For this application, multispectral cameras typically use Green, Red, Red-Edge (RE) and Near Infrared (NIR) wavebands to capture both visible and invisible images of crops and vegetation. These bands are very effective for deriving characteristics like soil productivity, plant health and overall growth. However, the quality of results is affected by the sensor architecture, the spatial and spectral resolutions, the pattern of image collection, and the processing of the multispectral images. In particular, collecting data with multiple sensors requires an accurate spatial co-registration of the various UAV image datasets. Multispectral processed data in precision agriculture are mainly presented as orthorectified mosaics used to export information maps and vegetation indices. This work aims to investigate the acquisition parameters and processing approaches of this new type of image data in order to generate orthoimages using different sensors and UAV platforms. Within our experimental area we placed a grid of artificial targets, whose position was determined with differential global positioning system (dGPS) measurements. Targets were used as ground control points to georeference the images and as checkpoints to verify the accuracy of the georeferenced mosaics. The primary aim is to present a method for the spatial co-registration of visible, Red-Edge, and NIR image sets. To demonstrate the applicability and accuracy of our methodology, multi-sensor datasets were collected over the same area and approximately at the same time using the fixed-wing UAV senseFly "eBee". The images were acquired with the camera Canon S110 RGB, the multispectral cameras

  10. MOVING BEYOND COLOR: THE CASE FOR MULTISPECTRAL IMAGING IN BRIGHTFIELD PATHOLOGY.

    PubMed

    Cukierski, William J; Qi, Xin; Foran, David J

    2009-01-01

    A multispectral camera is capable of imaging a histologic slide at narrow bandwidths over the range of the visible spectrum. While several uses for multispectral imaging (MSI) have been demonstrated in pathology [1, 2], there is no unified consensus over when and how MSI might benefit automated analysis [3, 4]. In this work, we use a linear-algebra framework to investigate the relationship between the spectral image and its standard-image counterpart. The multispectral "cube" is treated as an extension of a traditional image in a high-dimensional color space. The concept of metamers is introduced and used to derive regions of the visible spectrum where MSI may provide an advantage. Furthermore, histological stains which are amenable to analysis by MSI are reported. We show the Commission internationale de l'éclairage (CIE) 1931 transformation from spectrum to color is non-neighborhood preserving. Empirical results are demonstrated on multispectral images of peripheral blood smears.

  11. MOVING BEYOND COLOR: THE CASE FOR MULTISPECTRAL IMAGING IN BRIGHTFIELD PATHOLOGY

    PubMed Central

    Cukierski, William J.; Qi, Xin; Foran, David J.

    2009-01-01

    A multispectral camera is capable of imaging a histologic slide at narrow bandwidths over the range of the visible spectrum. While several uses for multispectral imaging (MSI) have been demonstrated in pathology [1, 2], there is no unified consensus over when and how MSI might benefit automated analysis [3, 4]. In this work, we use a linear-algebra framework to investigate the relationship between the spectral image and its standard-image counterpart. The multispectral “cube” is treated as an extension of a traditional image in a high-dimensional color space. The concept of metamers is introduced and used to derive regions of the visible spectrum where MSI may provide an advantage. Furthermore, histological stains which are amenable to analysis by MSI are reported. We show the Commission internationale de l’éclairage (CIE) 1931 transformation from spectrum to color is non-neighborhood preserving. Empirical results are demonstrated on multispectral images of peripheral blood smears. PMID:19997528

  12. Molecular Shocks Associated with Massive Young Stars: CO Line Images with a New Far-Infrared Spectroscopic Camera on the Kuiper Airborne Observatory

    NASA Technical Reports Server (NTRS)

    Watson, Dan M.

    1997-01-01

    Under the terms of our contract with NASA Ames Research Center, the University of Rochester (UR) offers the following final technical report on grant NAG 2-958, Molecular shocks associated with massive young stars: CO line images with a new far-infrared spectroscopic camera, given for implementation of the UR Far-Infrared Spectroscopic Camera (FISC) on the Kuiper Airborne Observatory (KAO), and use of this camera for observations of star-formation regions 1. Two KAO flights in FY 1995, the final year of KAO operations, were awarded to this program, conditional upon a technical readiness confirmation which was given in January 1995. The funding period covered in this report is 1 October 1994 - 30 September 1996. The project was supported with $30,000, and no funds remained at the conclusion of the project.

  13. The analysis on the relation between the compression method and the performance enhancement of MSC (Multi-Spectral Camera) image data

    NASA Astrophysics Data System (ADS)

    Yong, Sang-Soon; Ra, Sung-Woong

    2007-10-01

    Multi-Spectral Camera(MSC) is a main payload on the KOMPSAT-2 satellite to perform the earth remote sensing. The MSC instrument has one(1) channel for panchromatic imaging and four(4) channel for multi-spectral imaging covering the spectral range from 450nm to 900nm using TDI CCD Focal Plane Array (FPA). The instrument images the earth using a push-broom motion with a swath width of 15 km and a ground sample distance (GSD) of 1 m over the entire field of view (FOV) at altitude 685 Km. The instrument is designed to have an on-orbit operation duty cycle of 20% over the mission lifetime of 3 years with the functions of programmable gain/ offset and on-board image data compression/ storage. The compression method on KOMPSAT-2 MSC was selected and used to match EOS input rate and PDTS output data rate on MSC image data chain. At once the MSC performance was carefully handled to minimize any degradation so that it was analyzed and restored in KGS(KOMPSAT Ground Station) during LEOP and Cal./Val.(Calibration and Validation) phase. In this paper, on-orbit image data chain in MSC and image data processing on KGS including general MSC description is briefly described. The influences on image performance between on-board compression algorithms and between performance restoration methods in ground station are analyzed, and the relation between both methods is to be analyzed and discussed.

  14. Time-of-Flight Microwave Camera

    PubMed Central

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-01-01

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz–12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum. PMID:26434598

  15. Summaries of the Third Annual JPL Airborne Geoscience Workshop. Volume 1: AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1992-01-01

    This publication contains the preliminary agenda and summaries for the Third Annual JPL Airborne Geoscience Workshop, held at the Jet Propulsion Laboratory, Pasadena, California, on 1-5 June 1992. This main workshop is divided into three smaller workshops as follows: (1) the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on June 1 and 2; (2) the Thermal Infrared Multispectral Scanner (TIMS) workshop, on June 3; and (3) the Airborne Synthetic Aperture Radar (AIRSAR) workshop, on June 4 and 5. The summaries are contained in Volumes 1, 2, and 3, respectively.

  16. Lunar Reconnaissance Orbiter Camera (LROC) instrument overview

    USGS Publications Warehouse

    Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

  17. Thermal airborne multispectral aster simulator and its preliminary results

    NASA Astrophysics Data System (ADS)

    Mills, F.; Kannari, Y.; Watanabe, H.; Sano, M.; Chang, S. H.

    1994-03-01

    An Airborne ASTER Simulator (AAS) is being developed for the Japan Resources Observation System Organization (JAROS) by the Geophysical Environmental Research (GER) Corporation. The first test flights of the AAS were over Cuprite, Nevada; Long Valley, California; and Death Valley, California, in December 1991. Preliminary laboratory tests at NASA's Stennis Space Center (SSC) were completed in April 1992. The results of the these tests indicate the AAS can discriminate between silicate and non-silicate rocks. The improvements planned for the next two years may give a spectral Full-Width at Half-Maximum (FWHM) of 0.3 μm and NEΔT of 0.2 - 0.5°K. The AAS has the potential to become a good tool for airborne TIR research and can be used for simulations of future satellite-borne TIR sensors. Flight tests over Cuprite, Nevada, and Castaic Lake, California, are planned for October-December 1992.

  18. Summaries of the 4th Annual JPL Airborne Geoscience Workshop. Volume 3: AIRSAR Workshop

    NASA Technical Reports Server (NTRS)

    Vanzyl, Jakob (Editor)

    1993-01-01

    This publication contains the summaries for the Fourth Annual JPL Airborne Geoscience Workshop, held in Washington, D.C. on October 25-29, 1993. The main workshop is divided into three smaller workshops as follows: The Airborne Visible/Infrared Spectrometer (AVIRIS) workshop, on October 25-26, whose summaries appear in Volume 1; The Thermal Infrared Multispectral Scanner (TIMS) workshop, on October 27, whose summaries appear in Volume 2; and The Airborne Synthetic Aperture Radar (AIRSAR) workshop, on October 28-29, whose summaries appear in this volume, Volume 3.

  19. Summaries of the Seventh JPL Airborne Earth Science Workshop January 12-16, 1998. Volume 1; AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1998-01-01

    This publication contains the summaries for the Seventh JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 12-16, 1998. The main workshop is divided into three smaller workshops, and each workshop has a volume as follows: (1) Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) Workshop; (2) Airborne Synthetic Aperture Radar (AIRSAR) Workshop; and (3) Thermal Infrared Multispectral Scanner (TIMS) Workshop. This Volume 1 publication contains 58 papers taken from the AVIRIS workshop.

  20. An airborne thematic thermal infrared and electro-optical imaging system

    NASA Astrophysics Data System (ADS)

    Sun, Xiuhong; Shu, Peter

    2011-08-01

    This paper describes an advanced Airborne Thematic Thermal InfraRed and Electro-Optical Imaging System (ATTIREOIS) and its potential applications. ATTIREOIS sensor payload consists of two sets of advanced Focal Plane Arrays (FPAs) - a broadband Thermal InfraRed Sensor (TIRS) and a four (4) band Multispectral Electro-Optical Sensor (MEOS) to approximate Landsat ETM+ bands 1,2,3,4, and 6, and LDCM bands 2,3,4,5, and 10+11. The airborne TIRS is 3-axis stabilized payload capable of providing 3D photogrammetric images with a 1,850 pixel swathwidth via pushbroom operation. MEOS has a total of 116 million simultaneous sensor counts capable of providing 3 cm spatial resolution multispectral orthophotos for continuous airborne mapping. ATTIREOIS is a complete standalone and easy-to-use portable imaging instrument for light aerial vehicle deployment. Its miniaturized backend data system operates all ATTIREOIS imaging sensor components, an INS/GPS, and an e-Gimbal™ Control Electronic Unit (ECU) with a data throughput of 300 Megabytes/sec. The backend provides advanced onboard processing, performing autonomous raw sensor imagery development, TIRS image track-recovery reconstruction, LWIR/VNIR multi-band co-registration, and photogrammetric image processing. With geometric optics and boresight calibrations, the ATTIREOIS data products are directly georeferenced with an accuracy of approximately one meter. A prototype ATTIREOIS has been configured. Its sample LWIR/EO image data will be presented. Potential applications of ATTIREOIS include: 1) Providing timely and cost-effective, precisely and directly georeferenced surface emissive and solar reflective LWIR/VNIR multispectral images via a private Google Earth Globe to enhance NASA's Earth science research capabilities; and 2) Underflight satellites to support satellite measurement calibration and validation observations.

  1. Summaries of the 4th Annual JPL Airborne Geoscience Workshop. Volume 1: AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1993-01-01

    This publication contains the summaries for the Fourth Annual JPL Airborne Geoscience Workshop, held in Washington, D. C. October 25-29, 1993 The main workshop is divided into three smaller workshops as follows: The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, October 25-26 (the summaries for this workshop appear in this volume, Volume 1); The Thermal Infrared Multispectral Scanner (TMIS) workshop, on October 27 (the summaries for this workshop appear in Volume 2); and The Airborne Synthetic Aperture Radar (AIRSAR) workshop, October 28-29 (the summaries for this workshop appear in Volume 3).

  2. Web camera as low cost multispectral sensor for quantification of chlorophyll in soybean leaves

    NASA Astrophysics Data System (ADS)

    Adhiwibawa, Marcelinus A.; Setiawan, Yonathan E.; Prilianti, Kestrilia R.; Brotosudarmo, Tatas H. P.

    2015-01-01

    Soybeans is one of main crops in Indonesia but the demand for soybeans is not followed by an increase in soybeans national production. One of the production limitation factor is the availability of lush cultivation area for soybeans plantation. Indonesian farners are usually grow soybeans in marginal cultivation area that requires soybeans varieties which tolerant with environmental stress such as drought, nutrition limitation, pest, disease and many others. Chlorophyll content in leaf is one of plant health indicator that can be used to determine environmental stress tolerant soybean varieties. However, there are difficulties in soybeans breeding research due to the manual acquisition of data that are time consume and labour extensive. In this paper authors proposed automatic system of soybeans leaves area and chlorophyll quantification based on low cost multispectral sensor using web camera as an indicator of soybean plant tollerance to environmental stress particularlly drought stress. The system acquires the image of the plant that is placed in the acquisition box from the top of the plant. The image is segmented using NDVI (Normalized Difference Vegetation Index) from image and quantified to yield an average value of NDVI and leaf area. The proposed system showed that acquired NDVI value has a strong relationship with SPAD value with r-square value 0.70, while the leaf area prediction has error of 18.41%. Thus the automation system can quantify plant data with good result.

  3. Remote identification of individual volunteer cotton plants

    USDA-ARS?s Scientific Manuscript database

    Although airborne multispectral remote sensing can identify fields of small cotton plants, improvements to detection sensitivity are needed to identify individual or small clusters of plants that can similarly provide habitat for boll weevils. However, when consumer-grade cameras are used, each pix...

  4. Spectral difference analysis and airborne imaging classification for citrus greening infected trees

    USDA-ARS?s Scientific Manuscript database

    Citrus greening, also called Huanglongbing (HLB), became a devastating disease spread through citrus groves in Florida, since it was first found in 2005. Multispectral (MS) and hyperspectral (HS) airborne images of citrus groves in Florida were acquired to detect citrus greening infected trees in 20...

  5. Airborne Camera System for Real-Time Applications - Support of a National Civil Protection Exercise

    NASA Astrophysics Data System (ADS)

    Gstaiger, V.; Romer, H.; Rosenbaum, D.; Henkel, F.

    2015-04-01

    In the VABENE++ project of the German Aerospace Center (DLR), powerful tools are being developed to aid public authorities and organizations with security responsibilities as well as traffic authorities when dealing with disasters and large public events. One focus lies on the acquisition of high resolution aerial imagery, its fully automatic processing, analysis and near real-time provision to decision makers in emergency situations. For this purpose a camera system was developed to be operated from a helicopter with light-weight processing units and microwave link for fast data transfer. In order to meet end-users' requirements DLR works close together with the German Federal Office of Civil Protection and Disaster Assistance (BBK) within this project. One task of BBK is to establish, maintain and train the German Medical Task Force (MTF), which gets deployed nationwide in case of large-scale disasters. In October 2014, several units of the MTF were deployed for the first time in the framework of a national civil protection exercise in Brandenburg. The VABENE++ team joined the exercise and provided near real-time aerial imagery, videos and derived traffic information to support the direction of the MTF and to identify needs for further improvements and developments. In this contribution the authors introduce the new airborne camera system together with its near real-time processing components and share experiences gained during the national civil protection exercise.

  6. Airborne remote sensing in precision viticolture: assessment of quality and quantity vineyard production using multispectral imagery: a case study in Velletri, Rome surroundings (central Italy)

    NASA Astrophysics Data System (ADS)

    Tramontana, Gianluca; Papale, Dario; Girard, Filippo; Belli, Claudio; Pietromarchi, Paolo; Tiberi, Domenico; Comandini, Maria C.

    2009-09-01

    During 2008 an experimental study aimed to investigate the capabilities of a new Airborne Remote sensing platform as an aid in precision viticulture was conducted. The study was carried out on 2 areas located in the town of Velletri, near Rome; the acquisitions were conducted on 07-08-2008 and on 09-09-2008, using ASPIS (Advanced Spectroscopic Imager System) the new airborne multispectral sensor, capable to acquire 12 narrow spectral bands (10 nm) located in the visible and near-infrared region. Several vegetation indices, for a total of 22 independent variables, were tested for the estimation of different oenological parameters. Anova test showed that several oenochemical parameters, such as sugars and acidity, differ according to the variety taken into consideration. The remotely sensed data were significantly correlated with the following oenochemical parameters: Leaf Surface Exposed (SFE) (correlation coefficient R2 ~ 0.8), wood pruning (R2 ~ 0.8), reducing sugars (R2 ~ 0.6 and Root Mean Square Error ~ 5g/l), total acidity (R2 ~ 0.6 and RMSE ~ 0.5 g/l), polyphenols (R2~ 0.9) and anthocyanins content (R2 ~ 0.89) in order to provide "prescriptives" thematic maps related to the oenological variables of interest, the relationships previously carried out have been applied to the vegetation indices.

  7. Biooptical variability in the Greenland Sea observed with the Multispectral Airborne Radiometer System (MARS)

    NASA Technical Reports Server (NTRS)

    Mueller, James L.; Trees, Charles C.

    1989-01-01

    A site-specific ocean color remote sensing algorithm was developed and used to convert Multispectral Airborne Radiometer System (MARS) spectral radiance measurements to chlorophyll-a concentration profiles along aircraft tracklines in the Greenland Sea. The analysis is described and the results given in graphical or tabular form. Section 2 describes the salient characteristics and history of development of the MARS instrument. Section 3 describes the analyses of MARS flight segments over consolidated sea ice, resulting in a set of altitude dependent ratios used (over water) to estimate radiance reflected by the surface and atmosphere from total radiance measured. Section 4 presents optically weighted pigment concentrations calculated from profile data, and spectral reflectances measured in situ from the top meter of the water column; this data was analyzed to develop an algorithm relating chlorophyll-a concentrations to the ratio of radiance reflectances at 441 and 550 nm (with a selection of coefficients dependent upon whether significant gelvin presence is implied by a low ratio of reflectances at 410 and 550 nm). Section 5 describes the scaling adjustments which were derived to reconcile the MARS upwelled radiance ratios at 410:550 nm and 441:550 nm to in situ reflectance ratios measured simultaneously on the surface. Section 6 graphically presents the locations of MARS data tracklines and positions of the surface monitoring R/V. Section 7 presents stick-plots of MARS tracklines selected to illustrate two-dimensional spatial variability within the box covered by each day's flight. Section 8 presents curves of chlorophyll-a concentration profiles derived from MARS data along survey tracklines. Significant results are summarized in Section 1.

  8. Integrated active fire retrievals and biomass burning emissions using complementary near-coincident ground, airborne and spaceborne sensor data

    Treesearch

    Wilfrid Schroeder; Evan Ellicott; Charles Ichoku; Luke Ellison; Matthew B. Dickinson; Roger D. Ottmar; Craig Clements; Dianne Hall; Vincent Ambrosia; Robert Kremens

    2013-01-01

    Ground, airborne and spaceborne data were collected for a 450 ha prescribed fire implemented on 18 October 2011 at the Henry W. Coe State Park in California. The integration of various data elements allowed near-coincident active fire retrievals to be estimated. The Autonomous Modular Sensor-Wildfire (AMS) airborne multispectral imaging system was used as a bridge...

  9. Effectiveness of airborne multispectral thermal data for karst groundwater resources recognition in coastal areas

    NASA Astrophysics Data System (ADS)

    Pignatti, Stefano; Fusilli, Lorenzo; Palombo, Angelo; Santini, Federico; Pascucci, Simone

    2013-04-01

    Currently the detection, use and management of groundwater in karst regions can be considered one of the most significant procedures for solving water scarcity problems during periods of low rainfall this because groundwater resources from karst aquifers play a key role in the water supply in karst areas worldwide [1]. In many countries of the Mediterranean area, where karst is widespread, groundwater resources are still underexploited, while surface waters are generally preferred [2]. Furthermore, carbonate aquifers constitute a crucial thermal water resource outside of volcanic areas, even if there is no detailed and reliable global assessment of thermal water resources. The composite hydrogeological characteristics of karst, particularly directions and zones of groundwater distribution, are not up till now adequately explained [3]. In view of the abovementioned reasons the present study aims at analyzing the detection capability of high spatial resolution thermal remote sensing of karst water resources in coastal areas in order to get useful information on the karst springs flow and on different characteristics of these environments. To this purpose MIVIS [4, 5] and TASI-600 [6] airborne multispectral thermal imagery (see sensors' characteristics in Table 1) acquired on two coastal areas of the Mediterranean area interested by karst activity, one located in Montenegro and one in Italy, were used. One study area is located in the Kotor Bay, a winding bay on the Adriatic Sea surrounded by high mountains in south-western Montenegro and characterized by many subaerial and submarine coastal springs related to deep karstic channels. The other study area is located in Santa Cesarea (Italy), encompassing coastal cold springs, the main local source of high quality water, and also a noticeable thermal groundwater outflow. The proposed study shows the preliminary results of the two airborne deployments on these areas. The preprocessing of the multispectral thermal imagery

  10. Non-contact assessment of melanin distribution via multispectral temporal illumination coding

    NASA Astrophysics Data System (ADS)

    Amelard, Robert; Scharfenberger, Christian; Wong, Alexander; Clausi, David A.

    2015-03-01

    Melanin is a pigment that is highly absorptive in the UV and visible electromagnetic spectra. It is responsible for perceived skin tone, and protects against harmful UV effects. Abnormal melanin distribution is often an indicator for melanoma. We propose a novel approach for non-contact melanin distribution via multispectral temporal illumination coding to estimate the two-dimensional melanin distribution based on its absorptive characteristics. In the proposed system, a novel multispectral, cross-polarized, temporally-coded illumination sequence is synchronized with a camera to measure reflectance under both multispectral and ambient illumination. This allows us to eliminate the ambient illumination contribution from the acquired reflectance measurements, and also to determine the melanin distribution in an observed region based on the spectral properties of melanin using the Beer-Lambert law. Using this information, melanin distribution maps can be generated for objective, quantitative assessment of skin type of individuals. We show that the melanin distribution map correctly identifies areas with high melanin densities (e.g., nevi).

  11. SPEKTROP DPU: optoelectronic platform for fast multispectral imaging

    NASA Astrophysics Data System (ADS)

    Graczyk, Rafal; Sitek, Piotr; Stolarski, Marcin

    2010-09-01

    In recent years it easy to spot and increasing need of high-quality Earth imaging in airborne and space applications. This is due fact that government and local authorities urge for up to date topological data for administrative purposes. On the other hand, interest in environmental sciences, push for ecological approach, efficient agriculture and forests management are also heavily supported by Earth images in various resolutions and spectral ranges. "SPEKTROP DPU: Opto-electronic platform for fast multi-spectral imaging" paper describes architectural datails of data processing unit, part of universal and modular platform that provides high quality imaging functionality in aerospace applications.

  12. Summaries of the Fifth Annual JPL Airborne Earth Science Workshop. Volume 3: AIRSAR Workshop

    NASA Technical Reports Server (NTRS)

    Vanzyl, Jakob (Editor)

    1995-01-01

    This publication is the third containing summaries for the Fifth Annual JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 23-26, 1995. The main workshop is divided into three smaller workshops as follows: (1) The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on January 23-24. The summaries for this workshop appear in Volume 1; (2) The Airborne synthetic Aperture Radar (AIRSAR) workshop, on January 25-26. The summaries for this workshop appear in this volume; and (3) The Thermal Infrared Multispectral Scanner (TIMS) workshop, on January 26. The summaries for this workshop appear in Volume 2.

  13. Summaries of the Fifth Annual JPL Airborne Earth Science Workshop. Volume 1: AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1995-01-01

    This publication is the first of three containing summaries for the Fifth Annual JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 23-26, 1995. The main workshop is divided into three smaller workshops as follows: (1) The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on January 23-24. The summaries for this workshop appear in this volume; (2) The Airborne Synthetic Aperture Radar (AIRSAR) workshop, on January 25-26. The summaries for this workshop appear in Volume 3; and (3) The Thermal Infrared Multispectral Scanner (TIMS) workshop, on January 26. The summaries for this workshop appear in Volume 2.

  14. Summaries of the Fifth Annual JPL Airborne Earth Science Workshop. Volume 2: TIMS Workshop

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J. (Editor)

    1995-01-01

    This publication is the second volume of the summaries for the Fifth Annual JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 23-26, 1995. The main workshop is divided into three smaller workshops as follows: (1) The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop on January 23-24. The summaries for this workshop appear in Volume 1; (2) The Airborne Synthetic Aperture Radar (AIRSAR) workshop on January 25-26. The summaries for this workshop appear in volume 3; and (3) The Thermal Infrared Multispectral Scanner (TIMS) workshop on January 26. The summaries for this workshop appear in this volume.

  15. Using Multitemporal and Multispectral Airborne Lidar to Assess Depth of Peat Loss and Correspondence With a New Active Normalized Burn Ratio for Wildfires

    NASA Astrophysics Data System (ADS)

    Chasmer, L. E.; Hopkinson, C. D.; Petrone, R. M.; Sitar, M.

    2017-12-01

    Accuracy of depth of burn (an indicator of consumption) in peatland soils using prefire and postfire airborne light detection and ranging (lidar) data is determined within a wetland-upland forest environment near Fort McMurray, Alberta, Canada. The relationship between peat soil burn depth and an "active" normalized burn ratio (ANBR) is also examined beneath partially and fully burned forest and understory canopies using state-of-the-art active reflectance from a multispectral lidar compared with normalized burn ratio (NBR) derived from Landsat 7 ETM+. We find significant correspondence between depth of burn, lidar-derived ANBR, and difference NBR (dNBR) from Landsat. However, low-resolution optical imagery excludes peatland burn losses in transition zones, which are highly sensitive to peat loss via combustion. The findings presented here illustrate the utility of this new remote sensing technology for expanding an area of research where it has previously been challenging to spatially detect and quantify such wildfire burn losses.

  16. Hybrid Image Fusion for Sharpness Enhancement of Multi-Spectral Lunar Images

    NASA Astrophysics Data System (ADS)

    Awumah, Anna; Mahanti, Prasun; Robinson, Mark

    2016-10-01

    Image fusion enhances the sharpness of a multi-spectral (MS) image by incorporating spatial details from a higher-resolution panchromatic (Pan) image [1,2]. Known applications of image fusion for planetary images are rare, although image fusion is well-known for its applications to Earth-based remote sensing. In a recent work [3], six different image fusion algorithms were implemented and their performances were verified with images from the Lunar Reconnaissance Orbiter (LRO) Camera. The image fusion procedure obtained a high-resolution multi-spectral (HRMS) product from the LRO Narrow Angle Camera (used as Pan) and LRO Wide Angle Camera (used as MS) images. The results showed that the Intensity-Hue-Saturation (IHS) algorithm results in a high-spatial quality product while the Wavelet-based image fusion algorithm best preserves spectral quality among all the algorithms. In this work we show the results of a hybrid IHS-Wavelet image fusion algorithm when applied to LROC MS images. The hybrid method provides the best HRMS product - both in terms of spatial resolution and preservation of spectral details. Results from hybrid image fusion can enable new science and increase the science return from existing LROC images.[1] Pohl, Cle, and John L. Van Genderen. "Review article multisensor image fusion in remote sensing: concepts, methods and applications." International journal of remote sensing 19.5 (1998): 823-854.[2] Zhang, Yun. "Understanding image fusion." Photogramm. Eng. Remote Sens 70.6 (2004): 657-661.[3] Mahanti, Prasun et al. "Enhancement of spatial resolution of the LROC Wide Angle Camera images." Archives, XXIII ISPRS Congress Archives (2016).

  17. An automated geometric correction system for airborne multispectral scanner imagery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis-King, E.; Tinney, L.; Brickey, D.

    1996-10-01

    The United States Department of Energy (USDOE) maintains a Remote Sensing Laboratory (RSL) to support nuclear related programs of the US Government. The mission of the organization includes both emergency response and more routine environmental assessments of nuclear facilities. The USDOE RSL maintains a small fleet of specially equipped aircraft that are used as platforms for remote sensor systems. The aircraft include helicopters, light aircraft, and a business jet suitable for high altitude acquisitions. Multispectral scanners flown on these platforms are subject to geometric distortions related to variations in aircraft orientation (pitch, roll, and yaw), position, and velocity during datamore » acquistions.« less

  18. Red to far-red multispectral fluorescence image fusion for detection of fecal contamination on apples

    USDA-ARS?s Scientific Manuscript database

    This research developed a multispectral algorithm derived from hyperspectral line-scan fluorescence imaging under violet/blue LED excitation for detection of fecal contamination on Golden Delicious apples. Using a hyperspectral line-scan imaging system consisting of an EMCCD camera, spectrograph, an...

  19. Assessing exergy of forest ecosystem using airborne and satellite data

    NASA Astrophysics Data System (ADS)

    Brovkina, Olga; Fabianek, Tomas; Lukes, Petr; Zemek, Frantisek

    2017-04-01

    Interactions of the energy flows of forest ecosystem with environment are formed by a suite of forest structure, functions and pathways of self-control. According to recent thermodynamic theory for open systems, concept of exergy of solar radiation has been applied to estimate energy consumptions on evapotranspiration and biomass production in forest ecosystem or to indicate forest decline and human land use impact on ecosystem stability. However, most of the methods for exergy estimation in forest ecosystem is not stable and its physical meaning remains on the surface. This study was aimed to contribute to understanding the exergy of forest ecosystem using combination of remote sensing (RS) and eddy covariance technologies, specifically: 1/to explore exergy of solar radiation depending on structure of solar spectrum (number of spectral bands of RS data), and 2/to explore the relationship between exergy and flux tower eddy covariance measurements. Two study forest sites were located in Western Beskids in the Czech Republic. The first site was dominated by young Norway spruce, the second site was dominated by mature European beech. Airborne hyperspectral data in VNIR, SWIR and TIR spectral regions were acquired 9 times for study sites during a vegetation periods in 2015-2016. Radiometric, geometric and atmospheric corrections of airborne data were performed. Satellite multispectral Landsat-8 cloud-free 21 scenes were downloaded and atmospherically corrected for the period from April to November 2015-2016. Evapotranspiration and latent heat fluxes were collected from operating flux towers located on study sites according to date and time of remote sensing data acquisition. Exergy was calculated for each satellite and airborne scene using various combinations of spectral bands as: Ex=E^out (K+ln E^out/E^in )+R, where Ein is the incoming solar energy, Eout is the reflected solar energy, R = Ein-Eout is absorbed energy, Eout/Ein is albedo and K is the Kullback increment

  20. Summaries of the Third Annual JPL Airborne Geoscience Workshop. Volume 2: TIMS Workshop

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J. (Editor)

    1992-01-01

    This publication contains the preliminary agenda and summaries for the Third Annual JPL Airborne Geoscience Workshop, held at the Jet Propulsion Laboratory, Pasadena, California, on 1-5 June 1992. This main workshop is divided into three smaller workshops as follows: (1) the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on June 1 and 2; the summaries for this workshop appear in Volume 1; (2) the Thermal Infrared Multispectral Scanner (TIMS) workshop, on June 3; the summaries for this workshop appear in Volume 2; and (3) the Airborne Synthetic Aperture Radar (AIRSAR) workshop, on June 4 and 5; the summaries for this workshop appear in Volume 3.

  1. Summaries of the Third Annual JPL Airborne Geoscience Workshop. Volume 3: AIRSAR Workshop

    NASA Technical Reports Server (NTRS)

    Vanzyl, Jakob (Editor)

    1992-01-01

    This publication contains the preliminary agenda and summaries for the Third Annual JPL Airborne Geoscience Workshop, held at the Jet Propulsion Laboratory, Pasadena, California, on 1-5 June 1992. This main workshop is divided into three smaller workshops as follows: (1) the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on June 1 and 2; the summaries for this workshop appear in Volume 1; (2) the Thermal Infrared Multispectral Scanner (TIMS) workshop, on June 3; the summaries for this workshop appear in Volume 2; and (3) the Airborne Synthetic Aperture Radar (AIRSAR) workshop, on June 4 and 5; the summaries for this workshop appear in Volume 3.

  2. Application of multispectral color photography to flame flow visualization

    NASA Technical Reports Server (NTRS)

    Stoffers, G.

    1979-01-01

    For flames of short duration and low intensity of radiation a spectroscopical flame diagnostics is difficult. In order to find some other means of extracting information about the flame structure from its radiation, the feasibility of using multispectral color photography was successfully evaluated. Since the flame photographs are close-ups, there is a considerable parallax between the single images, when several cameras are used, and additive color viewing is not possible. Each image must be analyzed individually, it is advisable to use color film in all cameras. One can either use color films of different spectral sensitivities or color films of the same type with different color filters. Sharp cutting filters are recommended.

  3. Multispectral and polarimetric photodetection using a plasmonic metasurface

    NASA Astrophysics Data System (ADS)

    Pelzman, Charles; Cho, Sang-Yeon

    2018-01-01

    We present a metasurface-integrated Si 2-D CMOS sensor array for multispectral and polarimetric photodetection applications. The demonstrated sensor is based on the polarization selective extraordinary optical transmission from periodic subwavelength nanostructures, acting as artificial atoms, known as meta-atoms. The meta-atoms were created by patterning periodic rectangular apertures that support optical resonance at the designed spectral bands. By spatially separating meta-atom clusters with different lattice constants and orientations, the demonstrated metasurface can convert the polarization and spectral information of an optical input into a 2-D intensity pattern. As a proof-of-concept experiment, we measured the linear components of the Stokes parameters directly from captured images using a CMOS camera at four spectral bands. Compared to existing multispectral polarimetric sensors, the demonstrated metasurface-integrated CMOS system is compact and does not require any moving components, offering great potential for advanced photodetection applications.

  4. High resolution multispectral photogrammetric imagery: enhancement, interpretation and evaluations

    NASA Astrophysics Data System (ADS)

    Roberts, Arthur; Haefele, Martin; Bostater, Charles; Becker, Thomas

    2007-10-01

    A variety of aerial mapping cameras were adapted and developed into simulated multiband digital photogrammetric mapping systems. Direct digital multispectral, two multiband cameras (IIS 4 band and Itek 9 band) and paired mapping and reconnaissance cameras were evaluated for digital spectral performance and photogrammetric mapping accuracy in an aquatic environment. Aerial films (24cm X 24cm format) tested were: Agfa color negative and extended red (visible and near infrared) panchromatic, and; Kodak color infrared and B&W (visible and near infrared) infrared. All films were negative processed to published standards and digitally converted at either 16 (color) or 10 (B&W) microns. Excellent precision in the digital conversions was obtained with scanning errors of less than one micron. Radiometric data conversion was undertaken using linear density conversion and centered 8 bit histogram exposure. This resulted in multiple 8 bit spectral image bands that were unaltered (not radiometrically enhanced) "optical count" conversions of film density. This provided the best film density conversion to a digital product while retaining the original film density characteristics. Data covering water depth, water quality, surface roughness, and bottom substrate were acquired using different measurement techniques as well as different techniques to locate sampling points on the imagery. Despite extensive efforts to obtain accurate ground truth data location errors, measurement errors, and variations in the correlation between water depth and remotely sensed signal persisted. These errors must be considered endemic and may not be removed through even the most elaborate sampling set up. Results indicate that multispectral photogrammetric systems offer improved feature mapping capability.

  5. Monitoring Ephemeral Streams Using Airborne Very High Resolution Multispectral Remote Sensing in Arid Environments

    NASA Astrophysics Data System (ADS)

    Hamada, Y.; O'Connor, B. L.

    2012-12-01

    Development in arid environments often results in the loss and degradation of the ephemeral streams that provide habitat and critical ecosystem functions such as water delivery, sediment transport, and groundwater recharge. Quantification of these ecosystem functions is challenging because of the episodic nature of runoff events in desert landscapes and the large spatial scale of watersheds that potentially can be impacted by large-scale development. Low-impact development guidelines and regulatory protection of ephemeral streams are often lacking due to the difficulty of accurately mapping and quantifying the critical functions of ephemeral streams at scales larger than individual reaches. Renewable energy development in arid regions has the potential to disturb ephemeral streams at the watershed scale, and it is necessary to develop environmental monitoring applications for ephemeral streams to help inform land management and regulatory actions aimed at protecting and mitigating for impacts related to large-scale land disturbances. This study focuses on developing remote sensing methodologies to identify and monitor impacts on ephemeral streams resulting from the land disturbance associated with utility-scale solar energy development in the desert southwest of the United States. Airborne very high resolution (VHR) multispectral imagery is used to produce stereoscopic, three-dimensional landscape models that can be used to (1) identify and map ephemeral stream channel networks, and (2) support analyses and models of hydrologic and sediment transport processes that pertain to the critical functionality of ephemeral streams. Spectral and statistical analyses are being developed to extract information about ephemeral channel location and extent, micro-topography, riparian vegetation, and soil moisture characteristics. This presentation will demonstrate initial results and provide a framework for future work associated with this project, for developing the necessary

  6. Sun and aureole spectrometer for airborne measurements to derive aerosol optical properties.

    PubMed

    Asseng, Hagen; Ruhtz, Thomas; Fischer, Jürgen

    2004-04-01

    We have designed an airborne spectrometer system for the simultaneous measurement of the direct Sun irradiance and aureole radiance. The instrument is based on diffraction grating spectrometers with linear image sensors. It is robust, lightweight, compact, and reliable, characteristics that are important for airborne applications. The multispectral radiation measurements are used to derive optical properties of tropospheric aerosols. We extract the altitude dependence of the aerosol volume scattering function and of the aerosol optical depth by using flight patterns with descents and ascents ranging from the surface level to the top of the boundary layer. The extinction coefficient and the product of single scattering albedo and phase function of separate layers can be derived from the airborne measurements.

  7. Synthesis of Multispectral Bands from Hyperspectral Data: Validation Based on Images Acquired by AVIRIS, Hyperion, ALI, and ETM+

    NASA Technical Reports Server (NTRS)

    Blonksi, Slawomir; Gasser, Gerald; Russell, Jeffrey; Ryan, Robert; Terrie, Greg; Zanoni, Vicki

    2001-01-01

    Multispectral data requirements for Earth science applications are not always studied rigorously studied before a new remote sensing system is designed. A study of the spatial resolution, spectral bandpasses, and radiometric sensitivity requirements of real-world applications would focus the design onto providing maximum benefits to the end-user community. To support systematic studies of multispectral data requirements, the Applications Research Toolbox (ART) has been developed at NASA's Stennis Space Center. The ART software allows users to create and assess simulated datasets while varying a wide range of system parameters. The simulations are based on data acquired by existing multispectral and hyperspectral instruments. The produced datasets can be further evaluated for specific end-user applications. Spectral synthesis of multispectral images from hyperspectral data is a key part of the ART software. In this process, hyperspectral image cubes are transformed into multispectral imagery without changes in spatial sampling and resolution. The transformation algorithm takes into account spectral responses of both the synthesized, broad, multispectral bands and the utilized, narrow, hyperspectral bands. To validate the spectral synthesis algorithm, simulated multispectral images are compared with images collected near-coincidentally by the Landsat 7 ETM+ and the EO-1 ALI instruments. Hyperspectral images acquired with the airborne AVIRIS instrument and with the Hyperion instrument onboard the EO-1 satellite were used as input data to the presented simulations.

  8. Mapping giant reed along the Rio Grande using airborne and satellite imagery

    USDA-ARS?s Scientific Manuscript database

    Giant reed (Arundo donax L.) is a perennial invasive weed that presents a severe threat to agroecosystems and riparian areas in the Texas and Mexican portions of the Rio Grande Basin. The objective of this presentation is to give an overview on the use of aerial photography, airborne multispectral a...

  9. Forest Stand Canopy Structure Attribute Estimation from High Resolution Digital Airborne Imagery

    Treesearch

    Demetrios Gatziolis

    2006-01-01

    A study of forest stand canopy variable assessment using digital, airborne, multispectral imagery is presented. Variable estimation involves stem density, canopy closure, and mean crown diameter, and it is based on quantification of spatial autocorrelation among pixel digital numbers (DN) using variogram analysis and an alternative, non-parametric approach known as...

  10. A Comparative Study of Land Cover Classification by Using Multispectral and Texture Data

    PubMed Central

    Qadri, Salman; Khan, Dost Muhammad; Ahmad, Farooq; Qadri, Syed Furqan; Babar, Masroor Ellahi; Shahid, Muhammad; Ul-Rehman, Muzammil; Razzaq, Abdul; Shah Muhammad, Syed; Fahad, Muhammad; Ahmad, Sarfraz; Pervez, Muhammad Tariq; Naveed, Nasir; Aslam, Naeem; Jamil, Mutiullah; Rehmani, Ejaz Ahmad; Ahmad, Nazir; Akhtar Khan, Naeem

    2016-01-01

    The main objective of this study is to find out the importance of machine vision approach for the classification of five types of land cover data such as bare land, desert rangeland, green pasture, fertile cultivated land, and Sutlej river land. A novel spectra-statistical framework is designed to classify the subjective land cover data types accurately. Multispectral data of these land covers were acquired by using a handheld device named multispectral radiometer in the form of five spectral bands (blue, green, red, near infrared, and shortwave infrared) while texture data were acquired with a digital camera by the transformation of acquired images into 229 texture features for each image. The most discriminant 30 features of each image were obtained by integrating the three statistical features selection techniques such as Fisher, Probability of Error plus Average Correlation, and Mutual Information (F + PA + MI). Selected texture data clustering was verified by nonlinear discriminant analysis while linear discriminant analysis approach was applied for multispectral data. For classification, the texture and multispectral data were deployed to artificial neural network (ANN: n-class). By implementing a cross validation method (80-20), we received an accuracy of 91.332% for texture data and 96.40% for multispectral data, respectively. PMID:27376088

  11. Summaries of the 4th Annual JPL Airborne Geoscience Workshop. Volume 2: TIMS Workshop

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J. (Editor)

    1993-01-01

    This is volume 2 of a three volume set of publications that contain the summaries for the Fourth Annual JPL Airborne Geoscience Workshop, held in Washington, D.C. on October 25-29, 1993. The main workshop is divided into three smaller workshops as follows: The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on October 25-26. The summaries for this workshop appear in Volume 1. The Thermal Infrared Multispectral Scanner (TIMS) workshop, on October 27. The summaries for this workshop appear in Volume 2. The Airborne Synthetic Aperture Radar (AIRSAR) workshop, on October 28-29. The summaries for this workshop appear in Volume 3.

  12. Common aperture multispectral spotter camera: Spectro XR

    NASA Astrophysics Data System (ADS)

    Petrushevsky, Vladimir; Freiman, Dov; Diamant, Idan; Giladi, Shira; Leibovich, Maor

    2017-10-01

    The Spectro XRTM is an advanced color/NIR/SWIR/MWIR 16'' payload recently developed by Elbit Systems / ELOP. The payload's primary sensor is a spotter camera with common 7'' aperture. The sensor suite includes also MWIR zoom, EO zoom, laser designator or rangefinder, laser pointer / illuminator and laser spot tracker. Rigid structure, vibration damping and 4-axes gimbals enable high level of line-of-sight stabilization. The payload's list of features include multi-target video tracker, precise boresight, strap-on IMU, embedded moving map, geodetic calculations suite, and image fusion. The paper describes main technical characteristics of the spotter camera. Visible-quality, all-metal front catadioptric telescope maintains optical performance in wide range of environmental conditions. High-efficiency coatings separate the incoming light into EO, SWIR and MWIR band channels. Both EO and SWIR bands have dual FOV and 3 spectral filters each. Several variants of focal plane array formats are supported. The common aperture design facilitates superior DRI performance in EO and SWIR, in comparison to the conventionally configured payloads. Special spectral calibration and color correction extend the effective range of color imaging. An advanced CMOS FPA and low F-number of the optics facilitate low light performance. SWIR band provides further atmospheric penetration, as well as see-spot capability at especially long ranges, due to asynchronous pulse detection. MWIR band has good sharpness in the entire field-of-view and (with full HD FPA) delivers amount of detail far exceeding one of VGA-equipped FLIRs. The Spectro XR offers level of performance typically associated with larger and heavier payloads.

  13. Cinematic camera emulation using two-dimensional color transforms

    NASA Astrophysics Data System (ADS)

    McElvain, Jon S.; Gish, Walter

    2015-02-01

    For cinematic and episodic productions, on-set look management is an important component of the creative process, and involves iterative adjustments of the set, actors, lighting and camera configuration. Instead of using the professional motion capture device to establish a particular look, the use of a smaller form factor DSLR is considered for this purpose due to its increased agility. Because the spectral response characteristics will be different between the two camera systems, a camera emulation transform is needed to approximate the behavior of the destination camera. Recently, twodimensional transforms have been shown to provide high-accuracy conversion of raw camera signals to a defined colorimetric state. In this study, the same formalism is used for camera emulation, whereby a Canon 5D Mark III DSLR is used to approximate the behavior a Red Epic cinematic camera. The spectral response characteristics for both cameras were measured and used to build 2D as well as 3x3 matrix emulation transforms. When tested on multispectral image databases, the 2D emulation transforms outperform their matrix counterparts, particularly for images containing highly chromatic content.

  14. Pre-flight and On-orbit Geometric Calibration of the Lunar Reconnaissance Orbiter Camera

    NASA Astrophysics Data System (ADS)

    Speyerer, E. J.; Wagner, R. V.; Robinson, M. S.; Licht, A.; Thomas, P. C.; Becker, K.; Anderson, J.; Brylow, S. M.; Humm, D. C.; Tschimmel, M.

    2016-04-01

    The Lunar Reconnaissance Orbiter Camera (LROC) consists of two imaging systems that provide multispectral and high resolution imaging of the lunar surface. The Wide Angle Camera (WAC) is a seven color push-frame imager with a 90∘ field of view in monochrome mode and 60∘ field of view in color mode. From the nominal 50 km polar orbit, the WAC acquires images with a nadir ground sampling distance of 75 m for each of the five visible bands and 384 m for the two ultraviolet bands. The Narrow Angle Camera (NAC) consists of two identical cameras capable of acquiring images with a ground sampling distance of 0.5 m from an altitude of 50 km. The LROC team geometrically calibrated each camera before launch at Malin Space Science Systems in San Diego, California and the resulting measurements enabled the generation of a detailed camera model for all three cameras. The cameras were mounted and subsequently launched on the Lunar Reconnaissance Orbiter (LRO) on 18 June 2009. Using a subset of the over 793000 NAC and 207000 WAC images of illuminated terrain collected between 30 June 2009 and 15 December 2013, we improved the interior and exterior orientation parameters for each camera, including the addition of a wavelength dependent radial distortion model for the multispectral WAC. These geometric refinements, along with refined ephemeris, enable seamless projections of NAC image pairs with a geodetic accuracy better than 20 meters and sub-pixel precision and accuracy when orthorectifying WAC images.

  15. The NASA 2003 Mars Exploration Rover Panoramic Camera (Pancam) Investigation

    NASA Astrophysics Data System (ADS)

    Bell, J. F.; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Morris, R. V.; Athena Team

    2002-12-01

    The Panoramic Camera System (Pancam) is part of the Athena science payload to be launched to Mars in 2003 on NASA's twin Mars Exploration Rover missions. The Pancam imaging system on each rover consists of two major components: a pair of digital CCD cameras, and the Pancam Mast Assembly (PMA), which provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360o of azimuth and from zenith to nadir, providing a complete view of the scene around the rover. Pancam utilizes two 1024x2048 Mitel frame transfer CCD detector arrays, each having a 1024x1024 active imaging area and 32 optional additional reference pixels per row for offset monitoring. Each array is combined with optics and a small filter wheel to become one "eye" of a multispectral, stereoscopic imaging system. The optics for both cameras consist of identical 3-element symmetrical lenses with an effective focal length of 42 mm and a focal ratio of f/20, yielding an IFOV of 0.28 mrad/pixel or a rectangular FOV of 16o\\x9D 16o per eye. The two eyes are separated by 30 cm horizontally and have a 1o toe-in to provide adequate parallax for stereo imaging. The cameras are boresighted with adjacent wide-field stereo Navigation Cameras, as well as with the Mini-TES instrument. The Pancam optical design is optimized for best focus at 3 meters range, and allows Pancam to maintain acceptable focus from infinity to within 1.5 meters of the rover, with a graceful degradation (defocus) at closer ranges. Each eye also contains a small 8-position filter wheel to allow multispectral sky imaging, direct Sun imaging, and surface mineralogic studies in the 400-1100 nm wavelength region. Pancam has been designed and calibrated to operate within specifications from -55oC to +5oC. An onboard calibration target and fiducial marks provide

  16. Remote Sensing of Liquid Water and Ice Cloud Optical Thickness and Effective Radius in the Arctic: Application of Airborne Multispectral MAS Data

    NASA Technical Reports Server (NTRS)

    King, Michael D.; Platnick, Steven; Yang, Ping; Arnold, G. Thomas; Gray, Mark A.; Riedi, Jerome C.; Ackerman, Steven A.; Liou, Kuo-Nan

    2003-01-01

    A multispectral scanning spectrometer was used to obtain measurements of the reflection function and brightness temperature of clouds, sea ice, snow, and tundra surfaces at 50 discrete wavelengths between 0.47 and 14.0 microns. These observations were obtained from the NASA ER-2 aircraft as part of the FIRE Arctic Clouds Experiment, conducted over a 1600 x 500 km region of the north slope of Alaska and surrounding Beaufort and Chukchi Seas between 18 May and 6 June 1998. Multispectral images of the reflection function and brightness temperature in 11 distinct bands of the MODIS Airborne Simulator (MAS) were used to derive a confidence in clear sky (or alternatively the probability of cloud), shadow, and heavy aerosol over five different ecosystems. Based on the results of individual tests run as part of the cloud mask, an algorithm was developed to estimate the phase of the clouds (water, ice, or undetermined phase). Finally, the cloud optical thickness and effective radius were derived for both water and ice clouds that were detected during one flight line on 4 June. This analysis shows that the cloud mask developed for operational use on MODIS, and tested using MAS data in Alaska, is quite capable of distinguishing clouds from bright sea ice surfaces during daytime conditions in the high Arctic. Results of individual tests, however, make it difficult to distinguish ice clouds over snow and sea ice surfaces, so additional tests were added to enhance the confidence in the thermodynamic phase of clouds over the Beaufort Sea. The cloud optical thickness and effective radius retrievals used 3 distinct bands of the MAS, with the newly developed 1.62 and 2.13 micron bands being used quite successfully over snow and sea ice surfaces. These results are contrasted with a MODIS-based algorithm that relies on spectral reflectance at 0.87 and 2.13 micron.

  17. Airborne hyperspectral remote sensing in Italy

    NASA Astrophysics Data System (ADS)

    Bianchi, Remo; Marino, Carlo M.; Pignatti, Stefano

    1994-12-01

    The Italian National Research Council (CNR) in the framework of its `Strategic Project for Climate and Environment in Southern Italy' established a new laboratory for airborne hyperspectral imaging devoted to environmental problems. Since the end of June 1994, the LARA (Laboratorio Aereo per Ricerche Ambientali -- Airborne Laboratory for Environmental Studies) Project is fully operative to provide hyperspectral data to the national and international scientific community by means of deployments of its CASA-212 aircraft carrying the Daedalus AA5000 MIVIS (multispectral infrared and visible imaging spectrometer) system. MIVIS is a modular instrument consisting of 102 spectral channels that use independent optical sensors simultaneously sampled and recorded onto a compact computer compatible magnetic tape medium with a data capacity of 10.2 Gbytes. To support the preprocessing and production pipeline of the large hyperspectral data sets CNR housed in Pomezia, a town close to Rome, a ground based computer system with a software designed to handle MIVIS data. The software (MIDAS-Multispectral Interactive Data Analysis System), besides the data production management, gives to users a powerful and highly extensible hyperspectral analysis system. The Pomezia's ground station is designed to maintain and check the MIVIS instrument performance through the evaluation of data quality (like spectral accuracy, signal to noise performance, signal variations, etc.), and to produce, archive, and diffuse MIVIS data in the form of geometrically and radiometrically corrected data sets on low cost and easy access CC media.

  18. Multispectral Thermal Infrared Mapping of Sulfur Dioxide Plumes: A Case Study from the East Rift Zone of Kilauea Volcano, Hawaii

    NASA Technical Reports Server (NTRS)

    Realmuto, V. J.; Sutton, A. J.; Elias, T.

    1996-01-01

    The synoptic perspective and rapid mode of data acquisition provided by remote sensing are well-suited for the study of volcanic SO2 plumes. In this paper we describe a plume-mapping procedure that is based on image data acquired with NASA's airborne Thermal Infrared Multispectral Scanner (TIMS).

  19. A comparison of digital multi-spectral imagery versus conventional photography for mapping seagrass in Indian River Lagoon, Florida

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Virnstein, R.; Tepera, M.; Beazley, L.

    1997-06-01

    A pilot study is very briefly summarized in the article. The study tested the potential of multi-spectral digital imagery for discrimination of seagrass densities and species, algae, and bottom types. Imagery was obtained with the Compact Airborne Spectral Imager (casi) and two flight lines flown with hyper-spectral mode. The photogrammetric method used allowed interpretation of the highest quality product, eliminating limitations caused by outdated or poor quality base maps and the errors associated with transfer of polygons. Initial image analysis indicates that the multi-spectral imagery has several advantages, including sophisticated spectral signature recognition and classification, ease of geo-referencing, and rapidmore » mosaicking.« less

  20. The Beagle 2 Stereo Camera System: Scientific Objectives and Design Characteristics

    NASA Astrophysics Data System (ADS)

    Griffiths, A.; Coates, A.; Josset, J.; Paar, G.; Sims, M.

    2003-04-01

    The Stereo Camera System (SCS) will provide wide-angle (48 degree) multi-spectral stereo imaging of the Beagle 2 landing site in Isidis Planitia with an angular resolution of 0.75 milliradians. Based on the SpaceX Modular Micro-Imager, the SCS is composed of twin cameras (with 1024 by 1024 pixel frame transfer CCD) and twin filter wheel units (with a combined total of 24 filters). The primary mission objective is to construct a digital elevation model of the area in reach of the lander’s robot arm. The SCS specifications and following baseline studies are described: Panoramic RGB colour imaging of the landing site and panoramic multi-spectral imaging at 12 distinct wavelengths to study the mineralogy of landing site. Solar observations to measure water vapour absorption and the atmospheric dust optical density. Also envisaged are multi-spectral observations of Phobos &Deimos (observations of the moons relative to background stars will be used to determine the lander’s location and orientation relative to the Martian surface), monitoring of the landing site to detect temporal changes, observation of the actions and effects of the other PAW experiments (including rock texture studies with a close-up-lens) and collaborative observations with the Mars Express orbiter instrument teams. Due to be launched in May of this year, the total system mass is 360 g, the required volume envelope is 747 cm^3 and the average power consumption is 1.8 W. A 10Mbit/s RS422 bus connects each camera to the lander common electronics.

  1. A new method of building footprints detection using airborne laser scanning data and multispectral image

    NASA Astrophysics Data System (ADS)

    Luo, Yiping; Jiang, Ting; Gao, Shengli; Wang, Xin

    2010-10-01

    It presents a new approach for detecting building footprints in a combination of registered aerial image with multispectral bands and airborne laser scanning data synchronously obtained by Leica-Geosystems ALS40 and Applanix DACS-301 on the same platform. A two-step method for building detection was presented consisting of selecting 'building' candidate points and then classifying candidate points. A digital surface model(DSM) derived from last pulse laser scanning data was first filtered and the laser points were classified into classes 'ground' and 'building or tree' based on mathematic morphological filter. Then, 'ground' points were resample into digital elevation model(DEM), and a Normalized DSM(nDSM) was generated from DEM and DSM. The candidate points were selected from 'building or tree' points by height value and area threshold in nDSM. The candidate points were further classified into building points and tree points by using the support vector machines(SVM) classification method. Two classification tests were carried out using features only from laser scanning data and associated features from two input data sources. The features included height, height finite difference, RGB bands value, and so on. The RGB value of points was acquired by matching laser scanning data and image using collinear equation. The features of training points were presented as input data for SVM classification method, and cross validation was used to select best classification parameters. The determinant function could be constructed by the classification parameters and the class of candidate points was determined by determinant function. The result showed that associated features from two input data sources were superior to features only from laser scanning data. The accuracy of more than 90% was achieved for buildings in first kind of features.

  2. Classification of human carcinoma cells using multispectral imagery

    NASA Astrophysics Data System (ADS)

    Ćinar, Umut; Y. Ćetin, Yasemin; Ćetin-Atalay, Rengul; Ćetin, Enis

    2016-03-01

    In this paper, we present a technique for automatically classifying human carcinoma cell images using textural features. An image dataset containing microscopy biopsy images from different patients for 14 distinct cancer cell line type is studied. The images are captured using a RGB camera attached to an inverted microscopy device. Texture based Gabor features are extracted from multispectral input images. SVM classifier is used to generate a descriptive model for the purpose of cell line classification. The experimental results depict satisfactory performance, and the proposed method is versatile for various microscopy magnification options.

  3. Automated oil spill detection with multispectral imagery

    NASA Astrophysics Data System (ADS)

    Bradford, Brian N.; Sanchez-Reyes, Pedro J.

    2011-06-01

    In this publication we present an automated detection method for ocean surface oil, like that which existed in the Gulf of Mexico as a result of the April 20, 2010 Deepwater Horizon drilling rig explosion. Regions of surface oil in airborne imagery are isolated using red, green, and blue bands from multispectral data sets. The oil shape isolation procedure involves a series of image processing functions to draw out the visual phenomenological features of the surface oil. These functions include selective color band combinations, contrast enhancement and histogram warping. An image segmentation process then separates out contiguous regions of oil to provide a raster mask to an analyst. We automate the detection algorithm to allow large volumes of data to be processed in a short time period, which can provide timely oil coverage statistics to response crews. Geo-referenced and mosaicked data sets enable the largest identified oil regions to be mapped to exact geographic coordinates. In our simulation, multispectral imagery came from multiple sources including first-hand data collected from the Gulf. Results of the simulation show the oil spill coverage area as a raster mask, along with histogram statistics of the oil pixels. A rough square footage estimate of the coverage is reported if the image ground sample distance is available.

  4. Lossless compression algorithm for multispectral imagers

    NASA Astrophysics Data System (ADS)

    Gladkova, Irina; Grossberg, Michael; Gottipati, Srikanth

    2008-08-01

    Multispectral imaging is becoming an increasingly important tool for monitoring the earth and its environment from space borne and airborne platforms. Multispectral imaging data consists of visible and IR measurements from a scene across space and spectrum. Growing data rates resulting from faster scanning and finer spatial and spectral resolution makes compression an increasingly critical tool to reduce data volume for transmission and archiving. Research for NOAA NESDIS has been directed to finding for the characteristics of satellite atmospheric Earth science Imager sensor data what level of Lossless compression ratio can be obtained as well as appropriate types of mathematics and approaches that can lead to approaching this data's entropy level. Conventional lossless do not achieve the theoretical limits for lossless compression on imager data as estimated from the Shannon entropy. In a previous paper, the authors introduce a lossless compression algorithm developed for MODIS as a proxy for future NOAA-NESDIS satellite based Earth science multispectral imagers such as GOES-R. The algorithm is based on capturing spectral correlations using spectral prediction, and spatial correlations with a linear transform encoder. In decompression, the algorithm uses a statistically computed look up table to iteratively predict each channel from a channel decompressed in the previous iteration. In this paper we present a new approach which fundamentally differs from our prior work. In this new approach, instead of having a single predictor for each pair of bands we introduce a piecewise spatially varying predictor which significantly improves the compression results. Our new algorithm also now optimizes the sequence of channels we use for prediction. Our results are evaluated by comparison with a state of the art wavelet based image compression scheme, Jpeg2000. We present results on the 14 channel subset of the MODIS imager, which serves as a proxy for the GOES-R imager. We

  5. Multi-Angle Imager for Aerosols (MAIA) Investigation of Airborne Particle Health Impacts

    NASA Astrophysics Data System (ADS)

    Diner, D. J.

    2016-12-01

    Airborne particulate matter (PM) is a well-known cause of heart disease, cardiovascular and respiratory illness, low birth weight, and lung cancer. The Global Burden of Disease (GBD) Study ranks PM as a major environmental risk factor worldwide. Global maps of PM2.5concentrations derived from satellite instruments, including MISR and MODIS, have provided key contributions to the GBD and many other health-related investigations. Although it is well established that PM exposure increases the risks of mortality and morbidity, our understanding of the relative toxicity of specific PM types is relatively poor. To address this, the Multi-Angle Imager for Aerosols (MAIA) investigation was proposed to NASA's third Earth Venture Instrument (EVI-3) solicitation. The satellite instrument that is part of the investigation is a multiangle, multispectral, and polarimetric camera system based on the first and second generation Airborne Multiangle SpectroPolarimetric Imagers, AirMSPI and AirMSPI-2. MAIA was selected for funding in March 2016. Estimates of the abundances of different aerosol types from the WRF-Chem model will be combined with MAIA instrument data. Geostatistical models derived from collocated surface and MAIA retrievals will then be used to relate retrieved fractional column aerosol optical depths to near-surface concentrations of major PM constituents, including sulfate, nitrate, organic carbon, black carbon, and dust. Epidemiological analyses of geocoded birth, death, and hospital records will be used to associate exposure to PM types with adverse health outcomes. MAIA launch is planned for early in the next decade. The MAIA instrument incorporates a pair of cameras on a two-axis gimbal to provide regional multiangle observations of selected, globally distributed target areas. Primary Target Areas (PTAs) on five continents are chosen to include major population centers covering a range of PM concentrations and particle types, surface-based aerosol sunphotometers

  6. Application of airborne remote sensing to the ancient Pompeii site

    NASA Astrophysics Data System (ADS)

    Vitiello, Fausto; Giordano, Antonio; Borfecchia, Flavio; Martini, Sandro; De Cecco, Luigi

    1996-12-01

    The ancient Pompeii site is in the Sarno Valley, an area of about 400 km2 in the South of Italy near Naples, that was utilized by man since old time (thousands of years ago). Actually the valley is under critical environmental conditions because of the relevant industrial development. ENEA is conducting various studies and research in the valley. ENEA is employing historical research, ground campaigns, cartography and up-to-date airborne multispectral remote sensing technologies to make a geographical information system. Airborne remote sensing technologies are very suitable for situations as that of the Sarno Valley. The paper describes the archaeological application of the research in progress as regarding the ancient site of Pompeii and its fluvial port.

  7. Tree Species Classification of Broadleaved Forests in Nagano, Central Japan, Using Airborne Laser Data and Multispectral Images

    NASA Astrophysics Data System (ADS)

    Deng, S.; Katoh, M.; Takenaka, Y.; Cheung, K.; Ishii, A.; Fujii, N.; Gao, T.

    2017-10-01

    This study attempted to classify three coniferous and ten broadleaved tree species by combining airborne laser scanning (ALS) data and multispectral images. The study area, located in Nagano, central Japan, is within the broadleaved forests of the Afan Woodland area. A total of 235 trees were surveyed in 2016, and we recorded the species, DBH, and tree height. The geographical position of each tree was collected using a Global Navigation Satellite System (GNSS) device. Tree crowns were manually detected using GNSS position data, field photographs, true-color orthoimages with three bands (red-green-blue, RGB), 3D point clouds, and a canopy height model derived from ALS data. Then a total of 69 features, including 27 image-based and 42 point-based features, were extracted from the RGB images and the ALS data to classify tree species. Finally, the detected tree crowns were classified into two classes for the first level (coniferous and broadleaved trees), four classes for the second level (Pinus densiflora, Larix kaempferi, Cryptomeria japonica, and broadleaved trees), and 13 classes for the third level (three coniferous and ten broadleaved species), using the 27 image-based features, 42 point-based features, all 69 features, and the best combination of features identified using a neighborhood component analysis algorithm, respectively. The overall classification accuracies reached 90 % at the first and second levels but less than 60 % at the third level. The classifications using the best combinations of features had higher accuracies than those using the image-based and point-based features and the combination of all of the 69 features.

  8. Apollo 9 Mission image - S0-65 Multispectral Photography - Texas

    NASA Image and Video Library

    2009-01-21

    Earth Observation taken by the Apollo 9 crew. View is of Galveston and Freeport in Texas. Latitude was 28.42 N by Longitude 94.54 W, Overlap was 80%, Altitude miles were 105 and cloud cover was 5%. This imagery taken as part of the NASA S0-65 Experiment "Multispectral Terrain Photography". The experiment provides simultaneous satellite photography of the Earth's surface in three distinct spectral bands. The photography consists of four almost spatially identical photographs. The images of ground objects appear in the same coordinate positions on all four photos in the multispectral set within the opto-mechanical tolerances of the Hasselblad cameras in the Apollo 9 spacecraft. Band designation for this frame is A. Film and filter is Ektachrome SO-368,Infrared Color Wratten 15. Mean Wavelength of Sensitivity is green,red and infrared. The Nominal Bandpass is total sensitivity of all dye layers 510-900nm.

  9. An investigative study of multispectral data compression for remotely-sensed images using vector quantization and difference-mapped shift-coding

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    A study is conducted to investigate the effects and advantages of data compression techniques on multispectral imagery data acquired by NASA's airborne scanners at the Stennis Space Center. The first technique used was vector quantization. The vector is defined in the multispectral imagery context as an array of pixels from the same location from each channel. The error obtained in substituting the reconstructed images for the original set is compared for different compression ratios. Also, the eigenvalues of the covariance matrix obtained from the reconstructed data set are compared with the eigenvalues of the original set. The effects of varying the size of the vector codebook on the quality of the compression and on subsequent classification are also presented. The output data from the Vector Quantization algorithm was further compressed by a lossless technique called Difference-mapped Shift-extended Huffman coding. The overall compression for 7 channels of data acquired by the Calibrated Airborne Multispectral Scanner (CAMS), with an RMS error of 15.8 pixels was 195:1 (0.41 bpp) and with an RMS error of 3.6 pixels was 18:1 (.447 bpp). The algorithms were implemented in software and interfaced with the help of dedicated image processing boards to an 80386 PC compatible computer. Modules were developed for the task of image compression and image analysis. Also, supporting software to perform image processing for visual display and interpretation of the compressed/classified images was developed.

  10. Characterization of previously unidentified lunar pyroclastic deposits using Lunar Reconnaissance Orbiter Camera (LROC) data

    USGS Publications Warehouse

    Gustafson, J. Olaf; Bell, James F.; Gaddis, Lisa R.R.; Hawke, B. Ray Ray; Giguere, Thomas A.

    2012-01-01

    We used a Lunar Reconnaissance Orbiter Camera (LROC) global monochrome Wide-angle Camera (WAC) mosaic to conduct a survey of the Moon to search for previously unidentified pyroclastic deposits. Promising locations were examined in detail using LROC multispectral WAC mosaics, high-resolution LROC Narrow Angle Camera (NAC) images, and Clementine multispectral (ultraviolet-visible or UVVIS) data. Out of 47 potential deposits chosen for closer examination, 12 were selected as probable newly identified pyroclastic deposits. Potential pyroclastic deposits were generally found in settings similar to previously identified deposits, including areas within or near mare deposits adjacent to highlands, within floor-fractured craters, and along fissures in mare deposits. However, a significant new finding is the discovery of localized pyroclastic deposits within floor-fractured craters Anderson E and F on the lunar farside, isolated from other known similar deposits. Our search confirms that most major regional and localized low-albedo pyroclastic deposits have been identified on the Moon down to ~100 m/pix resolution, and that additional newly identified deposits are likely to be either isolated small deposits or additional portions of discontinuous, patchy deposits.

  11. COMPARISON OF RETINAL PATHOLOGY VISUALIZATION IN MULTISPECTRAL SCANNING LASER IMAGING.

    PubMed

    Meshi, Amit; Lin, Tiezhu; Dans, Kunny; Chen, Kevin C; Amador, Manuel; Hasenstab, Kyle; Muftuoglu, Ilkay Kilic; Nudleman, Eric; Chao, Daniel; Bartsch, Dirk-Uwe; Freeman, William R

    2018-03-16

    To compare retinal pathology visualization in multispectral scanning laser ophthalmoscope imaging between the Spectralis and Optos devices. This retrospective cross-sectional study included 42 eyes from 30 patients with age-related macular degeneration (19 eyes), diabetic retinopathy (10 eyes), and epiretinal membrane (13 eyes). All patients underwent retinal imaging with a color fundus camera (broad-spectrum white light), the Spectralis HRA-2 system (3-color monochromatic lasers), and the Optos P200 system (2-color monochromatic lasers). The Optos image was cropped to a similar size as the Spectralis image. Seven masked graders marked retinal pathologies in each image within a 5 × 5 grid that included the macula. The average area with detected retinal pathology in all eyes was larger in the Spectralis images compared with Optos images (32.4% larger, P < 0.0001), mainly because of better visualization of epiretinal membrane and retinal hemorrhage. The average detection rate of age-related macular degeneration and diabetic retinopathy pathologies was similar across the three modalities, whereas epiretinal membrane detection rate was significantly higher in the Spectralis images. Spectralis tricolor multispectral scanning laser ophthalmoscope imaging had higher rate of pathology detection primarily because of better epiretinal membrane and retinal hemorrhage visualization compared with Optos bicolor multispectral scanning laser ophthalmoscope imaging.

  12. Determining density of maize canopy. 2: Airborne multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Stoner, E. R.; Baumgardner, M. F.; Cipra, J. E.

    1971-01-01

    Multispectral scanner data were collected in two flights over a light colored soil background cover plot at an altitude of 305 m. Energy in eleven reflective wavelength band from 0.45 to 2.6 microns was recorded. Four growth stages of maize (Zea mays L.) gave a wide range of canopy densities for each flight date. Leaf area index measurements were taken from the twelve subplots and were used as a measure of canopy density. Ratio techniques were used to relate uncalibrated scanner response to leaf area index. The ratios of scanner data values for the 0.72 to 0.92 micron wavelength band over the 0.61 to 0.70 micron wavelength band were calculated for each plot. The ratios related very well to leaf area index for a given flight date. The results indicated that spectral data from maize canopies could be of value in determining canopy density.

  13. ASPIRE - Airborne Spectro-Polarization InfraRed Experiment

    NASA Astrophysics Data System (ADS)

    DeLuca, E.; Cheimets, P.; Golub, L.; Madsen, C. A.; Marquez, V.; Bryans, P.; Judge, P. G.; Lussier, L.; McIntosh, S. W.; Tomczyk, S.

    2017-12-01

    Direct measurements of coronal magnetic fields are critical for taking the next step in active region and solar wind modeling and for building the next generation of physics-based space-weather models. We are proposing a new airborne instrument to make these key observations. Building on the successful Airborne InfraRed Spectrograph (AIR-Spec) experiment for the 2017 eclipse, we will design and build a spectro-polarimeter to measure coronal magnetic field during the 2019 South Pacific eclipse. The new instrument will use the AIR-Spec optical bench and the proven pointing, tracking, and stabilization optics. A new cryogenic spectro-polarimeter will be built focusing on the strongest emission lines observed during the eclipse. The AIR-Spec IR camera, slit jaw camera and data acquisition system will all be reused. The poster will outline the optical design and the science goals for ASPIRE.

  14. Topological anomaly detection performance with multispectral polarimetric imagery

    NASA Astrophysics Data System (ADS)

    Gartley, M. G.; Basener, W.,

    2009-05-01

    Polarimetric imaging has demonstrated utility for increasing contrast of manmade targets above natural background clutter. Manual detection of manmade targets in multispectral polarimetric imagery can be challenging and a subjective process for large datasets. Analyst exploitation may be improved utilizing conventional anomaly detection algorithms such as RX. In this paper we examine the performance of a relatively new approach to anomaly detection, which leverages topology theory, applied to spectral polarimetric imagery. Detection results for manmade targets embedded in a complex natural background will be presented for both the RX and Topological Anomaly Detection (TAD) approaches. We will also present detailed results examining detection sensitivities relative to: (1) the number of spectral bands, (2) utilization of Stoke's images versus intensity images, and (3) airborne versus spaceborne measurements.

  15. Apollo 9 Mission image - S0-65 Multispectral Photography - California

    NASA Image and Video Library

    2009-01-21

    Earth Observation taken by the Apollo 9 crew. View is of Salton Sea and Imperial Valley in California. Latitude was 33.09 N by Longitude 116.14 W, Overlap was 50%, Altitude miles were 103 and cloud cover was 35%. This imagery taken as part of the NASA S0-65 Experiment "Multispectral Terrain Photography". The experiment provides simultaneous satellite photography of the Earth's surface in three distinct spectral bands. The photography consists of four almost spatially identical photographs. The images of ground objects appear in the same coordinate positions on all four photos in the multispectral set within the opto-mechanical tolerances of the Hasselblad cameras in the Apollo 9 spacecraft. Band designation for this frame is A. Film and filter is Ektachrome SO-368,Infrared Color Wratten 15. Mean Wavelength of Sensitivity is green,red and infrared. The Nominal Bandpass is total sensitivity of all dye layers 510-900nm.

  16. Applying Neural Networks to Hyperspectral and Multispectral Field Data for Discrimination of Cruciferous Weeds in Winter Crops

    PubMed Central

    de Castro, Ana-Isabel; Jurado-Expósito, Montserrat; Gómez-Casero, María-Teresa; López-Granados, Francisca

    2012-01-01

    In the context of detection of weeds in crops for site-specific weed control, on-ground spectral reflectance measurements are the first step to determine the potential of remote spectral data to classify weeds and crops. Field studies were conducted for four years at different locations in Spain. We aimed to distinguish cruciferous weeds in wheat and broad bean crops, using hyperspectral and multispectral readings in the visible and near-infrared spectrum. To identify differences in reflectance between cruciferous weeds, we applied three classification methods: stepwise discriminant (STEPDISC) analysis and two neural networks, specifically, multilayer perceptron (MLP) and radial basis function (RBF). Hyperspectral and multispectral signatures of cruciferous weeds, and wheat and broad bean crops can be classified using STEPDISC analysis, and MLP and RBF neural networks with different success, being the MLP model the most accurate with 100%, or higher than 98.1%, of classification performance for all the years. Classification accuracy from hyperspectral signatures was similar to that from multispectral and spectral indices, suggesting that little advantage would be obtained by using more expensive airborne hyperspectral imagery. Therefore, for next investigations, we recommend using multispectral remote imagery to explore whether they can potentially discriminate these weeds and crops. PMID:22629171

  17. Applying neural networks to hyperspectral and multispectral field data for discrimination of cruciferous weeds in winter crops.

    PubMed

    de Castro, Ana-Isabel; Jurado-Expósito, Montserrat; Gómez-Casero, María-Teresa; López-Granados, Francisca

    2012-01-01

    In the context of detection of weeds in crops for site-specific weed control, on-ground spectral reflectance measurements are the first step to determine the potential of remote spectral data to classify weeds and crops. Field studies were conducted for four years at different locations in Spain. We aimed to distinguish cruciferous weeds in wheat and broad bean crops, using hyperspectral and multispectral readings in the visible and near-infrared spectrum. To identify differences in reflectance between cruciferous weeds, we applied three classification methods: stepwise discriminant (STEPDISC) analysis and two neural networks, specifically, multilayer perceptron (MLP) and radial basis function (RBF). Hyperspectral and multispectral signatures of cruciferous weeds, and wheat and broad bean crops can be classified using STEPDISC analysis, and MLP and RBF neural networks with different success, being the MLP model the most accurate with 100%, or higher than 98.1%, of classification performance for all the years. Classification accuracy from hyperspectral signatures was similar to that from multispectral and spectral indices, suggesting that little advantage would be obtained by using more expensive airborne hyperspectral imagery. Therefore, for next investigations, we recommend using multispectral remote imagery to explore whether they can potentially discriminate these weeds and crops.

  18. Road Asphalt Pavements Analyzed by Airborne Thermal Remote Sensing: Preliminary Results of the Venice Highway.

    PubMed

    Pascucci, Simone; Bassani, Cristiana; Palombo, Angelo; Poscolieri, Maurizio; Cavalli, Rosa

    2008-02-22

    This paper describes a fast procedure for evaluating asphalt pavement surface defects using airborne emissivity data. To develop this procedure, we used airborne multispectral emissivity data covering an urban test area close to Venice (Italy).For this study, we first identify and select the roads' asphalt pavements on Multispectral Infrared Visible Imaging Spectrometer (MIVIS) imagery using a segmentation procedure. Next, since in asphalt pavements the surface defects are strictly related to the decrease of oily components that cause an increase of the abundance of surfacing limestone, the diagnostic absorption emissivity peak at 11.2μm of the limestone was used for retrieving from MIVIS emissivity data the areas exhibiting defects on asphalt pavements surface.The results showed that MIVIS emissivity allows establishing a threshold that points out those asphalt road sites on which a check for a maintenance intervention is required. Therefore, this technique can supply local government authorities an efficient, rapid and repeatable road mapping procedure providing the location of the asphalt pavements to be checked.

  19. Road Asphalt Pavements Analyzed by Airborne Thermal Remote Sensing: Preliminary Results of the Venice Highway

    PubMed Central

    Pascucci, Simone; Bassani, Cristiana; Palombo, Angelo; Poscolieri, Maurizio; Cavalli, Rosa

    2008-01-01

    This paper describes a fast procedure for evaluating asphalt pavement surface defects using airborne emissivity data. To develop this procedure, we used airborne multispectral emissivity data covering an urban test area close to Venice (Italy).For this study, we first identify and select the roads' asphalt pavements on Multispectral Infrared Visible Imaging Spectrometer (MIVIS) imagery using a segmentation procedure. Next, since in asphalt pavements the surface defects are strictly related to the decrease of oily components that cause an increase of the abundance of surfacing limestone, the diagnostic absorption emissivity peak at 11.2μm of the limestone was used for retrieving from MIVIS emissivity data the areas exhibiting defects on asphalt pavements surface.The results showed that MIVIS emissivity allows establishing a threshold that points out those asphalt road sites on which a check for a maintenance intervention is required. Therefore, this technique can supply local government authorities an efficient, rapid and repeatable road mapping procedure providing the location of the asphalt pavements to be checked. PMID:27879765

  20. The correlation and quantification of airborne spectroradiometer data to turbidity measurements at Lake Powell, Utah

    NASA Technical Reports Server (NTRS)

    Merry, C. J.

    1979-01-01

    A water sampling program was accomplished at Lake Powell, Utah, during June 1975 for correlation to multispectral data obtained with a 500-channel airborne spectroradiometer. Field measurements were taken of percentage of light transmittance, surface temperature, pH and Secchi disk depth. Percentage of light transmittance was also measured in the laboratory for the water samples. Analyses of electron micrographs and suspended sediment concentration data for four water samples located at Hite Bridge, Mile 168, Mile 150 and Bullfrog Bay indicated differences in the composition and concentration of the particulate matter. Airborne spectroradiometer multispectral data were analyzed for the four sampling locations. The results showed that: (1) as the percentage of light transmittance of the water samples decreased, the reflected radiance increased; and (2) as the suspended sediment concentration (mg/l) increased, the reflected radiance increased in the 1-80 mg/l range. In conclusion, valuable qualitative information was obtained on surface turbidity for the Lake Powell water spectra. Also, the reflected radiance measured at a wavelength of 0.58 micron was directly correlated to the suspended sediment concentration.

  1. Multispectral imaging of the ocular fundus using light emitting diode illumination

    NASA Astrophysics Data System (ADS)

    Everdell, N. L.; Styles, I. B.; Calcagni, A.; Gibson, J.; Hebden, J.; Claridge, E.

    2010-09-01

    We present an imaging system based on light emitting diode (LED) illumination that produces multispectral optical images of the human ocular fundus. It uses a conventional fundus camera equipped with a high power LED light source and a highly sensitive electron-multiplying charge coupled device camera. It is able to take pictures at a series of wavelengths in rapid succession at short exposure times, thereby eliminating the image shift introduced by natural eye movements (saccades). In contrast with snapshot systems the images retain full spatial resolution. The system is not suitable for applications where the full spectral resolution is required as it uses discrete wavebands for illumination. This is not a problem in retinal imaging where the use of selected wavelengths is common. The modular nature of the light source allows new wavelengths to be introduced easily and at low cost. The use of wavelength-specific LEDs as a source is preferable to white light illumination and subsequent filtering of the remitted light as it minimizes the total light exposure of the subject. The system is controlled via a graphical user interface that enables flexible control of intensity, duration, and sequencing of sources in synchrony with the camera. Our initial experiments indicate that the system can acquire multispectral image sequences of the human retina at exposure times of 0.05 s in the range of 500-620 nm with mean signal to noise ratio of 17 dB (min 11, std 4.5), making it suitable for quantitative analysis with application to the diagnosis and screening of eye diseases such as diabetic retinopathy and age-related macular degeneration.

  2. Multispectral imaging of the ocular fundus using light emitting diode illumination.

    PubMed

    Everdell, N L; Styles, I B; Calcagni, A; Gibson, J; Hebden, J; Claridge, E

    2010-09-01

    We present an imaging system based on light emitting diode (LED) illumination that produces multispectral optical images of the human ocular fundus. It uses a conventional fundus camera equipped with a high power LED light source and a highly sensitive electron-multiplying charge coupled device camera. It is able to take pictures at a series of wavelengths in rapid succession at short exposure times, thereby eliminating the image shift introduced by natural eye movements (saccades). In contrast with snapshot systems the images retain full spatial resolution. The system is not suitable for applications where the full spectral resolution is required as it uses discrete wavebands for illumination. This is not a problem in retinal imaging where the use of selected wavelengths is common. The modular nature of the light source allows new wavelengths to be introduced easily and at low cost. The use of wavelength-specific LEDs as a source is preferable to white light illumination and subsequent filtering of the remitted light as it minimizes the total light exposure of the subject. The system is controlled via a graphical user interface that enables flexible control of intensity, duration, and sequencing of sources in synchrony with the camera. Our initial experiments indicate that the system can acquire multispectral image sequences of the human retina at exposure times of 0.05 s in the range of 500-620 nm with mean signal to noise ratio of 17 dB (min 11, std 4.5), making it suitable for quantitative analysis with application to the diagnosis and screening of eye diseases such as diabetic retinopathy and age-related macular degeneration.

  3. A versatile photogrammetric camera automatic calibration suite for multispectral fusion and optical helmet tracking

    NASA Astrophysics Data System (ADS)

    de Villiers, Jason; Jermy, Robert; Nicolls, Fred

    2014-06-01

    This paper presents a system to determine the photogrammetric parameters of a camera. The lens distortion, focal length and camera six degree of freedom (DOF) position are calculated. The system caters for cameras of different sensitivity spectra and fields of view without any mechanical modifications. The distortion characterization, a variant of Brown's classic plumb line method, allows many radial and tangential distortion coefficients and finds the optimal principal point. Typical values are 5 radial and 3 tangential coefficients. These parameters are determined stably and demonstrably produce superior results to low order models despite popular and prevalent misconceptions to the contrary. The system produces coefficients to model both the distorted to undistorted pixel coordinate transformation (e.g. for target designation) and the inverse transformation (e.g. for image stitching and fusion) allowing deterministic rates far exceeding real time. The focal length is determined to minimise the error in absolute photogrammetric positional measurement for both multi camera systems or monocular (e.g. helmet tracker) systems. The system determines the 6 DOF position of the camera in a chosen coordinate system. It can also determine the 6 DOF offset of the camera relative to its mechanical mount. This allows faulty cameras to be replaced without requiring a recalibration of the entire system (such as an aircraft cockpit). Results from two simple applications of the calibration results are presented: stitching and fusion of the images from a dual-band visual/ LWIR camera array, and a simple laboratory optical helmet tracker.

  4. Large Multispectral and Albedo Panoramas Acquired by the Pancam Instruments on the Mars Exploration Rovers Spirit and Opportunity

    NASA Technical Reports Server (NTRS)

    Bell, J. F., III; Arneson, H. M.; Farrand, W. H.; Goetz, W.; Hayes, A. G.; Herkenhoff, K.; Johnson, M. J.; Johnson, J. R.; Joseph, J.; Kinch, K.

    2005-01-01

    Introduction. The panoramic camera (Pancam) multispectral, stereoscopic imaging systems on the Mars Exploration Rovers Spirit and Opportunity [1] have acquired and downlinked more than 45,000 images (35 Gbits of data) over more than 700 combined sols of operation on Mars as of early January 2005. A large subset of these images were acquired as part of 26 large multispectral and/or broadband "albedo" panoramas (15 on Spirit, 11 on Opportunity) covering large ranges of azimuth (12 spanning 360 ) and designed to characterize major regional color and albedo characteristics of the landing sites and various points along both rover traverses.

  5. Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test.

    PubMed

    Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno

    2008-11-17

    The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces.

  6. Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test

    PubMed Central

    Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno

    2008-01-01

    The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces. PMID:27873930

  7. Design and Fabrication of Two-Dimensional Semiconducting Bolometer Arrays for the High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC-II)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Allen, Christine A.; Amato, Michael J.; Babu, Sachidananda R.; Bartels, Arlin E.; Benford, Dominic J.; Derro, Rebecca J.; Dowell, C. Darren; Harper, D. Al; Jhabvala, Murzy D.; hide

    2002-01-01

    The High resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC 11) will use almost identical versions of an ion-implanted silicon bolometer array developed at the National Aeronautics and Space Administration's Goddard Space Flight Center (GSFC). The GSFC "Pop-Up" Detectors (PUD's) use a unique folding technique to enable a 12 x 32-element close-packed array of bolometers with a filling factor greater than 95 percent. A kinematic Kevlar(Registered Trademark) suspension system isolates the 200 mK bolometers from the helium bath temperature, and GSFC - developed silicon bridge chips make electrical connection to the bolometers, while maintaining thermal isolation. The JFET preamps operate at 120 K. Providing good thermal heat sinking for these, and keeping their conduction and radiation from reaching the nearby bolometers, is one of the principal design challenges encountered. Another interesting challenge is the preparation of the silicon bolometers. They are manufactured in 32-element, planar rows using Micro Electro Mechanical Systems (MEMS) semiconductor etching techniques, and then cut and folded onto a ceramic bar. Optical alignment using specialized jigs ensures their uniformity and correct placement. The rows are then stacked to create the 12 x 32-element array. Engineering results from the first light run of SHARC II at the CalTech Submillimeter Observatory (CSO) are presented.

  8. Design and Fabrication of Two-Dimensional Semiconducting Bolometer Arrays for the High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC-II)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Allen, Christine A.; Amato, Michael J.; Babu, Sachidananda R.; Bartels, Arlin E.; Benford, Dominic J.; Derro, Rebecca J.; Dowell, C. Darren; Harper, D. Al; Jhabvala, Murzy D.

    2002-01-01

    The High resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC II) will use almost identical versions of an ion-implanted silicon bolometer array developed at the National Aeronautics and Space Administration's Goddard Space Flight Center (GSFC). The GSFC 'Pop-up' Detectors (PUD's) use a unique folding technique to enable a 12 x 32-element close-packed array of bolometers with a filling factor greater than 95 percent. A kinematic Kevlar(trademark) suspension system isolates the 200 mK bolometers from the helium bath temperature, and GSFC - developed silicon bridge chips make electrical connection to the bolometers, while maintaining thermal isolation. The JFET preamps operate at 120 K. Providing good thermal heat sinking for these, and keeping their conduction and radiation from reaching the nearby bolometers, is one of the principal design challenges encountered. Another interesting challenge is the preparation of the silicon bolometers. They are manufactured in 32-element, planar rows using Micro Electro Mechanical Systems (MEMS) semiconductor etching techniques, and then cut and folded onto a ceramic bar. Optical alignment using specialized jigs ensures their uniformity and correct placement. The rows are then stacked to create the 12 x 32-element array. Engineering results from the first light run of SHARC II at the Caltech Submillimeter Observatory (CSO) are presented.

  9. Novelty Detection Classifiers in Weed Mapping: Silybum marianum Detection on UAV Multispectral Images.

    PubMed

    Alexandridis, Thomas K; Tamouridou, Afroditi Alexandra; Pantazi, Xanthoula Eirini; Lagopodi, Anastasia L; Kashefi, Javid; Ovakoglou, Georgios; Polychronos, Vassilios; Moshou, Dimitrios

    2017-09-01

    In the present study, the detection and mapping of Silybum marianum (L.) Gaertn. weed using novelty detection classifiers is reported. A multispectral camera (green-red-NIR) on board a fixed wing unmanned aerial vehicle (UAV) was employed for obtaining high-resolution images. Four novelty detection classifiers were used to identify S. marianum between other vegetation in a field. The classifiers were One Class Support Vector Machine (OC-SVM), One Class Self-Organizing Maps (OC-SOM), Autoencoders and One Class Principal Component Analysis (OC-PCA). As input features to the novelty detection classifiers, the three spectral bands and texture were used. The S. marianum identification accuracy using OC-SVM reached an overall accuracy of 96%. The results show the feasibility of effective S. marianum mapping by means of novelty detection classifiers acting on multispectral UAV imagery.

  10. A Cryogenic, Insulating Suspension System for the High Resolution Airborne Wideband Camera (HAWC)and Submillemeter And Far Infrared Experiment (SAFIRE) Adiabatic Demagnetization Refrigerators (ADRs)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Jackson, Michael L.; Shirron, Peter J.; Tuttle, James G.

    2002-01-01

    The High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter And Far Infrared Experiment (SAFIRE) will use identical Adiabatic Demagnetization Refrigerators (ADR) to cool their detectors to 200mK and 100mK, respectively. In order to minimize thermal loads on the salt pill, a Kevlar suspension system is used to hold it in place. An innovative, kinematic suspension system is presented. The suspension system is unique in that it consists of two parts that can be assembled and tensioned offline, and later bolted onto the salt pill.

  11. The use of optical microscope equipped with multispectral detector to distinguish different types of acute lymphoblastic leukemia

    NASA Astrophysics Data System (ADS)

    Pronichev, A. N.; Polyakov, E. V.; Tupitsyn, N. N.; Frenkel, M. A.; Mozhenkova, A. V.

    2017-01-01

    The article describes the use of a computer optical microscopy with multispectral camera to characterize the texture of blasts bone marrow of patients with different variants of acute lymphoblastic leukemia: B- and T- types. Specific characteristics of the chromatin of the nuclei of blasts for different types of acute lymphoblastic leukemia were obtained.

  12. Integration, Testing, and Analysis of Multispectral Imager on Small Unmanned Aerial System for Skin Detection

    DTIC Science & Technology

    2014-03-01

    U.S. Air Force, and others have demonstrated the utility of SUAS in natural disasters such as the Fukushima Daiichi meltdown to take photographs at...factor. Multispectral Imagery (MSI) has proven capable of dismount detection with several distinct wavelengths. This research proposes a spectral...Epipolar lines depicted in blue, show the geometric relationship between the two cameras after stereo rectification

  13. Laser- and Multi-Spectral Monitoring of Natural Objects from UAVs

    NASA Astrophysics Data System (ADS)

    Reiterer, Alexander; Frey, Simon; Koch, Barbara; Stemmler, Simon; Weinacker, Holger; Hoffmann, Annemarie; Weiler, Markus; Hergarten, Stefan

    2016-04-01

    The paper describes the research, development and evaluation of a lightweight sensor system for UAVs. The system is composed of three main components: (1) a laser scanning module, (2) a multi-spectral camera system, and (3) a processing/storage unit. All three components are newly developed. Beside measurement precision and frequency, the low weight has been one of the challenging tasks. The current system has a total weight of about 2.5 kg and is designed as a self-contained unit (incl. storage and battery units). The main features of the system are: laser-based multi-echo 3D measurement by a wavelength of 905 nm (totally eye save), measurement range up to 200 m, measurement frequency of 40 kHz, scanning frequency of 16 Hz, relative distance accuracy of 10 mm. The system is equipped with both GNSS and IMU. Alternatively, a multi-visual-odometry system has been integrated to estimate the trajectory of the UAV by image features (based on this system a calculation of 3D-coordinates without GNSS is possible). The integrated multi-spectral camera system is based on conventional CMOS-image-chips equipped with a special sets of band-pass interference filters with a full width half maximum (FWHM) of 50 nm. Good results for calculating the normalized difference vegetation index (NDVI) and the wide dynamic range vegetation index (WDRVI) have been achieved using the band-pass interference filter-set with a FWHM of 50 nm and an exposure times between 5.000 μs and 7.000 μs. The system is currently used for monitoring of natural objects and surfaces, like forest, as well as for geo-risk analysis (landslides). By measuring 3D-geometric and multi-spectral information a reliable monitoring and interpretation of the data-set is possible. The paper gives an overview about the development steps, the system, the evaluation and first results.

  14. Adaptation of the Camera Link Interface for Flight-Instrument Applications

    NASA Technical Reports Server (NTRS)

    Randall, David P.; Mahoney, John C.

    2010-01-01

    COTS (commercial-off-the-shelf) hard ware using an industry-standard Camera Link interface is proposed to accomplish the task of designing, building, assembling, and testing electronics for an airborne spectrometer that would be low-cost, but sustain the required data speed and volume. The focal plane electronics were designed to support that hardware standard. Analysis was done to determine how these COTS electronics could be interfaced with space-qualified camera electronics. Interfaces available for spaceflight application do not support the industry standard Camera Link interface, but with careful design, COTS EGSE (electronics ground support equipment), including camera interfaces and camera simulators, can still be used.

  15. Development of an Airborne High Resolution TV System (AHRTS)

    DTIC Science & Technology

    1975-11-01

    GOVT ACCESSION NO READ INSTRUCTIONS BEFORE COMPLETING FORM JP RECIPIENT’S CATALOG NUMBER DEVELOPMENT OF AN ^IRBORNE HIGH JESOLUTION TV SYSTEM...c. Sytem Elements The essential Airborne Subsystem elements of camera, video tape recorder, transmitter and antennas are required to have...The camera operated over the 3000:1 light change as required. A solar shutter was Incorporated to protect the vidicon from damage from direct view

  16. Multispectral and colour analysis for ubiquinone solutions and biological samples

    NASA Astrophysics Data System (ADS)

    Timofeeva, Elvira O.; Gorbunova, Elena V.; Chertov, Aleksandr N.

    2017-02-01

    An oxidative damage in cell structures is a basis of most mechanisms that lead to health diseases and senescence of human body. The presence of antioxidant issues such as redox potential imbalance in human body is a very important question for modern clinical diagnostics. Implementation of multispectral and colour analysis of the human skin into optical diagnostics of such wide distributed in a human body antioxidant as ubiquinone can be one of the steps for development of the device with a view to clinical diagnostics of redox potential or quality control of the cosmetics. The recording of multispectral images of the hand skin with monochromatic camera and a set of coloured filters was provided in the current research. Recording data of the multispectral imaging technique was processed using principal component analysis. Also colour characteristics of the skin before and after the skin treatment with facial mask which contains ubiquinone were calculated. The results of the mask treatment were compared with the treatment using oily ubiquinone solution. Despite the fact that results did not give clear explanation about healthy skin or skin stressed by reactive oxygen species, methods which were described in this research are able to identify how skin surface is changing after the antioxidant treatment. In future it is important to provide biomedical tests during the optical tests of the human skin.

  17. Multispectral Coatings

    DTIC Science & Technology

    2010-01-01

    failure, whereas the polymer nanocomposite gave ductile failure with less surface damage. Task 2. Highly reflective self-assembled coatings . The...AFRL-RX-WP-TR-2010-4036 MULTISPECTRAL COATINGS Eric Grulke University of Kentucky Thad Druffel Optical Dynamics JANUARY...REPORT TYPE 3. DATES COVERED (From - To) January 2010 Final 28 November 2005 – 30 September 2008 4. TITLE AND SUBTITLE MULTISPECTRAL COATINGS 5a

  18. An operational multispectral scanner for bathymetric surveys - The ABS NORDA scanner

    NASA Technical Reports Server (NTRS)

    Haimbach, Stephen P.; Joy, Richard T.; Hickman, G. Daniel

    1987-01-01

    The Naval Ocean Research and Development Activity (NORDA) is developing the Airborne Bathymetric Survey (ABS) system, which will take shallow water depth soundings from a Navy P-3 aircraft. The system combines active and passive sensors to obtain optical measurements of water depth. The ABS NORDA Scanner is the systems passive multispectral scanner whose design goal is to provide 100 percent coverage of the seafloor, to depths of 20 m in average coastal waters. The ABS NORDA Scanner hardware and operational environment is discussed in detail. The optical model providing the basis for depth extraction is reviewed and the proposed data processing routine discussed.

  19. Apollo 9 Mission image - S0-65 Multispectral Photography - New Mexico

    NASA Image and Video Library

    2009-01-21

    Earth Observation taken by the Apollo 9 crew. View is of Carrizozo in New Mexico and includes lava flow and snow. Latitude was 33.42 N by Longitude 106.10 W, Overlap was 7.5%, Altitude miles were 121 and cloud cover was 0%. This imagery taken as part of the NASA S0-65 Experiment "Multispectral Terrain Photography". The experiment provides simultaneous satellite photography of the Earth's surface in three distinct spectral bands. The photography consists of four almost spatially identical photographs. The images of ground objects appear in the same coordinate positions on all four photos in the multispectral set within the opto-mechanical tolerances of the Hasselblad cameras in the Apollo 9 spacecraft. Band designation for this frame is A. Film and filter is Ektachrome SO-368,Infrared Color Wratten 15. Mean Wavelength of Sensitivity is green,red and infrared. The Nominal Bandpass is total sensitivity of all dye layers 510-900nm.

  20. Airborne Dust Monitoring Activities at the National Environmental Satellite, Data and Information Service

    NASA Astrophysics Data System (ADS)

    Stephens, G.; McNamara, D.; Taylor, J.

    2002-12-01

    Wind blown dust can be a hazard to transportation, industrial, and military operations, and much work has been devoted to its analysis and prediction from a meteorological viewpoint. The detection and forecasting of dust outbreaks in near real time is difficult, particularly in remote desert areas with sparse observation networks. The Regional Haze Regulation, passed by Congress in 1999, mandates a reduction in man made inputs to haze in 156 Class I areas (national parks and wilderness areas). Studies have demonstrated that satellite data can be useful in detection and tracking of dust storms. Environmental satellites offer frequent coverage of large geographic areas. The National Environmental Satellite, Data, and Information Service (NESDIS) of the U.S. National Oceanic and Atmospheric Administration (NOAA) operates a system of polar orbiting and geostationary environmental satellites, which sense data in two visible and three infrared channels. Promising results in the detection of airborne dust have been obtained using multispectral techniques to combine information from two or more channels to detect subtle spectral differences. One technique, using a ratio of two thermal channels, detects the presence of airborne dust, and discriminates it from both underlying ground and meteorological clouds. In addition, NESDIS accesses and is investigating for operational use data from several other satellites. The Total Ozone Mapping Spectrometer on board NASA's Earth Probe mission provides an aerosol index product which can detect dust and smoke, and the Moderate Resolution Imaging Spectroradiometer on NASA's Terra and Aqua satellites provide several channels which can detect aerosols in multispectral channel combinations. NESDIS, in cooperation with NOAA's Air Resources Laboratory, produces a daily smoke transport forecast, combining satellite derived smoke source points with a mathematical transport prediction model; such a scheme could be applied to other aerosol

  1. Detection of rice sheath blight using an unmanned aerial system with high-resolution color and multispectral imaging.

    PubMed

    Zhang, Dongyan; Zhou, Xingen; Zhang, Jian; Lan, Yubin; Xu, Chao; Liang, Dong

    2018-01-01

    Detection and monitoring are the first essential step for effective management of sheath blight (ShB), a major disease in rice worldwide. Unmanned aerial systems have a high potential of being utilized to improve this detection process since they can reduce the time needed for scouting for the disease at a field scale, and are affordable and user-friendly in operation. In this study, a commercialized quadrotor unmanned aerial vehicle (UAV), equipped with digital and multispectral cameras, was used to capture imagery data of research plots with 67 rice cultivars and elite lines. Collected imagery data were then processed and analyzed to characterize the development of ShB and quantify different levels of the disease in the field. Through color features extraction and color space transformation of images, it was found that the color transformation could qualitatively detect the infected areas of ShB in the field plots. However, it was less effective to detect different levels of the disease. Five vegetation indices were then calculated from the multispectral images, and ground truths of disease severity and GreenSeeker measured NDVI (Normalized Difference Vegetation Index) were collected. The results of relationship analyses indicate that there was a strong correlation between ground-measured NDVIs and image-extracted NDVIs with the R2 of 0.907 and the root mean square error (RMSE) of 0.0854, and a good correlation between image-extracted NDVIs and disease severity with the R2 of 0.627 and the RMSE of 0.0852. Use of image-based NDVIs extracted from multispectral images could quantify different levels of ShB in the field plots with an accuracy of 63%. These results demonstrate that a customer-grade UAV integrated with digital and multispectral cameras can be an effective tool to detect the ShB disease at a field scale.

  2. Multispectral imaging based on a Smartphone with an external C-MOS camera for detection of seborrheic dermatitis on the scalp

    NASA Astrophysics Data System (ADS)

    Kim, Manjae; Kim, Sewoong; Hwang, Minjoo; Kim, Jihun; Je, Minkyu; Jang, Jae Eun; Lee, Dong Hun; Hwang, Jae Youn

    2017-02-01

    To date, the incident rates of various skin diseases have increased due to hereditary and environmental factors including stress, irregular diet, pollution, etc. Among these skin diseases, seborrheic dermatitis and psoriasis are a chronic/relapsing dermatitis involving infection and temporary alopecia. However, they typically exhibit similar symptoms, thus resulting in difficulty in discrimination between them. To prevent their associated complications and appropriate treatments for them, it is crucial to discriminate between seborrheic dermatitis and psoriasis with high specificity and sensitivity and further continuously/quantitatively to monitor the skin lesions during their treatment at other locations besides a hospital. Thus, we here demonstrate a mobile multispectral imaging system connected to a smartphone for selfdiagnosis of seborrheic dermatitis and further discrimination between seborrheic dermatitis and psoriasis on the scalp, which is the more challenging case. Using the system developed, multispectral imaging and analysis of seborrheic dermatitis and psoriasis on the scalp was carried out. It was here found that the spectral signatures of seborrheic dermatitis and psoriasis were discernable and thus seborrheic dermatitis on the scalp could be distinguished from psoriasis by using the system. In particular, the smartphone-based multispectral imaging and analysis moreover offered better discrimination between seborrheic dermatitis and psoriasis than the RGB imaging and analysis. These results suggested that the multispectral imaging system based on a smartphone has the potential for self-diagnosis of seborrheic dermatitis with high portability and specificity.

  3. Optimized Multi-Spectral Filter Array Based Imaging of Natural Scenes.

    PubMed

    Li, Yuqi; Majumder, Aditi; Zhang, Hao; Gopi, M

    2018-04-12

    Multi-spectral imaging using a camera with more than three channels is an efficient method to acquire and reconstruct spectral data and is used extensively in tasks like object recognition, relighted rendering, and color constancy. Recently developed methods are used to only guide content-dependent filter selection where the set of spectral reflectances to be recovered are known a priori. We present the first content-independent spectral imaging pipeline that allows optimal selection of multiple channels. We also present algorithms for optimal placement of the channels in the color filter array yielding an efficient demosaicing order resulting in accurate spectral recovery of natural reflectance functions. These reflectance functions have the property that their power spectrum statistically exhibits a power-law behavior. Using this property, we propose power-law based error descriptors that are minimized to optimize the imaging pipeline. We extensively verify our models and optimizations using large sets of commercially available wide-band filters to demonstrate the greater accuracy and efficiency of our multi-spectral imaging pipeline over existing methods.

  4. Optimized Multi-Spectral Filter Array Based Imaging of Natural Scenes

    PubMed Central

    Li, Yuqi; Majumder, Aditi; Zhang, Hao; Gopi, M.

    2018-01-01

    Multi-spectral imaging using a camera with more than three channels is an efficient method to acquire and reconstruct spectral data and is used extensively in tasks like object recognition, relighted rendering, and color constancy. Recently developed methods are used to only guide content-dependent filter selection where the set of spectral reflectances to be recovered are known a priori. We present the first content-independent spectral imaging pipeline that allows optimal selection of multiple channels. We also present algorithms for optimal placement of the channels in the color filter array yielding an efficient demosaicing order resulting in accurate spectral recovery of natural reflectance functions. These reflectance functions have the property that their power spectrum statistically exhibits a power-law behavior. Using this property, we propose power-law based error descriptors that are minimized to optimize the imaging pipeline. We extensively verify our models and optimizations using large sets of commercially available wide-band filters to demonstrate the greater accuracy and efficiency of our multi-spectral imaging pipeline over existing methods. PMID:29649114

  5. PtSi gimbal-based FLIR for airborne applications

    NASA Astrophysics Data System (ADS)

    Wallace, Joseph; Ornstein, Itzhak; Nezri, M.; Fryd, Y.; Bloomberg, Steve; Beem, S.; Bibi, B.; Hem, S.; Perna, Steve N.; Tower, John R.; Lang, Frank B.; Villani, Thomas S.; McCarthy, D. R.; Stabile, Paul J.

    1997-08-01

    A new gimbal-based, FLIR camera for several types of airborne platforms has been developed. The FLIR is based on a PtSi on silicon technology: developed for high volume and minimum cost. The gimbal scans an area of 360 degrees in azimuth and an elevation range of plus 15 degrees to minus 105 degrees. It is stabilized to 25 (mu) Rad-rms. A combination of uniformity correction, defect substitution, and compact optics results in a long range, low cost FLIR for all low-speed airborne platforms.

  6. Multispectral thermal infrared mapping of the 1 October 1988 Kupaianaha flow field, Kilauea volcano, Hawaii

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J.; Hon, Ken; Kahle, Anne B.; Abbott, Elsa A.; Pieri, David C.

    1992-01-01

    Multispectral thermal infrared radiance measurements of the Kupaianaha flow field were acquired with the NASA airborne Thermal Infrared Multispectral Scanner (TIMS) on the morning of 1 October 1988. The TIMS data were used to map both the temperature and emissivity of the surface of the flow field. The temperature map depicted the underground storage and transport of lava. The presence of molten lava in a tube or tumulus resulted in surface temperatures that were at least 10 C above ambient. The temperature map also clearly defined the boundaries of hydrothermal plumes which resulted from the entry of lava into the ocean. The emissivity map revealed the boundaries between individual flow units within the Kupaianaha field. Distinct spectral anomalies, indicative of silica-rich surface materials, were mapped near fumaroles and ocean entry sites. This apparent enrichment in silica may have resulted from an acid-induced leaching of cations from the surfaces of glassy flows.

  7. Development of Camera Model and Geometric Calibration/validation of Xsat IRIS Imagery

    NASA Astrophysics Data System (ADS)

    Kwoh, L. K.; Huang, X.; Tan, W. J.

    2012-07-01

    XSAT, launched on 20 April 2011, is the first micro-satellite designed and built in Singapore. It orbits the Earth at altitude of 822 km in a sun synchronous orbit. The satellite carries a multispectral camera IRIS with three spectral bands - 0.52~0.60 mm for Green, 0.63~0.69 mm for Red and 0.76~0.89 mm for NIR at 12 m resolution. In the design of IRIS camera, the three bands were acquired by three lines of CCDs (NIR, Red and Green). These CCDs were physically separated in the focal plane and their first pixels not absolutely aligned. The micro-satellite platform was also not stable enough to allow for co-registration of the 3 bands with simple linear transformation. In the camera model developed, this platform stability was compensated with 3rd to 4th order polynomials for the satellite's roll, pitch and yaw attitude angles. With the camera model, the camera parameters such as the band to band separations, the alignment of the CCDs relative to each other, as well as the focal length of the camera can be validated or calibrated. The results of calibration with more than 20 images showed that the band to band along-track separation agreed well with the pre-flight values provided by the vendor (0.093° and 0.046° for the NIR vs red and for green vs red CCDs respectively). The cross-track alignments were 0.05 pixel and 5.9 pixel for the NIR vs red and green vs red CCDs respectively. The focal length was found to be shorter by about 0.8%. This was attributed to the lower operating temperature which XSAT is currently operating. With the calibrated parameters and the camera model, a geometric level 1 multispectral image with RPCs can be generated and if required, orthorectified imagery can also be produced.

  8. Design framework for a spectral mask for a plenoptic camera

    NASA Astrophysics Data System (ADS)

    Berkner, Kathrin; Shroff, Sapna A.

    2012-01-01

    Plenoptic cameras are designed to capture different combinations of light rays from a scene, sampling its lightfield. Such camera designs capturing directional ray information enable applications such as digital refocusing, rotation, or depth estimation. Only few address capturing spectral information of the scene. It has been demonstrated that by modifying a plenoptic camera with a filter array containing different spectral filters inserted in the pupil plane of the main lens, sampling of the spectral dimension of the plenoptic function is performed. As a result, the plenoptic camera is turned into a single-snapshot multispectral imaging system that trades-off spatial with spectral information captured with a single sensor. Little work has been performed so far on analyzing diffraction effects and aberrations of the optical system on the performance of the spectral imager. In this paper we demonstrate simulation of a spectrally-coded plenoptic camera optical system via wave propagation analysis, evaluate quality of the spectral measurements captured at the detector plane, and demonstrate opportunities for optimization of the spectral mask for a few sample applications.

  9. The Panoramic Camera (Pancam) Investigation on the NASA 2003 Mars Exploration Rover Mission

    NASA Technical Reports Server (NTRS)

    Bell, J. F., III; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Dingizian, A.; Brown, D.; Morris, R. V.; Arneson, H. M.; Johnson, M. J.

    2003-01-01

    The Panoramic Camera System (Pancam) is part of the Athena science payload to be launched to Mars in 2003 on NASA's twin Mars Exploration Rover (MER) missions. The Pancam imaging system on each rover consists of two major components: a pair of digital CCD cameras, and the Pancam Mast Assembly (PMA), which provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360 of azimuth and from zenith to nadir, providing a complete view of the scene around the rover.

  10. Pancam: A Multispectral Imaging Investigation on the NASA 2003 Mars Exploration Rover Mission

    NASA Technical Reports Server (NTRS)

    Bell, J. F., III; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.; Schwochert, M.; Dingizian, A.; Brown, D.; Morris, R. V.; Arneson, H. M.; Johnson, M. J.

    2003-01-01

    One of the six science payload elements carried on each of the NASA Mars Exploration Rovers (MER; Figure 1) is the Panoramic Camera System, or Pancam. Pancam consists of three major components: a pair of digital CCD cameras, the Pancam Mast Assembly (PMA), and a radiometric calibration target. The PMA provides the azimuth and elevation actuation for the cameras as well as a 1.5 meter high vantage point from which to image. The calibration target provides a set of reference color and grayscale standards for calibration validation, and a shadow post for quantification of the direct vs. diffuse illumination of the scene. Pancam is a multispectral, stereoscopic, panoramic imaging system, with a field of regard provided by the PMA that extends across 360 of azimuth and from zenith to nadir, providing a complete view of the scene around the rover in up to 12 unique wavelengths. The major characteristics of Pancam are summarized.

  11. Multispectral and hyperspectral measurements of soldier's camouflage equipment

    NASA Astrophysics Data System (ADS)

    Kastek, Mariusz; Piątkowski, Tadeusz; Dulski, Rafal; Chamberland, Martin; Lagueux, Philippe; Farley, Vincent

    2012-06-01

    In today's electro-optic warfare era, it is more than vital for one nation's defense to possess the most advanced measurement and signature intelligence (MASINT) capabilities. This is critical to gain a strategic advantage in the planning of the military operations and deployments. The thermal infrared region of the electromagnetic spectrum is a key region that is exploited for infrared reconnaissance and surveillance missions. The Military University of Technology has conducted an intensive measurement campaign of various soldier's camouflage devices in the scope of building a database of infrared signatures. One of today's key technologies required to perform signature measurements has become infrared hyperspectral and broadband/multispectral imaging sensors. The Telops Hyper-Cam LW product represents a unique commercial offering with outstanding performances and versatility for the collection of hyperspectral infrared images. The Hyper-Cam allows for the infrared imagery of a target (320 × 256 pixels) at a very high spectral resolution (down to 0.25 cm-1). Moreover, the Military University of Technology has made use of a suite of scientific grade commercial infrared cameras to further measure and assess the targets from a broadband/multispectral perspective. The experiment concept and measurement results are presented in this paper.

  12. Rapid overt airborne reconnaissance (ROAR) for mines and obstacles in very shallow water, surf zone, and beach

    NASA Astrophysics Data System (ADS)

    Moran, Steven E.; Austin, William L.; Murray, James T.; Roddier, Nicolas A.; Bridges, Robert; Vercillo, Richard; Stettner, Roger; Phillips, Dave; Bisbee, Al; Witherspoon, Ned H.

    2003-09-01

    Under the Office of Naval Research's Organic Mine Countermeasures Future Naval Capabilities (OMCM FNC) program, Lite Cycles, Inc. is developing an innovative and highly compact airborne active sensor system for mine and obstacle detection in very shallow water (VSW), through the surf-zone (SZ) and onto the beach. The system uses an innovative LCI proprietary integrated scanner, detector, and telescope (ISDT) receiver architecture. The ISD tightly couples all receiver components and LIDAR electronics to achieve the system compaction required for tactical UAVintegration while providing a large aperture. It also includes an advanced compact multifunction laser transmitter; an industry-first high-resolution, compact 3-D camera, a scanning function for wide area search, and temporally displaced multiple looks on the fly over the ocean surface for clutter reduction. Additionally, the laser will provide time-multiplexed multi-color output to perform day/night multispectral imaging for beach surveillance. New processing algorithms for mine detection in the very challenging surf-zone clutter environment are under development, which offer the potential for significant processing gains in comparison to the legacy approaches. This paper reviews the legacy system approaches, describes the mission challenges, and provides an overview of the ROAR system architecture.

  13. A multi-sensor lidar, multi-spectral and multi-angular approach for mapping canopy height in boreal forest regions

    USGS Publications Warehouse

    Selkowitz, David J.; Green, Gordon; Peterson, Birgit E.; Wylie, Bruce

    2012-01-01

    Spatially explicit representations of vegetation canopy height over large regions are necessary for a wide variety of inventory, monitoring, and modeling activities. Although airborne lidar data has been successfully used to develop vegetation canopy height maps in many regions, for vast, sparsely populated regions such as the boreal forest biome, airborne lidar is not widely available. An alternative approach to canopy height mapping in areas where airborne lidar data is limited is to use spaceborne lidar measurements in combination with multi-angular and multi-spectral remote sensing data to produce comprehensive canopy height maps for the entire region. This study uses spaceborne lidar data from the Geosciences Laser Altimeter System (GLAS) as training data for regression tree models that incorporate multi-angular and multi-spectral data from the Multi-Angle Imaging Spectroradiometer (MISR) and the Moderate Resolution Imaging SpectroRadiometer (MODIS) to map vegetation canopy height across a 1,300,000 km2 swath of boreal forest in Interior Alaska. Results are compared to in situ height measurements as well as airborne lidar data. Although many of the GLAS-derived canopy height estimates are inaccurate, applying a series of filters incorporating both data associated with the GLAS shots as well as ancillary data such as land cover can identify the majority of height estimates with significant errors, resulting in a filtered dataset with much higher accuracy. Results from the regression tree models indicate that late winter MISR imagery acquired under snow-covered conditions is effective for mapping canopy heights ranging from 5 to 15 m, which includes the vast majority of forests in the region. It appears that neither MISR nor MODIS imagery acquired during the growing season is effective for canopy height mapping, although including summer multi-spectral MODIS data along with winter MISR imagery does appear to provide a slight increase in the accuracy of

  14. Analysis of multispectral signatures and investigation of multi-aspect remote sensing techniques

    NASA Technical Reports Server (NTRS)

    Malila, W. A.; Hieber, R. H.; Sarno, J. E.

    1974-01-01

    Two major aspects of remote sensing with multispectral scanners (MSS) are investigated. The first, multispectral signature analysis, includes the effects on classification performance of systematic variations found in the average signals received from various ground covers as well as the prediction of these variations with theoretical models of physical processes. The foremost effects studied are those associated with the time of day airborne MSS data are collected. Six data collection runs made over the same flight line in a period of five hours are analyzed, it is found that the time span significantly affects classification performance. Variations associated with scan angle also are studied. The second major topic of discussion is multi-aspect remote sensing, a new concept in remote sensing with scanners. Here, data are collected on multiple passes by a scanner that can be tilted to scan forward of the aircraft at different angles on different passes. The use of such spatially registered data to achieve improved classification of agricultural scenes is investigated and found promising. Also considered are the possibilities of extracting from multi-aspect data, information on the condition of corn canopies and the stand characteristics of forests.

  15. Camera system for multispectral imaging of documents

    NASA Astrophysics Data System (ADS)

    Christens-Barry, William A.; Boydston, Kenneth; France, Fenella G.; Knox, Keith T.; Easton, Roger L., Jr.; Toth, Michael B.

    2009-02-01

    A spectral imaging system comprising a 39-Mpixel monochrome camera, LED-based narrowband illumination, and acquisition/control software has been designed for investigations of cultural heritage objects. Notable attributes of this system, referred to as EurekaVision, include: streamlined workflow, flexibility, provision of well-structured data and metadata for downstream processing, and illumination that is safer for the artifacts. The system design builds upon experience gained while imaging the Archimedes Palimpsest and has been used in studies of a number of important objects in the LOC collection. This paper describes practical issues that were considered by EurekaVision to address key research questions for the study of fragile and unique cultural objects over a range of spectral bands. The system is intended to capture important digital records for access by researchers, professionals, and the public. The system was first used for spectral imaging of the 1507 world map by Martin Waldseemueller, the first printed map to reference "America." It was also used to image sections of the Carta Marina 1516 map by the same cartographer for comparative purposes. An updated version of the system is now being utilized by the Preservation Research and Testing Division of the Library of Congress.

  16. Get the Picture?

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Positive Systems has worked in conjunction with Stennis Space Center to design the ADAR System 5500. This is a four-band airborne digital imaging system used to capture multispectral imagery similar to that available from satellite platforms such as Landsat, SPOT and the new generation of high resolution satellites. Positive Systems has provided remote sensing services for the development of digital aerial camera systems and software for commercial aerial imaging applications.

  17. Multispectral Microscopic Imager (MMI): Multispectral Imaging of Geological Materials at a Handlens Scale

    NASA Astrophysics Data System (ADS)

    Farmer, J. D.; Nunez, J. I.; Sellar, R. G.; Gardner, P. B.; Manatt, K. S.; Dingizian, A.; Dudik, M. J.; McDonnell, G.; Le, T.; Thomas, J. A.; Chu, K.

    2011-12-01

    The Multispectral Microscopic Imager (MMI) is a prototype instrument presently under development for future astrobiological missions to Mars. The MMI is designed to be a arm-mounted rover instrument for use in characterizing the microtexture and mineralogy of materials along geological traverses [1,2,3]. Such geological information is regarded as essential for interpreting petrogenesis and geological history, and when acquired in near real-time, can support hypothesis-driven exploration and optimize science return. Correlated microtexure and mineralogy also provides essential data for selecting samples for analysis with onboard lab instruments, and for prioritizing samples for potential Earth return. The MMI design employs multispectral light-emitting diodes (LEDs) and an uncooled focal plane array to achieve the low-mass (<1kg), low-cost, and high reliability (no moving parts) required for an arm-mounted instrument on a planetary rover [2,3]. The MMI acquires multispectral, reflectance images at 62 μm/pixel, in which each image pixel is comprised of a 21-band VNIR spectrum (0.46 to 1.73 μm). This capability enables the MMI to discriminate and resolve the spatial distribution of minerals and textures at the microscale [2, 3]. By extending the spectral range into the infrared, and increasing the number of spectral bands, the MMI exceeds the capabilities of current microimagers, including the MER Microscopic Imager (MI); 4, the Phoenix mission Robotic Arm Camera (RAC; 5) and the Mars Science Laboratory's Mars Hand Lens Imager (MAHLI; 6). In this report we will review the capabilities of the MMI by highlighting recent lab and field applications, including: 1) glove box deployments in the Astromaterials lab at Johnson Space Center to analyze Apollo lunar samples; 2) GeoLab glove box deployments during the 2011 Desert RATS field trials in northern AZ to characterize analog materials collected by astronauts during simulated EVAs; 3) field deployments on Mauna Kea

  18. Multispectral LiDAR Data for Land Cover Classification of Urban Areas

    PubMed Central

    Morsy, Salem; Shaker, Ahmed; El-Rabbany, Ahmed

    2017-01-01

    Airborne Light Detection And Ranging (LiDAR) systems usually operate at a monochromatic wavelength measuring the range and the strength of the reflected energy (intensity) from objects. Recently, multispectral LiDAR sensors, which acquire data at different wavelengths, have emerged. This allows for recording of a diversity of spectral reflectance from objects. In this context, we aim to investigate the use of multispectral LiDAR data in land cover classification using two different techniques. The first is image-based classification, where intensity and height images are created from LiDAR points and then a maximum likelihood classifier is applied. The second is point-based classification, where ground filtering and Normalized Difference Vegetation Indices (NDVIs) computation are conducted. A dataset of an urban area located in Oshawa, Ontario, Canada, is classified into four classes: buildings, trees, roads and grass. An overall accuracy of up to 89.9% and 92.7% is achieved from image classification and 3D point classification, respectively. A radiometric correction model is also applied to the intensity data in order to remove the attenuation due to the system distortion and terrain height variation. The classification process is then repeated, and the results demonstrate that there are no significant improvements achieved in the overall accuracy. PMID:28445432

  19. Multispectral LiDAR Data for Land Cover Classification of Urban Areas.

    PubMed

    Morsy, Salem; Shaker, Ahmed; El-Rabbany, Ahmed

    2017-04-26

    Airborne Light Detection And Ranging (LiDAR) systems usually operate at a monochromatic wavelength measuring the range and the strength of the reflected energy (intensity) from objects. Recently, multispectral LiDAR sensors, which acquire data at different wavelengths, have emerged. This allows for recording of a diversity of spectral reflectance from objects. In this context, we aim to investigate the use of multispectral LiDAR data in land cover classification using two different techniques. The first is image-based classification, where intensity and height images are created from LiDAR points and then a maximum likelihood classifier is applied. The second is point-based classification, where ground filtering and Normalized Difference Vegetation Indices (NDVIs) computation are conducted. A dataset of an urban area located in Oshawa, Ontario, Canada, is classified into four classes: buildings, trees, roads and grass. An overall accuracy of up to 89.9% and 92.7% is achieved from image classification and 3D point classification, respectively. A radiometric correction model is also applied to the intensity data in order to remove the attenuation due to the system distortion and terrain height variation. The classification process is then repeated, and the results demonstrate that there are no significant improvements achieved in the overall accuracy.

  20. Improved capabilities of the Multispectral Atmospheric Mapping Sensor (MAMS)

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.; Batson, K. Bryan; Atkinson, Robert J.; Moeller, Chris C.; Menzel, W. Paul; James, Mark W.

    1989-01-01

    The Multispectral Atmospheric Mapping Sensor (MAMS) is an airborne instrument being investigated as part of NASA's high altitude research program. Findings from work on this and other instruments have been important as the scientific justification of new instrumentation for the Earth Observing System (EOS). This report discusses changes to the instrument which have led to new capabilities, improved data quality, and more accurate calibration methods. In order to provide a summary of the data collected with MAMS, a complete list of flight dates and locations is provided. For many applications, registration of MAMS imagery with landmarks is required. The navigation of this data on the Man-computer Interactive Data Access System (McIDAS) is discussed. Finally, research applications of the data are discussed and specific examples are presented to show the applicability of these measurements to NASA's Earth System Science (ESS) objectives.

  1. Optimized lighting method of applying shaped-function signal for increasing the dynamic range of LED-multispectral imaging system

    NASA Astrophysics Data System (ADS)

    Yang, Xue; Hu, Yajia; Li, Gang; Lin, Ling

    2018-02-01

    This paper proposes an optimized lighting method of applying a shaped-function signal for increasing the dynamic range of light emitting diode (LED)-multispectral imaging system. The optimized lighting method is based on the linear response zone of the analog-to-digital conversion (ADC) and the spectral response of the camera. The auxiliary light at a higher sensitivity-camera area is introduced to increase the A/D quantization levels that are within the linear response zone of ADC and improve the signal-to-noise ratio. The active light is modulated by the shaped-function signal to improve the gray-scale resolution of the image. And the auxiliary light is modulated by the constant intensity signal, which is easy to acquire the images under the active light irradiation. The least square method is employed to precisely extract the desired images. One wavelength in multispectral imaging based on LED illumination was taken as an example. It has been proven by experiments that the gray-scale resolution and the accuracy of information of the images acquired by the proposed method were both significantly improved. The optimum method opens up avenues for the hyperspectral imaging of biological tissue.

  2. Optimized lighting method of applying shaped-function signal for increasing the dynamic range of LED-multispectral imaging system.

    PubMed

    Yang, Xue; Hu, Yajia; Li, Gang; Lin, Ling

    2018-02-01

    This paper proposes an optimized lighting method of applying a shaped-function signal for increasing the dynamic range of light emitting diode (LED)-multispectral imaging system. The optimized lighting method is based on the linear response zone of the analog-to-digital conversion (ADC) and the spectral response of the camera. The auxiliary light at a higher sensitivity-camera area is introduced to increase the A/D quantization levels that are within the linear response zone of ADC and improve the signal-to-noise ratio. The active light is modulated by the shaped-function signal to improve the gray-scale resolution of the image. And the auxiliary light is modulated by the constant intensity signal, which is easy to acquire the images under the active light irradiation. The least square method is employed to precisely extract the desired images. One wavelength in multispectral imaging based on LED illumination was taken as an example. It has been proven by experiments that the gray-scale resolution and the accuracy of information of the images acquired by the proposed method were both significantly improved. The optimum method opens up avenues for the hyperspectral imaging of biological tissue.

  3. SWUIS-A: A Versatile, Low-Cost UV/VIS/IR Imaging System for Airborne Astronomy and Aeronomy Research

    NASA Technical Reports Server (NTRS)

    Durda, Daniel D.; Stern, S. Alan; Tomlinson, William; Slater, David C.; Vilas, Faith

    2001-01-01

    We have developed and successfully flight-tested on 14 different airborne missions the hardware and techniques for routinely conducting valuable astronomical and aeronomical observations from high-performance, two-seater military-type aircraft. The SWUIS-A (Southwest Universal Imaging System - Airborne) system consists of an image-intensified CCD camera with broad band response from the near-UV to the near IR, high-quality foreoptics, a miniaturized video recorder, an aircraft-to-camera power and telemetry interface with associated camera controls, and associated cables, filters, and other minor equipment. SWUIS-A's suite of high-quality foreoptics gives it selectable, variable focal length/variable field-of-view capabilities. The SWUIS-A camera frames at 60 Hz video rates, which is a key requirement for both jitter compensation and high time resolution (useful for occultation, lightning, and auroral studies). Broadband SWUIS-A image coadds can exceed a limiting magnitude of V = 10.5 in <1 sec with dark sky conditions. A valuable attribute of SWUIS-A airborne observations is the fact that the astronomer flies with the instrument, thereby providing Space Shuttle-like "payload specialist" capability to "close-the-loop" in real-time on the research done on each research mission. Key advantages of the small, high-performance aircraft on which we can fly SWUIS-A include significant cost savings over larger, more conventional airborne platforms, worldwide basing obviating the need for expensive, campaign-style movement of specialized large aircraft and their logistics support teams, and ultimately faster reaction times to transient events. Compared to ground-based instruments, airborne research platforms offer superior atmospheric transmission, the mobility to reach remote and often-times otherwise unreachable locations over the Earth, and virtually-guaranteed good weather for observing the sky. Compared to space-based instruments, airborne platforms typically offer

  4. Multispectral analysis tools can increase utility of RGB color images in histology

    NASA Astrophysics Data System (ADS)

    Fereidouni, Farzad; Griffin, Croix; Todd, Austin; Levenson, Richard

    2018-04-01

    Multispectral imaging (MSI) is increasingly finding application in the study and characterization of biological specimens. However, the methods typically used come with challenges on both the acquisition and the analysis front. MSI can be slow and photon-inefficient, leading to long imaging times and possible phototoxicity and photobleaching. The resulting datasets can be large and complex, prompting the development of a number of mathematical approaches for segmentation and signal unmixing. We show that under certain circumstances, just three spectral channels provided by standard color cameras, coupled with multispectral analysis tools, including a more recent spectral phasor approach, can efficiently provide useful insights. These findings are supported with a mathematical model relating spectral bandwidth and spectral channel number to achievable spectral accuracy. The utility of 3-band RGB and MSI analysis tools are demonstrated on images acquired using brightfield and fluorescence techniques, as well as a novel microscopy approach employing UV-surface excitation. Supervised linear unmixing, automated non-negative matrix factorization and phasor analysis tools all provide useful results, with phasors generating particularly helpful spectral display plots for sample exploration.

  5. Hemodynamic and morphologic responses in mouse brain during acute head injury imaged by multispectral structured illumination

    NASA Astrophysics Data System (ADS)

    Volkov, Boris; Mathews, Marlon S.; Abookasis, David

    2015-03-01

    Multispectral imaging has received significant attention over the last decade as it integrates spectroscopy, imaging, tomography analysis concurrently to acquire both spatial and spectral information from biological tissue. In the present study, a multispectral setup based on projection of structured illumination at several near-infrared wavelengths and at different spatial frequencies is applied to quantitatively assess brain function before, during, and after the onset of traumatic brain injury in an intact mouse brain (n=5). For the production of head injury, we used the weight drop method where weight of a cylindrical metallic rod falling along a metal tube strikes the mouse's head. Structured light was projected onto the scalp surface and diffuse reflected light was recorded by a CCD camera positioned perpendicular to the mouse head. Following data analysis, we were able to concurrently show a series of hemodynamic and morphologic changes over time including higher deoxyhemoglobin, reduction in oxygen saturation, cell swelling, etc., in comparison with baseline measurements. Overall, results demonstrates the capability of multispectral imaging based structured illumination to detect and map of brain tissue optical and physiological properties following brain injury in a simple noninvasive and noncontact manner.

  6. Three-Dimensional Reconstruction from Single Image Base on Combination of CNN and Multi-Spectral Photometric Stereo.

    PubMed

    Lu, Liang; Qi, Lin; Luo, Yisong; Jiao, Hengchao; Dong, Junyu

    2018-03-02

    Multi-spectral photometric stereo can recover pixel-wise surface normal from a single RGB image. The difficulty lies in that the intensity in each channel is the tangle of illumination, albedo and camera response; thus, an initial estimate of the normal is required in optimization-based solutions. In this paper, we propose to make a rough depth estimation using the deep convolutional neural network (CNN) instead of using depth sensors or binocular stereo devices. Since high-resolution ground-truth data is expensive to obtain, we designed a network and trained it with rendered images of synthetic 3D objects. We use the model to predict initial normal of real-world objects and iteratively optimize the fine-scale geometry in the multi-spectral photometric stereo framework. The experimental results illustrate the improvement of the proposed method compared with existing methods.

  7. Three-Dimensional Reconstruction from Single Image Base on Combination of CNN and Multi-Spectral Photometric Stereo

    PubMed Central

    Lu, Liang; Qi, Lin; Luo, Yisong; Jiao, Hengchao; Dong, Junyu

    2018-01-01

    Multi-spectral photometric stereo can recover pixel-wise surface normal from a single RGB image. The difficulty lies in that the intensity in each channel is the tangle of illumination, albedo and camera response; thus, an initial estimate of the normal is required in optimization-based solutions. In this paper, we propose to make a rough depth estimation using the deep convolutional neural network (CNN) instead of using depth sensors or binocular stereo devices. Since high-resolution ground-truth data is expensive to obtain, we designed a network and trained it with rendered images of synthetic 3D objects. We use the model to predict initial normal of real-world objects and iteratively optimize the fine-scale geometry in the multi-spectral photometric stereo framework. The experimental results illustrate the improvement of the proposed method compared with existing methods. PMID:29498703

  8. Multispectral imaging probe

    DOEpatents

    Sandison, David R.; Platzbecker, Mark R.; Descour, Michael R.; Armour, David L.; Craig, Marcus J.; Richards-Kortum, Rebecca

    1999-01-01

    A multispectral imaging probe delivers a range of wavelengths of excitation light to a target and collects a range of expressed light wavelengths. The multispectral imaging probe is adapted for mobile use and use in confined spaces, and is sealed against the effects of hostile environments. The multispectral imaging probe comprises a housing that defines a sealed volume that is substantially sealed from the surrounding environment. A beam splitting device mounts within the sealed volume. Excitation light is directed to the beam splitting device, which directs the excitation light to a target. Expressed light from the target reaches the beam splitting device along a path coaxial with the path traveled by the excitation light from the beam splitting device to the target. The beam splitting device directs expressed light to a collection subsystem for delivery to a detector.

  9. Multispectral imaging probe

    DOEpatents

    Sandison, D.R.; Platzbecker, M.R.; Descour, M.R.; Armour, D.L.; Craig, M.J.; Richards-Kortum, R.

    1999-07-27

    A multispectral imaging probe delivers a range of wavelengths of excitation light to a target and collects a range of expressed light wavelengths. The multispectral imaging probe is adapted for mobile use and use in confined spaces, and is sealed against the effects of hostile environments. The multispectral imaging probe comprises a housing that defines a sealed volume that is substantially sealed from the surrounding environment. A beam splitting device mounts within the sealed volume. Excitation light is directed to the beam splitting device, which directs the excitation light to a target. Expressed light from the target reaches the beam splitting device along a path coaxial with the path traveled by the excitation light from the beam splitting device to the target. The beam splitting device directs expressed light to a collection subsystem for delivery to a detector. 8 figs.

  10. Real-time moving objects detection and tracking from airborne infrared camera

    NASA Astrophysics Data System (ADS)

    Zingoni, Andrea; Diani, Marco; Corsini, Giovanni

    2017-10-01

    Detecting and tracking moving objects in real-time from an airborne infrared (IR) camera offers interesting possibilities in video surveillance, remote sensing and computer vision applications, such as monitoring large areas simultaneously, quickly changing the point of view on the scene and pursuing objects of interest. To fully exploit such a potential, versatile solutions are needed, but, in the literature, the majority of them works only under specific conditions about the considered scenario, the characteristics of the moving objects or the aircraft movements. In order to overcome these limitations, we propose a novel approach to the problem, based on the use of a cheap inertial navigation system (INS), mounted on the aircraft. To exploit jointly the information contained in the acquired video sequence and the data provided by the INS, a specific detection and tracking algorithm has been developed. It consists of three main stages performed iteratively on each acquired frame. The detection stage, in which a coarse detection map is computed, using a local statistic both fast to calculate and robust to noise and self-deletion of the targeted objects. The registration stage, in which the position of the detected objects is coherently reported on a common reference frame, by exploiting the INS data. The tracking stage, in which the steady objects are rejected, the moving objects are tracked, and an estimation of their future position is computed, to be used in the subsequent iteration. The algorithm has been tested on a large dataset of simulated IR video sequences, recreating different environments and different movements of the aircraft. Promising results have been obtained, both in terms of detection and false alarm rate, and in terms of accuracy in the estimation of position and velocity of the objects. In addition, for each frame, the detection and tracking map has been generated by the algorithm, before the acquisition of the subsequent frame, proving its

  11. Multispectral Snapshot Imagers Onboard Small Satellite Formations for Multi-Angular Remote Sensing

    NASA Technical Reports Server (NTRS)

    Nag, Sreeja; Hewagama, Tilak; Georgiev, Georgi; Pasquale, Bert; Aslam, Shahid; Gatebe, Charles K.

    2017-01-01

    Multispectral snapshot imagers are capable of producing 2D spatial images with a single exposure at selected, numerous wavelengths using the same camera, therefore operate differently from push broom or whiskbroom imagers. They are payloads of choice in multi-angular, multi-spectral imaging missions that use small satellites flying in controlled formation, to retrieve Earth science measurements dependent on the targets Bidirectional Reflectance-Distribution Function (BRDF). Narrow fields of view are needed to capture images with moderate spatial resolution. This paper quantifies the dependencies of the imagers optical system, spectral elements and camera on the requirements of the formation mission and their impact on performance metrics such as spectral range, swath and signal to noise ratio (SNR). All variables and metrics have been generated from a comprehensive, payload design tool. The baseline optical parameters selected (diameter 7 cm, focal length 10.5 cm, pixel size 20 micron, field of view 1.15 deg) and snapshot imaging technologies are available. The spectral components shortlisted were waveguide spectrometers, acousto-optic tunable filters (AOTF), electronically actuated Fabry-Perot interferometers, and integral field spectrographs. Qualitative evaluation favored AOTFs because of their low weight, small size, and flight heritage. Quantitative analysis showed that waveguide spectrometers perform better in terms of achievable swath (10-90 km) and SNR (greater than 20) for 86 wavebands, but the data volume generated will need very high bandwidth communication to downlink. AOTFs meet the external data volume caps well as the minimum spectral (wavebands) and radiometric (SNR) requirements, therefore are found to be currently feasible in spite of lower swath and SNR.

  12. Comparison of mosaicking techniques for airborne images from consumer-grade cameras

    USDA-ARS?s Scientific Manuscript database

    Images captured from airborne imaging systems have the advantages of relatively low cost, high spatial resolution, and real/near-real-time availability. Multiple images taken from one or more flight lines could be used to generate a high-resolution mosaic image, which could be useful for diverse rem...

  13. Application of Multilayer Perceptron with Automatic Relevance Determination on Weed Mapping Using UAV Multispectral Imagery.

    PubMed

    Tamouridou, Afroditi A; Alexandridis, Thomas K; Pantazi, Xanthoula E; Lagopodi, Anastasia L; Kashefi, Javid; Kasampalis, Dimitris; Kontouris, Georgios; Moshou, Dimitrios

    2017-10-11

    Remote sensing techniques are routinely used in plant species discrimination and of weed mapping. In the presented work, successful Silybum marianum detection and mapping using multilayer neural networks is demonstrated. A multispectral camera (green-red-near infrared) attached on a fixed wing unmanned aerial vehicle (UAV) was utilized for the acquisition of high-resolution images (0.1 m resolution). The Multilayer Perceptron with Automatic Relevance Determination (MLP-ARD) was used to identify the S. marianum among other vegetation, mostly Avena sterilis L. The three spectral bands of Red, Green, Near Infrared (NIR) and the texture layer resulting from local variance were used as input. The S. marianum identification rates using MLP-ARD reached an accuracy of 99.54%. Τhe study had an one year duration, meaning that the results are specific, although the accuracy shows the interesting potential of S. marianum mapping with MLP-ARD on multispectral UAV imagery.

  14. Dual light-emitting diode-based multichannel microscopy for whole-slide multiplane, multispectral and phase imaging.

    PubMed

    Liao, Jun; Wang, Zhe; Zhang, Zibang; Bian, Zichao; Guo, Kaikai; Nambiar, Aparna; Jiang, Yutong; Jiang, Shaowei; Zhong, Jingang; Choma, Michael; Zheng, Guoan

    2018-02-01

    We report the development of a multichannel microscopy for whole-slide multiplane, multispectral and phase imaging. We use trinocular heads to split the beam path into 6 independent channels and employ a camera array for parallel data acquisition, achieving a maximum data throughput of approximately 1 gigapixel per second. To perform single-frame rapid autofocusing, we place 2 near-infrared light-emitting diodes (LEDs) at the back focal plane of the condenser lens to illuminate the sample from 2 different incident angles. A hot mirror is used to direct the near-infrared light to an autofocusing camera. For multiplane whole-slide imaging (WSI), we acquire 6 different focal planes of a thick specimen simultaneously. For multispectral WSI, we relay the 6 independent image planes to the same focal position and simultaneously acquire information at 6 spectral bands. For whole-slide phase imaging, we acquire images at 3 focal positions simultaneously and use the transport-of-intensity equation to recover the phase information. We also provide an open-source design to further increase the number of channels from 6 to 15. The reported platform provides a simple solution for multiplexed fluorescence imaging and multimodal WSI. Acquiring an instant focal stack without z-scanning may also enable fast 3-dimensional dynamic tracking of various biological samples. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Airborne Optical and Thermal Remote Sensing for Wildfire Detection and Monitoring.

    PubMed

    Allison, Robert S; Johnston, Joshua M; Craig, Gregory; Jennings, Sion

    2016-08-18

    For decades detection and monitoring of forest and other wildland fires has relied heavily on aircraft (and satellites). Technical advances and improved affordability of both sensors and sensor platforms promise to revolutionize the way aircraft detect, monitor and help suppress wildfires. Sensor systems like hyperspectral cameras, image intensifiers and thermal cameras that have previously been limited in use due to cost or technology considerations are now becoming widely available and affordable. Similarly, new airborne sensor platforms, particularly small, unmanned aircraft or drones, are enabling new applications for airborne fire sensing. In this review we outline the state of the art in direct, semi-automated and automated fire detection from both manned and unmanned aerial platforms. We discuss the operational constraints and opportunities provided by these sensor systems including a discussion of the objective evaluation of these systems in a realistic context.

  16. Airborne Optical and Thermal Remote Sensing for Wildfire Detection and Monitoring

    PubMed Central

    Allison, Robert S.; Johnston, Joshua M.; Craig, Gregory; Jennings, Sion

    2016-01-01

    For decades detection and monitoring of forest and other wildland fires has relied heavily on aircraft (and satellites). Technical advances and improved affordability of both sensors and sensor platforms promise to revolutionize the way aircraft detect, monitor and help suppress wildfires. Sensor systems like hyperspectral cameras, image intensifiers and thermal cameras that have previously been limited in use due to cost or technology considerations are now becoming widely available and affordable. Similarly, new airborne sensor platforms, particularly small, unmanned aircraft or drones, are enabling new applications for airborne fire sensing. In this review we outline the state of the art in direct, semi-automated and automated fire detection from both manned and unmanned aerial platforms. We discuss the operational constraints and opportunities provided by these sensor systems including a discussion of the objective evaluation of these systems in a realistic context. PMID:27548174

  17. Active/passive scanning. [airborne multispectral laser scanners for agricultural and water resources applications

    NASA Technical Reports Server (NTRS)

    Woodfill, J. R.; Thomson, F. J.

    1979-01-01

    The paper deals with the design, construction, and applications of an active/passive multispectral scanner combining lasers with conventional passive remote sensors. An application investigation was first undertaken to identify remote sensing applications where active/passive scanners (APS) would provide improvement over current means. Calibration techniques and instrument sensitivity are evaluated to provide predictions of the APS's capability to meet user needs. A preliminary instrument design was developed from the initial conceptual scheme. A design review settled the issues of worthwhile applications, calibration approach, hardware design, and laser complement. Next, a detailed mechanical design was drafted and construction of the APS commenced. The completed APS was tested and calibrated in the laboratory, then installed in a C-47 aircraft and ground tested. Several flight tests completed the test program.

  18. A multispectral automatic target recognition application for maritime surveillance, search, and rescue

    NASA Astrophysics Data System (ADS)

    Schoonmaker, Jon; Reed, Scott; Podobna, Yuliya; Vazquez, Jose; Boucher, Cynthia

    2010-04-01

    Due to increased security concerns, the commitment to monitor and maintain security in the maritime environment is increasingly a priority. A country's coast is the most vulnerable area for the incursion of illegal immigrants, terrorists and contraband. This work illustrates the ability of a low-cost, light-weight, multi-spectral, multi-channel imaging system to handle the environment and see under difficult marine conditions. The system and its implemented detecting and tracking technologies should be organic to the maritime homeland security community for search and rescue, fisheries, defense, and law enforcement. It is tailored for airborne and ship based platforms to detect, track and monitor suspected objects (such as semi-submerged targets like marine mammals, vessels in distress, and drug smugglers). In this system, automated detection and tracking technology is used to detect, classify and localize potential threats or objects of interest within the imagery provided by the multi-spectral system. These algorithms process the sensor data in real time, thereby providing immediate feedback when features of interest have been detected. A supervised detection system based on Haar features and Cascade Classifiers is presented and results are provided on real data. The system is shown to be extendable and reusable for a variety of different applications.

  19. Dual-emissive quantum dots for multispectral intraoperative fluorescence imaging.

    PubMed

    Chin, Patrick T K; Buckle, Tessa; Aguirre de Miguel, Arantxa; Meskers, Stefan C J; Janssen, René A J; van Leeuwen, Fijs W B

    2010-09-01

    Fluorescence molecular imaging is rapidly increasing its popularity in image guided surgery applications. To help develop its full surgical potential it remains a challenge to generate dual-emissive imaging agents that allow for combined visible assessment and sensitive camera based imaging. To this end, we now describe multispectral InP/ZnS quantum dots (QDs) that exhibit a bright visible green/yellow exciton emission combined with a long-lived far red defect emission. The intensity of the latter emission was enhanced by X-ray irradiation and allows for: 1) inverted QD density dependent defect emission intensity, showing improved efficacies at lower QD densities, and 2) detection without direct illumination and interference from autofluorescence. Copyright 2010 Elsevier Ltd. All rights reserved.

  20. Research on airborne infrared leakage detection of natural gas pipeline

    NASA Astrophysics Data System (ADS)

    Tan, Dongjie; Xu, Bin; Xu, Xu; Wang, Hongchao; Yu, Dongliang; Tian, Shengjie

    2011-12-01

    An airborne laser remote sensing technology is proposed to detect natural gas pipeline leakage in helicopter which carrying a detector, and the detector can detect a high spatial resolution of trace of methane on the ground. The principle of the airborne laser remote sensing system is based on tunable diode laser absorption spectroscopy (TDLAS). The system consists of an optical unit containing the laser, camera, helicopter mount, electronic unit with DGPS antenna, a notebook computer and a pilot monitor. And the system is mounted on a helicopter. The principle and the architecture of the airborne laser remote sensing system are presented. Field test experiments are carried out on West-East Natural Gas Pipeline of China, and the results show that airborne detection method is suitable for detecting gas leak of pipeline on plain, desert, hills but unfit for the area with large altitude diversification.

  1. Efficient Feature Extraction and Likelihood Fusion for Vehicle Tracking in Low Frame Rate Airborne Video

    DTIC Science & Technology

    2010-07-01

    imagery, persistent sensor array I. Introduction New device fabrication technologies and heterogeneous embedded processors have led to the emergence of a...geometric occlusions between target and sensor , motion blur, urban scene complexity, and high data volumes. In practical terms the targets are small...distributed airborne narrow-field-of-view video sensor networks. Airborne camera arrays combined with com- putational photography techniques enable the

  2. Recent improvements in hydrometeor sampling using airborne holography

    NASA Astrophysics Data System (ADS)

    Stith, J. L.; Bansemer, A.; Glienke, S.; Shaw, R. A.; Aquino, J.; Fugal, J. P.

    2017-12-01

    Airborne digital holography provides a new technique to study the sizes, shapes and locations of hydrometeors. Airborne holographic cameras are able to capture more optical information than traditional airborne hydrometeor instruments, which allows for more detailed information, such as the location and shape of individual hydrometeors over a relatively wide range of sizes. These cameras can be housed in an anti-shattering probe arm configuration, which minimizes the effects of probe tip shattering. Holographic imagery, with its three dimensional view of hydrometeor spacing, is also well suited to detecting shattering events when present. A major problem with digital holographic techniques has been the amount of machine time and human analysis involved in analyzing holographic data. Here, we present some recent examples showing how holographic analysis can improve our measurements of liquid and ice particles and we describe a format we have developed for routine archiving of Holographic data, so that processed results can be utilized more routinely by a wider group of investigators. We present a side-by-side comparison of the imagery obtained from holographic reconstruction of ice particles from a holographic camera (HOLODEC) with imagery from a 3VCPI instrument, which utilizes a tube-based sampling geometry. Both instruments were carried on the NSF/NCAR GV aircraft. In a second application of holographic imaging, we compare measurements of cloud droplets from a Cloud Droplet Probe (CDP) with simultaneous measurements from HOLODEC. In some cloud regions the CDP data exhibits a bimodal size distribution, while the more local data from HOLODEC suggests that two mono-modal size distributions are present in the cloud and that the bimodality observed in the CDP is due to the averaging length. Thus, the holographic techniques have the potential to improve our understanding of the warm rain process in future airborne field campaigns. The development of this instrument has

  3. Computationally efficient target classification in multispectral image data with Deep Neural Networks

    NASA Astrophysics Data System (ADS)

    Cavigelli, Lukas; Bernath, Dominic; Magno, Michele; Benini, Luca

    2016-10-01

    Detecting and classifying targets in video streams from surveillance cameras is a cumbersome, error-prone and expensive task. Often, the incurred costs are prohibitive for real-time monitoring. This leads to data being stored locally or transmitted to a central storage site for post-incident examination. The required communication links and archiving of the video data are still expensive and this setup excludes preemptive actions to respond to imminent threats. An effective way to overcome these limitations is to build a smart camera that analyzes the data on-site, close to the sensor, and transmits alerts when relevant video sequences are detected. Deep neural networks (DNNs) have come to outperform humans in visual classifications tasks and are also performing exceptionally well on other computer vision tasks. The concept of DNNs and Convolutional Networks (ConvNets) can easily be extended to make use of higher-dimensional input data such as multispectral data. We explore this opportunity in terms of achievable accuracy and required computational effort. To analyze the precision of DNNs for scene labeling in an urban surveillance scenario we have created a dataset with 8 classes obtained in a field experiment. We combine an RGB camera with a 25-channel VIS-NIR snapshot sensor to assess the potential of multispectral image data for target classification. We evaluate several new DNNs, showing that the spectral information fused together with the RGB frames can be used to improve the accuracy of the system or to achieve similar accuracy with a 3x smaller computation effort. We achieve a very high per-pixel accuracy of 99.1%. Even for scarcely occurring, but particularly interesting classes, such as cars, 75% of the pixels are labeled correctly with errors occurring only around the border of the objects. This high accuracy was obtained with a training set of only 30 labeled images, paving the way for fast adaptation to various application scenarios.

  4. Multispectral determination of vegetative cover in corn crop canopy

    NASA Technical Reports Server (NTRS)

    Stoner, E. R.; Baumgardner, M. F.

    1972-01-01

    The relationship between different amounts of vegetative ground cover and the energy reflected by corn canopies was investigated. Low altitude photography and an airborne multispectral scanner were used to measure this reflected energy. Field plots were laid out, representing four growth stages of corn. Two plot locations were chosen-on a very dark and a very light surface soil. Color and color infrared photographs were taken from a vertical distance of 10 m. Estimates of ground cover were made from these photographs and were related to field measurements of leaf area index. Ground cover could be predicted from leaf area index measurements by a second order equation. Microdensitometry and digitzation of the three separated dye layers of color infrared film showed that the near infrared dye layer is most valuable in ground cover determinations. Computer analysis of the digitized photography provided an accurate method of determining precent ground cover.

  5. Gimbaled multispectral imaging system and method

    DOEpatents

    Brown, Kevin H.; Crollett, Seferino; Henson, Tammy D.; Napier, Matthew; Stromberg, Peter G.

    2016-01-26

    A gimbaled multispectral imaging system and method is described herein. In an general embodiment, the gimbaled multispectral imaging system has a cross support that defines a first gimbal axis and a second gimbal axis, wherein the cross support is rotatable about the first gimbal axis. The gimbaled multispectral imaging system comprises a telescope that fixed to an upper end of the cross support, such that rotation of the cross support about the first gimbal axis causes the tilt of the telescope to alter. The gimbaled multispectral imaging system includes optics that facilitate on-gimbal detection of visible light and off-gimbal detection of infrared light.

  6. For geological investigations with airborne thermal infrared multispectral images: Transfer of calibration from laboratory spectrometer to TIMS as alternative for removing atmospheric effects

    NASA Technical Reports Server (NTRS)

    Edgett, Kenneth S.; Anderson, Donald L.

    1995-01-01

    This paper describes an empirical method to correct TIMS (Thermal Infrared Multispectral Scanner) data for atmospheric effects by transferring calibration from a laboratory thermal emission spectrometer to the TIMS multispectral image. The method does so by comparing the laboratory spectra of samples gathered in the field with TIMS 6-point spectra for pixels at the location of field sampling sites. The transference of calibration also makes it possible to use spectra from the laboratory as endmembers in unmixing studies of TIMS data.

  7. Uav Cameras: Overview and Geometric Calibration Benchmark

    NASA Astrophysics Data System (ADS)

    Cramer, M.; Przybilla, H.-J.; Zurhorst, A.

    2017-08-01

    Different UAV platforms and sensors are used in mapping already, many of them equipped with (sometimes) modified cameras as known from the consumer market. Even though these systems normally fulfil their requested mapping accuracy, the question arises, which system performs best? This asks for a benchmark, to check selected UAV based camera systems in well-defined, reproducible environments. Such benchmark is tried within this work here. Nine different cameras used on UAV platforms, representing typical camera classes, are considered. The focus is laid on the geometry here, which is tightly linked to the process of geometrical calibration of the system. In most applications the calibration is performed in-situ, i.e. calibration parameters are obtained as part of the project data itself. This is often motivated because consumer cameras do not keep constant geometry, thus, cannot be seen as metric cameras. Still, some of the commercial systems are quite stable over time, as it was proven from repeated (terrestrial) calibrations runs. Already (pre-)calibrated systems may offer advantages, especially when the block geometry of the project does not allow for a stable and sufficient in-situ calibration. Especially for such scenario close to metric UAV cameras may have advantages. Empirical airborne test flights in a calibration field have shown how block geometry influences the estimated calibration parameters and how consistent the parameters from lab calibration can be reproduced.

  8. Airborne laser altimetry and multispectral imagery for modeling Golden-cheeked Warbler (Setophaga chrysoparia) density

    Treesearch

    Steven E. Sesnie; James M. Mueller; Sarah E. Lehnen; Scott M. Rowin; Jennifer L. Reidy; Frank R. Thompson

    2016-01-01

    Robust models of wildlife population size, spatial distribution, and habitat relationships are needed to more effectively monitor endangered species and prioritize habitat conservation efforts. Remotely sensed data such as airborne laser altimetry (LiDAR) and digital color infrared (CIR) aerial photography combined with well-designed field studies can help fill these...

  9. Multispectral radiation envelope characteristics of aerial infrared targets

    NASA Astrophysics Data System (ADS)

    Kou, Tian; Zhou, Zhongliang; Liu, Hongqiang; Yang, Yuanzhi; Lu, Chunguang

    2018-07-01

    Multispectral detection signals are relatively stable and complementary to single spectral detection signals with deficiencies of severe scintillation and poor anti-interference. To take advantage of multispectral radiation characteristics in the application of infrared target detection, the concept of a multispectral radiation envelope is proposed. To build the multispectral radiation envelope model, the temperature distribution of an aerial infrared target is calculated first. By considering the coupling heat transfer process, the heat balance equation is built by using the node network, and the convective heat transfer laws as a function of target speed are uncovered. Then, the tail flame temperature distribution model is built and the temperature distributions at different horizontal distances are calculated. Second, to obtain the optimal detection angles, envelope models of reflected background multispectral radiation and target multispectral radiation are built. Finally, the envelope characteristics of the aerial target multispectral radiation are analyzed in different wavebands in detail. The results we obtained reflect Wien's displacement law and prove the effectiveness and reasonableness of the envelope model, and also indicate that the major difference between multispectral wavebands is greatly influenced by the target speed. Moreover, optimal detection angles are obtained by numerical simulation, and these are very important for accurate and fast target detection, attack decision-making and developing multispectral detection platforms.

  10. Application of Multilayer Perceptron with Automatic Relevance Determination on Weed Mapping Using UAV Multispectral Imagery

    PubMed Central

    Tamouridou, Afroditi A.; Lagopodi, Anastasia L.; Kashefi, Javid; Kasampalis, Dimitris; Kontouris, Georgios; Moshou, Dimitrios

    2017-01-01

    Remote sensing techniques are routinely used in plant species discrimination and of weed mapping. In the presented work, successful Silybum marianum detection and mapping using multilayer neural networks is demonstrated. A multispectral camera (green-red-near infrared) attached on a fixed wing unmanned aerial vehicle (UAV) was utilized for the acquisition of high-resolution images (0.1 m resolution). The Multilayer Perceptron with Automatic Relevance Determination (MLP-ARD) was used to identify the S. marianum among other vegetation, mostly Avena sterilis L. The three spectral bands of Red, Green, Near Infrared (NIR) and the texture layer resulting from local variance were used as input. The S. marianum identification rates using MLP-ARD reached an accuracy of 99.54%. Τhe study had an one year duration, meaning that the results are specific, although the accuracy shows the interesting potential of S. marianum mapping with MLP-ARD on multispectral UAV imagery. PMID:29019957

  11. Multispectral imaging of absorption and scattering properties of in vivo exposed rat brain using a digital red-green-blue camera.

    PubMed

    Yoshida, Keiichiro; Nishidate, Izumi; Ishizuka, Tomohiro; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu

    2015-05-01

    In order to estimate multispectral images of the absorption and scattering properties in the cerebral cortex of in vivo rat brain, we investigated spectral reflectance images estimated by the Wiener estimation method using a digital RGB camera. A Monte Carlo simulation-based multiple regression analysis for the corresponding spectral absorbance images at nine wavelengths (500, 520, 540, 560, 570, 580, 600, 730, and 760 nm) was then used to specify the absorption and scattering parameters of brain tissue. In this analysis, the concentrations of oxygenated hemoglobin and that of deoxygenated hemoglobin were estimated as the absorption parameters, whereas the coefficient a and the exponent b of the reduced scattering coefficient spectrum approximated by a power law function were estimated as the scattering parameters. The spectra of absorption and reduced scattering coefficients were reconstructed from the absorption and scattering parameters, and the spectral images of absorption and reduced scattering coefficients were then estimated. In order to confirm the feasibility of this method, we performed in vivo experiments on exposed rat brain. The estimated images of the absorption coefficients were dominated by the spectral characteristics of hemoglobin. The estimated spectral images of the reduced scattering coefficients had a broad scattering spectrum, exhibiting a larger magnitude at shorter wavelengths, corresponding to the typical spectrum of brain tissue published in the literature. The changes in the estimated absorption and scattering parameters during normoxia, hyperoxia, and anoxia indicate the potential applicability of the method by which to evaluate the pathophysiological conditions of in vivo brain due to the loss of tissue viability.

  12. On-board multispectral classification study

    NASA Technical Reports Server (NTRS)

    Ewalt, D.

    1979-01-01

    The factors relating to onboard multispectral classification were investigated. The functions implemented in ground-based processing systems for current Earth observation sensors were reviewed. The Multispectral Scanner, Thematic Mapper, Return Beam Vidicon, and Heat Capacity Mapper were studied. The concept of classification was reviewed and extended from the ground-based image processing functions to an onboard system capable of multispectral classification. Eight different onboard configurations, each with varying amounts of ground-spacecraft interaction, were evaluated. Each configuration was evaluated in terms of turnaround time, onboard processing and storage requirements, geometric and classification accuracy, onboard complexity, and ancillary data required from the ground.

  13. Overview of the Multi-Spectral Imager on the NEAR spacecraft

    NASA Astrophysics Data System (ADS)

    Hawkins, S. E., III

    1996-07-01

    The Multi-Spectral Imager on the Near Earth Asteroid Rendezvous (NEAR) spacecraft is a 1 Hz frame rate CCD camera sensitive in the visible and near infrared bands (~400-1100 nm). MSI is the primary instrument on the spacecraft to determine morphology and composition of the surface of asteroid 433 Eros. In addition, the camera will be used to assist in navigation to the asteroid. The instrument uses refractive optics and has an eight position spectral filter wheel to select different wavelength bands. The MSI optical focal length of 168 mm gives a 2.9 ° × 2.25 ° field of view. The CCD is passively cooled and the 537×244 pixel array output is digitized to 12 bits. Electronic shuttering increases the effective dynamic range of the instrument by more than a factor of 100. A one-time deployable cover protects the instrument during ground testing operations and launch. A reduced aperture viewport permits full field of view imaging while the cover is in place. A Data Processing Unit (DPU) provides the digital interface between the spacecraft and the Camera Head and uses an RTX2010 processor. The DPU provides an eight frame image buffer, lossy and lossless data compression routines, and automatic exposure control. An overview of the instrument is presented and design parameters and trade-offs are discussed.

  14. Mars Exploration Rover Athena Panoramic Camera (Pancam) investigation

    USGS Publications Warehouse

    Bell, J.F.; Squyres, S. W.; Herkenhoff, K. E.; Maki, J.N.; Arneson, H.M.; Brown, D.; Collins, S.A.; Dingizian, A.; Elliot, S.T.; Hagerott, E.C.; Hayes, A.G.; Johnson, M.J.; Johnson, J. R.; Joseph, J.; Kinch, K.; Lemmon, M.T.; Morris, R.V.; Scherr, L.; Schwochert, M.; Shepard, M.K.; Smith, G.H.; Sohl-Dickstein, J. N.; Sullivan, R.J.; Sullivan, W.T.; Wadsworth, M.

    2003-01-01

    The Panoramic Camera (Pancam) investigation is part of the Athena science payload launched to Mars in 2003 on NASA's twin Mars Exploration Rover (MER) missions. The scientific goals of the Pancam investigation are to assess the high-resolution morphology, topography, and geologic context of each MER landing site, to obtain color images to constrain the mineralogic, photometric, and physical properties of surface materials, and to determine dust and aerosol opacity and physical properties from direct imaging of the Sun and sky. Pancam also provides mission support measurements for the rovers, including Sun-finding for rover navigation, hazard identification and digital terrain modeling to help guide long-term rover traverse decisions, high-resolution imaging to help guide the selection of in situ sampling targets, and acquisition of education and public outreach products. The Pancam optical, mechanical, and electronics design were optimized to achieve these science and mission support goals. Pancam is a multispectral, stereoscopic, panoramic imaging system consisting of two digital cameras mounted on a mast 1.5 m above the Martian surface. The mast allows Pancam to image the full 360?? in azimuth and ??90?? in elevation. Each Pancam camera utilizes a 1024 ?? 1024 active imaging area frame transfer CCD detector array. The Pancam optics have an effective focal length of 43 mm and a focal ratio f/20, yielding an instantaneous field of view of 0.27 mrad/pixel and a field of view of 16?? ?? 16??. Each rover's two Pancam "eyes" are separated by 30 cm and have a 1?? toe-in to provide adequate stereo parallax. Each eye also includes a small eight position filter wheel to allow surface mineralogic studies, multispectral sky imaging, and direct Sun imaging in the 400-1100 nm wavelength region. Pancam was designed and calibrated to operate within specifications on Mars at temperatures from -55?? to +5??C. An onboard calibration target and fiducial marks provide the capability

  15. Classification by Using Multispectral Point Cloud Data

    NASA Astrophysics Data System (ADS)

    Liao, C. T.; Huang, H. H.

    2012-07-01

    Remote sensing images are generally recorded in two-dimensional format containing multispectral information. Also, the semantic information is clearly visualized, which ground features can be better recognized and classified via supervised or unsupervised classification methods easily. Nevertheless, the shortcomings of multispectral images are highly depending on light conditions, and classification results lack of three-dimensional semantic information. On the other hand, LiDAR has become a main technology for acquiring high accuracy point cloud data. The advantages of LiDAR are high data acquisition rate, independent of light conditions and can directly produce three-dimensional coordinates. However, comparing with multispectral images, the disadvantage is multispectral information shortage, which remains a challenge in ground feature classification through massive point cloud data. Consequently, by combining the advantages of both LiDAR and multispectral images, point cloud data with three-dimensional coordinates and multispectral information can produce a integrate solution for point cloud classification. Therefore, this research acquires visible light and near infrared images, via close range photogrammetry, by matching images automatically through free online service for multispectral point cloud generation. Then, one can use three-dimensional affine coordinate transformation to compare the data increment. At last, the given threshold of height and color information is set as threshold in classification.

  16. Aspects of detection and tracking of ground targets from an airborne EO/IR sensor

    NASA Astrophysics Data System (ADS)

    Balaji, Bhashyam; Sithiravel, Rajiv; Daya, Zahir; Kirubarajan, Thiagalingam

    2015-05-01

    An airborne EO/IR (electro-optical/infrared) camera system comprises of a suite of sensors, such as a narrow and wide field of view (FOV) EO and mid-wave IR sensors. EO/IR camera systems are regularly employed on military and search and rescue aircrafts. The EO/IR system can be used to detect and identify objects rapidly in daylight and at night, often with superior performance in challenging conditions such as fog. There exist several algorithms for detecting potential targets in the bearing elevation grid. The nonlinear filtering problem is one of estimation of the kinematic parameters from bearing and elevation measurements from a moving platform. In this paper, we developed a complete model for the state of a target as detected by an airborne EO/IR system and simulated a typical scenario with single target with 1 or 2 airborne sensors. We have demonstrated the ability to track the target with `high precision' and noted the improvement from using two sensors on a single platform or on separate platforms. The performance of the Extended Kalman filter (EKF) is investigated on simulated data. Image/video data collected from an IR sensor on an airborne platform are processed using an image tracking by detection algorithm.

  17. Multispectral hypercolorimetry and automatic guided pigment identification: some masterpieces case studies

    NASA Astrophysics Data System (ADS)

    Melis, Marcello; Miccoli, Matteo; Quarta, Donato

    2013-05-01

    A couple of years ago we proposed, in this same session, an extension to the standard colorimetry (CIE '31) that we called Hypercolorimetry. It was based on an even sampling of the 300-1000nm wavelength range, with the definition of 7 hypercolor matching functions optimally shaped to minimize the methamerism. Since then we consolidated the approach through a large number of multispectral analysis and specialized the system to the non invasive diagnosis for paintings and frescos. In this paper we describe the whole process, from the multispectral image acquisition to the final 7 bands computation and we show the results on paintings from Masters of the colour. We describe and propose in this paper a systematic approach to the non invasive diagnosis that is able to change a subjective analysis into a repeatable measure indipendent from the specific lighting conditions and from the specific acquisition system. Along with the Hypercolorimetry and its consolidation in the field of non invasive diagnosis, we developed also a standard spectral reflectance database of pure pigments and pigments painted with different bindings. As we will see, this database could be compared to the reflectances of the painting to help the diagnostician in identifing the proper matter. We used a Nikon D800FR (Full Range) camera. This is a 36megapixel reflex camera modified under a Nikon/Profilocolore common project, to achieve a 300-1000nm range sensitivity. The large amount of data allowed us to perform very accurate pixels comparisions, based on their spectral reflectance. All the original pigments and their binding have been provided by the Opificio delle Pietre Dure, Firenze, Italy, while the analyzed masterpieces belong to the collection of the Pinacoteca Nazionale of Bologna, Italy.

  18. A multisensor system for airborne surveillance of oil pollution

    NASA Technical Reports Server (NTRS)

    Edgerton, A. T.; Ketchal, R.; Catoe, C.

    1973-01-01

    The U.S. Coast Guard is developing a prototype airborne oil surveillance system for use in its Marine Environmental Protection Program. The prototype system utilizes an X-band side-looking radar, a 37-GHz imaging microwave radiometer, a multichannel line scanner, and a multispectral low light level system. The system is geared to detecting and mapping oil spills and potential pollution violators anywhere within a 25 nmi range of the aircraft flight track under all but extreme weather conditions. The system provides for false target discrimination and maximum identification of spilled materials. The system also provides an automated detection alarm, as well as a color display to achieve maximum coupling between the sensor data and the equipment operator.

  19. Reciprocity testing of Kodak film type SO-289 multispectral infrared aerial film

    NASA Technical Reports Server (NTRS)

    Lockwood, H. E.

    1975-01-01

    Kodak multispectral infrared aerial film type SO-289 was tested for reciprocity characteristics because of the variance between the I-B sensitometer exposure times (8 seconds and 4 seconds) and the camera exposure time (1/500 second) used on the ASTP stratospheric aerosol measurement project. Test exposures were made on the flight emulsion using a Mead star system sensitometer, the films were processed to ASTP control standards, and the resulting densities read and reciprocity data calculated. It was found that less exposure was required to produce a typical density (1.3) at 1/500 second exposure time than at an 8 second exposure time. This exposure factor was 2.8.

  20. Integrating optical satellite data and airborne laser scanning in habitat classification for wildlife management

    NASA Astrophysics Data System (ADS)

    Nijland, W.; Coops, N. C.; Nielsen, S. E.; Stenhouse, G.

    2015-06-01

    Wildlife habitat selection is determined by a wide range of factors including food availability, shelter, security and landscape heterogeneity all of which are closely related to the more readily mapped landcover types and disturbance regimes. Regional wildlife habitat studies often used moderate resolution multispectral satellite imagery for wall to wall mapping, because it offers a favourable mix of availability, cost and resolution. However, certain habitat characteristics such as canopy structure and topographic factors are not well discriminated with these passive, optical datasets. Airborne laser scanning (ALS) provides highly accurate three dimensional data on canopy structure and the underlying terrain, thereby offers significant enhancements to wildlife habitat mapping. In this paper, we introduce an approach to integrate ALS data and multispectral images to develop a new heuristic wildlife habitat classifier for western Alberta. Our method combines ALS direct measures of canopy height, and cover with optical estimates of species (conifer vs. deciduous) composition into a decision tree classifier for habitat - or landcover types. We believe this new approach is highly versatile and transferable, because class rules can be easily adapted for other species or functional groups. We discuss the implications of increased ALS availability for habitat mapping and wildlife management and provide recommendations for integrating multispectral and ALS data into wildlife management.

  1. Digital staining for histopathology multispectral images by the combined application of spectral enhancement and spectral transformation.

    PubMed

    Bautista, Pinky A; Yagi, Yukako

    2011-01-01

    In this paper we introduced a digital staining method for histopathology images captured with an n-band multispectral camera. The method consisted of two major processes: enhancement of the original spectral transmittance and the transformation of the enhanced transmittance to its target spectral configuration. Enhancement is accomplished by shifting the original transmittance with the scaled difference between the original transmittance and the transmittance estimated with m dominant principal component (PC) vectors;the m-PC vectors were determined from the transmittance samples of the background image. Transformation of the enhanced transmittance to the target spectral configuration was done using an nxn transformation matrix, which was derived by applying a least square method to the enhanced and target spectral training data samples of the different tissue components. Experimental results on the digital conversion of a hematoxylin and eosin (H&E) stained multispectral image to its Masson's trichrome stained (MT) equivalent shows the viability of the method.

  2. Imager-to-Radiometer In-flight Cross Calibration: RSP Radiometric Comparison with Airborne and Satellite Sensors

    NASA Technical Reports Server (NTRS)

    McCorkel, Joel; Cairns, Brian; Wasilewski, Andrzej

    2016-01-01

    This work develops a method to compare the radiometric calibration between a radiometer and imagers hosted on aircraft and satellites. The radiometer is the airborne Research Scanning Polarimeter (RSP), which takes multi-angle, photo-polarimetric measurements in several spectral channels. The RSP measurements used in this work were coincident with measurements made by the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), which was on the same aircraft. These airborne measurements were also coincident with an overpass of the Landsat 8 Operational Land Imager (OLI). First we compare the RSP and OLI radiance measurements to AVIRIS since the spectral response of the multispectral instruments can be used to synthesize a spectrally equivalent signal from the imaging spectrometer data. We then explore a method that uses AVIRIS as a transfer between RSP and OLI to show that radiometric traceability of a satellite-based imager can be used to calibrate a radiometer despite differences in spectral channel sensitivities. This calibration transfer shows agreement within the uncertainty of both the various instruments for most spectral channels.

  3. Multispectral imaging with vertical silicon nanowires

    PubMed Central

    Park, Hyunsung; Crozier, Kenneth B.

    2013-01-01

    Multispectral imaging is a powerful tool that extends the capabilities of the human eye. However, multispectral imaging systems generally are expensive and bulky, and multiple exposures are needed. Here, we report the demonstration of a compact multispectral imaging system that uses vertical silicon nanowires to realize a filter array. Multiple filter functions covering visible to near-infrared (NIR) wavelengths are simultaneously defined in a single lithography step using a single material (silicon). Nanowires are then etched and embedded into polydimethylsiloxane (PDMS), thereby realizing a device with eight filter functions. By attaching it to a monochrome silicon image sensor, we successfully realize an all-silicon multispectral imaging system. We demonstrate visible and NIR imaging. We show that the latter is highly sensitive to vegetation and furthermore enables imaging through objects opaque to the eye. PMID:23955156

  4. Application of phase matching autofocus in airborne long-range oblique photography camera

    NASA Astrophysics Data System (ADS)

    Petrushevsky, Vladimir; Guberman, Asaf

    2014-06-01

    The Condor2 long-range oblique photography (LOROP) camera is mounted in an aerodynamically shaped pod carried by a fast jet aircraft. Large aperture, dual-band (EO/MWIR) camera is equipped with TDI focal plane arrays and provides high-resolution imagery of extended areas at long stand-off ranges, at day and night. Front Ritchey-Chretien optics is made of highly stable materials. However, the camera temperature varies considerably in flight conditions. Moreover, a composite-material structure of the reflective objective undergoes gradual dehumidification in dry nitrogen atmosphere inside the pod, causing some small decrease of the structure length. The temperature and humidity effects change a distance between the mirrors by just a few microns. The distance change is small but nevertheless it alters the camera's infinity focus setpoint significantly, especially in the EO band. To realize the optics' resolution potential, the optimal focus shall be constantly maintained. In-flight best focus calibration and temperature-based open-loop focus control give mostly satisfactory performance. To get even better focusing precision, a closed-loop phase-matching autofocus method was developed for the camera. The method makes use of an existing beamsharer prism FPA arrangement where aperture partition exists inherently in an area of overlap between the adjacent detectors. The defocus is proportional to an image phase shift in the area of overlap. Low-pass filtering of raw defocus estimate reduces random errors related to variable scene content. Closed-loop control converges robustly to precise focus position. The algorithm uses the temperature- and range-based focus prediction as an initial guess for the closed-loop phase-matching control. The autofocus algorithm achieves excellent results and works robustly in various conditions of scene illumination and contrast.

  5. Airborne observed solar elevation and row direction effects on the near-IR/red ratio of cotton

    NASA Technical Reports Server (NTRS)

    Millard, J. P.; Jackson, R. D.; Goettelman, R. C.; Leroy, M. J. (Principal Investigator)

    1981-01-01

    An airborne multispectral scanner was used to obtain data over two adjacent cotton fields having rows perpendicular to one another, at three times of day (different solar elevations), and on two dates (different plant size). The near IR/red ratios were displayed in image form, so that within-field variations and differences between fields could be easily assessed. The ratio varied with changing Sun elevation for north-south oriented rows, but no variation was detected for east-west oriented rows.

  6. Estimating atmospheric parameters and reducing noise for multispectral imaging

    DOEpatents

    Conger, James Lynn

    2014-02-25

    A method and system for estimating atmospheric radiance and transmittance. An atmospheric estimation system is divided into a first phase and a second phase. The first phase inputs an observed multispectral image and an initial estimate of the atmospheric radiance and transmittance for each spectral band and calculates the atmospheric radiance and transmittance for each spectral band, which can be used to generate a "corrected" multispectral image that is an estimate of the surface multispectral image. The second phase inputs the observed multispectral image and the surface multispectral image that was generated by the first phase and removes noise from the surface multispectral image by smoothing out change in average deviations of temperatures.

  7. The International SubMillimetre Airborne Radiometer (ISMAR) - First results from the STICCS and COSMIC campaigns

    NASA Astrophysics Data System (ADS)

    Mendrok, Jana; Eriksson, Patrick; Fox, Stuart; Brath, Manfred; Buehler, Stefan

    2016-04-01

    Multispectral millimeter- and submillimeter-wave observations bear the potential to measure properties of non-thin ice clouds like mass content and mean particle size. The next generation of European meteorological satellites, the MetOp-SG series, will carry the first satellite-borne submillimeter sounder, the Ice Cloud Imager (ICI). An airborne demonstrator, the International SubMillimetre Airborne Radiometer (ISMAR), is operated together with other remote sensing instruments and in-situ probes on the FAAM aircraft. Scientific measurements from two campaings in the North Atlantic region, STICCS and COSMIC, are available so far. Here we will introduce the ISMAR instrument, present the acquired measurements from the STICCS and COSMIC campaigns and show some first results. This will include estimation of instrument performance, first analysis of clear-sky and cloudy cases and discussion of selected features observed in the measurements (e.g. polarisation signatures).

  8. Introducing a Low-Cost Mini-Uav for - and Multispectral-Imaging

    NASA Astrophysics Data System (ADS)

    Bendig, J.; Bolten, A.; Bareth, G.

    2012-07-01

    The trend to minimize electronic devices also accounts for Unmanned Airborne Vehicles (UAVs) as well as for sensor technologies and imaging devices. Consequently, it is not surprising that UAVs are already part of our daily life and the current pace of development will increase civil applications. A well known and already wide spread example is the so called flying video game based on Parrot's AR.Drone which is remotely controlled by an iPod, iPhone, or iPad (http://ardrone.parrot.com). The latter can be considered as a low-weight and low-cost Mini-UAV. In this contribution a Mini-UAV is considered to weigh less than 5 kg and is being able to carry 0.2 kg to 1.5 kg of sensor payload. While up to now Mini-UAVs like Parrot's AR.Drone are mainly equipped with RGB cameras for videotaping or imaging, the development of such carriage systems clearly also goes to multi-sensor platforms like the ones introduced for larger UAVs (5 to 20 kg) by Jaakkolla et al. (2010) for forestry applications or by Berni et al. (2009) for agricultural applications. The problem when designing a Mini-UAV for multi-sensor imaging is the limitation of payload of up to 1.5 kg and a total weight of the whole system below 5 kg. Consequently, the Mini-UAV without sensors but including navigation system and GPS sensors must weigh less than 3.5 kg. A Mini-UAV system with these characteristics is HiSystems' MK-Okto (www.mikrokopter.de). Total weight including battery without sensors is less than 2.5 kg. Payload of a MK-Okto is approx. 1 kg and maximum speed is around 30 km/h. The MK-Okto can be operated up to a wind speed of less than 19 km/h which corresponds to Beaufort scale number 3 for wind speed. In our study, the MK-Okto is equipped with a handheld low-weight NEC F30IS thermal imaging system. The F30IS which was developed for veterinary applications, covers 8 to 13 μm, weighs only 300 g

  9. Monitoring Geothermal Features in Yellowstone National Park with ATLAS Multispectral Imagery

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph; Berglund, Judith

    2000-01-01

    The National Park Service (NPS) must produce an Environmental Impact Statement for each proposed development in the vicinity of known geothermal resource areas (KGRAs) in Yellowstone National Park. In addition, the NPS monitors indicator KGRAs for environmental quality and is still in the process of mapping many geothermal areas. The NPS currently maps geothermal features with field survey techniques. High resolution aerial multispectral remote sensing in the visible, NIR, SWIR, and thermal spectral regions could enable YNP geothermal features to be mapped more quickly and in greater detail In response, Yellowstone Ecosystems Studies, in partnership with NASA's Commercial Remote Sensing Program, is conducting a study on the use of Airborne Terrestrial Applications Sensor (ATLAS) multispectral data for monitoring geothermal features in the Upper Geyser Basin. ATLAS data were acquired at 2.5 meter resolution on August 17, 2000. These data were processed into land cover classifications and relative temperature maps. For sufficiently large features, the ATLAS data can map geothermal areas in terms of geyser pools and hot springs, plus multiple categories of geothermal runoff that are apparently indicative of temperature gradients and microbial matting communities. In addition, the ATLAS maps clearly identify geyserite areas. The thermal bands contributed to classification success and to the computation of relative temperature. With masking techniques, one can assess the influence of geothermal features on the Firehole River. Preliminary results appear to confirm ATLAS data utility for mapping and monitoring geothermal features. Future work will include classification refinement and additional validation.

  10. Airborne remote sensing for geology and the environment; present and future

    USGS Publications Warehouse

    Watson, Ken; Knepper, Daniel H.

    1994-01-01

    In 1988, a group of leading experts from government, academia, and industry attended a workshop on airborne remote sensing sponsored by the U.S. Geological Survey (USGS) and hosted by the Branch of Geophysics. The purpose of the workshop was to examine the scientific rationale for airborne remote sensing in support of government earth science in the next decade. This report has arranged the six resulting working-group reports under two main headings: (1) Geologic Remote Sensing, for the reports on geologic mapping, mineral resources, and fossil fuels and geothermal resources; and (2) Environmental Remote Sensing, for the reports on environmental geology, geologic hazards, and water resources. The intent of the workshop was to provide an evaluation of demonstrated capabilities, their direct extensions, and possible future applications, and this was the organizational format used for the geologic remote sensing reports. The working groups in environmental remote sensing chose to present their reports in a somewhat modified version of this format. A final section examines future advances and limitations in the field. There is a large, complex, and often bewildering array of remote sensing data available. Early remote sensing studies were based on data collected from airborne platforms. Much of that technology was later extended to satellites. The original 80-m-resolution Landsat Multispectral Scanner System (MSS) has now been largely superseded by the 30-m-resolution Thematic Mapper (TM) system that has additional spectral channels. The French satellite SPOT provides higher spatial resolution for channels equivalent to MSS. Low-resolution (1 km) data are available from the National Oceanographic and Atmospheric Administration's AVHRR system, which acquires reflectance and day and night thermal data daily. Several experimental satellites have acquired limited data, and there are extensive plans for future satellites including those of Japan (JERS), Europe (ESA), Canada

  11. Multispectral fundus imaging for early detection of diabetic retinopathy

    NASA Astrophysics Data System (ADS)

    Beach, James M.; Tiedeman, James S.; Hopkins, Mark F.; Sabharwal, Yashvinder S.

    1999-04-01

    Functional imaging of the retina and associated structures may provide information for early assessment of risks of developing retinopathy in diabetic patients. Here we show results of retinal oximetry performed using multi-spectral reflectance imaging techniques to assess hemoglobin (Hb) oxygen saturation (OS) in blood vessels of the inner retina and oxygen utilization at the optic nerve in diabetic patients without retinopathy and early disease during experimental hyperglycemia. Retinal images were obtained through a fundus camera and simultaneously recorded at up to four wavelengths using image-splitting modules coupled to a digital camera. Changes in OS in large retinal vessels, in average OS in disk tissue, and in the reduced state of cytochrome oxidase (CO) at the disk were determined from changes in reflectance associated with the oxidation/reduction states of Hb and CO. Step to high sugar lowered venous oxygen saturation to a degree dependent on disease duration. Moderate increase in sugar produced higher levels of reduced CO in both the disk and surrounding tissue without a detectable change in average tissue OS. Results suggest that regulation of retinal blood supply and oxygen consumption are altered by hyperglycemia and that such functional changes are present before clinical signs of retinopathy.

  12. Multispectral image fusion for target detection

    NASA Astrophysics Data System (ADS)

    Leviner, Marom; Maltz, Masha

    2009-09-01

    Various different methods to perform multi-spectral image fusion have been suggested, mostly on the pixel level. However, the jury is still out on the benefits of a fused image compared to its source images. We present here a new multi-spectral image fusion method, multi-spectral segmentation fusion (MSSF), which uses a feature level processing paradigm. To test our method, we compared human observer performance in an experiment using MSSF against two established methods: Averaging and Principle Components Analysis (PCA), and against its two source bands, visible and infrared. The task that we studied was: target detection in the cluttered environment. MSSF proved superior to the other fusion methods. Based on these findings, current speculation about the circumstances in which multi-spectral image fusion in general and specific fusion methods in particular would be superior to using the original image sources can be further addressed.

  13. Comparison of Hyperspectral and Multispectral Satellites for Discriminating Land Cover in Northern California

    NASA Astrophysics Data System (ADS)

    Clark, M. L.; Kilham, N. E.

    2015-12-01

    Land-cover maps are important science products needed for natural resource and ecosystem service management, biodiversity conservation planning, and assessing human-induced and natural drivers of land change. Most land-cover maps at regional to global scales are produced with remote sensing techniques applied to multispectral satellite imagery with 30-500 m pixel sizes (e.g., Landsat, MODIS). Hyperspectral, or imaging spectrometer, imagery measuring the visible to shortwave infrared regions (VSWIR) of the spectrum have shown impressive capacity to map plant species and coarser land-cover associations, yet techniques have not been widely tested at regional and greater spatial scales. The Hyperspectral Infrared Imager (HyspIRI) mission is a VSWIR hyperspectral and thermal satellite being considered for development by NASA. The goal of this study was to assess multi-temporal, HyspIRI-like satellite imagery for improved land cover mapping relative to multispectral satellites. We mapped FAO Land Cover Classification System (LCCS) classes over 22,500 km2 in the San Francisco Bay Area, California using 30-m HyspIRI, Landsat 8 and Sentinel-2 imagery simulated from data acquired by NASA's AVIRIS airborne sensor. Random Forests (RF) and Multiple-Endmember Spectral Mixture Analysis (MESMA) classifiers were applied to the simulated images and accuracies were compared to those from real Landsat 8 images. The RF classifier was superior to MESMA, and multi-temporal data yielded higher accuracy than summer-only data. With RF, hyperspectral data had overall accuracy of 72.2% and 85.1% with full 20-class and reduced 12-class schemes, respectively. Multispectral imagery had lower accuracy. For example, simulated and real Landsat data had 7.5% and 4.6% lower accuracy than HyspIRI data with 12 classes, respectively. In summary, our results indicate increased mapping accuracy using HyspIRI multi-temporal imagery, particularly in discriminating different natural vegetation types, such as

  14. MultiSpec—a tool for multispectral hyperspectral image data analysis

    NASA Astrophysics Data System (ADS)

    Biehl, Larry; Landgrebe, David

    2002-12-01

    MultiSpec is a multispectral image data analysis software application. It is intended to provide a fast, easy-to-use means for analysis of multispectral image data, such as that from the Landsat, SPOT, MODIS or IKONOS series of Earth observational satellites, hyperspectral data such as that from the Airborne Visible-Infrared Imaging Spectrometer (AVIRIS) and EO-1 Hyperion satellite system or the data that will be produced by the next generation of Earth observational sensors. The primary purpose for the system was to make new, otherwise complex analysis tools available to the general Earth science community. It has also found use in displaying and analyzing many other types of non-space related digital imagery, such as medical image data and in K-12 and university level educational activities. MultiSpec has been implemented for both the Apple Macintosh ® and Microsoft Windows ® operating systems (OS). The effort was first begun on the Macintosh OS in 1988. The GLOBE ( http://www.globe.gov) program supported the development of a subset of MultiSpec for the Windows OS in 1995. Since then most (but not all) of the features in the Macintosh OS version have been ported to the Windows OS version. Although copyrighted, MultiSpec with its documentation is distributed without charge. The Macintosh and Windows versions and documentation on its use are available from the World Wide Web at URL: http://dynamo.ecn.purdue.edu/˜biehl/MultiSpec/ MultiSpec is copyrighted (1991-2001) by Purdue Research Foundation, West Lafayette, Indiana 47907.

  15. CNR LARA project, Italy: Airborne laboratory for environmental research

    NASA Technical Reports Server (NTRS)

    Bianchi, R.; Cavalli, R. M.; Fiumi, L.; Marino, C. M.; Pignatti, S.

    1995-01-01

    The increasing interest for the environmental problems and the study of the impact on the environment due to antropic activity produced an enhancement of remote sensing applications. The Italian National Research Council (CNR) established a new laboratory for airborne hyperspectral imaging, the LARA Project (Laboratorio Aero per Ricerche Ambientali - Airborne Laboratory for Environmental Research), equipping its airborne laboratory, a CASA-212, mainly with the Daedalus AA5000 MIVIS (Multispectral Infrared and Visible Imaging Spectrometer) instrument. MIVIS's channels, spectral bandwidths, and locations are chosen to meet the needs of scientific research for advanced applications of remote sensing data. MIVIS can make significant contributions to solving problems in many diverse areas such as geologic exploration, land use studies, mineralogy, agricultural crop studies, energy loss analysis, pollution assessment, volcanology, forest fire management and others. The broad spectral range and the many discrete narrow channels of MIVIS provide a fine quantization of spectral information that permits accurate definition of absorption features from a variety of materials, allowing the extraction of chemical and physical information of our environment. The availability of such a hyperspectral imager, that will operate mainly in the Mediterranean area, at the present represents a unique opportunity for those who are involved in environmental studies and land-management to collect systematically large-scale and high spectral-spatial resolution data of this part of the world. Nevertheless, MIVIS deployments will touch other parts of the world, where a major interest from the international scientific community is present.

  16. Multispectral imaging for biometrics

    NASA Astrophysics Data System (ADS)

    Rowe, Robert K.; Corcoran, Stephen P.; Nixon, Kristin A.; Ostrom, Robert E.

    2005-03-01

    Automated identification systems based on fingerprint images are subject to two significant types of error: an incorrect decision about the identity of a person due to a poor quality fingerprint image and incorrectly accepting a fingerprint image generated from an artificial sample or altered finger. This paper discusses the use of multispectral sensing as a means to collect additional information about a finger that significantly augments the information collected using a conventional fingerprint imager based on total internal reflectance. In the context of this paper, "multispectral sensing" is used broadly to denote a collection of images taken under different polarization conditions and illumination configurations, as well as using multiple wavelengths. Background information is provided on conventional fingerprint imaging. A multispectral imager for fingerprint imaging is then described and a means to combine the two imaging systems into a single unit is discussed. Results from an early-stage prototype of such a system are shown.

  17. A multispectral study of an extratropical cyclone with Nimbus 3 medium resolution infrared radiometer data

    NASA Technical Reports Server (NTRS)

    Holub, R.; Shenk, W. E.

    1973-01-01

    Four registered channels (0.2 to 4, 6.5 to 7, 10 to 11, and 20 to 23 microns) of the Nimbus 3 Medium Resolution Infrared Radiometer (MRIR) were used to study 24-hr changes in the structure of an extratropical cyclone during a 6-day period in May 1969. Use of a stereographic-horizon map projection insured that the storm was mapped with a single perspective throughout the series and allowed the convenient preparation of 24-hr difference maps of the infrared radiation fields. Single-channel and multispectral analysis techniques were employed to establish the positions and vertical slopes of jetstreams, large cloud systems, and major features of middle and upper tropospheric circulation. Use of these techniques plus the difference maps and continuity of observation allowed the early detection of secondary cyclones developing within the circulation of the primary cyclone. An automated, multispectral cloud-type identification technique was developed, and comparisons that were made with conventional ship reports and with high-resolution visual data from the image dissector camera system showed good agreement.

  18. Galileo multispectral imaging of Earth.

    PubMed

    Geissler, P; Thompson, W R; Greenberg, R; Moersch, J; McEwen, A; Sagan, C

    1995-08-25

    Nearly 6000 multispectral images of Earth were acquired by the Galileo spacecraft during its two flybys. The Galileo images offer a unique perspective on our home planet through the spectral capability made possible by four narrowband near-infrared filters, intended for observations of methane in Jupiter's atmosphere, which are not incorporated in any of the currently operating Earth orbital remote sensing systems. Spectral variations due to mineralogy, vegetative cover, and condensed water are effectively mapped by the visible and near-infrared multispectral imagery, showing a wide variety of biological, meteorological, and geological phenomena. Global tectonic and volcanic processes are clearly illustrated by these images, providing a useful basis for comparative planetary geology. Differences between plant species are detected through the narrowband IR filters on Galileo, allowing regional measurements of variation in the "red edge" of chlorophyll and the depth of the 1-micrometer water band, which is diagnostic of leaf moisture content. Although evidence of life is widespread in the Galileo data set, only a single image (at approximately 2 km/pixel) shows geometrization plausibly attributable to our technical civilization. Water vapor can be uniquely imaged in the Galileo 0.73-micrometer band, permitting spectral discrimination of moist and dry clouds with otherwise similar albedo. Surface snow and ice can be readily distinguished from cloud cover by narrowband imaging within the sensitivity range of Galileo's silicon CCD camera. Ice grain size variations can be mapped using the weak H2O absorption at 1 micrometer, a technique which may find important applications in the exploration of the moons of Jupiter. The Galileo images have the potential to make unique contributions to Earth science in the areas of geological, meteorological and biological remote sensing, due to the inclusion of previously untried narrowband IR filters. The vast scale and near global

  19. Airborne Polarimeter Intercomparison for the NASA Aerosols-Clouds-Ecosystems (ACE) Mission

    NASA Technical Reports Server (NTRS)

    Knobelspiesse, Kirk; Redemann, Jens

    2014-01-01

    The Aerosols-Clouds-Ecosystems (ACE) mission, recommended by the National Research Council's Decadal Survey, calls for a multi-angle, multi-spectral polarimeter devoted to observations of atmospheric aerosols and clouds. In preparation for ACE, NASA funds the deployment of airborne polarimeters, including the Airborne Multi-angle SpectroPolarimeter Imager (AirMSPI), the Passive Aerosol and Cloud Suite (PACS) and the Research Scanning Polarimeter (RSP). These instruments have been operated together on NASA's ER-2 high altitude aircraft as part of field campaigns such as the POlarimeter DEfinition EXperiment (PODEX) (California, early 2013) and Studies of Emissions and Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC4RS, California and Texas, summer 2013). Our role in these efforts has been to serve as an assessment team performing level 1 (calibrated radiance, polarization) and level 2 (retrieved geophysical parameter) instrument intercomparisons, and to promote unified and generalized calibration, uncertainty assessment and retrieval techniques. We will present our progress in this endeavor thus far and describe upcoming research in 2015.

  20. Airborne Mission Concept for Coastal Ocean Color and Ecosystems Research

    NASA Technical Reports Server (NTRS)

    Guild, Liane S.; Hooker, Stanford B.; Morrow, John H.; Kudela, Raphael M.; Palacios, Sherry L.; Torres Perez, Juan L.; Hayashi, Kendra; Dunagan, Stephen E.

    2016-01-01

    NASA airborne missions in 2011 and 2013 over Monterey Bay, CA, demonstrated novel above- and in-water calibration and validation measurements supporting a combined airborne sensor approach (imaging spectrometer, microradiometers, and a sun photometer). The resultant airborne data characterize contemporaneous coastal atmospheric and aquatic properties plus sea-truth observations from state-of-the-art instrument systems spanning a next-generation spectral domain (320-875 nm). This airborne instrument suite for calibration, validation, and research flew at the lowest safe altitude (ca. 100 ft or 30 m) as well as higher altitudes (e.g., 6,000 ft or 1,800 m) above the sea surface covering a larger area in a single synoptic sortie than ship-based measurements at a few stations during the same sampling period. Data collection of coincident atmospheric and aquatic properties near the sea surface and at altitude allows the input of relevant variables into atmospheric correction schemes to improve the output of corrected imaging spectrometer data. Specific channels support legacy and next-generation satellite capabilities, and flights are planned to within 30 min of satellite overpass. This concept supports calibration and validation activities of ocean color phenomena (e.g., river plumes, algal blooms) and studies of water quality and coastal ecosystems. The 2011 COAST mission flew at 100 and 6,000 ft on a Twin Otter platform with flight plans accommodating the competing requirements of the sensor suite, which included the Coastal-Airborne In-situ Radiometers (C-AIR) for the first time. C-AIR (Biospherical Instruments Inc.) also flew in the 2013 OCEANIA mission at 100 and 1,000 ft on the Twin Otter below the California airborne simulation of the proposed NASA HyspIRI satellite system comprised of an imaging spectrometer and thermal infrared multispectral imager on the ER-2 at 65,000 ft (20,000 m). For both missions, the Compact-Optical Profiling System (Biospherical

  1. Characterizing tropical forests with multispectral imagery

    Treesearch

    Eileen Helmer; Nicholas R. Goodwin; Valery Gond; Carlos M. Souza, Jr.; Gregory P. Asner

    2015-01-01

    Multispectral satellite imagery, that is, remotely sensed imagery with discrete bands ranging from visible to shortwave infrared (SWIR) wavelengths, is the timeliest and most accessible remotely sensed data for monitoring tropical forests. Given this relevance, we summarize here how multispectral imagery can help characterize tropical forest attributes of widespread...

  2. Multispectral Palmprint Recognition Using a Quaternion Matrix

    PubMed Central

    Xu, Xingpeng; Guo, Zhenhua; Song, Changjiang; Li, Yafeng

    2012-01-01

    Palmprints have been widely studied for biometric recognition for many years. Traditionally, a white light source is used for illumination. Recently, multispectral imaging has drawn attention because of its high recognition accuracy. Multispectral palmprint systems can provide more discriminant information under different illuminations in a short time, thus they can achieve better recognition accuracy. Previously, multispectral palmprint images were taken as a kind of multi-modal biometrics, and the fusion scheme on the image level or matching score level was used. However, some spectral information will be lost during image level or matching score level fusion. In this study, we propose a new method for multispectral images based on a quaternion model which could fully utilize the multispectral information. Firstly, multispectral palmprint images captured under red, green, blue and near-infrared (NIR) illuminations were represented by a quaternion matrix, then principal component analysis (PCA) and discrete wavelet transform (DWT) were applied respectively on the matrix to extract palmprint features. After that, Euclidean distance was used to measure the dissimilarity between different features. Finally, the sum of two distances and the nearest neighborhood classifier were employed for recognition decision. Experimental results showed that using the quaternion matrix can achieve a higher recognition rate. Given 3000 test samples from 500 palms, the recognition rate can be as high as 98.83%. PMID:22666049

  3. Multispectral palmprint recognition using a quaternion matrix.

    PubMed

    Xu, Xingpeng; Guo, Zhenhua; Song, Changjiang; Li, Yafeng

    2012-01-01

    Palmprints have been widely studied for biometric recognition for many years. Traditionally, a white light source is used for illumination. Recently, multispectral imaging has drawn attention because of its high recognition accuracy. Multispectral palmprint systems can provide more discriminant information under different illuminations in a short time, thus they can achieve better recognition accuracy. Previously, multispectral palmprint images were taken as a kind of multi-modal biometrics, and the fusion scheme on the image level or matching score level was used. However, some spectral information will be lost during image level or matching score level fusion. In this study, we propose a new method for multispectral images based on a quaternion model which could fully utilize the multispectral information. Firstly, multispectral palmprint images captured under red, green, blue and near-infrared (NIR) illuminations were represented by a quaternion matrix, then principal component analysis (PCA) and discrete wavelet transform (DWT) were applied respectively on the matrix to extract palmprint features. After that, Euclidean distance was used to measure the dissimilarity between different features. Finally, the sum of two distances and the nearest neighborhood classifier were employed for recognition decision. Experimental results showed that using the quaternion matrix can achieve a higher recognition rate. Given 3000 test samples from 500 palms, the recognition rate can be as high as 98.83%.

  4. Multispectral thermal infrared mapping of the 1 October 1988 Kupaianaha flow field, Kilauea volcano, Hawaii

    USGS Publications Warehouse

    Realmuto, V.J.; Hon, K.; Kahle, A.B.; Abbott, E.A.; Pieri, D.C.

    1992-01-01

    Multispectral thermal infrared radiance measurements of the Kupaianaha flow field were acquired with the NASA airborne Thermal Infrared Multispectral Scanner (TIMS) on the morning of 1 October 1988. The TIMS data were used to map both the temperature and emissivity of the surface of the flow field. The temperature map depicted the underground storage and transport of lava. The presence of molten lava in a tube or tumulus resulted in surface temperatures that were at least 10?? C above ambient. The temperature map also clearly defined the boundaries of hydrothermal plumes which resulted from the entry of lava into the ocean. The emissivity map revealed the boundaries between individual flow units within the Kupaianaha field. In general, the emissivity of the flows varied systematically with age but the relationship between age and emissivity was not unique. Distinct spectral anomalies, indicative of silica-rich surface materials, were mapped near fumaroles and ocean entry sites. This apparent enrichment in silica may have resulted from an acid-induced leaching of cations from the surfaces of glassy flows. Such incipient alteration may have been the cause for virtually all of the emissivity variations observed on the flow field, the spectral anomalies representing areas where the acid attack was most intense. ?? 1992 Springer-Verlag.

  5. System for critical infrastructure security based on multispectral observation-detection module

    NASA Astrophysics Data System (ADS)

    Trzaskawka, Piotr; Kastek, Mariusz; Życzkowski, Marek; Dulski, Rafał; Szustakowski, Mieczysław; Ciurapiński, Wiesław; Bareła, Jarosław

    2013-10-01

    Recent terrorist attacks and possibilities of such actions in future have forced to develop security systems for critical infrastructures that embrace sensors technologies and technical organization of systems. The used till now perimeter protection of stationary objects, based on construction of a ring with two-zone fencing, visual cameras with illumination are efficiently displaced by the systems of the multisensor technology that consists of: visible technology - day/night cameras registering optical contrast of a scene, thermal technology - cheap bolometric cameras recording thermal contrast of a scene and active ground radars - microwave and millimetre wavelengths that record and detect reflected radiation. Merging of these three different technologies into one system requires methodology for selection of technical conditions of installation and parameters of sensors. This procedure enables us to construct a system with correlated range, resolution, field of view and object identification. Important technical problem connected with the multispectral system is its software, which helps couple the radar with the cameras. This software can be used for automatic focusing of cameras, automatic guiding cameras to an object detected by the radar, tracking of the object and localization of the object on the digital map as well as target identification and alerting. Based on "plug and play" architecture, this system provides unmatched flexibility and simplistic integration of sensors and devices in TCP/IP networks. Using a graphical user interface it is possible to control sensors and monitor streaming video and other data over the network, visualize the results of data fusion process and obtain detailed information about detected intruders over a digital map. System provide high-level applications and operator workload reduction with features such as sensor to sensor cueing from detection devices, automatic e-mail notification and alarm triggering. The paper presents

  6. Long-Term Tracking of a Specific Vehicle Using Airborne Optical Camera Systems

    NASA Astrophysics Data System (ADS)

    Kurz, F.; Rosenbaum, D.; Runge, H.; Cerra, D.; Mattyus, G.; Reinartz, P.

    2016-06-01

    In this paper we present two low cost, airborne sensor systems capable of long-term vehicle tracking. Based on the properties of the sensors, a method for automatic real-time, long-term tracking of individual vehicles is presented. This combines the detection and tracking of the vehicle in low frame rate image sequences and applies the lagged Cell Transmission Model (CTM) to handle longer tracking outages occurring in complex traffic situations, e.g. tunnels. The CTM model uses the traffic conditions in the proximities of the target vehicle and estimates its motion to predict the position where it reappears. The method is validated on an airborne image sequence acquired from a helicopter. Several reference vehicles are tracked within a range of 500m in a complex urban traffic situation. An artificial tracking outage of 240m is simulated, which is handled by the CTM. For this, all the vehicles in the close proximity are automatically detected and tracked to estimate the basic density-flow relations of the CTM model. Finally, the real and simulated trajectories of the reference vehicles in the outage are compared showing good correspondence also in congested traffic situations.

  7. The Mars NetLander panoramic camera

    NASA Astrophysics Data System (ADS)

    Jaumann, Ralf; Langevin, Yves; Hauber, Ernst; Oberst, Jürgen; Grothues, Hans-Georg; Hoffmann, Harald; Soufflot, Alain; Bertaux, Jean-Loup; Dimarellis, Emmanuel; Mottola, Stefano; Bibring, Jean-Pierre; Neukum, Gerhard; Albertz, Jörg; Masson, Philippe; Pinet, Patrick; Lamy, Philippe; Formisano, Vittorio

    2000-10-01

    The panoramic camera (PanCam) imaging experiment is designed to obtain high-resolution multispectral stereoscopic panoramic images from each of the four Mars NetLander 2005 sites. The main scientific objectives to be addressed by the PanCam experiment are (1) to locate the landing sites and support the NetLander network sciences, (2) to geologically investigate and map the landing sites, and (3) to study the properties of the atmosphere and of variable phenomena. To place in situ measurements at a landing site into a proper regional context, it is necessary to determine the lander orientation on ground and to exactly locate the position of the landing site with respect to the available cartographic database. This is not possible by tracking alone due to the lack of on-ground orientation and the so-called map-tie problem. Images as provided by the PanCam allow to determine accurate tilt and north directions for each lander and to identify the lander locations based on landmarks, which can also be recognized in appropriate orbiter imagery. With this information, it will be further possible to improve the Mars-wide geodetic control point network and the resulting geometric precision of global map products. The major geoscientific objectives of the PanCam lander images are the recognition of surface features like ripples, ridges and troughs, and the identification and characterization of different rock and surface units based on their morphology, distribution, spectral characteristics, and physical properties. The analysis of the PanCam imagery will finally result in the generation of precise map products for each of the landing sites. So far comparative geologic studies of the Martian surface are restricted to the timely separated Mars Pathfinder and the two Viking Lander Missions. Further lander missions are in preparation (Beagle-2, Mars Surveyor 03). NetLander provides the unique opportunity to nearly double the number of accessible landing site data by providing

  8. Airborne imaging for heritage documentation using the Fotokite tethered flying camera

    NASA Astrophysics Data System (ADS)

    Verhoeven, Geert; Lupashin, Sergei; Briese, Christian; Doneus, Michael

    2014-05-01

    Since the beginning of aerial photography, researchers used all kinds of devices (from pigeons, kites, poles, and balloons to rockets) to take still cameras aloft and remotely gather aerial imagery. To date, many of these unmanned devices are still used for what has been referred to as Low-Altitude Aerial Photography or LAAP. In addition to these more traditional camera platforms, radio-controlled (multi-)copter platforms have recently added a new aspect to LAAP. Although model airplanes have been around for several decades, the decreasing cost, increasing functionality and stability of ready-to-fly multi-copter systems has proliferated their use among non-hobbyists. As such, they became a very popular tool for aerial imaging. The overwhelming amount of currently available brands and types (heli-, dual-, tri-, quad-, hexa-, octo-, dodeca-, deca-hexa and deca-octocopters), together with the wide variety of navigation options (e.g. altitude and position hold, waypoint flight) and camera mounts indicate that these platforms are here to stay for some time. Given the multitude of still camera types and the image quality they are currently capable of, endless combinations of low- and high-cost LAAP solutions are available. In addition, LAAP allows for the exploitation of new imaging techniques, as it is often only a matter of lifting the appropriate device (e.g. video cameras, thermal frame imagers, hyperspectral line sensors). Archaeologists were among the first to adopt this technology, as it provided them with a means to easily acquire essential data from a unique point of view, whether for simple illustration purposes of standing historic structures or to compute three-dimensional (3D) models and orthophotographs from excavation areas. However, even very cheap multi-copters models require certain skills to pilot them safely. Additionally, malfunction or overconfidence might lift these devices to altitudes where they can interfere with manned aircrafts. As such, the

  9. Scattering and absorption measurements of cervical tissues measures using low cost multi-spectral imaging

    NASA Astrophysics Data System (ADS)

    Bernat, Amir S.; Bar-Am, Kfir; Cataldo, Leigh; Bolton, Frank J.; Kahn, Bruce S.; Levitz, David

    2018-02-01

    Cervical cancer is a leading cause of death for women in low resource settings. In order to better detect cervical dysplasia, a low cost multi-spectral colposcope was developed utilizing low costs LEDs and an area scan camera. The device is capable of both traditional colposcopic imaging and multi-spectral image capture. Following initial bench testing, the device was deployed to a gynecology clinic where it was used to image patients in a colposcopy setting. Both traditional colposcopic images and spectral data from patients were uploaded to a cloud server for remote analysis. Multi-spectral imaging ( 30 second capture) took place before any clinical procedure; the standard of care was followed thereafter. If acetic acid was used in the standard of care, a post-acetowhitening colposcopic image was also captured. In analyzing the data, normal and abnormal regions were identified in the colposcopic images by an expert clinician. Spectral data were fit to a theoretical model based on diffusion theory, yielding information on scattering and absorption parameters. Data were grouped according to clinician labeling of the tissue, as well as any additional clinical test results available (Pap, HPV, biopsy). Altogether, N=20 patients were imaged in this study, with 9 of them abnormal. In comparing normal and abnormal regions of interest from patients, substantial differences were measured in blood content, while differences in oxygen saturation parameters were more subtle. These results suggest that optical measurements made using low cost spectral imaging systems can distinguish between normal and pathological tissues.

  10. Quantitative evaluation of lipid concentration in atherosclerotic plaque phantom by near-infrared multispectral angioscope at wavelengths around 1200 nm

    NASA Astrophysics Data System (ADS)

    Matsui, Daichi; Ishii, Katsunori; Awazu, Kunio

    2015-07-01

    Atherosclerosis is a primary cause of critical ischemic diseases like heart infarction or stroke. A method that can provide detailed information about the stability of atherosclerotic plaques is required. We focused on spectroscopic techniques that could evaluate the chemical composition of lipid in plaques. A novel angioscope using multispectral imaging at wavelengths around 1200 nm for quantitative evaluation of atherosclerotic plaques was developed. The angioscope consists of a halogen lamp, an indium gallium arsenide (InGaAs) camera, 3 optical band pass filters transmitting wavelengths of 1150, 1200, and 1300 nm, an image fiber having 0.7 mm outer diameter, and an irradiation fiber which consists of 7 multimode fibers. Atherosclerotic plaque phantoms with 100, 60, 20 vol.% of lipid were prepared and measured by the multispectral angioscope. The acquired datasets were processed by spectral angle mapper (SAM) method. As a result, simulated plaque areas in atherosclerotic plaque phantoms that could not be detected by an angioscopic visible image could be clearly enhanced. In addition, quantitative evaluation of atherosclerotic plaque phantoms based on the lipid volume fractions was performed up to 20 vol.%. These results show the potential of a multispectral angioscope at wavelengths around 1200 nm for quantitative evaluation of the stability of atherosclerotic plaques.

  11. Gyrocopter-Based Remote Sensing Platform

    NASA Astrophysics Data System (ADS)

    Weber, I.; Jenal, A.; Kneer, C.; Bongartz, J.

    2015-04-01

    In this paper the development of a lightweight and highly modularized airborne sensor platform for remote sensing applications utilizing a gyrocopter as a carrier platform is described. The current sensor configuration consists of a high resolution DSLR camera for VIS-RGB recordings. As a second sensor modality, a snapshot hyperspectral camera was integrated in the aircraft. Moreover a custom-developed thermal imaging system composed of a VIS-PAN camera and a LWIR-camera is used for aerial recordings in the thermal infrared range. Furthermore another custom-developed highly flexible imaging system for high resolution multispectral image acquisition with up to six spectral bands in the VIS-NIR range is presented. The performance of the overall system was tested during several flights with all sensor modalities and the precalculated demands with respect to spatial resolution and reliability were validated. The collected data sets were georeferenced, georectified, orthorectified and then stitched to mosaics.

  12. MSS D Multispectral Scanner System

    NASA Technical Reports Server (NTRS)

    Lauletta, A. M.; Johnson, R. L.; Brinkman, K. L. (Principal Investigator)

    1982-01-01

    The development and acceptance testing of the 4-band Multispectral Scanners to be flown on LANDSAT D and LANDSAT D Earth resources satellites are summarized. Emphasis is placed on the acceptance test phase of the program. Test history and acceptance test algorithms are discussed. Trend data of all the key performance parameters are included and discussed separately for each of the two multispectral scanner instruments. Anomalies encountered and their resolutions are included.

  13. Light-Weight Multispectral Uav Sensors and Their Capabilities for Predicting Grain Yield and Detecting Plant Diseases

    NASA Astrophysics Data System (ADS)

    Nebiker, S.; Lack, N.; Abächerli, M.; Läderach, S.

    2016-06-01

    In this paper we investigate the performance of new light-weight multispectral sensors for micro UAV and their application to selected tasks in agronomical research and agricultural practice. The investigations are based on a series of flight campaigns in 2014 and 2015 covering a number of agronomical test sites with experiments on rape, barley, onion, potato and other crops. In our sensor comparison we included a high-end multispectral multiSPEC 4C camera with bandpass colour filters and reference channel in zenith direction and a low-cost, consumer-grade Canon S110 NIR camera with Bayer pattern colour filters. Ground-based reference measurements were obtained using a terrestrial hyperspectral field spectrometer. The investigations show that measurements with the high-end system consistently match very well with ground-based field spectrometer measurements with a mean deviation of just 0.01-0.04 NDVI values. The low-cost system, while delivering better spatial resolutions, expressed significant biases. The sensors were subsequently used to address selected agronomical questions. These included crop yield estimation in rape and barley and plant disease detection in potato and onion cultivations. High levels of correlation between different vegetation indices and reference yield measurements were obtained for rape and barley. In case of barley, the NDRE index shows an average correlation of 87% with reference yield, when species are taken into account. With high geometric resolutions and respective GSDs of down to 2.5 cm the effects of a thrips infestation in onion could be analysed and potato blight was successfully detected at an early stage of infestation.

  14. Comparison of multispectral remote-sensing techniques for monitoring subsurface drain conditions. [Imperial Valley, California

    NASA Technical Reports Server (NTRS)

    Goettelman, R. C.; Grass, L. B.; Millard, J. P.; Nixon, P. R.

    1983-01-01

    The following multispectral remote-sensing techniques were compared to determine the most suitable method for routinely monitoring agricultural subsurface drain conditions: airborne scanning, covering the visible through thermal-infrared (IR) portions of the spectrum; color-IR photography; and natural-color photography. Color-IR photography was determined to be the best approach, from the standpoint of both cost and information content. Aerial monitoring of drain conditions for early warning of tile malfunction appears practical. With careful selection of season and rain-induced soil-moisture conditions, extensive regional surveys are possible. Certain locations, such as the Imperial Valley, Calif., are precluded from regional monitoring because of year-round crop rotations and soil stratification conditions. Here, farms with similar crops could time local coverage for bare-field and saturated-soil conditions.

  15. Common aperture multispectral optics for military applications

    NASA Astrophysics Data System (ADS)

    Thompson, N. A.

    2012-06-01

    With the recent developments in multi-spectral detector technology the interest in common aperture, common focal plane multi-spectral imaging systems is increasing. Such systems are particularly desirable for military applications where increased levels of target discrimination and identification are required in cost-effective, rugged, lightweight systems. During the optical design of dual waveband or multi-spectral systems, the options for material selection are limited. This selection becomes even more restrictive for military applications as material resilience and thermal properties must be considered in addition to colour correction. In this paper we discuss the design challenges that lightweight multi-spectral common aperture systems present along with some potential design solutions. Consideration will be given to material selection for optimum colour correction as well as material resilience and thermal correction. This discussion is supported using design examples that are currently in development at Qioptiq.

  16. Nanohole-array-based device for 2D snapshot multispectral imaging

    PubMed Central

    Najiminaini, Mohamadreza; Vasefi, Fartash; Kaminska, Bozena; Carson, Jeffrey J. L.

    2013-01-01

    We present a two-dimensional (2D) snapshot multispectral imager that utilizes the optical transmission characteristics of nanohole arrays (NHAs) in a gold film to resolve a mixture of input colors into multiple spectral bands. The multispectral device consists of blocks of NHAs, wherein each NHA has a unique periodicity that results in transmission resonances and minima in the visible and near-infrared regions. The multispectral device was illuminated over a wide spectral range, and the transmission was spectrally unmixed using a least-squares estimation algorithm. A NHA-based multispectral imaging system was built and tested in both reflection and transmission modes. The NHA-based multispectral imager was capable of extracting 2D multispectral images representative of four independent bands within the spectral range of 662 nm to 832 nm for a variety of targets. The multispectral device can potentially be integrated into a variety of imaging sensor systems. PMID:24005065

  17. Application of the NASA airborne oceanographic lidar to the mapping of chlorophyll and other organic pigments

    NASA Technical Reports Server (NTRS)

    Hoge, F. E.; Swift, R. N.

    1981-01-01

    Laser fluorosensing techniques used for the airborne measurement of chlorophyll a and other naturally occurring waterborne pigments are reviewed. Previous experiments demonstrating the utility of the airborne oceanographic lidar (AOL) for assessment of various marine parameters are briefly discussed. The configuration of the AOL during the NOAA/NASA Superflux experiments is described. The participation of the AOL in these experiments is presented and the preliminary results are discussed. The importance of multispectral receiving capability in a laser fluorosensing system for providing reproducible measurements over wide areas having spatial variations in water column transmittance properties is addressed. This capability minimizes the number of truthing points required and is usable even in shallow estuarine areas where resuspension of bottom sediment is common. Finally, problems encountered on the Superflux missions and the resulting limitations on the AOL data sets are addressed and feasible solutions to these problems are provided.

  18. Real-time implementation of a multispectral mine target detection algorithm

    NASA Astrophysics Data System (ADS)

    Samson, Joseph W.; Witter, Lester J.; Kenton, Arthur C.; Holloway, John H., Jr.

    2003-09-01

    Spatial-spectral anomaly detection (the "RX Algorithm") has been exploited on the USMC's Coastal Battlefield Reconnaissance and Analysis (COBRA) Advanced Technology Demonstration (ATD) and several associated technology base studies, and has been found to be a useful method for the automated detection of surface-emplaced antitank land mines in airborne multispectral imagery. RX is a complex image processing algorithm that involves the direct spatial convolution of a target/background mask template over each multispectral image, coupled with a spatially variant background spectral covariance matrix estimation and inversion. The RX throughput on the ATD was about 38X real time using a single Sun UltraSparc system. A goal to demonstrate RX in real-time was begun in FY01. We now report the development and demonstration of a Field Programmable Gate Array (FPGA) solution that achieves a real-time implementation of the RX algorithm at video rates using COBRA ATD data. The approach uses an Annapolis Microsystems Firebird PMC card containing a Xilinx XCV2000E FPGA with over 2,500,000 logic gates and 18MBytes of memory. A prototype system was configured using a Tek Microsystems VME board with dual-PowerPC G4 processors and two PMC slots. The RX algorithm was translated from its C programming implementation into the VHDL language and synthesized into gates that were loaded into the FPGA. The VHDL/synthesizer approach allows key RX parameters to be quickly changed and a new implementation automatically generated. Reprogramming the FPGA is done rapidly and in-circuit. Implementation of the RX algorithm in a single FPGA is a major first step toward achieving real-time land mine detection.

  19. Visibility through the gaseous smoke in airborne remote sensing using a DSLR camera

    NASA Astrophysics Data System (ADS)

    Chabok, Mirahmad; Millington, Andrew; Hacker, Jorg M.; McGrath, Andrew J.

    2016-08-01

    Visibility and clarity of remotely sensed images acquired by consumer grade DSLR cameras, mounted on an unmanned aerial vehicle or a manned aircraft, are critical factors in obtaining accurate and detailed information from any area of interest. The presence of substantial haze, fog or gaseous smoke particles; caused, for example, by an active bushfire at the time of data capture, will dramatically reduce image visibility and quality. Although most modern hyperspectral imaging sensors are capable of capturing a large number of narrow range bands of the shortwave and thermal infrared spectral range, which have the potential to penetrate smoke and haze, the resulting images do not contain sufficient spatial detail to enable locating important objects or assist search and rescue or similar applications which require high resolution information. We introduce a new method for penetrating gaseous smoke without compromising spatial resolution using a single modified DSLR camera in conjunction with image processing techniques which effectively improves the visibility of objects in the captured images. This is achieved by modifying a DSLR camera and adding a custom optical filter to enable it to capture wavelengths from 480-1200nm (R, G and Near Infrared) instead of the standard RGB bands (400-700nm). With this modified camera mounted on an aircraft, images were acquired over an area polluted by gaseous smoke from an active bushfire. Processed data using our proposed method shows significant visibility improvements compared with other existing solutions.

  20. Development of infrared scene projectors for testing fire-fighter cameras

    NASA Astrophysics Data System (ADS)

    Neira, Jorge E.; Rice, Joseph P.; Amon, Francine K.

    2008-04-01

    We have developed two types of infrared scene projectors for hardware-in-the-loop testing of thermal imaging cameras such as those used by fire-fighters. In one, direct projection, images are projected directly into the camera. In the other, indirect projection, images are projected onto a diffuse screen, which is then viewed by the camera. Both projectors use a digital micromirror array as the spatial light modulator, in the form of a Micromirror Array Projection System (MAPS) engine having resolution of 800 x 600 with mirrors on a 17 micrometer pitch, aluminum-coated mirrors, and a ZnSe protective window. Fire-fighter cameras are often based upon uncooled microbolometer arrays and typically have resolutions of 320 x 240 or lower. For direct projection, we use an argon-arc source, which provides spectral radiance equivalent to a 10,000 Kelvin blackbody over the 7 micrometer to 14 micrometer wavelength range, to illuminate the micromirror array. For indirect projection, an expanded 4 watt CO II laser beam at a wavelength of 10.6 micrometers illuminates the micromirror array and the scene formed by the first-order diffracted light from the array is projected onto a diffuse aluminum screen. In both projectors, a well-calibrated reference camera is used to provide non-uniformity correction and brightness calibration of the projected scenes, and the fire-fighter cameras alternately view the same scenes. In this paper, we compare the two methods for this application and report on our quantitative results. Indirect projection has an advantage of being able to more easily fill the wide field of view of the fire-fighter cameras, which typically is about 50 degrees. Direct projection more efficiently utilizes the available light, which will become important in emerging multispectral and hyperspectral applications.

  1. Multispectral histogram normalization contrast enhancement

    NASA Technical Reports Server (NTRS)

    Soha, J. M.; Schwartz, A. A.

    1979-01-01

    A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.

  2. Multispectral imaging method and apparatus

    DOEpatents

    Sandison, David R.; Platzbecker, Mark R.; Vargo, Timothy D.; Lockhart, Randal R.; Descour, Michael R.; Richards-Kortum, Rebecca

    1999-01-01

    A multispectral imaging method and apparatus adapted for use in determining material properties, especially properties characteristic of abnormal non-dermal cells. A target is illuminated with a narrow band light beam. The target expresses light in response to the excitation. The expressed light is collected and the target's response at specific response wavelengths to specific excitation wavelengths is measured. From the measured multispectral response the target's properties can be determined. A sealed, remote probe and robust components can be used for cervical imaging

  3. Software defined multi-spectral imaging for Arctic sensor networks

    NASA Astrophysics Data System (ADS)

    Siewert, Sam; Angoth, Vivek; Krishnamurthy, Ramnarayan; Mani, Karthikeyan; Mock, Kenrick; Singh, Surjith B.; Srivistava, Saurav; Wagner, Chris; Claus, Ryan; Vis, Matthew Demi

    2016-05-01

    Availability of off-the-shelf infrared sensors combined with high definition visible cameras has made possible the construction of a Software Defined Multi-Spectral Imager (SDMSI) combining long-wave, near-infrared and visible imaging. The SDMSI requires a real-time embedded processor to fuse images and to create real-time depth maps for opportunistic uplink in sensor networks. Researchers at Embry Riddle Aeronautical University working with University of Alaska Anchorage at the Arctic Domain Awareness Center and the University of Colorado Boulder have built several versions of a low-cost drop-in-place SDMSI to test alternatives for power efficient image fusion. The SDMSI is intended for use in field applications including marine security, search and rescue operations and environmental surveys in the Arctic region. Based on Arctic marine sensor network mission goals, the team has designed the SDMSI to include features to rank images based on saliency and to provide on camera fusion and depth mapping. A major challenge has been the design of the camera computing system to operate within a 10 to 20 Watt power budget. This paper presents a power analysis of three options: 1) multi-core, 2) field programmable gate array with multi-core, and 3) graphics processing units with multi-core. For each test, power consumed for common fusion workloads has been measured at a range of frame rates and resolutions. Detailed analyses from our power efficiency comparison for workloads specific to stereo depth mapping and sensor fusion are summarized. Preliminary mission feasibility results from testing with off-the-shelf long-wave infrared and visible cameras in Alaska and Arizona are also summarized to demonstrate the value of the SDMSI for applications such as ice tracking, ocean color, soil moisture, animal and marine vessel detection and tracking. The goal is to select the most power efficient solution for the SDMSI for use on UAVs (Unoccupied Aerial Vehicles) and other drop

  4. Solar-Powered Airplane with Cameras and WLAN

    NASA Technical Reports Server (NTRS)

    Higgins, Robert G.; Dunagan, Steve E.; Sullivan, Don; Slye, Robert; Brass, James; Leung, Joe G.; Gallmeyer, Bruce; Aoyagi, Michio; Wei, Mei Y.; Herwitz, Stanley R.; hide

    2004-01-01

    An experimental airborne remote sensing system includes a remotely controlled, lightweight, solar-powered airplane (see figure) that carries two digital-output electronic cameras and communicates with a nearby ground control and monitoring station via a wireless local-area network (WLAN). The speed of the airplane -- typically <50 km/h -- is low enough to enable loitering over farm fields, disaster scenes, or other areas of interest to collect high-resolution digital imagery that could be delivered to end users (e.g., farm managers or disaster-relief coordinators) in nearly real time.

  5. Polarimetric Multispectral Imaging Technology

    NASA Technical Reports Server (NTRS)

    Cheng, L.-J.; Chao, T.-H.; Dowdy, M.; Mahoney, C.; Reyes, G.

    1993-01-01

    The Jet Propulsion Laboratory is developing a remote sensing technology on which a new generation of compact, lightweight, high-resolution, low-power, reliable, versatile, programmable scientific polarimetric multispectral imaging instruments can be built to meet the challenge of future planetary exploration missions. The instrument is based on the fast programmable acousto-optic tunable filter (AOTF) of tellurium dioxide (TeO2) that operates in the wavelength range of 0.4-5 microns. Basically, the AOTF multispectral imaging instrument measures incoming light intensity as a function of spatial coordinates, wavelength, and polarization. Its operation can be in either sequential, random access, or multiwavelength mode as required. This provides observation flexibility, allowing real-time alternation among desired observations, collecting needed data only, minimizing data transmission, and permitting implementation of new experiments. These will result in optimization of the mission performance with minimal resources. Recently we completed a polarimetric multispectral imaging prototype instrument and performed outdoor field experiments for evaluating application potentials of the technology. We also investigated potential improvements on AOTF performance to strengthen technology readiness for applications. This paper will give a status report on the technology and a prospect toward future planetary exploration.

  6. A mobile device-based imaging spectrometer for environmental monitoring by attaching a lightweight small module to a commercial digital camera.

    PubMed

    Cai, Fuhong; Lu, Wen; Shi, Wuxiong; He, Sailing

    2017-11-15

    Spatially-explicit data are essential for remote sensing of ecological phenomena. Lately, recent innovations in mobile device platforms have led to an upsurge in on-site rapid detection. For instance, CMOS chips in smart phones and digital cameras serve as excellent sensors for scientific research. In this paper, a mobile device-based imaging spectrometer module (weighing about 99 g) is developed and equipped on a Single Lens Reflex camera. Utilizing this lightweight module, as well as commonly used photographic equipment, we demonstrate its utility through a series of on-site multispectral imaging, including ocean (or lake) water-color sensing and plant reflectance measurement. Based on the experiments we obtain 3D spectral image cubes, which can be further analyzed for environmental monitoring. Moreover, our system can be applied to many kinds of cameras, e.g., aerial camera and underwater camera. Therefore, any camera can be upgraded to an imaging spectrometer with the help of our miniaturized module. We believe it has the potential to become a versatile tool for on-site investigation into many applications.

  7. Principle component analysis and linear discriminant analysis of multi-spectral autofluorescence imaging data for differentiating basal cell carcinoma and healthy skin

    NASA Astrophysics Data System (ADS)

    Chernomyrdin, Nikita V.; Zaytsev, Kirill I.; Lesnichaya, Anastasiya D.; Kudrin, Konstantin G.; Cherkasova, Olga P.; Kurlov, Vladimir N.; Shikunova, Irina A.; Perchik, Alexei V.; Yurchenko, Stanislav O.; Reshetov, Igor V.

    2016-09-01

    In present paper, an ability to differentiate basal cell carcinoma (BCC) and healthy skin by combining multi-spectral autofluorescence imaging, principle component analysis (PCA), and linear discriminant analysis (LDA) has been demonstrated. For this purpose, the experimental setup, which includes excitation and detection branches, has been assembled. The excitation branch utilizes a mercury arc lamp equipped with a 365-nm narrow-linewidth excitation filter, a beam homogenizer, and a mechanical chopper. The detection branch employs a set of bandpass filters with the central wavelength of spectral transparency of λ = 400, 450, 500, and 550 nm, and a digital camera. The setup has been used to study three samples of freshly excised BCC. PCA and LDA have been implemented to analyze the data of multi-spectral fluorescence imaging. Observed results of this pilot study highlight the advantages of proposed imaging technique for skin cancer diagnosis.

  8. Multispectral computational ghost imaging with multiplexed illumination

    NASA Astrophysics Data System (ADS)

    Huang, Jian; Shi, Dongfeng

    2017-07-01

    Computational ghost imaging has attracted wide attention from researchers in many fields over the last two decades. Multispectral imaging as one application of computational ghost imaging possesses spatial and spectral resolving abilities, and is very useful for surveying scenes and extracting detailed information. Existing multispectral imagers mostly utilize narrow band filters or dispersive optical devices to separate light of different wavelengths, and then use multiple bucket detectors or an array detector to record them separately. Here, we propose a novel multispectral ghost imaging method that uses one single bucket detector with multiplexed illumination to produce a colored image. The multiplexed illumination patterns are produced by three binary encoded matrices (corresponding to the red, green and blue colored information, respectively) and random patterns. The results of the simulation and experiment have verified that our method can be effective in recovering the colored object. Multispectral images are produced simultaneously by one single-pixel detector, which significantly reduces the amount of data acquisition.

  9. Multispectral laser imaging for advanced food analysis

    NASA Astrophysics Data System (ADS)

    Senni, L.; Burrascano, P.; Ricci, M.

    2016-07-01

    A hardware-software apparatus for food inspection capable of realizing multispectral NIR laser imaging at four different wavelengths is herein discussed. The system was designed to operate in a through-transmission configuration to detect the presence of unwanted foreign bodies inside samples, whether packed or unpacked. A modified Lock-In technique was employed to counterbalance the significant signal intensity attenuation due to transmission across the sample and to extract the multispectral information more efficiently. The NIR laser wavelengths used to acquire the multispectral images can be varied to deal with different materials and to focus on specific aspects. In the present work the wavelengths were selected after a preliminary analysis to enhance the image contrast between foreign bodies and food in the sample, thus identifying the location and nature of the defects. Experimental results obtained from several specimens, with and without packaging, are presented and the multispectral image processing as well as the achievable spatial resolution of the system are discussed.

  10. Study on multispectral imaging detection and recognition

    NASA Astrophysics Data System (ADS)

    Jun, Wang; Na, Ding; Gao, Jiaobo; Yu, Hu; Jun, Wu; Li, Junna; Zheng, Yawei; Fei, Gao; Sun, Kefeng

    2009-07-01

    Multispectral imaging detecting technology use target radiation character in spectral spatial distribution and relation between spectral and image to detect target and remote sensing measure. Its speciality is multi channel, narrow bandwidth, large amount of information, high accuracy. The ability of detecting target in environment of clutter, camouflage, concealment and beguilement is improved. At present, spectral imaging technology in the range of multispectral and hyperspectral develop greatly. The multispectral imaging equipment of unmanned aerial vehicle can be used in mine detection, information, surveillance and reconnaissance. Spectral imaging spectrometer operating in MWIR and LWIR has already been applied in the field of remote sensing and military in the advanced country. The paper presents the technology of multispectral imaging. It can enhance the reflectance, scatter and radiation character of the artificial targets among nature background. The targets among complex background and camouflage/stealth targets can be effectively identified. The experiment results and the data of spectral imaging is obtained.

  11. Development, characterization, and modeling of a tunable filter camera

    NASA Astrophysics Data System (ADS)

    Sartor, Mark Alan

    1999-10-01

    This paper describes the development, characterization, and modeling of a Tunable Filter Camera (TFC). The TFC is a new multispectral instrument with electronically tuned spectral filtering and low-light-level sensitivity. It represents a hybrid between hyperspectral and multispectral imaging spectrometers that incorporates advantages from each, addressing issues such as complexity, cost, lack of sensitivity, and adaptability. These capabilities allow the TFC to be applied to low- altitude video surveillance for real-time spectral and spatial target detection and image exploitation. Described herein are the theory and principles of operation for the TFC, which includes a liquid crystal tunable filter, an intensified CCD, and a custom apochromatic lens. The results of proof-of-concept testing, and characterization of two prototype cameras are included, along with a summary of the design analyses for the development of a multiple-channel system. A significant result of this effort was the creation of a system-level model, which was used to facilitate development and predict performance. It includes models for the liquid crystal tunable filter and intensified CCD. Such modeling was necessary in the design of the system and is useful for evaluation of the system in remote-sensing applications. Also presented are characterization data from component testing, which included quantitative results for linearity, signal to noise ratio (SNR), linearity, and radiometric response. These data were used to help refine and validate the model. For a pre-defined source, the spatial and spectral response, and the noise of the camera, system can now be predicted. The innovation that sets this development apart is the fact that this instrument has been designed for integrated, multi-channel operation for the express purpose of real-time detection/identification in low- light-level conditions. Many of the requirements for the TFC were derived from this mission. In order to provide

  12. Analysis of remote sensing data collected for detection and mapping of oil spills: Reduction and analysis of multi-sensor airborne data of the NASA Wallops oil spill exercise of November 1978

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Airborne, remotely sensed data of the NASA Wallops controlled oil spill were corrected, reduced and analysed. Sensor performance comparisons were made by registering data sets from different sensors, which were near-coincident in time and location. Multispectral scanner images were, in turn, overlayed with profiles of correlation between airborne and laboratory-acquired fluorosensor spectra of oil; oil-thickness contours derived (by NASA) from a scanning fluorosensor and also from a two-channel scanning microwave radiometer; and synthetic aperture radar X-HH images. Microwave scatterometer data were correlated with dual-channel (UV and TIR) line scanner images of the oil slick.

  13. Airborne multispectral remote sensing data to estimate several oenological parameters in vineyard production. A case study of application of remote sensing data to precision viticulture in central Italy.

    NASA Astrophysics Data System (ADS)

    Tramontana, Gianluca; Girard, Filippo; Belli, Claudio; Comandini, Maria Cristina; Pietromarchi, Paolo; Tiberi, Domenico; Papale, Dario

    2010-05-01

    It is widely recognized that environmental differences within the vineyard, with respect to soils, microclimate, and topography, can influence grape characteristics and crop yields. Besides, the central Italy landscape is characterized by a high level of fragmentation and heterogeneity It requires stringent Remote sensing technical features in terms of spectral, geometric and temporal resolution to aimed at supporting applications for precision viticulture. In response to the needs of the Italian grape and wine industry for an evaluation of precision viticulture technologies, the DISAFRI (University of Tuscia) and the Agricultural Research Council - Oenological research unit (ENC-CRA) jointly carried out an experimental study during the year 2008. The study was carried out on 2 areas located in the town of Velletri, near Rome; for each area, two varieties (red and white grape) were studied: Nero d'Avola and Sauvignon blanc in first area , Merlot and Sauvignon blanc in second. Remote sensing data were acquired in different periods using a low cost multisensor Airborne remote sensing platform developed by DISAFRI (ASPIS-2 Advanced Spectroscopic Imager System). ASPIS-2, an evolution of the ASPIS sensor (Papale et al 2008, Sensors), is a multispectral sensor based on 4 CCD and 3 interferential filters per CCD. The filters are user selectable during the flight and in this way Aspis is able to acquire data in 12 bands in the visible and near infrared regions with a bandwidth of 10 or 20 nm. To the purposes of this study 7 spectral band were acquired and 15 vegetation indices calculated. During the ripeness period several vegetative and oenochemical parameters were monitored. Anova test shown that several oenochemical variables, such as sugars, total acidity, polyphenols and anthocyanins differ according to the variety taken into consideration. In order to evaluate the time autocorrelation of several oenological parameters value, a simple linear regression between

  14. Remote sensing of the earth's surface with an airborne polarized laser

    NASA Technical Reports Server (NTRS)

    Kalshoven, James E.; Dabney, Philip W.

    1993-01-01

    Attention is given to the Airborne Laser Polarization Sensor (ALPS), which makes multispectral radiometric and polarization measurements of the earth's surface using a polarized laser light source. Results from data flights taken over boreal forests in Maine at two wavelengths (1060 and 532 nm) using an Nd:YAG laser source show distinct depolarization signatures for three broadleaf and five coniferous tree species. A statistically significant increase in depolarization is found to correlate with increasing leaf surface roughness for the broadleaf species in the near-IR. The ALPS system 3 employs 12 photomultiplier tube detectors configurable to measure desired parameters such as the total backscatter and the polarization state, including the azimuthal angle and ellipticity, at different UV to near-IR wavelengths simultaneously.

  15. A method for measuring aircraft height and velocity using dual television cameras

    NASA Technical Reports Server (NTRS)

    Young, W. R.

    1977-01-01

    A unique electronic optical technique, consisting of two closed circuit television cameras and timing electronics, was devised to measure an aircraft's horizontal velocity and height above ground without the need for airborne cooperative devices. The system is intended to be used where the aircraft has a predictable flight path and a height of less than 660 meters (2,000 feet) at or near the end of an air terminal runway, but is suitable for greater aircraft altitudes whenever the aircraft remains visible. Two television cameras, pointed at zenith, are placed in line with the expected path of travel of the aircraft. Velocity is determined by measuring the time it takes the aircraft to travel the measured distance between cameras. Height is determined by correlating this speed with the time required to cross the field of view of either camera. Preliminary tests with a breadboard version of the system and a small model aircraft indicate the technique is feasible.

  16. Target-Tracking Camera for a Metrology System

    NASA Technical Reports Server (NTRS)

    Liebe, Carl; Bartman, Randall; Chapsky, Jacob; Abramovici, Alexander; Brown, David

    2009-01-01

    An analog electronic camera that is part of a metrology system measures the varying direction to a light-emitting diode that serves as a bright point target. In the original application for which the camera was developed, the metrological system is used to determine the varying relative positions of radiating elements of an airborne synthetic aperture-radar (SAR) antenna as the airplane flexes during flight; precise knowledge of the relative positions as a function of time is needed for processing SAR readings. It has been common metrology system practice to measure the varying direction to a bright target by use of an electronic camera of the charge-coupled-device or active-pixel-sensor type. A major disadvantage of this practice arises from the necessity of reading out and digitizing the outputs from a large number of pixels and processing the resulting digital values in a computer to determine the centroid of a target: Because of the time taken by the readout, digitization, and computation, the update rate is limited to tens of hertz. In contrast, the analog nature of the present camera makes it possible to achieve an update rate of hundreds of hertz, and no computer is needed to determine the centroid. The camera is based on a position-sensitive detector (PSD), which is a rectangular photodiode with output contacts at opposite ends. PSDs are usually used in triangulation for measuring small distances. PSDs are manufactured in both one- and two-dimensional versions. Because it is very difficult to calibrate two-dimensional PSDs accurately, the focal-plane sensors used in this camera are two orthogonally mounted one-dimensional PSDs.

  17. A multispectral sorting device for wheat kernels

    USDA-ARS?s Scientific Manuscript database

    A low-cost multispectral sorting device was constructed using three visible and three near-infrared light-emitting diodes (LED) with peak emission wavelengths of 470 nm (blue), 527 nm (green), 624 nm (red), 850 nm, 940 nm, and 1070 nm. The multispectral data were collected by rapidly (~12 kHz) blin...

  18. A multispectral imaging approach for diagnostics of skin pathologies

    NASA Astrophysics Data System (ADS)

    Lihacova, Ilze; Derjabo, Aleksandrs; Spigulis, Janis

    2013-06-01

    Noninvasive multispectral imaging method was applied for different skin pathology such as nevus, basal cell carcinoma, and melanoma diagnostics. Developed melanoma diagnostic parameter, using three spectral bands (540 nm, 650 nm and 950 nm), was calculated for nevus, melanoma and basal cell carcinoma. Simple multispectral diagnostic device was established and applied for skin assessment. Development and application of multispectral diagnostics method described further in this article.

  19. Multispectral imaging method and apparatus

    DOEpatents

    Sandison, D.R.; Platzbecker, M.R.; Vargo, T.D.; Lockhart, R.R.; Descour, M.R.; Richards-Kortum, R.

    1999-07-06

    A multispectral imaging method and apparatus are described which are adapted for use in determining material properties, especially properties characteristic of abnormal non-dermal cells. A target is illuminated with a narrow band light beam. The target expresses light in response to the excitation. The expressed light is collected and the target's response at specific response wavelengths to specific excitation wavelengths is measured. From the measured multispectral response the target's properties can be determined. A sealed, remote probe and robust components can be used for cervical imaging. 5 figs.

  20. Using the thermal infrared multispectral scanner (TIMS) to estimate surface thermal responses

    NASA Technical Reports Server (NTRS)

    Luvall, J. C.; Holbo, H. R.

    1987-01-01

    A series of measurements was conducted over the H.J. Andrews, Oregon, experimental coniferous forest, using airborne thermal infrared multispectral scanner (TIMS). Flight lines overlapped, with a 28-min time difference between flight lines. Concurrent radiosonde measurements of atmospheric profiles of air temperature and moisture were used for atmospheric radiance corrections of the TIMS data. Surface temperature differences over time between flight lines were used to develop thermal response numbers (TRNs) which characterized the thermal response (in KJ/sq m/C, where K is the measured incoming solar radiation) of the different surface types. The surface types included a mature forest (canopy dominated by dense crowns of Pseudosuga menziesii, with a secondary canopy of dense Tsuga heterophylla, and also a tall shrub layer of Acer circinatum) and a two-year-old clear-cut. The temperature distribution, within TIMS thermal images was found to reflect the surface type examined. The clear-cut surface had the lowest TRN, while mature Douglas fir the highest.

  1. Skin Parameter Map Retrieval from a Dedicated Multispectral Imaging System Applied to Dermatology/Cosmetology

    PubMed Central

    2013-01-01

    In vivo quantitative assessment of skin lesions is an important step in the evaluation of skin condition. An objective measurement device can help as a valuable tool for skin analysis. We propose an explorative new multispectral camera specifically developed for dermatology/cosmetology applications. The multispectral imaging system provides images of skin reflectance at different wavebands covering visible and near-infrared domain. It is coupled with a neural network-based algorithm for the reconstruction of reflectance cube of cutaneous data. This cube contains only skin optical reflectance spectrum in each pixel of the bidimensional spatial information. The reflectance cube is analyzed by an algorithm based on a Kubelka-Munk model combined with evolutionary algorithm. The technique allows quantitative measure of cutaneous tissue and retrieves five skin parameter maps: melanin concentration, epidermis/dermis thickness, haemoglobin concentration, and the oxygenated hemoglobin. The results retrieved on healthy participants by the algorithm are in good accordance with the data from the literature. The usefulness of the developed technique was proved during two experiments: a clinical study based on vitiligo and melasma skin lesions and a skin oxygenation experiment (induced ischemia) with healthy participant where normal tissues are recorded at normal state and when temporary ischemia is induced. PMID:24159326

  2. Modeling of estuarne chlorophyll a from an airborne scanner

    USGS Publications Warehouse

    Khorram, Siamak; Catts, Glenn P.; Cloern, James E.; Knight, Allen W.

    1987-01-01

    Near simultaneous collection of 34 surface water samples and airborne multispectral scanner data provided input for regression models developed to predict surface concentrations of estuarine chlorophyll a. Two wavelength ratios were employed in model development. The ratios werechosen to capitalize on the spectral characteristics of chlorophyll a, while minimizing atmospheric influences. Models were then applied to data previously acquired over the study area thre years earlier. Results are in the form of color-coded displays of predicted chlorophyll a concentrations and comparisons of the agreement among measured surface samples and predictions basedon coincident remotely sensed data. The influence of large variations in fresh-water inflow to the estuary are clearly apparent in the results. The synoptic view provided by remote sensing is another method of examining important estuarine dynamics difficult to observe from in situ sampling alone.

  3. Light, shadows and surface characteristics: the multispectral Portable Light Dome

    NASA Astrophysics Data System (ADS)

    Watteeuw, Lieve; Hameeuw, Hendrik; Vandermeulen, Bruno; Van der Perre, Athena; Boschloos, Vanessa; Delvaux, Luc; Proesmans, Marc; Van Bos, Marina; Van Gool, Luc

    2016-11-01

    A multispectral, multidirectional, portable and dome-shaped acquisition system is developed within the framework of the research projects RICH (KU Leuven) and EES (RMAH, Brussels) in collaboration with the ESAT-VISICS research group (KU Leuven). The multispectral Portable Light Dome (MS PLD) consists of a hemispherical structure, an overhead camera and LEDs emitting in five parts of the electromagnetic spectrum regularly covering the dome's inside surface. With the associated software solution, virtual relighting and enhancements can be applied in a real-time, interactive manner. The system extracts genuine 3D and shading information based on a photometric stereo algorithm. This innovative approach allows for instantaneous alternations between the computations in the infrared, red, green, blue and ultraviolet spectra. The MS PLD system has been tested for research ranging from medieval manuscript illuminations to ancient Egyptian artefacts. Preliminary results have shown that it documents and measures the 3D surface structure of objects, re-visualises underdrawings, faded pigments and inscriptions, and examines the MS results in combination with the actual relief characteristics of the physical object. Newly developed features are reflection maps and histograms, analytic visualisations of the reflection properties of all separate LEDs or selected areas. In its capacity as imaging technology, the system acts as a tool for the analysis of surface materials (e.g. identification of blue pigments, gold and metallic surfaces). Besides offering support in answering questions of attribution and monitoring changes and decay of materials, the PLD also contributes to the identification of materials, all essential factors when making decisions in the conservation protocol.

  4. Evaluation of port-wine stain treatment outcomes using multispectral imaging

    NASA Astrophysics Data System (ADS)

    Samatham, Ravikant; Choudhury, Niloy; Krol, Alfons L.; Jacques, Steven L.

    2012-02-01

    Port-wine Stain (PWS) is a vascular malformation characterized by ectasia of superficial dermal capillaries. The flash-lamp pumped pulsed dye laser (PDL) treatment has been the mainstay of PWS for the last decade. Despite the success of the PDL in significantly fading the PWS, the overall cure rate is less than 10%. The precise efficacy of an individual PDL treatment is hard to evaluate and the treatment outcome is measured by visual observation of clinical fading. A hand-held multi-spectral imaging system was developed to image PWS before and after PDL treatment. In an NIH-funded pilot study multi-spectral camera was used to image PWS in children (2- 17 years). Oxygen saturation (S) and blood content (B) of PWS before and after the treatment was determined by analysis of the reflectance spectra. The outcome of the treatment was evaluated during follow up visits of the patients. One of the major causes of failure of laser therapy of port-wine stains (PWS) is reperfusion of the lesion after laser treatment. Oxygen saturation and blood content maps of PWS before and after treatment can predict regions of reperfusion and subsequent failure of the treatment. The ability to measure reperfusion and to predict lesions or areas susceptible to reperfusion, will help in selection of patients/lesions for laser treatment and help to optimize laser dosimetry for maximum effect. The current studies also should provide a basis for monitoring of future alternative therapies or enhancers of laser treatment in resistant cases.

  5. Near-infrared extension of a visible spectrum airborne Sun photometer

    NASA Astrophysics Data System (ADS)

    Starace, Marco; von Bismarck, Jonas; Hollstein, André; Ruhtz, Thomas; Preusker, René; Fischer, Jürgen

    2013-05-01

    The continuously-measuring, multispectral airborne Sun and aureole photometers FUBISS-ASA and FUBISSASA2 were developed at the Institute for Space Sciences of the Freie Universität Berlin in 2002 and 2006 respectively, for the retrieval of aerosol optical and microphysical parameters at wavelengths ranging from 400 to 900 nm. A multispectral near-infrared direct sun radiometer measuring in a spectral range of 1000 to 1700 nm has now been added to FUBISS-ASA2. The main objective of this NIR extension is to enhance the characterization of larger aerosol particles, as Mie scattering theory offers a more accurate approximation for their interaction with electromagnetic radiation, if both the VIS and NIR parts of the spectrum are considered, than it does for the VIS part only. The spectral transmissivity of atmospheric models was computed using the HITRAN2008 database in order to determine local absorption minima suitable for aerosol retrieval. Measurements were first carried out aboard the research vessel FS Polarstern on its transatlantic voyage ANT-XXVI/1. Additional measurements were performed from the Sphinx High Altitude Research Station on the Jungfraujoch and in the nearby Kleine Scheidegg locality during the CLACE2010 measurement campaign. Aerosol optical parameters derived from VIS aureole and direct sun measurements were compared to those of simulated aerosol mixtures in order to estimate the composition of the measured aerosol.

  6. Photometry of Galactic and Extragalactic Far-Infrared Sources using the 91.5 cm Airborne Infrared Telescope

    NASA Technical Reports Server (NTRS)

    Harper, D. A.

    1996-01-01

    The objective of this grant was to construct a series of far infrared photometers, cameras, and supporting systems for use in astronomical observations in the Kuiper Airborne Observatory. The observations have included studies of galaxies, star formation regions, and objects within the Solar System.

  7. Progress in Airborne Polarimeter Inter Comparison for the NASA Aerosols-Clouds-Ecosystems (ACE) Mission

    NASA Technical Reports Server (NTRS)

    Knobelspiesse, Kirk; Redemann, Jens

    2014-01-01

    The Aerosols-Clouds-Ecosystems (ACE) mission, recommended by the National Research Council's Decadal Survey, calls for a multi-angle, multi-spectral polarimeter devoted to observations of atmospheric aerosols and clouds. In preparation for ACE, NASA funds the deployment of airborne polarimeters, including the Airborne Multiangle SpectroPolarimeter Imager (AirMSPI), the Passive Aerosol and Cloud Suite (PACS) and the Research Scanning Polarimeter (RSP). These instruments have been operated together on NASA's ER-2 high altitude aircraft as part of field campaigns such as the POlarimeter DEfinition EXperiment (PODEX) (California, early 2013) and Studies of Emissions and Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC4RS, California and Texas, summer 2013). Our role in these efforts has been to serve as an assessment team performing level 1 (calibrated radiance, polarization) and level 2 (retrieved geophysical parameter) instrument intercomparisons, and to promote unified and generalized calibration, uncertainty assessment and retrieval techniques. We will present our progress in this endeavor thus far and describe upcoming research in 2015.

  8. Materials to enable vehicle and personnel identification from surveillance aircraft equipped with visible and IR cameras

    NASA Astrophysics Data System (ADS)

    O'Keefe, Eoin S.

    2005-10-01

    As thermal imaging technology matures and ownership costs decrease, there is a trend to equip a greater proportion of airborne surveillance vehicles used by security and defence forces with both visible band and thermal infrared cameras. These cameras are used for tracking vehicles on the ground, to aid in pursuit of villains in vehicles and on foot, while also assisting in the direction and co-ordination of emergency service vehicles as the occasion arises. These functions rely on unambiguous identification of police and the other emergency service vehicles. In the visible band this is achieved by dark markings with high contrast (light) backgrounds on the roof of vehicles. When there is no ambient lighting, for example at night, thermal imaging is used to track both vehicles and people. In the thermal IR, the visible markings are not obvious. At the wavelength thermal imagers operate, either 3-5 microns or 8-12 microns, the dark and light coloured materials have similar low reflectivity. To maximise the usefulness of IR airborne surveillance, a method of passively and unobtrusively marking vehicles concurrently in the visible and thermal infrared is needed. In this paper we discuss the design, application and operation of some vehicle and personnel marking materials and show airborne IR and visible imagery of materials in use.

  9. Multipurpose Hyperspectral Imaging System

    NASA Technical Reports Server (NTRS)

    Mao, Chengye; Smith, David; Lanoue, Mark A.; Poole, Gavin H.; Heitschmidt, Jerry; Martinez, Luis; Windham, William A.; Lawrence, Kurt C.; Park, Bosoon

    2005-01-01

    A hyperspectral imaging system of high spectral and spatial resolution that incorporates several innovative features has been developed to incorporate a focal plane scanner (U.S. Patent 6,166,373). This feature enables the system to be used for both airborne/spaceborne and laboratory hyperspectral imaging with or without relative movement of the imaging system, and it can be used to scan a target of any size as long as the target can be imaged at the focal plane; for example, automated inspection of food items and identification of single-celled organisms. The spectral resolution of this system is greater than that of prior terrestrial multispectral imaging systems. Moreover, unlike prior high-spectral resolution airborne and spaceborne hyperspectral imaging systems, this system does not rely on relative movement of the target and the imaging system to sweep an imaging line across a scene. This compact system (see figure) consists of a front objective mounted at a translation stage with a motorized actuator, and a line-slit imaging spectrograph mounted within a rotary assembly with a rear adaptor to a charged-coupled-device (CCD) camera. Push-broom scanning is carried out by the motorized actuator which can be controlled either manually by an operator or automatically by a computer to drive the line-slit across an image at a focal plane of the front objective. To reduce the cost, the system has been designed to integrate as many as possible off-the-shelf components including the CCD camera and spectrograph. The system has achieved high spectral and spatial resolutions by using a high-quality CCD camera, spectrograph, and front objective lens. Fixtures for attachment of the system to a microscope (U.S. Patent 6,495,818 B1) make it possible to acquire multispectral images of single cells and other microscopic objects.

  10. Multiplexing and de-multiplexing with scattering media for large field of view and multispectral imaging

    NASA Astrophysics Data System (ADS)

    Sahoo, Sujit Kumar; Tang, Dongliang; Dang, Cuong

    2018-02-01

    Large field of view multispectral imaging through scattering medium is a fundamental quest in optics community. It has gained special attention from researchers in recent years for its wide range of potential applications. However, the main bottlenecks of the current imaging systems are the requirements on specific illumination, poor image quality and limited field of view. In this work, we demonstrated a single-shot high-resolution colour-imaging through scattering media using a monochromatic camera. This novel imaging technique is enabled by the spatial, spectral decorrelation property and the optical memory effect of the scattering media. Moreover the use of deconvolution image processing further annihilate above-mentioned drawbacks arise due iterative refocusing, scanning or phase retrieval procedures.

  11. Detection of contamination on selected apple cultivars using reflectance hyperspectral and multispectral analysis

    NASA Astrophysics Data System (ADS)

    Mehl, Patrick M.; Chao, Kevin; Kim, Moon S.; Chen, Yud-Ren

    2001-03-01

    Presence of natural or exogenous contaminations on apple cultivars is a food safety and quality concern touching the general public and strongly affecting this commodity market. Accumulations of human pathogens are usually observed on surface lesions of commodities. Detections of either lesions or directly of the pathogens are essential for assuring the quality and safety of commodities. We are presenting the application of hyperspectral image analysis towards the development of multispectral techniques for the detection of defects on chosen apple cultivars, such as Golden Delicious, Red Delicious, and Gala apples. Separate apple cultivars possess different spectral characteristics leading to different approaches for analysis. General preprocessing analysis with morphological treatments is followed by different image treatments and condition analysis for highlighting lesions and contaminations on the apple cultivars. Good isolations of scabs, fungal and soil contaminations and bruises are observed with hyperspectral imaging processing either using principal component analysis or utilizing the chlorophyll absorption peak. Applications of hyperspectral results to a multispectral detection are limited by the spectral capabilities of our RGB camera using either specific band pass filters and using direct neutral filters. Good separations of defects are obtained for Golden Delicious apples. It is however limited for the other cultivars. Having an extra near infrared channel will increase the detection level utilizing the chlorophyll absorption band for detection as demonstrated by the present hyperspectral imaging analysis

  12. Development of a multispectral imagery device devoted to weed detection

    NASA Astrophysics Data System (ADS)

    Vioix, Jean-Baptiste; Douzals, Jean-Paul; Truchetet, Frederic; Navar, Pierre

    2003-04-01

    Multispectral imagery is a large domain with number of practical applications: thermography, quality control in industry, food science and agronomy, etc. The main interest is to obtain spectral information of the objects for which reflectance signal can be associated with physical, chemical and/or biological properties. Agronomic applications of multispectral imagery generally involve the acquisition of several images in the wavelengths of visible and near infrared. This paper will first present different kind of multispectral devices used for agronomic issues and will secondly introduce an original multispectral design based on a single CCD. Third, early results obtained for weed detection are presented.

  13. Feature Relevance Assessment of Multispectral Airborne LIDAR Data for Tree Species Classification

    NASA Astrophysics Data System (ADS)

    Amiri, N.; Heurich, M.; Krzystek, P.; Skidmore, A. K.

    2018-04-01

    The presented experiment investigates the potential of Multispectral Laser Scanning (MLS) point clouds for single tree species classification. The basic idea is to simulate a MLS sensor by combining two different Lidar sensors providing three different wavelngthes. The available data were acquired in the summer 2016 at the same date in a leaf-on condition with an average point density of 37 points/m2. For the purpose of classification, we segmented the combined 3D point clouds consisiting of three different spectral channels into 3D clusters using Normalized Cut segmentation approach. Then, we extracted four group of features from the 3D point cloud space. Once a varity of features has been extracted, we applied forward stepwise feature selection in order to reduce the number of irrelevant or redundant features. For the classification, we used multinomial logestic regression with L1 regularization. Our study is conducted using 586 ground measured single trees from 20 sample plots in the Bavarian Forest National Park, in Germany. Due to lack of reference data for some rare species, we focused on four classes of species. The results show an improvement between 4-10 pp for the tree species classification by using MLS data in comparison to a single wavelength based approach. A cross validated (15-fold) accuracy of 0.75 can be achieved when all feature sets from three different spectral channels are used. Our results cleary indicates that the use of MLS point clouds has great potential to improve detailed forest species mapping.

  14. Multispectral Filter Arrays: Recent Advances and Practical Implementation

    PubMed Central

    Lapray, Pierre-Jean; Wang, Xingbo; Thomas, Jean-Baptiste; Gouton, Pierre

    2014-01-01

    Thanks to some technical progress in interferencefilter design based on different technologies, we can finally successfully implement the concept of multispectral filter array-based sensors. This article provides the relevant state-of-the-art for multispectral imaging systems and presents the characteristics of the elements of our multispectral sensor as a case study. The spectral characteristics are based on two different spatial arrangements that distribute eight different bandpass filters in the visible and near-infrared area of the spectrum. We demonstrate that the system is viable and evaluate its performance through sensor spectral simulation. PMID:25407904

  15. Fast Lossless Compression of Multispectral-Image Data

    NASA Technical Reports Server (NTRS)

    Klimesh, Matthew

    2006-01-01

    An algorithm that effects fast lossless compression of multispectral-image data is based on low-complexity, proven adaptive-filtering algorithms. This algorithm is intended for use in compressing multispectral-image data aboard spacecraft for transmission to Earth stations. Variants of this algorithm could be useful for lossless compression of three-dimensional medical imagery and, perhaps, for compressing image data in general.

  16. Non-contact tissue perfusion and oxygenation imaging using a LED based multispectral and a thermal imaging system, first results of clinical intervention studies

    NASA Astrophysics Data System (ADS)

    Klaessens, John H. G. M.; Nelisse, Martin; Verdaasdonk, Rudolf M.; Noordmans, Herke Jan

    2013-03-01

    During clinical interventions objective and quantitative information of the tissue perfusion, oxygenation or temperature can be useful for the surgical strategy. Local (point) measurements give limited information and affected areas can easily be missed, therefore imaging large areas is required. In this study a LED based multispectral imaging system (MSI, 17 different wavelengths 370nm-880nm) and a thermo camera were applied during clinical interventions: tissue flap transplantations (ENT), local anesthetic block and during open brain surgery (epileptic seizure). The images covered an area of 20x20 cm, when doing measurements in an (operating) room, they turned out to be more complicated than laboratory experiments due to light fluctuations, movement of the patient and limited angle of view. By constantly measuring the background light and the use of a white reference, light fluctuations and movement were corrected. Oxygenation concentration images could be calculated and combined with the thermal images. The effectively of local anesthesia of a hand could be predicted in an early stage using the thermal camera and the reperfusion of transplanted skin flap could be imaged. During brain surgery, a temporary hyper-perfused area was witnessed which was probably related to an epileptic attack. A LED based multispectral imaging system combined with thermal imaging provide complementary information on perfusion and oxygenation changes and are promising techniques for real-time diagnostics during clinical interventions.

  17. Classification of Dual-Wavelength Airborne Laser Scanning Point Cloud Based on the Radiometric Properties of the Objects

    NASA Astrophysics Data System (ADS)

    Pilarska, M.

    2018-05-01

    Airborne laser scanning (ALS) is a well-known and willingly used technology. One of the advantages of this technology is primarily its fast and accurate data registration. In recent years ALS is continuously developed. One of the latest achievements is multispectral ALS, which consists in obtaining simultaneously the data in more than one laser wavelength. In this article the results of the dual-wavelength ALS data classification are presented. The data were acquired with RIEGL VQ-1560i sensor, which is equipped with two laser scanners operating in different wavelengths: 532 nm and 1064 nm. Two classification approaches are presented in the article: classification, which is based on geometric relationships between points and classification, which mostly relies on the radiometric properties of registered objects. The overall accuracy of the geometric classification was 86 %, whereas for the radiometric classification it was 81 %. As a result, it can be assumed that the radiometric features which are provided by the multispectral ALS have potential to be successfully used in ALS point cloud classification.

  18. Automated detection and mapping of crown discolouration caused by jack pine budworm with 2.5 m resolution multispectral imagery

    NASA Astrophysics Data System (ADS)

    Leckie, Donald G.; Cloney, Ed; Joyce, Steve P.

    2005-05-01

    Jack pine budworm ( Choristoneura pinus pinus (Free.)) is a native insect defoliator of mainly jack pine ( Pinus banksiana Lamb.) in North America east of the Rocky Mountains. Periodic outbreaks of this insect, which generally last two to three years, can cause growth loss and mortality and have an important impact ecologically and economically in terms of timber production and harvest. The jack pine budworm prefers to feed on current year needles. Their characteristic feeding habits cause discolouration or reddening of the canopy. This red colouration is used to map the distribution and intensity of defoliation that has taken place that year (current defoliation). An accurate and consistent map of the distribution and intensity of budworm defoliation (as represented by the red discolouration) at the stand and within stand level is desirable. Automated classification of multispectral imagery, such as is available from airborne and new high resolution satellite systems, was explored as a viable tool for objectively classifying current discolouration. Airborne multispectral imagery was acquired at a 2.5 m resolution with the Multispectral Electro-optical Imaging Sensor (MEIS). It recorded imagery in six nadir looking spectral bands specifically designed to detect discolouration caused by budworm and a near-infrared band viewing forward at 35° was also used. A 2200 nm middle infrared image was acquired with a Daedalus scanner. Training and test areas of different levels of discolouration were created based on field observations and a maximum likelihood supervized classification was used to estimate four classes of discolouration (nil-trace, light, moderate and severe). Good discrimination was achieved with an overall accuracy of 84% for the four discolouration levels. The moderate discolouration class was the poorest at 73%, because of confusion with both the severe and light classes. Accuracy on a stand basis was also good, and regional and within stand

  19. Optimal wavelength band clustering for multispectral iris recognition.

    PubMed

    Gong, Yazhuo; Zhang, David; Shi, Pengfei; Yan, Jingqi

    2012-07-01

    This work explores the possibility of clustering spectral wavelengths based on the maximum dissimilarity of iris textures. The eventual goal is to determine how many bands of spectral wavelengths will be enough for iris multispectral fusion and to find these bands that will provide higher performance of iris multispectral recognition. A multispectral acquisition system was first designed for imaging the iris at narrow spectral bands in the range of 420 to 940 nm. Next, a set of 60 human iris images that correspond to the right and left eyes of 30 different subjects were acquired for an analysis. Finally, we determined that 3 clusters were enough to represent the 10 feature bands of spectral wavelengths using the agglomerative clustering based on two-dimensional principal component analysis. The experimental results suggest (1) the number, center, and composition of clusters of spectral wavelengths and (2) the higher performance of iris multispectral recognition based on a three wavelengths-bands fusion.

  20. Blast investigation by fast multispectral radiometric analysis

    NASA Astrophysics Data System (ADS)

    Devir, A. D.; Bushlin, Y.; Mendelewicz, I.; Lessin, A. B.; Engel, M.

    2011-06-01

    Knowledge regarding the processes involved in blasts and detonations is required in various applications, e.g. missile interception, blasts of high-explosive materials, final ballistics and IED identification. Blasts release large amount of energy in short time duration. Some part of this energy is released as intense radiation in the optical spectral bands. This paper proposes to measure the blast radiation by a fast multispectral radiometer. The measurement is made, simultaneously, in appropriately chosen spectral bands. These spectral bands provide extensive information on the physical and chemical processes that govern the blast through the time-dependence of the molecular and aerosol contributions to the detonation products. Multi-spectral blast measurements are performed in the visible, SWIR and MWIR spectral bands. Analysis of the cross-correlation between the measured multi-spectral signals gives the time dependence of the temperature, aerosol and gas composition of the blast. Farther analysis of the development of these quantities in time may indicate on the order of the detonation and amount and type of explosive materials. Examples of analysis of measured explosions are presented to demonstrate the power of the suggested fast multispectral radiometric analysis approach.

  1. Design and fabrication of multispectral optics using expanded glass map

    NASA Astrophysics Data System (ADS)

    Bayya, Shyam; Gibson, Daniel; Nguyen, Vinh; Sanghera, Jasbinder; Kotov, Mikhail; Drake, Gryphon; Deegan, John; Lindberg, George

    2015-06-01

    As the desire to have compact multispectral imagers in various DoD platforms is growing, the dearth of multispectral optics is widely felt. With the limited number of material choices for optics, these multispectral imagers are often very bulky and impractical on several weight sensitive platforms. To address this issue, NRL has developed a large set of unique infrared glasses that transmit from 0.9 to > 14 μm in wavelength and expand the glass map for multispectral optics with refractive indices from 2.38 to 3.17. They show a large spread in dispersion (Abbe number) and offer some unique solutions for multispectral optics designs. The new NRL glasses can be easily molded and also fused together to make bonded doublets. A Zemax compatible glass file has been created and is available upon request. In this paper we present some designs, optics fabrication and imaging, all using NRL materials.

  2. Multispectral imaging approach for simplified non-invasive in-vivo evaluation of gingival erythema

    NASA Astrophysics Data System (ADS)

    Eckhard, Timo; Valero, Eva M.; Nieves, Juan L.; Gallegos-Rueda, José M.; Mesa, Francisco

    2012-03-01

    Erythema is a common visual sign of gingivitis. In this work, a new and simple low-cost image capture and analysis method for erythema assessment is proposed. The method is based on digital still images of gingivae and applied on a pixel-by-pixel basis. Multispectral images are acquired with a conventional digital camera and multiplexed LED illumination panels at 460nm and 630nm peak wavelength. An automatic work-flow segments teeth from gingiva regions in the images and creates a map of local blood oxygenation levels, which relates to the presence of erythema. The map is computed from the ratio of the two spectral images. An advantage of the proposed approach is that the whole process is easy to manage by dental health care professionals in clinical environment.

  3. Evaluation of an airborne remote sensing platform consisting of two consumer-grade cameras for crop identification

    USDA-ARS?s Scientific Manuscript database

    Remote sensing systems based on consumer-grade cameras have been increasingly used in scientific research and remote sensing applications because of their low cost and ease of use. However, the performance of consumer-grade cameras for practical applications have not been well documented in related ...

  4. Analysis of multispectral and hyperspectral longwave infrared (LWIR) data for geologic mapping

    NASA Astrophysics Data System (ADS)

    Kruse, Fred A.; McDowell, Meryl

    2015-05-01

    Multispectral MODIS/ASTER Airborne Simulator (MASTER) data and Hyperspectral Thermal Emission Spectrometer (HyTES) data covering the 8 - 12 μm spectral range (longwave infrared or LWIR) were analyzed for an area near Mountain Pass, California. Decorrelation stretched images were initially used to highlight spectral differences between geologic materials. Both datasets were atmospherically corrected using the ISAC method, and the Normalized Emissivity approach was used to separate temperature and emissivity. The MASTER data had 10 LWIR spectral bands and approximately 35-meter spatial resolution and covered a larger area than the HyTES data, which were collected with 256 narrow (approximately 17nm-wide) spectral bands at approximately 2.3-meter spatial resolution. Spectra for key spatially-coherent, spectrally-determined geologic units for overlap areas were overlain and visually compared to determine similarities and differences. Endmember spectra were extracted from both datasets using n-dimensional scatterplotting and compared to emissivity spectral libraries for identification. Endmember distributions and abundances were then mapped using Mixture-Tuned Matched Filtering (MTMF), a partial unmixing approach. Multispectral results demonstrate separation of silica-rich vs non-silicate materials, with distinct mapping of carbonate areas and general correspondence to the regional geology. Hyperspectral results illustrate refined mapping of silicates with distinction between similar units based on the position, character, and shape of high resolution emission minima near 9 μm. Calcite and dolomite were separated, identified, and mapped using HyTES based on a shift of the main carbonate emissivity minimum from approximately 11.3 to 11.2 μm respectively. Both datasets demonstrate the utility of LWIR spectral remote sensing for geologic mapping.

  5. Multispectral system analysis through modeling and simulation

    NASA Technical Reports Server (NTRS)

    Malila, W. A.; Gleason, J. M.; Cicone, R. C.

    1977-01-01

    The design and development of multispectral remote sensor systems and associated information extraction techniques should be optimized under the physical and economic constraints encountered and yet be effective over a wide range of scene and environmental conditions. Direct measurement of the full range of conditions to be encountered can be difficult, time consuming, and costly. Simulation of multispectral data by modeling scene, atmosphere, sensor, and data classifier characteristics is set forth as a viable alternative, particularly when coupled with limited sets of empirical measurements. A multispectral system modeling capability is described. Use of the model is illustrated for several applications - interpretation of remotely sensed data from agricultural and forest scenes, evaluating atmospheric effects in Landsat data, examining system design and operational configuration, and development of information extraction techniques.

  6. Multispectral system analysis through modeling and simulation

    NASA Technical Reports Server (NTRS)

    Malila, W. A.; Gleason, J. M.; Cicone, R. C.

    1977-01-01

    The design and development of multispectral remote sensor systems and associated information extraction techniques should be optimized under the physical and economic constraints encountered and yet be effective over a wide range of scene and environmental conditions. Direct measurement of the full range of conditions to be encountered can be difficult, time consuming, and costly. Simulation of multispectral data by modeling scene, atmosphere, sensor, and data classifier characteristics is set forth as a viable alternative, particularly when coupled with limited sets of empirical measurements. A multispectral system modeling capability is described. Use of the model is illustrated for several applications - interpretation of remotely sensed data from agricultural and forest scenes, evaluating atmospheric effects in LANDSAT data, examining system design and operational configuration, and development of information extraction techniques.

  7. Unsupervised classification of remote multispectral sensing data

    NASA Technical Reports Server (NTRS)

    Su, M. Y.

    1972-01-01

    The new unsupervised classification technique for classifying multispectral remote sensing data which can be either from the multispectral scanner or digitized color-separation aerial photographs consists of two parts: (a) a sequential statistical clustering which is a one-pass sequential variance analysis and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. Applications of the technique using an IBM-7094 computer on multispectral data sets over Purdue's Flight Line C-1 and the Yellowstone National Park test site have been accomplished. Comparisons between the classification maps by the unsupervised technique and the supervised maximum liklihood technique indicate that the classification accuracies are in agreement.

  8. The Need for High Spatial Resolution Multispectral Thermal Remote Sensing Data In Urban Heat Island Research

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Luvall, Jeffrey C.

    2006-01-01

    Although the study of the Urban Heat Island (UHI) effect dates back to the early 1800's when Luke Howard discovered London s heat island, it has only been with the advent of thermal remote sensing systems that the extent, characteristics, and impacts of the UHI have become to be understood. Analysis of the UHI effect is important because above all, this phenomenon can directly influence the health and welfare of urban residents. For example, in 1995, over 700 people died in Chicago due to heat-related causes. UHI s are characterized by increased temperature in comparison to rural areas and mortality rates during a heat wave increase exponentially with the maximum temperature, an effect that is exacerbated by the UHI. Aside from the direct impacts of the UHI on temperature, UHI s can produce secondary effects on local meteorology, including altering local wind patterns, increased development of clouds and fog, and increasing rates of precipitation either over, or downwind, of cities. Because of the extreme heterogeneity of the urban surface, in combination with the sprawl associated with urban growth, thermal infrared (TIR) remote sensing data have become of significant importance in understanding how land cover and land use characteristics affect the development and intensification of the UHI. TIR satellite data have been used extensively to analyze the surface temperature regimes of cities to help observe and measure the impacts of surface temperatures across the urban landscape. However, the spatial scales at which satellite TIR data are collected are for the most part, coarse, with the finest readily available TIR data collected by the Landsat ETM+ sensor at 60m spatial resolution. For many years, we have collected high spatial resolution (10m) data using an airborne multispectral TIR sensor over a number of cities across the United States. These high resolution data have been used to develop an understanding of how discrete surfaces across the urban environment

  9. [Detecting fire smoke based on the multispectral image].

    PubMed

    Wei, Ying-Zhuo; Zhang, Shao-Wu; Liu, Yan-Wei

    2010-04-01

    Smoke detection is very important for preventing forest-fire in the fire early process. Because the traditional technologies based on video and image processing are easily affected by the background dynamic information, three limitations exist in these technologies, i. e. lower anti-interference ability, higher false detection rate and the fire smoke and water fog being not easily distinguished. A novel detection method for detecting smoke based on the multispectral image was proposed in the present paper. Using the multispectral digital imaging technique, the multispectral image series of fire smoke and water fog were obtained in the band scope of 400 to 720 nm, and the images were divided into bins. The Euclidian distance among the bins was taken as a measurement for showing the difference of spectrogram. After obtaining the spectral feature vectors of dynamic region, the regions of fire smoke and water fog were extracted according to the spectrogram feature difference between target and background. The indoor and outdoor experiments show that the smoke detection method based on multispectral image can be applied to the smoke detection, which can effectively distinguish the fire smoke and water fog. Combined with video image processing method, the multispectral image detection method can also be applied to the forest fire surveillance, reducing the false alarm rate in forest fire detection.

  10. Nondestructive prediction of pork freshness parameters using multispectral scattering images

    NASA Astrophysics Data System (ADS)

    Tang, Xiuying; Li, Cuiling; Peng, Yankun; Chao, Kuanglin; Wang, Mingwu

    2012-05-01

    Optical technology is an important and immerging technology for non-destructive and rapid detection of pork freshness. This paper studied on the possibility of using multispectral imaging technique and scattering characteristics to predict the freshness parameters of pork meat. The pork freshness parameters selected for prediction included total volatile basic nitrogen (TVB-N), color parameters (L *, a *, b *), and pH value. Multispectral scattering images were obtained from pork sample surface by a multispectral imaging system developed by ourselves; they were acquired at the selected narrow wavebands whose center wavelengths were 517,550, 560, 580, 600, 760, 810 and 910nm. In order to extract scattering characteristics from multispectral images at multiple wavelengths, a Lorentzian distribution (LD) function with four parameters (a: scattering asymptotic value; b: scattering peak; c: scattering width; d: scattering slope) was used to fit the scattering curves at the selected wavelengths. The results show that the multispectral imaging technique combined with scattering characteristics is promising for predicting the freshness parameters of pork meat.

  11. DEIMOS-2: cost-effective, very-high resolution multispectral imagery

    NASA Astrophysics Data System (ADS)

    Pirondini, Fabrizio; López, Julio; González, Enrique; González, José Antonio

    2014-10-01

    ELECNOR DEIMOS is a private Spanish company, part of the Elecnor industrial group, which owns and operates DEIMOS-1, the first Spanish Earth Observation satellite. DEIMOS-1, launched in 2009, is among the world leading sources of high resolution data. On June 19th, 2014 ELECNOR DEIMOS launched its second satellite, DEIMOS-2, which is a very-high resolution, agile satellite capable of providing 75-cm pan-sharpened imagery, with a 12kmwide swath. The DEIMOS-2 camera delivers multispectral imagery in 5 bands: Panchromatic, G, R, B and NIR. DEIMOS-2 is the first European satellite completely owned by private capital, which is capable of providing submetric multispectral imagery. The whole end-to-end DEIMOS-2 system is designed to provide a cost-effective, dependable and highly responsive service to cope with the increasing need of fast access to very-high resolution imagery. The same 24/7 commercial service which is now available for DEIMOS-1, including tasking, download, processing and delivery, will become available for DEIMOS-2 as well, as soon as the satellite enters into commercial operations, at the end of its in-orbit commissioning. The DEIMOS-2 satellite has been co-developed by ELECNOR DEIMOS and SATREC-I (South Korea), and it has been integrated and tested in the new ELECNOR DEIMOS Satellite Systems premises in Puertollano (Spain). The DEIMOS-2 ground segment, which includes four receiving/commanding ground stations in Spain, Sweden and Canada, has been completely developed in-house by ELECNOR DEIMOS, based on its Ground Segment for Earth Observation (gs4EO®) suite. In this paper we describe the main features of the DEIMOS-2 system, with emphasis on its initial operations and the quality of the initial imagery, and provide updated information on its mission status.

  12. Hyperspectral and multispectral bioluminescence optical tomography for small animal imaging.

    PubMed

    Chaudhari, Abhijit J; Darvas, Felix; Bading, James R; Moats, Rex A; Conti, Peter S; Smith, Desmond J; Cherry, Simon R; Leahy, Richard M

    2005-12-07

    For bioluminescence imaging studies in small animals, it is important to be able to accurately localize the three-dimensional (3D) distribution of the underlying bioluminescent source. The spectrum of light produced by the source that escapes the subject varies with the depth of the emission source because of the wavelength-dependence of the optical properties of tissue. Consequently, multispectral or hyperspectral data acquisition should help in the 3D localization of deep sources. In this paper, we describe a framework for fully 3D bioluminescence tomographic image acquisition and reconstruction that exploits spectral information. We describe regularized tomographic reconstruction techniques that use semi-infinite slab or FEM-based diffusion approximations of photon transport through turbid media. Singular value decomposition analysis was used for data dimensionality reduction and to illustrate the advantage of using hyperspectral rather than achromatic data. Simulation studies in an atlas-mouse geometry indicated that sub-millimeter resolution may be attainable given accurate knowledge of the optical properties of the animal. A fixed arrangement of mirrors and a single CCD camera were used for simultaneous acquisition of multispectral imaging data over most of the surface of the animal. Phantom studies conducted using this system demonstrated our ability to accurately localize deep point-like sources and show that a resolution of 1.5 to 2.2 mm for depths up to 6 mm can be achieved. We also include an in vivo study of a mouse with a brain tumour expressing firefly luciferase. Co-registration of the reconstructed 3D bioluminescent image with magnetic resonance images indicated good anatomical localization of the tumour.

  13. Reproducible high-resolution multispectral image acquisition in dermatology

    NASA Astrophysics Data System (ADS)

    Duliu, Alexandru; Gardiazabal, José; Lasser, Tobias; Navab, Nassir

    2015-07-01

    Multispectral image acquisitions are increasingly popular in dermatology, due to their improved spectral resolution which enables better tissue discrimination. Most applications however focus on restricted regions of interest, imaging only small lesions. In this work we present and discuss an imaging framework for high-resolution multispectral imaging on large regions of interest.

  14. The High Resolution Stereo Camera (HRSC): 10 Years of Imaging Mars

    NASA Astrophysics Data System (ADS)

    Jaumann, R.; Neukum, G.; Tirsch, D.; Hoffmann, H.

    2014-04-01

    The HRSC Experiment: Imagery is the major source for our current understanding of the geologic evolution of Mars in qualitative and quantitative terms.Imaging is required to enhance our knowledge of Mars with respect to geological processes occurring on local, regional and global scales and is an essential prerequisite for detailed surface exploration. The High Resolution Stereo Camera (HRSC) of ESA's Mars Express Mission (MEx) is designed to simultaneously map the morphology, topography, structure and geologic context of the surface of Mars as well as atmospheric phenomena [1]. The HRSC directly addresses two of the main scientific goals of the Mars Express mission: (1) High-resolution three-dimensional photogeologic surface exploration and (2) the investigation of surface-atmosphere interactions over time; and significantly supports: (3) the study of atmospheric phenomena by multi-angle coverage and limb sounding as well as (4) multispectral mapping by providing high-resolution threedimensional color context information. In addition, the stereoscopic imagery will especially characterize landing sites and their geologic context [1]. The HRSC surface resolution and the digital terrain models bridge the gap in scales between highest ground resolution images (e.g., HiRISE) and global coverage observations (e.g., Viking). This is also the case with respect to DTMs (e.g., MOLA and local high-resolution DTMs). HRSC is also used as cartographic basis to correlate between panchromatic and multispectral stereo data. The unique multi-angle imaging technique of the HRSC supports its stereo capability by providing not only a stereo triplet but also a stereo quintuplet, making the photogrammetric processing very robust [1, 3]. The capabilities for three dimensional orbital reconnaissance of the Martian surface are ideally met by HRSC making this camera unique in the international Mars exploration effort.

  15. Compact multispectral photodiode arrays using micropatterned dichroic filters

    NASA Astrophysics Data System (ADS)

    Chandler, Eric V.; Fish, David E.

    2014-05-01

    The next generation of multispectral instruments requires significant improvements in both spectral band customization and portability to support the widespread deployment of application-specific optical sensors. The benefits of spectroscopy are well established for numerous applications including biomedical instrumentation, industrial sorting and sensing, chemical detection, and environmental monitoring. In this paper, spectroscopic (and by extension hyperspectral) and multispectral measurements are considered. The technology, tradeoffs, and application fits of each are evaluated. In the majority of applications, monitoring 4-8 targeted spectral bands of optimized wavelength and bandwidth provides the necessary spectral contrast and correlation. An innovative approach integrates precision spectral filters at the photodetector level to enable smaller sensors, simplify optical designs, and reduce device integration costs. This method supports user-defined spectral bands to create application-specific sensors in a small footprint with scalable cost efficiencies. A range of design configurations, filter options and combinations are presented together with typical applications ranging from basic multi-band detection to stringent multi-channel fluorescence measurement. An example implementation packages 8 narrowband silicon photodiodes into a 9x9mm ceramic LCC (leadless chip carrier) footprint. This package is designed for multispectral applications ranging from portable color monitors to purpose- built OEM industrial and scientific instruments. Use of an eight-channel multispectral photodiode array typically eliminates 10-20 components from a device bill-of-materials (BOM), streamlining the optical path and shrinking the footprint by 50% or more. A stepwise design approach for multispectral sensors is discussed - including spectral band definition, optical design tradeoffs and constraints, and device integration from prototype through scalable volume production

  16. Real-Time On-Board Processing Validation of MSPI Ground Camera Images

    NASA Technical Reports Server (NTRS)

    Pingree, Paula J.; Werne, Thomas A.; Bekker, Dmitriy L.

    2010-01-01

    The Earth Sciences Decadal Survey identifies a multiangle, multispectral, high-accuracy polarization imager as one requirement for the Aerosol-Cloud-Ecosystem (ACE) mission. JPL has been developing a Multiangle SpectroPolarimetric Imager (MSPI) as a candidate to fill this need. A key technology development needed for MSPI is on-board signal processing to calculate polarimetry data as imaged by each of the 9 cameras forming the instrument. With funding from NASA's Advanced Information Systems Technology (AIST) Program, JPL is solving the real-time data processing requirements to demonstrate, for the first time, how signal data at 95 Mbytes/sec over 16-channels for each of the 9 multiangle cameras in the spaceborne instrument can be reduced on-board to 0.45 Mbytes/sec. This will produce the intensity and polarization data needed to characterize aerosol and cloud microphysical properties. Using the Xilinx Virtex-5 FPGA including PowerPC440 processors we have implemented a least squares fitting algorithm that extracts intensity and polarimetric parameters in real-time, thereby substantially reducing the image data volume for spacecraft downlink without loss of science information.

  17. Semiconductor Laser Multi-Spectral Sensing and Imaging

    PubMed Central

    Le, Han Q.; Wang, Yang

    2010-01-01

    Multi-spectral laser imaging is a technique that can offer a combination of the laser capability of accurate spectral sensing with the desirable features of passive multispectral imaging. The technique can be used for detection, discrimination, and identification of objects by their spectral signature. This article describes and reviews the development and evaluation of semiconductor multi-spectral laser imaging systems. Although the method is certainly not specific to any laser technology, the use of semiconductor lasers is significant with respect to practicality and affordability. More relevantly, semiconductor lasers have their own characteristics; they offer excellent wavelength diversity but usually with modest power. Thus, system design and engineering issues are analyzed for approaches and trade-offs that can make the best use of semiconductor laser capabilities in multispectral imaging. A few systems were developed and the technique was tested and evaluated on a variety of natural and man-made objects. It was shown capable of high spectral resolution imaging which, unlike non-imaging point sensing, allows detecting and discriminating objects of interest even without a priori spectroscopic knowledge of the targets. Examples include material and chemical discrimination. It was also shown capable of dealing with the complexity of interpreting diffuse scattered spectral images and produced results that could otherwise be ambiguous with conventional imaging. Examples with glucose and spectral imaging of drug pills were discussed. Lastly, the technique was shown with conventional laser spectroscopy such as wavelength modulation spectroscopy to image a gas (CO). These results suggest the versatility and power of multi-spectral laser imaging, which can be practical with the use of semiconductor lasers. PMID:22315555

  18. Semiconductor laser multi-spectral sensing and imaging.

    PubMed

    Le, Han Q; Wang, Yang

    2010-01-01

    Multi-spectral laser imaging is a technique that can offer a combination of the laser capability of accurate spectral sensing with the desirable features of passive multispectral imaging. The technique can be used for detection, discrimination, and identification of objects by their spectral signature. This article describes and reviews the development and evaluation of semiconductor multi-spectral laser imaging systems. Although the method is certainly not specific to any laser technology, the use of semiconductor lasers is significant with respect to practicality and affordability. More relevantly, semiconductor lasers have their own characteristics; they offer excellent wavelength diversity but usually with modest power. Thus, system design and engineering issues are analyzed for approaches and trade-offs that can make the best use of semiconductor laser capabilities in multispectral imaging. A few systems were developed and the technique was tested and evaluated on a variety of natural and man-made objects. It was shown capable of high spectral resolution imaging which, unlike non-imaging point sensing, allows detecting and discriminating objects of interest even without a priori spectroscopic knowledge of the targets. Examples include material and chemical discrimination. It was also shown capable of dealing with the complexity of interpreting diffuse scattered spectral images and produced results that could otherwise be ambiguous with conventional imaging. Examples with glucose and spectral imaging of drug pills were discussed. Lastly, the technique was shown with conventional laser spectroscopy such as wavelength modulation spectroscopy to image a gas (CO). These results suggest the versatility and power of multi-spectral laser imaging, which can be practical with the use of semiconductor lasers.

  19. Multispectral infrared target detection: phenomenology and modeling

    NASA Astrophysics Data System (ADS)

    Cederquist, Jack N.; Rogne, Timothy J.; Schwartz, Craig R.

    1993-10-01

    Many targets of interest provide only very small signature differences from the clutter background. The ability to detect these small difference targets should be improved by using data which is diverse in space, time, wavelength or some other observable. Target materials often differ from background materials in the variation of their reflectance or emittance with wavelength. A multispectral sensor is therefore considered as a means to improve detection of small signal targets. If this sensor operates in the thermal infrared, it will not need solar illumination and will be useful at night as well as during the day. An understanding of the phenomenology of the spectral properties of materials and an ability to model and simulate target and clutter signatures is needed to understand potential target detection performance from multispectral infrared sensor data. Spectral variations in material emittance are due to vibrational energy transitions in molecular bonds. The spectral emittances of many materials of interest have been measured. Examples are vegetation, soil, construction and road materials, and paints. A multispectral infrared signature model has been developed which includes target and background temperature and emissivity, sky, sun, cloud and background irradiance, multiple reflection effects, path radiance, and atmospheric attenuation. This model can be used to predict multispectral infrared signatures for small signal targets.

  20. Multispectral Imaging of Mars from the Mars Science Laboratory Mastcam Instruments: Spectral Properties and Mineralogic Implications Along the Gale Crater Traverse

    NASA Astrophysics Data System (ADS)

    Bell, James F.; Wellington, Danika; Hardgrove, Craig; Godber, Austin; Rice, Melissa S.; Johnson, Jeffrey R.; Fraeman, Abigail

    2016-10-01

    The Mars Science Laboratory (MSL) Curiosity rover Mastcam is a pair of multispectral CCD cameras that have been imaging the surface and atmosphere in three broadband visible RGB color channels as well as nine additional narrowband color channels between 400 and 1000 nm since the rover's landing in August 2012. As of Curiosity sol 1159 (the most recent PDS data release as of this writing), approximately 140 multispectral imaging targets have been imaged using all twelve unique bandpasses. Near-simultaneous imaging of an onboard calibration target allows rapid relative reflectance calibration of these data to radiance factor and estimated Lambert albedo, for direct comparison to lab reflectance spectra of rocks, minerals, and mixtures. Surface targets among this data set include a variety of outcrop and float rocks (some containing light-toned veins), unconsolidated pebbles and clasts, and loose sand and soil. Some of these targets have been brushed, scuffed, or otherwise disturbed by the rover in order to reveal the (less dusty) interiors of these materials, and those targets and each of Curiosity's drill holes and tailings piles have been specifically targeted for multispectral imaging.Analysis of the relative reflectance spectra of these materials, sometimes in concert with additional compositional and/or mineralogic information from Curiosity's ChemCam LIBS and passive-mode spectral data and CheMin XRD data, reveals the presence of relatively broad solid state crystal field and charge transfer absorption features characteristic of a variety of common iron-bearing phases, including hematite (both nanophase and crystalline), ferric sulfate, olivine, and pyroxene. In addition, Mastcam is sensitive to a weak hydration feature in the 900-1000 nm region that can provide insight on the hydration state of some of these phases, especially sulfates. Here we summarize the Mastcam multispectral data set and the major potential phase identifications made using that data set

  1. Multispectral Imaging for Determination of Astaxanthin Concentration in Salmonids

    PubMed Central

    Dissing, Bjørn S.; Nielsen, Michael E.; Ersbøll, Bjarne K.; Frosch, Stina

    2011-01-01

    Multispectral imaging has been evaluated for characterization of the concentration of a specific cartenoid pigment; astaxanthin. 59 fillets of rainbow trout, Oncorhynchus mykiss, were filleted and imaged using a rapid multispectral imaging device for quantitative analysis. The multispectral imaging device captures reflection properties in 19 distinct wavelength bands, prior to determination of the true concentration of astaxanthin. The samples ranged from 0.20 to 4.34 g per g fish. A PLSR model was calibrated to predict astaxanthin concentration from novel images, and showed good results with a RMSEP of 0.27. For comparison a similar model were built for normal color images, which yielded a RMSEP of 0.45. The acquisition speed of the multispectral imaging system and the accuracy of the PLSR model obtained suggest this method as a promising technique for rapid in-line estimation of astaxanthin concentration in rainbow trout fillets. PMID:21573000

  2. Spring wheat-leaf phytomass and yield estimates from airborne scanner and hand-held radiometer measurements

    NASA Technical Reports Server (NTRS)

    Aase, J. K.; Siddoway, F. H.; Millard, J. P.

    1984-01-01

    An attempt has been made to relate hand-held radiometer measurements, and airborne multispectral scanner readings, with both different wheat stand densities and grain yield. Aircraft overflights were conducted during the tillering, stem extension and heading period stages of growth, while hand-held radiometer readings were taken throughout the growing season. The near-IR/red ratio was used in the analysis, which indicated that both the aircraft and the ground measurements made possible a differentiation and evaluation of wheat stand densities at an early enough growth stage to serve as the basis of management decisions. The aircraft data also corroborated the hand-held radiometer measurements with respect to yield prediction. Winterkill was readily evaluated.

  3. Forest tree species clssification based on airborne hyper-spectral imagery

    NASA Astrophysics Data System (ADS)

    Dian, Yuanyong; Li, Zengyuan; Pang, Yong

    2013-10-01

    Forest precision classification products were the basic data for surveying of forest resource, updating forest subplot information, logging and design of forest. However, due to the diversity of stand structure, complexity of the forest growth environment, it's difficult to discriminate forest tree species using multi-spectral image. The airborne hyperspectral images can achieve the high spatial and spectral resolution imagery of forest canopy, so it will good for tree species level classification. The aim of this paper was to test the effective of combining spatial and spectral features in airborne hyper-spectral image classification. The CASI hyper spectral image data were acquired from Liangshui natural reserves area. Firstly, we use the MNF (minimum noise fraction) transform method for to reduce the hyperspectral image dimensionality and highlighting variation. And secondly, we use the grey level co-occurrence matrix (GLCM) to extract the texture features of forest tree canopy from the hyper-spectral image, and thirdly we fused the texture and the spectral features of forest canopy to classify the trees species using support vector machine (SVM) with different kernel functions. The results showed that when using the SVM classifier, MNF and texture-based features combined with linear kernel function can achieve the best overall accuracy which was 85.92%. It was also confirm that combine the spatial and spectral information can improve the accuracy of tree species classification.

  4. Lattice algebra approach to multispectral analysis of ancient documents.

    PubMed

    Valdiviezo-N, Juan C; Urcid, Gonzalo

    2013-02-01

    This paper introduces a lattice algebra procedure that can be used for the multispectral analysis of historical documents and artworks. Assuming the presence of linearly mixed spectral pixels captured in a multispectral scene, the proposed method computes the scaled min- and max-lattice associative memories to determine the purest pixels that best represent the spectra of single pigments. The estimation of fractional proportions of pure spectra at each image pixel is used to build pigment abundance maps that can be used for subsequent restoration of damaged parts. Application examples include multispectral images acquired from the Archimedes Palimpsest and a Mexican pre-Hispanic codex.

  5. Artificial Neural Network to Predict Vine Water Status Spatial Variability Using Multispectral Information Obtained from an Unmanned Aerial Vehicle (UAV).

    PubMed

    Poblete, Tomas; Ortega-Farías, Samuel; Moreno, Miguel Angel; Bardeen, Matthew

    2017-10-30

    Water stress, which affects yield and wine quality, is often evaluated using the midday stem water potential (Ψ stem ). However, this measurement is acquired on a per plant basis and does not account for the assessment of vine water status spatial variability. The use of multispectral cameras mounted on unmanned aerial vehicle (UAV) is capable to capture the variability of vine water stress in a whole field scenario. It has been reported that conventional multispectral indices (CMI) that use information between 500-800 nm, do not accurately predict plant water status since they are not sensitive to water content. The objective of this study was to develop artificial neural network (ANN) models derived from multispectral images to predict the Ψ stem spatial variability of a drip-irrigated Carménère vineyard in Talca, Maule Region, Chile. The coefficient of determination (R²) obtained between ANN outputs and ground-truth measurements of Ψ stem were between 0.56-0.87, with the best performance observed for the model that included the bands 550, 570, 670, 700 and 800 nm. Validation analysis indicated that the ANN model could estimate Ψ stem with a mean absolute error (MAE) of 0.1 MPa, root mean square error (RMSE) of 0.12 MPa, and relative error (RE) of -9.1%. For the validation of the CMI, the MAE, RMSE and RE values were between 0.26-0.27 MPa, 0.32-0.34 MPa and -24.2-25.6%, respectively.

  6. Multispectral Terrain Background Simulation Techniques For Use In Airborne Sensor Evaluation

    NASA Astrophysics Data System (ADS)

    Weinberg, Michael; Wohlers, Ronald; Conant, John; Powers, Edward

    1988-08-01

    A background simulation code developed at Aerodyne Research, Inc., called AERIE is designed to reflect the major sources of clutter that are of concern to staring and scanning sensors of the type being considered for various airborne threat warning (both aircraft and missiles) sensors. The code is a first principles model that could be used to produce a consistent image of the terrain for various spectral bands, i.e., provide the proper scene correlation both spectrally and spatially. The code utilizes both topographic and cultural features to model terrain, typically from DMA data, with a statistical overlay of the critical underlying surface properties (reflectance, emittance, and thermal factors) to simulate the resulting texture in the scene. Strong solar scattering from water surfaces is included with allowance for wind driven surface roughness. Clouds can be superimposed on the scene using physical cloud models and an analytical representation of the reflectivity obtained from scattering off spherical particles. The scene generator is augmented by collateral codes that allow for the generation of images at finer resolution. These codes provide interpolation of the basic DMA databases using fractal procedures that preserve the high frequency power spectral density behavior of the original scene. Scenes are presented illustrating variations in altitude, radiance, resolution, material, thermal factors, and emissivities. The basic models utilized for simulation of the various scene components and various "engineering level" approximations are incorporated to reduce the computational complexity of the simulation.

  7. Optimal attributes for the object based detection of giant reed in riparian habitats: A comparative study between Airborne High Spatial Resolution and WorldView-2 imagery

    NASA Astrophysics Data System (ADS)

    Fernandes, Maria Rosário; Aguiar, Francisca C.; Silva, João M. N.; Ferreira, Maria Teresa; Pereira, José M. C.

    2014-10-01

    Giant reed is an aggressive invasive plant of riparian ecosystems in many sub-tropical and warm-temperate regions, including Mediterranean Europe. In this study we tested a set of geometric, spectral and textural attributes in an object based image analysis (OBIA) approach to map giant reed invasions in riparian habitats. Bagging Classification and Regression Tree were used to select the optimal attributes and to build the classification rules sets. Mapping accuracy was performed using landscape metrics and the Kappa coefficient to compare the topographical and geometric similarity between the giant reed patches obtained with the OBIA map and with a validation map derived from on-screen digitizing. The methodology was applied in two high spatial resolution images: an airborne multispectral imagery and the newly WorldView-2 imagery. A temporal coverage of the airborne multispectral images was radiometrically calibrated with the IR-Mad transformation and used to assess the influence of the phenological variability of the invader. We found that optimal attributes for giant reed OBIA detection are a combination of spectral, geometric and textural information, with different scoring selection depending on the spectral and spatial characteristics of the imagery. WorldView-2 showed higher mapping accuracy (Kappa coefficient of 77%) and spectral attributes, including the newly yellow band, were preferentially selected, although a tendency to overestimate the total invaded area, due to the low spatial resolution (2 m of pixel size vs. 50 cm) was observed. When airborne images were used, geometric attributes were primarily selected and a higher spatial detail of the invasive patches was obtained, due to the higher spatial resolution. However, in highly heterogeneous landscapes, the low spectral resolution of the airborne images (4 bands instead of the 8 of WorldView-2) reduces the capability to detect giant reed patches. Giant reed displays peculiar spectral and geometric

  8. Air-borne shape measurement of parabolic trough collector fields

    NASA Astrophysics Data System (ADS)

    Prahl, Christoph; Röger, Marc; Hilgert, Christoph

    2017-06-01

    The optical and thermal efficiency of parabolic trough collector solar fields is dependent on the performance and assembly accuracy of its components such as the concentrator and absorber. For the purpose of optical inspection/approval, yield analysis, localization of low performing areas, and optimization of the solar field, it is essential to create a complete view of the optical properties of the field. Existing optical measurement tools are based on ground based cameras, facing restriction concerning speed, volume and automation. QFly is an airborne qualification system which provides holistic and accurate information on geometrical, optical, and thermal properties of the entire solar field. It consists of an unmanned aerial vehicle, cameras and related software for flight path planning, data acquisition and evaluation. This article presents recent advances of the QFly measurement system and proposes a methodology on holistic qualification of the complete solar field with minimum impact on plant operation.

  9. Autocalibrating vision guided navigation of unmanned air vehicles via tactical monocular cameras in GPS denied environments

    NASA Astrophysics Data System (ADS)

    Celik, Koray

    This thesis presents a novel robotic navigation strategy by using a conventional tactical monocular camera, proving the feasibility of using a monocular camera as the sole proximity sensing, object avoidance, mapping, and path-planning mechanism to fly and navigate small to medium scale unmanned rotary-wing aircraft in an autonomous manner. The range measurement strategy is scalable, self-calibrating, indoor-outdoor capable, and has been biologically inspired by the key adaptive mechanisms for depth perception and pattern recognition found in humans and intelligent animals (particularly bats), designed to assume operations in previously unknown, GPS-denied environments. It proposes novel electronics, aircraft, aircraft systems, systems, and procedures and algorithms that come together to form airborne systems which measure absolute ranges from a monocular camera via passive photometry, mimicking that of a human-pilot like judgement. The research is intended to bridge the gap between practical GPS coverage and precision localization and mapping problem in a small aircraft. In the context of this study, several robotic platforms, airborne and ground alike, have been developed, some of which have been integrated in real-life field trials, for experimental validation. Albeit the emphasis on miniature robotic aircraft this research has been tested and found compatible with tactical vests and helmets, and it can be used to augment the reliability of many other types of proximity sensors.

  10. Radiometric characterization of hyperspectral imagers using multispectral sensors

    NASA Astrophysics Data System (ADS)

    McCorkel, Joel; Thome, Kurt; Leisso, Nathan; Anderson, Nikolaus; Czapla-Myers, Jeff

    2009-08-01

    The Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne and satellite based sensors. Often, ground-truth measurements at these tests sites are not always successful due to weather and funding availability. Therefore, RSG has also employed automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor. This work studies the feasibility of determining the radiometric calibration of a hyperspectral imager using multispectral imagery. The work relies on the Moderate Resolution Imaging Spectroradiometer (MODIS) as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. Hyperion bands are compared to MODIS by band averaging Hyperion's high spectral resolution data with the relative spectral response of MODIS. The results compare cross-calibration scenarios that differ in image acquisition coincidence, test site used for the calibration, and reference sensor. Cross-calibration results are presented that show agreement between the use of coincident and non-coincident image pairs within 2% in most bands as well as similar agreement between results that employ the different MODIS sensors as a reference.

  11. Airborne ultrasound surface motion camera: Application to seismocardiography

    NASA Astrophysics Data System (ADS)

    Shirkovskiy, P.; Laurin, A.; Jeger-Madiot, N.; Chapelle, D.; Fink, M.; Ing, R. K.

    2018-05-01

    The recent achievements in the accelerometer-based seismocardiography field indicate a strong potential for this technique to address a wide variety of clinical needs. Recordings from different locations on the chest can give a more comprehensive observation and interpretation of wave propagation phenomena than a single-point recording, can validate existing modeling assumptions (such as the representation of the sternum as a single solid body), and provide better identifiability for models using richer recordings. Ultimately, the goal is to advance our physiological understanding of the processes to provide useful data to promote cardiovascular health. Accelerometer-based multichannel system is a contact method and laborious for use in practice, and also even ultralight accelerometers can cause non-negligible loading effects. We propose a contactless ultrasound imaging method to measure thoracic and abdominal surface motions, demonstrating that it is adequate for typical seismocardiogram (SCG) use. The developed method extends non-contact surface-vibrometry to fast 2D mapping by originally combining multi-element airborne ultrasound arrays, a synthetic aperture implementation, and pulsed-waves. Experimental results show the ability of the developed method to obtain 2D seismocardiographic maps of the body surface 30 × 40 cm2 in dimension, with a temporal sampling rate of several hundred Hz, using ultrasound waves with the central frequency of 40 kHz. Our implementation was validated in-vivo on eight healthy human participants. The shape and position of the zone of maximal absolute acceleration and velocity during the cardiac cycle were also observed. This technology could potentially be used to obtain more complete cardio-vascular information than single-source SCG in and out of clinical environments, due to enhanced identifiability provided by the distributed measurements, and observation of propagation phenomena.

  12. Radiometric sensitivity comparisons of multispectral imaging systems

    NASA Technical Reports Server (NTRS)

    Lu, Nadine C.; Slater, Philip N.

    1989-01-01

    Multispectral imaging systems provide much of the basic data used by the land and ocean civilian remote-sensing community. There are numerous multispectral imaging systems which have been and are being developed. A common way to compare the radiometric performance of these systems is to examine their noise-equivalent change in reflectance, NE Delta-rho. The NE Delta-rho of a system is the reflectance difference that is equal to the noise in the recorded signal. A comparison is made of the noise equivalent change in reflectance of seven different multispectral imaging systems (AVHRR, AVIRIS, ETM, HIRIS, MODIS-N, SPOT-1, HRV, and TM) for a set of three atmospheric conditions (continental aerosol with 23-km visibility, continental aerosol with 5-km visibility, and a Rayleigh atmosphere), five values of ground reflectance (0.01, 0.10, 0.25, 0.50, and 1.00), a nadir viewing angle, and a solar zenith angle of 45 deg.

  13. Experimental Demonstration of Adaptive Infrared Multispectral Imaging Using Plasmonic Filter Array (Postprint)

    DTIC Science & Technology

    2016-10-10

    AFRL-RX-WP-JA-2017-0189 EXPERIMENTAL DEMONSTRATION OF ADAPTIVE INFRARED MULTISPECTRAL IMAGING USING PLASMONIC FILTER ARRAY...March 2016 – 23 May 2016 4. TITLE AND SUBTITLE EXPERIMENTAL DEMONSTRATION OF ADAPTIVE INFRARED MULTISPECTRAL IMAGING USING PLASMONIC FILTER ARRAY...experimental demonstration of adaptive multispectral imagery using fabricated plasmonic spectral filter arrays and proposed target detection scenarios

  14. Multi-Spectral Stereo Atmospheric Remote Sensing (STARS) for Retrieval of Cloud Properties and Cloud-Motion Vectors

    NASA Astrophysics Data System (ADS)

    Kelly, M. A.; Boldt, J.; Wilson, J. P.; Yee, J. H.; Stoffler, R.

    2017-12-01

    The multi-spectral STereo Atmospheric Remote Sensing (STARS) concept has the objective to provide high-spatial and -temporal-resolution observations of 3D cloud structures related to hurricane development and other severe weather events. The rapid evolution of severe weather demonstrates a critical need for mesoscale observations of severe weather dynamics, but such observations are rare, particularly over the ocean where extratropical and tropical cyclones can undergo explosive development. Coincident space-based measurements of wind velocity and cloud properties at the mesoscale remain a great challenge, but are critically needed to improve the understanding and prediction of severe weather and cyclogenesis. STARS employs a mature stereoscopic imaging technique on two satellites (e.g. two CubeSats, two hosted payloads) to simultaneously retrieve cloud motion vectors (CMVs), cloud-top temperatures (CTTs), and cloud geometric heights (CGHs) from multi-angle, multi-spectral observations of cloud features. STARS is a pushbroom system based on separate wide-field-of-view co-boresighted multi-spectral cameras in the visible, midwave infrared (MWIR), and longwave infrared (LWIR) with high spatial resolution (better than 1 km). The visible system is based on a pan-chromatic, low-light imager to resolve cloud structures under nighttime illumination down to ¼ moon. The MWIR instrument, which is being developed as a NASA ESTO Instrument Incubator Program (IIP) project, is based on recent advances in MWIR detector technology that requires only modest cooling. The STARS payload provides flexible options for spaceflight due to its low size, weight, power (SWaP) and very modest cooling requirements. STARS also meets AF operational requirements for cloud characterization and theater weather imagery. In this paper, an overview of the STARS concept, including the high-level sensor design, the concept of operations, and measurement capability will be presented.

  15. Eliminate background interference from latent fingerprints using ultraviolet multispectral imaging

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Xu, Xiaojing; Wang, Guiqiang

    2014-02-01

    Fingerprints are the most important evidence in crime scene. The technology of developing latent fingerprints is one of the hottest research areas in forensic science. Recently, multispectral imaging which has shown great capability in fingerprints development, questioned document detection and trace evidence examination is used in detecting material evidence. This paper studied how to eliminate background interference from non-porous and porous surface latent fingerprints by rotating filter wheel ultraviolet multispectral imaging. The results approved that background interference could be removed clearly from latent fingerprints by using multispectral imaging in ultraviolet bandwidth.

  16. Airborne mapping of chemical plumes in the aftermath of Hurricanes Katrina and Rita

    NASA Astrophysics Data System (ADS)

    Lewis, Paul E.; Thomas, Mark J.; Kroutil, Robert T.; Combs, Roger; Cummings, Alan S.; Miller, Dave; Curry, Tim; Shen, Sylvia S.

    2006-05-01

    Infrared airborne spectral measurements were collected over the Gulf Coast area during the aftermath of Hurricanes Katrina and Rita. These measurements allowed surveillance for potentially hazardous chemical vapor releases from industrial facilities caused by storm damage. Data was collected with a mid-longwave infrared multispectral imager and a hyperspectral Fourier transform infrared spectrometer operating in a low altitude aircraft. Signal processing allowed detection and identification of targeted spectral signatures in the presence of interferents, atmospheric contributions, and thermal clutter. Results confirmed the presence of a number of chemical vapors. All detection results were immediately passed along to emergency first responders on the ground. The chemical identification, location, and vapor species concentration information were used by the emergency response ground teams for identification of critical plume releases and subsequent mitigation.

  17. Real-time full-motion color Flash lidar for target detection and identification

    NASA Astrophysics Data System (ADS)

    Nelson, Roy; Coppock, Eric; Craig, Rex; Craner, Jeremy; Nicks, Dennis; von Niederhausern, Kurt

    2015-05-01

    Greatly improved understanding of areas and objects of interest can be gained when real time, full-motion Flash LiDAR is fused with inertial navigation data and multi-spectral context imagery. On its own, full-motion Flash LiDAR provides the opportunity to exploit the z dimension for improved intelligence vs. 2-D full-motion video (FMV). The intelligence value of this data is enhanced when it is combined with inertial navigation data to produce an extended, georegistered data set suitable for a variety of analysis. Further, when fused with multispectral context imagery the typical point cloud now becomes a rich 3-D scene which is intuitively obvious to the user and allows rapid cognitive analysis with little or no training. Ball Aerospace has developed and demonstrated a real-time, full-motion LIDAR system that fuses context imagery (VIS to MWIR demonstrated) and inertial navigation data in real time, and can stream these information-rich geolocated/fused 3-D scenes from an airborne platform. In addition, since the higher-resolution context camera is boresighted and frame synchronized to the LiDAR camera and the LiDAR camera is an array sensor, techniques have been developed to rapidly interpolate the LIDAR pixel values creating a point cloud that has the same resolution as the context camera, effectively creating a high definition (HD) LiDAR image. This paper presents a design overview of the Ball TotalSight™ LIDAR system along with typical results over urban and rural areas collected from both rotary and fixed-wing aircraft. We conclude with a discussion of future work.

  18. Applying Lidar and High-Resolution Multispectral Imagery for Improved Quantification and Mapping of Tundra Vegetation Structure and Distribution in the Alaskan Arctic

    NASA Astrophysics Data System (ADS)

    Greaves, Heather E.

    Climate change is disproportionately affecting high northern latitudes, and the extreme temperatures, remoteness, and sheer size of the Arctic tundra biome have always posed challenges that make application of remote sensing technology especially appropriate. Advances in high-resolution remote sensing continually improve our ability to measure characteristics of tundra vegetation communities, which have been difficult to characterize previously due to their low stature and their distribution in complex, heterogeneous patches across large landscapes. In this work, I apply terrestrial lidar, airborne lidar, and high-resolution airborne multispectral imagery to estimate tundra vegetation characteristics for a research area near Toolik Lake, Alaska. Initially, I explored methods for estimating shrub biomass from terrestrial lidar point clouds, finding that a canopy-volume based algorithm performed best. Although shrub biomass estimates derived from airborne lidar data were less accurate than those from terrestrial lidar data, algorithm parameters used to derive biomass estimates were similar for both datasets. Additionally, I found that airborne lidar-based shrub biomass estimates were just as accurate whether calibrated against terrestrial lidar data or harvested shrub biomass--suggesting that terrestrial lidar potentially could replace destructive biomass harvest. Along with smoothed Normalized Differenced Vegetation Index (NDVI) derived from airborne imagery, airborne lidar-derived canopy volume was an important predictor in a Random Forest model trained to estimate shrub biomass across the 12.5 km2 covered by our lidar and imagery data. The resulting 0.80 m resolution shrub biomass maps should provide important benchmarks for change detection in the Toolik area, especially as deciduous shrubs continue to expand in tundra regions. Finally, I applied 33 lidar- and imagery-derived predictor layers in a validated Random Forest modeling approach to map vegetation

  19. Data fusion concept in multispectral system for perimeter protection of stationary and moving objects

    NASA Astrophysics Data System (ADS)

    Ciurapiński, Wieslaw; Dulski, Rafal; Kastek, Mariusz; Szustakowski, Mieczyslaw; Bieszczad, Grzegorz; Życzkowski, Marek; Trzaskawka, Piotr; Piszczek, Marek

    2009-09-01

    The paper presents the concept of multispectral protection system for perimeter protection for stationary and moving objects. The system consists of active ground radar, thermal and visible cameras. The radar allows the system to locate potential intruders and to control an observation area for system cameras. The multisensor construction of the system ensures significant improvement of detection probability of intruder and reduction of false alarms. A final decision from system is worked out using image data. The method of data fusion used in the system has been presented. The system is working under control of FLIR Nexus system. The Nexus offers complete technology and components to create network-based, high-end integrated systems for security and surveillance applications. Based on unique "plug and play" architecture, system provides unmatched flexibility and simplistic integration of sensors and devices in TCP/IP networks. Using a graphical user interface it is possible to control sensors and monitor streaming video and other data over the network, visualize the results of data fusion process and obtain detailed information about detected intruders over a digital map. System provides high-level applications and operator workload reduction with features such as sensor to sensor cueing from detection devices, automatic e-mail notification and alarm triggering.

  20. Comparison of preliminary results from Airborne Aster Simulator (AAS) with TIMS data

    NASA Technical Reports Server (NTRS)

    Kannari, Yoshiaki; Mills, Franklin; Watanabe, Hiroshi; Ezaka, Teruya; Narita, Tatsuhiko; Chang, Sheng-Huei

    1992-01-01

    The Japanese Advanced Spaceborne Thermal Emission and Reflection radiometer (ASTER), being developed for a NASA EOS-A satellite, will have 3 VNIR, 6 SWIR, and 5 TIR (8-12 micron) bands. An Airborne ASTER Simulator (AAS) was developed for Japan Resources Observation System Organization (JAROS) by the Geophysical Environmental Research Group (GER) Corp. to research surface temperature and emission features in the MWIR/TIR, to simulate ASTER's TIR bands, and to study further possibility of MWIR/TIR bands. ASTER Simulator has 1 VNIR, 3 MWIR (3-5 microns), and 20 (currently 24) TIR bands. Data was collected over 3 sites - Cuprite, Nevada; Long Valley/Mono Lake, California; and Death Valley, California - with simultaneous ground truth measurements. Preliminary data collected by AAS for Cuprite, Nevada is presented and AAS data is compared with Thermal Infrared Multispectral Scanner (TIMS) data.

  1. Catchment-Scale Terrain Modelling with Structure-from-Motion Photogrammetry: a replacement for airborne lidar?

    NASA Astrophysics Data System (ADS)

    Brasington, J.

    2015-12-01

    Over the last five years, Structure-from-Motion photogrammetry has dramatically democratized the availability of high quality topographic data. This approach involves the use of a non-linear bundle adjustment to estimate simultaneously camera position, pose, distortion and 3D model coordinates. In contrast to traditional aerial photogrammetry, the bundle adjustment is typically solved without external constraints and instead ground control is used a posteriori to transform the modelled coordinates to an established datum using a similarity transformation. The limited data requirements, coupled with the ability to self-calibrate compact cameras, has led to a burgeoning of applications using low-cost imagery acquired terrestrially or from low-altitude platforms. To date, most applications have focused on relatively small spatial scales where relaxed logistics permit the use of dense ground control and high resolution, close-range photography. It is less clear whether this low-cost approach can be successfully upscaled to tackle larger, watershed-scale projects extending over 102-3 km2 where it could offer a competitive alternative to landscape modelling with airborne lidar. At such scales, compromises over the density of ground control, the speed and height of sensor platform and related image properties are inevitable. In this presentation we provide a systematic assessment of large-scale SfM terrain products derived for over 80 km2 of the braided Dart River and its catchment in the Southern Alps of NZ. Reference data in the form of airborne and terrestrial lidar are used to quantify the quality of 3D reconstructions derived from helicopter photography and used to establish baseline uncertainty models for geomorphic change detection. Results indicate that camera network design is a key determinant of model quality, and that standard aerial networks based on strips of nadir photography can lead to unstable camera calibration and systematic errors that are difficult

  2. Catchment-Scale Terrain Modelling with Structure-from-Motion Photogrammetry: a replacement for airborne lidar?

    NASA Astrophysics Data System (ADS)

    Brasington, James; James, Joe; Cook, Simon; Cox, Simon; Lotsari, Eliisa; McColl, Sam; Lehane, Niall; Williams, Richard; Vericat, Damia

    2016-04-01

    In recent years, 3D terrain reconstructions based on Structure-from-Motion photogrammetry have dramatically democratized the availability of high quality topographic data. This approach involves the use of a non-linear bundle adjustment to estimate simultaneously camera position, pose, distortion and 3D model coordinates. In contrast to traditional aerial photogrammetry, the bundle adjustment is typically solved without external constraints and instead ground control is used a posteriori to transform the modelled coordinates to an established datum using a similarity transformation. The limited data requirements, coupled with the ability to self-calibrate compact cameras, has led to a burgeoning of applications using low-cost imagery acquired terrestrially or from low-altitude platforms. To date, most applications have focused on relatively small spatial scales (0.1-5 Ha), where relaxed logistics permit the use of dense ground control networks and high resolution, close-range photography. It is less clear whether this low-cost approach can be successfully upscaled to tackle larger, watershed-scale projects extending over 102-3 km2 where it could offer a competitive alternative to established landscape modelling with airborne lidar. At such scales, compromises over the density of ground control, the speed and height of sensor platform and related image properties are inevitable. In this presentation we provide a systematic assessment of the quality of large-scale SfM terrain products derived for over 80 km2 of the braided Dart River and its catchment in the Southern Alps of NZ. Reference data in the form of airborne and terrestrial lidar are used to quantify the quality of 3D reconstructions derived from helicopter photography and used to establish baseline uncertainty models for geomorphic change detection. Results indicate that camera network design is a key determinant of model quality, and that standard aerial photogrammetric networks based on strips of nadir

  3. Multi-spectral confocal microendoscope for in-vivo imaging

    NASA Astrophysics Data System (ADS)

    Rouse, Andrew Robert

    The concept of in-vivo multi-spectral confocal microscopy is introduced. A slit-scanning multi-spectral confocal microendoscope (MCME) was built to demonstrate the technique. The MCME employs a flexible fiber-optic catheter coupled to a custom built slit-scan confocal microscope fitted with a custom built imaging spectrometer. The catheter consists of a fiber-optic imaging bundle linked to a miniature objective and focus assembly. The design and performance of the miniature objective and focus assembly are discussed. The 3mm diameter catheter may be used on its own or routed though the instrument channel of a commercial endoscope. The confocal nature of the system provides optical sectioning with 3mum lateral resolution and 30mum axial resolution. The prism based multi-spectral detection assembly is typically configured to collect 30 spectral samples over the visible chromatic range. The spectral sampling rate varies from 4nm/pixel at 490nm to 8nm/pixel at 660nm and the minimum resolvable wavelength difference varies from 7nm to 18nm over the same spectral range. Each of these characteristics are primarily dictated by the dispersive power of the prism. The MCME is designed to examine cellular structures during optical biopsy and to exploit the diagnostic information contained within the spectral domain. The primary applications for the system include diagnosis of disease in the gastro-intestinal tract and female reproductive system. Recent data from the grayscale imaging mode are presented. Preliminary multi-spectral results from phantoms, cell cultures, and excised human tissue are presented to demonstrate the potential of in-vivo multi-spectral imaging.

  4. Assessment of Satellite Albedos Using NASA-CAR Airborne Data

    NASA Astrophysics Data System (ADS)

    Kharbouche, S.; Charles, G.; Muller, J. P.

    2016-12-01

    Airborne BRF (Bidirectional Reflectance Factor) data has been acquired at multiple altitudes by the NASA CAR (Cloud Absorption Radiometer) multi-spectral instrument since the late 1990s in order to study the reflectance over different types of landscapes depending upon wavelengths, view angles and spatial scales, and to assess derived BRFs from multispectral satellites. As the measured BRFs are taken over a very short period (< 2 minutes), we minimise the effects of solar angles and atmospheric effects. This allows the derivation of a dense set of BRFs which allow direct display of polar plots of the BRDF for different sites in the Arctic. Also, as the measurements have been taken at different flight heights, the upscaling issue can be addressed and detailed with concrete samples. The CAR instrument is well calibrated (back to NIST standards) and can be compared with some ground measurements on the ground. So the derived BRF data for this instrument are likely to be highly reliable and can be used in the validation of some satellites products like radiance, reflectance and albedo, as well as in the BRDF (Bidirectional Reflectance Distribution Function) modelling and in the development of new atmospheric correction techniques. The NASA-CAR, developed by NASA-GSFC can be carried and integrated into many experimental aircraft. So, CAR can be considered as an airborne multi-wavelength scanning radiometer that can measure radiance with instantaneous fields of view of 1°. Over targeted sites, the CAR flies circularly and scans through 180° from straight above, through the horizon to straight down. Data are recorded in 14 narrow spectral bands located in the ultraviolet, visible and near-infrared regions in the electromagnetic spectrum (0.340-2.301 mm). The ray or spot at nadir depends on the flight height. It varies from 1m (height=110m) to 48m (height=5500m). We will show in this presentation the accuracy of BRF, BRDF and Black-Sky-Albedo of MODIS, MISR, MERIS, VGT

  5. Multispectral image fusion for illumination-invariant palmprint recognition

    PubMed Central

    Zhang, Xinman; Xu, Xuebin; Shang, Dongpeng

    2017-01-01

    Multispectral palmprint recognition has shown broad prospects for personal identification due to its high accuracy and great stability. In this paper, we develop a novel illumination-invariant multispectral palmprint recognition method. To combine the information from multiple spectral bands, an image-level fusion framework is completed based on a fast and adaptive bidimensional empirical mode decomposition (FABEMD) and a weighted Fisher criterion. The FABEMD technique decomposes the multispectral images into their bidimensional intrinsic mode functions (BIMFs), on which an illumination compensation operation is performed. The weighted Fisher criterion is to construct the fusion coefficients at the decomposition level, making the images be separated correctly in the fusion space. The image fusion framework has shown strong robustness against illumination variation. In addition, a tensor-based extreme learning machine (TELM) mechanism is presented for feature extraction and classification of two-dimensional (2D) images. In general, this method has fast learning speed and satisfying recognition accuracy. Comprehensive experiments conducted on the PolyU multispectral palmprint database illustrate that the proposed method can achieve favorable results. For the testing under ideal illumination, the recognition accuracy is as high as 99.93%, and the result is 99.50% when the lighting condition is unsatisfied. PMID:28558064

  6. Multispectral image fusion for illumination-invariant palmprint recognition.

    PubMed

    Lu, Longbin; Zhang, Xinman; Xu, Xuebin; Shang, Dongpeng

    2017-01-01

    Multispectral palmprint recognition has shown broad prospects for personal identification due to its high accuracy and great stability. In this paper, we develop a novel illumination-invariant multispectral palmprint recognition method. To combine the information from multiple spectral bands, an image-level fusion framework is completed based on a fast and adaptive bidimensional empirical mode decomposition (FABEMD) and a weighted Fisher criterion. The FABEMD technique decomposes the multispectral images into their bidimensional intrinsic mode functions (BIMFs), on which an illumination compensation operation is performed. The weighted Fisher criterion is to construct the fusion coefficients at the decomposition level, making the images be separated correctly in the fusion space. The image fusion framework has shown strong robustness against illumination variation. In addition, a tensor-based extreme learning machine (TELM) mechanism is presented for feature extraction and classification of two-dimensional (2D) images. In general, this method has fast learning speed and satisfying recognition accuracy. Comprehensive experiments conducted on the PolyU multispectral palmprint database illustrate that the proposed method can achieve favorable results. For the testing under ideal illumination, the recognition accuracy is as high as 99.93%, and the result is 99.50% when the lighting condition is unsatisfied.

  7. Characterization of instream hydraulic and riparian habitat conditions and stream temperatures of the Upper White River Basin, Washington, using multispectral imaging systems

    USGS Publications Warehouse

    Black, Robert W.; Haggland, Alan; Crosby, Greg

    2003-01-01

    Instream hydraulic and riparian habitat conditions and stream temperatures were characterized for selected stream segments in the Upper White River Basin, Washington. An aerial multispectral imaging system used digital cameras to photograph the stream segments across multiple wavelengths to characterize fish habitat and temperature conditions. All imageries were georeferenced. Fish habitat features were photographed at a resolution of 0.5 meter and temperature imageries were photographed at a 1.0-meter resolution. The digital multispectral imageries were classified using commercially available software. Aerial photographs were taken on September 21, 1999. Field habitat data were collected from August 23 to October 12, 1999, to evaluate the measurement accuracy and effectiveness of the multispectral imaging in determining the extent of the instream habitat variables. Fish habitat types assessed by this method were the abundance of instream hydraulic features such as pool and riffle habitats, turbulent and non-turbulent habitats, riparian composition, the abundance of large woody debris in the stream and riparian zone, and stream temperatures. Factors such as the abundance of instream woody debris, the location and frequency of pools, and stream temperatures generally are known to have a significant impact on salmon. Instream woody debris creates the habitat complexity necessary to maintain a diverse and healthy salmon population. The abundance of pools is indicative of a stream's ability to support fish and other aquatic organisms. Changes in water temperature can affect aquatic organisms by altering metabolic rates and oxygen requirements, altering their sensitivity to toxic materials and affecting their ability to avoid predators. The specific objectives of this project were to evaluate the use of an aerial multispectral imaging system to accurately identify instream hydraulic features and surface-water temperatures in the Upper White River Basin, to use the

  8. [A spatial adaptive algorithm for endmember extraction on multispectral remote sensing image].

    PubMed

    Zhu, Chang-Ming; Luo, Jian-Cheng; Shen, Zhan-Feng; Li, Jun-Li; Hu, Xiao-Dong

    2011-10-01

    Due to the problem that the convex cone analysis (CCA) method can only extract limited endmember in multispectral imagery, this paper proposed a new endmember extraction method by spatial adaptive spectral feature analysis in multispectral remote sensing image based on spatial clustering and imagery slice. Firstly, in order to remove spatial and spectral redundancies, the principal component analysis (PCA) algorithm was used for lowering the dimensions of the multispectral data. Secondly, iterative self-organizing data analysis technology algorithm (ISODATA) was used for image cluster through the similarity of the pixel spectral. And then, through clustering post process and litter clusters combination, we divided the whole image data into several blocks (tiles). Lastly, according to the complexity of image blocks' landscape and the feature of the scatter diagrams analysis, the authors can determine the number of endmembers. Then using hourglass algorithm extracts endmembers. Through the endmember extraction experiment on TM multispectral imagery, the experiment result showed that the method can extract endmember spectra form multispectral imagery effectively. What's more, the method resolved the problem of the amount of endmember limitation and improved accuracy of the endmember extraction. The method has provided a new way for multispectral image endmember extraction.

  9. Inverse analysis of non-uniform temperature distributions using multispectral pyrometry

    NASA Astrophysics Data System (ADS)

    Fu, Tairan; Duan, Minghao; Tian, Jibin; Shi, Congling

    2016-05-01

    Optical diagnostics can be used to obtain sub-pixel temperature information in remote sensing. A multispectral pyrometry method was developed using multiple spectral radiation intensities to deduce the temperature area distribution in the measurement region. The method transforms a spot multispectral pyrometer with a fixed field of view into a pyrometer with enhanced spatial resolution that can give sub-pixel temperature information from a "one pixel" measurement region. A temperature area fraction function was defined to represent the spatial temperature distribution in the measurement region. The method is illustrated by simulations of a multispectral pyrometer with a spectral range of 8.0-13.0 μm measuring a non-isothermal region with a temperature range of 500-800 K in the spot pyrometer field of view. The inverse algorithm for the sub-pixel temperature distribution (temperature area fractions) in the "one pixel" verifies this multispectral pyrometry method. The results show that an improved Levenberg-Marquardt algorithm is effective for this ill-posed inverse problem with relative errors in the temperature area fractions of (-3%, 3%) for most of the temperatures. The analysis provides a valuable reference for the use of spot multispectral pyrometers for sub-pixel temperature distributions in remote sensing measurements.

  10. A system to geometrically rectify and map airborne scanner imagery and to estimate ground area. [by computer

    NASA Technical Reports Server (NTRS)

    Spencer, M. M.; Wolf, J. M.; Schall, M. A.

    1974-01-01

    A system of computer programs were developed which performs geometric rectification and line-by-line mapping of airborne multispectral scanner data to ground coordinates and estimates ground area. The system requires aircraft attitude and positional information furnished by ancillary aircraft equipment, as well as ground control points. The geometric correction and mapping procedure locates the scan lines, or the pixels on each line, in terms of map grid coordinates. The area estimation procedure gives ground area for each pixel or for a predesignated parcel specified in map grid coordinates. The results of exercising the system with simulated data showed the uncorrected video and corrected imagery and produced area estimates accurate to better than 99.7%.

  11. Artificial Neural Network to Predict Vine Water Status Spatial Variability Using Multispectral Information Obtained from an Unmanned Aerial Vehicle (UAV)

    PubMed Central

    Bardeen, Matthew

    2017-01-01

    Water stress, which affects yield and wine quality, is often evaluated using the midday stem water potential (Ψstem). However, this measurement is acquired on a per plant basis and does not account for the assessment of vine water status spatial variability. The use of multispectral cameras mounted on unmanned aerial vehicle (UAV) is capable to capture the variability of vine water stress in a whole field scenario. It has been reported that conventional multispectral indices (CMI) that use information between 500–800 nm, do not accurately predict plant water status since they are not sensitive to water content. The objective of this study was to develop artificial neural network (ANN) models derived from multispectral images to predict the Ψstem spatial variability of a drip-irrigated Carménère vineyard in Talca, Maule Region, Chile. The coefficient of determination (R2) obtained between ANN outputs and ground-truth measurements of Ψstem were between 0.56–0.87, with the best performance observed for the model that included the bands 550, 570, 670, 700 and 800 nm. Validation analysis indicated that the ANN model could estimate Ψstem with a mean absolute error (MAE) of 0.1 MPa, root mean square error (RMSE) of 0.12 MPa, and relative error (RE) of −9.1%. For the validation of the CMI, the MAE, RMSE and RE values were between 0.26–0.27 MPa, 0.32–0.34 MPa and −24.2–25.6%, respectively. PMID:29084169

  12. Measurement Sets and Sites Commonly Used for High Spatial Resolution Image Product Characterization

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary

    2006-01-01

    Scientists within NASA's Applied Sciences Directorate have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site has enabled the in-flight characterization of satellite high spatial resolution remote sensing system products form Space Imaging IKONOS, Digital Globe QuickBird, and ORBIMAGE OrbView, as well as advanced multispectral airborne digital camera products. SSC utilizes engineered geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment and their Instrument Validation Laboratory to characterize high spatial resolution remote sensing data products. This presentation describes the SSC characterization capabilities and techniques in the visible through near infrared spectrum and examples of calibration results.

  13. Integrated Active Fire Retrievals and Biomass Burning Emissions Using Complementary Near-Coincident Ground, Airborne and Spaceborne Sensor Data

    NASA Technical Reports Server (NTRS)

    Schroeder, Wilfrid; Ellicott, Evan; Ichoku, Charles; Ellison, Luke; Dickinson, Matthew B.; Ottmar, Roger D.; Clements, Craig; Hall, Dianne; Ambrosia, Vincent; Kremens, Robert

    2013-01-01

    Ground, airborne and spaceborne data were collected for a 450 ha prescribed fire implemented on 18 October 2011 at the Henry W. Coe State Park in California. The integration of various data elements allowed near coincident active fire retrievals to be estimated. The Autonomous Modular Sensor-Wildfire (AMS) airborne multispectral imaging system was used as a bridge between ground and spaceborne data sets providing high quality reference information to support satellite fire retrieval error analyses and fire emissions estimates. We found excellent agreement between peak fire radiant heat flux data (less than 1% error) derived from near-coincident ground radiometers and AMS. Both MODIS and GOES imager active fire products were negatively influenced by the presence of thick smoke, which was misclassified as cloud by their algorithms, leading to the omission of fire pixels beneath the smoke, and resulting in the underestimation of their retrieved fire radiative power (FRP) values for the burn plot, compared to the reference airborne data. Agreement between airborne and spaceborne FRP data improved significantly after correction for omission errors and atmospheric attenuation, resulting in as low as 5 difference between AquaMODIS and AMS. Use of in situ fuel and fire energy estimates in combination with a collection of AMS, MODIS, and GOES FRP retrievals provided a fuel consumption factor of 0.261 kg per MJ, total energy release of 14.5 x 10(exp 6) MJ, and total fuel consumption of 3.8 x 10(exp 6) kg. Fire emissions were calculated using two separate techniques, resulting in as low as 15 difference for various species

  14. Multispectral data compression through transform coding and block quantization

    NASA Technical Reports Server (NTRS)

    Ready, P. J.; Wintz, P. A.

    1972-01-01

    Transform coding and block quantization techniques are applied to multispectral aircraft scanner data, and digitized satellite imagery. The multispectral source is defined and an appropriate mathematical model proposed. The Karhunen-Loeve, Fourier, and Hadamard encoders are considered and are compared to the rate distortion function for the equivalent Gaussian source and to the performance of the single sample PCM encoder.

  15. Intelligent multi-spectral IR image segmentation

    NASA Astrophysics Data System (ADS)

    Lu, Thomas; Luong, Andrew; Heim, Stephen; Patel, Maharshi; Chen, Kang; Chao, Tien-Hsin; Chow, Edward; Torres, Gilbert

    2017-05-01

    This article presents a neural network based multi-spectral image segmentation method. A neural network is trained on the selected features of both the objects and background in the longwave (LW) Infrared (IR) images. Multiple iterations of training are performed until the accuracy of the segmentation reaches satisfactory level. The segmentation boundary of the LW image is used to segment the midwave (MW) and shortwave (SW) IR images. A second neural network detects the local discontinuities and refines the accuracy of the local boundaries. This article compares the neural network based segmentation method to the Wavelet-threshold and Grab-Cut methods. Test results have shown increased accuracy and robustness of this segmentation scheme for multi-spectral IR images.

  16. Fourth Airborne Geoscience Workshop

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The focus of the workshop was on how the airborne community can assist in achieving the goals of the Global Change Research Program. The many activities that employ airborne platforms and sensors were discussed: platforms and instrument development; airborne oceanography; lidar research; SAR measurements; Doppler radar; laser measurements; cloud physics; airborne experiments; airborne microwave measurements; and airborne data collection.

  17. Verification of small-scale water vapor features in VAS imagery using high resolution MAMS imagery. [VISSR Atmospheric Sounder - Multispectral Atmospheric Mapping Sensor

    NASA Technical Reports Server (NTRS)

    Menzel, Paul W.; Jedlovec, Gary; Wilson, Gregory

    1986-01-01

    The Multispectral Atmospheric Mapping Sensor (MAMS), a modification of NASA's Airborne Thematic Mapper, is described, and radiances from the MAMS and the VISSR Atmospheric Sounder (VAS) are compared which were collected simultaneously on May 18, 1985. Thermal emission from the earth atmosphere system in eight visible and three infrared spectral bands (12.3, 11.2 and 6.5 microns) are measured by the MAMS at up to 50 m horizontal resolution, and the infrared bands are similar to three of the VAS infrared bands. Similar radiometric performance was found for the two systems, though the MAMS showed somewhat less attenuation from water vapor than VAS because its spectral bands are shifted to shorter wavelengths away from the absorption band center.

  18. Skin condition measurement by using multispectral imaging system (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Jung, Geunho; Kim, Sungchul; Kim, Jae Gwan

    2017-02-01

    There are a number of commercially available low level light therapy (LLLT) devices in a market, and face whitening or wrinkle reduction is one of targets in LLLT. The facial improvement could be known simply by visual observation of face, but it cannot provide either quantitative data or recognize a subtle change. Clinical diagnostic instruments such as mexameter can provide a quantitative data, but it costs too high for home users. Therefore, we designed a low cost multi-spectral imaging device by adding additional LEDs (470nm, 640nm, white LED, 905nm) to a commercial USB microscope which has two LEDs (395nm, 940nm) as light sources. Among various LLLT skin treatments, we focused on getting melanin and wrinkle information. For melanin index measurements, multi-spectral images of nevus were acquired and melanin index values from color image (conventional method) and from multi-spectral images were compared. The results showed that multi-spectral analysis of melanin index can visualize nevus with a different depth and concentration. A cross section of wrinkle on skin resembles a wedge which can be a source of high frequency components when the skin image is Fourier transformed into a spatial frequency domain map. In that case, the entropy value of the spatial frequency map can represent the frequency distribution which is related with the amount and thickness of wrinkle. Entropy values from multi-spectral images can potentially separate the percentage of thin and shallow wrinkle from thick and deep wrinkle. From the results, we found that this low cost multi-spectral imaging system could be beneficial for home users of LLLT by providing the treatment efficacy in a quantitative way.

  19. Cross-Calibration of Ground and Airborne TIR and VSWIR Instruments for NASA's SnowEx 2017 Grand Mesa Campaign

    NASA Astrophysics Data System (ADS)

    Crawford, C. J.; Chickadel, C. C.; Hall, D. K.; Jennings, D. E.; Jhabvala, M. D.; Kim, E. J.; Jessica, L.; Lunsford, A.

    2017-12-01

    The NASA Terrestrial Hydrology Program sponsored a ground and airborne snow experiment (SnowEx) to the Grand Mesa area and Senator Beck Basin in western Colorado during February 2017. This communication summarizes efforts to develop traceable instrument calibration requirements for SnowEx Grand Mesa in support of thermal infrared (TIR) and visible-to-shortwave infrared (VSWIR) snow measurement science. Cross-calibration outcomes for TIR instruments (7-10 µm and 8-14 µm response functions) indicate that an at-sensor measurement accuracy of within 1.5 degrees Celsius was achieved across ground and airborne sensors using laboratory and field blackbody sources. A cross-calibration assessment of VSWIR spectrometers (0.35 to 2.5 µm response functions) using a National Institutes of Standard Technology (NIST) traceable source indicates an at-sensor measurement accuracy of within 5% for visible-near infrared spectral radiance (W/cm-2/sr-1/nm) and irradiance (W/m-2/nm), and within 20% for shortwave infrared measurements before a radiometric cross-calibration correction was applied. Additional validation is undertaken to assess the ground and airborne SnowEx Grand Mesa TIR and VSWIR instrument cross-calibration quality by benchmarking against on-orbit image acquisitions of the snow surface on February 14th and 15th, 2017 from Landsat Enhanced Thematic Mapper Plus (ETM+), Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), and Sentinel-2A Multi-Spectral Instrument (MSI).

  20. The Multispectral Imaging Science Working Group. Volume 2: Working group reports

    NASA Technical Reports Server (NTRS)

    Cox, S. C. (Editor)

    1982-01-01

    Summaries of the various multispectral imaging science working groups are presented. Current knowledge of the spectral and spatial characteristics of the Earth's surface is outlined and the present and future capabilities of multispectral imaging systems are discussed.

  1. Accuracy Potential and Applications of MIDAS Aerial Oblique Camera System

    NASA Astrophysics Data System (ADS)

    Madani, M.

    2012-07-01

    Airborne oblique cameras such as Fairchild T-3A were initially used for military reconnaissance in 30s. A modern professional digital oblique camera such as MIDAS (Multi-camera Integrated Digital Acquisition System) is used to generate lifelike three dimensional to the users for visualizations, GIS applications, architectural modeling, city modeling, games, simulators, etc. Oblique imagery provide the best vantage for accessing and reviewing changes to the local government tax base, property valuation assessment, buying & selling of residential/commercial for better decisions in a more timely manner. Oblique imagery is also used for infrastructure monitoring making sure safe operations of transportation, utilities, and facilities. Sanborn Mapping Company acquired one MIDAS from TrackAir in 2011. This system consists of four tilted (45 degrees) cameras and one vertical camera connected to a dedicated data acquisition computer system. The 5 digital cameras are based on the Canon EOS 1DS Mark3 with Zeiss lenses. The CCD size is 5,616 by 3,744 (21 MPixels) with the pixel size of 6.4 microns. Multiple flights using different camera configurations (nadir/oblique (28 mm/50 mm) and (50 mm/50 mm)) were flown over downtown Colorado Springs, Colorado. Boresight fights for 28 mm nadir camera were flown at 600 m and 1,200 m and for 50 mm nadir camera at 750 m and 1500 m. Cameras were calibrated by using a 3D cage and multiple convergent images utilizing Australis model. In this paper, the MIDAS system is described, a number of real data sets collected during the aforementioned flights are presented together with their associated flight configurations, data processing workflow, system calibration and quality control workflows are highlighted and the achievable accuracy is presented in some detail. This study revealed that the expected accuracy of about 1 to 1.5 GSD (Ground Sample Distance) for planimetry and about 2 to 2.5 GSD for vertical can be achieved. Remaining systematic

  2. Comparing the Potential of Multispectral and Hyperspectral Data for Monitoring Oil Spill Impact.

    PubMed

    Khanna, Shruti; Santos, Maria J; Ustin, Susan L; Shapiro, Kristen; Haverkamp, Paul J; Lay, Mui

    2018-02-12

    Oil spills from offshore drilling and coastal refineries often cause significant degradation of coastal environments. Early oil detection may prevent losses and speed up recovery if monitoring of the initial oil extent, oil impact, and recovery are in place. Satellite imagery data can provide a cost-effective alternative to expensive airborne imagery or labor intensive field campaigns for monitoring effects of oil spills on wetlands. However, these satellite data may be restricted in their ability to detect and map ecosystem recovery post-spill given their spectral measurement properties and temporal frequency. In this study, we assessed whether spatial and spectral resolution, and other sensor characteristics influence the ability to detect and map vegetation stress and mortality due to oil. We compared how well three satellite multispectral sensors: WorldView2, RapidEye and Landsat EMT+, match the ability of the airborne hyperspectral AVIRIS sensor to map oil-induced vegetation stress, recovery, and mortality after the DeepWater Horizon oil spill in the Gulf of Mexico in 2010. We found that finer spatial resolution (3.5 m) provided better delineation of the oil-impacted wetlands and better detection of vegetation stress along oiled shorelines in saltmarsh wetland ecosystems. As spatial resolution become coarser (3.5 m to 30 m) the ability to accurately detect and map stressed vegetation decreased. Spectral resolution did improve the detection and mapping of oil-impacted wetlands but less strongly than spatial resolution, suggesting that broad-band data may be sufficient to detect and map oil-impacted wetlands. AVIRIS narrow-band data performs better detecting vegetation stress, followed by WorldView2, RapidEye and then Landsat 15 m (pan sharpened) data. Higher quality sensor optics and higher signal-to-noise ratio (SNR) may also improve detection and mapping of oil-impacted wetlands; we found that resampled coarser resolution AVIRIS data with higher SNR performed

  3. Comparing the Potential of Multispectral and Hyperspectral Data for Monitoring Oil Spill Impact

    PubMed Central

    Santos, Maria J.; Ustin, Susan L.; Haverkamp, Paul J.; Lay, Mui

    2018-01-01

    Oil spills from offshore drilling and coastal refineries often cause significant degradation of coastal environments. Early oil detection may prevent losses and speed up recovery if monitoring of the initial oil extent, oil impact, and recovery are in place. Satellite imagery data can provide a cost-effective alternative to expensive airborne imagery or labor intensive field campaigns for monitoring effects of oil spills on wetlands. However, these satellite data may be restricted in their ability to detect and map ecosystem recovery post-spill given their spectral measurement properties and temporal frequency. In this study, we assessed whether spatial and spectral resolution, and other sensor characteristics influence the ability to detect and map vegetation stress and mortality due to oil. We compared how well three satellite multispectral sensors: WorldView2, RapidEye and Landsat EMT+, match the ability of the airborne hyperspectral AVIRIS sensor to map oil-induced vegetation stress, recovery, and mortality after the DeepWater Horizon oil spill in the Gulf of Mexico in 2010. We found that finer spatial resolution (3.5 m) provided better delineation of the oil-impacted wetlands and better detection of vegetation stress along oiled shorelines in saltmarsh wetland ecosystems. As spatial resolution become coarser (3.5 m to 30 m) the ability to accurately detect and map stressed vegetation decreased. Spectral resolution did improve the detection and mapping of oil-impacted wetlands but less strongly than spatial resolution, suggesting that broad-band data may be sufficient to detect and map oil-impacted wetlands. AVIRIS narrow-band data performs better detecting vegetation stress, followed by WorldView2, RapidEye and then Landsat 15 m (pan sharpened) data. Higher quality sensor optics and higher signal-to-noise ratio (SNR) may also improve detection and mapping of oil-impacted wetlands; we found that resampled coarser resolution AVIRIS data with higher SNR performed

  4. New Capabilities in the Astrophysics Multispectral Archive Search Engine

    NASA Astrophysics Data System (ADS)

    Cheung, C. Y.; Kelley, S.; Roussopoulos, N.

    The Astrophysics Multispectral Archive Search Engine (AMASE) uses object-oriented database techniques to provide a uniform multi-mission and multi-spectral interface to search for data in the distributed archives. We describe our experience of porting AMASE from Illustra object-relational DBMS to the Informix Universal Data Server. New capabilities and utilities have been developed, including a spatial datablade that supports Nearest Neighbor queries.

  5. Wide field-of-view dual-band multispectral muzzle flash detection

    NASA Astrophysics Data System (ADS)

    Montoya, J.; Melchor, J.; Spiliotis, P.; Taplin, L.

    2013-06-01

    Sensor technologies are undergoing revolutionary advances, as seen in the rapid growth of multispectral methodologies. Increases in spatial, spectral, and temporal resolution, and in breadth of spectral coverage, render feasible sensors that function with unprecedented performance. A system was developed that addresses many of the key hardware requirements for a practical dual-band multispectral acquisition system, including wide field of view and spectral/temporal shift between dual bands. The system was designed using a novel dichroic beam splitter and dual band-pass filter configuration that creates two side-by-side images of a scene on a single sensor. A high-speed CMOS sensor was used to simultaneously capture data from the entire scene in both spectral bands using a short focal-length lens that provided a wide field-of-view. The beam-splitter components were arranged such that the two images were maintained in optical alignment and real-time intra-band processing could be carried out using only simple arithmetic on the image halves. An experiment related to limitations of the system to address multispectral detection requirements was performed. This characterized the system's low spectral variation across its wide field of view. This paper provides lessons learned on the general limitation of key hardware components required for multispectral muzzle flash detection, using the system as a hardware example combined with simulated multispectral muzzle flash and background signatures.

  6. Spectral correction algorithm for multispectral CdTe x-ray detectors

    NASA Astrophysics Data System (ADS)

    Christensen, Erik D.; Kehres, Jan; Gu, Yun; Feidenhans'l, Robert; Olsen, Ulrik L.

    2017-09-01

    Compared to the dual energy scintillator detectors widely used today, pixelated multispectral X-ray detectors show the potential to improve material identification in various radiography and tomography applications used for industrial and security purposes. However, detector effects, such as charge sharing and photon pileup, distort the measured spectra in high flux pixelated multispectral detectors. These effects significantly reduce the detectors' capabilities to be used for material identification, which requires accurate spectral measurements. We have developed a semi analytical computational algorithm for multispectral CdTe X-ray detectors which corrects the measured spectra for severe spectral distortions caused by the detector. The algorithm is developed for the Multix ME100 CdTe X-ray detector, but could potentially be adapted for any pixelated multispectral CdTe detector. The calibration of the algorithm is based on simple attenuation measurements of commercially available materials using standard laboratory sources, making the algorithm applicable in any X-ray setup. The validation of the algorithm has been done using experimental data acquired with both standard lab equipment and synchrotron radiation. The experiments show that the algorithm is fast, reliable even at X-ray flux up to 5 Mph/s/mm2, and greatly improves the accuracy of the measured X-ray spectra, making the algorithm very useful for both security and industrial applications where multispectral detectors are used.

  7. Radiometric Characterization of Hyperspectral Imagers using Multispectral Sensors

    NASA Technical Reports Server (NTRS)

    McCorkel, Joel; Kurt, Thome; Leisso, Nathan; Anderson, Nikolaus; Czapla-Myers, Jeff

    2009-01-01

    The Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne and satellite based sensors. Often, ground-truth measurements at these test sites are not always successful due to weather and funding availability. Therefore, RSG has also automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor, This work studies the feasibility of determining the radiometric calibration of a hyperspectral imager using multispectral a imagery. The work relies on the Moderate Resolution Imaging Spectroradiometer (M0DIS) as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. Hyperion bands are compared to MODIS by band averaging Hyperion's high spectral resolution data with the relative spectral response of M0DlS. The results compare cross-calibration scenarios that differ in image acquisition coincidence, test site used for the calibration, and reference sensor. Cross-calibration results are presented that show agreement between the use of coincident and non-coincident image pairs within 2% in most brands as well as similar agreement between results that employ the different MODIS sensors as a reference.

  8. Investigation related to multispectral imaging systems

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F.; Erickson, J. D.

    1974-01-01

    A summary of technical progress made during a five year research program directed toward the development of operational information systems based on multispectral sensing and the use of these systems in earth-resource survey applications is presented. Efforts were undertaken during this program to: (1) improve the basic understanding of the many facets of multispectral remote sensing, (2) develop methods for improving the accuracy of information generated by remote sensing systems, (3) improve the efficiency of data processing and information extraction techniques to enhance the cost-effectiveness of remote sensing systems, (4) investigate additional problems having potential remote sensing solutions, and (5) apply the existing and developing technology for specific users and document and transfer that technology to the remote sensing community.

  9. Analyzing RCD30 Oblique Performance in a Production Environment

    NASA Astrophysics Data System (ADS)

    Soler, M. E.; Kornus, W.; Magariños, A.; Pla, M.

    2016-06-01

    In 2014 the Institut Cartogràfic i Geològic de Catalunya (ICGC) decided to incorporate digital oblique imagery in its portfolio in response to the growing demand for this product. The reason can be attributed to its useful applications in a wide variety of fields and, most recently, to an increasing interest in 3d modeling. The selection phase for a digital oblique camera led to the purchase of the Leica RCD30 Oblique system, an 80MPixel multispectral medium-format camera which consists of one Nadir camera and four oblique viewing cameras acquiring images at an off-Nadir angle of 35º. The system also has a multi-directional motion compensation on-board system to deliver the highest image quality. The emergence of airborne oblique cameras has run in parallel to the inclusion of computer vision algorithms into the traditional photogrammetric workflows. Such algorithms rely on having multiple views of the same area of interest and take advantage of the image redundancy for automatic feature extraction. The multiview capability is highly fostered by the use of oblique systems which capture simultaneously different points of view for each camera shot. Different companies and NMAs have started pilot projects to assess the capabilities of the 3D mesh that can be obtained using correlation techniques. Beyond a software prototyping phase, and taking into account the currently immature state of several components of the oblique imagery workflow, the ICGC has focused on deploying a real production environment with special interest on matching the performance and quality of the existing production lines based on classical Nadir images. This paper introduces different test scenarios and layouts to analyze the impact of different variables on the geometric and radiometric performance. Different variables such as flight altitude, side and forward overlap and ground control point measurements and location have been considered for the evaluation of aerial triangulation and

  10. Spatial arrangement of color filter array for multispectral image acquisition

    NASA Astrophysics Data System (ADS)

    Shrestha, Raju; Hardeberg, Jon Y.; Khan, Rahat

    2011-03-01

    In the past few years there has been a significant volume of research work carried out in the field of multispectral image acquisition. The focus of most of these has been to facilitate a type of multispectral image acquisition systems that usually requires multiple subsequent shots (e.g. systems based on filter wheels, liquid crystal tunable filters, or active lighting). Recently, an alternative approach for one-shot multispectral image acquisition has been proposed; based on an extension of the color filter array (CFA) standard to produce more than three channels. We can thus introduce the concept of multispectral color filter array (MCFA). But this field has not been much explored, particularly little focus has been given in developing systems which focuses on the reconstruction of scene spectral reflectance. In this paper, we have explored how the spatial arrangement of multispectral color filter array affects the acquisition accuracy with the construction of MCFAs of different sizes. We have simulated acquisitions of several spectral scenes using different number of filters/channels, and compared the results with those obtained by the conventional regular MCFA arrangement, evaluating the precision of the reconstructed scene spectral reflectance in terms of spectral RMS error, and colorimetric ▵E*ab color differences. It has been found that the precision and the the quality of the reconstructed images are significantly influenced by the spatial arrangement of the MCFA and the effect will be more and more prominent with the increase in the number of channels. We believe that MCFA-based systems can be a viable alternative for affordable acquisition of multispectral color images, in particular for applications where spatial resolution can be traded off for spectral resolution. We have shown that the spatial arrangement of the array is an important design issue.

  11. Airborne seeker evaluation and test system

    NASA Astrophysics Data System (ADS)

    Jollie, William B.

    1991-08-01

    The Airborne Seeker Evaluation Test System (ASETS) is an airborne platform for development, test, and evaluation of air-to-ground seekers and sensors. ASETS consists of approximately 10,000 pounds of equipment, including sixteen racks of control, display, and recording electronics, and a very large stabilized airborne turret, all carried by a modified C- 130A aircraft. The turret measures 50 in. in diameter and extends over 50 in. below the aircraft. Because of the low ground clearance of the C-130, a unique retractor mechanism was designed to raise the turret inside the aircraft for take-offs and landings, and deploy the turret outside the aircraft for testing. The turret has over 7 cubic feet of payload space and can accommodate up to 300 pounds of instrumentation, including missile seekers, thermal imagers, infrared mapping systems, laser systems, millimeter wave radar units, television cameras, and laser rangers. It contains a 5-axis gyro-stabilized gimbal system that will maintain a line of sight in the pitch, roll, and yaw axes to an accuracy better than +/- 125 (mu) rad. The rack-mounted electronics in the aircraft cargo bay can be interchanged to operate any type of sensor and record the data. Six microcomputer subsystems operate and maintain all of the system components during a test mission. ASETS is capable of flying at altitudes between 200 and 20,000 feet, and at airspeeds ranging from 100 to 250 knots. Mission scenarios can include air-to-surface seeker testing, terrain mapping, surface target measurement, air-to-air testing, atmospheric transmission studies, weather data collection, aircraft or missile tracking, background signature measurements, and surveillance. ASETS is fully developed and available to support test programs.

  12. Lossless, Multi-Spectral Data Compressor for Improved Compression for Pushbroom-Type Instruments

    NASA Technical Reports Server (NTRS)

    Klimesh, Matthew

    2008-01-01

    A low-complexity lossless algorithm for compression of multispectral data has been developed that takes into account pushbroom-type multispectral imagers properties in order to make the file compression more effective.

  13. Classification of non native tree species in Adda Park (Italy) through multispectral and multitemporal surveys from UAV

    NASA Astrophysics Data System (ADS)

    Pinto, Livio; Sona, Giovanna; Biffi, Andrea; Dosso, Paolo; Passoni, Daniele; Baracani, Matteo

    2014-05-01

    The project ITACA (Innovation, Technologies, Actions to Contrast Alloctonous species) rises from the need of protecting natural habitats in parks where native vegetation is threaten by the always increasing spread of alloctonous species. Starting from preliminary results obtained in previous experimental studies performed inside Adda Park (Lombardy Region, Northern Italy) the aim of the project is a further development and optimization of some tested techniques and procedures. In the frame of ITACA project, that involves Politecnico di Milano and some local enterprises, 11 separate areas of the Adda Park, globally covering 50 hectars, will be surveyed with UAV-borne multispectral sensors through different seasons (summer, autumn and spring). The summer and autumn flights have already been realized by the fixed wing UAV Sensefly SwingletCAM mounted with a Canon Ixus 220HS, producing real color images (RGB), and an identical camera, modified to produce false color images (NIR-RG). The 'multisensor-multitemporal' flights have been planned with high longitudinal and transversal overlaps, always in the range 60% to 80%, and a GSD of around 4 cm. Presignalized artificial points or natural elements have been surveyed on the ground by GPS RTK Trimble 5700, making use a Network GPS ervice (NRTK). For each survey two flights have been realized, one with the standard camera, and the second one with the NIR-modified one, with the double purpose of: - producing a multispectral orthomosaic, formed by the four bands NIR-R-G-B, coregistered. - increasing the coverage of the area, yielding in the block adjustment phase a more robust solution and a higher metric accuracy of digital products (digital orthomosaics). The first two flights have been scheduled taking into account information on the phenology of the species under observation (both native or invasive) given by expert botanists involved in the project. The first set of acquisition, originally planned for the first half of

  14. Monitoring temporal microstructural variations of skeletal muscle tissues by multispectral Mueller matrix polarimetry

    NASA Astrophysics Data System (ADS)

    Dong, Yang; He, Honghui; He, Chao; Ma, Hui

    2017-02-01

    Mueller matrix polarimetry is a powerful tool for detecting microscopic structures, therefore can be used to monitor physiological changes of tissue samples. Meanwhile, spectral features of scattered light can also provide abundant microstructural information of tissues. In this paper, we take the 2D multispectral backscattering Mueller matrix images of bovine skeletal muscle tissues, and analyze their temporal variation behavior using multispectral Mueller matrix parameters. The 2D images of the Mueller matrix elements are reduced to the multispectral frequency distribution histograms (mFDHs) to reveal the dominant structural features of the muscle samples more clearly. For quantitative analysis, the multispectral Mueller matrix transformation (MMT) parameters are calculated to characterize the microstructural variations during the rigor mortis and proteolysis processes of the skeletal muscle tissue samples. The experimental results indicate that the multispectral MMT parameters can be used to judge different physiological stages for bovine skeletal muscle tissues in 24 hours, and combining with the multispectral technique, the Mueller matrix polarimetry and FDH analysis can monitor the microstructural variation features of skeletal muscle samples. The techniques may be used for quick assessment and quantitative monitoring of meat qualities in food industry.

  15. Multispectral Observations of Explosive Gas Emissions from Santiaguito, Guatemala

    NASA Astrophysics Data System (ADS)

    Carn, S. A.; Watson, M.; Thomas, H.; Rodriguez, L. A.; Campion, R.; Prata, F. J.

    2016-12-01

    Santiaguito volcano, Guatemala, has been persistently active for decades, producing frequent explosions from its actively growing lava dome. Repeated release of volcanic gases contains information about conduit processes during the cyclical explosions at Santiaguito, but the composition of the gas phase and the amount of volatiles released in each explosion remains poorly constrained. In addition to its persistent activity, Santiaguito offers an exceptional opportunity to investigate lava dome degassing processes since the upper surface of the active lava dome can be viewed from the summit of neighboring Santa Maria. In January 2016 we conducted multi-spectral observations of Santiaguito's explosive eruption plumes and passive degassing from multiple perspectives as part of the first NSF-sponsored `Workshop on Volcanoes' instrument deployment. Gas measurements included open-path Fourier-Transform infrared (OP-FTIR) spectroscopy from the Santa Maria summit, coincident with ultraviolet (UV) and infrared (IR) camera and UV Differential Optical Absorption Spectroscopy (DOAS) from the El Mirador site below Santiaguito's active Caliente lava dome. Using the OP-FTIR in passive mode with the Caliente lava dome as the source of IR radiation, we were able to collect IR spectra at high temporal resolution prior to and during two explosions of Santiaguito on 7-8 January, with volcanic SO2 and H2O emissions detected. UV and IR camera data provide constraints on the total SO2 burden in the emissions (and potentially the volcanic ash burden), which coupled with the FTIR gas ratios provides new constraints on the mass and composition of volatiles driving explosions at Santiaguito. All gas measurements indicate significant volatile release during explosions with limited degassing during repose periods. In this presentation we will present ongoing analysis of the unique Santiaguito gas dataset including estimation of the total volatile mass released in explosions and an

  16. Perceptual evaluation of color transformed multispectral imagery

    NASA Astrophysics Data System (ADS)

    Toet, Alexander; de Jong, Michael J.; Hogervorst, Maarten A.; Hooge, Ignace T. C.

    2014-04-01

    Color remapping can give multispectral imagery a realistic appearance. We assessed the practical value of this technique in two observer experiments using monochrome intensified (II) and long-wave infrared (IR) imagery, and color daylight (REF) and fused multispectral (CF) imagery. First, we investigated the amount of detail observers perceive in a short timespan. REF and CF imagery yielded the highest precision and recall measures, while II and IR imagery yielded significantly lower values. This suggests that observers have more difficulty in extracting information from monochrome than from color imagery. Next, we measured eye fixations during free image exploration. Although the overall fixation behavior was similar across image modalities, the order in which certain details were fixated varied. Persons and vehicles were typically fixated first in REF, CF, and IR imagery, while they were fixated later in II imagery. In some cases, color remapping II imagery and fusion with IR imagery restored the fixation order of these image details. We conclude that color remapping can yield enhanced scene perception compared to conventional monochrome nighttime imagery, and may be deployed to tune multispectral image representations such that the resulting fixation behavior resembles the fixation behavior corresponding to daylight color imagery.

  17. Image processing of underwater multispectral imagery

    USGS Publications Warehouse

    Zawada, D. G.

    2003-01-01

    Capturing in situ fluorescence images of marine organisms presents many technical challenges. The effects of the medium, as well as the particles and organisms within it, are intermixed with the desired signal. Methods for extracting and preparing the imagery for analysis are discussed in reference to a novel underwater imaging system called the low-light-level underwater multispectral imaging system (LUMIS). The instrument supports both uni- and multispectral collections, each of which is discussed in the context of an experimental application. In unispectral mode, LUMIS was used to investigate the spatial distribution of phytoplankton. A thin sheet of laser light (532 nm) induced chlorophyll fluorescence in the phytoplankton, which was recorded by LUMIS. Inhomogeneities in the light sheet led to the development of a beam-pattern-correction algorithm. Separating individual phytoplankton cells from a weak background fluorescence field required a two-step procedure consisting of edge detection followed by a series of binary morphological operations. In multispectral mode, LUMIS was used to investigate the bio-assay potential of fluorescent pigments in corals. Problems with the commercial optical-splitting device produced nonlinear distortions in the imagery. A tessellation algorithm, including an automated tie-point-selection procedure, was developed to correct the distortions. Only pixels corresponding to coral polyps were of interest for further analysis. Extraction of these pixels was performed by a dynamic global-thresholding algorithm.

  18. Mapping Urban Tree Canopy Cover Using Fused Airborne LIDAR and Satellite Imagery Data

    NASA Astrophysics Data System (ADS)

    Parmehr, Ebadat G.; Amati, Marco; Fraser, Clive S.

    2016-06-01

    Urban green spaces, particularly urban trees, play a key role in enhancing the liveability of cities. The availability of accurate and up-to-date maps of tree canopy cover is important for sustainable development of urban green spaces. LiDAR point clouds are widely used for the mapping of buildings and trees, and several LiDAR point cloud classification techniques have been proposed for automatic mapping. However, the effectiveness of point cloud classification techniques for automated tree extraction from LiDAR data can be impacted to the point of failure by the complexity of tree canopy shapes in urban areas. Multispectral imagery, which provides complementary information to LiDAR data, can improve point cloud classification quality. This paper proposes a reliable method for the extraction of tree canopy cover from fused LiDAR point cloud and multispectral satellite imagery data. The proposed method initially associates each LiDAR point with spectral information from the co-registered satellite imagery data. It calculates the normalised difference vegetation index (NDVI) value for each LiDAR point and corrects tree points which have been misclassified as buildings. Then, region growing of tree points, taking the NDVI value into account, is applied. Finally, the LiDAR points classified as tree points are utilised to generate a canopy cover map. The performance of the proposed tree canopy cover mapping method is experimentally evaluated on a data set of airborne LiDAR and WorldView 2 imagery covering a suburb in Melbourne, Australia.

  19. Application of EREP, LANDSAT, and aircraft image data to environmental problems related to coal mining

    NASA Technical Reports Server (NTRS)

    Amato, R. V.; Russell, O. R.; Martin, K. R.; Wier, C. E.

    1975-01-01

    Remote sensing techniques were used to study coal mining sites within the Eastern Interior Coal Basin (Indiana, Illinois, and western Kentucky), the Appalachian Coal Basin (Ohio, West Virginia, and Pennsylvania) and the anthracite coal basins of northeastern Pennsylvania. Remote sensor data evaluated during these studies were acquired by LANDSAT, Skylab and both high and low altitude aircraft. Airborne sensors included multispectral scanners, multiband cameras and standard mapping cameras loaded with panchromatic, color and color infrared films. The research conducted in these areas is a useful prerequisite to the development of an operational monitoring system that can be peridically employed to supply state and federal regulatory agencies with supportive data. Further research, however, must be undertaken to systematically examine those mining processes and features that can be monitored cost effectively using remote sensors and for determining what combination of sensors and ground sampling processes provide the optimum combination for an operational system.

  20. Smartphone-based multispectral imaging: system development and potential for mobile skin diagnosis

    PubMed Central

    Kim, Sewoong; Cho, Dongrae; Kim, Jihun; Kim, Manjae; Youn, Sangyeon; Jang, Jae Eun; Je, Minkyu; Lee, Dong Hun; Lee, Boreom; Farkas, Daniel L.; Hwang, Jae Youn

    2016-01-01

    We investigate the potential of mobile smartphone-based multispectral imaging for the quantitative diagnosis and management of skin lesions. Recently, various mobile devices such as a smartphone have emerged as healthcare tools. They have been applied for the early diagnosis of nonmalignant and malignant skin diseases. Particularly, when they are combined with an advanced optical imaging technique such as multispectral imaging and analysis, it would be beneficial for the early diagnosis of such skin diseases and for further quantitative prognosis monitoring after treatment at home. Thus, we demonstrate here the development of a smartphone-based multispectral imaging system with high portability and its potential for mobile skin diagnosis. The results suggest that smartphone-based multispectral imaging and analysis has great potential as a healthcare tool for quantitative mobile skin diagnosis. PMID:28018743

  1. Smartphone-based multispectral imaging: system development and potential for mobile skin diagnosis.

    PubMed

    Kim, Sewoong; Cho, Dongrae; Kim, Jihun; Kim, Manjae; Youn, Sangyeon; Jang, Jae Eun; Je, Minkyu; Lee, Dong Hun; Lee, Boreom; Farkas, Daniel L; Hwang, Jae Youn

    2016-12-01

    We investigate the potential of mobile smartphone-based multispectral imaging for the quantitative diagnosis and management of skin lesions. Recently, various mobile devices such as a smartphone have emerged as healthcare tools. They have been applied for the early diagnosis of nonmalignant and malignant skin diseases. Particularly, when they are combined with an advanced optical imaging technique such as multispectral imaging and analysis, it would be beneficial for the early diagnosis of such skin diseases and for further quantitative prognosis monitoring after treatment at home. Thus, we demonstrate here the development of a smartphone-based multispectral imaging system with high portability and its potential for mobile skin diagnosis. The results suggest that smartphone-based multispectral imaging and analysis has great potential as a healthcare tool for quantitative mobile skin diagnosis.

  2. Modelling forest canopy height by integrating airborne LiDAR samples with satellite Radar and multispectral imagery

    NASA Astrophysics Data System (ADS)

    García, Mariano; Saatchi, Sassan; Ustin, Susan; Balzter, Heiko

    2018-04-01

    Spatially-explicit information on forest structure is paramount to estimating aboveground carbon stocks for designing sustainable forest management strategies and mitigating greenhouse gas emissions from deforestation and forest degradation. LiDAR measurements provide samples of forest structure that must be integrated with satellite imagery to predict and to map landscape scale variations of forest structure. Here we evaluate the capability of existing satellite synthetic aperture radar (SAR) with multispectral data to estimate forest canopy height over five study sites across two biomes in North America, namely temperate broadleaf and mixed forests and temperate coniferous forests. Pixel size affected the modelling results, with an improvement in model performance as pixel resolution coarsened from 25 m to 100 m. Likewise, the sample size was an important factor in the uncertainty of height prediction using the Support Vector Machine modelling approach. Larger sample size yielded better results but the improvement stabilised when the sample size reached approximately 10% of the study area. We also evaluated the impact of surface moisture (soil and vegetation moisture) on the modelling approach. Whereas the impact of surface moisture had a moderate effect on the proportion of the variance explained by the model (up to 14%), its impact was more evident in the bias of the models with bias reaching values up to 4 m. Averaging the incidence angle corrected radar backscatter coefficient (γ°) reduced the impact of surface moisture on the models and improved their performance at all study sites, with R2 ranging between 0.61 and 0.82, RMSE between 2.02 and 5.64 and bias between 0.02 and -0.06, respectively, at 100 m spatial resolution. An evaluation of the relative importance of the variables in the model performance showed that for the study sites located within the temperate broadleaf and mixed forests biome ALOS-PALSAR HV polarised backscatter was the most important

  3. Implementation and evaluation of ILLIAC 4 algorithms for multispectral image processing

    NASA Technical Reports Server (NTRS)

    Swain, P. H.

    1974-01-01

    Data concerning a multidisciplinary and multi-organizational effort to implement multispectral data analysis algorithms on a revolutionary computer, the Illiac 4, are reported. The effectiveness and efficiency of implementing the digital multispectral data analysis techniques for producing useful land use classifications from satellite collected data were demonstrated.

  4. Comparison of Hyperspectral and Multispectral Satellites for Forest Alliance Classification in the San Francisco Bay Area

    NASA Astrophysics Data System (ADS)

    Clark, M. L.

    2016-12-01

    The goal of this study was to assess multi-temporal, Hyperspectral Infrared Imager (HyspIRI) satellite imagery for improved forest class mapping relative to multispectral satellites. The study area was the western San Francisco Bay Area, California and forest alliances (e.g., forest communities defined by dominant or co-dominant trees) were defined using the U.S. National Vegetation Classification System. Simulated 30-m HyspIRI, Landsat 8 and Sentinel-2 imagery were processed from image data acquired by NASA's AVIRIS airborne sensor in year 2015, with summer and multi-temporal (spring, summer, fall) data analyzed separately. HyspIRI reflectance was used to generate a suite of hyperspectral metrics that targeted key spectral features related to chemical and structural properties. The Random Forests classifier was applied to the simulated images and overall accuracies (OA) were compared to those from real Landsat 8 images. For each image group, broad land cover (e.g., Needle-leaf Trees, Broad-leaf Trees, Annual agriculture, Herbaceous, Built-up) was classified first, followed by a finer-detail forest alliance classification for pixels mapped as closed-canopy forest. There were 5 needle-leaf tree alliances and 16 broad-leaf tree alliances, including 7 Quercus (oak) alliance types. No forest alliance classification exceeded 50% OA, indicating that there was broad spectral similarity among alliances, most of which were not spectrally pure but rather a mix of tree species. In general, needle-leaf (Pine, Redwood, Douglas Fir) alliances had better class accuracies than broad-leaf alliances (Oaks, Madrone, Bay Laurel, Buckeye, etc). Multi-temporal data classifications all had 5-6% greater OA than with comparable summer data. For simulated data, HyspIRI metrics had 4-5% greater OA than Landsat 8 and Sentinel-2 multispectral imagery and 3-4% greater OA than HyspIRI reflectance. Finally, HyspIRI metrics had 8% greater OA than real Landsat 8 imagery. In conclusion, forest

  5. Filter Selection for Optimizing the Spectral Sensitivity of Broadband Multispectral Cameras Based on Maximum Linear Independence.

    PubMed

    Li, Sui-Xian

    2018-05-07

    Previous research has shown that the effectiveness of selecting filter sets from among a large set of commercial broadband filters by a vector analysis method based on maximum linear independence (MLI). However, the traditional MLI approach is suboptimal due to the need to predefine the first filter of the selected filter set to be the maximum ℓ₂ norm among all available filters. An exhaustive imaging simulation with every single filter serving as the first filter is conducted to investigate the features of the most competent filter set. From the simulation, the characteristics of the most competent filter set are discovered. Besides minimization of the condition number, the geometric features of the best-performed filter set comprise a distinct transmittance peak along the wavelength axis of the first filter, a generally uniform distribution for the peaks of the filters and substantial overlaps of the transmittance curves of the adjacent filters. Therefore, the best-performed filter sets can be recognized intuitively by simple vector analysis and just a few experimental verifications. A practical two-step framework for selecting optimal filter set is recommended, which guarantees a significant enhancement of the performance of the systems. This work should be useful for optimizing the spectral sensitivity of broadband multispectral imaging sensors.

  6. Multispectral image analysis for object recognition and classification

    NASA Astrophysics Data System (ADS)

    Viau, C. R.; Payeur, P.; Cretu, A.-M.

    2016-05-01

    Computer and machine vision applications are used in numerous fields to analyze static and dynamic imagery in order to assist or automate decision-making processes. Advancements in sensor technologies now make it possible to capture and visualize imagery at various wavelengths (or bands) of the electromagnetic spectrum. Multispectral imaging has countless applications in various fields including (but not limited to) security, defense, space, medical, manufacturing and archeology. The development of advanced algorithms to process and extract salient information from the imagery is a critical component of the overall system performance. The fundamental objective of this research project was to investigate the benefits of combining imagery from the visual and thermal bands of the electromagnetic spectrum to improve the recognition rates and accuracy of commonly found objects in an office setting. A multispectral dataset (visual and thermal) was captured and features from the visual and thermal images were extracted and used to train support vector machine (SVM) classifiers. The SVM's class prediction ability was evaluated separately on the visual, thermal and multispectral testing datasets.

  7. Novel approach to multispectral image compression on the Internet

    NASA Astrophysics Data System (ADS)

    Zhu, Yanqiu; Jin, Jesse S.

    2000-10-01

    Still image coding techniques such as JPEG have been always applied onto intra-plane images. Coding fidelity is always utilized in measuring the performance of intra-plane coding methods. In many imaging applications, it is more and more necessary to deal with multi-spectral images, such as the color images. In this paper, a novel approach to multi-spectral image compression is proposed by using transformations among planes for further compression of spectral planes. Moreover, a mechanism of introducing human visual system to the transformation is provided for exploiting the psycho visual redundancy. The new technique for multi-spectral image compression, which is designed to be compatible with the JPEG standard, is demonstrated on extracting correlation among planes based on human visual system. A high measure of compactness in the data representation and compression can be seen with the power of the scheme taken into account.

  8. Changes of multispectral soil patterns with increasing crop canopy

    NASA Technical Reports Server (NTRS)

    Kristof, S. J.; Baumgardner, M. F.

    1972-01-01

    Multispectral data and automatic data processing were used to map surface soil patterns and to follow the changes in multispectral radiation from a field of maize (Zea mays L.) during a period from seeding to maturity. Panchromatic aerial photography was obtained in early May 1970 and multispectral scanner missions were flown on May 6, June 30, August 11 and September 5, 1970 to obtain energy measurements in 13 wavelength bands. The orange portion of the visible spectrum was used in analyzing the May and June data to cluster relative radiance of the soils into eight different radiance levels. The reflective infrared spectral band was used in analyzing the August and September data to cluster maize into different spectral categories. The computer-produced soil patterns had a striking similarity to the soil pattern of the aerial photograph. These patterns became less distinct as the maize canopy increased.

  9. Tissue classification for laparoscopic image understanding based on multispectral texture analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Yan; Wirkert, Sebastian J.; Iszatt, Justin; Kenngott, Hannes; Wagner, Martin; Mayer, Benjamin; Stock, Christian; Clancy, Neil T.; Elson, Daniel S.; Maier-Hein, Lena

    2016-03-01

    Intra-operative tissue classification is one of the prerequisites for providing context-aware visualization in computer-assisted minimally invasive surgeries. As many anatomical structures are difficult to differentiate in conventional RGB medical images, we propose a classification method based on multispectral image patches. In a comprehensive ex vivo study we show (1) that multispectral imaging data is superior to RGB data for organ tissue classification when used in conjunction with widely applied feature descriptors and (2) that combining the tissue texture with the reflectance spectrum improves the classification performance. Multispectral tissue analysis could thus evolve as a key enabling technique in computer-assisted laparoscopy.

  10. Evaluation of Chilling Injury in Mangoes Using Multispectral Imaging.

    PubMed

    Hashim, Norhashila; Onwude, Daniel I; Osman, Muhamad Syafiq

    2018-05-01

    Commodities originating from tropical and subtropical climes are prone to chilling injury (CI). This injury could affect the quality and marketing potential of mango after harvest. This will later affect the quality of the produce and subsequent consumer acceptance. In this study, the appearance of CI symptoms in mango was evaluated non-destructively using multispectral imaging. The fruit were stored at 4 °C to induce CI and 12 °C to preserve the quality of the control samples for 4 days before they were taken out and stored at ambient temperature for 24 hr. Measurements using multispectral imaging and standard reference methods were conducted before and after storage. The performance of multispectral imaging was compared using standard reference properties including moisture content (MC), total soluble solids (TSS) content, firmness, pH, and color. Least square support vector machine (LS-SVM) combined with principal component analysis (PCA) were used to discriminate CI samples with those of control and before storage, respectively. The statistical results demonstrated significant changes in the reference quality properties of samples before and after storage. The results also revealed that multispectral parameters have a strong correlation with the reference parameters of L * , a * , TSS, and MC. The MC and L * were found to be the best reference parameters in identifying the severity of CI in mangoes. PCA and LS-SVM analysis indicated that the fruit were successfully classified into their categories, that is, before storage, control, and CI. This indicated that the multispectral imaging technique is feasible for detecting CI in mangoes during postharvest storage and processing. This paper demonstrates a fast, easy, and accurate method of identifying the effect of cold storage on mango, nondestructively. The method presented in this paper can be used industrially to efficiently differentiate different fruits from each other after low temperature storage. © 2018

  11. Multispectral Image Compression Based on DSC Combined with CCSDS-IDC

    PubMed Central

    Li, Jin; Xing, Fei; Sun, Ting; You, Zheng

    2014-01-01

    Remote sensing multispectral image compression encoder requires low complexity, high robust, and high performance because it usually works on the satellite where the resources, such as power, memory, and processing capacity, are limited. For multispectral images, the compression algorithms based on 3D transform (like 3D DWT, 3D DCT) are too complex to be implemented in space mission. In this paper, we proposed a compression algorithm based on distributed source coding (DSC) combined with image data compression (IDC) approach recommended by CCSDS for multispectral images, which has low complexity, high robust, and high performance. First, each band is sparsely represented by DWT to obtain wavelet coefficients. Then, the wavelet coefficients are encoded by bit plane encoder (BPE). Finally, the BPE is merged to the DSC strategy of Slepian-Wolf (SW) based on QC-LDPC by deep coupling way to remove the residual redundancy between the adjacent bands. A series of multispectral images is used to test our algorithm. Experimental results show that the proposed DSC combined with the CCSDS-IDC (DSC-CCSDS)-based algorithm has better compression performance than the traditional compression approaches. PMID:25110741

  12. Multispectral image compression based on DSC combined with CCSDS-IDC.

    PubMed

    Li, Jin; Xing, Fei; Sun, Ting; You, Zheng

    2014-01-01

    Remote sensing multispectral image compression encoder requires low complexity, high robust, and high performance because it usually works on the satellite where the resources, such as power, memory, and processing capacity, are limited. For multispectral images, the compression algorithms based on 3D transform (like 3D DWT, 3D DCT) are too complex to be implemented in space mission. In this paper, we proposed a compression algorithm based on distributed source coding (DSC) combined with image data compression (IDC) approach recommended by CCSDS for multispectral images, which has low complexity, high robust, and high performance. First, each band is sparsely represented by DWT to obtain wavelet coefficients. Then, the wavelet coefficients are encoded by bit plane encoder (BPE). Finally, the BPE is merged to the DSC strategy of Slepian-Wolf (SW) based on QC-LDPC by deep coupling way to remove the residual redundancy between the adjacent bands. A series of multispectral images is used to test our algorithm. Experimental results show that the proposed DSC combined with the CCSDS-IDC (DSC-CCSDS)-based algorithm has better compression performance than the traditional compression approaches.

  13. Mapping playa evaporite minerals and associated sediments in Death Valley, California, with multispectral thermal infrared images

    USGS Publications Warehouse

    Crowley, J.K.; Hook, S.J.

    1996-01-01

    Efflorescent salt crusts and associated sediments in Death Valley, California, were studied with remote-sensing data acquired by the NASA thermal infrared multispectral scanner (TIMS). Nine spectral classes that represent a variety of surface materials were distinguished, including several classes that reflect important aspects of the playa groundwater chemistry and hydrology. Evaporite crusts containing abundant thenardite (sodium sulfate) were mapped along the northern and eastern margins of the Cottonball Basin, areas where the inflow waters are rich in sodium. Gypsum (calcium sulfate) crusts were more common in the Badwater Basin, particularly near springs associated with calcic groundwaters along the western basin margin. Evaporite-rich crusts generally marked areas where groundwater is periodically near the surface and thus able to replenish the crusts though capillary evaporation. Detrital silicate minerals were prevalent in other parts of the salt pan where shallow groundwater does not affect the surface composition. The surface features in Death Valley change in response to climatic variations on several different timescales. For example, salt crusts on low-lying mudflats form and redissolve during seasonal-to-interannual cycles of wetting and desiccation. In contrast, recent flooding and erosion of rough-salt surfaces in Death Valley probably reflect increased regional precipitation spanning several decades. Remote-sensing observations of playas can provide a means for monitoring changes in evaporite facies and for better understanding the associated climatic processes. At present, such studies are limited by the availability of suitable airborne scanner data. However, with the launch of the Earth Observing System (EOS) AM-1 Platform in 1998, multispectral visible/near-infrared and thermal infrared remote-sensing data will become globally available. Copyright 1996 by the American Geophysical Union.

  14. Multispectral fluorescence imaging techniques for nondestructive food safety inspection

    NASA Astrophysics Data System (ADS)

    Kim, Moon S.; Lefcourt, Alan M.; Chen, Yud-Ren

    2004-03-01

    The use of spectral sensing has gained acceptance as a rapid means for nondestructive inspection of postharvest food produce. Current technologies generally use color or a single wavelength camera technology. The applicability and sensitivity of these techniques can be expanded through the use of multiple wavelengths. Reflectance in the Vis/NIR is the prevalent spectral technique. Fluorescence, compared to reflectance, is regarded as a more sensitive technique due to its dynamic responses to subtle changes in biological entities. Our laboratory has been exploring fluorescence as a potential means for detection of quality and wholesomeness of food products. Applications of fluorescence sensing require an understanding of the spectral characteristics emanating from constituents and potential contaminants. A number of factors affecting fluorescence emission characteristics are discussed. Because of relatively low fluorescence quantum yield from biological samples, a system with a powerful pulse light source such as a laser coupled with a gated detection device is used to harvest fluorescence, in the presence of ambient light. Several fluorescence sensor platforms developed in our laboratory, including hyperspectral imaging, and laser-induced fluorescence (LIF) and steady-state fluorescence imaging systems with multispectral capabilities are presented. We demonstrate the potential uses of recently developed fluorescence imaging platforms in food safety inspection of apples contaminated with animal feces.

  15. Airborne wireless communication systems, airborne communication methods, and communication methods

    DOEpatents

    Deaton, Juan D [Menan, ID; Schmitt, Michael J [Idaho Falls, ID; Jones, Warren F [Idaho Falls, ID

    2011-12-13

    An airborne wireless communication system includes circuitry configured to access information describing a configuration of a terrestrial wireless communication base station that has become disabled. The terrestrial base station is configured to implement wireless communication between wireless devices located within a geographical area and a network when the terrestrial base station is not disabled. The circuitry is further configured, based on the information, to configure the airborne station to have the configuration of the terrestrial base station. An airborne communication method includes answering a 911 call from a terrestrial cellular wireless phone using an airborne wireless communication system.

  16. Ensemble classification of individual Pinus crowns from multispectral satellite imagery and airborne LiDAR

    NASA Astrophysics Data System (ADS)

    Kukunda, Collins B.; Duque-Lazo, Joaquín; González-Ferreiro, Eduardo; Thaden, Hauke; Kleinn, Christoph

    2018-03-01

    Distinguishing tree species is relevant in many contexts of remote sensing assisted forest inventory. Accurate tree species maps support management and conservation planning, pest and disease control and biomass estimation. This study evaluated the performance of applying ensemble techniques with the goal of automatically distinguishing Pinus sylvestris L. and Pinus uncinata Mill. Ex Mirb within a 1.3 km2 mountainous area in Barcelonnette (France). Three modelling schemes were examined, based on: (1) high-density LiDAR data (160 returns m-2), (2) Worldview-2 multispectral imagery, and (3) Worldview-2 and LiDAR in combination. Variables related to the crown structure and height of individual trees were extracted from the normalized LiDAR point cloud at individual-tree level, after performing individual tree crown (ITC) delineation. Vegetation indices and the Haralick texture indices were derived from Worldview-2 images and served as independent spectral variables. Selection of the best predictor subset was done after a comparison of three variable selection procedures: (1) Random Forests with cross validation (AUCRFcv), (2) Akaike Information Criterion (AIC) and (3) Bayesian Information Criterion (BIC). To classify the species, 9 regression techniques were combined using ensemble models. Predictions were evaluated using cross validation and an independent dataset. Integration of datasets and models improved individual tree species classification (True Skills Statistic, TSS; from 0.67 to 0.81) over individual techniques and maintained strong predictive power (Relative Operating Characteristic, ROC = 0.91). Assemblage of regression models and integration of the datasets provided more reliable species distribution maps and associated tree-scale mapping uncertainties. Our study highlights the potential of model and data assemblage at improving species classifications needed in present-day forest planning and management.

  17. Integrative Multi-Spectral Sensor Device for Far-Infrared and Visible Light Fusion

    NASA Astrophysics Data System (ADS)

    Qiao, Tiezhu; Chen, Lulu; Pang, Yusong; Yan, Gaowei

    2018-06-01

    Infrared and visible light image fusion technology is a hot spot in the research of multi-sensor fusion technology in recent years. Existing infrared and visible light fusion technologies need to register before fusion because of using two cameras. However, the application effect of the registration technology has yet to be improved. Hence, a novel integrative multi-spectral sensor device is proposed for infrared and visible light fusion, and by using the beam splitter prism, the coaxial light incident from the same lens is projected to the infrared charge coupled device (CCD) and visible light CCD, respectively. In this paper, the imaging mechanism of the proposed sensor device is studied with the process of the signals acquisition and fusion. The simulation experiment, which involves the entire process of the optic system, signal acquisition, and signal fusion, is constructed based on imaging effect model. Additionally, the quality evaluation index is adopted to analyze the simulation result. The experimental results demonstrate that the proposed sensor device is effective and feasible.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leary, T.J.; Lamb, A.

    The Department of Energy`s Office of Arms Control and Non-Proliferation (NN-20) has developed a suite of airborne remote sensing systems that simultaneously collect coincident data from a US Navy P-3 aircraft. The primary objective of the Airborne Multisensor Pod System (AMPS) Program is {open_quotes}to collect multisensor data that can be used for data research, both to reduce interpretation problems associated with data overload and to develop information products more complete than can be obtained from any single sensor.{close_quotes} The sensors are housed in wing-mounted pods and include: a Ku-Band Synthetic Aperture Radar; a CASI Hyperspectral Imager; a Daedalus 3600 Airbornemore » Multispectral Scanner; a Wild Heerbrugg RC-30 motion compensated large format camera; various high resolution, light intensified and thermal video cameras; and several experimental sensors (e.g. the Portable Hyperspectral Imager of Low-Light Spectroscopy (PHILLS)). Over the past year or so, the Coastal Marine Resource Assessment (CAMRA) group at the Florida Department of Environmental Protection`s Marine Research Institute (FMRI) has been working with the Department of Energy through the Naval Research Laboratory to develop applications and products from existing data. Considerable effort has been spent identifying image formats integration parameters. 2 refs., 3 figs., 2 tabs.« less

  19. Imaging Spectroscopy Enables Novel Applications and Continuity with the Landsat Record to Sustain Legacy Applications: An Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and Landsat 8 OLI Case Study

    NASA Astrophysics Data System (ADS)

    Stavros, E. N.; Seidel, F.; Cable, M. L.; Green, R. O.; Freeman, A.

    2017-12-01

    While, imaging spectrometers offer additional information that provide value added products for applications that are otherwise underserved, there is need to demonstrate their ability to augment the multi-spectral (e.g., Landsat) optical record by both providing more frequent temporal revisit and lengthening the existing record. Here we test the hypothesis that imaging spectroscopic optical data is compatible with multi-spectral data to within ±5% radiometric accuracy, as desirable to continue the long-term Landsat data record. We use a coincident Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) flight with over-passing Operational Land Imager (OLI) data on Landsat 8 to document a procedure for simulating OLI multi-spectral bands from AVIRIS, evaluate influencing factors on the observed radiance, and assess AVIRIS radiometric accuracy compared to OLI. The procedure for simulating OLI data includes spectral convolution, accounting for atmospheric effects introduced by different sensor altitude and viewing geometries, and spatial resampling. After accounting for these influences, we expect the remaining differences between the simulated and the real OLI data result from differences in sensor calibration, surface bi-directional reflectance, from the different viewing geometries, and spatial sampling. The median radiometric percent difference for each band in the data used range from 0.6% to 8.3%. After bias-correction to minimize potential calibration discrepancies, we find no more than 1.2% radiometric percent difference for any OLI band. This analysis therefore successfully demonstrates that imaging spectrometer data can not only address novel applications, but also contribute to the Landsat-type or other multi-spectral data records to sustain legacy applications.

  20. Real Time Data/Video/Voice Uplink and Downlink for Kuiper Airborne Observatory

    NASA Technical Reports Server (NTRS)

    Harper, Doyal A.

    1997-01-01

    LFS was an educational outreach adventure which brought the excitement of astronomical exploration on NASA's Kuiper Airborne Observatory (KAO) to a nationwide audience of children, parents and children through live, interactive television, broadcast from the KAO at an altitude of 41,000 feet during an actual scientific observing mission. The project encompassed three KAO flights during the fall of 1995, including a short practice mission, a daytime observing flight between Moffett Field, California to Houston, Texas, and a nighttime mission from Houston back to Moffett Field. The University of Chicago infrared research team participated in planning the program, developing auxiliary materials including background information and lesson plans, developing software which allowed students on the ground to control the telescope and on-board cameras via the Internet from the Adler Planetarium in Chicago, and acting as on-camera correspondents to explain and answer questions about the scientific research conducted during the flights.

  1. Fourier Spectral Filter Array for Optimal Multispectral Imaging.

    PubMed

    Jia, Jie; Barnard, Kenneth J; Hirakawa, Keigo

    2016-04-01

    Limitations to existing multispectral imaging modalities include speed, cost, range, spatial resolution, and application-specific system designs that lack versatility of the hyperspectral imaging modalities. In this paper, we propose a novel general-purpose single-shot passive multispectral imaging modality. Central to this design is a new type of spectral filter array (SFA) based not on the notion of spatially multiplexing narrowband filters, but instead aimed at enabling single-shot Fourier transform spectroscopy. We refer to this new SFA pattern as Fourier SFA, and we prove that this design solves the problem of optimally sampling the hyperspectral image data.

  2. Characterizing the solar reflection from wildfire smoke plumes using airborne multiangle measurements

    NASA Astrophysics Data System (ADS)

    Gatebe, C. K.; Varnai, T.; Gautam, R.; Poudyal, R.; Singh, M. K.

    2016-12-01

    To help better understand forest fire smoke plumes, this study examines sunlight reflected from plumes that were observed over Canada during the ARCTAS campaign in summer 2008. In particular, the study analyzes multiangle and multispectral measurements of smoke scattering by the airborne Cloud Absorption Radiometer (CAR). In combination with other in-situ and remote sensing information and radiation modeling, CAR data is used for characterizing the radiative properties and radiative impact of smoke particles—which inherently depend on smoke particle properties that influence air quality. In addition to estimating the amount of reflected and absorbed sunlight, the work includes using CAR data to create spectral and broadband top-of-atmosphere angular distribution models (ADMs) of solar radiation reflected by smoke plumes, and examining the sensitivity of such angular models to scene parameters. Overall, the results help better understand the radiative properties and radiative effects of smoke particles, and are anticipated to help better interpret satellite data on smoke plumes.

  3. A COST EFFECTIVE MULTI-SPECTRAL SCANNER FOR NATURAL GAS DETECTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yudaya Sivathanu; Jongmook Lim; Vinoo Narayanan

    The objective of this project is to design, fabricate and field demonstrate a cost effective, multi-spectral scanner for natural gas leak detection in transmission and distribution pipelines. During the first six months of the project, the design for a laboratory version of the multispectral scanner was completed. The optical, mechanical, and electronic design for the scanner was completed. The optical design was analyzed using Zeemax Optical Design software and found to provide sufficiently resolved performance for the scanner. The electronic design was evaluated using a bread board and very high signal to noise ratios were obtained. Fabrication of a laboratorymore » version of the multi-spectral scanner is currently in progress. A technology status report and a research management plan was also completed during the same period.« less

  4. Interactive color display for multispectral imagery using correlation clustering

    NASA Technical Reports Server (NTRS)

    Haskell, R. E. (Inventor)

    1979-01-01

    A method for processing multispectral data is provided, which permits an operator to make parameter level changes during the processing of the data. The system is directed to production of a color classification map on a video display in which a given color represents a localized region in multispectral feature space. Interactive controls permit an operator to alter the size and change the location of these regions, permitting the classification of such region to be changed from a broad to a narrow classification.

  5. Digital computer processing of peach orchard multispectral aerial photography

    NASA Technical Reports Server (NTRS)

    Atkinson, R. J.

    1976-01-01

    Several methods of analysis using digital computers applicable to digitized multispectral aerial photography, are described, with particular application to peach orchard test sites. This effort was stimulated by the recent premature death of peach trees in the Southeastern United States. The techniques discussed are: (1) correction of intensity variations by digital filtering, (2) automatic detection and enumeration of trees in five size categories, (3) determination of unhealthy foliage by infrared reflectances, and (4) four band multispectral classification into healthy and declining categories.

  6. Positioning in Time and Space - Cost-Effective Exterior Orientation for Airborne Archaeological Photographs

    NASA Astrophysics Data System (ADS)

    Verhoeven, G.; Wieser, M.; Briese, C.; Doneus, M.

    2013-07-01

    Since manned, airborne aerial reconnaissance for archaeological purposes is often characterised by more-or-less random photographing of archaeological features on the Earth, the exact position and orientation of the camera during image acquisition becomes very important in an effective inventorying and interpretation workflow of these aerial photographs. Although the positioning is generally achieved by simultaneously logging the flight path or directly recording the camera's position with a GNSS receiver, this approach does not allow to record the necessary roll, pitch and yaw angles of the camera. The latter are essential elements for the complete exterior orientation of the camera, which allows - together with the inner orientation of the camera - to accurately define the portion of the Earth recorded in the photograph. This paper proposes a cost-effective, accurate and precise GNSS/IMU solution (image position: 2.5 m and orientation: 2°, both at 1σ) to record all essential exterior orientation parameters for the direct georeferencing of the images. After the introduction of the utilised hardware, this paper presents the developed software that allows recording and estimating these parameters. Furthermore, this direct georeferencing information can be embedded into the image's metadata. Subsequently, the first results of the estimation of the mounting calibration (i.e. the misalignment between the camera and GNSS/IMU coordinate frame) are provided. Furthermore, a comparison with a dedicated commercial photographic GNSS/IMU solution will prove the superiority of the introduced solution. Finally, an outlook on future tests and improvements finalises this article.

  7. Efficient single-pixel multispectral imaging via non-mechanical spatio-spectral modulation.

    PubMed

    Li, Ziwei; Suo, Jinli; Hu, Xuemei; Deng, Chao; Fan, Jingtao; Dai, Qionghai

    2017-01-27

    Combining spectral imaging with compressive sensing (CS) enables efficient data acquisition by fully utilizing the intrinsic redundancies in natural images. Current compressive multispectral imagers, which are mostly based on array sensors (e.g, CCD or CMOS), suffer from limited spectral range and relatively low photon efficiency. To address these issues, this paper reports a multispectral imaging scheme with a single-pixel detector. Inspired by the spatial resolution redundancy of current spatial light modulators (SLMs) relative to the target reconstruction, we design an all-optical spectral splitting device to spatially split the light emitted from the object into several counterparts with different spectrums. Separated spectral channels are spatially modulated simultaneously with individual codes by an SLM. This no-moving-part modulation ensures a stable and fast system, and the spatial multiplexing ensures an efficient acquisition. A proof-of-concept setup is built and validated for 8-channel multispectral imaging within 420~720 nm wavelength range on both macro and micro objects, showing a potential for efficient multispectral imager in macroscopic and biomedical applications.

  8. Discriminating plant species across California's diverse ecosystems using airborne VSWIR and TIR imagery

    NASA Astrophysics Data System (ADS)

    Meerdink, S.; Roberts, D. A.; Roth, K. L.

    2015-12-01

    Accurate knowledge of the spatial distribution of plant species is required for many research and management agendas that track ecosystem health. Because of this, there is continuous development of research focused on remotely-sensed species classifications for many diverse ecosystems. While plant species have been mapped using airborne imaging spectroscopy, the geographic extent has been limited due to data availability and spectrally similar species continue to be difficult to separate. The proposed Hyperspectral Infrared Imager (HyspIRI) space-borne mission, which includes a visible near infrared/shortwave infrared (VSWIR) imaging spectrometer and thermal infrared (TIR) multi-spectral imager, would present an opportunity to improve species discrimination over a much broader scale. Here we evaluate: 1) the capability of VSWIR and/or TIR spectra to discriminate plant species; 2) the accuracy of species classifications within an ecosystem; and 3) the potential for discriminating among species across a range of ecosystems. Simulated HyspIRI imagery was acquired in spring/summer of 2013 spanning from Santa Barbara to Bakersfield, CA with the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and the MODIS/ASTER Airborne Simulator (MASTER) instruments. Three spectral libraries were created from these images: AVIRIS (224 bands from 0.4 - 2.5 µm), MASTER (8 bands from 7.5 - 12 µm), and AVIRIS + MASTER. We used canonical discriminant analysis (CDA) as a dimension reduction technique and then classified plant species using linear discriminant analysis (LDA). Our results show the inclusion of TIR spectra improved species discrimination, but only for plant species with emissivities departing from that of a gray body. Ecosystems with species that have high spectral contrast had higher classification accuracies. Mapping plant species across all ecosystems resulted in a classification with lower accuracies than a single ecosystem due to the complex nature of

  9. A multilevel multispectral data set analysis in the visible and infrared wavelength regions. [for land use remote sensing

    NASA Technical Reports Server (NTRS)

    Biehl, L. L.; Silva, L. F.

    1975-01-01

    Skylab multispectral scanner data, digitized Skylab color infrared (IR) photography, digitized Skylab black and white multiband photography, and Earth Resources Technology Satellite (ERTS) multispectral scanner data collected within a 24-hr time period over an area in south-central Indiana near Bloomington on June 9 and 10, 1973, were compared in a machine-aided land use analysis of the area. The overall classification performance results, obtained with nine land use classes, were 87% correct classification using the 'best' 4 channels of the Skylab multispectral scanner, 80% for the channels on the Skylab multispectral scanner which are spectrally comparable to the ERTS multispectral scanner, 88% for the ERTS multispectral scanner, 83% for the digitized color IR photography, and 76% for the digitized black and white multiband photography. The results indicate that the Skylab multispectral scanner may yield even higher classification accuracies when a noise-filtered multispectral scanner data set becomes available in the near future.

  10. Optical design of common aperture, common focal plane, multispectral optics for military applications

    NASA Astrophysics Data System (ADS)

    Thompson, Nicholas Allan

    2013-06-01

    With recent developments in multispectral detector technology, the interest in common aperture, common focal plane multispectral imaging systems is increasing. Such systems are particularly desirable for military applications, where increased levels of target discrimination and identification are required in cost-effective, rugged, lightweight systems. During the optical design of dual waveband or multispectral systems, the options for material selection are limited. This selection becomes even more restrictive for military applications, where material resilience, thermal properties, and color correction must be considered. We discuss the design challenges that lightweight multispectral common aperture systems present, along with some potential design solutions. Consideration is given to material selection for optimum color correction, as well as material resilience and thermal correction. This discussion is supported using design examples currently in development at Qioptiq.

  11. Atmospheric transformation of multispectral remote sensor data. [Great Lakes

    NASA Technical Reports Server (NTRS)

    Turner, R. E. (Principal Investigator)

    1977-01-01

    The author has identified the following significant results. The effects of earth's atmosphere were accounted for, and a simple algorithm, based upon a radiative transfer model, was developed to determine the radiance at earth's surface free of atmospheric effects. Acutal multispectral remote sensor data for Lake Erie and associated optical thickness data were used to demonstrate the effectiveness of the atmospheric transformation algorithm. The basic transformation was general in nature and could be applied to the large scale processing of multispectral aircraft or satellite remote sensor data.

  12. Fluorescence multispectral imaging-based diagnostic system for atherosclerosis.

    PubMed

    Ho, Cassandra Su Lyn; Horiuchi, Toshikatsu; Taniguchi, Hiroaki; Umetsu, Araya; Hagisawa, Kohsuke; Iwaya, Keiichi; Nakai, Kanji; Azmi, Amalina; Zulaziz, Natasha; Azhim, Azran; Shinomiya, Nariyoshi; Morimoto, Yuji

    2016-08-20

    Composition of atherosclerotic arterial walls is rich in lipids such as cholesterol, unlike normal arterial walls. In this study, we aimed to utilize this difference to diagnose atherosclerosis via multispectral fluorescence imaging, which allows for identification of fluorescence originating from the substance in the arterial wall. The inner surface of extracted arteries (rabbit abdominal aorta, human coronary artery) was illuminated by 405 nm excitation light and multispectral fluorescence images were obtained. Pathological examination of human coronary artery samples were carried out and thickness of arteries were calculated by measuring combined media and intima thickness. The fluorescence spectra in atherosclerotic sites were different from those in normal sites. Multiple regions of interest (ROI) were selected within each sample and a ratio between two fluorescence intensity differences (where each intensity difference is calculated between an identifier wavelength and a base wavelength) from each ROI was determined, allowing for discrimination of atherosclerotic sites. Fluorescence intensity and thickness of artery were found to be significantly correlated. These results indicate that multispectral fluorescence imaging provides qualitative and quantitative evaluations of atherosclerosis and is therefore a viable method of diagnosing the disease.

  13. Systematic variations in multi-spectral lidar representations of canopy height profiles and gap probability

    NASA Astrophysics Data System (ADS)

    Chasmer, L.; Hopkinson, C.; Gynan, C.; Mahoney, C.; Sitar, M.

    2015-12-01

    Airborne and terrestrial lidar are increasingly used in forest attribute modeling for carbon, ecosystem and resource monitoring. The near infra-red wavelength at 1064nm has been utilised most in airborne applications due to, for example, diode manufacture costs, surface reflectance and eye safety. Foliage reflects well at 1064nm and most of the literature on airborne lidar forest structure is based on data from this wavelength. However, lidar systems also operate at wavelengths further from the visible spectrum (e.g. 1550nm) for eye safety reasons. This corresponds to a water absorption band and can be sensitive to attenuation if surfaces contain moisture. Alternatively, some systems operate in the visible range (e.g. 532nm) for specialised applications requiring simultaneous mapping of terrestrial and bathymetric surfaces. All these wavelengths provide analogous 3D canopy structure reconstructions and thus offer the potential to be combined for spatial comparisons or temporal monitoring. However, a systematic comparison of wavelength-dependent foliage profile and gap probability (index of transmittance) is needed. Here we report on two multispectral lidar missions carried out in 2013 and 2015 over conifer, deciduous and mixed stands in Ontario, Canada. The first used separate lidar sensors acquiring comparable data at three wavelengths, while the second used a single sensor with 3 integrated laser systems. In both cases, wavelenegths sampled were 532nm, 1064nm and 1550nm. The experiment revealed significant differences in proportions of returns at ground level, the vertical foliage distribution and gap probability across wavelengths. Canopy attenuation was greatest at 532nm due to photosynthetic plant tissue absorption. Relative to 1064nm, foliage was systematically undersampled at the 10% to 60% height percentiles at both 1550nm and 532nm (this was confirmed with coincident terrestrial lidar data). When using all returns to calculate gap probability, all

  14. CAROLS: a new airborne L-band radiometer for ocean surface and land observations.

    PubMed

    Zribi, Mehrez; Pardé, Mickael; Boutin, Jacquline; Fanise, Pascal; Hauser, Daniele; Dechambre, Monique; Kerr, Yann; Leduc-Leballeur, Marion; Reverdin, Gilles; Skou, Niels; Søbjærg, Sten; Albergel, Clement; Calvet, Jean Christophe; Wigneron, Jean Pierre; Lopez-Baeza, Ernesto; Rius, Antonio; Tenerelli, Joseph

    2011-01-01

    The "Cooperative Airborne Radiometer for Ocean and Land Studies" (CAROLS) L-Band radiometer was designed and built as a copy of the EMIRAD II radiometer constructed by the Technical University of Denmark team. It is a fully polarimetric and direct sampling correlation radiometer. It is installed on board a dedicated French ATR42 research aircraft, in conjunction with other airborne instruments (C-Band scatterometer-STORM, the GOLD-RTR GPS system, the infrared CIMEL radiometer and a visible wavelength camera). Following initial laboratory qualifications, three airborne campaigns involving 21 flights were carried out over South West France, the Valencia site and the Bay of Biscay (Atlantic Ocean) in 2007, 2008 and 2009, in coordination with in situ field campaigns. In order to validate the CAROLS data, various aircraft flight patterns and maneuvers were implemented, including straight horizontal flights, circular flights, wing and nose wags over the ocean. Analysis of the first two campaigns in 2007 and 2008 leads us to improve the CAROLS radiometer regarding isolation between channels and filter bandwidth. After implementation of these improvements, results show that the instrument is conforming to specification and is a useful tool for Soil Moisture and Ocean Salinity (SMOS) satellite validation as well as for specific studies on surface soil moisture or ocean salinity.

  15. Multispectral Imaging in Cultural Heritage Conservation

    NASA Astrophysics Data System (ADS)

    Del Pozo, S.; Rodríguez-Gonzálvez, P.; Sánchez-Aparicio, L. J.; Muñoz-Nieto, A.; Hernández-López, D.; Felipe-García, B.; González-Aguilera, D.

    2017-08-01

    This paper sums up the main contribution derived from the thesis entitled "Multispectral imaging for the analysis of materials and pathologies in civil engineering, constructions and natural spaces" awarded by CIPA-ICOMOS for its connection with the preservation of Cultural Heritage. This thesis is framed within close-range remote sensing approaches by the fusion of sensors operating in the optical domain (visible to shortwave infrared spectrum). In the field of heritage preservation, multispectral imaging is a suitable technique due to its non-destructive nature and its versatility. It combines imaging and spectroscopy to analyse materials and land covers and enables the use of a variety of different geomatic sensors for this purpose. These sensors collect both spatial and spectral information for a given scenario and a specific spectral range, so that, their smaller storage units save the spectral properties of the radiation reflected by the surface of interest. The main goal of this research work is to characterise different construction materials as well as the main pathologies of Cultural Heritage elements by combining active and passive sensors recording data in different ranges. Conclusions about the suitability of each type of sensor and spectral range are drawn in relation to each particular case study and damage. It should be emphasised that results are not limited to images, since 3D intensity data from laser scanners can be integrated with 2D data from passive sensors obtaining high quality products due to the added value that metric brings to multispectral images.

  16. Fingerprint enhancement using a multispectral sensor

    NASA Astrophysics Data System (ADS)

    Rowe, Robert K.; Nixon, Kristin A.

    2005-03-01

    The level of performance of a biometric fingerprint sensor is critically dependent on the quality of the fingerprint images. One of the most common types of optical fingerprint sensors relies on the phenomenon of total internal reflectance (TIR) to generate an image. Under ideal conditions, a TIR fingerprint sensor can produce high-contrast fingerprint images with excellent feature definition. However, images produced by the same sensor under conditions that include dry skin, dirt on the skin, and marginal contact between the finger and the sensor, are likely to be severely degraded. This paper discusses the use of multispectral sensing as a means to collect additional images with new information about the fingerprint that can significantly augment the system performance under both normal and adverse sample conditions. In the context of this paper, "multispectral sensing" is used to broadly denote a collection of images taken under different illumination conditions: different polarizations, different illumination/detection configurations, as well as different wavelength illumination. Results from three small studies using an early-stage prototype of the multispectral-TIR (MTIR) sensor are presented along with results from the corresponding TIR data. The first experiment produced data from 9 people, 4 fingers from each person and 3 measurements per finger under "normal" conditions. The second experiment provided results from a study performed to test the relative performance of TIR and MTIR images when taken under extreme dry and dirty conditions. The third experiment examined the case where the area of contact between the finger and sensor is greatly reduced.

  17. Detecting early stage pressure ulcer on dark skin using multispectral imager

    NASA Astrophysics Data System (ADS)

    Yi, Dingrong; Kong, Linghua; Sprigle, Stephen; Wang, Fengtao; Wang, Chao; Liu, Fuhan; Adibi, Ali; Tummala, Rao

    2010-02-01

    We are developing a handheld multispectral imaging device to non-invasively inspect stage I pressure ulcers in dark pigmented skins without the need of touching the patient's skin. This paper reports some preliminary test results of using a proof-of-concept prototype. It also talks about the innovation's impact to traditional multispectral imaging technologies and the fields that will potentially benefit from it.

  18. Single sensor that outputs narrowband multispectral images

    PubMed Central

    Kong, Linghua; Yi, Dingrong; Sprigle, Stephen; Wang, Fengtao; Wang, Chao; Liu, Fuhan; Adibi, Ali; Tummala, Rao

    2010-01-01

    We report the work of developing a hand-held (or miniaturized), low-cost, stand-alone, real-time-operation, narrow bandwidth multispectral imaging device for the detection of early stage pressure ulcers. PMID:20210418

  19. Automatic inference of geometric camera parameters and inter-camera topology in uncalibrated disjoint surveillance cameras

    NASA Astrophysics Data System (ADS)

    den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.

    2015-10-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.

  20. Maximum-likelihood techniques for joint segmentation-classification of multispectral chromosome images.

    PubMed

    Schwartzkopf, Wade C; Bovik, Alan C; Evans, Brian L

    2005-12-01

    Traditional chromosome imaging has been limited to grayscale images, but recently a 5-fluorophore combinatorial labeling technique (M-FISH) was developed wherein each class of chromosomes binds with a different combination of fluorophores. This results in a multispectral image, where each class of chromosomes has distinct spectral components. In this paper, we develop new methods for automatic chromosome identification by exploiting the multispectral information in M-FISH chromosome images and by jointly performing chromosome segmentation and classification. We (1) develop a maximum-likelihood hypothesis test that uses multispectral information, together with conventional criteria, to select the best segmentation possibility; (2) use this likelihood function to combine chromosome segmentation and classification into a robust chromosome identification system; and (3) show that the proposed likelihood function can also be used as a reliable indicator of errors in segmentation, errors in classification, and chromosome anomalies, which can be indicators of radiation damage, cancer, and a wide variety of inherited diseases. We show that the proposed multispectral joint segmentation-classification method outperforms past grayscale segmentation methods when decomposing touching chromosomes. We also show that it outperforms past M-FISH classification techniques that do not use segmentation information.

  1. Target Detection over the Diurnal Cycle Using a Multispectral Infrared Sensor.

    PubMed

    Zhao, Huijie; Ji, Zheng; Li, Na; Gu, Jianrong; Li, Yansong

    2016-12-29

    When detecting a target over the diurnal cycle, a conventional infrared thermal sensor might lose the target due to the thermal crossover, which could happen at any time throughout the day when the infrared image contrast between target and background in a scene is indistinguishable due to the temperature variation. In this paper, the benefits of using a multispectral-based infrared sensor over the diurnal cycle have been shown. Firstly, a brief theoretical analysis on how the thermal crossover influences a conventional thermal sensor, within the conditions where the thermal crossover would happen and why the mid-infrared (3~5 μm) multispectral technology is effective, is presented. Furthermore, the effectiveness of this technology is also described and we describe how the prototype design and multispectral technology is employed to help solve the thermal crossover detection problem. Thirdly, several targets are set up outside and imaged in the field experiment over a 24-h period. The experimental results show that the multispectral infrared imaging system can enhance the contrast of the detected images and effectively solve the failure of the conventional infrared sensor during the diurnal cycle, which is of great significance for infrared surveillance applications.

  2. Target Detection over the Diurnal Cycle Using a Multispectral Infrared Sensor

    PubMed Central

    Zhao, Huijie; Ji, Zheng; Li, Na; Gu, Jianrong; Li, Yansong

    2016-01-01

    When detecting a target over the diurnal cycle, a conventional infrared thermal sensor might lose the target due to the thermal crossover, which could happen at any time throughout the day when the infrared image contrast between target and background in a scene is indistinguishable due to the temperature variation. In this paper, the benefits of using a multispectral-based infrared sensor over the diurnal cycle have been shown. Firstly, a brief theoretical analysis on how the thermal crossover influences a conventional thermal sensor, within the conditions where the thermal crossover would happen and why the mid-infrared (3~5 μm) multispectral technology is effective, is presented. Furthermore, the effectiveness of this technology is also described and we describe how the prototype design and multispectral technology is employed to help solve the thermal crossover detection problem. Thirdly, several targets are set up outside and imaged in the field experiment over a 24-h period. The experimental results show that the multispectral infrared imaging system can enhance the contrast of the detected images and effectively solve the failure of the conventional infrared sensor during the diurnal cycle, which is of great significance for infrared surveillance applications. PMID:28036073

  3. Remote sensing of soil moisture using airborne hyperspectral data

    USGS Publications Warehouse

    Finn, M.; Lewis, M.; Bosch, D.; Giraldo, Mario; Yamamoto, K.; Sullivan, D.; Kincaid, R.; Luna, R.; Allam, G.; Kvien, Craig; Williams, M.

    2011-01-01

    Landscape assessment of soil moisture is critical to understanding the hydrological cycle at the regional scale and in broad-scale studies of biophysical processes affected by global climate changes in temperature and precipitation. Traditional efforts to measure soil moisture have been principally restricted to in situ measurements, so remote sensing techniques are often employed. Hyperspectral sensors with finer spatial resolution and narrow band widths may offer an alternative to traditional multispectral analysis of soil moisture, particularly in landscapes with high spatial heterogeneity. This preliminary research evaluates the ability of remotely sensed hyperspectral data to quantify soil moisture for the Little River Experimental Watershed (LREW), Georgia. An airborne hyperspectral instrument with a short-wavelength infrared (SWIR) sensor was flown in 2005 and 2007 and the results were correlated to in situ soil moisture values. A significant statistical correlation (R2 value above 0.7 for both sampling dates) for the hyperspectral instrument data and the soil moisture probe data at 5.08 cm (2 inches) was determined. While models for the 20.32 cm (8 inches) and 30.48 cm (12 inches) depths were tested, they were not able to estimate soil moisture to the same degree.

  4. Atmospheric effects in multispectral remote sensor data

    NASA Technical Reports Server (NTRS)

    Turner, R. E.

    1975-01-01

    The problem of radiometric variations in multispectral remote sensing data which occur as a result of a change in geometric and environmental factors is studied. The case of spatially varying atmospheres is considered and the effect of atmospheric scattering is analyzed for realistic conditions. Emphasis is placed upon a simulation of LANDSAT spectral data for agricultural investigations over the United States. The effect of the target-background interaction is thoroughly analyzed in terms of various atmospheric states, geometric parameters, and target-background materials. Results clearly demonstrate that variable atmospheres can alter the classification accuracy and that the presence of various backgrounds can change the effective target radiance by a significant amount. A failure to include these effects in multispectral data analysis will result in a decrease in the classification accuracy.

  5. Mapping within-field variations of soil organic carbon content using UAV multispectral visible near-infrared images

    NASA Astrophysics Data System (ADS)

    Gilliot, Jean-Marc; Vaudour, Emmanuelle; Michelin, Joël

    2016-04-01

    This study was carried out in the framework of the PROSTOCK-Gessol3 project supported by the French Environment and Energy Management Agency (ADEME), the TOSCA-PLEIADES-CO project of the French Space Agency (CNES) and the SOERE PRO network working on environmental impacts of Organic Waste Products recycling on field crops at long time scale. The organic matter is an important soil fertility parameter and previous studies have shown the potential of spectral information measured in the laboratory or directly in the field using field spectro-radiometer or satellite imagery to predict the soil organic carbon (SOC) content. This work proposes a method for a spatial prediction of bare cultivated topsoil SOC content, from Unmanned Aerial Vehicle (UAV) multispectral imagery. An agricultural plot of 13 ha, located in the western region of Paris France, was analysed in April 2013, shortly before sowing while it was still bare soil. Soils comprised haplic luvisols, rendzic cambisols and calcaric or colluvic cambisols. The UAV platform used was a fixed wing provided by Airinov® flying at an altitude of 150m and was equipped with a four channels multispectral visible near-infrared camera MultiSPEC 4C® (550nm, 660nm, 735 nm and 790 nm). Twenty three ground control points (GCP) were sampled within the plot according to soils descriptions. GCP positions were determined with a centimetric DGPS. Different observations and measurements were made synchronously with the drone flight: soil surface description, spectral measurements (with ASD FieldSpec 3® spectroradiometer), roughness measurements by a photogrammetric method. Each of these locations was sampled for both soil standard physico-chemical analysis and soil water content. A Structure From Motion (SFM) processing was done from the UAV imagery to produce a 15 cm resolution multispectral mosaic using the Agisoft Photoscan® software. The SOC content was modelled by partial least squares regression (PLSR) between the

  6. Retinex Preprocessing for Improved Multi-Spectral Image Classification

    NASA Technical Reports Server (NTRS)

    Thompson, B.; Rahman, Z.; Park, S.

    2000-01-01

    The goal of multi-image classification is to identify and label "similar regions" within a scene. The ability to correctly classify a remotely sensed multi-image of a scene is affected by the ability of the classification process to adequately compensate for the effects of atmospheric variations and sensor anomalies. Better classification may be obtained if the multi-image is preprocessed before classification, so as to reduce the adverse effects of image formation. In this paper, we discuss the overall impact on multi-spectral image classification when the retinex image enhancement algorithm is used to preprocess multi-spectral images. The retinex is a multi-purpose image enhancement algorithm that performs dynamic range compression, reduces the dependence on lighting conditions, and generally enhances apparent spatial resolution. The retinex has been successfully applied to the enhancement of many different types of grayscale and color images. We show in this paper that retinex preprocessing improves the spatial structure of multi-spectral images and thus provides better within-class variations than would otherwise be obtained without the preprocessing. For a series of multi-spectral images obtained with diffuse and direct lighting, we show that without retinex preprocessing the class spectral signatures vary substantially with the lighting conditions. Whereas multi-dimensional clustering without preprocessing produced one-class homogeneous regions, the classification on the preprocessed images produced multi-class non-homogeneous regions. This lack of homogeneity is explained by the interaction between different agronomic treatments applied to the regions: the preprocessed images are closer to ground truth. The principle advantage that the retinex offers is that for different lighting conditions classifications derived from the retinex preprocessed images look remarkably "similar", and thus more consistent, whereas classifications derived from the original

  7. Multispectral Landsat images of Antartica

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucchitta, B.K.; Bowell, J.A.; Edwards, K.L.

    1988-01-01

    The U.S. Geological Survey has a program to map Antarctica by using colored, digitally enhanced Landsat multispectral scanner images to increase existing map coverage and to improve upon previously published Landsat maps. This report is a compilation of images and image mosaic that covers four complete and two partial 1:250,000-scale quadrangles of the McMurdo Sound region.

  8. Use of reflectance spectra of native plant species for interpreting airborne multispectral scanner data in the East Tintic Mountains, Utah.

    USGS Publications Warehouse

    Milton, N.M.

    1983-01-01

    Analysis of in situ reflectance spectra of native vegetation was used to interpret airborne MSS data. Representative spectra from three plant species in the E Tintic Mountains, Utah, were used to interpret the color components on a color ratio composite image made from MSS data in the visible and near-infrared regions. A map of plant communities was made from the color ratio composite image and field checked. -from Author

  9. Ortho-Rectification of Narrow Band Multi-Spectral Imagery Assisted by Dslr RGB Imagery Acquired by a Fixed-Wing Uas

    NASA Astrophysics Data System (ADS)

    Rau, J.-Y.; Jhan, J.-P.; Huang, C.-Y.

    2015-08-01

    Miniature Multiple Camera Array (MiniMCA-12) is a frame-based multilens/multispectral sensor composed of 12 lenses with narrow band filters. Due to its small size and light weight, it is suitable to mount on an Unmanned Aerial System (UAS) for acquiring high spectral, spatial and temporal resolution imagery used in various remote sensing applications. However, due to its wavelength range is only 10 nm that results in low image resolution and signal-to-noise ratio which are not suitable for image matching and digital surface model (DSM) generation. In the meantime, the spectral correlation among all 12 bands of MiniMCA images are low, it is difficult to perform tie-point matching and aerial triangulation at the same time. In this study, we thus propose the use of a DSLR camera to assist automatic aerial triangulation of MiniMCA-12 imagery and to produce higher spatial resolution DSM for MiniMCA12 ortho-image generation. Depending on the maximum payload weight of the used UAS, these two kinds of sensors could be collected at the same time or individually. In this study, we adopt a fixed-wing UAS to carry a Canon EOS 5D Mark2 DSLR camera and a MiniMCA-12 multi-spectral camera. For the purpose to perform automatic aerial triangulation between a DSLR camera and the MiniMCA-12, we choose one master band from MiniMCA-12 whose spectral range has overlap with the DSLR camera. However, all lenses of MiniMCA-12 have different perspective centers and viewing angles, the original 12 channels have significant band misregistration effect. Thus, the first issue encountered is to reduce the band misregistration effect. Due to all 12 MiniMCA lenses being frame-based, their spatial offsets are smaller than 15 cm and all images are almost 98% overlapped, we thus propose a modified projective transformation (MPT) method together with two systematic error correction procedures to register all 12 bands of imagery on the same image space. It means that those 12 bands of images acquired at

  10. Wavelength band selection method for multispectral target detection.

    PubMed

    Karlholm, Jörgen; Renhorn, Ingmar

    2002-11-10

    A framework is proposed for the selection of wavelength bands for multispectral sensors by use of hyperspectral reference data. Using the results from the detection theory we derive a cost function that is minimized by a set of spectral bands optimal in terms of detection performance for discrimination between a class of small rare targets and clutter with known spectral distribution. The method may be used, e.g., in the design of multispectral infrared search and track and electro-optical missile warning sensors, where a low false-alarm rate and a high-detection probability for detection of small targets against a clutter background are of critical importance, but the required high frame rate prevents the use of hyperspectral sensors.

  11. A new multi-spectral feature level image fusion method for human interpretation

    NASA Astrophysics Data System (ADS)

    Leviner, Marom; Maltz, Masha

    2009-03-01

    Various different methods to perform multi-spectral image fusion have been suggested, mostly on the pixel level. However, the jury is still out on the benefits of a fused image compared to its source images. We present here a new multi-spectral image fusion method, multi-spectral segmentation fusion (MSSF), which uses a feature level processing paradigm. To test our method, we compared human observer performance in a three-task experiment using MSSF against two established methods: averaging and principle components analysis (PCA), and against its two source bands, visible and infrared. The three tasks that we studied were: (1) simple target detection, (2) spatial orientation, and (3) camouflaged target detection. MSSF proved superior to the other fusion methods in all three tests; MSSF also outperformed the source images in the spatial orientation and camouflaged target detection tasks. Based on these findings, current speculation about the circumstances in which multi-spectral image fusion in general and specific fusion methods in particular would be superior to using the original image sources can be further addressed.

  12. Comparative Performance of Ground vs. Aerially Assessed RGB and Multispectral Indices for Early-Growth Evaluation of Maize Performance under Phosphorus Fertilization

    PubMed Central

    Gracia-Romero, Adrian; Kefauver, Shawn C.; Vergara-Díaz, Omar; Zaman-Allah, Mainassara A.; Prasanna, Boddupalli M.; Cairns, Jill E.; Araus, José L.

    2017-01-01

    Low soil fertility is one of the factors most limiting agricultural production, with phosphorus deficiency being among the main factors, particularly in developing countries. To deal with such environmental constraints, remote sensing measurements can be used to rapidly assess crop performance and to phenotype a large number of plots in a rapid and cost-effective way. We evaluated the performance of a set of remote sensing indices derived from Red-Green-Blue (RGB) images and multispectral (visible and infrared) data as phenotypic traits and crop monitoring tools for early assessment of maize performance under phosphorus fertilization. Thus, a set of 26 maize hybrids grown under field conditions in Zimbabwe was assayed under contrasting phosphorus fertilization conditions. Remote sensing measurements were conducted in seedlings at two different levels: at the ground and from an aerial platform. Within a particular phosphorus level, some of the RGB indices strongly correlated with grain yield. In general, RGB indices assessed at both ground and aerial levels correlated in a comparable way with grain yield except for indices a* and u*, which correlated better when assessed at the aerial level than at ground level and Greener Area (GGA) which had the opposite correlation. The Normalized Difference Vegetation Index (NDVI) evaluated at ground level with an active sensor also correlated better with grain yield than the NDVI derived from the multispectral camera mounted in the aerial platform. Other multispectral indices like the Soil Adjusted Vegetation Index (SAVI) performed very similarly to NDVI assessed at the aerial level but overall, they correlated in a weaker manner with grain yield than the best RGB indices. This study clearly illustrates the advantage of RGB-derived indices over the more costly and time-consuming multispectral indices. Moreover, the indices best correlated with GY were in general those best correlated with leaf phosphorous content. However

  13. Experimental Advanced Airborne Research Lidar (EAARL) Data Processing Manual

    USGS Publications Warehouse

    Bonisteel, Jamie M.; Nayegandhi, Amar; Wright, C. Wayne; Brock, John C.; Nagle, David

    2009-01-01

    . Each pulse is focused into an illumination area that has a radius of about 20 centimeters on the ground. The pulse-repetition frequency of the EAARL transmitter varies along each across-track scan to produce equal cross-track sample spacing and near uniform density (Nayegandhi and others, 2006). Targets can have varying physical and optical characteristics that cause extreme fluctuations in laser backscatter complexity and signal strength. To accommodate this dynamic range, EAARL has the real-time ability to detect, capture, and automatically adapt to each laser return backscatter. The backscattered energy is collected by an array of four high-speed waveform digitizers connected to an array of four sub-nanosecond photodetectors. Each of the four photodetectors receives a finite range of the returning laser backscatter photons. The most sensitive channel receives 90% of the photons, the least sensitive receives 0.9%, and the middle channel receives 9% (Wright and Brock, 2002). The fourth channel is available for detection but is not currently being utilized. All four channels are digitized simultaneously into 65,536 samples for every laser pulse. Receiver optics consists of a 15-centimeter-diameter dielectric-coated Newtonian telescope, a computer-driven raster scanning mirror oscillating at 12.5 hertz (25 rasters per second), and an array of sub-nanosecond photodetectors. The signal emitted by the pulsed laser transmitter is amplified as backscatter by the optical telescope receiver. The photomultiplier tube (PMT) then converts the optical energy into electrical impulses (Nayegandhi and others, 2006). In addition to the full-waveform resolving laser, the EAARL sensor suite includes a down-looking 70-centimeter-resolution Red-Green-Blue (RGB) digital network camera, a high-resolution color infrared (CIR) multispectral camera (14-centimeter-resolution), two precision dual-frequency kinematic carrier-phase global positioning system (GPS) receivers, and an

  14. A multispectral sorting device for isolating single wheat kernels with high protein content

    USDA-ARS?s Scientific Manuscript database

    Automated sorting of single wheat kernels according to protein content was demonstrated using two novel multispectral sorting devices with different spectral ranges; 470-1070 nm (silicone based detector) and 910nm-1550 nm (InGaAs based detector). The multispectral data were acquired by rapidly (~12...

  15. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    NASA Astrophysics Data System (ADS)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  16. Differentiating aquatic plant communities in a eutrophic river using hyperspectral and multispectral remote sensing

    USGS Publications Warehouse

    Tian, Y.Q.; Yu, Q.; Zimmerman, M.J.; Flint, S.; Waldron, M.C.

    2010-01-01

    This study evaluates the efficacy of remote sensing technology to monitor species composition, areal extent and density of aquatic plants (macrophytes and filamentous algae) in impoundments where their presence may violate water-quality standards. Multispectral satellite (IKONOS) images and more than 500 in situ hyperspectral samples were acquired to map aquatic plant distributions. By analyzing field measurements, we created a library of hyperspectral signatures for a variety of aquatic plant species, associations and densities. We also used three vegetation indices. Normalized Difference Vegetation Index (NDVI), near-infrared (NIR)-Green Angle Index (NGAI) and normalized water absorption depth (DH), at wavelengths 554, 680, 820 and 977 nm to differentiate among aquatic plant species composition, areal density and thickness in cases where hyperspectral analysis yielded potentially ambiguous interpretations. We compared the NDVI derived from IKONOS imagery with the in situ, hyperspectral-derived NDVI. The IKONOS-based images were also compared to data obtained through routine visual observations. Our results confirmed that aquatic species composition alters spectral signatures and affects the accuracy of remote sensing of aquatic plant density. The results also demonstrated that the NGAI has apparent advantages in estimating density over the NDVI and the DH. In the feature space of the three indices, 3D scatter plot analysis revealed that hyperspectral data can differentiate several aquatic plant associations. High-resolution multispectral imagery provided useful information to distinguish among biophysical aquatic plant characteristics. Classification analysis indicated that using satellite imagery to assess Lemna coverage yielded an overall agreement of 79% with visual observations and >90% agreement for the densest aquatic plant coverages. Interpretation of biophysical parameters derived from high-resolution satellite or airborne imagery should prove to be a

  17. Bayer Filter Snapshot Hyperspectral Fundus Camera for Human Retinal Imaging

    PubMed Central

    Liu, Wenzhong; Nesper, Peter; Park, Justin; Zhang, Hao F.; Fawzi, Amani A.

    2016-01-01

    Purpose To demonstrate the versatility and performance of a compact Bayer filter snapshot hyperspectral fundus camera for in-vivo clinical applications including retinal oximetry and macular pigment optical density measurements. Methods 12 healthy volunteers were recruited under an Institutional Review Board (IRB) approved protocol. Fundus images were taken with a custom hyperspectral camera with a spectral range of 460–630 nm. We determined retinal vascular oxygen saturation (sO2) for the healthy population using the captured spectra by least squares curve fitting. Additionally, macular pigment optical density was localized and visualized using multispectral reflectometry from selected wavelengths. Results We successfully determined the mean sO2 of arteries and veins of each subject (ages 21–80) with excellent intrasubject repeatability (1.4% standard deviation). The mean arterial sO2 for all subjects was 90.9% ± 2.5%, whereas the mean venous sO2 for all subjects was 64.5% ± 3.5%. The mean artery–vein (A–V) difference in sO2 varied between 20.5% and 31.9%. In addition, we were able to reveal and quantify macular pigment optical density. Conclusions We demonstrated a single imaging tool capable of oxygen saturation and macular pigment density measurements in vivo. The unique combination of broad spectral range, high spectral–spatial resolution, rapid and robust imaging capability, and compact design make this system a valuable tool for multifunction spectral imaging that can be easily performed in a clinic setting. PMID:27767345

  18. Satellite land remote sensing advancements for the eighties; Proceedings of the Eighth Pecora Symposium, Sioux Falls, SD, October 4-7, 1983

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Among the topics discussed are NASA's land remote sensing plans for the 1980s, the evolution of Landsat 4 and the performance of its sensors, the Landsat 4 thematic mapper image processing system radiometric and geometric characteristics, data quality, image data radiometric analysis and spectral/stratigraphic analysis, and thematic mapper agricultural, forest resource and geological applications. Also covered are geologic applications of side-looking airborne radar, digital image processing, the large format camera, the RADARSAT program, the SPOT 1 system's program status, distribution plans, and simulation program, Space Shuttle multispectral linear array studies of the optical and biological properties of terrestrial land cover, orbital surveys of solar-stimulated luminescence, the Space Shuttle imaging radar research facility, and Space Shuttle-based polar ice sounding altimetry.

  19. NEON Airborne Remote Sensing of Terrestrial Ecosystems

    NASA Astrophysics Data System (ADS)

    Kampe, T. U.; Leisso, N.; Krause, K.; Karpowicz, B. M.

    2012-12-01

    The National Ecological Observatory Network (NEON) is the continental-scale research platform that will collect information on ecosystems across the United States to advance our understanding and ability to forecast environmental change at the continental scale. One of NEON's observing systems, the Airborne Observation Platform (AOP), will fly an instrument suite consisting of a high-fidelity visible-to-shortwave infrared imaging spectrometer, a full waveform small footprint LiDAR, and a high-resolution digital camera on a low-altitude aircraft platform. NEON AOP is focused on acquiring data on several terrestrial Essential Climate Variables including bioclimate, biodiversity, biogeochemistry, and land use products. These variables are collected throughout a network of 60 sites across the Continental United States, Alaska, Hawaii and Puerto Rico via ground-based and airborne measurements. Airborne remote sensing plays a critical role by providing measurements at the scale of individual shrubs and larger plants over hundreds of square kilometers. The NEON AOP plays the role of bridging the spatial scales from that of individual organisms and stands to the scale of satellite-based remote sensing. NEON is building 3 airborne systems to facilitate the routine coverage of NEON sites and provide the capacity to respond to investigator requests for specific projects. The first NEON imaging spectrometer, a next-generation VSWIR instrument, was recently delivered to NEON by JPL. This instrument has been integrated with a small-footprint waveform LiDAR on the first NEON airborne platform (AOP-1). A series of AOP-1 test flights were conducted during the first year of NEON's construction phase. The goal of these flights was to test out instrument functionality and performance, exercise remote sensing collection protocols, and provide provisional data for algorithm and data product validation. These test flights focused the following questions: What is the optimal remote

  20. Summary of Michigan multispectral investigations program

    NASA Technical Reports Server (NTRS)

    Legault, R. R.

    1970-01-01

    The development of techniques to extend spectral signatures in space and time is reported. Signatures that were valid for 30 miles have been extended for 129 miles using transformation and sun sensor data so that a complicated multispectral recognition problem that required 219 learning sets can now be done with 13 learning sets.

  1. An integrated multispectral video and environmental monitoring system for the study of coastal processes and the support of beach management operations

    NASA Astrophysics Data System (ADS)

    Ghionis, George; Trygonis, Vassilis; Karydis, Antonis; Vousdoukas, Michalis; Alexandrakis, George; Drakopoulos, Panos; Amdreadis, Olympos; Psarros, Fotis; Velegrakis, Antonis; Poulos, Serafim

    2016-04-01

    Effective beach management requires environmental assessments that are based on sound science, are cost-effective and are available to beach users and managers in an accessible, timely and transparent manner. The most common problems are: 1) The available field data are scarce and of sub-optimal spatio-temporal resolution and coverage, 2) our understanding of local beach processes needs to be improved in order to accurately model/forecast beach dynamics under a changing climate, and 3) the information provided by coastal scientists/engineers in the form of data, models and scientific interpretation is often too complicated to be of direct use by coastal managers/decision makers. A multispectral video system has been developed, consisting of one or more video cameras operating in the visible part of the spectrum, a passive near-infrared (NIR) camera, an active NIR camera system, a thermal infrared camera and a spherical video camera, coupled with innovative image processing algorithms and a telemetric system for the monitoring of coastal environmental parameters. The complete system has the capability to record, process and communicate (in quasi-real time) high frequency information on shoreline position, wave breaking zones, wave run-up, erosion hot spots along the shoreline, nearshore wave height, turbidity, underwater visibility, wind speed and direction, air and sea temperature, solar radiation, UV radiation, relative humidity, barometric pressure and rainfall. An innovative, remotely-controlled interactive visual monitoring system, based on the spherical video camera (with 360°field of view), combines the video streams from all cameras and can be used by beach managers to monitor (in real time) beach user numbers, flow activities and safety at beaches of high touristic value. The high resolution near infrared cameras permit 24-hour monitoring of beach processes, while the thermal camera provides information on beach sediment temperature and moisture, can

  2. An earth imaging camera simulation using wide-scale construction of reflectance surfaces

    NASA Astrophysics Data System (ADS)

    Murthy, Kiran; Chau, Alexandra H.; Amin, Minesh B.; Robinson, M. Dirk

    2013-10-01

    Developing and testing advanced ground-based image processing systems for earth-observing remote sensing applications presents a unique challenge that requires advanced imagery simulation capabilities. This paper presents an earth-imaging multispectral framing camera simulation system called PayloadSim (PaySim) capable of generating terabytes of photorealistic simulated imagery. PaySim leverages previous work in 3-D scene-based image simulation, adding a novel method for automatically and efficiently constructing 3-D reflectance scenes by draping tiled orthorectified imagery over a geo-registered Digital Elevation Map (DEM). PaySim's modeling chain is presented in detail, with emphasis given to the techniques used to achieve computational efficiency. These techniques as well as cluster deployment of the simulator have enabled tuning and robust testing of image processing algorithms, and production of realistic sample data for customer-driven image product development. Examples of simulated imagery of Skybox's first imaging satellite are shown.

  3. Miniaturized Airborne Imaging Central Server System

    NASA Technical Reports Server (NTRS)

    Sun, Xiuhong

    2011-01-01

    In recent years, some remote-sensing applications require advanced airborne multi-sensor systems to provide high performance reflective and emissive spectral imaging measurement rapidly over large areas. The key or unique problem of characteristics is associated with a black box back-end system that operates a suite of cutting-edge imaging sensors to collect simultaneously the high throughput reflective and emissive spectral imaging data with precision georeference. This back-end system needs to be portable, easy-to-use, and reliable with advanced onboard processing. The innovation of the black box backend is a miniaturized airborne imaging central server system (MAICSS). MAICSS integrates a complex embedded system of systems with dedicated power and signal electronic circuits inside to serve a suite of configurable cutting-edge electro- optical (EO), long-wave infrared (LWIR), and medium-wave infrared (MWIR) cameras, a hyperspectral imaging scanner, and a GPS and inertial measurement unit (IMU) for atmospheric and surface remote sensing. Its compatible sensor packages include NASA s 1,024 1,024 pixel LWIR quantum well infrared photodetector (QWIP) imager; a 60.5 megapixel BuckEye EO camera; and a fast (e.g. 200+ scanlines/s) and wide swath-width (e.g., 1,920+ pixels) CCD/InGaAs imager-based visible/near infrared reflectance (VNIR) and shortwave infrared (SWIR) imaging spectrometer. MAICSS records continuous precision georeferenced and time-tagged multisensor throughputs to mass storage devices at a high aggregate rate, typically 60 MB/s for its LWIR/EO payload. MAICSS is a complete stand-alone imaging server instrument with an easy-to-use software package for either autonomous data collection or interactive airborne operation. Advanced multisensor data acquisition and onboard processing software features have been implemented for MAICSS. With the onboard processing for real time image development, correction, histogram-equalization, compression, georeference, and

  4. Application of airborne hyperspectral remote sensing for the retrieval of forest inventory parameters

    NASA Astrophysics Data System (ADS)

    Dmitriev, Yegor V.; Kozoderov, Vladimir V.; Sokolov, Anton A.

    2016-04-01

    Collecting and updating forest inventory data play an important part in the forest management. The data can be obtained directly by using exact enough but low efficient ground based methods as well as from the remote sensing measurements. We present applications of airborne hyperspectral remote sensing for the retrieval of such important inventory parameters as the forest species and age composition. The hyperspectral images of the test region were obtained from the airplane equipped by the produced in Russia light-weight airborne video-spectrometer of visible and near infrared spectral range and high resolution photo-camera on the same gyro-stabilized platform. The quality of the thematic processing depends on many factors such as the atmospheric conditions, characteristics of measuring instruments, corrections and preprocessing methods, etc. An important role plays the construction of the classifier together with methods of the reduction of the feature space. The performance of different spectral classification methods is analyzed for the problem of hyperspectral remote sensing of soil and vegetation. For the reduction of the feature space we used the earlier proposed stable feature selection method. The results of the classification of hyperspectral airborne images by using the Multiclass Support Vector Machine method with Gaussian kernel and the parametric Bayesian classifier based on the Gaussian mixture model and their comparative analysis are demonstrated.

  5. Multispectral code excited linear prediction coding and its application in magnetic resonance images.

    PubMed

    Hu, J H; Wang, Y; Cahill, P T

    1997-01-01

    This paper reports a multispectral code excited linear prediction (MCELP) method for the compression of multispectral images. Different linear prediction models and adaptation schemes have been compared. The method that uses a forward adaptive autoregressive (AR) model has been proven to achieve a good compromise between performance, complexity, and robustness. This approach is referred to as the MFCELP method. Given a set of multispectral images, the linear predictive coefficients are updated over nonoverlapping three-dimensional (3-D) macroblocks. Each macroblock is further divided into several 3-D micro-blocks, and the best excitation signal for each microblock is determined through an analysis-by-synthesis procedure. The MFCELP method has been applied to multispectral magnetic resonance (MR) images. To satisfy the high quality requirement for medical images, the error between the original image set and the synthesized one is further specified using a vector quantizer. This method has been applied to images from 26 clinical MR neuro studies (20 slices/study, three spectral bands/slice, 256x256 pixels/band, 12 b/pixel). The MFCELP method provides a significant visual improvement over the discrete cosine transform (DCT) based Joint Photographers Expert Group (JPEG) method, the wavelet transform based embedded zero-tree wavelet (EZW) coding method, and the vector tree (VT) coding method, as well as the multispectral segmented autoregressive moving average (MSARMA) method we developed previously.

  6. Innovativ Airborne Sensors for Disaster Management

    NASA Astrophysics Data System (ADS)

    Altan, M. O.; Kemper, G.

    2016-06-01

    Disaster management by analyzing changes in the DSM before and after the "event". Advantage of Lidar is that beside rain and clouds, no other weather conditions limit their use. As an active sensor, missions in the nighttime are possible. The new mid-format cameras that make use CMOS sensors (e.g. Phase One IXU1000) can capture data also under poor and difficult light conditions and might will be the first choice for remotely sensed data acquisition in aircrafts and UAVs. UAVs will surely be more and more part of the disaster management on the detailed level. Today equipped with video live cams using RGB and Thermal IR, they assist in looking inside buildings and behind. Thus, they can continue with the aerial survey where airborne anomalies have been detected.

  7. Multispectral Analysis of NMR Imagery

    NASA Technical Reports Server (NTRS)

    Butterfield, R. L.; Vannier, M. W. And Associates; Jordan, D.

    1985-01-01

    Conference paper discusses initial efforts to adapt multispectral satellite-image analysis to nuclear magnetic resonance (NMR) scans of human body. Flexibility of these techniques makes it possible to present NMR data in variety of formats, including pseudocolor composite images of pathological internal features. Techniques do not have to be greatly modified from form in which used to produce satellite maps of such Earth features as water, rock, or foliage.

  8. Multispectral imaging reveals biblical-period inscription unnoticed for half a century.

    PubMed

    Faigenbaum-Golovin, Shira; Mendel-Geberovich, Anat; Shaus, Arie; Sober, Barak; Cordonsky, Michael; Levin, David; Moinester, Murray; Sass, Benjamin; Turkel, Eli; Piasetzky, Eli; Finkelstein, Israel

    2017-01-01

    Most surviving biblical period Hebrew inscriptions are ostraca-ink-on-clay texts. They are poorly preserved and once unearthed, fade rapidly. Therefore, proper and timely documentation of ostraca is essential. Here we show a striking example of a hitherto invisible text on the back side of an ostracon revealed via multispectral imaging. This ostracon, found at the desert fortress of Arad and dated to ca. 600 BCE (the eve of Judah's destruction by Nebuchadnezzar), has been on display for half a century. Its front side has been thoroughly studied, while its back side was considered blank. Our research revealed three lines of text on the supposedly blank side and four "new" lines on the front side. Our results demonstrate the need for multispectral image acquisition for both sides of all ancient ink ostraca. Moreover, in certain cases we recommend employing multispectral techniques for screening newly unearthed ceramic potsherds prior to disposal.

  9. Memoris, A Wide Angle Camera For Bepicolombo

    NASA Astrophysics Data System (ADS)

    Cremonese, G.; Memoris Team

    In order to answer to the Announcement of Opportunity of ESA for the BepiColombo payload, we are working on a wide angle camera concept named MEMORIS (MEr- cury MOderate Resolution Imaging System). MEMORIS will performe stereoscopic images of the whole Mercury surface using two different channels at +/- 20 degrees from the nadir point. It will achieve a spatial resolution of 50m per pixel at 400 km from the surface (peri-Herm), corresponding to a vertical resolution of about 75m with the stereo performances. The scientific objectives will be addressed by MEMORIS may be identified as follows: Estimate of surface age based on crater counting Crater morphology and degrada- tion Stratigraphic sequence of geological units Identification of volcanic features and related deposits Origin of plain units from morphological observations Distribution and type of the tectonic structures Determination of relative age among the structures based on cross-cutting relationships 3D Tectonics Global mineralogical mapping of main geological units Identification of weathering products The last two items will come from the multispectral capabilities of the camera utilizing 8 to 12 (TBD) broad band filters. MEMORIS will be equipped by a further channel devoted to the observations of the tenuous exosphere. It will look at the limb on a given arc of the BepiColombo orbit, in so doing it will observe the exosphere above a surface latitude range of 25-75 degrees in the northern emisphere. The exosphere images will be obtained above the surface just observed by the other two channels, trying to find possible relantionship, as ground-based observations suggest. The exospheric channel will have four narrow-band filters centered on the sodium and potassium emissions and the adjacent continua.

  10. Quality assessment of butter cookies applying multispectral imaging

    PubMed Central

    Andresen, Mette S; Dissing, Bjørn S; Løje, Hanne

    2013-01-01

    A method for characterization of butter cookie quality by assessing the surface browning and water content using multispectral images is presented. Based on evaluations of the browning of butter cookies, cookies were manually divided into groups. From this categorization, reference values were calculated for a statistical prediction model correlating multispectral images with a browning score. The browning score is calculated as a function of oven temperature and baking time. It is presented as a quadratic response surface. The investigated process window was the intervals 4–16 min and 160–200°C in a forced convection electrically heated oven. In addition to the browning score, a model for predicting the average water content based on the same images is presented. This shows how multispectral images of butter cookies may be used for the assessment of different quality parameters. Statistical analysis showed that the most significant wavelengths for browning predictions were in the interval 400–700 nm and the wavelengths significant for water prediction were primarily located in the near-infrared spectrum. The water prediction model was found to correctly estimate the average water content with an absolute error of 0.22%. From the images it was also possible to follow the browning and drying propagation from the cookie edge toward the center. PMID:24804036

  11. Airborne imaging spectrometers developed in China

    NASA Astrophysics Data System (ADS)

    Wang, Jianyu; Xue, Yongqi

    1998-08-01

    Airborne imaging spectral technology, principle means in airborne remote sensing, has been developed rapidly both in the world and in China recently. This paper describes Modular Airborne Imaging Spectrometer (MAIS), Operational Modular Airborne Imaging Spectrometer (OMAIS) and Pushbroom Hyperspectral Imagery (PHI) that have been developed or are being developed in Airborne Remote Sensing Lab of Shanghai Institute of Technical Physics, CAS.

  12. Experimental Demonstration of Adaptive Infrared Multispectral Imaging using Plasmonic Filter Array.

    PubMed

    Jang, Woo-Yong; Ku, Zahyun; Jeon, Jiyeon; Kim, Jun Oh; Lee, Sang Jun; Park, James; Noyola, Michael J; Urbas, Augustine

    2016-10-10

    In our previous theoretical study, we performed target detection using a plasmonic sensor array incorporating the data-processing technique termed "algorithmic spectrometry". We achieved the reconstruction of a target spectrum by extracting intensity at multiple wavelengths with high resolution from the image data obtained from the plasmonic array. The ultimate goal is to develop a full-scale focal plane array with a plasmonic opto-coupler in order to move towards the next generation of versatile infrared cameras. To this end, and as an intermediate step, this paper reports the experimental demonstration of adaptive multispectral imagery using fabricated plasmonic spectral filter arrays and proposed target detection scenarios. Each plasmonic filter was designed using periodic circular holes perforated through a gold layer, and an enhanced target detection strategy was proposed to refine the original spectrometry concept for spatial and spectral computation of the data measured from the plasmonic array. Both the spectrum of blackbody radiation and a metal ring object at multiple wavelengths were successfully reconstructed using the weighted superposition of plasmonic output images as specified in the proposed detection strategy. In addition, plasmonic filter arrays were theoretically tested on a target at extremely high temperature as a challenging scenario for the detection scheme.

  13. Experimental Demonstration of Adaptive Infrared Multispectral Imaging using Plasmonic Filter Array

    PubMed Central

    Jang, Woo-Yong; Ku, Zahyun; Jeon, Jiyeon; Kim, Jun Oh; Lee, Sang Jun; Park, James; Noyola, Michael J.; Urbas, Augustine

    2016-01-01

    In our previous theoretical study, we performed target detection using a plasmonic sensor array incorporating the data-processing technique termed “algorithmic spectrometry”. We achieved the reconstruction of a target spectrum by extracting intensity at multiple wavelengths with high resolution from the image data obtained from the plasmonic array. The ultimate goal is to develop a full-scale focal plane array with a plasmonic opto-coupler in order to move towards the next generation of versatile infrared cameras. To this end, and as an intermediate step, this paper reports the experimental demonstration of adaptive multispectral imagery using fabricated plasmonic spectral filter arrays and proposed target detection scenarios. Each plasmonic filter was designed using periodic circular holes perforated through a gold layer, and an enhanced target detection strategy was proposed to refine the original spectrometry concept for spatial and spectral computation of the data measured from the plasmonic array. Both the spectrum of blackbody radiation and a metal ring object at multiple wavelengths were successfully reconstructed using the weighted superposition of plasmonic output images as specified in the proposed detection strategy. In addition, plasmonic filter arrays were theoretically tested on a target at extremely high temperature as a challenging scenario for the detection scheme. PMID:27721506

  14. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  15. Airborne Tactical Crossload Planner

    DTIC Science & Technology

    2017-12-01

    set out in the Airborne Standard Operating Procedure (ASOP). 14. SUBJECT TERMS crossload, airborne, optimization, integer linear programming ...they land to their respective sub-mission locations. In this thesis, we formulate and implement an integer linear program called the Tactical...to meet any desired crossload objectives. xiv We demonstrate TCP with two real-world tactical problems from recent airborne operations: one by the

  16. Caught on Camera.

    ERIC Educational Resources Information Center

    Milshtein, Amy

    2002-01-01

    Describes the benefits of and rules to be followed when using surveillance cameras for school security. Discusses various camera models, including indoor and outdoor fixed position cameras, pan-tilt zoom cameras, and pinhole-lens cameras for covert surveillance. (EV)

  17. Quantitatively differentiating microstructural variations of skeletal muscle tissues by multispectral Mueller matrix imaging

    NASA Astrophysics Data System (ADS)

    Dong, Yang; He, Honghui; He, Chao; Ma, Hui

    2016-10-01

    Polarized light is sensitive to the microstructures of biological tissues and can be used to detect physiological changes. Meanwhile, spectral features of the scattered light can also provide abundant microstructural information of tissues. In this paper, we take the backscattering polarization Mueller matrix images of bovine skeletal muscle tissues during the 24-hour experimental time, and analyze their multispectral behavior using quantitative Mueller matrix parameters. In the processes of rigor mortis and proteolysis of muscle samples, multispectral frequency distribution histograms (FDHs) of the Mueller matrix elements can reveal rich qualitative structural information. In addition, we analyze the temporal variations of the sample using the multispectral Mueller matrix transformation (MMT) parameters. The experimental results indicate that the different stages of rigor mortis and proteolysis for bovine skeletal muscle samples can be judged by these MMT parameters. The results presented in this work show that combining with the multispectral technique, the FDHs and MMT parameters can characterize the microstructural variation features of skeletal muscle tissues. The techniques have the potential to be used as tools for quantitative assessment of meat qualities in food industry.

  18. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  19. Multispectral linear array visible and shortwave infrared sensors

    NASA Astrophysics Data System (ADS)

    Tower, J. R.; Warren, F. B.; Pellon, L. E.; Strong, R.; Elabd, H.; Cope, A. D.; Hoffmann, D. M.; Kramer, W. M.; Longsderff, R. W.

    1984-08-01

    All-solid state pushbroom sensors for multispectral linear array (MLA) instruments to replace mechanical scanners used on LANDSAT satellites are introduced. A buttable, four-spectral-band, linear-format charge coupled device (CCD) and a buttable, two-spectral-band, linear-format, shortwave infrared CCD are described. These silicon integrated circuits may be butted end to end to provide multispectral focal planes with thousands of contiguous, in-line photosites. The visible CCD integrated circuit is organized as four linear arrays of 1024 pixels each. Each array views the scene in a different spectral window, resulting in a four-band sensor. The shortwave infrared (SWIR) sensor is organized as 2 linear arrays of 512 detectors each. Each linear array is optimized for performance at a different wavelength in the SWIR band.

  20. AIRES: An Airborne Infra-Red Echelle Spectrometer for SOFIA

    NASA Technical Reports Server (NTRS)

    Dotson, Jessie J.; Erickson, Edwin F.; Haas, Michael R.; Colgan, Sean W. J.; Simpson, Janet P.; Telesco, Charles M.; Pina, Robert K.; Wolf, Juergen; Young, Erick T.

    1999-01-01

    SOFIA will enable astronomical observations with unprecedented angular resolution at infrared wavelengths obscured from the ground. To help open this new chapter in the exploration of the infrared universe, we are building AIRES, an Airborne Infra-Red Echelle Spectrometer. AIRES will be operated as a first generation, general purpose facility instrument by USRA, NASA's prime contractor for SOFIA. AIRES is a long slit spectrograph operating from 17 - 210 microns. In high resolution mode the spectral resolving power is approx. 10(exp 6) microns/A or approx. 10(exp 4) at 100 microns. Unfortunately, since the conference, a low resolution mode with resolving power about 100 times lower has been deleted due to budgetary constraints. AIRES includes a slit viewing camera which operates in broad bands at 18 and 25 microns.