Science.gov

Sample records for airborne multispectral camera

  1. An airborne multispectral imaging system based on two consumer-grade cameras for agricultural remote sensing

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper describes the design and evaluation of an airborne multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame complementary metal oxide semiconductor (CMOS) sensor with 5616 × 3744 pixels. One came...

  2. A Multispectral Image Creating Method for a New Airborne Four-Camera System with Different Bandpass Filters

    PubMed Central

    Li, Hanlun; Zhang, Aiwu; Hu, Shaoxing

    2015-01-01

    This paper describes an airborne high resolution four-camera multispectral system which mainly consists of four identical monochrome cameras equipped with four interchangeable bandpass filters. For this multispectral system, an automatic multispectral data composing method was proposed. The homography registration model was chosen, and the scale-invariant feature transform (SIFT) and random sample consensus (RANSAC) were used to generate matching points. For the difficult registration problem between visible band images and near-infrared band images in cases lacking manmade objects, we presented an effective method based on the structural characteristics of the system. Experiments show that our method can acquire high quality multispectral images and the band-to-band alignment error of the composed multiple spectral images is less than 2.5 pixels. PMID:26205264

  3. Airborne multispectral identification of individual cotton plants using consumer-grade cameras

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Although multispectral remote sensing using consumer-grade cameras has successfully identified fields of small cotton plants, improvements to detection sensitivity are needed to identify individual or small clusters of plants. The imaging sensor of consumer-grade cameras are based on a Bayer patter...

  4. Assessing the application of an airborne intensified multispectral video camera to measure chlorophyll a in three Florida estuaries

    SciTech Connect

    Dierberg, F.E.; Zaitzeff, J.

    1997-08-01

    After absolute and spectral calibration, an airborne intensified, multispectral video camera was field tested for water quality assessments over three Florida estuaries (Tampa Bay, Indian River Lagoon, and the St. Lucie River Estuary). Univariate regression analysis of upwelling spectral energy vs. ground-truthed uncorrected chlorophyll a (Chl a) for each estuary yielded lower coefficients of determination (R{sup 2}) with increasing concentrations of Gelbstoff within an estuary. More predictive relationships were established by adding true color as a second independent variable in a bivariate linear regression model. These regressions successfully explained most of the variation in upwelling light energy (R{sup 2}=0.94, 0.82 and 0.74 for the Tampa Bay, Indian River Lagoon, and St. Lucie estuaries, respectively). Ratioed wavelength bands within the 625-710 nm range produced the highest correlations with ground-truthed uncorrected Chl a, and were similar to those reported as being the most predictive for Chl a in Tennessee reservoirs. However, the ratioed wavebands producing the best predictive algorithms for Chl a differed among the three estuaries due to the effects of varying concentrations of Gelbstoff on upwelling spectral signatures, which precluded combining the data into a common data set for analysis.

  5. Study on airborne multispectral imaging fusion detection technology

    NASA Astrophysics Data System (ADS)

    Ding, Na; Gao, Jiaobo; Wang, Jun; Cheng, Juan; Gao, Meng; Gao, Fei; Fan, Zhe; Sun, Kefeng; Wu, Jun; Li, Junna; Gao, Zedong; Cheng, Gang

    2014-11-01

    The airborne multispectral imaging fusion detection technology is proposed in this paper. In this design scheme, the airborne multispectral imaging system consists of the multispectral camera, the image processing unit, and the stabilized platform. The multispectral camera can operate in the spectral region from visible to near infrared waveband (0.4-1.0um), it has four same and independent imaging channels, and sixteen different typical wavelengths to be selected based on the different typical targets and background. The related experiments were tested by the airborne multispectral imaging system. In particularly, the camouflage targets were fused and detected in the different complex environment, such as the land vegetation background, the desert hot background and underwater. In the spectral region from 0.4 um to 1.0um, the three different characteristic wave from sixteen typical spectral are selected and combined according to different backgrounds and targets. The spectral image corresponding to the three characteristic wavelengths is resisted and fused by the image processing technology in real time, and the fusion video with typical target property is outputted. In these fusion images, the contrast of target and background is greatly increased. Experimental results confirm that the airborne multispectral imaging fusion detection technology can acquire multispectral fusion image with high contrast in real time, and has the ability of detecting and identification camouflage objects from complex background to targets underwater.

  6. Airborne system for testing multispectral reconnaissance technologies

    NASA Astrophysics Data System (ADS)

    Schmitt, Dirk-Roger; Doergeloh, Heinrich; Keil, Heiko; Wetjen, Wilfried

    1999-07-01

    There is an increasing demand for future airborne reconnaissance systems to obtain aerial images for tactical or peacekeeping operations. Especially Unmanned Aerial Vehicles (UAVs) equipped with multispectral sensor system and with real time jam resistant data transmission capabilities are of high interest. An airborne experimental platform has been developed as testbed to investigate different concepts of reconnaissance systems before their application in UAVs. It is based on a Dornier DO 228 aircraft, which is used as flying platform. Great care has been taken to achieve the possibility to test different kinds of multispectral sensors. Hence basically it is capable to be equipped with an IR sensor head, high resolution aerial cameras of the whole optical spectrum and radar systems. The onboard equipment further includes system for digital image processing, compression, coding, and storage. The data are RF transmitted to the ground station using technologies with high jam resistance. The images, after merging with enhanced vision components, are delivered to the observer who has an uplink data channel available to control flight and imaging parameters.

  7. Airborne ballistic camera tracking systems

    NASA Technical Reports Server (NTRS)

    Redish, W. L.

    1976-01-01

    An operational airborne ballistic camera tracking system was tested for operational and data reduction feasibility. The acquisition and data processing requirements of the system are discussed. Suggestions for future improvements are also noted. A description of the data reduction mathematics is outlined. Results from a successful reentry test mission are tabulated. The test mission indicated that airborne ballistic camera tracking systems are feasible.

  8. Airborne multispectral detection of regrowth cotton fields

    NASA Astrophysics Data System (ADS)

    Westbrook, John K.; Suh, Charles P.-C.; Yang, Chenghai; Lan, Yubin; Eyster, Ritchie S.

    2015-01-01

    Effective methods are needed for timely areawide detection of regrowth cotton plants because boll weevils (a quarantine pest) can feed and reproduce on these plants beyond the cotton production season. Airborne multispectral images of regrowth cotton plots were acquired on several dates after three shredding (i.e., stalk destruction) dates. Linear spectral unmixing (LSU) classification was applied to high-resolution airborne multispectral images of regrowth cotton plots to estimate the minimum detectable size and subsequent growth of plants. We found that regrowth cotton fields can be identified when the mean plant width is ˜0.2 m for an image resolution of 0.1 m. LSU estimates of canopy cover of regrowth cotton plots correlated well (r2=0.81) with the ratio of mean plant width to row spacing, a surrogate measure of plant canopy cover. The height and width of regrowth plants were both well correlated (r2=0.94) with accumulated degree-days after shredding. The results will help boll weevil eradication program managers use airborne multispectral images to detect and monitor the regrowth of cotton plants after stalk destruction, and identify fields that may require further inspection and mitigation of boll weevil infestations.

  9. Development of a multispectral camera system

    NASA Astrophysics Data System (ADS)

    Sugiura, Hiroaki; Kuno, Tetsuya; Watanabe, Norihiro; Matoba, Narihiro; Hayashi, Junichiro; Miyake, Yoichi

    2000-05-01

    A highly accurate multispectral camera and the application software have been developed as a practical system to capture digital images of the artworks stored in galleries and museums. Instead of recording color data in the conventional three RGB primary colors, the newly developed camera and the software carry out a pixel-wise estimation of spectral reflectance, the color data specific to the object, to enable the practical multispectral imaging. In order to realize the accurate multispectral imaging, the dynamic range of the camera is set to 14 bits or over and the output bits to 14 bits so as to allow capturing even when the difference in light quantity between the each channel is large. Further, a small-size rotary color filter was simultaneously developed to keep the camera to a practical size. We have developed software capable of selecting the optimum combination of color filters available in the market. Using this software, n types of color filter can be selected from m types of color filter giving a minimum Euclidean distance or minimum color difference in CIELAB color space between actual and estimated spectral reflectance as to 147 types of oil paint samples.

  10. Astronaut Jack Lousma works at Multispectral camera experiment

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Astronaut Jack R. Lousma, Skylab 3 pilot, works at the S190A multispectral camera experiment in the Multiple Docking Adapter (MDA), seen from a color television transmission made by a TV camera aboard the Skylab space station cluster in Earth orbit. Lousma later used a small brush to clean the six lenses of the multispectral camera.

  11. Aerosol Remote Sensing Applications for Airborne Multiangle, Multispectral Shortwave Radiometers

    NASA Astrophysics Data System (ADS)

    von Bismarck, Jonas; Ruhtz, Thomas; Starace, Marco; Hollstein, André; Preusker, René; Fischer, Jürgen

    2010-05-01

    Aerosol particles have an important impact on the surface net radiation budget by direct scattering and absorption (direct aerosol effect) of solar radiation, and also by influencing cloud formation processes (semi-direct and indirect aerosol effects). To study the former, a number of multispectral sky- and sunphotometers have been developed at the Institute for Space Sciences of the Free University of Berlin in the past two decades. The latest operational developments were the multispectral aureole- and sunphotometer FUBISS-ASA2, the zenith radiometer FUBISS-ZENITH, and the nadir polarimeter AMSSP-EM, all designed for a flexible use on moving platforms like aircraft or ships. Currently the multiangle, multispectral radiometer URMS/AMSSP (Universal Radiation Measurement System/ Airborne Multispectral Sunphotometer and Polarimeter) is under construction for a Wing-Pod of the high altitude research aircraft HALO operated by DLR. The system is expected to have its first mission on HALO in 2011. The algorithms for the retrieval of aerosol and trace gas properties from the recorded multidirectional, multispectral radiation measurements allow more than deriving standard products, as for instance the aerosol optical depth and the Angstrom exponent. The radiation measured in the solar aureole contains information about the aerosol phasefunction and therefore allows conclusions about the particle type. Furthermore, airborne instrument operation allows vertically resolved measurements. An inversion algorithm, based on radiative transfer simulations and additionally including measured vertical zenith-radiance profiles, allows conclusions about the aerosol single scattering albedo and the relative soot fraction in aerosol layers. Ozone column retrieval is performed evaluating measurements from pixels in the Chappuis absorption band. A retrieval algorithm to derive the water-vapor column from the sunphotometer measurements is currently under development. Of the various airborne

  12. Multispectral Airborne Laser Scanning for Automated Map Updating

    NASA Astrophysics Data System (ADS)

    Matikainen, Leena; Hyyppä, Juha; Litkey, Paula

    2016-06-01

    During the last 20 years, airborne laser scanning (ALS), often combined with multispectral information from aerial images, has shown its high feasibility for automated mapping processes. Recently, the first multispectral airborne laser scanners have been launched, and multispectral information is for the first time directly available for 3D ALS point clouds. This article discusses the potential of this new single-sensor technology in map updating, especially in automated object detection and change detection. For our study, Optech Titan multispectral ALS data over a suburban area in Finland were acquired. Results from a random forests analysis suggest that the multispectral intensity information is useful for land cover classification, also when considering ground surface objects and classes, such as roads. An out-of-bag estimate for classification error was about 3% for separating classes asphalt, gravel, rocky areas and low vegetation from each other. For buildings and trees, it was under 1%. According to feature importance analyses, multispectral features based on several channels were more useful that those based on one channel. Automatic change detection utilizing the new multispectral ALS data, an old digital surface model (DSM) and old building vectors was also demonstrated. Overall, our first analyses suggest that the new data are very promising for further increasing the automation level in mapping. The multispectral ALS technology is independent of external illumination conditions, and intensity images produced from the data do not include shadows. These are significant advantages when the development of automated classification and change detection procedures is considered.

  13. CCD video camera and airborne applications

    NASA Astrophysics Data System (ADS)

    Sturz, Richard A.

    2000-11-01

    The human need to see for ones self and to do so remotely, has given rise to video camera applications never before imagined and growing constantly. The instant understanding and verification offered by video lends its applications to every facet of life. Once an entertainment media, video is now ever present in out daily life. The application to the aircraft platform is one aspect of the video camera versatility. Integrating the video camera into the aircraft platform is yet another story. The typical video camera when applied to more standard scene imaging poses less demanding parameters and considerations. This paper explores the video camera as applied to the more complicated airborne environment.

  14. A tiny VIS-NIR snapshot multispectral camera

    NASA Astrophysics Data System (ADS)

    Geelen, Bert; Blanch, Carolina; Gonzalez, Pilar; Tack, Nicolaas; Lambrechts, Andy

    2015-03-01

    Spectral imaging can reveal a lot of hidden details about the world around us, but is currently confined to laboratory environments due to the need for complex, costly and bulky cameras. Imec has developed a unique spectral sensor concept in which the spectral unit is monolithically integrated on top of a standard CMOS image sensor at wafer level, hence enabling the design of compact, low cost and high acquisition speed spectral cameras with a high design flexibility. This flexibility has previously been demonstrated by imec in the form of three spectral camera architectures: firstly a high spatial and spectral resolution scanning camera, secondly a multichannel snapshot multispectral camera and thirdly a per-pixel mosaic snapshot spectral camera. These snapshot spectral cameras sense an entire multispectral data cube at one discrete point in time, extending the domain of spectral imaging towards dynamic, video-rate applications. This paper describes the integration of our per-pixel mosaic snapshot spectral sensors inside a tiny, portable and extremely user-friendly camera. Our prototype demonstrator cameras can acquire multispectral image cubes, either of 272x512 pixels over 16 bands in the VIS (470-620nm) or of 217x409 pixels over 25 bands in the VNIR (600-900nm) at 170 cubes per second for normal machine vision illumination levels. The cameras themselves are extremely compact based on Ximea xiQ cameras, measuring only 26x26x30mm, and can be operated from a laptop-based USB3 connection, making them easily deployable in very diverse environments.

  15. Sandia Multispectral Airborne Lidar for UAV Deployment

    SciTech Connect

    Daniels, J.W.; Hargis,Jr. P.J.; Henson, T.D.; Jordan, J.D.; Lang, A.R.; Schmitt, R.L.

    1998-10-23

    Sandia National Laboratories has initiated the development of an airborne system for W laser remote sensing measurements. System applications include the detection of effluents associated with the proliferation of weapons of mass destruction and the detection of biological weapon aerosols. This paper discusses the status of the conceptual design development and plans for both the airborne payload (pointing and tracking, laser transmitter, and telescope receiver) and the Altus unmanned aerospace vehicle platform. Hardware design constraints necessary to maintain system weight, power, and volume limitations of the flight platform are identified.

  16. Generic MSFA mosaicking and demosaicking for multispectral cameras

    NASA Astrophysics Data System (ADS)

    Miao, Lidan; Qi, Hairong; Ramanath, Rajeev

    2006-02-01

    In this paper, we investigate the potential application of the multispectral filter array (MSFA) techniques in multispectral imaging for reasons like low cost, exact registration, and strong robustness. In both human and many animal visual systems, different types of photoreceptors are organized into mosaic patterns. This behavior has been emulated in the industry to develop the so-called color filter array (CFA) in the manufacture of digital color cameras. In this way, only one color component is measured at each pixel, and the sensed image is a mosaic of different color bands. We extend this idea to multispectral imaging by developing generic mosaicking and demosaicking algorithms. The binary tree-driven MSFA design process guarantees that the pixel distributions of different spectral bands are uniform and highly correlated. These spatial features facilitate the design of the generic demosaicking algorithm based on the same binary tree, which considers three interrelated issues: band selection, pixel selection and interpolation. We evaluate the reconstructed images from two aspects: better reconstruction and better target classification. The experimental results demonstrate that the mosaicking and demosaicking process preserves the image quality effectively, which further supports that the MSFA technique is a feasible solution for multispectral cameras.

  17. Calibration of a multispectral camera system using interference filters

    NASA Astrophysics Data System (ADS)

    Nishi, Shogo; Tominaga, Shoji

    2011-08-01

    The present paper proposes a calibration method of a multispectral camera system using interference filters. A spectral image processing is effective to acquire an inherent information of an object in a general way. However, filter registration error often occurs when the interference filter is used. Therefore, a calibration method is presented for correcting observed images. Moreover, we describe a method for digital archiving of oil paintings based the present imaging system.

  18. [Radiometric calibration of LCTF-based multispectral area CCD camera].

    PubMed

    Du, Li-Li; Yi, Wei-Ning; Zhang, Dong-Ying; Huang, Hong-Lian; Qiao, Yan-Li; Zhang, Xie

    2011-01-01

    Multispectral area CCD camera based on liquid crystal tunable filter (LCTF) is a new spectral imaging system, which could record image of one wavelength on the area CCD by utilizing electrically controlled birefringence of liquid-crystal and interference principle of polarized light. Because of the special working principle of LCTF and frame transfer area CCD, the existing radiometric calibration method can not meet the precision need of remote sensing application if it is used for LCTF-camera. An improved radiometric calibration method is proposed, in which the camera performance test and calibration experiment are carried out relying on the devices of integrating sphere and standard detector, and the absolute calibration coefficient is calculated via correcting frame transfer smear and improving data process algorithm. Then the validity of the laboratory calibration coefficient is checked by a field validation experiment. Experimental result indicates that the calibration coefficient is valid, and the radiation information on the ground could be accurately inverted from the calibrated image data. With the resolution of radiometric calibration of LCTF-camera and the improvement of calibration precision, the application field of the image data acquired by the camera would be extended effectively.

  19. Airborne multisensor pod system (AMPS) data: Multispectral data integration and processing hints

    SciTech Connect

    Leary, T.J.; Lamb, A.

    1996-11-01

    The Department of Energy`s Office of Arms Control and Non-Proliferation (NN-20) has developed a suite of airborne remote sensing systems that simultaneously collect coincident data from a US Navy P-3 aircraft. The primary objective of the Airborne Multisensor Pod System (AMPS) Program is {open_quotes}to collect multisensor data that can be used for data research, both to reduce interpretation problems associated with data overload and to develop information products more complete than can be obtained from any single sensor.{close_quotes} The sensors are housed in wing-mounted pods and include: a Ku-Band Synthetic Aperture Radar; a CASI Hyperspectral Imager; a Daedalus 3600 Airborne Multispectral Scanner; a Wild Heerbrugg RC-30 motion compensated large format camera; various high resolution, light intensified and thermal video cameras; and several experimental sensors (e.g. the Portable Hyperspectral Imager of Low-Light Spectroscopy (PHILLS)). Over the past year or so, the Coastal Marine Resource Assessment (CAMRA) group at the Florida Department of Environmental Protection`s Marine Research Institute (FMRI) has been working with the Department of Energy through the Naval Research Laboratory to develop applications and products from existing data. Considerable effort has been spent identifying image formats integration parameters. 2 refs., 3 figs., 2 tabs.

  20. Development of a portable multispectral thermal infrared camera

    NASA Technical Reports Server (NTRS)

    Osterwisch, Frederick G.

    1991-01-01

    The purpose of this research and development effort was to design and build a prototype instrument designated the 'Thermal Infrared Multispectral Camera' (TIRC). The Phase 2 effort was a continuation of the Phase 1 feasibility study and preliminary design for such an instrument. The completed instrument designated AA465 has application in the field of geologic remote sensing and exploration. The AA465 Thermal Infrared Camera (TIRC) System is a field-portable multispectral thermal infrared camera operating over the 8.0 - 13.0 micron wavelength range. Its primary function is to acquire two-dimensional thermal infrared images of user-selected scenes. Thermal infrared energy emitted by the scene is collected, dispersed into ten 0.5 micron wide channels, and then measured and recorded by the AA465 System. This multispectral information is presented in real time on a color display to be used by the operator to identify spectral and spatial variations in the scenes emissivity and/or irradiance. This fundamental instrument capability has a wide variety of commercial and research applications. While ideally suited for two-man operation in the field, the AA465 System can be transported and operated effectively by a single user. Functionally, the instrument operates as if it were a single exposure camera. System measurement sensitivity requirements dictate relatively long (several minutes) instrument exposure times. As such, the instrument is not suited for recording time-variant information. The AA465 was fabricated, assembled, tested, and documented during this Phase 2 work period. The detailed design and fabrication of the instrument was performed during the period of June 1989 to July 1990. The software development effort and instrument integration/test extended from July 1990 to February 1991. Software development included an operator interface/menu structure, instrument internal control functions, DSP image processing code, and a display algorithm coding program. The

  1. The new airborne Thermal Infrared Multispectral Scanner (TIMS)

    NASA Technical Reports Server (NTRS)

    Kahle, A. B.

    1983-01-01

    A new airborne Thermal Infrared Multispectral Scanner (TIMS) with six bands between 8 and 12 microns is briefly characterized, and some results of remote sensing experiments are reported. The instrument has an instantaneous field of view of 2.5 milliradians, a total field of view of 80 deg, and a NE Delta T of approximately 0.1-0.3 C depending on the band. In the TIMS image of Death Valley, silica-rich rocks were easily separable from the nonsilicates. The Eureka Quartzite stood out in sharp contrast to other Ordovician and Cambrian metasediments, and Tertiary volcanic rocks were easily separable from both. Also distinguishable were various units in the fan gravels.

  2. Laboratory and field portable system for calibrating airborne multispectral scanners

    SciTech Connect

    Kuhlow, W.W.

    1981-01-01

    Manufacturers of airborne multispectral scanners suggest procedures for calibration and alignment that are usually awkward and even questionable. For example, the procedures may require: separating the scanner from calibration and alignment sources by 100 feet or more, employing folding mirrors, tampering with the detectors after the procedures are finished, etc. Under the best of conditions such procedures require about three hours yielding questionable confidence in the results; under many conditions, however, procedures commonly take six to eight hours, yielding no satisfactory results. EG and G, Inc. has designed and built a calibration and alignment system for airborne scanners which solves those problems, permitting the procedures to be carried out in about two to three hours. This equipment can be quickly disassembled, transported with the scanner in all but the smallest single engine aircraft, and reassembled in a few hours. The subsystems of this equipment are commonly available from manufacturers of optical and electronic equipment. The other components are easily purchased, or fabricated. The scanner discussed is the Model DS-1260 digital line scanner manufactured by Daedalus Enterprises, Inc. It is a dual-sensor system which is operated in one of two combination of sensors: one spectrometer head (which provides simultaneous coverage in ten visible channels) and one thermal infrared detector, or simply two thermal infrared detectors.

  3. Michigan experimental multispectral mapping system: A description of the M7 airborne sensor and its performance

    NASA Technical Reports Server (NTRS)

    Hasell, P. G., Jr.

    1974-01-01

    The development and characteristics of a multispectral band scanner for an airborne mapping system are discussed. The sensor operates in the ultraviolet, visual, and infrared frequencies. Any twelve of the bands may be selected for simultaneous, optically registered recording on a 14-track analog tape recorder. Multispectral imagery recorded on magnetic tape in the aircraft can be laboratory reproduced on film strips for visual analysis or optionally machine processed in analog and/or digital computers before display. The airborne system performance is analyzed.

  4. Use of airborne multispectral video data for water quality evaluation in Sandy Hook, New Jersey

    NASA Astrophysics Data System (ADS)

    Bagheri, Sima; Stein, Matt

    1992-05-01

    A local mission of short duration was carried out to investigate the relationship between signals acquired by an airborne multispectral camera (MSC-02) developed by XYbion Corporation and in situ water sampling. The MSC-02 was used to produce video images in six spectral bands in the reflective and near-infrared region of the spectrum from which all below-surface hydrological signals originate. Images of halon-coated panels were obtained in all bands to calculate relative radiometric calibration functions. These functions were applied to corresponding spectral images to calculate relative radiances of both panel and estuarine water targets. These values were then input to regression equations to establish a correlation between water constituents (organic/inorganic) and MSC-02 signals indicating the degree of eutrophication in the estuary. It is hypothesized that if reliable relationships between MSC-02 data with fine spatial resolution and selected water quality parameters are obtained, then it would be possible to calibrate the concurrently acquired Landsat 5 thematic mapper (TM) data with coarser spatial resolution for monitoring of estuarine water quality.

  5. Comparison of different detection methods for citrus greening disease based on airborne multispectral and hyperspectral imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Citrus greening or Huanglongbing (HLB) is a devastating disease spread in many citrus groves since first found in 2005 in Florida. Multispectral (MS) and hyperspectral (HS) airborne images of citrus groves in Florida were taken to detect citrus greening infected trees in 2007 and 2010. Ground truthi...

  6. Television camera on RMS surveys insulation on Airborne Support Equipment

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The television camera on the end effector of the Canadian-built Remote Manipulator System (RMS) is seen surveying some of the insulation on the Airborne Support Equipment (ASE). Flight controllers called for the survey following the departure of the Advanced Communications Technology Satellite (ACTS) and its Transfer Orbit Stage (TOS).

  7. Evaluating the Potential of Multispectral Airborne LIDAR for Topographic Mapping and Land Cover Classification

    NASA Astrophysics Data System (ADS)

    Wichmann, V.; Bremer, M.; Lindenberger, J.; Rutzinger, M.; Georges, C.; Petrini-Monteferri, F.

    2015-08-01

    Recently multispectral LiDAR became a promising research field for enhanced LiDAR classification workflows and e.g. the assessment of vegetation health. Current analyses on multispectral LiDAR are mainly based on experimental setups, which are often limited transferable to operational tasks. In late 2014 Optech Inc. announced the first commercially available multispectral LiDAR system for airborne topographic mapping. The combined system makes synchronic multispectral LiDAR measurements possible, solving time shift problems of experimental acquisitions. This paper presents an explorative analysis of the first airborne collected data with focus on class specific spectral signatures. Spectral patterns are used for a classification approach, which is evaluated in comparison to a manual reference classification. Typical spectral patterns comparable to optical imagery could be observed for homogeneous and planar surfaces. For rough and volumetric objects such as trees, the spectral signature becomes biased by signal modification due to multi return effects. However, we show that this first flight data set is suitable for conventional geometrical classification and mapping procedures. Additional classes such as sealed and unsealed ground can be separated with high classification accuracies. For vegetation classification the distinction of species and health classes is possible.

  8. Calibrated and geocoded clutter from an airborne multispectral scanner

    NASA Astrophysics Data System (ADS)

    Heuer, Markus; Bruehlmann, Ralph; John, Marc-Andre; Schmid, Konrad J.; Hueppi, Rudolph; Koenig, Reto

    1999-07-01

    Robustness of automatic target recognition (ATR) to varying observation conditions and countermeasures is substantially increased by use of multispectral sensors. Assessment of such ATR systems is performed by captive flight tests and simulations (HWIL or complete modeling). Although the clutter components of a scene can be generated with specified statistics, clutter maps directly obtained from measurement are required for validation of a simulation. In addition, urban scenes have non-stationary characteristics and are difficult to simulate. The present paper describes a scanner, data acquisition and processing system used for the generation of realistic clutter maps incorporating infrared, passive and active millimeter wave channels. The sensors are mounted on a helicopter with coincident line-of-sight, enabling us to measure consistent clutter signatures under varying observation conditions. Position and attitude data from GPS and an inertial measurement unit, respectively, are used to geometrically correct the raw scanner data. After sensor calibration the original voltage signals are converted to physical units, i.e. temperatures and reflectivities, describing the clutter independently of the scanning sensor, thus allowing us the use of the clutter maps in tests of a priori unknown multispectral sensors. The data correction procedures are described and results are presented.

  9. Determining density of maize canopy. 2: Airborne multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Stoner, E. R.; Baumgardner, M. F.; Cipra, J. E.

    1971-01-01

    Multispectral scanner data were collected in two flights over a light colored soil background cover plot at an altitude of 305 m. Energy in eleven reflective wavelength band from 0.45 to 2.6 microns was recorded. Four growth stages of maize (Zea mays L.) gave a wide range of canopy densities for each flight date. Leaf area index measurements were taken from the twelve subplots and were used as a measure of canopy density. Ratio techniques were used to relate uncalibrated scanner response to leaf area index. The ratios of scanner data values for the 0.72 to 0.92 micron wavelength band over the 0.61 to 0.70 micron wavelength band were calculated for each plot. The ratios related very well to leaf area index for a given flight date. The results indicated that spectral data from maize canopies could be of value in determining canopy density.

  10. Multispectral filter wheel cameras: modeling aberrations for filters in front of lens

    NASA Astrophysics Data System (ADS)

    Klein, Julie; Aach, Til

    2012-01-01

    Aberrations occur in multispectral cameras featuring filter wheels because of color filters with different optical properties being present in the ray path. In order to ensure an exact compensation of these aberrations, a mathematical model of the distortions has to be developed and its parameters have to be calculated using the measured data. Such a model already exists for optical filters placed between the sensor and the lens, but not for bandpass filters placed in front of the lens. For this configuration, the rays are first distorted by the filters and then by the lens. In this paper, we derive a model for aberrations caused by filters placed in front of the lens in multispectral cameras. We compare this model with distortions obtained with simulations as well as with distortions measured during real multispectral acquisitions. In both cases, the difference between modeled and measured aberrations remains low, which corroborates the physical model. Multispectral acquisitions with filters placed between the sensor and the lens or in front of the lens are compared: the latter exhibit smaller distortions and the aberrations in both images can be compensated using the same algorithm.

  11. Cubic spline reflectance estimates using the Viking lander camera multispectral data

    NASA Technical Reports Server (NTRS)

    Park, S. K.; Huck, F. O.

    1976-01-01

    A technique was formulated for constructing spectral reflectance estimates from multispectral data obtained with the Viking lander cameras. The output of each channel was expressed as a linear function of the unknown spectral reflectance producing a set of linear equations which were used to determine the coefficients in a representation of the spectral reflectance estimate as a natural cubic spline. The technique was used to produce spectral reflectance estimates for a variety of actual and hypothetical spectral reflectances.

  12. Evaluation of eelgrass beds mapping using a high-resolution airborne multispectral scanner

    USGS Publications Warehouse

    Su, H.; Karna, D.; Fraim, E.; Fitzgerald, M.; Dominguez, R.; Myers, J.S.; Coffland, B.; Handley, L.R.; Mace, T.

    2006-01-01

    Eelgrass (Zostera marina) can provide vital ecological functions in stabilizing sediments, influencing current dynamics, and contributing significant amounts of biomass to numerous food webs in coastal ecosystems. Mapping eelgrass beds is important for coastal water and nearshore estuarine monitoring, management, and planning. This study demonstrated the possible use of high spatial (approximately 5 m) and temporal (maximum low tide) resolution airborne multispectral scanner on mapping eelgrass beds in Northern Puget Sound, Washington. A combination of supervised and unsupervised classification approaches were performed on the multispectral scanner imagery. A normalized difference vegetation index (NDVI) derived from the red and near-infrared bands and ancillary spatial information, were used to extract and mask eelgrass beds and other submerged aquatic vegetation (SAV) in the study area. We evaluated the resulting thematic map (geocoded, classified image) against a conventional aerial photograph interpretation using 260 point locations randomly stratified over five defined classes from the thematic map. We achieved an overall accuracy of 92 percent with 0.92 Kappa Coefficient in the study area. This study demonstrates that the airborne multispectral scanner can be useful for mapping eelgrass beds in a local or regional scale, especially in regions for which optical remote sensing from space is constrained by climatic and tidal conditions. ?? 2006 American Society for Photogrammetry and Remote Sensing.

  13. Towards Automatic Single-Sensor Mapping by Multispectral Airborne Laser Scanning

    NASA Astrophysics Data System (ADS)

    Ahokas, E.; Hyyppä, J.; Yu, X.; Liang, X.; Matikainen, L.; Karila, K.; Litkey, P.; Kukko, A.; Jaakkola, A.; Kaartinen, H.; Holopainen, M.; Vastaranta, M.

    2016-06-01

    This paper describes the possibilities of the Optech Titan multispectral airborne laser scanner in the fields of mapping and forestry. Investigation was targeted to six land cover classes. Multispectral laser scanner data can be used to distinguish land cover classes of the ground surface, including the roads and separate road surface classes. For forest inventory using point cloud metrics and intensity features combined, total accuracy of 93.5% was achieved for classification of three main boreal tree species (pine, spruce and birch).When using intensity features - without point height metrics - a classification accuracy of 91% was achieved for these three tree species. It was also shown that deciduous trees can be further classified into more species. We propose that intensity-related features and waveform-type features are combined with point height metrics for forest attribute derivation in area-based prediction, which is an operatively applied forest inventory process in Scandinavia. It is expected that multispectral airborne laser scanning can provide highly valuable data for city and forest mapping and is a highly relevant data asset for national and local mapping agencies in the near future.

  14. Urban land use monitoring from computer-implemented processing of airborne multispectral data

    NASA Technical Reports Server (NTRS)

    Todd, W. J.; Mausel, P. W.; Baumgardner, M. F.

    1976-01-01

    Machine processing techniques were applied to multispectral data obtained from airborne scanners at an elevation of 600 meters over central Indianapolis in August, 1972. Computer analysis of these spectral data indicate that roads (two types), roof tops (three types), dense grass (two types), sparse grass (two types), trees, bare soil, and water (two types) can be accurately identified. Using computers, it is possible to determine land uses from analysis of type, size, shape, and spatial associations of earth surface images identified from multispectral data. Land use data developed through machine processing techniques can be programmed to monitor land use changes, simulate land use conditions, and provide impact statistics that are required to analyze stresses placed on spatial systems.

  15. Development of a portable 3CCD camera system for multispectral imaging of biological samples.

    PubMed

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples.

  16. A spectral reflectance estimation technique using multispectral data from the Viking lander camera

    NASA Technical Reports Server (NTRS)

    Park, S. K.; Huck, F. O.

    1976-01-01

    A technique is formulated for constructing spectral reflectance curve estimates from multispectral data obtained with the Viking lander camera. The multispectral data are limited to six spectral channels in the wavelength range from 0.4 to 1.1 micrometers and most of these channels exhibit appreciable out-of-band response. The output of each channel is expressed as a linear (integral) function of the (known) solar irradiance, atmospheric transmittance, and camera spectral responsivity and the (unknown) spectral responsivity and the (unknown) spectral reflectance. This produces six equations which are used to determine the coefficients in a representation of the spectral reflectance as a linear combination of known basis functions. Natural cubic spline reflectance estimates are produced for a variety of materials that can be reasonably expected to occur on Mars. In each case the dominant reflectance features are accurately reproduced, but small period features are lost due to the limited number of channels. This technique may be a valuable aid in selecting the number of spectral channels and their responsivity shapes when designing a multispectral imaging system.

  17. Development of a portable 3CCD camera system for multispectral imaging of biological samples.

    PubMed

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  18. Development of a Portable 3CCD Camera System for Multispectral Imaging of Biological Samples

    PubMed Central

    Lee, Hoyoung; Park, Soo Hyun; Noh, Sang Ha; Lim, Jongguk; Kim, Moon S.

    2014-01-01

    Recent studies have suggested the need for imaging devices capable of multispectral imaging beyond the visible region, to allow for quality and safety evaluations of agricultural commodities. Conventional multispectral imaging devices lack flexibility in spectral waveband selectivity for such applications. In this paper, a recently developed portable 3CCD camera with significant improvements over existing imaging devices is presented. A beam-splitter prism assembly for 3CCD was designed to accommodate three interference filters that can be easily changed for application-specific multispectral waveband selection in the 400 to 1000 nm region. We also designed and integrated electronic components on printed circuit boards with firmware programming, enabling parallel processing, synchronization, and independent control of the three CCD sensors, to ensure the transfer of data without significant delay or data loss due to buffering. The system can stream 30 frames (3-waveband images in each frame) per second. The potential utility of the 3CCD camera system was demonstrated in the laboratory for detecting defect spots on apples. PMID:25350510

  19. Multi-spectral CCD camera system for ocean water color and seacoast observation

    NASA Astrophysics Data System (ADS)

    Zhu, Min; Chen, Shiping; Wu, Yanlin; Huang, Qiaolin; Jin, Weiqi

    2001-10-01

    One of the earth observing instruments on HY-1 Satellite which will be launched in 2001, the multi-spectral CCD camera system, is developed by Beijing Institute of Space Mechanics & Electricity (BISME), Chinese Academy of Space Technology (CAST). In 798 km orbit, the system can provide images with 250 m ground resolution and a swath of 500 km. It is mainly used for coast zone dynamic mapping and oceanic watercolor monitoring, which include the pollution of offshore and coast zone, plant cover, watercolor, ice, terrain underwater, suspended sediment, mudflat, soil and vapor gross. The multi- spectral camera system is composed of four monocolor CCD cameras, which are line array-based, 'push-broom' scanning cameras, and responding for four spectral bands. The camera system adapts view field registration; that is, each camera scans the same region at the same moment. Each of them contains optics, focal plane assembly, electrical circuit, installation structure, calibration system, thermal control and so on. The primary features on the camera system are: (1) Offset of the central wavelength is better than 5 nm; (2) Degree of polarization is less than 0.5%; (3) Signal-to-noise ratio is about 1000; (4) Dynamic range is better than 2000:1; (5) Registration precision is better than 0.3 pixel; (6) Quantization value is 12 bit.

  20. Multispectral integral imaging acquisition and processing using a monochrome camera and a liquid crystal tunable filter.

    PubMed

    Latorre-Carmona, Pedro; Sánchez-Ortiga, Emilio; Xiao, Xiao; Pla, Filiberto; Martínez-Corral, Manuel; Navarro, Héctor; Saavedra, Genaro; Javidi, Bahram

    2012-11-01

    This paper presents an acquisition system and a procedure to capture 3D scenes in different spectral bands. The acquisition system is formed by a monochrome camera, and a Liquid Crystal Tunable Filter (LCTF) that allows to acquire images at different spectral bands in the [480, 680]nm wavelength interval. The Synthetic Aperture Integral Imaging acquisition technique is used to obtain the elemental images for each wavelength. These elemental images are used to computationally obtain the reconstruction planes of the 3D scene at different depth planes. The 3D profile of the acquired scene is also obtained using a minimization of the variance of the contribution of the elemental images at each image pixel. Experimental results show the viability to recover the 3D multispectral information of the scene. Integration of 3D and multispectral information could have important benefits in different areas, including skin cancer detection, remote sensing and pattern recognition, among others.

  1. Integration of multispectral face recognition and multi-PTZ camera automated surveillance for security applications

    NASA Astrophysics Data System (ADS)

    Chen, Chung-Hao; Yao, Yi; Chang, Hong; Koschan, Andreas; Abidi, Mongi

    2013-06-01

    Due to increasing security concerns, a complete security system should consist of two major components, a computer-based face-recognition system and a real-time automated video surveillance system. A computerbased face-recognition system can be used in gate access control for identity authentication. In recent studies, multispectral imaging and fusion of multispectral narrow-band images in the visible spectrum have been employed and proven to enhance the recognition performance over conventional broad-band images, especially when the illumination changes. Thus, we present an automated method that specifies the optimal spectral ranges under the given illumination. Experimental results verify the consistent performance of our algorithm via the observation that an identical set of spectral band images is selected under all tested conditions. Our discovery can be practically used for a new customized sensor design associated with given illuminations for an improved face recognition performance over conventional broad-band images. In addition, once a person is authorized to enter a restricted area, we still need to continuously monitor his/her activities for the sake of security. Because pantilt-zoom (PTZ) cameras are capable of covering a panoramic area and maintaining high resolution imagery for real-time behavior understanding, researches in automated surveillance systems with multiple PTZ cameras have become increasingly important. Most existing algorithms require the prior knowledge of intrinsic parameters of the PTZ camera to infer the relative positioning and orientation among multiple PTZ cameras. To overcome this limitation, we propose a novel mapping algorithm that derives the relative positioning and orientation between two PTZ cameras based on a unified polynomial model. This reduces the dependence on the knowledge of intrinsic parameters of PTZ camera and relative positions. Experimental results demonstrate that our proposed algorithm presents substantially

  2. Multispectral airborne laser scanning - a new trend in the development of LiDAR technology

    NASA Astrophysics Data System (ADS)

    Bakuła, K.

    2015-12-01

    Airborne laser scanning (ALS) is the one of the most accurate remote sensing techniques for data acquisition where the terrain and its coverage is concerned. Modern scanners have been able to scan in two or more channels (frequencies of the laser) recently. This gives the rise to the possibility of obtaining diverse information about an area with the different spectral properties of objects. The paper presents an example of a multispectral ALS system - Titan by Optech - with the possibility of data including the analysis of digital elevation models accuracy and data density. As a result of the study, the high relative accuracy of LiDAR acquisition in three spectral bands was proven. The mean differences between digital terrain models (DTMs) were less than 0.03 m. The data density analysis showed the influence of the laser wavelength. The points clouds that were tested had average densities of 25, 23 and 20 points per square metre respectively for green (G), near-infrared (NIR) and shortwave-infrared (SWIR) lasers. In this paper, the possibility of the generation of colour composites using orthoimages of laser intensity reflectance and its classification capabilities using data from airborne multispectral laser scanning for land cover mapping are also discussed and compared with conventional photogrammetric techniques.

  3. Testing of Land Cover Classification from Multispectral Airborne Laser Scanning Data

    NASA Astrophysics Data System (ADS)

    Bakuła, K.; Kupidura, P.; Jełowicki, Ł.

    2016-06-01

    Multispectral Airborne Laser Scanning provides a new opportunity for airborne data collection. It provides high-density topographic surveying and is also a useful tool for land cover mapping. Use of a minimum of three intensity images from a multiwavelength laser scanner and 3D information included in the digital surface model has the potential for land cover/use classification and a discussion about the application of this type of data in land cover/use mapping has recently begun. In the test study, three laser reflectance intensity images (orthogonalized point cloud) acquired in green, near-infrared and short-wave infrared bands, together with a digital surface model, were used in land cover/use classification where six classes were distinguished: water, sand and gravel, concrete and asphalt, low vegetation, trees and buildings. In the tested methods, different approaches for classification were applied: spectral (based only on laser reflectance intensity images), spectral with elevation data as additional input data, and spectro-textural, using morphological granulometry as a method of texture analysis of both types of data: spectral images and the digital surface model. The method of generating the intensity raster was also tested in the experiment. Reference data were created based on visual interpretation of ALS data and traditional optical aerial and satellite images. The results have shown that multispectral ALS data are unlike typical multispectral optical images, and they have a major potential for land cover/use classification. An overall accuracy of classification over 90% was achieved. The fusion of multi-wavelength laser intensity images and elevation data, with the additional use of textural information derived from granulometric analysis of images, helped to improve the accuracy of classification significantly. The method of interpolation for the intensity raster was not very helpful, and using intensity rasters with both first and last return

  4. Identification of landslides in clay terrains using Airborne Thematic Mapper (ATM) multispectral imagery

    NASA Astrophysics Data System (ADS)

    Whitworth, Malcolm; Giles, David; Murphy, William

    2002-01-01

    The slopes of the Cotswolds Escarpment in the United Kingdom are mantled by extensive landslide deposits, including both relict and active features. These landslides pose a significant threat to engineering projects and have been the focus of research into the use of airborne remote sensing data sets for landslide mapping. Due to the availability of extensive ground investigation data, a test site was chosen on the slopes of the Cotswolds Escarpment above the village of Broadway, Worcestershire, United Kingdom. Daedalus Airborne Thematic Mapper (ATM) imagery was subsequently acquired by the UK Natural Environment Research Council (NERC) to provide high-resolution multispectral imagery of the Broadway site. This paper assesses the textural enhancement of ATM imagery as an image processing technique for landslide mapping at the Broadway site. Results of three kernel based textural measures, variance, mean euclidean distance (MEUC) and grey level co-occurrence matrix (GLCM) entropy are presented. Problems encountered during textural analysis, associated with the presence of dense woodland within the project area, are discussed and a solution using Principal Component Analysis (PCA) is described. Landslide features in clay dominated terrains can be identified through textural enhancement of airborne multispectral imagery. The kernel based textural measures tested in the current study were all able to enhance areas of slope instability within ATM imagery. Additionally, results from supervised classification of the combined texture-principal component dataset show that texture based image classification can accurately classify landslide regions and that by including a Principal Component image, woodland and landslide classes can be differentiated successfully during the classification process.

  5. Combining multi-spectral proximal sensors and digital cameras for monitoring grazed tropical pastures

    NASA Astrophysics Data System (ADS)

    Handcock, R. N.; Gobbett, D. L.; González, L. A.; Bishop-Hurley, G. J.; McGavin, S. L.

    2015-11-01

    Timely and accurate monitoring of pasture biomass and ground-cover is necessary in livestock production systems to ensure productive and sustainable management of forage for livestock. Interest in the use of proximal sensors for monitoring pasture status in grazing systems has increased, since such sensors can return data in near real-time, and have the potential to be deployed on large properties where remote sensing may not be suitable due to issues such as spatial scale or cloud cover. However, there are unresolved challenges in developing calibrations to convert raw sensor data to quantitative biophysical values, such as pasture biomass or vegetation ground-cover, to allow meaningful interpretation of sensor data by livestock producers. We assessed the use of multiple proximal sensors for monitoring tropical pastures with a pilot deployment of sensors at two sites on Lansdown Research Station near Townsville, Australia. Each site was monitored by a Skye SKR-four-band multi-spectral sensor (every 1 min), a digital camera (every 30 min), and a soil moisture sensor (every 1 min), each operated over 18 months. Raw data from each sensor were processed to calculate a number of multispectral vegetation indices. Visual observations of pasture characteristics, including above-ground standing biomass and ground cover, were made every 2 weeks. A methodology was developed to manage the sensor deployment and the quality control of the data collected. The data capture from the digital cameras was more reliable than the multi-spectral sensors, which had up to 63 % of data discarded after data cleaning and quality control. We found a strong relationship between sensor and pasture measurements during the wet season period of maximum pasture growth (January to April), especially when data from the multi-spectral sensors were combined with weather data. RatioNS34 (a simple band ratio between the near infrared (NIR) and lower shortwave infrared (SWIR) bands) and rainfall since 1

  6. Validation of a 2D multispectral camera: application to dermatology/cosmetology on a population covering five skin phototypes

    NASA Astrophysics Data System (ADS)

    Jolivot, Romuald; Nugroho, Hermawan; Vabres, Pierre; Ahmad Fadzil, M. H.; Marzani, Franck

    2011-07-01

    This paper presents the validation of a new multispectral camera specifically developed for dermatological application based on healthy participants from five different Skin PhotoTypes (SPT). The multispectral system provides images of the skin reflectance at different spectral bands, coupled with a neural network-based algorithm that reconstructs a hyperspectral cube of cutaneous data from a multispectral image. The flexibility of neural network based algorithm allows reconstruction at different wave ranges. The hyperspectral cube provides both high spectral and spatial information. The study population involves 150 healthy participants. The participants are classified based on their skin phototype according to the Fitzpatrick Scale and population covers five of the six types. The acquisition of a participant is performed at three body locations: two skin areas exposed to the sun (hand, face) and one area non exposed to the sun (lower back) and each is reconstructed at 3 different wave ranges. The validation is performed by comparing data acquired from a commercial spectrophotometer with the reconstructed spectrum obtained from averaging the hyperspectral cube. The comparison is calculated between 430 to 740 nm due to the limit of the spectrophotometer used. The results reveal that the multispectral camera is able to reconstruct hyperspectral cube with a goodness of fit coefficient superior to 0,997 for the average of all SPT for each location. The study reveals that the multispectral camera provides accurate reconstruction of hyperspectral cube which can be used for analysis of skin reflectance spectrum.

  7. Design of a multi-spectral imager built using the compressive sensing single-pixel camera architecture

    NASA Astrophysics Data System (ADS)

    McMackin, Lenore; Herman, Matthew A.; Weston, Tyler

    2016-02-01

    We present the design of a multi-spectral imager built using the architecture of the single-pixel camera. The architecture is enabled by the novel sampling theory of compressive sensing implemented optically using the Texas Instruments DLP™ micro-mirror array. The array not only implements spatial modulation necessary for compressive imaging but also provides unique diffractive spectral features that result in a multi-spectral, high-spatial resolution imager design. The new camera design provides multi-spectral imagery in a wavelength range that extends from the visible to the shortwave infrared without reduction in spatial resolution. In addition to the compressive imaging spectrometer design, we present a diffractive model of the architecture that allows us to predict a variety of detailed functional spatial and spectral design features. We present modeling results, architectural design and experimental results that prove the concept.

  8. A technique for constructing spectral reflectance curves from Viking lander camera multispectral data

    NASA Technical Reports Server (NTRS)

    Park, S. K.; Huck, F. O.; Martin, B. D.

    1975-01-01

    A technique for evaluating the construction of spectral reflectance curves from multispectral data obtained with the Viking lander cameras is presented. The multispectral data is limited to 6 channels in the wave-length range 0.4 to 1.1 microns, and several of the channels suffer from appreciable out-of-band response. The technique represents the estimated reflectance curves as a linear combination of known basic functions with coefficients determined to minimize the error in the representation, and it permits all channels, with and without out-of-band response, to contribute equally valid information. The technique is evaluated for known spectral reflectance curves of 8 materials felt likely to be present on the Martian surface. The technique provides an essentially exact fit if the the reflectance curve has no pronounced maxima and minima. Even if the curve has pronounced maxima and minima, the fit is good and reveals the most dominant features. Since only 6 samples are available some short period features are lost. This loss is almost certainly due to undersampling rather than out-of-band channel response.

  9. Simultaneous multispectral framing infrared camera using an embedded diffractive optical lenslet array

    NASA Astrophysics Data System (ADS)

    Hinnrichs, Michele

    2011-06-01

    Recent advances in micro-optical element fabrication using gray scale technology have opened up the opportunity to create simultaneous multi-spectral imaging with fine structure diffractive lenses. This paper will discuss an approach that uses diffractive optical lenses configured in an array (lenslet array) and placed in close proximity to the focal plane array which enables a small compact simultaneous multispectral imaging camera [1]. The lenslet array is designed so that all lenslets have a common focal length with each lenslet tuned for a different wavelength. The number of simultaneous spectral images is determined by the number of individually configured lenslets in the array. The number of spectral images can be increased by a factor of 2 when using it with a dual-band focal plane array (MWIR/LWIR) by exploiting multiple diffraction orders. In addition, modulation of the focal length of the lenslet array with piezoelectric actuation will enable spectral bin fill-in allowing additional spectral coverage while giving up simultaneity. Different lenslet array spectral imaging concept designs are presented in this paper along with a unique concept for prefiltering the radiation focused on the detector. This approach to spectral imaging has applications in the detection of chemical agents in both aerosolized form and as a liquid on a surface. It also can be applied to the detection of weaponized biological agent and IED detection in various forms from manufacturing to deployment and post detection during forensic analysis.

  10. Statistical correction of lidar-derived digital elevation models with multispectral airborne imagery in tidal marshes

    USGS Publications Warehouse

    Buffington, Kevin J.; Dugger, Bruce D.; Thorne, Karen M.; Takekawa, John

    2016-01-01

    Airborne light detection and ranging (lidar) is a valuable tool for collecting large amounts of elevation data across large areas; however, the limited ability to penetrate dense vegetation with lidar hinders its usefulness for measuring tidal marsh platforms. Methods to correct lidar elevation data are available, but a reliable method that requires limited field work and maintains spatial resolution is lacking. We present a novel method, the Lidar Elevation Adjustment with NDVI (LEAN), to correct lidar digital elevation models (DEMs) with vegetation indices from readily available multispectral airborne imagery (NAIP) and RTK-GPS surveys. Using 17 study sites along the Pacific coast of the U.S., we achieved an average root mean squared error (RMSE) of 0.072 m, with a 40–75% improvement in accuracy from the lidar bare earth DEM. Results from our method compared favorably with results from three other methods (minimum-bin gridding, mean error correction, and vegetation correction factors), and a power analysis applying our extensive RTK-GPS dataset showed that on average 118 points were necessary to calibrate a site-specific correction model for tidal marshes along the Pacific coast. By using available imagery and with minimal field surveys, we showed that lidar-derived DEMs can be adjusted for greater accuracy while maintaining high (1 m) resolution.

  11. Land surface temperature retrieved from airborne multispectral scanner mid-infrared and thermal-infrared data.

    PubMed

    Qian, Yong-Gang; Wang, Ning; Ma, Ling-Ling; Liu, Yao-Kai; Wu, Hua; Tang, Bo-Hui; Tang, Ling-Li; Li, Chuan-Rong

    2016-01-25

    Land surface temperature (LST) is one of the key parameters in the physics of land surface processes at local/global scales. In this paper, a LST retrieval method was proposed from airborne multispectral scanner data comparing one mid-infrared (MIR) channel and one thermal infrared (TIR) channel with the land surface emissivity given as a priori knowledge. To remove the influence of the direct solar radiance efficiently, a relationship between the direct solar radiance and water vapor content and the view zenith angle and solar zenith angle was established. Then, LST could be retrieved with a split-window algorithm from MIR/TIR data. Finally, the proposed algorithm was applied to the actual airborne flight data and validated with in situ measurements of land surface types in the Baotou site in China on 17 October 2014. The results demonstrate that the difference between the retrieved and in situ LST was less than 1.5 K. The bais, RMSE, and standard deviation of the retrieved LST were 0.156 K, 0.883 K, and 0.869 K, respectively, for samples. PMID:26832579

  12. Charon's Color: A view from New Horizon Ralph/Multispectral Visible Imaging Camera

    NASA Astrophysics Data System (ADS)

    Olkin, C.; Howett, C.; Grundy, W. M.; Parker, A. H.; Ennico Smith, K.; Stern, S. A.; Binzel, R. P.; Cook, J. C.; Cruikshank, D. P.; Dalle Ore, C.; Earle, A. M.; Jennings, D. E.; Linscott, I.; Lunsford, A.; Parker, J. W.; Protopapa, S.; Reuter, D.; Singer, K. N.; Spencer, J. R.; Tsang, C.; Verbiscer, A.; Weaver, H. A., Jr.; Young, L. A.

    2015-12-01

    The Multispectral Visible Imaging Camera (MVIC; Reuter et al., 2008) is part of Ralph, an instrument on NASA's New Horizons spacecraft. MVIC is the color 'eyes' of New Horizons, observing objects using five bands from blue to infrared wavelengths. MVIC's images of Charon show it to be an intriguing place, a far cry from the grey heavily cratered world once postulated. Rather Charon is observed to have large surface areas free of craters, and a northern polar region that is much redder than its surroundings. This talk will describe these initial results in more detail, along with Charon's global geological color variations to put these results into their wider context. Finally possible surface coloration mechanisms due to global processes and/or seasonal cycles will be discussed.

  13. Airborne Multispectral and Thermal Remote Sensing for Detecting the Onset of Crop Stress Caused by Multiple Factors

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Remote sensing technology has been developed and applied to provide spatiotemporal information on crop stress for precision management. A series of multispectral images over a field planted cotton, corn and soybean were obtained by a Geospatial Systems MS4100 camera mounted on an Air Tractor 402B ai...

  14. Multispectral thermal airborne TASI-600 data to study the Pompeii (IT) archaeological area

    NASA Astrophysics Data System (ADS)

    Palombo, Angelo; Pascucci, Simone; Pergola, Nicola; Pignatti, Stefano; Santini, Federico; Soldovieri, Francesco

    2016-04-01

    The management of archaeological areas refers to the conservation of the ruins/buildings and the eventual prospection of new areas having an archaeological potential. In this framework, airborne remote sensing is a well-developed geophysical tool for supporting the archaeological surveys of wide areas. The spectral regions applied in archaeological remote sensing spans from the VNIR to the TIR. In particular, the archaeological thermal imaging considers that materials absorb, emit, transmit, and reflect the thermal infrared radiation at different rate according to their composition, density and moisture content. Despite its potential, thermal imaging in archaeological applications are scarce. Among them, noteworthy are the ones related to the use of Landsat and ASTER [1] and airborne remote sensing [2, 3, 4 and 5]. In view of these potential in Cultural Heritage applications, the present study aims at analysing the usefulness of the high spatial resolution thermal imaging on the Pompeii archaeological park. To this purpose TASI-600 [6] airborne multispectral thermal imagery (32 channels from 8 to 11.5 nm with a spectral resolution of 100nm and a spatial resolution of 1m/pixel) was acquired on December the 7th, 2015. Airborne survey has been acquired to get useful information on the building materials (both ancient and of consolidation) characteristics and, whenever possible, to retrieve quick indicators on their conservation status. Thermal images will be, moreover, processed to have an insight of the critical environmental issues impacting the structures (e.g. moisture). The proposed study shows the preliminary results of the airborne deployments, the pre-processing of the multispectral thermal imagery and the retrieving of accurate land surface temperatures (LST). LST map will be analysed to describe the thermal pattern of the city of Pompeii and detect any thermal anomalies. As far as the ongoing TASI-600 sensors pre-processing, it will include: (a) radiometric

  15. Development of low-cost high-performance multispectral camera system at Banpil

    NASA Astrophysics Data System (ADS)

    Oduor, Patrick; Mizuno, Genki; Olah, Robert; Dutta, Achyut K.

    2014-05-01

    Banpil Photonics (Banpil) has developed a low-cost high-performance multispectral camera system for Visible to Short- Wave Infrared (VIS-SWIR) imaging for the most demanding high-sensitivity and high-speed military, commercial and industrial applications. The 640x512 pixel InGaAs uncooled camera system is designed to provide a compact, smallform factor to within a cubic inch, high sensitivity needing less than 100 electrons, high dynamic range exceeding 190 dB, high-frame rates greater than 1000 frames per second (FPS) at full resolution, and low power consumption below 1W. This is practically all the feature benefits highly desirable in military imaging applications to expand deployment to every warfighter, while also maintaining a low-cost structure demanded for scaling into commercial markets. This paper describes Banpil's development of the camera system including the features of the image sensor with an innovation integrating advanced digital electronics functionality, which has made the confluence of high-performance capabilities on the same imaging platform practical at low cost. It discusses the strategies employed including innovations of the key components (e.g. focal plane array (FPA) and Read-Out Integrated Circuitry (ROIC)) within our control while maintaining a fabless model, and strategic collaboration with partners to attain additional cost reductions on optics, electronics, and packaging. We highlight the challenges and potential opportunities for further cost reductions to achieve a goal of a sub-$1000 uncooled high-performance camera system. Finally, a brief overview of emerging military, commercial and industrial applications that will benefit from this high performance imaging system and their forecast cost structure is presented.

  16. Application of combined Landsat thematic mapper and airborne thermal infrared multispectral scanner data to lithologic mapping in Nevada

    USGS Publications Warehouse

    Podwysocki, M.H.; Ehmann, W.J.; Brickey, D.W.

    1987-01-01

    Future Landsat satellites are to include the Thematic Mapper (TM) and also may incorporate additional multispectral scanners. One such scanner being considered for geologic and other applications is a four-channel thermal-infrared multispectral scanner having 60-m spatial resolution. This paper discusses the results of studies using combined Landsat TM and airborne Thermal Infrared Multispectral Scanner (TIMS) digital data for lithologic discrimination, identification, and geologic mapping in two areas within the Basin and Range province of Nevada. Field and laboratory reflectance spectra in the visible and reflective-infrared and laboratory spectra in the thermal-infrared parts of the spectrum were used to verify distinctions made between rock types in the image data sets.

  17. Forest Stand Segmentation Using Airborne LIDAR Data and Very High Resolution Multispectral Imagery

    NASA Astrophysics Data System (ADS)

    Dechesne, Clément; Mallet, Clément; Le Bris, Arnaud; Gouet, Valérie; Hervieu, Alexandre

    2016-06-01

    Forest stands are the basic units for forest inventory and mapping. Stands are large forested areas (e.g., ≥ 2 ha) of homogeneous tree species composition. The accurate delineation of forest stands is usually performed by visual analysis of human operators on very high resolution (VHR) optical images. This work is highly time consuming and should be automated for scalability purposes. In this paper, a method based on the fusion of airborne laser scanning data (or lidar) and very high resolution multispectral imagery for automatic forest stand delineation and forest land-cover database update is proposed. The multispectral images give access to the tree species whereas 3D lidar point clouds provide geometric information on the trees. Therefore, multi-modal features are computed, both at pixel and object levels. The objects are individual trees extracted from lidar data. A supervised classification is performed at the object level on the computed features in order to coarsely discriminate the existing tree species in the area of interest. The analysis at tree level is particularly relevant since it significantly improves the tree species classification. A probability map is generated through the tree species classification and inserted with the pixel-based features map in an energetical framework. The proposed energy is then minimized using a standard graph-cut method (namely QPBO with α-expansion) in order to produce a segmentation map with a controlled level of details. Comparison with an existing forest land cover database shows that our method provides satisfactory results both in terms of stand labelling and delineation (matching ranges between 94% and 99%).

  18. Development of the multi-spectral auroral camera onboard the index satellite

    NASA Astrophysics Data System (ADS)

    Sakanoi, T.; Okano, S.; Obuchi, Y.; Kobayashi, T.; Ejiri, M.; Asamura, K.; Hirahara, M.

    To investigate the fine-scale auroral structures, high time and spatial resolution imaging observations of optical auroras will be made by a multi-spectral auroral camera (MAC) onboard the INDEX satellite which will be launched by an H2A rocket as a piggyback satellite into a polar orbit at an altitude of ˜700 km. Monochromatic auroral image data at emissions of N2 + first negative band (427.8 nm), OI (557.7 nm), and N 2 first positive band (670 nm) are obtained by MAC with the field-of-view (FOV) of 7.6° using three independent CCD cameras in combination with interference filters. MAC will operate in the nightside auroral region by two operation modes in the following. (1) Simultaneous measurement with particle sensors (ESA/ISA). In this mode, MAC observes an imaging area of ˜80×80 km (at a 100 km altitude) around a magnetic footprint with spatial and time resolutions of ˜1.2 km and 120 msec, respectively. (2) Auroral height distribution measurement. The attitude of INDEX satellite is changed to direct the FOV of MAC on the limb of the Earth. In this mode, MAC observes an imaging area of ˜270×270 km (at a 2000 km distance from the satellite) with spatial and time resolutions of ˜4 km and 1 sec, respectively. In this paper, the science mission, the instrumentation, and observation modes concerning on MAC will be presented.

  19. Ground-based analysis of volcanic ash plumes using a new multispectral thermal infrared camera approach

    NASA Astrophysics Data System (ADS)

    Williams, D.; Ramsey, M. S.

    2015-12-01

    Volcanic plumes are complex mixtures of mineral, lithic and glass fragments of varying size, together with multiple gas species. These plumes vary in size dependent on a number of factors, including vent diameter, magma composition and the quantity of volatiles within a melt. However, determining the chemical and mineralogical properties of a volcanic plume immediately after an eruption is a great challenge. Thermal infrared (TIR) satellite remote sensing of these plumes is routinely used to calculate the volcanic ash particle size variations and sulfur dioxide concentration. These analyses are commonly performed using high temporal, low spatial resolution satellites, which can only reveal large scale trends. What is lacking is a high spatial resolution study specifically of the properties of the proximal plumes. Using the emissive properties of volcanic ash, a new method has been developed to determine the plume's particle size and petrology in spaceborne and ground-based TIR data. A multispectral adaptation of a FLIR TIR camera has been developed that simulates the TIR channels found on several current orbital instruments. Using this instrument, data of volcanic plumes from Fuego and Santiaguito volcanoes in Guatemala were recently obtained Preliminary results indicate that the camera is capable of detecting silicate absorption features in the emissivity spectra over the TIR wavelength range, which can be linked to both mineral chemistry and particle size. It is hoped that this technique can be expanded to isolate different volcanic species within a plume, validate the orbital data, and ultimately to use the results to better inform eruption dynamics modelling.

  20. Optical system design of multi-spectral and large format color CCD aerial photogrammetric camera

    NASA Astrophysics Data System (ADS)

    Qian, Yixian; Sun, Tianxiang; Gao, Xiaodong; Liang, Wei

    2007-12-01

    high-spatial resolution. Merits of the aerial photogrammetric camera are multi-spectral, high resolution, low distortion and light-weight and wide field. It can apply in aerial photography and remote sense in place of traditional film camera. After put on trial and analyzing from the design results, the system can meet large scale aerial survey.

  1. Airborne multispectral and thermal remote sensing for detecting the onset of crop stress caused by multiple factors

    NASA Astrophysics Data System (ADS)

    Huang, Yanbo; Thomson, Steven J.

    2010-10-01

    Remote sensing technology has been developed and applied to provide spatiotemporal information on crop stress for precision management. A series of multispectral images over a field planted cotton, corn and soybean were obtained by a Geospatial Systems MS4100 camera mounted on an Air Tractor 402B airplane equipped with Camera Link in a Magma converter box triggered by Terraverde Dragonfly® flight navigation and imaging control software. The field crops were intentionally stressed by applying glyphosate herbicide via aircraft and allowing it to drift near-field. Aerial multispectral images in the visible and near-infrared bands were manipulated to produce vegetation indices, which were used to quantify the onset of herbicide induced crop stress. The vegetation indices normalized difference vegetation index (NDVI) and soil adjusted vegetation index (SAVI) showed the ability to monitor crop response to herbicide-induced injury by revealing stress at different phenological stages. Two other fields were managed with irrigated versus nonirrigated treatments, and those fields were imaged with both the multispectral system and an Electrophysics PV-320T thermal imaging camera on board an Air Tractor 402B aircraft. Thermal imagery indicated water stress due to deficits in soil moisture, and a proposed method of determining crop cover percentage using thermal imagery was compared with a multispectral imaging method. Development of an image fusion scheme may be necessary to provide synergy and improve overall water stress detection ability.

  2. Mapping of hydrothermally altered rocks using airborne multispectral scanner data, Marysvale, Utah, mining district

    USGS Publications Warehouse

    Podwysocki, M.H.; Segal, D.B.; Jones, O.D.

    1983-01-01

    Multispectral data covering an area near Marysvale, Utah, collected with the airborne National Aeronautics and Space Administration (NASA) 24-channel Bendix multispectral scanner, were analyzed to detect areas of hydrothermally altered, potentially mineralized rocks. Spectral bands were selected for analysis that approximate those of the Landsat 4 Thematic Mapper and which are diagnostic of the presence of hydrothermally derived products. Hydrothermally altered rocks, particularly volcanic rocks affected by solutions rich in sulfuric acid, are commonly characterized by concentrations of argillic minerals such as alunite and kaolinite. These minerals are important for identifying hydrothermally altered rocks in multispectral images because they have intense absorption bands centered near a wavelength of 2.2 ??m. Unaltered volcanic rocks commonly do not contain these minerals and hence do not have the absorption bands. A color-composite image was constructed using the following spectral band ratios: 1.6??m/2.2??m, 1.6??m/0.48??m, and 0.67??m/1.0??m. The particular bands were chosen to emphasize the spectral contrasts that exist for argillic versus non-argillic rocks, limonitic versus nonlimonitic rocks, and rocks versus vegetation, respectively. The color-ratio composite successfully distinguished most types of altered rocks from unaltered rocks. Some previously unrecognized areas of hydrothermal alteration were mapped. The altered rocks included those having high alunite and/or kaolinite content, siliceous rocks containing some kaolinite, and ash-fall tuffs containing zeolitic minerals. The color-ratio-composite image allowed further division of these rocks into limonitic and nonlimonitic phases. The image did not allow separation of highly siliceous or hematitically altered rocks containing no clays or alunite from unaltered rocks. A color-coded density slice image of the 1.6??m/2.2??m band ratio allowed further discrimination among the altered units. Areas

  3. High Spatial Resolution Airborne Multispectral Thermal Infrared Remote Sensing Data for Analysis of Urban Landscape Characteristics

    NASA Technical Reports Server (NTRS)

    Quattrochi, Dale A.; Luvall, Jeffrey C.; Estes, Maurice G., Jr.; Arnold, James E. (Technical Monitor)

    2000-01-01

    We have used airborne multispectral thermal infrared (TIR) remote sensing data collected at a high spatial resolution (i.e., 10m) over several cities in the United States to study thermal energy characteristics of the urban landscape. These TIR data provide a unique opportunity to quantify thermal responses from discrete surfaces typical of the urban landscape and to identify both the spatial arrangement and patterns of thermal processes across the city. The information obtained from these data is critical to understanding how urban surfaces drive or force development of the Urban Heat Island (UHI) effect, which exists as a dome of elevated air temperatures that presides over cities in contrast to surrounding non-urbanized areas. The UHI is most pronounced in the summertime where urban surfaces, such as rooftops and pavement, store solar radiation throughout the day, and release this stored energy slowly after sunset creating air temperatures over the city that are in excess of 2-4'C warmer in contrast with non-urban or rural air temperatures. The UHI can also exist as a daytime phenomenon with surface temperatures in downtown areas of cities exceeding 38'C. The implications of the UHI are significant, particularly as an additive source of thermal energy input that exacerbates the overall production of ground level ozone over cities. We have used the Airborne Thermal and Land Applications Sensor (ATLAS), flown onboard a Lear 23 jet aircraft from the NASA Stennis Space Center, to acquire high spatial resolution multispectral TIR data (i.e., 6 bandwidths between 8.2-12.2 (um) over Huntsville, Alabama, Atlanta, Georgia, Baton Rouge, Louisiana, Salt Lake City, Utah, and Sacramento, California. These TIR data have been used to produce maps and other products, showing the spatial distribution of heating and cooling patterns over these cities to better understand how the morphology of the urban landscape affects development of the UHI. In turn, these data have been used

  4. Airborne Thermal Infrared Multispectral Scanner (TIMS) images over disseminated gold deposits, Osgood Mountains, Humboldt County, Nevada

    NASA Technical Reports Server (NTRS)

    Krohn, M. Dennis

    1986-01-01

    The U.S. Geological Survey (USGS) acquired airborne Thermal Infrared Multispectral Scanner (TIMS) images over several disseminated gold deposits in northern Nevada in 1983. The aerial surveys were flown to determine whether TIMS data could depict jasperoids (siliceous replacement bodies) associated with the gold deposits. The TIMS data were collected over the Pinson and Getchell Mines in the Osgood Mountains, the Carlin, Maggie Creek, Bootstrap, and other mines in the Tuscarora Mountains, and the Jerritt Canyon Mine in the Independence Mountains. The TIMS data seem to be a useful supplement to conventional geochemical exploration for disseminated gold deposits in the western United States. Siliceous outcrops are readily separable in the TIMS image from other types of host rocks. Different forms of silicification are not readily separable, yet, due to limitations of spatial resolution and spectral dynamic range. Features associated with the disseminated gold deposits, such as the large intrusive bodies and fault structures, are also resolvable on TIMS data. Inclusion of high-resolution thermal inertia data would be a useful supplement to the TIMS data.

  5. Simulated radiance profiles for automating the interpretation of airborne passive multi-spectral infrared images.

    PubMed

    Sulub, Yusuf; Small, Gary W

    2008-10-01

    Methodology is developed for simulating the radiance profiles acquired from airborne passive multispectral infrared imaging measurements of ground sources of volatile organic compounds (VOCs). The simulation model allows the superposition of pure-component laboratory spectra of VOCs onto spectral backgrounds that simulate those acquired during field measurements conducted with a downward-looking infrared line scanner mounted on an aircraft flying at an altitude of 2000-3000 ft (approximately 600-900 m). Wavelength selectivity in the line scanner is accomplished through the use of a multichannel Hg:Cd:Te detector with up to 16 integrated optical filters. These filters allow the detection of absorption and emission signatures of VOCs superimposed on the upwelling infrared background radiance within the instrumental field of view (FOV). By combining simulated radiance profiles containing analyte signatures with field-collected background signatures, supervised pattern recognition methods can be employed to train automated classifiers for use in detecting the signatures of VOCs during field measurements. The targeted application for this methodology is the use of the imaging system to detect releases of VOCs during emergency response scenarios. In the work described here, the simulation model is combined with piecewise linear discriminant analysis to build automated classifiers for detecting ethanol and methanol. Field data collected during controlled releases of ethanol, as well as during a methanol release from an industrial facility, are used to evaluate the methodology.

  6. A pilot project combining multispectral proximal sensors and digital cameras for monitoring tropical pastures

    NASA Astrophysics Data System (ADS)

    Handcock, Rebecca N.; Gobbett, D. L.; González, Luciano A.; Bishop-Hurley, Greg J.; McGavin, Sharon L.

    2016-08-01

    Timely and accurate monitoring of pasture biomass and ground cover is necessary in livestock production systems to ensure productive and sustainable management. Interest in the use of proximal sensors for monitoring pasture status in grazing systems has increased, since data can be returned in near real time. Proximal sensors have the potential for deployment on large properties where remote sensing may not be suitable due to issues such as spatial scale or cloud cover. There are unresolved challenges in gathering reliable sensor data and in calibrating raw sensor data to values such as pasture biomass or vegetation ground cover, which allow meaningful interpretation of sensor data by livestock producers. Our goal was to assess whether a combination of proximal sensors could be reliably deployed to monitor tropical pasture status in an operational beef production system, as a precursor to designing a full sensor deployment. We use this pilot project to (1) illustrate practical issues around sensor deployment, (2) develop the methods necessary for the quality control of the sensor data, and (3) assess the strength of the relationships between vegetation indices derived from the proximal sensors and field observations across the wet and dry seasons. Proximal sensors were deployed at two sites in a tropical pasture on a beef production property near Townsville, Australia. Each site was monitored by a Skye SKR-four-band multispectral sensor (every 1 min), a digital camera (every 30 min), and a soil moisture sensor (every 1 min), each of which were operated over 18 months. Raw data from each sensor was processed to calculate multispectral vegetation indices. The data capture from the digital cameras was more reliable than the multispectral sensors, which had up to 67 % of data discarded after data cleaning and quality control for technical issues related to the sensor design, as well as environmental issues such as water incursion and insect infestations. We recommend

  7. Curved CCD detector devices and arrays for multispectral astrophysical applications and terrestrial stereo panoramic cameras

    NASA Astrophysics Data System (ADS)

    Swain, Pradyumna; Mark, David

    2004-09-01

    The emergence of curved CCD detectors as individual devices or as contoured mosaics assembled to match the curved focal planes of astronomical telescopes and terrestrial stereo panoramic cameras represents a major optical design advancement that greatly enhances the scientific potential of such instruments. In altering the primary detection surface within the telescope"s optical instrumentation system from flat to curved, and conforming the applied CCD"s shape precisely to the contour of the telescope"s curved focal plane, a major increase in the amount of transmittable light at various wavelengths through the system is achieved. This in turn enables multi-spectral ultra-sensitive imaging with much greater spatial resolution necessary for large and very large telescope applications, including those involving infrared image acquisition and spectroscopy, conducted over very wide fields of view. For earth-based and space-borne optical telescopes, the advent of curved CCD"s as the principle detectors provides a simplification of the telescope"s adjoining optics, reducing the number of optical elements and the occurrence of optical aberrations associated with large corrective optics used to conform to flat detectors. New astronomical experiments may be devised in the presence of curved CCD applications, in conjunction with large format cameras and curved mosaics, including three dimensional imaging spectroscopy conducted over multiple wavelengths simultaneously, wide field real-time stereoscopic tracking of remote objects within the solar system at high resolution, and deep field survey mapping of distant objects such as galaxies with much greater multi-band spatial precision over larger sky regions. Terrestrial stereo panoramic cameras equipped with arrays of curved CCD"s joined with associative wide field optics will require less optical glass and no mechanically moving parts to maintain continuous proper stereo convergence over wider perspective viewing fields than

  8. Multispectral thermal airborne TASI-600 data to study the Pompeii (IT) archaeological area

    NASA Astrophysics Data System (ADS)

    Palombo, Angelo; Pascucci, Simone; Pergola, Nicola; Pignatti, Stefano; Santini, Federico; Soldovieri, Francesco

    2016-04-01

    The management of archaeological areas refers to the conservation of the ruins/buildings and the eventual prospection of new areas having an archaeological potential. In this framework, airborne remote sensing is a well-developed geophysical tool for supporting the archaeological surveys of wide areas. The spectral regions applied in archaeological remote sensing spans from the VNIR to the TIR. In particular, the archaeological thermal imaging considers that materials absorb, emit, transmit, and reflect the thermal infrared radiation at different rate according to their composition, density and moisture content. Despite its potential, thermal imaging in archaeological applications are scarce. Among them, noteworthy are the ones related to the use of Landsat and ASTER [1] and airborne remote sensing [2, 3, 4 and 5]. In view of these potential in Cultural Heritage applications, the present study aims at analysing the usefulness of the high spatial resolution thermal imaging on the Pompeii archaeological park. To this purpose TASI-600 [6] airborne multispectral thermal imagery (32 channels from 8 to 11.5 nm with a spectral resolution of 100nm and a spatial resolution of 1m/pixel) was acquired on December the 7th, 2015. Airborne survey has been acquired to get useful information on the building materials (both ancient and of consolidation) characteristics and, whenever possible, to retrieve quick indicators on their conservation status. Thermal images will be, moreover, processed to have an insight of the critical environmental issues impacting the structures (e.g. moisture). The proposed study shows the preliminary results of the airborne deployments, the pre-processing of the multispectral thermal imagery and the retrieving of accurate land surface temperatures (LST). LST map will be analysed to describe the thermal pattern of the city of Pompeii and detect any thermal anomalies. As far as the ongoing TASI-600 sensors pre-processing, it will include: (a) radiometric

  9. Development of the multi-spectral auroral camera onboard the INDEX satellite

    NASA Astrophysics Data System (ADS)

    Sakanoi, T.; Okano, S.; Obuchi, Y.; Kobayashi, T.; Ejiri, M.; Asamura, K.; Hirahara, M.

    To clarify the mechanism of formation of auroral fine structures, high-time and highspatial resolution imaging observations of optical auroras will be made by a multispectral auroral camera (MAC) onboard the INDEX satellite which will be launched by an H2A rocket as a piggyback satellite into a polar orbit at an altitude of 680 km. Monochromatic auroral image data at emissions of N2+ 1st negative band (427.8 nm), OI (557.7 nm), and N2 1st positive band (670 nm) are simultaneously obtained by MAC with the field-of-view (FOV) of 7.6 deg using three independent CCD cameras in combination with interference filters. MAC will operate only in the nightside auroral region by two operation modes in the following. (1) Simultaneous observation with particle sensors (ESA/ISA). In this mode, MAC observes an imaging area of ~80x80 km (at a 100-km altitude) around a magnetic footprint with the spatial and time resolutions of ~1.2 km and 120 msec, respectively. (2) Auroral height distribution observation. The attitude of INDEX satellite is changed to train the FOV of MAC on the limb of the Earth. In this mode, MAC observes an imaging area of ~270x270 km (at a 2000-km distance from the satellite) with the spatial and time resolutions of ~4 km and 1 sec, respectively. In this paper, we report the detailed specification and the present development situation of MAC, and would also like to discuss the possibility of collaboration with ground-based observations.

  10. New, Flexible Applications with the Multi-Spectral Titan Airborne Lidar

    NASA Astrophysics Data System (ADS)

    Swirski, A.; LaRocque, D. P.; Shaker, A.; Smith, B.

    2015-12-01

    Traditional lidar designs have been restricted to using a single laser channel operating at one particular wavelength. Single-channel systems excel at collecting high-precision spatial (XYZ) data, with accuracies down to a few centimeters. However, target classification is difficult with spatial data alone, and single-wavelength systems are limited to the strengths and weaknesses of the wavelength they use. To resolve these limitations in lidar design, Teledyne Optech developed the Titan, the world's first multispectral lidar system, which uses three independent laser channels operating at 532, 1064, and 1550 nm. Since Titan collects 12 bit intensity returns for each wavelength separately, users can compare how strongly targets in the survey area reflect each wavelength. Materials such as soil, rock and foliage all reflect the wavelengths differently, enabling post-processing algorithms to identify the material of targets easily and automatically. Based on field tests in Canada, automated classification algorithms have combined this with elevation data to classify targets into six basic types with 78% accuracy. Even greater accuracy is possible with further algorithm enhancement and the use of an in-sensor passive imager such as a thermal, multispectral, CIR or RGB camera. Titan therefore presents an important new tool for applications such as land-cover classification and environmental modeling while maintaining lidar's traditional strengths: high 3D accuracy and day/night operation. Multispectral channels also enable a single lidar to handle both topographic and bathymetric surveying efficiently, which previously required separate specialized lidar systems operating at different wavelengths. On land, Titan can survey efficiently from 2000 m AGL with a 900 kHz PRF (300 kHz per channel), or up to 2500 m if only the infrared 1064 and 1550 nm channels are used. Over water, the 532 nm green channel penetrates water to collect seafloor returns while the infrared

  11. Pluto's Global Color Variability as Seen by the New Horizons Multispectral Visible Imaging Camera

    NASA Astrophysics Data System (ADS)

    Binzel, R. P.; Stern, A.; Weaver, H. A., Jr.; Young, L. A.; Olkin, C.; Grundy, W. M.; Earle, A. M.

    2015-12-01

    While variability in Pluto's albedo, color, and methane distribution had been previously discerned from ground-based and Hubble Space Telescope observations [e.g. 1,2], the sharp juxtaposition of contrasting units forms one of the greatest surprises returned (to date) from the New Horizons mission. Here we present a global analysis of the color distribution of Pluto's surface factoring in both seasonal and large scale geologic processes. We will also explore the possible role of long-term (million year) precession cycles [3] in shaping the surface morphology and the distribution of volatiles. We utilize data returned by the New Horizons Multispectral Visible Imaging Camera (MVIC) operating as part of the Ralph instrument [4]. MVIC captures images over five wavelength bands from blue to the near-infrared, including a broad panchromatic band and a narrow band centered on the 0.89-micron methane absorption feature. References: [1] Young, E. F., Binzel, R. P., Crane, K. 2001; Astron. J. 121, 552-561. [2] Grundy, W.M., Olkin, C.B., Young, L.A., Buie, M. W., Young, E. F. 2013; Icarus 223, 710-721. [3] Earle, A. M., Binzel, R. P. 2015; Icarus 250, 405-412. [4] Reuter, D.C., Stern, S.A., Scherrer, J., et al. 2008; Space Science Reviews, 140, 129-154.

  12. Web camera as low cost multispectral sensor for quantification of chlorophyll in soybean leaves

    NASA Astrophysics Data System (ADS)

    Adhiwibawa, Marcelinus A.; Setiawan, Yonathan E.; Prilianti, Kestrilia R.; Brotosudarmo, Tatas H. P.

    2015-01-01

    Soybeans is one of main crops in Indonesia but the demand for soybeans is not followed by an increase in soybeans national production. One of the production limitation factor is the availability of lush cultivation area for soybeans plantation. Indonesian farners are usually grow soybeans in marginal cultivation area that requires soybeans varieties which tolerant with environmental stress such as drought, nutrition limitation, pest, disease and many others. Chlorophyll content in leaf is one of plant health indicator that can be used to determine environmental stress tolerant soybean varieties. However, there are difficulties in soybeans breeding research due to the manual acquisition of data that are time consume and labour extensive. In this paper authors proposed automatic system of soybeans leaves area and chlorophyll quantification based on low cost multispectral sensor using web camera as an indicator of soybean plant tollerance to environmental stress particularlly drought stress. The system acquires the image of the plant that is placed in the acquisition box from the top of the plant. The image is segmented using NDVI (Normalized Difference Vegetation Index) from image and quantified to yield an average value of NDVI and leaf area. The proposed system showed that acquired NDVI value has a strong relationship with SPAD value with r-square value 0.70, while the leaf area prediction has error of 18.41%. Thus the automation system can quantify plant data with good result.

  13. Biooptical variability in the Greenland Sea observed with the Multispectral Airborne Radiometer System (MARS)

    NASA Technical Reports Server (NTRS)

    Mueller, James L.; Trees, Charles C.

    1989-01-01

    A site-specific ocean color remote sensing algorithm was developed and used to convert Multispectral Airborne Radiometer System (MARS) spectral radiance measurements to chlorophyll-a concentration profiles along aircraft tracklines in the Greenland Sea. The analysis is described and the results given in graphical or tabular form. Section 2 describes the salient characteristics and history of development of the MARS instrument. Section 3 describes the analyses of MARS flight segments over consolidated sea ice, resulting in a set of altitude dependent ratios used (over water) to estimate radiance reflected by the surface and atmosphere from total radiance measured. Section 4 presents optically weighted pigment concentrations calculated from profile data, and spectral reflectances measured in situ from the top meter of the water column; this data was analyzed to develop an algorithm relating chlorophyll-a concentrations to the ratio of radiance reflectances at 441 and 550 nm (with a selection of coefficients dependent upon whether significant gelvin presence is implied by a low ratio of reflectances at 410 and 550 nm). Section 5 describes the scaling adjustments which were derived to reconcile the MARS upwelled radiance ratios at 410:550 nm and 441:550 nm to in situ reflectance ratios measured simultaneously on the surface. Section 6 graphically presents the locations of MARS data tracklines and positions of the surface monitoring R/V. Section 7 presents stick-plots of MARS tracklines selected to illustrate two-dimensional spatial variability within the box covered by each day's flight. Section 8 presents curves of chlorophyll-a concentration profiles derived from MARS data along survey tracklines. Significant results are summarized in Section 1.

  14. Multispectral Photometry of the Moon and Absolute Calibration of the Clementine UV/Vis Camera

    NASA Astrophysics Data System (ADS)

    Hillier, John K.; Buratti, Bonnie J.; Hill, Kathryn

    1999-10-01

    We present a multispectral photometric study of the Moon between solar phase angles of 0 and 85°. Using Clementine images obtained between 0.4 and 1.0 μm, we produce a comprehensive study of the lunar surface containing the following results: (1) empirical photometric functions for the spectral range and viewing and illumination geometries mentioned, (2) photometric modeling that derives the physical properties of the upper regolith and includes a detailed study of the causes for the lunar opposition surge, (3) an absolute calibration of the Clementine UV/Vis camera. The calibration procedure given on the Clementine calibration web site produces reflectances relative to a halon standard and further appear significantly higher than those seen in groundbased observations. By comparing Clementine observations with prior groundbased observations of 15 sites on the Moon we have determined a good absolute calibration of the Clementine UV/Vis camera. A correction factor of 0.532 has been determined to convert the web site (www.planetary.brown.edu/clementine/calibration.html) reflectances to absolute values. From the calibrated data, we calculate empirical phase functions useful for performing photometric corrections to observations of the Moon between solar phase angles of 0 and 85° and in the spectral range 0.4 to 1.0μm. Finally, the calibrated data is used to fit a version of Hapke's photometric model modified to incorporate a new formulation, developed in this paper, of the lunar opposition surge which includes coherent backscatter. Recent studies of the lunar opposition effect have yielded contradictory results as to the mechanism responsible: shadow hiding, coherent backscatter, or both. We find that most of the surge can be explained by shadow hiding with a halfwidth of ˜8°. However, for the brightest regions (the highlands at 0.75-1.0μm) a small additional narrow component (halfwidth of <2°) of total amplitude ˜1/6 to 1/4 that of the shadow hiding surge is

  15. SPLASSH: Open source software for camera-based high-speed, multispectral in-vivo optical image acquisition

    PubMed Central

    Sun, Ryan; Bouchard, Matthew B.; Hillman, Elizabeth M. C.

    2010-01-01

    Camera-based in-vivo optical imaging can provide detailed images of living tissue that reveal structure, function, and disease. High-speed, high resolution imaging can reveal dynamic events such as changes in blood flow and responses to stimulation. Despite these benefits, commercially available scientific cameras rarely include software that is suitable for in-vivo imaging applications, making this highly versatile form of optical imaging challenging and time-consuming to implement. To address this issue, we have developed a novel, open-source software package to control high-speed, multispectral optical imaging systems. The software integrates a number of modular functions through a custom graphical user interface (GUI) and provides extensive control over a wide range of inexpensive IEEE 1394 Firewire cameras. Multispectral illumination can be incorporated through the use of off-the-shelf light emitting diodes which the software synchronizes to image acquisition via a programmed microcontroller, allowing arbitrary high-speed illumination sequences. The complete software suite is available for free download. Here we describe the software’s framework and provide details to guide users with development of this and similar software. PMID:21258475

  16. Effectiveness of airborne multispectral thermal data for karst groundwater resources recognition in coastal areas

    NASA Astrophysics Data System (ADS)

    Pignatti, Stefano; Fusilli, Lorenzo; Palombo, Angelo; Santini, Federico; Pascucci, Simone

    2013-04-01

    Currently the detection, use and management of groundwater in karst regions can be considered one of the most significant procedures for solving water scarcity problems during periods of low rainfall this because groundwater resources from karst aquifers play a key role in the water supply in karst areas worldwide [1]. In many countries of the Mediterranean area, where karst is widespread, groundwater resources are still underexploited, while surface waters are generally preferred [2]. Furthermore, carbonate aquifers constitute a crucial thermal water resource outside of volcanic areas, even if there is no detailed and reliable global assessment of thermal water resources. The composite hydrogeological characteristics of karst, particularly directions and zones of groundwater distribution, are not up till now adequately explained [3]. In view of the abovementioned reasons the present study aims at analyzing the detection capability of high spatial resolution thermal remote sensing of karst water resources in coastal areas in order to get useful information on the karst springs flow and on different characteristics of these environments. To this purpose MIVIS [4, 5] and TASI-600 [6] airborne multispectral thermal imagery (see sensors' characteristics in Table 1) acquired on two coastal areas of the Mediterranean area interested by karst activity, one located in Montenegro and one in Italy, were used. One study area is located in the Kotor Bay, a winding bay on the Adriatic Sea surrounded by high mountains in south-western Montenegro and characterized by many subaerial and submarine coastal springs related to deep karstic channels. The other study area is located in Santa Cesarea (Italy), encompassing coastal cold springs, the main local source of high quality water, and also a noticeable thermal groundwater outflow. The proposed study shows the preliminary results of the two airborne deployments on these areas. The preprocessing of the multispectral thermal imagery

  17. Airborne Multispectral LIDAR Data for Land-Cover Classification and Land/water Mapping Using Different Spectral Indexes

    NASA Astrophysics Data System (ADS)

    Morsy, S.; Shaker, A.; El-Rabbany, A.; LaRocque, P. E.

    2016-06-01

    Airborne Light Detection And Ranging (LiDAR) data is widely used in remote sensing applications, such as topographic and landwater mapping. Recently, airborne multispectral LiDAR sensors, which acquire data at different wavelengths, are available, thus allows recording a diversity of intensity values from different land features. In this study, three normalized difference feature indexes (NDFI), for vegetation, water, and built-up area mapping, were evaluated. The NDFIs namely, NDFIG-NIR, NDFIG-MIR, and NDFINIR-MIR were calculated using data collected at three wavelengths; green: 532 nm, near-infrared (NIR): 1064 nm, and mid-infrared (MIR): 1550 nm by the world's first airborne multispectral LiDAR sensor "Optech Titan". The Jenks natural breaks optimization method was used to determine the threshold values for each NDFI, in order to cluster the 3D point data into two classes (water and land or vegetation and built-up area). Two sites at Scarborough, Ontario, Canada were tested to evaluate the performance of the NDFIs for land-water, vegetation, and built-up area mapping. The use of the three NDFIs succeeded to discriminate vegetation from built-up areas with an overall accuracy of 92.51%. Based on the classification results, it is suggested to use NDFIG-MIR and NDFINIR-MIR for vegetation and built-up areas extraction, respectively. The clustering results show that the direct use of NDFIs for land-water mapping has low performance. Therefore, the clustered classes, based on the NDFIs, are constrained by the recorded number of returns from different wavelengths, thus the overall accuracy is improved to 96.98%.

  18. A simple method for vignette correction of airborne digital camera data

    SciTech Connect

    Nguyen, A.T.; Stow, D.A.; Hope, A.S.

    1996-11-01

    Airborne digital camera systems have gained popularity in recent years due to their flexibility, high geometric fidelity and spatial resolution, and fast data turn-around time. However, a common problem that plagues these types of framing systems is vignetting which causes falloff in image brightness away from principle nadir point. This paper presents a simple method for vignetting correction by utilizing laboratory images of a uniform illumination source. Multiple lab images are averaged and inverted to create digital correction templates which then are applied to actual airborne data. The vignette correction was effective in removing the systematic falloff in spectral values. We have shown that the vignette correction is a necessary part of the preprocessing of raw digital airborne remote sensing data. The consequences of not correcting for these effects are demonstrated in the context of monitoring of salt marsh habitat. 4 refs.

  19. GIS Meets Airborne MSS: Geospatial Applications of High-Resolution Multispectral Data

    SciTech Connect

    Albert Guber

    1999-07-27

    Bechtel Nevada operates and flies Daedalus multispectral scanners for funded project tasks at the Department of Energy's Remote Sensing Laboratory. Historically, processing and analysis of multispectral data has afforded scientists the opportunity to see natural phenomena not visible to the naked eye. However, only recently has a system, more specifically a Geometric Correction System, existed to automatically geo-reference these data directly into a Geographic Information (GIS) database. Now, analyses, performed previously in a nongeospatial environment, are integrated directly into an Arc/Info GIS. This technology is of direct benefit to environmental and emergency response applications.

  20. Comparison of airborne multispectral and hyperspectral imagery for estimating grain sorghum yield

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Both multispectral and hyperspectral images are being used to monitor crop conditions and map yield variability, but limited research has been conducted to compare the differences between these two types of imagery for assessing crop growth and yields. The objective of this study was to compare airb...

  1. Estimating Evapotranspiration over Heterogeneously Vegetated Surfaces using Large Aperture Scintillometer, LiDAR, and Airborne Multispectral Imagery

    NASA Astrophysics Data System (ADS)

    Geli, H. M.; Neale, C. M.; Pack, R. T.; Watts, D. R.; Osterberg, J.

    2011-12-01

    Estimates of evapotranspiration (ET) over heterogeneous areas is challenging especially in water-limited sparsely vegetated environments. New techniques such as airborne full-waveform LiDAR (Light Detection and Ranging) and high resolution multispectral and thermal imagery can provide enough detail of sparse canopies to improve energy balance model estimations as well as footprint analysis of scintillometer data. The objectives of this study were to estimate ET over such areas and develop methodologies for the use of these airborne data technologies. Because of the associated heterogeneity, this study was conducted over the Cibola National wildlife refuge, southern California on an area dominated with tamarisk (salt cedar) forest (90%) interspersed with arrowweed and bare soil (10%). A set of two large aperture scintillometers (LASs) were deployed over the area to provide estimates of sensible heat flux (HLAS). The LASs were distributed over the area in a way that allowed capturing different surface spatial heterogeneity. Bowen ratio systems were used to provide hydrometeorological variables and surface energy balance fluxes (SEBF) (i.e. Rn, G, H, and LE) measurements. Scintillometer-based estimates of HLAS were improved by considering the effect of the corresponding 3D footprint and the associated displacement height (d) and the roughness length (z0) following Geli et al. (2011). The LiDAR data were acquired using the LASSI Lidar developed at Utah State University (USU). The data was used to obtain 1-m spatial resolution DEM's and vegetation canopy height to improve the HLAS estimates. The BR measurements of Rn and G were combined with LAS estimates, HLAS, to provide estimates of LELASas a residual of the energy balance equation. A thermal remote sensing model namely the two source energy balance (TSEB) of Norman et al. (1995) was applied to provide spatial estimates of SEBF. Four airborne images at 1-4 meter spatial resolution acquired using the USU airborne

  2. A comparison between satellite and airborne multispectral data for the assessment of Mangrove areas in the eastern Caribbean

    SciTech Connect

    Green, E.P.; Edwards, A.J.; Mumby, P.J.

    1997-06-01

    Satellite (SPOT XS and Landsat TM) and airborne multispectral (CASI) imagery was acquired from the Turks and Caicos Islands, British West Indies. The descriptive resolution and accuracy of each image type is compared for two applications: mangrove habitat mapping and the measurement of mangrove canopy characteristics (leaf area index and canopy closure). Mangroves could be separated from non-mangrove vegetation to an accuracy of only 57% with SPOT XS data but better discrimination could be achieved with either Landsat TM or CASI (in both cases accuracy was >90%). CASI data permitted a more accurate classification of different mangrove habitats than was possible using Landsat TM. Nine mangrove habitats could be mapped to an accuracy of 85% with the high-resolution airborne data compared to 31% obtained with TM. A maximum of three mangrove habitats were separable with Landsat TM: the accuracy of this classification was 83%. Measurement of mangrove canopy characteristics is achieved more accurately with CASI than with either satellite sensor, but high costs probably make it a less cost-effective option. The cost-effectiveness of each sensor is discussed for each application.

  3. A versatile photogrammetric camera automatic calibration suite for multispectral fusion and optical helmet tracking

    NASA Astrophysics Data System (ADS)

    de Villiers, Jason; Jermy, Robert; Nicolls, Fred

    2014-06-01

    This paper presents a system to determine the photogrammetric parameters of a camera. The lens distortion, focal length and camera six degree of freedom (DOF) position are calculated. The system caters for cameras of different sensitivity spectra and fields of view without any mechanical modifications. The distortion characterization, a variant of Brown's classic plumb line method, allows many radial and tangential distortion coefficients and finds the optimal principal point. Typical values are 5 radial and 3 tangential coefficients. These parameters are determined stably and demonstrably produce superior results to low order models despite popular and prevalent misconceptions to the contrary. The system produces coefficients to model both the distorted to undistorted pixel coordinate transformation (e.g. for target designation) and the inverse transformation (e.g. for image stitching and fusion) allowing deterministic rates far exceeding real time. The focal length is determined to minimise the error in absolute photogrammetric positional measurement for both multi camera systems or monocular (e.g. helmet tracker) systems. The system determines the 6 DOF position of the camera in a chosen coordinate system. It can also determine the 6 DOF offset of the camera relative to its mechanical mount. This allows faulty cameras to be replaced without requiring a recalibration of the entire system (such as an aircraft cockpit). Results from two simple applications of the calibration results are presented: stitching and fusion of the images from a dual-band visual/ LWIR camera array, and a simple laboratory optical helmet tracker.

  4. Optical characterization of UV multispectral imaging cameras for SO2 plume measurements

    NASA Astrophysics Data System (ADS)

    Stebel, K.; Prata, F.; Dauge, F.; Durant, A.; Amigo, A.,

    2012-04-01

    Only a few years ago spectral imaging cameras for SO2 plume monitoring were developed for remote sensing of volcanic plumes. We describe the development from a first camera using a single filter in the absorption band of SO2 to more advanced systems using several filters and an integrated spectrometer. The first system was based on the Hamamatsu C8484 UV camera (1344 x 1024 pixels) with high quantum efficiency in the UV region from 280 nm onward. At the heart of the second UV camera system, EnviCam, is a cooled Alta U47 camera, equipped with two on-band (310 and 315 nm) and two off-band (325 and 330 nm) filters. The third system utilizes again the uncooled Hamamatsu camera for faster sampling (~10 Hz) and a four-position filter-wheel equipped with two 10 nm filters centered at 310 and 330 nm, a UV broadband view and a blackened plate for dark-current measurement. Both cameras have been tested with lenses with different focal lengths. A co-aligned spectrometer provides a ~0.3nm resolution spectrum within the field-of-view of the camera. We describe the ground-based imaging cameras systems developed and utilized at our Institute. Custom made cylindrical quartz calibration cells with 50 mm diameter, to cover the entire field of view of the camera optics, are filled with various amounts of gaseous SO2 (typically between 100 and 1500 ppm•m). They are used for calibration and characterization of the cameras in the laboratory. We report about the procedures for monitoring and analyzing SO2 path-concentration and fluxes. This includes a comparison of the calibration in the atmosphere using the SO2 cells versus the SO2 retrieval from the integrated spectrometer. The first UV cameras have been used to monitor ship emissions (Ny-Ålesund, Svalbard and Genova, Italy). The second generation of cameras were first tested for industrial stack monitoring during a field campaign close to the Rovinari (Romania) power plant in September 2010, revealing very high SO2 emissions

  5. Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test

    PubMed Central

    Lebourgeois, Valentine; Bégué, Agnès; Labbé, Sylvain; Mallavan, Benjamin; Prévot, Laurent; Roux, Bruno

    2008-01-01

    The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces.

  6. Integrating an RGB - CIR Digital Camera With an Airborne Laser Swath Mapping System

    NASA Astrophysics Data System (ADS)

    Lee, M.; Carter, W.; Shrestha, R.

    2003-12-01

    The National Science Foundation supported Center for Airborne Laser Mapping (NCALM) utilizes the airborne laser swath mapping (ALSM) system jointly owned by the University of Florida (UF) and Florida International University (FIU). The UF/FIU ALSM system is comprised of an Optech Inc. Model 1233 ALTM unit, with supporting GPS receiver and real-time navigation display, mounted in a twin-inline-engine Cessna 337 aircraft. Shortly after taking delivery of the ALSM system, UF researchers, in collaboration with a commercial partner, added a small format digital camera (Kodak 420) to the system, rigidly mounting it to the ALSM sensor head. Software was developed to use the GPS position and orientation parameters from the IMU unit in the ALSM sensor to rectify and mosaic the digital images. The ALSM height and intensity values were combined pixel by pixel with the RGB digital images, to classify surface materials. Based on our experience with the initial camera, and recommendations received at the NCALM workshop, UF researchers decided to upgrade the system to a Redlake MASD Inc. model MS4100 RGB/CIR camera. The MS4100 contains three CCD arrays, which simultaneously capture full spatial resolution images in red and near IR band bands, and a factor of two lower spatial resolution images in the blue and green bands (the blue and green bands share a single CCD array and the color bands are separated with a Bayer filter). The CCD arrays are rectangular with 1920 x 1080 elements, each element being 7.4 x 7.4 micrometers. With a 28 mm focal length lens, and at a flying height of 550 meters, the effective groundel is approximately 15 x 15 cm. The new digital camera should be particularly useful for studies of vegetation, including agricultural and forestry applications, and for computer automated classification of surface materials. Examples of early results using the improved ALSM-digital imaging capabilities will be presented.

  7. Simulation of LANDSAT multispectral scanner spatial resolution with airborne scanner data

    NASA Technical Reports Server (NTRS)

    Hlavka, C. A.

    1986-01-01

    A technique for simulation of low spatial resolution satellite imagery by using high resolution scanner data is described. The scanner data is convolved with the approximate point spread function of the low resolution data and then resampled to emulate low resolution imagery. The technique was successfully applied to Daedalus airborne scanner data to simulate a portion of a LANDSAT multispectra scanner scene.

  8. Comparison of multispectral airborne scanner reflectance images with ground surface reflectance measurements

    SciTech Connect

    Kollewe, M.; Bienlein, J.; Kollewe, T.; Spitzer, H.

    1996-11-01

    Simultaneously with an airborne data taking campaign near the city of Nurnberg (FRG), performed with an imaging 11-channel scanner of type Daedalus AADS 1268, ground reference measurements of reflectance spectra were conducted with a spectrally high resolving spectroradiometer of type IRIS at selected test sites. Based on a method developed reflectance images are calculated from the aerial raw data. Thus, physical quantities of the surfaces are generated, which are independent of illumination and registration conditions. The airborne scanner reflectance images are compared with ground reference reflectance measurements. The comparison yields deviations up to 35%. They can partially be explained by an inaccurate calibration of the airborne scanner. In addition, errors appear during calculation of the reflectances due to simplifying model assumptions and an inexact knowledge of the values of the model input parameters. It is shown that calibration of the airborne scanner data with the ground reference measurements improves the results, as compared to calibration based on laboratory testbench measurements. 8 refs., 4 figs., 1 tab.

  9. Estimating vegetation coverage in St. Joseph Bay, Florida with an airborne multispectral scanner

    NASA Technical Reports Server (NTRS)

    Savastano, K. J.; Faller, K. H.; Iverson, R. L.

    1984-01-01

    A four-channel multispectral scanner (MSS) carried aboard an aircraft was used to collect data along several flight paths over St. Joseph Bay, FL. Various classifications of benthic features were defined from the results of ground-truth observations. The classes were statistically correlated with MSS channel signal intensity using multivariate methods. Application of the classification measures to the MSS data set allowed computer construction of a detailed map of benthic features of the bay. Various densities of segrasses, various bottom types, and algal coverage were distinguished from water of various depths. The areal vegetation coverage of St. Joseph Bay was not significantly different from the results of a survey conducted six years previously, suggesting that seagrasses are a very stable feature of the bay bottom.

  10. Active/passive scanning. [airborne multispectral laser scanners for agricultural and water resources applications

    NASA Technical Reports Server (NTRS)

    Woodfill, J. R.; Thomson, F. J.

    1979-01-01

    The paper deals with the design, construction, and applications of an active/passive multispectral scanner combining lasers with conventional passive remote sensors. An application investigation was first undertaken to identify remote sensing applications where active/passive scanners (APS) would provide improvement over current means. Calibration techniques and instrument sensitivity are evaluated to provide predictions of the APS's capability to meet user needs. A preliminary instrument design was developed from the initial conceptual scheme. A design review settled the issues of worthwhile applications, calibration approach, hardware design, and laser complement. Next, a detailed mechanical design was drafted and construction of the APS commenced. The completed APS was tested and calibrated in the laboratory, then installed in a C-47 aircraft and ground tested. Several flight tests completed the test program.

  11. Capturing the Green River -- Multispectral airborne videography to evaluate the environmental impacts of hydropower operations

    SciTech Connect

    Snider, M.A.; Hayse, J.W.; Hlohowskyj, I.; LaGory, K.E.

    1996-02-01

    The 500-mile long Green River is the largest tributary of the Colorado River. From its origin in the Wind River Range mountains of western Wyoming to its confluence with the Colorado River in southeastern Utah, the Green River is vital to the arid region through which it flows. Large portions of the area remain near-wilderness with the river providing a source of recreation in the form of fishing and rafting, irrigation for farming and ranching, and hydroelectric power. In the late 1950`s and early 1960`s hydroelectric facilities were built on the river. One of these, Flaming Gorge Dam, is located just south of the Utah-Wyoming border near the town of Dutch John, Utah. Hydropower operations result in hourly and daily fluctuations in the releases of water from the dam that alter the natural stream flow below the dam and affect natural resources in and along the river corridor. In the present study, the authors were interested in evaluating the potential impacts of hydropower operations at Flaming Gorge Dam on the downstream natural resources. Considering the size of the area affected by the daily pattern of water release at the dam as well as the difficult terrain and limited accessibility of many reaches of the river, evaluating these impacts using standard field study methods was virtually impossible. Instead an approach was developed that used multispectral aerial videography to determine changes in the affected parameters at different flows, hydrologic modeling to predict flow conditions for various hydropower operating scenarios, and ecological information on the biological resources of concern to assign impacts.

  12. Airborne Camera System for Real-Time Applications - Support of a National Civil Protection Exercise

    NASA Astrophysics Data System (ADS)

    Gstaiger, V.; Romer, H.; Rosenbaum, D.; Henkel, F.

    2015-04-01

    In the VABENE++ project of the German Aerospace Center (DLR), powerful tools are being developed to aid public authorities and organizations with security responsibilities as well as traffic authorities when dealing with disasters and large public events. One focus lies on the acquisition of high resolution aerial imagery, its fully automatic processing, analysis and near real-time provision to decision makers in emergency situations. For this purpose a camera system was developed to be operated from a helicopter with light-weight processing units and microwave link for fast data transfer. In order to meet end-users' requirements DLR works close together with the German Federal Office of Civil Protection and Disaster Assistance (BBK) within this project. One task of BBK is to establish, maintain and train the German Medical Task Force (MTF), which gets deployed nationwide in case of large-scale disasters. In October 2014, several units of the MTF were deployed for the first time in the framework of a national civil protection exercise in Brandenburg. The VABENE++ team joined the exercise and provided near real-time aerial imagery, videos and derived traffic information to support the direction of the MTF and to identify needs for further improvements and developments. In this contribution the authors introduce the new airborne camera system together with its near real-time processing components and share experiences gained during the national civil protection exercise.

  13. Airborne multispectral and hyperspectral remote sensing: Examples of applications to the study of environmental and engineering problems

    SciTech Connect

    Bianchi, R.; Marino, C.M.

    1997-10-01

    The availability of a new aerial survey capability carried out by the CNR/LARA (National Research Council - Airborne Laboratory for the Environmental Research) by a new spectroradiometer AA5000 MIVIS (Multispectral Infrared and Visible Imaging Spectrometer) on board a CASA 212/200 aircraft, enable the scientists to obtain innovative data sets, for different approach to the definitions and the understanding of a variety of environmental and engineering problems. The 102 MIVIS channels spectral bandwidths are chosen to meet the needs of scientific research for advanced applications of remote sensing data. In such configuration MIVIS can offer significant contributions to problem solving in wide sectors such as geologic exploration, agricultural crop studies, forestry, land use mapping, idrogeology, oceanography and others. LARA in 1994-96 has been active over different test-sites in joint-venture with JPL, Pasadena, different European Institutions and Italian University and Research Institutes. These aerial surveys allow the national and international scientific community to approach the use of Hyperspectral Remote Sensing in environmental problems of very large interest. The sites surveyed in Italy, France and Germany include a variety of targets such as quarries, landfills, karst cavities areas, landslides, coastlines, geothermal areas, etc. The deployments gathered up to now more than 300 GBytes of MIVIS data in more than 30 hours of VLDS data recording. The purpose of this work is to present and to comment the procedures and the results at research and at operational level of the past campaigns with special reference to the study of environmental and engineering problems.

  14. Hydrological characterization of a riparian vegetation zone using high resolution multi-spectral airborne imagery

    NASA Astrophysics Data System (ADS)

    Akasheh, Osama Z.

    The Middle Rio Grande River (MRGR) is the main source of fresh water for the state of New Mexico. Located in an arid area with scarce local water resources, this has led to extensive diversions of river water to supply the high demand from municipalities and irrigated agricultural activities. The extensive water diversions over the last few decades have affected the composition of the native riparian vegetation by decreasing the area of cottonwood and coyote willow and increasing the spread of invasive species such as Tamarisk and Russian Olives, harmful to the river system, due to their high transpiration rates, which affect the river aquatic system. The need to study the river hydrological processes and their relation with its health is important to preserve the river ecosystem. To be able to do that a detailed vegetation map was produced using a Utah State University airborne remote sensing system for 286 km of river reach. Also a groundwater model was built in ArcGIS environment which has the ability to estimate soil water potential in the root zone and above the modeled water table. The Modified Penman-Monteith empirical equation was used in the ArcGIS environment to estimate riparian vegetation ET, taking advantage of the detailed vegetation map and spatial soil water potential layers. Vegetation water use per linear river reach was estimated to help decision makers to better manage and release the amount of water that keeps a sound river ecosystem and to support agricultural activities.

  15. Multispectral Photography

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Model II Multispectral Camera is an advanced aerial camera that provides optimum enhancement of a scene by recording spectral signatures of ground objects only in narrow, preselected bands of the electromagnetic spectrum. Its photos have applications in such areas as agriculture, forestry, water pollution investigations, soil analysis, geologic exploration, water depth studies and camouflage detection. The target scene is simultaneously photographed in four separate spectral bands. Using a multispectral viewer, such as their Model 75 Spectral Data creates a color image from the black and white positives taken by the camera. With this optical image analysis unit, all four bands are superimposed in accurate registration and illuminated with combinations of blue green, red, and white light. Best color combination for displaying the target object is selected and printed. Spectral Data Corporation produces several types of remote sensing equipment and also provides aerial survey, image processing and analysis and number of other remote sensing services.

  16. Potential of Uav-Based Laser Scanner and Multispectral Camera Data in Building Inspection

    NASA Astrophysics Data System (ADS)

    Mader, D.; Blaskow, R.; Westfeld, P.; Weller, C.

    2016-06-01

    Conventional building inspection of bridges, dams or large constructions in general is rather time consuming and often cost expensive due to traffic closures and the need of special heavy vehicles such as under-bridge inspection units or other large lifting platforms. In consideration that, an unmanned aerial vehicle (UAV) will be more reliable and efficient as well as less expensive and simpler to operate. The utilisation of UAVs as an assisting tool in building inspections is obviously. Furthermore, light-weight special sensors such as infrared and thermal cameras as well as laser scanner are available and predestined for usage on unmanned aircraft systems. Such a flexible low-cost system is realized in the ADFEX project with the goal of time-efficient object exploration, monitoring and damage detection. For this purpose, a fleet of UAVs, equipped with several sensors for navigation, obstacle avoidance and 3D object-data acquisition, has been developed and constructed. This contribution deals with the potential of UAV-based data in building inspection. Therefore, an overview of the ADFEX project, sensor specifications and requirements of building inspections in general are given. On the basis of results achieved in practical studies, the applicability and potential of the UAV system in building inspection will be presented and discussed.

  17. Preliminary investigation of multispectral retinal tissue oximetry mapping using a hyperspectral retinal camera.

    PubMed

    Desjardins, Michèle; Sylvestre, Jean-Philippe; Jafari, Reza; Kulasekara, Susith; Rose, Kalpana; Trussart, Rachel; Arbour, Jean Daniel; Hudson, Chris; Lesage, Frédéric

    2016-05-01

    Oximetry measurement of principal retinal vessels represents a first step towards understanding retinal metabolism, but the technique could be significantly enhanced by spectral imaging of the fundus outside of main vessels. In this study, a recently developed Hyperspectral Retinal Camera was used to measure relative oximetric (SatO2) and total hemoglobin (HbT) maps of the retina, outside of large vessels, in healthy volunteers at baseline (N = 7) and during systemic hypoxia (N = 11), as well as in patients with glaucoma (N = 2). Images of the retina, on a field of view of ∼30°, were acquired between 500 and 600 nm with 2 and 5 nm steps, in under 3 s. The reflectance spectrum from each pixel was fitted to a model having oxy- and deoxyhemoglobin as the main absorbers and scattering modeled by a power law, yielding estimates of relative SatO2 and HbT over the fundus. Average optic nerve head (ONH) saturation over 8 eyes was 68 ± 5%. During systemic hypoxia, mean ONH saturation decreased by 12.5% on average. Upon further development and validation, the relative SatO2 and HbT maps of microvasculature obtained with this imaging system could ultimately contribute to the diagnostic and management of diseases affecting the ONH and retina. PMID:27060375

  18. Photointerpretation of Skylab 2 multispectral camera (S-190A) data: Advance report of significant results

    NASA Technical Reports Server (NTRS)

    Jensen, M. L. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. A significant and possible major economic example of the practical value of Skylab photographs was provided by locating on Skylab Camera Station Number 4, frame 010, SL-2, an area of exposures of limestone rocks which were thought to be completely covered by volcanic rocks based upon prior mapping. The area is located less than 12 miles north of the Ruth porphyry copper deposit, White Pine County, Nevada. This is a major copper producing open pit mine owned by Kennecott Copper Corporation. Geophysical maps consisting of gravity and aeromagnetic studies have been published indicating three large positive magnetic anomalies located at the Ruth ore deposits, the Ward Mountain, not a mineralized area, and in the area previously thought to be completely covered by post-ore volcanics. Skylab photos indicate, however, that erosion has removed volcanic cover in specific sites sufficient to expose the underlying older rocks suggesting, therefore, that the volcanic rocks may not be the cause of the aeromagnetic anomaly. Field studies have verified the initial interpretations made from the Skylab photos. The potential significance of this study is that the large positive aeromagnetic anomaly suggests the presence of cooled and solidified magma below the anomalies, in which ore-bearing solutions may have been derived forming possible large ore deposits.

  19. Biophysical control of intertidal benthic macroalgae revealed by high-frequency multispectral camera images

    NASA Astrophysics Data System (ADS)

    van der Wal, Daphne; van Dalen, Jeroen; Wielemaker-van den Dool, Annette; Dijkstra, Jasper T.; Ysebaert, Tom

    2014-07-01

    Intertidal benthic macroalgae are a biological quality indicator in estuaries and coasts. While remote sensing has been applied to quantify the spatial distribution of such macroalgae, it is generally not used for their monitoring. We examined the day-to-day and seasonal dynamics of macroalgal cover on a sandy intertidal flat using visible and near-infrared images from a time-lapse camera mounted on a tower. Benthic algae were identified using supervised, semi-supervised and unsupervised classification techniques, validated with monthly ground-truthing over one year. A supervised classification (based on maximum likelihood, using training areas identified in the field) performed best in discriminating between sediment, benthic diatom films and macroalgae, with highest spectral separability between macroalgae and diatoms in spring/summer. An automated unsupervised classification (based on the Normalised Differential Vegetation Index NDVI) allowed detection of daily changes in macroalgal coverage without the need for calibration. This method showed a bloom of macroalgae (filamentous green algae, Ulva sp.) in summer with > 60% cover, but with pronounced superimposed day-to-day variation in cover. Waves were a major factor in regulating macroalgal cover, but regrowth of the thalli after a summer storm was fast (2 weeks). Images and in situ data demonstrated that the protruding tubes of the polychaete Lanice conchilega facilitated both settlement (anchorage) and survival (resistance to waves) of the macroalgae. Thus, high-frequency, high resolution images revealed the mechanisms for regulating the dynamics in cover of the macroalgae and for their spatial structuring. Ramifications for the mode, timing, frequency and evaluation of monitoring macroalgae by field and remote sensing surveys are discussed.

  20. Ground-based multispectral measurements for airborne data verification in non-operating open pit mine "Kremikovtsi"

    NASA Astrophysics Data System (ADS)

    Borisova, Denitsa; Nikolov, Hristo; Petkov, Doyno

    2013-10-01

    The impact of mining industry and metal production on the environment is presented all over the world. In our research we set focus on the impact of already non-operating ferrous "Kremikovtsi"open pit mine and related waste dumps and tailings which we consider to be the major factor responsible for pollution of one densely populated region in Bulgaria. The approach adopted is based on correct estimation of the distribution of the iron oxides inside open pit mines and the neighboring regions those considered in this case to be the key issue for the ecological state assessment of soils, vegetation and water. For this study the foremost source of data are those of airborne origin and those combined with ground-based in-situ and laboratory acquired data were used for verification of the environmental variables and thus in process of assessment of the present environmental status influenced by previous mining activities. The percentage of iron content was selected as main indicator for presence of metal pollution since it could be reliably identified by multispectral data used in this study and also because the iron compounds are widely spread in the most of the minerals, rocks and soils. In our research the number of samples from every source (air, field, lab) was taken in the way to be statistically sound and confident. In order to establish relationship between the degree of pollution of the soil and mulspectral data 40 soil samples were collected during a field campaign in the study area together with GPS measurements for two types of laboratory measurements: the first one, chemical and mineralogical analysis and the second one, non-destructive spectroscopy. In this work for environmental variables verification over large areas mulspectral satellite data from Landsat instruments TM/ETM+ and from ALI/OLI (Operational Land Imager) were used. Ground-based (laboratory and in-situ) spectrometric measurements were performed using the designed and constructed in Remote

  1. Land cover/use classification of Cairns, Queensland, Australia: A remote sensing study involving the conjunctive use of the airborne imaging spectrometer, the large format camera and the thematic mapper simulator

    NASA Technical Reports Server (NTRS)

    Heric, Matthew; Cox, William; Gordon, Daniel K.

    1987-01-01

    In an attempt to improve the land cover/use classification accuracy obtainable from remotely sensed multispectral imagery, Airborne Imaging Spectrometer-1 (AIS-1) images were analyzed in conjunction with Thematic Mapper Simulator (NS001) Large Format Camera color infrared photography and black and white aerial photography. Specific portions of the combined data set were registered and used for classification. Following this procedure, the resulting derived data was tested using an overall accuracy assessment method. Precise photogrammetric 2D-3D-2D geometric modeling techniques is not the basis for this study. Instead, the discussion exposes resultant spectral findings from the image-to-image registrations. Problems associated with the AIS-1 TMS integration are considered, and useful applications of the imagery combination are presented. More advanced methodologies for imagery integration are needed if multisystem data sets are to be utilized fully. Nevertheless, research, described herein, provides a formulation for future Earth Observation Station related multisensor studies.

  2. Multispectral airborne imagery in the field reveals genetic determinisms of morphological and transpiration traits of an apple tree hybrid population in response to water deficit.

    PubMed

    Virlet, Nicolas; Costes, Evelyne; Martinez, Sébastien; Kelner, Jean-Jacques; Regnard, Jean-Luc

    2015-09-01

    Genetic studies of response to water deficit in adult trees are limited by low throughput of the usual phenotyping methods in the field. Here, we aimed at overcoming this bottleneck, applying a new methodology using airborne multispectral imagery and in planta measurements to compare a high number of individuals.An apple tree population, grafted on the same rootstock, was submitted to contrasting summer water regimes over two years. Aerial images acquired in visible, near- and thermal-infrared at three dates each year allowed calculation of vegetation and water stress indices. Tree vigour and fruit production were also assessed. Linear mixed models were built accounting for date and year effects on several variables and including the differential response of genotypes between control and drought conditions.Broad-sense heritability of most variables was high and 18 quantitative trait loci (QTLs) independent of the dates were detected on nine linkage groups of the consensus apple genetic map. For vegetation and stress indices, QTLs were related to the means, the intra-crown heterogeneity, and differences induced by water regimes. Most QTLs explained 15-20% of variance.Airborne multispectral imaging proved relevant to acquire simultaneous information on a whole tree population and to decipher genetic determinisms involved in response to water deficit. PMID:26208644

  3. Multispectral airborne imagery in the field reveals genetic determinisms of morphological and transpiration traits of an apple tree hybrid population in response to water deficit

    PubMed Central

    Virlet, Nicolas; Costes, Evelyne; Martinez, Sébastien; Kelner, Jean-Jacques; Regnard, Jean-Luc

    2015-01-01

    Genetic studies of response to water deficit in adult trees are limited by low throughput of the usual phenotyping methods in the field. Here, we aimed at overcoming this bottleneck, applying a new methodology using airborne multispectral imagery and in planta measurements to compare a high number of individuals. An apple tree population, grafted on the same rootstock, was submitted to contrasting summer water regimes over two years. Aerial images acquired in visible, near- and thermal-infrared at three dates each year allowed calculation of vegetation and water stress indices. Tree vigour and fruit production were also assessed. Linear mixed models were built accounting for date and year effects on several variables and including the differential response of genotypes between control and drought conditions. Broad-sense heritability of most variables was high and 18 quantitative trait loci (QTLs) independent of the dates were detected on nine linkage groups of the consensus apple genetic map. For vegetation and stress indices, QTLs were related to the means, the intra-crown heterogeneity, and differences induced by water regimes. Most QTLs explained 15−20% of variance. Airborne multispectral imaging proved relevant to acquire simultaneous information on a whole tree population and to decipher genetic determinisms involved in response to water deficit. PMID:26208644

  4. Multispectral airborne imagery in the field reveals genetic determinisms of morphological and transpiration traits of an apple tree hybrid population in response to water deficit.

    PubMed

    Virlet, Nicolas; Costes, Evelyne; Martinez, Sébastien; Kelner, Jean-Jacques; Regnard, Jean-Luc

    2015-09-01

    Genetic studies of response to water deficit in adult trees are limited by low throughput of the usual phenotyping methods in the field. Here, we aimed at overcoming this bottleneck, applying a new methodology using airborne multispectral imagery and in planta measurements to compare a high number of individuals.An apple tree population, grafted on the same rootstock, was submitted to contrasting summer water regimes over two years. Aerial images acquired in visible, near- and thermal-infrared at three dates each year allowed calculation of vegetation and water stress indices. Tree vigour and fruit production were also assessed. Linear mixed models were built accounting for date and year effects on several variables and including the differential response of genotypes between control and drought conditions.Broad-sense heritability of most variables was high and 18 quantitative trait loci (QTLs) independent of the dates were detected on nine linkage groups of the consensus apple genetic map. For vegetation and stress indices, QTLs were related to the means, the intra-crown heterogeneity, and differences induced by water regimes. Most QTLs explained 15-20% of variance.Airborne multispectral imaging proved relevant to acquire simultaneous information on a whole tree population and to decipher genetic determinisms involved in response to water deficit.

  5. Mastcam-Z: Designing a Geologic, Stereoscopic, and Multispectral Pair of Zoom Cameras for the NASA Mars 2020 Rover

    NASA Astrophysics Data System (ADS)

    Bell, J. F.; Maki, J. N.; Mehall, G. L.; Ravine, M. A.; Caplinger, M. A.; Mastcam-Z Team

    2016-10-01

    Mastcam-Z is a stereoscopic, multispectral imaging investigation selected for flight on the Mars 2020 rover mission. In this presentation we review our science goals and requirements and describe our CDR-level design and operational plans.

  6. Coastal survey with a multispectral video system

    NASA Astrophysics Data System (ADS)

    Niedrauer, Terren M.

    1991-09-01

    Xybion Corporation has developed an airborne multispectral measurement system (AMMS) as part of a small business innovative research contract with the Department of Commerce. The AMMS is a low-cost portable system that can provide multispectral data suitable for frequent measurement and mapping. It has been used for measurement of estuarine concentrations of chlorophyll and suspended sediments and mapping of submerged aquatic vegetation fields. Other applications include the identification of tree and plant species, the detection of crop stress, and the detection of man-made objects in a background of vegetation. The AMMS provides high spatial resolution multispectral image data in six user-defined bands in the 400- 900 nm wavelength region. The AMMS includes a highly innovative, computer-controlled, intensified, multispectral video camera (IMC), a spectroradiometer, a S-VHS VCR, and a portable IBM-PC-compatible computer system. An airborne trial over Cheseapeake Bay in June 1990 showed its ability to detect variations in water parameters. Simultaneous measurements from a ship provided sea-surface data, including continuous fluorometer readings, and discrete samples of chlorophyll, suspended sediments, and several other water parameters. Two spectroradiometers were included in the airborne equipment. One pointed downward to provide a high-resolution spectrum of a large water area under the plane. The other spectroradiometer measured downwelling irradiance. This allowed for conversion of the upwelling radiances measured by the IMC into reflectances. Calibrations for the IMC and the spectroradiometers were done before and after the trials. The results of this airborne trial are presented.

  7. Non-invasive skin oxygenation imaging using a multi-spectral camera system: effectiveness of various concentration algorithms applied on human skin

    NASA Astrophysics Data System (ADS)

    Klaessens, John H. G. M.; Noordmans, Herke Jan; de Roode, Rowland; Verdaasdonk, Rudolf M.

    2009-02-01

    This study describes noninvasive noncontact methods to acquire and analyze functional information from the skin. Multispectral images at several selected wavelengths in the visible and near infrared region are collected and used in mathematical methods to calculate concentrations of different chromophores in the epidermis and dermis of the skin. This is based on the continuous wave Near Infrared Spectroscopy method, which is a well known non-invasive technique for measuring oxygenation changes in the brain and in muscle tissue. Concentration changes of hemoglobin (dO2Hb, dHHb and dtHb) can be calculated from light attenuations using the modified Lambert Beer equation. We applied this technique on multi-spectral images taken from the skin surface using different algorithms for calculating changes in O2Hb, HHb and tHb. In clinical settings, the imaging of local oxygenation variations and/or blood perfusion in the skin can be useful for e.g. detection of skin cancer, detection of early inflammation, checking the level of peripheral nerve block anesthesia, study of wound healing and tissue viability by skin flap transplantations. Images from the skin are obtained with a multi-spectral imaging system consisting of a 12-bit CCD camera in combination with a Liquid Crystal Tunable Filter. The skin is illuminated with either a broad band light source or a tunable multi wavelength LED light source. A polarization filter is used to block the direct reflected light. The collected multi-spectral imaging data are images of the skin surface radiance; each pixel contains either the full spectrum (420 - 730 nm) or a set of selected wavelengths. These images were converted to reflectance spectra. The algorithms were validated during skin oxygen saturation changes induced by temporary arm clamping and applied to some clinical examples. The initial results with the multi-spectral skin imaging system show good results for detecting dynamic changes in oxygen concentration. However, the

  8. Multispectral photography for earth resources

    NASA Technical Reports Server (NTRS)

    Wenderoth, S.; Yost, E.; Kalia, R.; Anderson, R.

    1972-01-01

    A guide for producing accurate multispectral results for earth resource applications is presented along with theoretical and analytical concepts of color and multispectral photography. Topics discussed include: capabilities and limitations of color and color infrared films; image color measurements; methods of relating ground phenomena to film density and color measurement; sensitometry; considerations in the selection of multispectral cameras and components; and mission planning.

  9. Russian multispectral-hyperspectral airborne scanner for geological and environmental investigations - {open_quotes}Vesuvius-EC{close_quotes}

    SciTech Connect

    Yassinsky, G.I.; Shilin, B.V.

    1996-07-01

    Small variations of spectral characteristics in 0,3-14 microns band are of great significance in geological and environmental investigations. Multipurpose multispectral digital scanner with narrow field of view, high spectral resolution and radiometric calibration designed in Russia. Changeable modules permit to obtain parameters of the device for practical using.

  10. Airborne remote sensing in precision viticolture: assessment of quality and quantity vineyard production using multispectral imagery: a case study in Velletri, Rome surroundings (central Italy)

    NASA Astrophysics Data System (ADS)

    Tramontana, Gianluca; Papale, Dario; Girard, Filippo; Belli, Claudio; Pietromarchi, Paolo; Tiberi, Domenico; Comandini, Maria C.

    2009-09-01

    During 2008 an experimental study aimed to investigate the capabilities of a new Airborne Remote sensing platform as an aid in precision viticulture was conducted. The study was carried out on 2 areas located in the town of Velletri, near Rome; the acquisitions were conducted on 07-08-2008 and on 09-09-2008, using ASPIS (Advanced Spectroscopic Imager System) the new airborne multispectral sensor, capable to acquire 12 narrow spectral bands (10 nm) located in the visible and near-infrared region. Several vegetation indices, for a total of 22 independent variables, were tested for the estimation of different oenological parameters. Anova test showed that several oenochemical parameters, such as sugars and acidity, differ according to the variety taken into consideration. The remotely sensed data were significantly correlated with the following oenochemical parameters: Leaf Surface Exposed (SFE) (correlation coefficient R2 ~ 0.8), wood pruning (R2 ~ 0.8), reducing sugars (R2 ~ 0.6 and Root Mean Square Error ~ 5g/l), total acidity (R2 ~ 0.6 and RMSE ~ 0.5 g/l), polyphenols (R2~ 0.9) and anthocyanins content (R2 ~ 0.89) in order to provide "prescriptives" thematic maps related to the oenological variables of interest, the relationships previously carried out have been applied to the vegetation indices.

  11. In vivo multispectral imaging of the absorption and scattering properties of exposed brain using a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Yoshida, Keiichiro; Ishizuka, Tomohiro; Mizushima, Chiharu; Nishidate, Izumi; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu

    2015-04-01

    To evaluate multi-spectral images of the absorption and scattering properties in the cerebral cortex of rat brain, we investigated spectral reflectance images estimated by the Wiener estimation method using a digital red-green-blue camera. A Monte Carlo simulation-based multiple regression analysis for the corresponding spectral absorbance images at nine wavelengths (500, 520, 540, 560, 570, 580, 600, 730, and 760 nm) was then used to specify the absorption and scattering parameters. The spectral images of absorption and reduced scattering coefficients were reconstructed from the absorption and scattering parameters. We performed in vivo experiments on exposed rat brain to confirm the feasibility of this method. The estimated images of the absorption coefficients were dominated by hemoglobin spectra. The estimated images of the reduced scattering coefficients had a broad scattering spectrum, exhibiting a larger magnitude at shorter wavelengths, corresponding to the typical spectrum of brain tissue published in the literature.

  12. Remote Sensing of Liquid Water and Ice Cloud Optical Thickness and Effective Radius in the Arctic: Application of Airborne Multispectral MAS Data

    NASA Technical Reports Server (NTRS)

    King, Michael D.; Platnick, Steven; Yang, Ping; Arnold, G. Thomas; Gray, Mark A.; Riedi, Jerome C.; Ackerman, Steven A.; Liou, Kuo-Nan

    2003-01-01

    A multispectral scanning spectrometer was used to obtain measurements of the reflection function and brightness temperature of clouds, sea ice, snow, and tundra surfaces at 50 discrete wavelengths between 0.47 and 14.0 microns. These observations were obtained from the NASA ER-2 aircraft as part of the FIRE Arctic Clouds Experiment, conducted over a 1600 x 500 km region of the north slope of Alaska and surrounding Beaufort and Chukchi Seas between 18 May and 6 June 1998. Multispectral images of the reflection function and brightness temperature in 11 distinct bands of the MODIS Airborne Simulator (MAS) were used to derive a confidence in clear sky (or alternatively the probability of cloud), shadow, and heavy aerosol over five different ecosystems. Based on the results of individual tests run as part of the cloud mask, an algorithm was developed to estimate the phase of the clouds (water, ice, or undetermined phase). Finally, the cloud optical thickness and effective radius were derived for both water and ice clouds that were detected during one flight line on 4 June. This analysis shows that the cloud mask developed for operational use on MODIS, and tested using MAS data in Alaska, is quite capable of distinguishing clouds from bright sea ice surfaces during daytime conditions in the high Arctic. Results of individual tests, however, make it difficult to distinguish ice clouds over snow and sea ice surfaces, so additional tests were added to enhance the confidence in the thermodynamic phase of clouds over the Beaufort Sea. The cloud optical thickness and effective radius retrievals used 3 distinct bands of the MAS, with the newly developed 1.62 and 2.13 micron bands being used quite successfully over snow and sea ice surfaces. These results are contrasted with a MODIS-based algorithm that relies on spectral reflectance at 0.87 and 2.13 micron.

  13. Documenting and Communicating the Dynamics of a Rapidly Changing Cryosphere Through the Use of Repeat Ground-Based, Airborne, and Space-Based Photography and Multispectral Imagery

    NASA Astrophysics Data System (ADS)

    Molnia, B. F.

    2009-04-01

    Alaska supports thousands of glaciers, covering an area of about 75,000 square kilometers. Today, most large low elevation Alaskan glaciers are rapidly retreating and/or thinning in response to increasing temperature. Considering the breadth of Alaska's glacier cover, documenting the response of these glaciers to changing climate is only possible through a comprehensive collection and assessment of ground-based, airborne, and space-based photography and multispectral imagery. Pairing these data with historical imagery provides unequivocal visual evidence of changes within the glacier component of the Alaskan cryosphere. Since 1972, all Alaskan glaciers have been sequentially imaged with space-based multispectral sensors. Additionally, many Alaskan glaciers have been repeatedly photographed from the ground (beginning in 1893), from the air (beginning in 1926), and from space (beginning in the early 1960s). Analysis of this massive compilation of repeat photographs and multispectral images has been used to quantitatively and qualitatively determine the distribution, extent, and multiple decadal-scale behavior of glaciers throughout Alaska. These results have recently been published by the U.S. Geological Survey in "Glaciers of Alaska", Chapter K of the "Satellite Image Atlas of the Glaciers of the World", Professional Paper 1386-K. Additionally, a website ("Glacier and Landscape Change in Response to Changing Climate" - www.usgs.gov/global_change/glaciers/default.asp) has been developed to broadly communicate and distribute this information to the general public, scientists and engineers, the press, civil protection government agencies, and a multitude of other governmental and non-governmental agencies. This poster presents details about the new book and website. For the poster, several areas with extensive records of historic ground-based photography and space-based imagery were selected to demonstrate the effectiveness of this approach to communicate information

  14. Comparison of mosaicking techniques for airborne images from consumer-grade cameras

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Images captured from airborne imaging systems have the advantages of relatively low cost, high spatial resolution, and real/near-real-time availability. Multiple images taken from one or more flight lines could be used to generate a high-resolution mosaic image, which could be useful for diverse rem...

  15. Verification of sensitivity enhancement of SWIR imager technology in advanced multispectral SWIR/VIS zoom cameras with constant and variable F-number

    NASA Astrophysics Data System (ADS)

    Hübner, M.; Achtner, B.; Kraus, M.; Siemens, C.; Münzberg, M.

    2016-05-01

    Current designs of combined VIS-color/SWIR camera optics use constant F-number over the full field of view (FOV) range. Especially in the SWIR, limited space for the camera integration in existing system volumes and relatively high pitch dimensions of 15μm or even 20μm force the use of relatively high F- numbers to accomplish narrow fields of view less than 2.0° with reasonable resolution for long range observation and targeting applications. Constant F-number designs are already reported and considered [1] for submarine applications. The comparison of electro-optical performance was based on the given detector noise performance and sensitivity data by the detector manufacturer [1] and further modelling of the imaging chain within linear MTF system theory. The visible channel provides limited twilight capability at F/2.6 but in the SWIR the twilight capability is degraded due to the relatively high F-number of F/7 or F/5.25 for 20 μm and 15 μm pitch, respectively. Differences between prediction and experimental verification of sensitivity in terms of noise equivalent irradiance (NEI) and scenery based limiting illumination levels are shown for the visible and the SWIR spectral range. Within this context, currently developed improvements using optical zoom designs for the multispectral SWIR/VIS camera optics with continuously variable Fnumber are discussed, offering increased low light level capabilities at wide and medium fields of view while still enabling a NFOV < 2° with superior long range targeting capabilities under limited atmospherical sight conditions at daytime.

  16. Use of reflectance spectra of native plant species for interpreting airborne multispectral scanner data in the East Tintic Mountains, Utah.

    USGS Publications Warehouse

    Milton, N.M.

    1983-01-01

    Analysis of in situ reflectance spectra of native vegetation was used to interpret airborne MSS data. Representative spectra from three plant species in the E Tintic Mountains, Utah, were used to interpret the color components on a color ratio composite image made from MSS data in the visible and near-infrared regions. A map of plant communities was made from the color ratio composite image and field checked. -from Author

  17. Feasibility of an airborne TV camera as a size spectrometer for cloud droplets in daylight.

    PubMed

    Roscoe, H K; Lachlan-Cope, T A; Roscoe, J

    1999-01-20

    Photographs of clouds taken with a camera with a large aperture ratio must have a short depth of focus to resolve small droplets. Hence the sampling volume is small, which limits the number of droplets and gives rise to a large statistical error on the number counted. However, useful signals can be obtained with a small aperture ratio, which allows for a sample volume large enough for counting cloud droplets at aircraft speeds with useful spatial resolution. The signal is sufficient to discriminate against noise from a sunlit cloud as background, provided the bandwidth of the light source and camera are restricted, and against readout noise. Hence, in principle, an instrument to sample the size distribution of cloud droplets from aircraft in daylight can be constructed from a simple TV camera and an array of laser diodes, without any components or screens external to the aircraft window.

  18. Visibility through the gaseous smoke in airborne remote sensing using a DSLR camera

    NASA Astrophysics Data System (ADS)

    Chabok, Mirahmad; Millington, Andrew; Hacker, Jorg M.; McGrath, Andrew J.

    2016-08-01

    Visibility and clarity of remotely sensed images acquired by consumer grade DSLR cameras, mounted on an unmanned aerial vehicle or a manned aircraft, are critical factors in obtaining accurate and detailed information from any area of interest. The presence of substantial haze, fog or gaseous smoke particles; caused, for example, by an active bushfire at the time of data capture, will dramatically reduce image visibility and quality. Although most modern hyperspectral imaging sensors are capable of capturing a large number of narrow range bands of the shortwave and thermal infrared spectral range, which have the potential to penetrate smoke and haze, the resulting images do not contain sufficient spatial detail to enable locating important objects or assist search and rescue or similar applications which require high resolution information. We introduce a new method for penetrating gaseous smoke without compromising spatial resolution using a single modified DSLR camera in conjunction with image processing techniques which effectively improves the visibility of objects in the captured images. This is achieved by modifying a DSLR camera and adding a custom optical filter to enable it to capture wavelengths from 480-1200nm (R, G and Near Infrared) instead of the standard RGB bands (400-700nm). With this modified camera mounted on an aircraft, images were acquired over an area polluted by gaseous smoke from an active bushfire. Processed data using our proposed method shows significant visibility improvements compared with other existing solutions.

  19. Application of phase matching autofocus in airborne long-range oblique photography camera

    NASA Astrophysics Data System (ADS)

    Petrushevsky, Vladimir; Guberman, Asaf

    2014-06-01

    The Condor2 long-range oblique photography (LOROP) camera is mounted in an aerodynamically shaped pod carried by a fast jet aircraft. Large aperture, dual-band (EO/MWIR) camera is equipped with TDI focal plane arrays and provides high-resolution imagery of extended areas at long stand-off ranges, at day and night. Front Ritchey-Chretien optics is made of highly stable materials. However, the camera temperature varies considerably in flight conditions. Moreover, a composite-material structure of the reflective objective undergoes gradual dehumidification in dry nitrogen atmosphere inside the pod, causing some small decrease of the structure length. The temperature and humidity effects change a distance between the mirrors by just a few microns. The distance change is small but nevertheless it alters the camera's infinity focus setpoint significantly, especially in the EO band. To realize the optics' resolution potential, the optimal focus shall be constantly maintained. In-flight best focus calibration and temperature-based open-loop focus control give mostly satisfactory performance. To get even better focusing precision, a closed-loop phase-matching autofocus method was developed for the camera. The method makes use of an existing beamsharer prism FPA arrangement where aperture partition exists inherently in an area of overlap between the adjacent detectors. The defocus is proportional to an image phase shift in the area of overlap. Low-pass filtering of raw defocus estimate reduces random errors related to variable scene content. Closed-loop control converges robustly to precise focus position. The algorithm uses the temperature- and range-based focus prediction as an initial guess for the closed-loop phase-matching control. The autofocus algorithm achieves excellent results and works robustly in various conditions of scene illumination and contrast.

  20. Analysis of testbed airborne multispectral scanner data from Superflux II. [Chesapeake Bay plume and James Shelf data

    NASA Technical Reports Server (NTRS)

    Bowker, D. E.; Hardesty, C. A.; Jobson, D. J.; Bahn, G. S.

    1981-01-01

    A test bed aircraft multispectral scanner (TBAMS) was flown during the James Shelf, Plume Scan, and Chesapeake Bay missions as part of the Superflux 2 experiment. Excellent correlations were obtained between water sample measurements of chlorophyll and sediment and TBAMS radiance data. The three-band algorithms used were insensitive to aircraft altitude and varying atmospheric conditions. This was particularly fortunate due to the hazy conditions during most of the experiments. A contour map of sediment, and also chlorophyll, was derived for the Chesapeake Bay plume along the southern Virginia-Carolina coastline. A sediment maximum occurs about 5 nautical miles off the Virginia Beach coast with a chlorophyll maximum slightly shoreward of this. During the James Shelf mission, a thermal anomaly (or front) was encountered about 50 miles from the coast. There was a minor variation in chlorophyll and sediment across the boundary. During the Chesapeake Bay mission, the Sun elevation increased from 50 degrees to over 70 degrees, interfering with the generation of data products.

  1. Comparison of mosaicking techniques for airborne images from consumer-grade cameras

    NASA Astrophysics Data System (ADS)

    Song, Huaibo; Yang, Chenghai; Zhang, Jian; Hoffmann, Wesley Clint; He, Dongjian; Thomasson, J. Alex

    2016-01-01

    Images captured from airborne imaging systems can be mosaicked for diverse remote sensing applications. The objective of this study was to identify appropriate mosaicking techniques and software to generate mosaicked images for use by aerial applicators and other users. Three software packages-Photoshop CC, Autostitch, and Pix4Dmapper-were selected for mosaicking airborne images acquired from a large cropping area. Ground control points were collected for georeferencing the mosaicked images and for evaluating the accuracy of eight mosaicking techniques. Analysis and accuracy assessment showed that Pix4Dmapper can be the first choice if georeferenced imagery with high accuracy is required. The spherical method in Photoshop CC can be an alternative for cost considerations, and Autostitch can be used to quickly mosaic images with reduced spatial resolution. The results also showed that the accuracy of image mosaicking techniques could be greatly affected by the size of the imaging area or the number of the images and that the accuracy would be higher for a small area than for a large area. The results from this study will provide useful information for the selection of image mosaicking software and techniques for aerial applicators and other users.

  2. Long-Term Tracking of a Specific Vehicle Using Airborne Optical Camera Systems

    NASA Astrophysics Data System (ADS)

    Kurz, F.; Rosenbaum, D.; Runge, H.; Cerra, D.; Mattyus, G.; Reinartz, P.

    2016-06-01

    In this paper we present two low cost, airborne sensor systems capable of long-term vehicle tracking. Based on the properties of the sensors, a method for automatic real-time, long-term tracking of individual vehicles is presented. This combines the detection and tracking of the vehicle in low frame rate image sequences and applies the lagged Cell Transmission Model (CTM) to handle longer tracking outages occurring in complex traffic situations, e.g. tunnels. The CTM model uses the traffic conditions in the proximities of the target vehicle and estimates its motion to predict the position where it reappears. The method is validated on an airborne image sequence acquired from a helicopter. Several reference vehicles are tracked within a range of 500m in a complex urban traffic situation. An artificial tracking outage of 240m is simulated, which is handled by the CTM. For this, all the vehicles in the close proximity are automatically detected and tracked to estimate the basic density-flow relations of the CTM model. Finally, the real and simulated trajectories of the reference vehicles in the outage are compared showing good correspondence also in congested traffic situations.

  3. Multispectral imaging and image processing

    NASA Astrophysics Data System (ADS)

    Klein, Julie

    2014-02-01

    The color accuracy of conventional RGB cameras is not sufficient for many color-critical applications. One of these applications, namely the measurement of color defects in yarns, is why Prof. Til Aach and the Institute of Image Processing and Computer Vision (RWTH Aachen University, Germany) started off with multispectral imaging. The first acquisition device was a camera using a monochrome sensor and seven bandpass color filters positioned sequentially in front of it. The camera allowed sampling the visible wavelength range more accurately and reconstructing the spectra for each acquired image position. An overview will be given over several optical and imaging aspects of the multispectral camera that have been investigated. For instance, optical aberrations caused by filters and camera lens deteriorate the quality of captured multispectral images. The different aberrations were analyzed thoroughly and compensated based on models for the optical elements and the imaging chain by utilizing image processing. With this compensation, geometrical distortions disappear and sharpness is enhanced, without reducing the color accuracy of multispectral images. Strong foundations in multispectral imaging were laid and a fruitful cooperation was initiated with Prof. Bernhard Hill. Current research topics like stereo multispectral imaging and goniometric multispectral measure- ments that are further explored with his expertise will also be presented in this work.

  4. Airborne imaging for heritage documentation using the Fotokite tethered flying camera

    NASA Astrophysics Data System (ADS)

    Verhoeven, Geert; Lupashin, Sergei; Briese, Christian; Doneus, Michael

    2014-05-01

    Since the beginning of aerial photography, researchers used all kinds of devices (from pigeons, kites, poles, and balloons to rockets) to take still cameras aloft and remotely gather aerial imagery. To date, many of these unmanned devices are still used for what has been referred to as Low-Altitude Aerial Photography or LAAP. In addition to these more traditional camera platforms, radio-controlled (multi-)copter platforms have recently added a new aspect to LAAP. Although model airplanes have been around for several decades, the decreasing cost, increasing functionality and stability of ready-to-fly multi-copter systems has proliferated their use among non-hobbyists. As such, they became a very popular tool for aerial imaging. The overwhelming amount of currently available brands and types (heli-, dual-, tri-, quad-, hexa-, octo-, dodeca-, deca-hexa and deca-octocopters), together with the wide variety of navigation options (e.g. altitude and position hold, waypoint flight) and camera mounts indicate that these platforms are here to stay for some time. Given the multitude of still camera types and the image quality they are currently capable of, endless combinations of low- and high-cost LAAP solutions are available. In addition, LAAP allows for the exploitation of new imaging techniques, as it is often only a matter of lifting the appropriate device (e.g. video cameras, thermal frame imagers, hyperspectral line sensors). Archaeologists were among the first to adopt this technology, as it provided them with a means to easily acquire essential data from a unique point of view, whether for simple illustration purposes of standing historic structures or to compute three-dimensional (3D) models and orthophotographs from excavation areas. However, even very cheap multi-copters models require certain skills to pilot them safely. Additionally, malfunction or overconfidence might lift these devices to altitudes where they can interfere with manned aircrafts. As such, the

  5. Airborne Imagery

    NASA Technical Reports Server (NTRS)

    1983-01-01

    ATM (Airborne Thematic Mapper) was developed for NSTL (National Space Technology Companies) by Daedalus Company. It offers expanded capabilities for timely, accurate and cost effective identification of areas with prospecting potential. A related system is TIMS, Thermal Infrared Multispectral Scanner. Originating from Landsat 4, it is also used for agricultural studies, etc.

  6. Airborne multispectral remote sensing data to estimate several oenological parameters in vineyard production. A case study of application of remote sensing data to precision viticulture in central Italy.

    NASA Astrophysics Data System (ADS)

    Tramontana, Gianluca; Girard, Filippo; Belli, Claudio; Comandini, Maria Cristina; Pietromarchi, Paolo; Tiberi, Domenico; Papale, Dario

    2010-05-01

    It is widely recognized that environmental differences within the vineyard, with respect to soils, microclimate, and topography, can influence grape characteristics and crop yields. Besides, the central Italy landscape is characterized by a high level of fragmentation and heterogeneity It requires stringent Remote sensing technical features in terms of spectral, geometric and temporal resolution to aimed at supporting applications for precision viticulture. In response to the needs of the Italian grape and wine industry for an evaluation of precision viticulture technologies, the DISAFRI (University of Tuscia) and the Agricultural Research Council - Oenological research unit (ENC-CRA) jointly carried out an experimental study during the year 2008. The study was carried out on 2 areas located in the town of Velletri, near Rome; for each area, two varieties (red and white grape) were studied: Nero d'Avola and Sauvignon blanc in first area , Merlot and Sauvignon blanc in second. Remote sensing data were acquired in different periods using a low cost multisensor Airborne remote sensing platform developed by DISAFRI (ASPIS-2 Advanced Spectroscopic Imager System). ASPIS-2, an evolution of the ASPIS sensor (Papale et al 2008, Sensors), is a multispectral sensor based on 4 CCD and 3 interferential filters per CCD. The filters are user selectable during the flight and in this way Aspis is able to acquire data in 12 bands in the visible and near infrared regions with a bandwidth of 10 or 20 nm. To the purposes of this study 7 spectral band were acquired and 15 vegetation indices calculated. During the ripeness period several vegetative and oenochemical parameters were monitored. Anova test shown that several oenochemical variables, such as sugars, total acidity, polyphenols and anthocyanins differ according to the variety taken into consideration. In order to evaluate the time autocorrelation of several oenological parameters value, a simple linear regression between

  7. An algorithm for the estimation of bounds on the emissivity and temperatures from thermal multispectral airborne remotely sensed data

    NASA Technical Reports Server (NTRS)

    Jaggi, S.; Quattrochi, D.; Baskin, R.

    1992-01-01

    The effective flux incident upon the detectors of a thermal sensor, after it has been corrected for atmospheric effects, is a function of a non-linear combination of the emissivity of the target for that channel and the temperature of the target. The sensor system cannot separate the contribution from the emissivity and the temperature that constitute the flux value. A method that estimates the bounds on these temperatures and emissivities from thermal data is described. This method is then tested with remotely sensed data obtained from NASA's Thermal Infrared Multispectral Scanner (TIMS) - a 6 channel thermal sensor. Since this is an under-determined set of equations i.e. there are 7 unknowns (6 emissivities and 1 temperature) and 6 equations (corresponding to the 6 channel fluxes), there exist theoretically an infinite combination of values of emissivities and temperature that can satisfy these equations. Using some realistic bounds on the emissivities, bounds on the temperature are calculated. These bounds on the temperature are refined to estimate a tighter bound on the emissivity of the source. An error analysis is also carried out to quantitatively determine the extent of uncertainty introduced in the estimate of these parameters. This method is useful only when a realistic set of bounds can be obtained for the emissivities of the data. In the case of water the lower and upper bounds were set at 0.97 and 1.00 respectively. Five flights were flown in succession at altitudes of 2 km (low), 6 km (mid), 12 km (high), and then back again at 6 km and 2 km. The area selected with the Ross Barnett reservoir near Jackson, Mississippi. The mission was flown during the predawn hours of 1 Feb. 1992. Radiosonde data was collected for that duration to profile the characteristics of the atmosphere. Ground truth temperatures using thermometers and radiometers were also obtained over an area of the reservoir. The results of two independent runs of the radiometer data averaged

  8. <5cm Ground Resolution DEMs for the Atacama Fault System (Chile), Acquried With the Modular Airborne Camera System (MACS)

    NASA Astrophysics Data System (ADS)

    Zielke, O.; Victor, P.; Oncken, O.; Bucher, T. U.; Lehmann, F.

    2011-12-01

    A primary step towards assessing time and size of future earthquakes is the identification of earthquake recurrence patterns in the existing seismic record. Geologic and geomorphic data are commonly analyzed for this purpose, reasoned by the lack of sufficiently long historical or instrumental seismic data sets. Until recently, those geomorphic data sets encompassed field observation, local total station surveys, and aerial photography. Over the last decade, LiDAR-based high-resolution topographic data sets became an additional powerful mean, contributing distinctly to a better understanding of earthquake rupture characteristics (e.g., single-event along-fault slip distribution, along-fault slip accumulation pattern) and their relation to fault geometric complexities. Typical shot densities of such data sets (e.g., airborne-LiDAR data along the San Andreas Fault) permit generation of digital elevation models (DEM) with <50 cm ground resolution, sufficient for depiction of meter-scale tectonic landforms. Identification of submeter-scale features is however prohibited by DEM resolution limitation. Here, we present a high-resolution topographic and visual data set from the Atacama fault system near Antofagasta, Chile. Data were acquired with Modular Airborne Camera System (MACS) - developed by the DLR (German Aerospace Center) in Berlin, Germany. The photogrammetrically derived DEM and True Ortho Images with <5cm ground resolution permit identification of very small-scale geomorphic features, thus enabling fault zone and earthquake rupture characterization at unprecedented detail. Compared to typical LiDAR-DEM, ground resolution is increased by an order of magnitude while the spatial extend of these data set is essentially the same. Here, we present examples of the <5cm resolution data set (DEM and visual results) and further explore resolution capabilities and potential with regards to the aforementioned tectono-geomorphic questions.

  9. [In-flight absolute radiometric calibration of UAV multispectral sensor].

    PubMed

    Chen, Wei; Yan, Lei; Gou, Zhi-Yang; Zhao, Hong-Ying; Liu, Da-Ping; Duan, Yi-Ni

    2012-12-01

    Based on the data of the scientific experiment in Urad Front Banner for UAV Remote Sensing Load Calibration Field project, with the help of 6 hyperspectral radiometric targets with good Lambertian property, the wide-view multispectral camera in UAV was calibrated adopting reflectance-based method. The result reveals that for green, red and infrared channel, whose images were successfully captured, the linear correlation coefficients between the DN and radiance are all larger than 99%. In final analysis, the comprehensive error is no more than 6%. The calibration results demonstrate that the hyperspectral targets equipped by the calibration field are well suitable for air-borne multispectral load in-flight calibration. The calibration result is reliable and could be used in the retrieval of geophysical parameters.

  10. Multispectral imaging of absorption and scattering properties of in vivo exposed rat brain using a digital red-green-blue camera

    NASA Astrophysics Data System (ADS)

    Yoshida, Keiichiro; Nishidate, Izumi; Ishizuka, Tomohiro; Kawauchi, Satoko; Sato, Shunichi; Sato, Manabu

    2015-05-01

    In order to estimate multispectral images of the absorption and scattering properties in the cerebral cortex of in vivo rat brain, we investigated spectral reflectance images estimated by the Wiener estimation method using a digital RGB camera. A Monte Carlo simulation-based multiple regression analysis for the corresponding spectral absorbance images at nine wavelengths (500, 520, 540, 560, 570, 580, 600, 730, and 760 nm) was then used to specify the absorption and scattering parameters of brain tissue. In this analysis, the concentrations of oxygenated hemoglobin and that of deoxygenated hemoglobin were estimated as the absorption parameters, whereas the coefficient a and the exponent b of the reduced scattering coefficient spectrum approximated by a power law function were estimated as the scattering parameters. The spectra of absorption and reduced scattering coefficients were reconstructed from the absorption and scattering parameters, and the spectral images of absorption and reduced scattering coefficients were then estimated. In order to confirm the feasibility of this method, we performed in vivo experiments on exposed rat brain. The estimated images of the absorption coefficients were dominated by the spectral characteristics of hemoglobin. The estimated spectral images of the reduced scattering coefficients had a broad scattering spectrum, exhibiting a larger magnitude at shorter wavelengths, corresponding to the typical spectrum of brain tissue published in the literature. The changes in the estimated absorption and scattering parameters during normoxia, hyperoxia, and anoxia indicate the potential applicability of the method by which to evaluate the pathophysiological conditions of in vivo brain due to the loss of tissue viability.

  11. Retrieval of water quality algorithms from airborne HySpex camera for oxbow lakes in north-eastern Poland

    NASA Astrophysics Data System (ADS)

    Slapinska, Malgorzata; Berezowski, Tomasz; Frąk, Magdalena; Chormański, Jarosław

    2016-04-01

    The aim of this study was to retrieve empirical formulas for water quality of oxbow lakes in Lower Biebrza Basin (river located in NE Poland) using HySpex airborne imaging spectrometer. Biebrza River is one of the biggest wetland in Europe. It is characterised by low contamination level and small human influence. Because of those characteristics Biebrza River can be treated as a reference area for other floodplains and fen ecosystem in Europe. Oxbow lakes are important part of Lower Biebrza Basin due to their retention and habitat function. Hyperspectral remote sensing data were acquired by the HySpex sensor (which covers the range of 400-2500 nm) on 01-02.08.2015 with the ground measurements campaign conducted 03-04.08.2015. The ground measurements consisted of two parts. First part included spectral reflectance sampling with spectroradiometer ASD FieldSpec 3, which covered the wavelength range of 350-2500 nm at 1 nm intervals. In situ data were collected both for water and for specific objects within the area. Second part of the campaign included water parameters such as Secchi disc depth (SDD), electric conductivity (EC), pH, temperature and phytoplankton. Measured reflectance enabled empirical line atmospheric correction which was conducted for the HySpex data. Our results indicated that proper atmospheric correction was very important for further data analysis. The empirical formulas for our water parameters were retrieved based on reflecatance data. This study confirmed applicability of HySpex camera to retrieve water quality.

  12. Improved Airborne System for Sensing Wildfires

    NASA Technical Reports Server (NTRS)

    McKeown, Donald; Richardson, Michael

    2008-01-01

    The Wildfire Airborne Sensing Program (WASP) is engaged in a continuing effort to develop an improved airborne instrumentation system for sensing wildfires. The system could also be used for other aerial-imaging applications, including mapping and military surveillance. Unlike prior airborne fire-detection instrumentation systems, the WASP system would not be based on custom-made multispectral line scanners and associated custom- made complex optomechanical servomechanisms, sensors, readout circuitry, and packaging. Instead, the WASP system would be based on commercial off-the-shelf (COTS) equipment that would include (1) three or four electronic cameras (one for each of three or four wavelength bands) instead of a multispectral line scanner; (2) all associated drive and readout electronics; (3) a camera-pointing gimbal; (4) an inertial measurement unit (IMU) and a Global Positioning System (GPS) receiver for measuring the position, velocity, and orientation of the aircraft; and (5) a data-acquisition subsystem. It would be necessary to custom-develop an integrated sensor optical-bench assembly, a sensor-management subsystem, and software. The use of mostly COTS equipment is intended to reduce development time and cost, relative to those of prior systems.

  13. Estimation of the Spectral Sensitivity Functions of Un-Modified and Modified Commercial Off-The Digital Cameras to Enable Their Use as a Multispectral Imaging System for Uavs

    NASA Astrophysics Data System (ADS)

    Berra, E.; Gibson-Poole, S.; MacArthur, A.; Gaulton, R.; Hamilton, A.

    2015-08-01

    Commercial off-the-shelf (COTS) digital cameras on-board unmanned aerial vehicles (UAVs) have the potential to be used as multispectral imaging systems; however, their spectral sensitivity is usually unknown and needs to be either measured or estimated. This paper details a step by step methodology for identifying the spectral sensitivity of modified (to be response to near infra-red wavelengths) and un-modified COTS digital cameras, showing the results of its application for three different models of camera. Six digital still cameras, which are being used as imaging systems on-board different UAVs, were selected to have their spectral sensitivities measured by a monochromator. Each camera was exposed to monochromatic light ranging from 370 nm to 1100 nm in 10 nm steps, with images of each step recorded in RAW format. The RAW images were converted linearly into TIFF images using DCRaw, an open-source program, before being batch processed through ImageJ (also open-source), which calculated the mean and standard deviation values from each of the red-green-blue (RGB) channels over a fixed central region within each image. These mean values were then related to the relative spectral radiance from the monochromator and its integrating sphere, in order to obtain the relative spectral response (RSR) for each of the cameras colour channels. It was found that different un-modified camera models present very different RSR in some channels, and one of the modified cameras showed a response that was unexpected. This highlights the need to determine the RSR of a camera before using it for any quantitative studies.

  14. Molecular Shocks Associated with Massive Young Stars: CO Line Images with a New Far-Infrared Spectroscopic Camera on the Kuiper Airborne Observatory

    NASA Technical Reports Server (NTRS)

    Watson, Dan M.

    1997-01-01

    Under the terms of our contract with NASA Ames Research Center, the University of Rochester (UR) offers the following final technical report on grant NAG 2-958, Molecular shocks associated with massive young stars: CO line images with a new far-infrared spectroscopic camera, given for implementation of the UR Far-Infrared Spectroscopic Camera (FISC) on the Kuiper Airborne Observatory (KAO), and use of this camera for observations of star-formation regions 1. Two KAO flights in FY 1995, the final year of KAO operations, were awarded to this program, conditional upon a technical readiness confirmation which was given in January 1995. The funding period covered in this report is 1 October 1994 - 30 September 1996. The project was supported with $30,000, and no funds remained at the conclusion of the project.

  15. A Cryogenic, Insulating Suspension System for the High Resolution Airborne Wideband Camera (HAWC)and Submillemeter And Far Infrared Experiment (SAFIRE) Adiabatic Demagnetization Refrigerators (ADRs)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Jackson, Michael L.; Shirron, Peter J.; Tuttle, James G.

    2002-01-01

    The High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter And Far Infrared Experiment (SAFIRE) will use identical Adiabatic Demagnetization Refrigerators (ADR) to cool their detectors to 200mK and 100mK, respectively. In order to minimize thermal loads on the salt pill, a Kevlar suspension system is used to hold it in place. An innovative, kinematic suspension system is presented. The suspension system is unique in that it consists of two parts that can be assembled and tensioned offline, and later bolted onto the salt pill.

  16. Evaluation of an airborne remote sensing platform consisting of two consumer-grade cameras for crop identification

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Remote sensing systems based on consumer-grade cameras have been increasingly used in scientific research and remote sensing applications because of their low cost and ease of use. However, the performance of consumer-grade cameras for practical applications have not been well documented in related ...

  17. Remote sensing techniques applied to multispectral recognition of the Aranjuez pilot zone

    NASA Technical Reports Server (NTRS)

    Lemos, G. L.; Salinas, J.; Rebollo, M.

    1977-01-01

    A rectangular (7 x 14 km) area 40 km S of Madrid was remote-sensed with a three-stage recognition process. Ground truth was established in the first phase, airborne sensing with a multispectral scanner and photographic cameras were used in the second phase, and Landsat satellite data were obtained in the third phase. Agronomic and hydrological photointerpretation problems are discussed. Color, black/white, and labeled areas are displayed for crop recognition in the land-use survey; turbidity, concentrations of pollutants and natural chemicals, and densitometry of the water are considered in the evaluation of water resources.

  18. Optical design of high resolution and large format CCD airborne remote sensing camera on unmanned aerial vehicle

    NASA Astrophysics Data System (ADS)

    Qian, Yixian; Cheng, Xiaowei; Shao, Jie

    2010-11-01

    Unmanned aerial vehicle remote sensing (UAVRS) is lower in cost, flexible on task arrangement and automatic and intelligent in application, it has been used widely for mapping, surveillance, reconnaissance and city planning. Airborne remote sensing missions require sensors with both high resolution and large fields of view, large format CCD digital airborne imaging systems are now a reality. A refractive system was designed to meet the requirements with the help of code V software, It has a focal length of 150mm, F number of 5.6, waveband of 0.45~0.7um, and field of view reaches 20°. It is shown that the value of modulation transfer function is higher than 0.5 at 55lp/mm, distortion is less than 0.1%, image quality reaches the diffraction limit. The system with large format CCD and wide field can satisfy the demand of the wide ground overlay area and high resolution. The optical system with simpler structure, smaller size and lighter weight, can be used in airborne remote sensing.

  19. Design and Fabrication of Two-Dimensional Semiconducting Bolometer Arrays for the High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC-II)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Allen, Christine A.; Amato, Michael J.; Babu, Sachidananda R.; Bartels, Arlin E.; Benford, Dominic J.; Derro, Rebecca J.; Dowell, C. Darren; Harper, D. Al; Jhabvala, Murzy D.

    2002-01-01

    The High resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC II) will use almost identical versions of an ion-implanted silicon bolometer array developed at the National Aeronautics and Space Administration's Goddard Space Flight Center (GSFC). The GSFC 'Pop-up' Detectors (PUD's) use a unique folding technique to enable a 12 x 32-element close-packed array of bolometers with a filling factor greater than 95 percent. A kinematic Kevlar(trademark) suspension system isolates the 200 mK bolometers from the helium bath temperature, and GSFC - developed silicon bridge chips make electrical connection to the bolometers, while maintaining thermal isolation. The JFET preamps operate at 120 K. Providing good thermal heat sinking for these, and keeping their conduction and radiation from reaching the nearby bolometers, is one of the principal design challenges encountered. Another interesting challenge is the preparation of the silicon bolometers. They are manufactured in 32-element, planar rows using Micro Electro Mechanical Systems (MEMS) semiconductor etching techniques, and then cut and folded onto a ceramic bar. Optical alignment using specialized jigs ensures their uniformity and correct placement. The rows are then stacked to create the 12 x 32-element array. Engineering results from the first light run of SHARC II at the Caltech Submillimeter Observatory (CSO) are presented.

  20. Design and Fabrication of Two-Dimensional Semiconducting Bolometer Arrays for the High Resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC-II)

    NASA Technical Reports Server (NTRS)

    Voellmer, George M.; Allen, Christine A.; Amato, Michael J.; Babu, Sachidananda R.; Bartels, Arlin E.; Benford, Dominic J.; Derro, Rebecca J.; Dowell, C. Darren; Harper, D. Al; Jhabvala, Murzy D.; Simpson, A. D. (Technical Monitor)

    2002-01-01

    The High resolution Airborne Wideband Camera (HAWC) and the Submillimeter High Angular Resolution Camera II (SHARC 11) will use almost identical versions of an ion-implanted silicon bolometer array developed at the National Aeronautics and Space Administration's Goddard Space Flight Center (GSFC). The GSFC "Pop-Up" Detectors (PUD's) use a unique folding technique to enable a 12 x 32-element close-packed array of bolometers with a filling factor greater than 95 percent. A kinematic Kevlar(Registered Trademark) suspension system isolates the 200 mK bolometers from the helium bath temperature, and GSFC - developed silicon bridge chips make electrical connection to the bolometers, while maintaining thermal isolation. The JFET preamps operate at 120 K. Providing good thermal heat sinking for these, and keeping their conduction and radiation from reaching the nearby bolometers, is one of the principal design challenges encountered. Another interesting challenge is the preparation of the silicon bolometers. They are manufactured in 32-element, planar rows using Micro Electro Mechanical Systems (MEMS) semiconductor etching techniques, and then cut and folded onto a ceramic bar. Optical alignment using specialized jigs ensures their uniformity and correct placement. The rows are then stacked to create the 12 x 32-element array. Engineering results from the first light run of SHARC II at the CalTech Submillimeter Observatory (CSO) are presented.

  1. Close-up multispectral images of the surface of comet 67P/Churyumov-Gerasimenko by the ROLIS camera onboard the Rosetta Philae lander

    NASA Astrophysics Data System (ADS)

    Schroeder, S.; Mottola, S.; Arnold, G.; Grothues, H. G.; Jaumann, R.; Michaelis, H.; Neukum, G.; Pelivan, I.; Bibring, J. P.

    2014-12-01

    In November 2014 the Philae lander onboard Rosetta is scheduled to land on the surface of comet 67P/Churyumov-Gerasimenko. The ROLIS camera will provide the ground truth for the Rosetta OSIRIS camera. ROLIS will acquire images both during the descent and after landing. In this paper we concentrate on the post-landing images. The close-up images will enable us to characterize the morphology and texture of the surface, and the shape, albedo, and size distribution of the particles on scales as small as 0.3 mm per pixel. We may see evidence for a dust mantle, a refractory crust, and exposed ice. In addition, we hope to identify features such as pores, cracks, or vents that allow volatiles to escape the surface. We will not only image the surface during the day but also the night, when LEDs will illuminate the surface in four different colors (blue, green, red, near-IR). This will characterize the spectral properties and heterogeneity of the surface, helping us to identify its composition. Although the ROLIS spectral range and resolution are too limited to allow an exact mineralogical characterization, a study of the spectral slope and albedo will allow a broad classification of the solid surface phases. We expect to be able to distinguish between organic material, silicates and ices. By repeated imaging over the course of the mission ROLIS may detect long term changes associated with cometary activity.

  2. Novel x-ray multispectral imaging of ultraintense laser plasmas by a single-photon charge coupled device based pinhole camera.

    PubMed

    Labate, L; Giulietti, A; Giulietti, D; Köster, P; Levato, T; Gizzi, L A; Zamponi, F; Lübcke, A; Kämpfer, T; Uschmann, I; Förster, E

    2007-10-01

    Spectrally resolved two-dimensional imaging of ultrashort laser-produced plasmas is described, obtained by means of an advanced technique. The technique has been tested with microplasmas produced by ultrashort relativistic laser pulses. The technique is based on the use of a pinhole camera equipped with a charge coupled device detector operating in the single-photon regime. The spectral resolution is about 150 eV in the 4-10 keV range, and images in any selected photon energy range have a spatial resolution of 5 microm. The potential of the technique to study fast electron propagation in ultraintense laser interaction with multilayer targets is discussed and some preliminary results are shown.

  3. Novel x-ray multispectral imaging of ultraintense laser plasmas by a single-photon charge coupled device based pinhole camera

    SciTech Connect

    Labate, L.; Giulietti, A.; Giulietti, D.; Koester, P.; Levato, T.; Gizzi, L. A.; Zamponi, F.; Luebcke, A.; Kaempfer, T.; Uschmann, I.; Foerster, E.

    2007-10-15

    Spectrally resolved two-dimensional imaging of ultrashort laser-produced plasmas is described, obtained by means of an advanced technique. The technique has been tested with microplasmas produced by ultrashort relativistic laser pulses. The technique is based on the use of a pinhole camera equipped with a charge coupled device detector operating in the single-photon regime. The spectral resolution is about 150 eV in the 4-10 keV range, and images in any selected photon energy range have a spatial resolution of 5 {mu}m. The potential of the technique to study fast electron propagation in ultraintense laser interaction with multilayer targets is discussed and some preliminary results are shown.

  4. Cucumber disease diagnosis using multispectral images

    NASA Astrophysics Data System (ADS)

    Feng, Jie; Li, Hongning; Shi, Junsheng; Yang, Weiping; Liao, Ningfang

    2009-07-01

    In this paper, multispectral imaging technique for plant diseases diagnosis is presented. Firstly, multispectral imaging system is designed. This system utilizes 15 narrow-band filters, a panchromatic band, a monochrome CCD camera, and standard illumination observing environment. The spectral reflectance and color of 8 Macbeth color patches are reproduced between 400nm and 700nm in the process. In addition, spectral reflectance angle and color difference is obtained through measurements and analysis of color patches using spectrometer and multispectral imaging system. The result shows that 16 narrow-bands multispectral imaging system realizes good accuracy in spectral reflectance and color reproduction. Secondly, a horticultural plant, cucumber' familiar disease are the researching objects. 210 multispectral samples are obtained by multispectral and are classified by BP artificial neural network. The classification accuracies of Sphaerotheca fuliginea, Corynespora cassiicola, Pseudoperonospora cubensis are 100%. Trichothecium roseum and Cladosporium cucumerinum are 96.67% and 90.00%. It is confirmed that the multispectral imaging system realizes good accuracy in the cucumber diseases diagnosis.

  5. High Resolution Airborne Digital Imagery for Precision Agriculture

    NASA Technical Reports Server (NTRS)

    Herwitz, Stanley R.

    1998-01-01

    The Environmental Research Aircraft and Sensor Technology (ERAST) program is a NASA initiative that seeks to demonstrate the application of cost-effective aircraft and sensor technology to private commercial ventures. In 1997-98, a series of flight-demonstrations and image acquisition efforts were conducted over the Hawaiian Islands using a remotely-piloted solar- powered platform (Pathfinder) and a fixed-wing piloted aircraft (Navajo) equipped with a Kodak DCS450 CIR (color infrared) digital camera. As an ERAST Science Team Member, I defined a set of flight lines over the largest coffee plantation in Hawaii: the Kauai Coffee Company's 4,000 acre Koloa Estate. Past studies have demonstrated the applications of airborne digital imaging to agricultural management. Few studies have examined the usefulness of high resolution airborne multispectral imagery with 10 cm pixel sizes. The Kodak digital camera integrated with ERAST's Airborne Real Time Imaging System (ARTIS) which generated multiband CCD images consisting of 6 x 106 pixel elements. At the designated flight altitude of 1,000 feet over the coffee plantation, pixel size was 10 cm. The study involved the analysis of imagery acquired on 5 March 1998 for the detection of anomalous reflectance values and for the definition of spectral signatures as indicators of tree vigor and treatment effectiveness (e.g., drip irrigation; fertilizer application).

  6. Commercial Applications Multispectral Sensor System

    NASA Technical Reports Server (NTRS)

    Birk, Ronald J.; Spiering, Bruce

    1993-01-01

    NASA's Office of Commercial Programs is funding a multispectral sensor system to be used in the development of remote sensing applications. The Airborne Terrestrial Applications Sensor (ATLAS) is designed to provide versatility in acquiring spectral and spatial information. The ATLAS system will be a test bed for the development of specifications for airborne and spaceborne remote sensing instrumentation for dedicated applications. This objective requires spectral coverage from the visible through thermal infrared wavelengths, variable spatial resolution from 2-25 meters; high geometric and geo-location accuracy; on-board radiometric calibration; digital recording; and optimized performance for minimized cost, size, and weight. ATLAS is scheduled to be available in 3rd quarter 1992 for acquisition of data for applications such as environmental monitoring, facilities management, geographic information systems data base development, and mineral exploration.

  7. Multispectral Photography: the obscure becomes the obvious

    ERIC Educational Resources Information Center

    Polgrean, John

    1974-01-01

    Commonly used in map making, real estate zoning, and highway route location, aerial photography planes equipped with multispectral cameras may, among many environmental applications, now be used to locate mineral deposits, define marshland boundaries, study water pollution, and detect diseases in crops and forests. (KM)

  8. Processing Of Multispectral Data For Identification Of Rocks

    NASA Technical Reports Server (NTRS)

    Evans, Diane L.

    1990-01-01

    Linear discriminant analysis and supervised classification evaluated. Report discusses processing of multispectral remote-sensing imagery to identify kinds of sedimentary rocks by spectral signatures in geological and geographical contexts. Raw image data are spectra of picture elements in images of seven sedimentary rock units exposed on margin of Wind River Basin in Wyoming. Data acquired by Landsat Thematic Mapper (TM), Thermal Infrared Multispectral Scanner (TIMS), and NASA/JPL airborne synthetic-aperture radar (SAR).

  9. Linear array CCD sensor for multispectral camera

    NASA Astrophysics Data System (ADS)

    Chabbal, J.; Boucharlat, G.; Capppechi, F.; Benoit-Gonin, R.

    1985-10-01

    Design, operational and performance features are described for a new 2048 element CCD array in a ceramic package for beam sharing focal plane arrangements on remote sensing satellites. The device, labeled the TH 7805, furnishes 13 micron square pixels at 13 microns pitch over the 480-930 nm interval, two video outputs and a single-phase, buried channel CCD register. Each n-p photodiode is linked to a Si coating by a gate storing the photocharges. Crosstalk between elements is less than 1 percent and the rms noise level is 180 micro-V. The array output sensitivity is 1.37 micro-V/electron, linearity to less than 1 percent, and a 10 MHz maximum data rate. The entire sensor package draws under 150 mW power from the spacecraft. The TH 7805 has withstood over 10 krads in tests without exhibiting faults.

  10. Multi-spectral image dissector camera system

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The image dissector sensor for the Earth Resources Program is evaluated using contrast and reflectance data. The ground resolution obtainable for low contrast at the targeted signal to noise ratio of 1.8 was defined. It is concluded that the system is capable of achieving the detection of small, low contrast ground targets from satellites.

  11. D Land Cover Classification Based on Multispectral LIDAR Point Clouds

    NASA Astrophysics Data System (ADS)

    Zou, Xiaoliang; Zhao, Guihua; Li, Jonathan; Yang, Yuanxi; Fang, Yong

    2016-06-01

    Multispectral Lidar System can emit simultaneous laser pulses at the different wavelengths. The reflected multispectral energy is captured through a receiver of the sensor, and the return signal together with the position and orientation information of sensor is recorded. These recorded data are solved with GNSS/IMU data for further post-processing, forming high density multispectral 3D point clouds. As the first commercial multispectral airborne Lidar sensor, Optech Titan system is capable of collecting point clouds data from all three channels at 532nm visible (Green), at 1064 nm near infrared (NIR) and at 1550nm intermediate infrared (IR). It has become a new source of data for 3D land cover classification. The paper presents an Object Based Image Analysis (OBIA) approach to only use multispectral Lidar point clouds datasets for 3D land cover classification. The approach consists of three steps. Firstly, multispectral intensity images are segmented into image objects on the basis of multi-resolution segmentation integrating different scale parameters. Secondly, intensity objects are classified into nine categories by using the customized features of classification indexes and a combination the multispectral reflectance with the vertical distribution of object features. Finally, accuracy assessment is conducted via comparing random reference samples points from google imagery tiles with the classification results. The classification results show higher overall accuracy for most of the land cover types. Over 90% of overall accuracy is achieved via using multispectral Lidar point clouds for 3D land cover classification.

  12. Multispectral system for perimeter protection of stationary and moving objects

    NASA Astrophysics Data System (ADS)

    Szustakowski, Mieczyslaw; Ciurapinski, Wieslaw M.; Zyczkowski, Marek; Palka, Norbert; Kastek, Mariusz; Dulski, Rafal; Bieszczad, Grzegorz; Sosnowski, Tomasz

    2009-09-01

    Introduction of a ground multispectral detection has changed organization and construction of perimeter security systems. The perimeter systems with linear zone sensors and cables have been replaced with a point arrangement of sensors with multispectral detection. Such multispectral sensors generally consist of an active ground radar, which scans the protected area with microwaves or millimeter waves, a thermal camera, which detects temperature contrast and a visible range camera. Connection of these three different technologies into one system requires methodology for selection of technical conditions of installation and parameters of sensors. This procedure enables us to construct a system with correlated range, resolution, field of view and object identification. The second technical problem connected with the multispectral system is its software, which helps couple the radar with the cameras. This software can be used for automatic focusing of cameras, automatic guiding cameras to an object detected by the radar, tracking of the object and localization of the object on the digital map as well as identification and alarming. In this paper two essential issues connected with multispectral system are described. We focus on methodology of selection of sensors parameters. We present usage of a spider-chart, which was adopted to the proposed methodology. Next, we describe methodology of automation of the system regarding an object detection, tracking, identification, localization and alarming.

  13. Application of multispectral systems for the diagnosis of plant diseases

    NASA Astrophysics Data System (ADS)

    Feng, Jie; Liao, Ningfang; Wang, Guolong; Luo, Yongdao; Liang, Minyong

    2008-03-01

    Multispectral imaging technique combines space imaging and spectral detecting. It can obtain the spectral information and image information of object at the same time. Base on this concept, A new method proposed multispectral camera system to demonstrated plant diseases. In this paper, multispectral camera was used as image capturing device. It consists of a monochrome CCD camera and 16 narrow-band filters. The multispectral images of Macbeth 24 color patches are captured under the illumination of incandescent lamp in this experiment The 64 spectral reflectances of each color patches are calculated using Spline interpolation from 400 to 700nm in the process. And the color of the object is reproduced from the estimated spectral reflectance. The result for reproduction is contrast with the color signal using X-rite PULSE spectrophotometer. The average and maximum ΔΕ * ab are 9.23 and 12.81. It is confirmed that the multispectral system realizes the color reproduction of plant diseases from narrow-band multispectral image.

  14. Multispectral imaging system for contaminant detection

    NASA Technical Reports Server (NTRS)

    Poole, Gavin H. (Inventor)

    2003-01-01

    An automated inspection system for detecting digestive contaminants on food items as they are being processed for consumption includes a conveyor for transporting the food items, a light sealed enclosure which surrounds a portion of the conveyor, with a light source and a multispectral or hyperspectral digital imaging camera disposed within the enclosure. Operation of the conveyor, light source and camera are controlled by a central computer unit. Light reflected by the food items within the enclosure is detected in predetermined wavelength bands, and detected intensity values are analyzed to detect the presence of digestive contamination.

  15. Multispectral imaging using a single bucket detector

    PubMed Central

    Bian, Liheng; Suo, Jinli; Situ, Guohai; Li, Ziwei; Fan, Jingtao; Chen, Feng; Dai, Qionghai

    2016-01-01

    Existing multispectral imagers mostly use available array sensors to separately measure 2D data slices in a 3D spatial-spectral data cube. Thus they suffer from low photon efficiency, limited spectrum range and high cost. To address these issues, we propose to conduct multispectral imaging using a single bucket detector, to take full advantage of its high sensitivity, wide spectrum range, low cost, small size and light weight. Technically, utilizing the detector’s fast response, a scene’s 3D spatial-spectral information is multiplexed into a dense 1D measurement sequence and then demultiplexed computationally under the single pixel imaging scheme. A proof-of-concept setup is built to capture multispectral data of 64 pixels × 64 pixels × 10 wavelength bands ranging from 450 nm to 650 nm, with the acquisition time being 1 minute. The imaging scheme holds great potentials for various low light and airborne applications, and can be easily manufactured as production-volume portable multispectral imagers. PMID:27103168

  16. Multispectral imaging using a single bucket detector

    NASA Astrophysics Data System (ADS)

    Bian, Liheng; Suo, Jinli; Situ, Guohai; Li, Ziwei; Fan, Jingtao; Chen, Feng; Dai, Qionghai

    2016-04-01

    Existing multispectral imagers mostly use available array sensors to separately measure 2D data slices in a 3D spatial-spectral data cube. Thus they suffer from low photon efficiency, limited spectrum range and high cost. To address these issues, we propose to conduct multispectral imaging using a single bucket detector, to take full advantage of its high sensitivity, wide spectrum range, low cost, small size and light weight. Technically, utilizing the detector’s fast response, a scene’s 3D spatial-spectral information is multiplexed into a dense 1D measurement sequence and then demultiplexed computationally under the single pixel imaging scheme. A proof-of-concept setup is built to capture multispectral data of 64 pixels × 64 pixels × 10 wavelength bands ranging from 450 nm to 650 nm, with the acquisition time being 1 minute. The imaging scheme holds great potentials for various low light and airborne applications, and can be easily manufactured as production-volume portable multispectral imagers.

  17. Analysis of multispectral signatures of the shot

    NASA Astrophysics Data System (ADS)

    Kastek, Mariusz; Dulski, Rafał; Piątkowski, Tadeusz; Madura, Henryk; Bareła, Jarosław; Polakowski, Henryk

    2011-06-01

    The paper presents some practical aspects of sniper IR signature measurements. Description of particular signatures for sniper shot in typical scenarios has been presented. We take into consideration sniper activities in the open area as well as in urban environment. The measurements were made at field test ground. High precision laboratory measurements were also performed. Several infrared cameras were used during measurements to cover all measurement assumptions. Some of the cameras are measurement-class devices with high accuracy and frame rates. The registrations were simultaneously made in UV, NWIR, SWIR and LWIR spectral bands. The infrared cameras have possibilities to install optical filters for multispectral measurement. An ultra fast visual camera was also used for visible spectra registration. Exemplary sniper IR signatures for typical situation were presented. LWIR imaging spectroradiometer HyperCam was also used during the laboratory measurements and field experiments. The signatures collected by HyperCam were useful for the determination of spectral characteristics of shot.

  18. Remote sensing of shorelines using data fusion of hyperspectral and multispectral imagery acquired from mobile and fixed platforms

    NASA Astrophysics Data System (ADS)

    Bostater, Charles R.; Frystacky, Heather

    2012-06-01

    An optimized data fusion methodology is presented and makes use of airborne and vessel mounted hyperspectral and multispectral imagery acquired at littoral zones in Florida and the northern Gulf of Mexico. The results demonstrate the use of hyperspectral-multispectral data fusion anomaly detection along shorelines and in surface and subsurface waters. Hyperspectral imagery utilized in the data fusion analysis was collected using a 64-1024 channel, 1376 pixel swath width; temperature stabilized sensing system; an integrated inertial motion unit; and differential GPS. The imaging system is calibrated using dual 18 inch calibration spheres, spectral line sources, and custom line targets. Simultaneously collected multispectral three band imagery used in the data fusion analysis was derived either a 12 inch focal length large format camera using 9 inch high speed AGFA color negative film, a 12.3 megapixel digital camera or dual high speed full definition video cameras. Pushbroom sensor imagery is corrected using Kalman filtering and smoothing in order to correct images for airborne platform motions or motions of a small vessel. Custom software developed for the hyperspectral system and the optimized data fusion process allows for post processing using atmospherically corrected and georeferenced reflectance imagery. The optimized data fusion approach allows for detecting spectral anomalies in the resolution enhanced data cubes. Spectral-spatial anomaly detection is demonstrated using simulated embedded targets in actual imagery. The approach allows one to utilize spectral signature anomalies to identify features and targets that would otherwise not be possible. The optimized data fusion techniques and software has been developed in order to perform sensitivity analysis of the synthetic images in order to optimize the singular value decomposition model building process and the 2-D Butterworth cutoff frequency selection process, using the concept of user defined "feature

  19. Low SWaP multispectral sensors using dichroic filter arrays

    NASA Astrophysics Data System (ADS)

    Dougherty, John; Varghese, Ron

    2015-06-01

    The benefits of multispectral imaging are well established in a variety of applications including remote sensing, authentication, satellite and aerial surveillance, machine vision, biomedical, and other scientific and industrial uses. However, many of the potential solutions require more compact, robust, and cost-effective cameras to realize these benefits. The next generation of multispectral sensors and cameras needs to deliver improvements in size, weight, power, portability, and spectral band customization to support widespread deployment for a variety of purpose-built aerial, unmanned, and scientific applications. A novel implementation uses micro-patterning of dichroic filters1 into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. This approach can be implemented across a variety of wavelength ranges and on a variety of detector types including linear, area, silicon, and InGaAs. This dichroic filter array approach can also reduce payloads and increase range for unmanned systems, with the capability to support both handheld and autonomous systems. Recent examples and results of 4 band RGB + NIR dichroic filter arrays in multispectral cameras are discussed. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and scalable production.

  20. A real-time multispectral imaging system for low- or mid-altitude remote sensing

    NASA Astrophysics Data System (ADS)

    Yi, Dingrong; Kong, Linghua

    2012-10-01

    Multispectral imaging is a powerful tool in remote sensing applications. Recently a micro-arrayed narrow-band optical mosaic filter was invented and successfully fabricated to reduce the size and cost of multispectral imaging devices in order to meet the requirements for low- or mid- altitude remote sensing. Such a filter with four narrow bands is integrated with an off-shelf CCD camera, resulting in an economic and light-weight multispectral imaging camera with the capacity of producing multiple images at different center wavelengths with a single shot. The multispectral imaging camera is then integrated with a wireless transmitter and battery to produce a remote sensing multispectral imaging system. The design and some preliminary results of a prototyped multispectral imaging system with the potential for remote sensing applications with a weight of only 200 grams are reported. The prototyped multispectral imaging system eliminates the image registration procedure required by traditional multispectral imaging technologies. In addition, it has other advantages such as low cost, being light weight and compact in design.

  1. Active and passive multispectral scanner for earth resources applications: An advanced applications flight experiment

    NASA Technical Reports Server (NTRS)

    Hasell, P. G., Jr.; Peterson, L. M.; Thomson, F. J.; Work, E. A.; Kriegler, F. J.

    1977-01-01

    The development of an experimental airborne multispectral scanner to provide both active (laser illuminated) and passive (solar illuminated) data from a commonly registered surface scene is discussed. The system was constructed according to specifications derived in an initial programs design study. The system was installed in an aircraft and test flown to produce illustrative active and passive multi-spectral imagery. However, data was not collected nor analyzed for any specific application.

  2. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.

  3. CAOS-CMOS camera.

    PubMed

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems. PMID:27410361

  4. Galileo multispectral imaging of Earth.

    PubMed

    Geissler, P; Thompson, W R; Greenberg, R; Moersch, J; McEwen, A; Sagan, C

    1995-08-25

    Nearly 6000 multispectral images of Earth were acquired by the Galileo spacecraft during its two flybys. The Galileo images offer a unique perspective on our home planet through the spectral capability made possible by four narrowband near-infrared filters, intended for observations of methane in Jupiter's atmosphere, which are not incorporated in any of the currently operating Earth orbital remote sensing systems. Spectral variations due to mineralogy, vegetative cover, and condensed water are effectively mapped by the visible and near-infrared multispectral imagery, showing a wide variety of biological, meteorological, and geological phenomena. Global tectonic and volcanic processes are clearly illustrated by these images, providing a useful basis for comparative planetary geology. Differences between plant species are detected through the narrowband IR filters on Galileo, allowing regional measurements of variation in the "red edge" of chlorophyll and the depth of the 1-micrometer water band, which is diagnostic of leaf moisture content. Although evidence of life is widespread in the Galileo data set, only a single image (at approximately 2 km/pixel) shows geometrization plausibly attributable to our technical civilization. Water vapor can be uniquely imaged in the Galileo 0.73-micrometer band, permitting spectral discrimination of moist and dry clouds with otherwise similar albedo. Surface snow and ice can be readily distinguished from cloud cover by narrowband imaging within the sensitivity range of Galileo's silicon CCD camera. Ice grain size variations can be mapped using the weak H2O absorption at 1 micrometer, a technique which may find important applications in the exploration of the moons of Jupiter. The Galileo images have the potential to make unique contributions to Earth science in the areas of geological, meteorological and biological remote sensing, due to the inclusion of previously untried narrowband IR filters. The vast scale and near global

  5. Galileo multispectral imaging of Earth.

    PubMed

    Geissler, P; Thompson, W R; Greenberg, R; Moersch, J; McEwen, A; Sagan, C

    1995-08-25

    Nearly 6000 multispectral images of Earth were acquired by the Galileo spacecraft during its two flybys. The Galileo images offer a unique perspective on our home planet through the spectral capability made possible by four narrowband near-infrared filters, intended for observations of methane in Jupiter's atmosphere, which are not incorporated in any of the currently operating Earth orbital remote sensing systems. Spectral variations due to mineralogy, vegetative cover, and condensed water are effectively mapped by the visible and near-infrared multispectral imagery, showing a wide variety of biological, meteorological, and geological phenomena. Global tectonic and volcanic processes are clearly illustrated by these images, providing a useful basis for comparative planetary geology. Differences between plant species are detected through the narrowband IR filters on Galileo, allowing regional measurements of variation in the "red edge" of chlorophyll and the depth of the 1-micrometer water band, which is diagnostic of leaf moisture content. Although evidence of life is widespread in the Galileo data set, only a single image (at approximately 2 km/pixel) shows geometrization plausibly attributable to our technical civilization. Water vapor can be uniquely imaged in the Galileo 0.73-micrometer band, permitting spectral discrimination of moist and dry clouds with otherwise similar albedo. Surface snow and ice can be readily distinguished from cloud cover by narrowband imaging within the sensitivity range of Galileo's silicon CCD camera. Ice grain size variations can be mapped using the weak H2O absorption at 1 micrometer, a technique which may find important applications in the exploration of the moons of Jupiter. The Galileo images have the potential to make unique contributions to Earth science in the areas of geological, meteorological and biological remote sensing, due to the inclusion of previously untried narrowband IR filters. The vast scale and near global

  6. Land use classification utilizing remote multispectral scanner data and computer analysis techniques

    NASA Technical Reports Server (NTRS)

    Leblanc, P. N.; Johannsen, C. J.; Yanner, J. E.

    1973-01-01

    An airborne multispectral scanner was used to collect the visible and reflective infrared data. A small subdivision near Lafayette, Indiana was selected as the test site for the urban land use study. Multispectral scanner data were collected over the subdivision on May 1, 1970 from an altitude of 915 meters. The data were collected in twelve wavelength bands from 0.40 to 1.00 micrometers by the scanner. The results indicated that computer analysis of multispectral data can be very accurate in classifying and estimating the natural and man-made materials that characterize land uses in an urban scene.

  7. Multispectral imaging for biometrics

    NASA Astrophysics Data System (ADS)

    Rowe, Robert K.; Corcoran, Stephen P.; Nixon, Kristin A.; Ostrom, Robert E.

    2005-03-01

    Automated identification systems based on fingerprint images are subject to two significant types of error: an incorrect decision about the identity of a person due to a poor quality fingerprint image and incorrectly accepting a fingerprint image generated from an artificial sample or altered finger. This paper discusses the use of multispectral sensing as a means to collect additional information about a finger that significantly augments the information collected using a conventional fingerprint imager based on total internal reflectance. In the context of this paper, "multispectral sensing" is used broadly to denote a collection of images taken under different polarization conditions and illumination configurations, as well as using multiple wavelengths. Background information is provided on conventional fingerprint imaging. A multispectral imager for fingerprint imaging is then described and a means to combine the two imaging systems into a single unit is discussed. Results from an early-stage prototype of such a system are shown.

  8. Multispectral imaging probe

    DOEpatents

    Sandison, David R.; Platzbecker, Mark R.; Descour, Michael R.; Armour, David L.; Craig, Marcus J.; Richards-Kortum, Rebecca

    1999-01-01

    A multispectral imaging probe delivers a range of wavelengths of excitation light to a target and collects a range of expressed light wavelengths. The multispectral imaging probe is adapted for mobile use and use in confined spaces, and is sealed against the effects of hostile environments. The multispectral imaging probe comprises a housing that defines a sealed volume that is substantially sealed from the surrounding environment. A beam splitting device mounts within the sealed volume. Excitation light is directed to the beam splitting device, which directs the excitation light to a target. Expressed light from the target reaches the beam splitting device along a path coaxial with the path traveled by the excitation light from the beam splitting device to the target. The beam splitting device directs expressed light to a collection subsystem for delivery to a detector.

  9. Multispectral imaging probe

    DOEpatents

    Sandison, D.R.; Platzbecker, M.R.; Descour, M.R.; Armour, D.L.; Craig, M.J.; Richards-Kortum, R.

    1999-07-27

    A multispectral imaging probe delivers a range of wavelengths of excitation light to a target and collects a range of expressed light wavelengths. The multispectral imaging probe is adapted for mobile use and use in confined spaces, and is sealed against the effects of hostile environments. The multispectral imaging probe comprises a housing that defines a sealed volume that is substantially sealed from the surrounding environment. A beam splitting device mounts within the sealed volume. Excitation light is directed to the beam splitting device, which directs the excitation light to a target. Expressed light from the target reaches the beam splitting device along a path coaxial with the path traveled by the excitation light from the beam splitting device to the target. The beam splitting device directs expressed light to a collection subsystem for delivery to a detector. 8 figs.

  10. SWNT Imaging Using Multispectral Image Processing

    NASA Astrophysics Data System (ADS)

    Blades, Michael; Pirbhai, Massooma; Rotkin, Slava V.

    2012-02-01

    A flexible optical system was developed to image carbon single-wall nanotube (SWNT) photoluminescence using the multispectral capabilities of a typical CCD camcorder. The built in Bayer filter of the CCD camera was utilized, using OpenCV C++ libraries for image processing, to decompose the image generated in a high magnification epifluorescence microscope setup into three pseudo-color channels. By carefully calibrating the filter beforehand, it was possible to extract spectral data from these channels, and effectively isolate the SWNT signals from the background.

  11. Simultaneous multispectral imaging using lenslet arrays

    NASA Astrophysics Data System (ADS)

    Hinnrichs, Michele; Jensen, James

    2013-03-01

    There is a need for small compact multispectral and hyperspectral imaging systems that simultaneously images in many spectral bands across the infrared spectral region from short to long-wave infrared. This is a challenge for conventional optics and usually requires large, costly and complex optical systems. However, with the advances in materials and photolithographic technology, Micro-Optical-Electrical-Machine-Systems (MOEMS) can meet these goals. In this paper Pacific Advanced Technology and ECBC will present the work that we are doing under a SBIR contract to the US Army using a MOEMS based diffractive optical lenslet array to perform simultaneous multispectral and hyperspectral imaging with relatively high spatial resolution. Under this program we will develop a proof of concept system that demonstrates how a diffractive optical (DO) lenslet array can image 1024 x 1024 pixels in 16 colors every frame of the camera. Each color image has a spatial resolution of 256 x 256 pixels with an IFOV of 1.7 mrads and FOV of 25 degrees. The purpose of this work is to simultaneously image multiple colors each frame and reduce the temporal changes between colors that are apparent in sequential multispectral imaging. Translating the lenslet array will collect hyperspectral image data cubes as will be explained later in this paper. Because the optics is integrated with the detector the entire multispectral/hyperspectral system can be contained in a miniature package. The spectral images are collected simultaneously allowing high resolution spectral-spatial-temporal information each frame of the camera. Thus enabling the implementation of spectral-temporal-spatial algorithms in real-time with high sensitivity for the detection of weak signals in a high background clutter environment with low sensitivity to camera motion. Using MOEMS actuation the DO lenslet array is translated along the optical axis to complete the full hyperspectral data cube in just a few frames of the

  12. Automated Data Production For A Novel Airborne Multiangle Spectropolarimetric Imager (AIRMSPI)

    NASA Technical Reports Server (NTRS)

    Jovanovic, V .M.; Bull, M.; Diner, D. J.; Geier, S.; Rheingans, B.

    2012-01-01

    A novel polarimetric imaging technique making use of rapid retardance modulation has been developed by JPL as a part of NASA's Instrument Incubator Program. It has been built into the Airborne Multiangle SpectroPolarimetric Imager (AirMSPI) under NASA's Airborne Instrument Technology Transition Program, and is aimed primarily at remote sensing of the amounts and microphysical properties of aerosols and clouds. AirMSPI includes an 8-band (355, 380, 445, 470, 555, 660, 865, 935 nm) pushbroom camera that measures polarization in a subset of the bands (470, 660, and 865 nm). The camera is mounted on a gimbal and acquires imagery in a configurable set of along-track viewing angles ranging between +67 deg and -67 deg relative to nadir. As a result, near simultaneous multi-angle, multi-spectral, and polarimetric measurements of the targeted areas at a spatial resolution ranging from 7 m to 20 m (depending on the viewing angle) can be derived. An automated data production system is being built to support high data acquisition rate in concert with co-registration and orthorectified mapping requirements. To date, a number of successful engineering checkout flights were conducted in October 2010, August-September 2011, and January 2012. Data products resulting from these flights will be presented.

  13. Multispectral Mapping of the Moon by Clementine

    NASA Technical Reports Server (NTRS)

    Eliason, Eric M.; McEwen, Alfred S.; Robinson, M.; Lucey, Paul G.; Duxbury, T.; Malaret, E.; Pieters, Carle; Becker, T.; Isbell, C.; Lee, E.

    1998-01-01

    One of the chief scientific objectives of the Clementine mission at the Moon was to acquire global multispectral mapping. A global digital map of the Moon in 11 spectral bandpasses and at a scale of 100 m/pixel is being produced at the U.S. Geological Survey in Flagstaff Arizona Near-global coverage was acquired with the UVVIS camera (central wavelengths of 415, 750, 900, 950, and 1000 nm) and the NIR camera (1102, 1248, 1499, 1996, 2620, and 2792 nary). We expect to complete processing of the UVVIS mosaics before the fall of 1998, and to complete the NIR mosaics a year later. The purpose of this poster is to provide an update on the processing and to show examples of the products or perhaps even a wall-sized display of color products from the UVVIS mosaics.

  14. Compact multi-spectral imaging system for dermatology and neurosurgery

    NASA Astrophysics Data System (ADS)

    Noordmans, Herke Jan; de Roode, Rowland; Verdaasdonk, Rudolf

    2007-03-01

    A compact multi-spectral imaging system is presented as diagnostic tool in dermatology and neurosurgery. Using an electronically tunable filter, a sensitive high resolution digital camera, 140 spectral images from 400 nm up to 720 nm are acquired in 40 s. Advanced image processing algorithms are used to enable interactive acquisition, viewing, image registration and image analysis. Experiments in the department of dermatology and neurosurgery show that multispectral imaging reveals much more detail than conventional medical photography or a surgical microscope, as images can be reprocessed to enhance the view on e.g. tumor boundaries. Using a hardware-based interactive registration algorithm, multi-spectral images can be aligned to correct for motion occurred during image acquisition or to compare acquisitions from different moments in time. The system shows to be a powerful diagnostics tool for medical imaging in the visual and near IR range.

  15. Polarimetric Multispectral Imaging Technology

    NASA Technical Reports Server (NTRS)

    Cheng, L.-J.; Chao, T.-H.; Dowdy, M.; Mahoney, C.; Reyes, G.

    1993-01-01

    The Jet Propulsion Laboratory is developing a remote sensing technology on which a new generation of compact, lightweight, high-resolution, low-power, reliable, versatile, programmable scientific polarimetric multispectral imaging instruments can be built to meet the challenge of future planetary exploration missions. The instrument is based on the fast programmable acousto-optic tunable filter (AOTF) of tellurium dioxide (TeO2) that operates in the wavelength range of 0.4-5 microns. Basically, the AOTF multispectral imaging instrument measures incoming light intensity as a function of spatial coordinates, wavelength, and polarization. Its operation can be in either sequential, random access, or multiwavelength mode as required. This provides observation flexibility, allowing real-time alternation among desired observations, collecting needed data only, minimizing data transmission, and permitting implementation of new experiments. These will result in optimization of the mission performance with minimal resources. Recently we completed a polarimetric multispectral imaging prototype instrument and performed outdoor field experiments for evaluating application potentials of the technology. We also investigated potential improvements on AOTF performance to strengthen technology readiness for applications. This paper will give a status report on the technology and a prospect toward future planetary exploration.

  16. Wetlands mapping with spot multispectral scanner data

    SciTech Connect

    Mackey, H.E. Jr. ); Jensen, J.R. . Dept. of Geography)

    1989-01-01

    Government facilities such as the US Department of Energy's Savannah River Plant (SRP) near Aiken, South Carolina, often use remote sensing data to assist in environmental management. Airborne multispectral scanner (MSS) data have been acquired at SRP since 1981. Various types of remote sensing data have been used to map and characterize wetlands. Regional Landsat MSS and TM satellite data have been used for wetlands mapping by various government agencies and private organizations. Furthermore, SPOT MSS data are becoming available and provide opportunities for increased spacial resolution and temporal coverage for wetlands mapping. This paper summarizes the initial results from using five dates of SPOT MSS data from April through October, 1987, as a means to monitor seasonal wetland changes in freshwater wetlands of the SRP. 11 refs., 4 figs.

  17. Use of multispectral scanner images for assessment of hydrothermal alteration in the Marysvale, Utah, mining area.

    USGS Publications Warehouse

    Podwysocki, M.H.; Segal, D.B.; Abrams, M.J.

    1983-01-01

    Airborne multispectral scanner. A color composite image was constructed using the following spectral band ratios: 1.6/2.2 mu m, 1.6/0.48 mu m, and 0.67/1.0 mu m. The color ratio composite successfully distinguished most types of altered rocks from unaltered rocks; further division of altered rocks into ferric oxide-rich and -poor types.

  18. Multispectral observations of the surf zone

    NASA Astrophysics Data System (ADS)

    Schoonmaker, Jon S.; Dirbas, Joseph; Gilbert, Gary

    2003-09-01

    Airborne multispectral imagery was collected over various targets on the beach and in the water in an attempt to characterize the surf zone environment with respect to electro-optical system capabilities and to assess the utility of very low cost, small multispectral systems in mine counter measures (MCM) and intelligence, surveillance and reconnaissance applications. The data was collected by PAR Government Systems Corporation (PGSC) at the Army Corps of Engineers Field Research Facility at Duck North Carolina and on the beaches of Camp Pendleton Marine Corps Base in Southern California. PGSC flew the first two of its MANTIS (Mission Adaptable Narrowband Tunable Imaging Sensor) systems. Both MANTIS systems were flown in an IR - red - green - blue (700, 600, 550, 480 nm) configuration from altitudes ranging from 200 to 700 meters. Data collected has been lightly analyzed and a surf zone index (SZI) defined and calculated. This index allows mine hunting system performance measurements in the surf zone to be normalized by environmental conditions. The SZI takes into account water clarity, wave energy, and foam persistence.

  19. Multispectral imaging with type II superlattice detectors

    NASA Astrophysics Data System (ADS)

    Ariyawansa, Gamini; Duran, Joshua M.; Grupen, Matt; Scheihing, John E.; Nelson, Thomas R.; Eismann, Michael T.

    2012-06-01

    Infrared (IR) focal plane arrays (FPAs) with multispectral detector elements promise significant advantages for airborne threat warning, surveillance, and targeting applications. At present, the use of type II superlattice (T2SL) structures based on the 6.1Å-family materials (InAs, GaSb, and AlSb) has become an area of interest for developing IR detectors and their FPAs. The ability to vary the bandgap in the IR range, suppression of Auger processes, prospective reduction of Shockley-Read-Hall centers by improved material growth capabilities, and the material stability are a few reasons for the predicted dominance of the T2SL technology over presently leading HgCdTe and quantum well technologies. The focus of the work reported here is on the development of T2SL based dual-band IR detectors and their applicability for multispectral imaging. A new NpBPN detector designed for the detection of IR in the 3-5 and 8-12 μm atmospheric windows is presented; comparing its advantages over other T2SL based approaches. One of the key challenges of the T2SL dual-band detectors is the spectral crosstalk associated with the LWIR band. The properties of the state-of-the-art T2SLs (i.e., absorption coefficient, minority carrier lifetime and mobility, etc.) and the present growth limitations that impact spectral crosstalk are discussed.

  20. Camera Optics.

    ERIC Educational Resources Information Center

    Ruiz, Michael J.

    1982-01-01

    The camera presents an excellent way to illustrate principles of geometrical optics. Basic camera optics of the single-lens reflex camera are discussed, including interchangeable lenses and accessories available to most owners. Several experiments are described and results compared with theoretical predictions or manufacturer specifications.…

  1. Airborne imaging sensors for environmental monitoring & surveillance in support of oil spills & recovery efforts

    NASA Astrophysics Data System (ADS)

    Bostater, Charles R.; Jones, James; Frystacky, Heather; Coppin, Gaelle; Leavaux, Florian; Neyt, Xavier

    2011-11-01

    Collection of pushbroom sensor imagery from a mobile platform requires corrections using inertial measurement units (IMU's) and DGPS in order to create useable imagery for environmental monitoring and surveillance of shorelines in freshwater systems, coastal littoral zones and harbor areas. This paper describes a suite of imaging systems used during collection of hyperspectral imagery in northern Florida panhandle and Gulf of Mexico airborne missions to detect weathered oil in coastal littoral zones. Underlying concepts of pushbroom imagery, the needed corrections for directional changes using DGPS and corrections for platform yaw, pitch, and roll using IMU data is described as well as the development and application of optimal band and spectral regions associated with weathered oil. Pushbroom sensor and frame camera data collected in response to the recent Gulf of Mexico oil spill disaster is presented as the scenario documenting environmental monitoring and surveillance techniques using mobile sensing platforms. Data was acquired during the months of February, March, April and May of 2011. The low altitude airborne systems include a temperature stabilized hyperspectral imaging system capable of up to 1024 spectral channels and 1376 spatial across track pixels flown from 3,000 to 4,500 feet altitudes. The hyperspectral imaging system is collocated with a full resolution high definition video recorder for simultaneous HD video imagery, a 12.3 megapixel digital, a mapping camera using 9 inch film types that yields scanned aerial imagery with approximately 22,200 by 22,200 pixel multispectral imagery (~255 megapixel RGB multispectral images in order to conduct for spectral-spatial sharpening of fused multispectral, hyperspectral imagery. Two high spectral (252 channels) and radiometric sensitivity solid state spectrographs are used for collecting upwelling radiance (sub-meter pixels) with downwelling irradiance fiber optic attachment. These sensors are utilized for

  2. Dual multispectral and 3D structured light laparoscope

    NASA Astrophysics Data System (ADS)

    Clancy, Neil T.; Lin, Jianyu; Arya, Shobhit; Hanna, George B.; Elson, Daniel S.

    2015-03-01

    Intraoperative feedback on tissue function, such as blood volume and oxygenation would be useful to the surgeon in cases where current clinical practice relies on subjective measures, such as identification of ischaemic bowel or tissue viability during anastomosis formation. Also, tissue surface profiling may be used to detect and identify certain pathologies, as well as diagnosing aspects of tissue health such as gut motility. In this paper a dual modality laparoscopic system is presented that combines multispectral reflectance and 3D surface imaging. White light illumination from a xenon source is detected by a laparoscope-mounted fast filter wheel camera to assemble a multispectral image (MSI) cube. Surface shape is then calculated using a spectrally-encoded structured light (SL) pattern detected by the same camera and triangulated using an active stereo technique. Images of porcine small bowel were acquired during open surgery. Tissue reflectance spectra were acquired and blood volume was calculated at each spatial pixel across the bowel wall and mesentery. SL features were segmented and identified using a `normalised cut' algoritm and the colour vector of each spot. Using the 3D geometry defined by the camera coordinate system the multispectral data could be overlaid onto the surface mesh. Dual MSI and SL imaging has the potential to provide augmented views to the surgeon supplying diagnostic information related to blood supply health and organ function. Future work on this system will include filter optimisation to reduce noise in tissue optical property measurement, and minimise spot identification errors in the SL pattern.

  3. Video rate multispectral imaging for camouflaged target detection

    NASA Astrophysics Data System (ADS)

    Henry, Sam

    2015-05-01

    The ability to detect and identify camouflaged targets is critical in combat environments. Hyperspectral and Multispectral cameras allow a soldier to identify threats more effectively than traditional RGB cameras due to both increased color resolution and ability to see beyond visible light. Static imagers have proven successful, however the development of video rate imagers allows for continuous real time target identification and tracking. This paper presents an analysis of existing anomaly detection algorithms and how they can be adopted to video rates, and presents a general purpose semisupervised real time anomaly detection algorithm using multiple frame sampling.

  4. Remote sensing operations (multispectral scanner and photographic) in the New York Bight, 22 September 1975

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.; Hall, J. B., Jr.

    1977-01-01

    Ocean dumping of waste materials is a significant environmental concern in the New York Bight. One of these waste materials, sewage sludge, was monitored in an experiment conducted in the New York Bight on September 22, 1975. Remote sensing over controlled sewage sludge dumping included an 11-band multispectral scanner, fiver multispectral cameras and one mapping camera. Concurrent in situ water samples were taken and acoustical measurements were made of the sewage sludge plumes. Data were obtained for sewage sludge plumes resulting from line (moving barge) and spot (stationary barge) dumps. Multiple aircraft overpasses were made to evaluate temporal effects on the plume signature.

  5. MULTISPECTRAL THERMAL IMAGER - OVERVIEW

    SciTech Connect

    P. WEBER

    2001-03-01

    The Multispectral Thermal Imager satellite fills a new and important role in advancing the state of the art in remote sensing sciences. Initial results with the full calibration system operating indicate that the system was already close to achieving the very ambitious goals which we laid out in 1993, and we are confident of reaching all of these goals as we continue our research and improve our analyses. In addition to the DOE interests, the satellite is tasked about one-third of the time with requests from other users supporting research ranging from volcanology to atmospheric sciences.

  6. Multispectral thermal imaging

    SciTech Connect

    Weber, P.G.; Bender, S.C.; Borel, C.C.; Clodius, W.B.; Smith, B.W.; Garrett, A.; Pendergast, M.M.; Kay, R.R.

    1998-12-01

    Many remote sensing applications rely on imaging spectrometry. Here the authors use imaging spectrometry for thermal and multispectral signatures measured from a satellite platform enhanced with a combination of accurate calibrations and on-board data for correcting atmospheric distortions. The approach is supported by physics-based end-to-end modeling and analysis, which permits a cost-effective balance between various hardware and software aspects. The goal is to develop and demonstrate advanced technologies and analysis tools toward meeting the needs of the customer; at the same time, the attributes of this system can address other applications in such areas as environmental change, agriculture, and volcanology.

  7. Lossless compression algorithm for multispectral imagers

    NASA Astrophysics Data System (ADS)

    Gladkova, Irina; Grossberg, Michael; Gottipati, Srikanth

    2008-08-01

    Multispectral imaging is becoming an increasingly important tool for monitoring the earth and its environment from space borne and airborne platforms. Multispectral imaging data consists of visible and IR measurements from a scene across space and spectrum. Growing data rates resulting from faster scanning and finer spatial and spectral resolution makes compression an increasingly critical tool to reduce data volume for transmission and archiving. Research for NOAA NESDIS has been directed to finding for the characteristics of satellite atmospheric Earth science Imager sensor data what level of Lossless compression ratio can be obtained as well as appropriate types of mathematics and approaches that can lead to approaching this data's entropy level. Conventional lossless do not achieve the theoretical limits for lossless compression on imager data as estimated from the Shannon entropy. In a previous paper, the authors introduce a lossless compression algorithm developed for MODIS as a proxy for future NOAA-NESDIS satellite based Earth science multispectral imagers such as GOES-R. The algorithm is based on capturing spectral correlations using spectral prediction, and spatial correlations with a linear transform encoder. In decompression, the algorithm uses a statistically computed look up table to iteratively predict each channel from a channel decompressed in the previous iteration. In this paper we present a new approach which fundamentally differs from our prior work. In this new approach, instead of having a single predictor for each pair of bands we introduce a piecewise spatially varying predictor which significantly improves the compression results. Our new algorithm also now optimizes the sequence of channels we use for prediction. Our results are evaluated by comparison with a state of the art wavelet based image compression scheme, Jpeg2000. We present results on the 14 channel subset of the MODIS imager, which serves as a proxy for the GOES-R imager. We

  8. Digital preprocessing and classification of multispectral earth observation data

    NASA Technical Reports Server (NTRS)

    Anuta, P. E.

    1976-01-01

    The development of airborne and satellite multispectral image scanning sensors has generated wide-spread interest in application of these sensors to earth resource mapping. These point scanning sensors permit scenes to be imaged in a large number of electromagnetic energy bands between .3 and 15 micrometers. The energy sensed in each band can be used as a feature in a computer based multi-dimensional pattern recognition process to aid in interpreting the nature of elements in the scene. Images from each band can also be interpreted visually. Visual interpretation of five or ten multispectral images simultaneously becomes impractical especially as area studied increases; hence, great emphasis has been placed on machine (computer) techniques for aiding in the interpretation process. This paper describes a computer software system concept called LARSYS for analysis of multivariate image data and presents some examples of its application.

  9. Multispectral Thermal Imager: overview

    NASA Astrophysics Data System (ADS)

    Bell, W. Randy; Weber, Paul G.

    2001-08-01

    The Multispectral Thermal Imager, MTI, is a research and development project sponsored by the United States Department of Energy. The primary mission is to demonstrate advanced multispectral and thermal imaging from a satellite, including new technologies, data processing and analysis techniques. The MTI builds on the efforts of a number of earlier efforts, including Landsat, NASA remote sensing missions, and others, but the MTI incorporates a unique combination of attributes. The MTI satellite was launched on 12 March 2000 into a 580 km x 610 km, sun-synchronous orbit with nominal 1 am and 1 pm equatorial crossing times. The Air Force Space Test Program provided the Orbital Sciences Taurus launch vehicle. The satellite has a design lifetime of a year, with the goal of three years. The satellite and payload can typically observe six sites per day, with either one or two observations per site from nadir and off-nadir angles. Data are stored in the satellite memory and down-linked to a ground station at Sandia National Laboratory. Data are then forwarded to the Data Processing and Analysis Center at Los Alamos National Laboratory for processing, analysis and distribution to the MTI team and collaborators. We will provide an overview of the Project, a few examples of data products, and an introduction to more detailed presentations in this special session.

  10. Remote detection of past habitability at Mars-analogue hydrothermal alteration terrains using an ExoMars Panoramic Camera emulator

    NASA Astrophysics Data System (ADS)

    Harris, J. K.; Cousins, C. R.; Gunn, M.; Grindrod, P. M.; Barnes, D.; Crawford, I. A.; Cross, R. E.; Coates, A. J.

    2015-05-01

    A major scientific goal of the European Space Agency's ExoMars 2018 rover is to identify evidence of life within the martian rock record. Key to this objective is the remote detection of geological substrates that are indicative of past habitable environments, which will rely on visual (stereo wide-angle, and high resolution images) and multispectral (440-1000 nm) data produced by the Panoramic Camera (PanCam) instrument. We deployed a PanCam emulator at four hydrothermal sites in the Námafjall volcanic region of Iceland, a Mars-analogue hydrothermal alteration terrain. At these sites, sustained acidic-neutral aqueous interaction with basaltic substrates (crystalline and sedimentary) has produced phyllosilicate, ferric oxide, and sulfate-rich alteration soils, and secondary mineral deposits including gypsum veins and zeolite amygdales. PanCam emulator datasets from these sites were complemented with (i) NERC Airborne Research and Survey Facility aerial hyperspectral images of the study area; (ii) in situ reflectance spectroscopy (400-1000 nm) of PanCam spectral targets; (iii) laboratory X-ray Diffraction, and (iv) laboratory VNIR (350-2500 nm) spectroscopy of target samples to identify their bulk mineralogy and spectral properties. The mineral assemblages and palaeoenvironments characterised here are analogous to neutral-acidic alteration terrains on Mars, such as at Mawrth Vallis and Gusev Crater. Combined multispectral and High Resolution Camera datasets were found to be effective at capturing features of astrobiological importance, such as secondary gypsum and zeolite mineral veins, and phyllosilicate-rich substrates. Our field observations with the PanCam emulator also uncovered stray light problems which are most significant in the NIR wavelengths and investigations are being undertaken to ensure that the flight model PanCam cameras are not similarly affected.

  11. Retinal oxygen saturation evaluation by multi-spectral fundus imaging

    NASA Astrophysics Data System (ADS)

    Khoobehi, Bahram; Ning, Jinfeng; Puissegur, Elise; Bordeaux, Kimberly; Balasubramanian, Madhusudhanan; Beach, James

    2007-03-01

    Purpose: To develop a multi-spectral method to measure oxygen saturation of the retina in the human eye. Methods: Five Cynomolgus monkeys with normal eyes were anesthetized with intramuscular ketamine/xylazine and intravenous pentobarbital. Multi-spectral fundus imaging was performed in five monkeys with a commercial fundus camera equipped with a liquid crystal tuned filter in the illumination light path and a 16-bit digital camera. Recording parameters were controlled with software written specifically for the application. Seven images at successively longer oxygen-sensing wavelengths were recorded within 4 seconds. Individual images for each wavelength were captured in less than 100 msec of flash illumination. Slightly misaligned images of separate wavelengths due to slight eye motion were registered and corrected by translational and rotational image registration prior to analysis. Numerical values of relative oxygen saturation of retinal arteries and veins and the underlying tissue in between the artery/vein pairs were evaluated by an algorithm previously described, but which is now corrected for blood volume from averaged pixels (n > 1000). Color saturation maps were constructed by applying the algorithm at each image pixel using a Matlab script. Results: Both the numerical values of relative oxygen saturation and the saturation maps correspond to the physiological condition, that is, in a normal retina, the artery is more saturated than the tissue and the tissue is more saturated than the vein. With the multi-spectral fundus camera and proper registration of the multi-wavelength images, we were able to determine oxygen saturation in the primate retinal structures on a tolerable time scale which is applicable to human subjects. Conclusions: Seven wavelength multi-spectral imagery can be used to measure oxygen saturation in retinal artery, vein, and tissue (microcirculation). This technique is safe and can be used to monitor oxygen uptake in humans. This work

  12. Using Google Earth for Rapid Dissemination of Airborne Remote Sensing Lidar and Photography

    NASA Astrophysics Data System (ADS)

    Wright, C. W.; Nayegandhi, A.; Brock, J. C.

    2006-12-01

    In order to visualize and disseminate vast amounts of lidar and digital photography data, we present a unique method that make these data layers available via the Google Earth interface. The NASA Experimental Advanced Airborne Research Lidar (EAARL) provides unprecedented capabilities to survey coral reefs, nearshore benthic habitats, coastal vegetation, and sandy beaches. The EAARL sensor suite includes a water-penetrating lidar that provides high-resolution topographic information, a down-looking color digital camera, a down-looking high-resolution color-infrared (CIR) digital camera, and precision kinematic GPS receivers which provide for sub-meter geo-referencing of each laser and multispectral sample. Google Earth "kml" files are created for each EAARL multispectral and processed lidar image. A hierarchical structure of network links allows the user to download high-resolution images within the region of interest. The first network link (kmz file) downloaded by the user contains a color coded flight path and "minute marker" icons along the flight path. Each "minute" icon provides access to the image overlays, and additional network links for each second along the flight path as well as flight navigation information. Layers of false-color-coded lidar Digital Elevation Model (DEM) data are made available in 2 km by 2km tiles. These layers include canopy-top, bare-Earth, submerged topography, and links to any other lidar products. The user has the option to download the x,y,z ascii point data or a DEM in the Geotif file format for each tile. The NASA EAARL project captured roughly 250,000 digital photographs in five flights conducted a few days after Hurricane Katrina made landfall along the Gulf Coast in 2005. All of the photos and DEM layers are georeferenced and viewable online using Google Earth.

  13. Airborne Transparencies.

    ERIC Educational Resources Information Center

    Horne, Lois Thommason

    1984-01-01

    Starting from a science project on flight, art students discussed and investigated various means of moving in space. Then they made acetate illustrations which could be used as transparencies. The projection phenomenon made the illustrations look airborne. (CS)

  14. Multispectral information hiding in RGB image using bit-plane-based watermarking and its application

    NASA Astrophysics Data System (ADS)

    Shinoda, Kazuma; Watanabe, Aya; Hasegawa, Madoka; Kato, Shigeo

    2015-06-01

    Although it was expected that multispectral images would be implemented in many applications, such as remote sensing and medical imaging, their use has not been widely diffused in these fields. The development of a compact multispectral camera and display will be needed for practical use, but the format compatibility between multispectral and RGB images is also important for reducing the introduction cost and having high usability. We propose a method of embedding the spectral information into an RGB image by watermarking. The RGB image is calculated from the multispectral image, and then, the original multispectral image is estimated from the RGB image using Wiener estimation. The residual data between the original and the estimated multispectral image are compressed and embedded in the lower bit planes of the RGB image. The experimental results show that, as compared with Wiener estimation, the proposed method leads to more than a 10 dB gain in the peak signal-to-noise ratio of the reconstructed multispectral image, while there are almost no significant perceptual differences in the watermarked RGB image.

  15. Multispectral scanner optical system

    NASA Technical Reports Server (NTRS)

    Stokes, R. C.; Koch, N. G. (Inventor)

    1980-01-01

    An optical system for use in a multispectral scanner of the type used in video imaging devices is disclosed. Electromagnetic radiation reflected by a rotating scan mirror is focused by a concave primary telescope mirror and collimated by a second concave mirror. The collimated beam is split by a dichroic filter which transmits radiant energy in the infrared spectrum and reflects visible and near infrared energy. The long wavelength beam is filtered and focused on an infrared detector positioned in a cryogenic environment. The short wavelength beam is dispersed by a pair of prisms, then projected on an array of detectors also mounted in a cryogenic environment and oriented at an angle relative to the optical path of the dispersed short wavelength beam.

  16. Multispectral Resource Sampler Workshop

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The utility of the multispectral resource sampler (MRS) was examined by users in the following disciplines: agriculture, atmospheric studies, engineering, forestry, geology, hydrology/oceanography, land use, and rangelands/soils. Modifications to the sensor design were recommended and the desired types of products and number of scenes required per month were indicated. The history, design, capabilities, and limitations of the MRS are discussed as well as the multilinear spectral array technology which it uses. Designed for small area inventory, the MRS can provide increased temporal, spectral, and spatial resolution, facilitate polarization measurement and atmospheric correction, and test onboard data compression techniques. The advantages of using it along with the thematic mapper are considered.

  17. Multispectral imaging radar

    NASA Technical Reports Server (NTRS)

    Porcello, L. J.; Rendleman, R. A.

    1972-01-01

    A side-looking radar, installed in a C-46 aircraft, was modified to provide it with an initial multispectral imaging capability. The radar is capable of radiating at either of two wavelengths, these being approximately 3 cm and 30 cm, with either horizontal or vertical polarization on each wavelength. Both the horizontally- and vertically-polarized components of the reflected signal can be observed for each wavelength/polarization transmitter configuration. At present, two-wavelength observation of a terrain region can be accomplished within the same day, but not with truly simultaneous observation on both wavelengths. A multiplex circuit to permit this simultaneous observation has been designed. A brief description of the modified radar system and its operating parameters is presented. Emphasis is then placed on initial flight test data and preliminary interpretation. Some considerations pertinent to the calibration of such radars are presented in passing.

  18. Scene segmentation from motion in multispectral imagery to aid automatic human gait recognition

    NASA Astrophysics Data System (ADS)

    Pearce, Daniel; Harvey, Christophe; Day, Simon; Goffredo, Michela

    2007-10-01

    Primarily focused at military and security environments where there is a need to identify humans covertly and remotely; this paper outlines how recovering human gait biometrics from a multi-spectral imaging system can overcome the failings of traditional biometrics to fulfil those needs. With the intention of aiding single camera human gait recognition, an algorithm was developed to accurately segment a walking human from multi-spectral imagery. 16-band imagery from the image replicating imaging spectrometer (IRIS) camera system is used to overcome some of the common problems associated with standard change detection techniques. Fusing the concepts of scene segmentation by spectral characterisation and background subtraction by image differencing gives a uniquely robust approach. This paper presents the results of real trials with human subjects and a prototype IRIS camera system, and compares performance to typical broadband camera systems.

  19. Multispectral Microimager for Astrobiology

    NASA Technical Reports Server (NTRS)

    Sellar, R. Glenn; Farmer, Jack D.; Kieta, Andrew; Huang, Julie

    2006-01-01

    A primary goal of the astrobiology program is the search for fossil records. The astrobiology exploration strategy calls for the location and return of samples indicative of environments conducive to life, and that best capture and preserve biomarkers. Successfully returning samples from environments conducive to life requires two primary capabilities: (1) in situ mapping of the mineralogy in order to determine whether the desired minerals are present; and (2) nondestructive screening of samples for additional in-situ testing and/or selection for return to laboratories for more in-depth examination. Two of the most powerful identification techniques are micro-imaging and visible/infrared spectroscopy. The design and test results are presented from a compact rugged instrument that combines micro-imaging and spectroscopic capability to provide in-situ analysis, mapping, and sample screening capabilities. Accurate reflectance spectra should be a measure of reflectance as a function of wavelength only. Other compact multispectral microimagers use separate LEDs (light-emitting diodes) for each wavelength and therefore vary the angles of illumination when changing wavelengths. When observing a specularly-reflecting sample, this produces grossly inaccurate spectra due to the variation in the angle of illumination. An advanced design and test results are presented for a multispectral microimager which demonstrates two key advances relative to previous LED-based microimagers: (i) acquisition of actual reflectance spectra in which the flux is a function of wavelength only, rather than a function of both wavelength and illumination geometry; and (ii) increase in the number of spectral bands to eight bands covering a spectral range of 468 to 975 nm.

  20. Time-of-Flight Microwave Camera.

    PubMed

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-10-05

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable "stealth" regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows "camera-like" behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.

  1. Time-resolved multispectral imaging of combustion reaction

    NASA Astrophysics Data System (ADS)

    Huot, Alexandrine; Gagnon, Marc-André; Jahjah, Karl-Alexandre; Tremblay, Pierre; Savary, Simon; Farley, Vincent; Lagueux, Philippe; Guyot, Éric; Chamberland, Martin; Marcotte, Fréderick

    2015-05-01

    Thermal infrared imaging is a field of science that evolves rapidly. Scientists have used for years the simplest tool: thermal broadband cameras. This allows to perform target characterization in both the longwave (LWIR) and midwave (MWIR) infrared spectral range. Infrared thermal imaging is used for a wide range of applications, especially in the combustion domain. For example, it can be used to follow combustion reactions, in order to characterize the injection and the ignition in a combustion chamber or even to observe gases produced by a flare or smokestack. Most combustion gases such as carbon dioxide (CO2) selectively absorb/emit infrared radiation at discrete energies, i.e. over a very narrow spectral range. Therefore, temperatures derived from broadband imaging are not reliable without prior knowledge about spectral emissivity. This information is not directly available from broadband images. However, spectral information is available using spectral filters. In this work, combustion analysis was carried out using Telops MS-IR MW camera which allows multispectral imaging at a high frame rate. A motorized filter wheel allowing synchronized acquisitions on eight (8) different channels was used to provide time-resolved multispectral imaging of combustion products of a candle in which black powder has been burnt to create a burst. It was then possible to estimate the temperature by modeling spectral profile derived from information obtained with the different spectral filters. Comparison with temperatures obtained using conventional broadband imaging illustrates the benefits of time-resolved multispectral imaging for the characterization of combustion processes.

  2. Time-resolved multispectral imaging of combustion reactions

    NASA Astrophysics Data System (ADS)

    Huot, Alexandrine; Gagnon, Marc-André; Jahjah, Karl-Alexandre; Tremblay, Pierre; Savary, Simon; Farley, Vincent; Lagueux, Philippe; Guyot, Éric; Chamberland, Martin; Marcotte, Frédérick

    2015-10-01

    Thermal infrared imaging is a field of science that evolves rapidly. Scientists have used for years the simplest tool: thermal broadband cameras. These allow to perform target characterization in both the longwave (LWIR) and midwave (MWIR) infrared spectral range. Infrared thermal imaging is used for a wide range of applications, especially in the combustion domain. For example, it can be used to follow combustion reactions, in order to characterize the injection and the ignition in a combustion chamber or even to observe gases produced by a flare or smokestack. Most combustion gases, such as carbon dioxide (CO2), selectively absorb/emit infrared radiation at discrete energies, i.e. over a very narrow spectral range. Therefore, temperatures derived from broadband imaging are not reliable without prior knowledge of spectral emissivity. This information is not directly available from broadband images. However, spectral information is available using spectral filters. In this work, combustion analysis was carried out using a Telops MS-IR MW camera, which allows multispectral imaging at a high frame rate. A motorized filter wheel allowing synchronized acquisitions on eight (8) different channels was used to provide time-resolved multispectral imaging of combustion products of a candle in which black powder has been burnt to create a burst. It was then possible to estimate the temperature by modeling spectral profiles derived from information obtained with the different spectral filters. Comparison with temperatures obtained using conventional broadband imaging illustrates the benefits of time-resolved multispectral imaging for the characterization of combustion processes.

  3. MSS D Multispectral Scanner System

    NASA Technical Reports Server (NTRS)

    Lauletta, A. M.; Johnson, R. L.; Brinkman, K. L. (Principal Investigator)

    1982-01-01

    The development and acceptance testing of the 4-band Multispectral Scanners to be flown on LANDSAT D and LANDSAT D Earth resources satellites are summarized. Emphasis is placed on the acceptance test phase of the program. Test history and acceptance test algorithms are discussed. Trend data of all the key performance parameters are included and discussed separately for each of the two multispectral scanner instruments. Anomalies encountered and their resolutions are included.

  4. Red to far-red multispectral fluorescence image fusion for detection of fecal contamination on apples

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This research developed a multispectral algorithm derived from hyperspectral line-scan fluorescence imaging under violet/blue LED excitation for detection of fecal contamination on Golden Delicious apples. Using a hyperspectral line-scan imaging system consisting of an EMCCD camera, spectrograph, an...

  5. Uniqueness in multispectral constant-wave epi-illumination imaging.

    PubMed

    Garcia-Allende, P B; Radrich, K; Symvoulidis, P; Glatz, J; Koch, M; Jentoft, K M; Ripoll, J; Ntziachristos, V

    2016-07-01

    Multispectral tissue imaging based on optical cameras and continuous-wave tissue illumination is commonly used in medicine and biology. Surprisingly, there is a characteristic absence of a critical look at the quantities that can be uniquely characterized from optically diffuse matter by multispectral imaging. Here, we investigate the fundamental question of uniqueness in epi-illumination measurements from turbid media obtained at multiple wavelengths. By utilizing an analytical model, tissue-mimicking phantoms, and an in vivo imaging experiment we show that independent of the bands employed, spectral measurements cannot uniquely retrieve absorption and scattering coefficients. We also establish that it is, nevertheless, possible to uniquely quantify oxygen saturation and the Mie scattering power-a previously undocumented uniqueness condition. PMID:27367111

  6. Multispectral Remote Sensing at the Savannah River Plant

    SciTech Connect

    Shines, J.E.; Tinney, L.R.; Hawley, D.L.

    1984-01-01

    Aerial Mesurements Operations (AMO) is the remote sensing arm of the Department of Energy (DOE). The purpose of AMO is to provide timely, accurate, and cost-effective remote sensing data on a non-interference basis over DOE facilities located around the country. One of the programs administered by AMO is the Comprehensive Integrated Remote Sensing (CIRS) program, which involves the use of a wide range of data acquisition systems - aerial cameras, multispectral and infrared scanners, and nuclear detectors - to acquire data at DOE sites. The data are then processed, analyzed and interpreted to provide useful information, which is then catalogued into a data base for future use. This report describes some of the data acquisition and analysis capabilities of the Multispectral Remote Sensing Department (MRSD) as they relate to the CIRS program. 3 tables.

  7. Fourth Airborne Geoscience Workshop

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The focus of the workshop was on how the airborne community can assist in achieving the goals of the Global Change Research Program. The many activities that employ airborne platforms and sensors were discussed: platforms and instrument development; airborne oceanography; lidar research; SAR measurements; Doppler radar; laser measurements; cloud physics; airborne experiments; airborne microwave measurements; and airborne data collection.

  8. Space Camera

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Nikon's F3 35mm camera was specially modified for use by Space Shuttle astronauts. The modification work produced a spinoff lubricant. Because lubricants in space have a tendency to migrate within the camera, Nikon conducted extensive development to produce nonmigratory lubricants; variations of these lubricants are used in the commercial F3, giving it better performance than conventional lubricants. Another spinoff is the coreless motor which allows the F3 to shoot 140 rolls of film on one set of batteries.

  9. An interactive lake survey program. [airborne multispectral sensor image processing

    NASA Technical Reports Server (NTRS)

    Smith, A. Y.

    1977-01-01

    Consideration is given to the development and operation of the interactive lake survey program developed by the Jet Propulsion Laboratory and the Environmental Protection Agency. The program makes it possible to locate, isolate, and store any number of water bodies on the basis of a given digital image. The stored information may be used to generate statistical analyses of each body of water including the lake surface area and the shoreline perimeter. The hardware includes a 360/65 host computer, a Ramtek G100B display controller, and a trackball cursor. The system is illustrated by the LAKELOC operation as it would be applied to a Landsat scene, noting the FARINA and STATUS programs. The water detection algorithm, which increases the accuracy with which water and land data may be separated, is discussed.

  10. Lunar Reconnaissance Orbiter Camera (LROC) instrument overview

    USGS Publications Warehouse

    Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

  11. Multispectral bilateral video fusion.

    PubMed

    Bennett, Eric P; Mason, John L; McMillan, Leonard

    2007-05-01

    We present a technique for enhancing underexposed visible-spectrum video by fusing it with simultaneously captured video from sensors in nonvisible spectra, such as Short Wave IR or Near IR. Although IR sensors can accurately capture video in low-light and night-vision applications, they lack the color and relative luminances of visible-spectrum sensors. RGB sensors do capture color and correct relative luminances, but are underexposed, noisy, and lack fine features due to short video exposure times. Our enhanced fusion output is a reconstruction of the RGB input assisted by the IR data, not an incorporation of elements imaged only in IR. With a temporal noise reduction, we first remove shot noise and increase the color accuracy of the RGB footage. The IR video is then normalized to ensure cross-spectral compatibility with the visible-spectrum video using ratio images. To aid fusion, we decompose the video sources with edge-preserving filters. We introduce a multispectral version of the bilateral filter called the "dual bilateral" that robustly decomposes the RGB video. It utilizes the less-noisy IR for edge detection but also preserves strong visible-spectrum edges not in the IR. We fuse the RGB low frequencies, the IR texture details, and the dual bilateral edges into a noise-reduced video with sharp details, correct chrominances, and natural relative luminances. PMID:17491451

  12. Summaries of the Seventh JPL Airborne Earth Science Workshop January 12-16, 1998. Volume 1; AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1998-01-01

    This publication contains the summaries for the Seventh JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 12-16, 1998. The main workshop is divided into three smaller workshops, and each workshop has a volume as follows: (1) Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) Workshop; (2) Airborne Synthetic Aperture Radar (AIRSAR) Workshop; and (3) Thermal Infrared Multispectral Scanner (TIMS) Workshop. This Volume 1 publication contains 58 papers taken from the AVIRIS workshop.

  13. Compact multispectral imaging system for contaminant detection on poultry carcass

    NASA Astrophysics Data System (ADS)

    Kise, Michio; Park, Bosoon; Lawrence, Kurt C.; Windham, William R.

    2007-02-01

    The objective of this research was to design and fabricate a compact, cost effective multispectral instrument and to collect and analyze spectra for real-time contaminant detection for poultry processing plants. The prototype system developed in this research consisted of a multispectral imaging system, illumination system and an industrial portable computer. The dual-band spectral imaging system developed in this study was a two-port imaging system that consisted of two identical monochrome cameras, optical system and two narrow bandpass filters whose center of the wavelength are 520 and 560 nm with 10 nm FWHM, respectively. A spectral reflectance from a chicken carcass was collected and split in two directions by an optical system including a beamsplitter and lenses, and then two identical collimated lights were filtered by the narrow bandpass filters and delivered to the cameras. Lens distortions and geometric misalignment of the two cameras were mathematically corrected. The prototype system was tested at the real-time processing line and the preliminary results showed that the dual-band spectral imaging system could effectively detect feces and ingesta on the surface of poultry carcass.

  14. Estimating evapotranspiration of riparian vegetation using high resolution multispectral, thermal infrared and lidar data

    NASA Astrophysics Data System (ADS)

    Neale, Christopher M. U.; Geli, Hatim; Taghvaeian, Saleh; Masih, Ashish; Pack, Robert T.; Simms, Ronald D.; Baker, Michael; Milliken, Jeff A.; O'Meara, Scott; Witherall, Amy J.

    2011-11-01

    High resolution airborne multispectral and thermal infrared imagery was acquired over the Mojave River, California with the Utah State University airborne remote sensing system integrated with the LASSI imaging Lidar also built and operated at USU. The data were acquired in pre-established mapping blocks over a 2 day period covering approximately 144 Km of the Mojave River floodplain and riparian zone, approximately 1500 meters in width. The multispectral imagery (green, red and near-infrared bands) was ortho-rectified using the Lidar point cloud data through a direct geo-referencing technique. Thermal Infrared imagery was rectified to the multispectral ortho-mosaics. The lidar point cloud data was classified to separate ground surface returns from vegetation returns as well as structures such as buildings, bridges etc. One-meter DEM's were produced from the surface returns along with vegetation canopy height also at 1-meter grids. Two surface energy balance models that use remote sensing inputs were applied to the high resolution imagery, namely the SEBAL and the Two Source Model. The model parameterizations were slightly modified to accept high resolution imagery (1-meter) as well as the lidar-based vegetation height product, which was used to estimate the aerodynamic roughness length. Both models produced very similar results in terms of latent heat fluxes (LE). Instantaneous LE values were extrapolated to daily evapotranspiration rates (ET) using the reference ET fraction, with data obtained from a local weather station. Seasonal rates were obtained by extrapolating the reference ET fraction according to the seasonal growth habits of the different species. Vegetation species distribution and area were obtained from classification of the multispectral imagery. Results indicate that cottonwood and salt cedar (tamarisk) had the highest evapotranspiration rates followed by mesophytes, arundo, mesquite and desert shrubs. This research showed that high

  15. Infrared Camera

    NASA Technical Reports Server (NTRS)

    1997-01-01

    A sensitive infrared camera that observes the blazing plumes from the Space Shuttle or expendable rocket lift-offs is capable of scanning for fires, monitoring the environment and providing medical imaging. The hand-held camera uses highly sensitive arrays in infrared photodetectors known as quantum well infrared photo detectors (QWIPS). QWIPS were developed by the Jet Propulsion Laboratory's Center for Space Microelectronics Technology in partnership with Amber, a Raytheon company. In October 1996, QWIP detectors pointed out hot spots of the destructive fires speeding through Malibu, California. Night vision, early warning systems, navigation, flight control systems, weather monitoring, security and surveillance are among the duties for which the camera is suited. Medical applications are also expected.

  16. Multi-spectral imaging with infrared sensitive organic light emitting diode.

    PubMed

    Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R; So, Franky

    2014-01-01

    Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions. PMID:25091589

  17. Multi-spectral imaging with infrared sensitive organic light emitting diode

    NASA Astrophysics Data System (ADS)

    Kim, Do Young; Lai, Tzung-Han; Lee, Jae Woong; Manders, Jesse R.; So, Franky

    2014-08-01

    Commercially available near-infrared (IR) imagers are fabricated by integrating expensive epitaxial grown III-V compound semiconductor sensors with Si-based readout integrated circuits (ROIC) by indium bump bonding which significantly increases the fabrication costs of these image sensors. Furthermore, these typical III-V compound semiconductors are not sensitive to the visible region and thus cannot be used for multi-spectral (visible to near-IR) sensing. Here, a low cost infrared (IR) imaging camera is demonstrated with a commercially available digital single-lens reflex (DSLR) camera and an IR sensitive organic light emitting diode (IR-OLED). With an IR-OLED, IR images at a wavelength of 1.2 µm are directly converted to visible images which are then recorded in a Si-CMOS DSLR camera. This multi-spectral imaging system is capable of capturing images at wavelengths in the near-infrared as well as visible regions.

  18. Multispectral imaging method and apparatus

    DOEpatents

    Sandison, D.R.; Platzbecker, M.R.; Vargo, T.D.; Lockhart, R.R.; Descour, M.R.; Richards-Kortum, R.

    1999-07-06

    A multispectral imaging method and apparatus are described which are adapted for use in determining material properties, especially properties characteristic of abnormal non-dermal cells. A target is illuminated with a narrow band light beam. The target expresses light in response to the excitation. The expressed light is collected and the target's response at specific response wavelengths to specific excitation wavelengths is measured. From the measured multispectral response the target's properties can be determined. A sealed, remote probe and robust components can be used for cervical imaging. 5 figs.

  19. Multispectral imaging method and apparatus

    DOEpatents

    Sandison, David R.; Platzbecker, Mark R.; Vargo, Timothy D.; Lockhart, Randal R.; Descour, Michael R.; Richards-Kortum, Rebecca

    1999-01-01

    A multispectral imaging method and apparatus adapted for use in determining material properties, especially properties characteristic of abnormal non-dermal cells. A target is illuminated with a narrow band light beam. The target expresses light in response to the excitation. The expressed light is collected and the target's response at specific response wavelengths to specific excitation wavelengths is measured. From the measured multispectral response the target's properties can be determined. A sealed, remote probe and robust components can be used for cervical imaging

  20. CCD Camera

    DOEpatents

    Roth, R.R.

    1983-08-02

    A CCD camera capable of observing a moving object which has varying intensities of radiation emanating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other. 7 figs.

  1. CCD Camera

    DOEpatents

    Roth, Roger R.

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  2. Nikon Camera

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Nikon FM compact has simplification feature derived from cameras designed for easy, yet accurate use in a weightless environment. Innovation is a plastic-cushioned advance lever which advances the film and simultaneously switches on a built in light meter. With a turn of the lens aperture ring, a glowing signal in viewfinder confirms correct exposure.

  3. New uses for the Zeiss KS-153A camera system

    NASA Astrophysics Data System (ADS)

    Spiller, Rudolf H.

    1995-09-01

    The Zeiss KS-153A aerial reconnaissance framing camera compliments satellite, mapping, and remote sensor data with imagery that is geometrically correct. KS-153A imagery is in a format for tactical 3-D mapping, targeting, and high-resolution intelligence data collection. This system is based upon rugged microprocessor technology that allows a wide variety of mission parameters. Geometrically correct horizon-to-horizon photography, multi-spectral mine detection, stand-off photography, NIRS nine high speed, and very low altitude anti-terrorist surveillance are KS-153A capabilities that have been proven in tests and actual missions. Civilian use of the KS-153A has ranged from measuring flood levels to forest infestations. These are everyday tasks for the KS-153A throughout the world. Zeiss optics have superb spectral response and resolution. Surprisingly effective haze penetration was shown on a day when the pilot himself could not see the terrain. Tests with CCD arrays have also produced outstanding results. This superb spectral response can be used for camouflage detection in wartime, or used for effective environmental control in peacetime, with its ability to detect subtle changes in the signature of vegetation, calling attention to man induced stress such as disease, drought, and pollution. One serious man-induced problem in many parts of the world deserves even more attention in these times: the locating and safe removal of mines. The KS- 153A is currently configured with four different optics. High acuity horizon-to-horizon Pentalens and Multi-spectral Lens (MUC) modules have been added to the basic KS-153A with Trilens and Telelens. This modular concept nearly meets all of today's airborne reconnaissance requirements. Modern recce programs, for example German Air Force Recce Tornado (GAF Recce), have selected the KS-153A. By simply adding additional focal length lens assemblies to an existing KS-153A configuration, the user can instantly and economically adapt

  4. Multispectral Analysis of Indigenous Rock Art Using Terrestrial Laser Scanning

    NASA Astrophysics Data System (ADS)

    Skoog, B.; Helmholz, P.; Belton, D.

    2016-06-01

    Multispectral analysis is a widely used technique in the photogrammetric and remote sensing industry. The use of Terrestrial Laser Scanning (TLS) in combination with imagery is becoming increasingly common, with its applications spreading to a wider range of fields. Both systems benefit from being a non-contact technique that can be used to accurately capture data regarding the target surface. Although multispectral analysis is actively performed within the spatial sciences field, its extent of application within an archaeological context has been limited. This study effectively aims to apply the multispectral techniques commonly used, to a remote Indigenous site that contains an extensive gallery of aging rock art. The ultimate goal for this research is the development of a systematic procedure that could be applied to numerous similar sites for the purpose of heritage preservation and research. The study consisted of extensive data capture of the rock art gallery using two different TLS systems and a digital SLR camera. The data was combined into a common 2D reference frame that allowed for standard image processing to be applied. An unsupervised k-means classifier was applied to the multiband images to detect the different types of rock art present. The result was unsatisfactory as the subsequent classification accuracy was relatively low. The procedure and technique does however show potential and further testing with different classification algorithms could possibly improve the result significantly.

  5. Real-time multispectral imaging application for poultry safety inspection

    NASA Astrophysics Data System (ADS)

    Park, Bosoon; Lawrence, Kurt C.; Windham, William R.; Snead, Matthew P.

    2006-02-01

    The ARS imaging research group in Athens, Georgia has developed a real-time multispectral imaging system for fecal and ingesta contaminant detection on broiler carcasses for poultry industry. The industrial scale system includes a common aperture camera with three visible wavelength optical trim filters. This paper demonstrates calibration of common aperture multispectral imaging hardware and real-time image processing software. The software design, especially the Unified Modeling Language (UML) design approach was used to develop real-time image processing software for on-line application. The UML models including class, object, activity, sequence, and collaboration diagram were presented. Both hardware and software for a real-time fecal and ingesta contaminant detection were tested at the pilot-scale poultry processing line. The test results of industrial sacle real-time system showed that the multispectral imaging technique performed well for detecting fecal contaminants with a commercial processing speed (currently 140 birds per minute). The accuracy for the detection of fecal and ingesta contaminates was approximately 96%.

  6. Multispectral slice of APXS

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Portions of Sojourner's Alpha Proton X-Ray Spectrometer (APXS), a deployment spring, and the rock Barnacle Bill are visible in this color image. The image was taken by Sojourner's rear camera, and shows that the APXS made good contact with Barnacle Bill.

    Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is an operating division of the California Institute of Technology (Caltech). The Imager for Mars Pathfinder (IMP) was developed by the University of Arizona Lunar and Planetary Laboratory under contract to JPL. Peter Smith is the Principal Investigator.

  7. Multispectral Landsat images of Antartica

    SciTech Connect

    Lucchitta, B.K.; Bowell, J.A.; Edwards, K.L.; Eliason, E.M.; Fergurson, H.M.

    1988-01-01

    The U.S. Geological Survey has a program to map Antarctica by using colored, digitally enhanced Landsat multispectral scanner images to increase existing map coverage and to improve upon previously published Landsat maps. This report is a compilation of images and image mosaic that covers four complete and two partial 1:250,000-scale quadrangles of the McMurdo Sound region.

  8. Multispectral fluorescence imaging of atherosclerosis

    SciTech Connect

    Davenport, C.M.C.

    1992-01-01

    Multispectral fluorescence imaging is a new diagnostic technique with the potential to provide improved detection and classification of atherosclerotic disease. This technique involves imaging the fluorescence response of a tissue region through a tunable band-pass filtering device. The result is a set of image in which each individual image is composed of the fluorescence emission within a specified band of wavelengths. Multispectral imaging combined with angioscopic technology allows direct access to important spectral information and spatial attributes providing the potential for more informed clinical decisions about which, if any, treatment modality is indicated. In this dissertation, the system requirements for an angioscopic system with multispectral imaging capability are identified. This analysis includes a description of the necessary optical components and their characteristics as well as the experimental determination of spectral radiance values for the fluorescence response of human aorta specimens and the estimation of anticipated signal-to-noise ratios for the spectral images. Other issues investigated include the number of spectral images required to provide good classification potential and the best normalization method to be utilized. Finally, the potential utility of the information contained within a multispectral data set is demonstrated. Two methods of utilizing the multispectral data are presented. The first method involves generating a ratio-image from the ratio of the intensities of two spectrally filtered images. The second method consists of using histologically verified training data to train a projector and then applying that projector to a set of spectral images. The result provides improved contrast image. White-light images (generated using an incandescent light source), total-fluorescence images (the fluorescence response without spectral filtering), ratio-images, and optimized contrast images are compared. T

  9. Application of multispectral color photography to flame flow visualization

    NASA Technical Reports Server (NTRS)

    Stoffers, G.

    1979-01-01

    For flames of short duration and low intensity of radiation a spectroscopical flame diagnostics is difficult. In order to find some other means of extracting information about the flame structure from its radiation, the feasibility of using multispectral color photography was successfully evaluated. Since the flame photographs are close-ups, there is a considerable parallax between the single images, when several cameras are used, and additive color viewing is not possible. Each image must be analyzed individually, it is advisable to use color film in all cameras. One can either use color films of different spectral sensitivities or color films of the same type with different color filters. Sharp cutting filters are recommended.

  10. Airborne thermography applications in Argentina

    NASA Astrophysics Data System (ADS)

    Castro, Eduardo H.; Selles, Eduardo J.; Costanzo, Marcelo; Franco, Oscar; Diaz, Jose

    2002-03-01

    Forest fires in summer and sheep buried under the snow in winter have become important problems in the south of our country, in the region named Patagonia. We are studying to find a solution by means of an airborne imaging system whose construction we have just finished. It is a 12 channel multispectral airborne scanner system that can be mounted in a Guarani airplane or in a Learjet; the first is a non- pressurized aircraft for flight at low height and the second is a pressurized one for higher flights. The scanner system is briefly described. Their sensors can detect radiation from the ultra violet to the thermal infrared. The images are visualized in real time in a monitor screen and can be stored in the hard disc of the PC for later processing. The use of this scanner for some applications that include the prevention and fighting of forest fires and the study of the possibility of detection of sheep under snow in the Patagonia is now being accomplished. Theoretical and experimental results in fire detection and a theoretical model for studying the possibility of detection of the buried sheep are presented.

  11. Mapping giant reed along the Rio Grande using airborne and satellite imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Giant reed (Arundo donax L.) is a perennial invasive weed that presents a severe threat to agroecosystems and riparian areas in the Texas and Mexican portions of the Rio Grande Basin. The objective of this presentation is to give an overview on the use of aerial photography, airborne multispectral a...

  12. Daily evapotranspiration estimates from extrapolating instantaneous airborne remote sensing ET values

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this study, six extrapolation methods have been compared for their ability to estimate daily crop evapotranspiration (ETd) from instantaneous latent heat flux estimates derived from digital airborne multispectral remote sensing imagery. Data used in this study were collected during an experiment...

  13. Multispectral polarized scene projector (MPSP)

    NASA Astrophysics Data System (ADS)

    Yu, Haiping; Wei, Hong; Guo, Lei; Wang, Shenggang; Li, Le; Lippert, Jack R.; Serati, Steve; Gupta, Neelam; Carlen, Frank R.

    2011-06-01

    This newly developed prototype Multispectral Polarized Scene Projector (MPSP), configured for the short wave infrared (SWIR) regime, can be used for the test & evaluation (T&E) of spectro-polarimetric imaging sensors. The MPSP system generates both static and video images (up to 200 Hz) with 512×512 spatial resolution with active spatial, spectral, and polarization modulation with controlled bandwidth. It projects input SWIR radiant intensity scenes from stored memory with user selectable wavelength (850-1650 nm) and bandwidth (12-100 nm), as well as polarization states (six different states) controllable on a pixel by pixel basis. The system consists of one spectrally tunable liquid crystal filter with variable bandpass, and multiple liquid crystal on silicon (LCoS) spatial light modulators (SLMs) for intensity control and polarization modulation. In addition to the spectro-polarimetric sensor test, the instrument also simulates polarized multispectral images of military scenes/targets for hardware-in-the loop (HIL) testing.

  14. A Comparative Study of Land Cover Classification by Using Multispectral and Texture Data

    PubMed Central

    Qadri, Salman; Khan, Dost Muhammad; Ahmad, Farooq; Qadri, Syed Furqan; Babar, Masroor Ellahi; Shahid, Muhammad; Ul-Rehman, Muzammil; Razzaq, Abdul; Shah Muhammad, Syed; Fahad, Muhammad; Ahmad, Sarfraz; Pervez, Muhammad Tariq; Naveed, Nasir; Aslam, Naeem; Jamil, Mutiullah; Rehmani, Ejaz Ahmad; Ahmad, Nazir; Akhtar Khan, Naeem

    2016-01-01

    The main objective of this study is to find out the importance of machine vision approach for the classification of five types of land cover data such as bare land, desert rangeland, green pasture, fertile cultivated land, and Sutlej river land. A novel spectra-statistical framework is designed to classify the subjective land cover data types accurately. Multispectral data of these land covers were acquired by using a handheld device named multispectral radiometer in the form of five spectral bands (blue, green, red, near infrared, and shortwave infrared) while texture data were acquired with a digital camera by the transformation of acquired images into 229 texture features for each image. The most discriminant 30 features of each image were obtained by integrating the three statistical features selection techniques such as Fisher, Probability of Error plus Average Correlation, and Mutual Information (F + PA + MI). Selected texture data clustering was verified by nonlinear discriminant analysis while linear discriminant analysis approach was applied for multispectral data. For classification, the texture and multispectral data were deployed to artificial neural network (ANN: n-class). By implementing a cross validation method (80-20), we received an accuracy of 91.332% for texture data and 96.40% for multispectral data, respectively. PMID:27376088

  15. A Comparative Study of Land Cover Classification by Using Multispectral and Texture Data.

    PubMed

    Qadri, Salman; Khan, Dost Muhammad; Ahmad, Farooq; Qadri, Syed Furqan; Babar, Masroor Ellahi; Shahid, Muhammad; Ul-Rehman, Muzammil; Razzaq, Abdul; Shah Muhammad, Syed; Fahad, Muhammad; Ahmad, Sarfraz; Pervez, Muhammad Tariq; Naveed, Nasir; Aslam, Naeem; Jamil, Mutiullah; Rehmani, Ejaz Ahmad; Ahmad, Nazir; Akhtar Khan, Naeem

    2016-01-01

    The main objective of this study is to find out the importance of machine vision approach for the classification of five types of land cover data such as bare land, desert rangeland, green pasture, fertile cultivated land, and Sutlej river land. A novel spectra-statistical framework is designed to classify the subjective land cover data types accurately. Multispectral data of these land covers were acquired by using a handheld device named multispectral radiometer in the form of five spectral bands (blue, green, red, near infrared, and shortwave infrared) while texture data were acquired with a digital camera by the transformation of acquired images into 229 texture features for each image. The most discriminant 30 features of each image were obtained by integrating the three statistical features selection techniques such as Fisher, Probability of Error plus Average Correlation, and Mutual Information (F + PA + MI). Selected texture data clustering was verified by nonlinear discriminant analysis while linear discriminant analysis approach was applied for multispectral data. For classification, the texture and multispectral data were deployed to artificial neural network (ANN: n-class). By implementing a cross validation method (80-20), we received an accuracy of 91.332% for texture data and 96.40% for multispectral data, respectively. PMID:27376088

  16. A Comparative Study of Land Cover Classification by Using Multispectral and Texture Data.

    PubMed

    Qadri, Salman; Khan, Dost Muhammad; Ahmad, Farooq; Qadri, Syed Furqan; Babar, Masroor Ellahi; Shahid, Muhammad; Ul-Rehman, Muzammil; Razzaq, Abdul; Shah Muhammad, Syed; Fahad, Muhammad; Ahmad, Sarfraz; Pervez, Muhammad Tariq; Naveed, Nasir; Aslam, Naeem; Jamil, Mutiullah; Rehmani, Ejaz Ahmad; Ahmad, Nazir; Akhtar Khan, Naeem

    2016-01-01

    The main objective of this study is to find out the importance of machine vision approach for the classification of five types of land cover data such as bare land, desert rangeland, green pasture, fertile cultivated land, and Sutlej river land. A novel spectra-statistical framework is designed to classify the subjective land cover data types accurately. Multispectral data of these land covers were acquired by using a handheld device named multispectral radiometer in the form of five spectral bands (blue, green, red, near infrared, and shortwave infrared) while texture data were acquired with a digital camera by the transformation of acquired images into 229 texture features for each image. The most discriminant 30 features of each image were obtained by integrating the three statistical features selection techniques such as Fisher, Probability of Error plus Average Correlation, and Mutual Information (F + PA + MI). Selected texture data clustering was verified by nonlinear discriminant analysis while linear discriminant analysis approach was applied for multispectral data. For classification, the texture and multispectral data were deployed to artificial neural network (ANN: n-class). By implementing a cross validation method (80-20), we received an accuracy of 91.332% for texture data and 96.40% for multispectral data, respectively.

  17. Multispectral Microscopic Imager (MMI): Multispectral Imaging of Geological Materials at a Handlens Scale

    NASA Astrophysics Data System (ADS)

    Farmer, J. D.; Nunez, J. I.; Sellar, R. G.; Gardner, P. B.; Manatt, K. S.; Dingizian, A.; Dudik, M. J.; McDonnell, G.; Le, T.; Thomas, J. A.; Chu, K.

    2011-12-01

    The Multispectral Microscopic Imager (MMI) is a prototype instrument presently under development for future astrobiological missions to Mars. The MMI is designed to be a arm-mounted rover instrument for use in characterizing the microtexture and mineralogy of materials along geological traverses [1,2,3]. Such geological information is regarded as essential for interpreting petrogenesis and geological history, and when acquired in near real-time, can support hypothesis-driven exploration and optimize science return. Correlated microtexure and mineralogy also provides essential data for selecting samples for analysis with onboard lab instruments, and for prioritizing samples for potential Earth return. The MMI design employs multispectral light-emitting diodes (LEDs) and an uncooled focal plane array to achieve the low-mass (<1kg), low-cost, and high reliability (no moving parts) required for an arm-mounted instrument on a planetary rover [2,3]. The MMI acquires multispectral, reflectance images at 62 μm/pixel, in which each image pixel is comprised of a 21-band VNIR spectrum (0.46 to 1.73 μm). This capability enables the MMI to discriminate and resolve the spatial distribution of minerals and textures at the microscale [2, 3]. By extending the spectral range into the infrared, and increasing the number of spectral bands, the MMI exceeds the capabilities of current microimagers, including the MER Microscopic Imager (MI); 4, the Phoenix mission Robotic Arm Camera (RAC; 5) and the Mars Science Laboratory's Mars Hand Lens Imager (MAHLI; 6). In this report we will review the capabilities of the MMI by highlighting recent lab and field applications, including: 1) glove box deployments in the Astromaterials lab at Johnson Space Center to analyze Apollo lunar samples; 2) GeoLab glove box deployments during the 2011 Desert RATS field trials in northern AZ to characterize analog materials collected by astronauts during simulated EVAs; 3) field deployments on Mauna Kea

  18. Multispectral determination of vegetative cover in corn crop canopy

    NASA Technical Reports Server (NTRS)

    Stoner, E. R.; Baumgardner, M. F.

    1972-01-01

    The relationship between different amounts of vegetative ground cover and the energy reflected by corn canopies was investigated. Low altitude photography and an airborne multispectral scanner were used to measure this reflected energy. Field plots were laid out, representing four growth stages of corn. Two plot locations were chosen-on a very dark and a very light surface soil. Color and color infrared photographs were taken from a vertical distance of 10 m. Estimates of ground cover were made from these photographs and were related to field measurements of leaf area index. Ground cover could be predicted from leaf area index measurements by a second order equation. Microdensitometry and digitzation of the three separated dye layers of color infrared film showed that the near infrared dye layer is most valuable in ground cover determinations. Computer analysis of the digitized photography provided an accurate method of determining precent ground cover.

  19. Gimbaled multispectral imaging system and method

    DOEpatents

    Brown, Kevin H.; Crollett, Seferino; Henson, Tammy D.; Napier, Matthew; Stromberg, Peter G.

    2016-01-26

    A gimbaled multispectral imaging system and method is described herein. In an general embodiment, the gimbaled multispectral imaging system has a cross support that defines a first gimbal axis and a second gimbal axis, wherein the cross support is rotatable about the first gimbal axis. The gimbaled multispectral imaging system comprises a telescope that fixed to an upper end of the cross support, such that rotation of the cross support about the first gimbal axis causes the tilt of the telescope to alter. The gimbaled multispectral imaging system includes optics that facilitate on-gimbal detection of visible light and off-gimbal detection of infrared light.

  20. Registration of 3D and multispectral data for the study of cultural heritage surfaces.

    PubMed

    Chane, Camille Simon; Schütze, Rainer; Boochs, Frank; Marzani, Franck S

    2013-01-01

    We present a technique for the multi-sensor registration of featureless datasets based on the photogrammetric tracking of the acquisition systems in use. This method is developed for the in situ study of cultural heritage objects and is tested by digitizing a small canvas successively with a 3D digitization system and a multispectral camera while simultaneously tracking the acquisition systems with four cameras and using a cubic target frame with a side length of 500 mm. The achieved tracking accuracy is better than 0.03 mm spatially and 0.150 mrad angularly. This allows us to seamlessly register the 3D acquisitions and to project the multispectral acquisitions on the 3D model. PMID:23322103

  1. Introducing a Low-Cost Mini-Uav for - and Multispectral-Imaging

    NASA Astrophysics Data System (ADS)

    Bendig, J.; Bolten, A.; Bareth, G.

    2012-07-01

    The trend to minimize electronic devices also accounts for Unmanned Airborne Vehicles (UAVs) as well as for sensor technologies and imaging devices. Consequently, it is not surprising that UAVs are already part of our daily life and the current pace of development will increase civil applications. A well known and already wide spread example is the so called flying video game based on Parrot's AR.Drone which is remotely controlled by an iPod, iPhone, or iPad (http://ardrone.parrot.com). The latter can be considered as a low-weight and low-cost Mini-UAV. In this contribution a Mini-UAV is considered to weigh less than 5 kg and is being able to carry 0.2 kg to 1.5 kg of sensor payload. While up to now Mini-UAVs like Parrot's AR.Drone are mainly equipped with RGB cameras for videotaping or imaging, the development of such carriage systems clearly also goes to multi-sensor platforms like the ones introduced for larger UAVs (5 to 20 kg) by Jaakkolla et al. (2010) for forestry applications or by Berni et al. (2009) for agricultural applications. The problem when designing a Mini-UAV for multi-sensor imaging is the limitation of payload of up to 1.5 kg and a total weight of the whole system below 5 kg. Consequently, the Mini-UAV without sensors but including navigation system and GPS sensors must weigh less than 3.5 kg. A Mini-UAV system with these characteristics is HiSystems' MK-Okto (www.mikrokopter.de). Total weight including battery without sensors is less than 2.5 kg. Payload of a MK-Okto is approx. 1 kg and maximum speed is around 30 km/h. The MK-Okto can be operated up to a wind speed of less than 19 km/h which corresponds to Beaufort scale number 3 for wind speed. In our study, the MK-Okto is equipped with a handheld low-weight NEC F30IS thermal imaging system. The F30IS which was developed for veterinary applications, covers 8 to 13 μm, weighs only 300 g

  2. Multispectral imaging with optical bandpass filters: tilt angle and position estimation

    NASA Astrophysics Data System (ADS)

    Brauers, Johannes; Aach, Til

    2009-01-01

    Optical bandpass filters play a decisive role in multispectral imaging. Various multispectral cameras use this type of color filter for the sequential acquisition of different spectral bands. Practically unavoidable, small tilt angles of the filters with respect to the optical axis influence the imaging process: First, by tilting the filter, the center wavelength of the filter is shifted, causing color variations. Second, due to refractions of the filter, the image is distorted geometrically depending on the tilt angle. Third, reflections between sensor and filter glass may cause ghosting, i.e., a weak and shifted copy of the image, which also depends on the filter angle. A method to measure the filter position parameters from multispectral color components is thus highly desirable. We propose a method to determine the angle and position of an optical filter brought into the optical path in, e.g., filter-wheel multispectral cameras, with respect to the camera coordinate system. We determine the position and angle of the filter by presenting a calibration chart to the camera, which is always partly reflected by the highly reflective optical bandpass filter. The extrinsic parameters of the original and mirrored chart can then be estimated. We derive the angle and position of the filter from the coordinates of the charts. We compare the results of the angle measurements to a ground truth obtained from the settings of a high-precision rotation table and thus validate our measurement method. Furthermore, we simulate the refraction effect of the optical filter and show that the results match quite well with the reality, thus also confirming our method.

  3. Summaries of the 4th Annual JPL Airborne Geoscience Workshop. Volume 2: TIMS Workshop

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J. (Editor)

    1993-01-01

    This is volume 2 of a three volume set of publications that contain the summaries for the Fourth Annual JPL Airborne Geoscience Workshop, held in Washington, D.C. on October 25-29, 1993. The main workshop is divided into three smaller workshops as follows: The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on October 25-26. The summaries for this workshop appear in Volume 1. The Thermal Infrared Multispectral Scanner (TIMS) workshop, on October 27. The summaries for this workshop appear in Volume 2. The Airborne Synthetic Aperture Radar (AIRSAR) workshop, on October 28-29. The summaries for this workshop appear in Volume 3.

  4. Summaries of the Third Annual JPL Airborne Geoscience Workshop. Volume 1: AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1992-01-01

    This publication contains the preliminary agenda and summaries for the Third Annual JPL Airborne Geoscience Workshop, held at the Jet Propulsion Laboratory, Pasadena, California, on 1-5 June 1992. This main workshop is divided into three smaller workshops as follows: (1) the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on June 1 and 2; (2) the Thermal Infrared Multispectral Scanner (TIMS) workshop, on June 3; and (3) the Airborne Synthetic Aperture Radar (AIRSAR) workshop, on June 4 and 5. The summaries are contained in Volumes 1, 2, and 3, respectively.

  5. Summaries of the 4th Annual JPL Airborne Geoscience Workshop. Volume 3: AIRSAR Workshop

    NASA Technical Reports Server (NTRS)

    Vanzyl, Jakob (Editor)

    1993-01-01

    This publication contains the summaries for the Fourth Annual JPL Airborne Geoscience Workshop, held in Washington, D.C. on October 25-29, 1993. The main workshop is divided into three smaller workshops as follows: The Airborne Visible/Infrared Spectrometer (AVIRIS) workshop, on October 25-26, whose summaries appear in Volume 1; The Thermal Infrared Multispectral Scanner (TIMS) workshop, on October 27, whose summaries appear in Volume 2; and The Airborne Synthetic Aperture Radar (AIRSAR) workshop, on October 28-29, whose summaries appear in this volume, Volume 3.

  6. Summaries of the Third Annual JPL Airborne Geoscience Workshop. Volume 2: TIMS Workshop

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J. (Editor)

    1992-01-01

    This publication contains the preliminary agenda and summaries for the Third Annual JPL Airborne Geoscience Workshop, held at the Jet Propulsion Laboratory, Pasadena, California, on 1-5 June 1992. This main workshop is divided into three smaller workshops as follows: (1) the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on June 1 and 2; the summaries for this workshop appear in Volume 1; (2) the Thermal Infrared Multispectral Scanner (TIMS) workshop, on June 3; the summaries for this workshop appear in Volume 2; and (3) the Airborne Synthetic Aperture Radar (AIRSAR) workshop, on June 4 and 5; the summaries for this workshop appear in Volume 3.

  7. Summaries of the Fifth Annual JPL Airborne Earth Science Workshop. Volume 1: AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1995-01-01

    This publication is the first of three containing summaries for the Fifth Annual JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 23-26, 1995. The main workshop is divided into three smaller workshops as follows: (1) The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on January 23-24. The summaries for this workshop appear in this volume; (2) The Airborne Synthetic Aperture Radar (AIRSAR) workshop, on January 25-26. The summaries for this workshop appear in Volume 3; and (3) The Thermal Infrared Multispectral Scanner (TIMS) workshop, on January 26. The summaries for this workshop appear in Volume 2.

  8. Summaries of the Fifth Annual JPL Airborne Earth Science Workshop. Volume 2: TIMS Workshop

    NASA Technical Reports Server (NTRS)

    Realmuto, Vincent J. (Editor)

    1995-01-01

    This publication is the second volume of the summaries for the Fifth Annual JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 23-26, 1995. The main workshop is divided into three smaller workshops as follows: (1) The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop on January 23-24. The summaries for this workshop appear in Volume 1; (2) The Airborne Synthetic Aperture Radar (AIRSAR) workshop on January 25-26. The summaries for this workshop appear in volume 3; and (3) The Thermal Infrared Multispectral Scanner (TIMS) workshop on January 26. The summaries for this workshop appear in this volume.

  9. Summaries of the 4th Annual JPL Airborne Geoscience Workshop. Volume 1: AVIRIS Workshop

    NASA Technical Reports Server (NTRS)

    Green, Robert O. (Editor)

    1993-01-01

    This publication contains the summaries for the Fourth Annual JPL Airborne Geoscience Workshop, held in Washington, D. C. October 25-29, 1993 The main workshop is divided into three smaller workshops as follows: The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, October 25-26 (the summaries for this workshop appear in this volume, Volume 1); The Thermal Infrared Multispectral Scanner (TMIS) workshop, on October 27 (the summaries for this workshop appear in Volume 2); and The Airborne Synthetic Aperture Radar (AIRSAR) workshop, October 28-29 (the summaries for this workshop appear in Volume 3).

  10. Summaries of the Third Annual JPL Airborne Geoscience Workshop. Volume 3: AIRSAR Workshop

    NASA Technical Reports Server (NTRS)

    Vanzyl, Jakob (Editor)

    1992-01-01

    This publication contains the preliminary agenda and summaries for the Third Annual JPL Airborne Geoscience Workshop, held at the Jet Propulsion Laboratory, Pasadena, California, on 1-5 June 1992. This main workshop is divided into three smaller workshops as follows: (1) the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on June 1 and 2; the summaries for this workshop appear in Volume 1; (2) the Thermal Infrared Multispectral Scanner (TIMS) workshop, on June 3; the summaries for this workshop appear in Volume 2; and (3) the Airborne Synthetic Aperture Radar (AIRSAR) workshop, on June 4 and 5; the summaries for this workshop appear in Volume 3.

  11. Summaries of the Fifth Annual JPL Airborne Earth Science Workshop. Volume 3: AIRSAR Workshop

    NASA Technical Reports Server (NTRS)

    Vanzyl, Jakob (Editor)

    1995-01-01

    This publication is the third containing summaries for the Fifth Annual JPL Airborne Earth Science Workshop, held in Pasadena, California, on January 23-26, 1995. The main workshop is divided into three smaller workshops as follows: (1) The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) workshop, on January 23-24. The summaries for this workshop appear in Volume 1; (2) The Airborne synthetic Aperture Radar (AIRSAR) workshop, on January 25-26. The summaries for this workshop appear in this volume; and (3) The Thermal Infrared Multispectral Scanner (TIMS) workshop, on January 26. The summaries for this workshop appear in Volume 2.

  12. Multispectral comparison of water ice deposits observed on cometary nuclei

    NASA Astrophysics Data System (ADS)

    Oklay Vincent, Nilda; Sunshine, Jessica M.; Pajola, Maurizio; Pommerol, Antoine; Vincent, Jean-Baptiste; Sierks, Holger; OSIRIS Team

    2016-10-01

    Cometary missions Deep Impact, EPOXI and Rosetta investigated the nuclei of comets 9P/Tempel 1, 103P/Hartley 2 and 67P/Churyumov-Gerasimenko respectively. Each of these three missions was equipped with multispectral cameras, allowing imaging at various wavelengths from NUV to NIR. In this spectral range, water ice-rich features display bluer spectral slopes than the average surface and some have very flat spectra. Features enriched in water ice are bright in the monochromatic images and are blue in the RGB color composites generated by using images taken in NUV, visible and NIR wavelentghs. Using these properties, water ice-rich features were detected on the nuclei of comets 9P [1], 103P [2] and 67P [3] via multispectral imaging cameras. Moreover, there were visual detections of jets and outbursts associated to some of these water ice-rich features when the right observing conditions were fulfilled [4, 5].We analyzed multispectral properties of different types of water ice-rich features [3] observed via OSIRIS NAC on comet 67P in the wavelength range of 260 nm to 1000 nm and then compared with those observed on comets 9P and 103P. Our multispectral analysis shows that the water ice deposits observed on comet 9P are very similar to the large bright blue clusters observed on comet 67P, while the large water ice deposit observed on comet 103P is similar to the large isolated water ice-rich features observed on comet 67P. The ice-rich deposits on comet 103P are the bluest of any comet, which indicates that the deposits on 103P contain more water ice than the ones observed on comets 9P and 67P [6].[1] Sunshine et al 2006, Science[2] Sunshine et al 2011, LPSC[3] Pommerol et al 2015, A&A[4] Oklay et al 2016, A&A[5] Vincent et al 2016, A&A[6] Oklay et al 2016, submitted

  13. Two-port multispectral imaging system for contaminant detection on poultry carcasses

    NASA Astrophysics Data System (ADS)

    Kise, Michio; Park, Bosoon; Lawrence, Kurt C.; Windham, Robert R.

    2006-10-01

    The objective of this research is to design and fabricate a compact, cost effective multispectral instrument and to collect and analyze spectra for real-time contaminant detection for poultry processing plants. It was revealed by our previous research that the fecal contamination on the surface of the poultry carcass could be detected by sensing the spectral reflectance of the carcass surface in two specific wavelengths, namely 517 nm and 565 nm. The prototype system developed in this research consists of a multispectral imaging system, illumination system and handheld PC. To develop the system cost-effectively, all components are selected from off-the-shelf products and manually assembled. The multispectral imaging sensor developed in this research is a two-port imaging system that consists of two identical monochrome cameras, optical system and two narrow bandpass filters whose center of the wavelength are 520 and 560 nm, respectively. A spectral reflectance from a chicken carcass is collected and split in two directions by an optical system including a beamsplitter and lenses, and then two identical collimated lights are filtered by the narrow bandpass filters and delivered to the cameras. Lens distortions and geometric misalignment of the two cameras are mathematically compensated to register two images perfectly. The prototype system is tested in the real environment and shows that it can effectively detect feces and ingesta on the surface of poultry carcasses.

  14. Adaptive illumination source for multispectral vision system applied to material discrimination

    NASA Astrophysics Data System (ADS)

    Conde, Olga M.; Cobo, Adolfo; Cantero, Paulino; Conde, David; Mirapeix, Jesús; Cubillas, Ana M.; López-Higuera, José M.

    2008-04-01

    A multispectral system based on a monochrome camera and an adaptive illumination source is presented in this paper. Its preliminary application is focused on material discrimination for food and beverage industries, where monochrome, color and infrared imaging have been successfully applied for this task. This work proposes a different approach, in which the relevant wavelengths for the required discrimination task are selected in advance using a Sequential Forward Floating Selection (SFFS) Algorithm. A light source, based on Light Emitting Diodes (LEDs) at these wavelengths is then used to sequentially illuminate the material under analysis, and the resulting images are captured by a CCD camera with spectral response in the entire range of the selected wavelengths. Finally, the several multispectral planes obtained are processed using a Spectral Angle Mapping (SAM) algorithm, whose output is the desired material classification. Among other advantages, this approach of controlled and specific illumination produces multispectral imaging with a simple monochrome camera, and cold illumination restricted to specific relevant wavelengths, which is desirable for the food and beverage industry. The proposed system has been tested with success for the automatic detection of foreign object in the tobacco processing industry.

  15. Joint spatio-spectral based edge detection for multispectral infrared imagery.

    SciTech Connect

    Krishna, Sanjay; Hayat, Majeed M.; Bender, Steven C.; Sharma, Yagya D.; Jang, Woo-Yong; Paskalva, Biliana S.

    2010-06-01

    Image segmentation is one of the most important and difficult tasks in digital image processing. It represents a key stage of automated image analysis and interpretation. Segmentation algorithms for gray-scale images utilize basic properties of intensity values such as discontinuity and similarity. However, it is possible to enhance edge-detection capability by means of using spectral information provided by multispectral (MS) or hyperspectral (HS) imagery. In this paper we consider image segmentation algorithms for multispectral images with particular emphasis on detection of multi-color or multispectral edges. More specifically, we report on an algorithm for joint spatio-spectral (JSS) edge detection. By joint we mean simultaneous utilization of spatial and spectral characteristics of a given MS or HS image. The JSS-based edge-detection approach, termed Spectral Ratio Contrast (SRC) edge-detection algorithm, utilizes the novel concept of matching edge signatures. The edge signature represents a combination of spectral ratios calculated using bands that enhance the spectral contrast between the two materials. In conjunction with a spatial mask, the edge signature give rise to a multispectral operator that can be viewed as a three-dimensional extension of the mask. In the extended mask, the third (spectral) dimension of each hyper-pixel can be chosen independently. The SRC is verified using MS and HS imagery from a quantum-dot in a well infrared (IR) focal plane array, and the Airborne Hyperspectral Imager.

  16. A multispectral scanner survey of the Tonopah Test Range, Nevada. Date of survey: August 1993

    SciTech Connect

    Brewster, S.B. Jr.; Howard, M.E.; Shines, J.E.

    1994-08-01

    The Multispectral Remote Sensing Department of the Remote Sensing Laboratory conducted an airborne multispectral scanner survey of a portion of the Tonopah Test Range, Nevada. The survey was conducted on August 21 and 22, 1993, using a Daedalus AADS1268 scanner and coincident aerial color photography. Flight altitudes were 5,000 feet (1,524 meters) above ground level for systematic coverage and 1,000 feet (304 meters) for selected areas of special interest. The multispectral scanner survey was initiated as part of an interim and limited investigation conducted to gather preliminary information regarding historical hazardous material release sites which could have environmental impacts. The overall investigation also includes an inventory of environmental restoration sites, a ground-based geophysical survey, and an aerial radiological survey. The multispectral scanner imagery and coincident aerial photography were analyzed for the detection, identification, and mapping of man-made soil disturbances. Several standard image enhancement techniques were applied to the data to assist image interpretation. A geologic ratio enhancement and a color composite consisting of AADS1268 channels 10, 7, and 9 (mid-infrared, red, and near-infrared spectral bands) proved most useful for detecting soil disturbances. A total of 358 disturbance sites were identified on the imagery and mapped using a geographic information system. Of these sites, 326 were located within the Tonopah Test Range while the remaining sites were present on the imagery but outside the site boundary. The mapped site locations are being used to support ongoing field investigations.

  17. Spectral stratigraphy: multispectral remote sensing as a stratigraphic tool, Wind River/Big Horn basin, Wyoming

    SciTech Connect

    Lang, H.R.; Paylor, E.D.

    1987-05-01

    Stratigraphic and structural analyses of the Wind River and Big Horn basins areas of central Wyoming are in progress. One result has been the development of a new approach to stratigraphic and structural analysis that uses photogeologic and spectral interpretation of multispectral image data to remotely characterize the attitude, thickness, and lithology of strata. New multispectral systems that have only been available since 1982 are used with topographic data to map upper paleozoic and Mesozoic strata exposed on the southern margin of the Bighorn Mountains. Thematic Mapper (TM) satellite data together with topographic data are used to map lithologic contacts, measure dip and strike, and develop a stratigraphic column that is correlated with conventional surface and subsurface sections. Aircraft-acquired Airborne Imaging Spectrometer and Thermal Infrared Multispectral Scanner data add mineralogical information to the TM column, including the stratigraphic distribution of quartz, calcite, dolomite, montmorillonite, and gypsum. Results illustrate an approach that has general applicability in other geologic investigations that could benefit from remotely acquired information about areal variations in attitude, sequence, thickness, and lithology of strata exposed at the Earth's surface. Application of their methods elsewhere is limited primarily by availability of multispectral and topographic data and quality of bedrock exposures.

  18. Caught on Camera.

    ERIC Educational Resources Information Center

    Milshtein, Amy

    2002-01-01

    Describes the benefits of and rules to be followed when using surveillance cameras for school security. Discusses various camera models, including indoor and outdoor fixed position cameras, pan-tilt zoom cameras, and pinhole-lens cameras for covert surveillance. (EV)

  19. A multispectral sorting device for wheat kernels

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A low-cost multispectral sorting device was constructed using three visible and three near-infrared light-emitting diodes (LED) with peak emission wavelengths of 470 nm (blue), 527 nm (green), 624 nm (red), 850 nm, 940 nm, and 1070 nm. The multispectral data were collected by rapidly (~12 kHz) blin...

  20. Sandia multispectral analyst remote sensing toolkit (SMART).

    SciTech Connect

    Post, Brian Nelson; Smith, Jody Lynn; Geib, Peter L.; Nandy, Prabal; Wang, Nancy Nairong

    2003-03-01

    This remote sensing science and exploitation work focused on exploitation algorithms and methods targeted at the analyst. SMART is a 'plug-in' to commercial remote sensing software that provides algorithms to enhance the utility of the Multispectral Thermal Imager (MTI) and other multispectral satellite data. This toolkit has been licensed to 22 government organizations.

  1. A compact snapshot multispectral imager with a monolithically integrated per-pixel filter mosaic

    NASA Astrophysics Data System (ADS)

    Geelen, Bert; Tack, Nicolaas; Lambrechts, Andy

    2014-03-01

    The adoption of spectral imaging by industry has so far been limited due to the lack of high speed, low cost and compact spectral cameras. Moreover most state-of-the-art spectral cameras utilize some form of spatial or spectral scanning during acquisition, making them ill-suited for analyzing dynamic scenes containing movement. This paper introduces a novel snapshot multispectral imager concept based on optical filters monolithically integrated on top of a standard CMOS image sensor. It overcomes the problems mentioned for scanning applications by snapshot acquisition, where an entire multispectral data cube is sensed at one discrete point in time. This is enabled by depositing interference filters per pixel directly on a CMOS image sensor, extending the traditional Bayer color imaging concept to multi- or hyperspectral imaging without a need for dedicated fore-optics. The monolithic deposition leads to a high degree of design flexibility. This enables systems ranging from application-specific, high spatial resolution cameras with 1 to 4 spectral filters, to hyperspectral snapshot cameras at medium spatial resolutions and filters laid out in cells of 4x4 to 6x6 or more. Through the use of monolithically integrated optical filters it further retains the qualities of compactness, low cost and high acquisition speed, differentiating it from other snapshot spectral cameras.

  2. Multispectral Image Processing for Plants

    NASA Technical Reports Server (NTRS)

    Miles, Gaines E.

    1991-01-01

    The development of a machine vision system to monitor plant growth and health is one of three essential steps towards establishing an intelligent system capable of accurately assessing the state of a controlled ecological life support system for long-term space travel. Besides a network of sensors, simulators are needed to predict plant features, and artificial intelligence algorithms are needed to determine the state of a plant based life support system. Multispectral machine vision and image processing can be used to sense plant features, including health and nutritional status.

  3. Monitoring of maize chlorophyll content based on multispectral vegetation indices

    NASA Astrophysics Data System (ADS)

    Sun, Hong; Li, Minzan; Zheng, Lihua; Zhang, Yane; Zhang, Yajing

    2012-11-01

    In order to estimate the nutrient status of maize, the multi-spectral image was used to monitor the chlorophyll content in the field. The experiments were conducted under three different fertilizer treatments (High, Normal and Low). A multispectral CCD camera was used to collect ground-based images of maize canopy in green (G, 520~600nm), red (R, 630~690nm) and near-infrared (NIR, 760~900nm) band. Leaves of maize were randomly sampled to detect the chlorophyll content by UV-Vis spectrophotometer. The images were processed following image preprocessing, canopy segmentation and parameter calculation: Firstly, the median filtering was used to improve the visual contrast of image. Secondly, the leaves of maize canopy were segmented in NIR image. Thirdly, the average gray value (GIA, RIA and NIRIA) and the vegetation indices (DVI, RVI, NDVI, et al.) widely used in remote sensing were calculated. A new vegetation index, combination of normalized difference vegetation index (CNDVI), was developed. After the correlation analysis between image parameter and chlorophyll content, six parameters (GIA, RIA, NIRIA, GRVI, GNDVI and CNDVI) were selected to estimate chlorophyll content at shooting and trumpet stages respectively. The results of MLR predicting models showed that the R2 was 0.88 and the adjust R2 was 0.64 at shooting stage; the R2 was 0.77 and the adjust R2 was 0.31 at trumpet stage. It was indicated that vegetation indices derived from multispectral image could be used to monitor the chlorophyll content. It provided a feasible method for the chlorophyll content detection.

  4. Estimation of thermal flux and emissivity of the land surface from multispectral aircraft data

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.

    1989-01-01

    In order to evaluate the importance of surface thermal flux and emissivity variations on surface and boundary layer processes, a technique that uses thermal data from an airborne multispectral scanner to determine the surface skin temperature and thermal emissivity over a regional area has been developed. These values are used to estimate the total flux density emanating from the surface and at the top of the atmosphere. Data from the multispectral atmospheric mapping sensor (MAMS) collected during the First ISLSCP Field Experiment (FIFE) are used to develop the technique, and to show the time and space variability of the flux values. The ground truth data available during FIFE provide a unique resource to evaluate this technique.

  5. Large Multispectral and Albedo Panoramas Acquired by the Pancam Instruments on the Mars Exploration Rovers Spirit and Opportunity

    NASA Technical Reports Server (NTRS)

    Bell, J. F., III; Arneson, H. M.; Farrand, W. H.; Goetz, W.; Hayes, A. G.; Herkenhoff, K.; Johnson, M. J.; Johnson, J. R.; Joseph, J.; Kinch, K.

    2005-01-01

    Introduction. The panoramic camera (Pancam) multispectral, stereoscopic imaging systems on the Mars Exploration Rovers Spirit and Opportunity [1] have acquired and downlinked more than 45,000 images (35 Gbits of data) over more than 700 combined sols of operation on Mars as of early January 2005. A large subset of these images were acquired as part of 26 large multispectral and/or broadband "albedo" panoramas (15 on Spirit, 11 on Opportunity) covering large ranges of azimuth (12 spanning 360 ) and designed to characterize major regional color and albedo characteristics of the landing sites and various points along both rover traverses.

  6. Classification of emerald based on multispectral image and PCA

    NASA Astrophysics Data System (ADS)

    Yang, Weiping; Zhao, Dazun; Huang, Qingmei; Ren, Pengyuan; Feng, Jie; Zhang, Xiaoyan

    2005-02-01

    Traditionally, the grade discrimination and classifying of bowlders (emeralds) are implemented by using methods based on people's experiences. In our previous works, a method based on NCS(Natural Color System) color system and sRGB color space conversion is employed for a coarse grade classification of emeralds. However, it is well known that the color match of two colors is not a true "match" unless their spectra are the same. Because metameric colors can not be differentiated by a three channel(RGB) camera, a multispectral camera(MSC) is used as image capturing device in this paper. It consists of a trichromatic digital camera and a set of wide-band filters. The spectra are obtained by measuring a series of natural bowlders(emeralds) samples. Principal component analysis(PCA) method is employed to get some spectral eigenvectors. During the fine classification, the color difference and RMS of spectrum difference between estimated and original spectra are used as criterion. It has been shown that 6 eigenvectors are enough to reconstruct reflection spectra of the testing samples.

  7. Fast application multispectral camouflage appliques

    NASA Astrophysics Data System (ADS)

    Meeker, David L.; Hall, Kenneth G.

    1995-05-01

    With reconnaissance surveillance, and target acquisition systems becoming increasingly sophisticated in both sensor performance and processing capabilities, there exists a requirement to increase the camoufleur's ability to control and manipulate target signatures beyond those currently available. To assist in accomplishing this, a hybrid technology is required: one that combines the features of multispectral signature control, rapid deployment, and low cost. The WES fixed-facility CCD team is developing a suite of signature controlling materials termed 'Multispectral Camouflage Appliques' (MCSs). Due to the nature of this material, the spectral characteristics (e.g. emmissivity, radar scattering properties, UV-NIR reflectance, color) can be controlled with great latitude by the designer, by adding underlying material layers or external coatings. It is the ability of the designer to manipulate the fundamental characteristics of the MCA material that allows its uniqueness and maximum utility. THe first series of MCAs are composed of an adhesive-backed metal foil overlaid with a visual color coating that is transparent in the thermal IR wavelengths. The effect is that of a visual camouflage combined with a thermal mirror that reflects emissions of the natural surroundings. The 'peel and stick' adhesive backing provides a rapid method of applying MCAs to fixed and semimobile assets that would be beneficial to bare base and force projection msisions. Dual uses of this material include drug and border aerial surveillance as position markers that are visually disguised from ground observation but provide high contrast in thermal IR imaging systems by reflecting cold sky temperatures.

  8. Quasi-microscope concept for planetary missions. [optically augmented lander camera for high resolution microscopy

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Burcher, E. E.; Wall, S. D.; Arvidson, R. E.; Giat, O.

    1977-01-01

    Viking lander cameras have returned stereo and multispectral views of the Martian surface with a resolution that approaches 2 mm/lp in the near field. A two-orders-of-magnitude increase in resolution could be obtained for collected surface samples by augmenting these cameras with auxiliary optics that would neither impose special camera design requirements nor limit the cameras field of view of the terrain. Quasi-microscope images would provide valuable data on the physical and chemical characteristics of planetary regoliths.

  9. Time-of-Flight Microwave Camera

    NASA Astrophysics Data System (ADS)

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-10-01

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum.

  10. Time-of-Flight Microwave Camera.

    PubMed

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-01-01

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable "stealth" regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz-12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows "camera-like" behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum. PMID:26434598

  11. Time-of-Flight Microwave Camera

    PubMed Central

    Charvat, Gregory; Temme, Andrew; Feigin, Micha; Raskar, Ramesh

    2015-01-01

    Microwaves can penetrate many obstructions that are opaque at visible wavelengths, however microwave imaging is challenging due to resolution limits associated with relatively small apertures and unrecoverable “stealth” regions due to the specularity of most objects at microwave frequencies. We demonstrate a multispectral time-of-flight microwave imaging system which overcomes these challenges with a large passive aperture to improve lateral resolution, multiple illumination points with a data fusion method to reduce stealth regions, and a frequency modulated continuous wave (FMCW) receiver to achieve depth resolution. The camera captures images with a resolution of 1.5 degrees, multispectral images across the X frequency band (8 GHz–12 GHz), and a time resolution of 200 ps (6 cm optical path in free space). Images are taken of objects in free space as well as behind drywall and plywood. This architecture allows “camera-like” behavior from a microwave imaging system and is practical for imaging everyday objects in the microwave spectrum. PMID:26434598

  12. CNR LARA Project: Evaluation of two years of airborne imaging spectrometry

    SciTech Connect

    Bianchi, R.; Cavalli, R.M.; Fiumi, L.; Marino, C.M.

    1996-10-01

    Since last July 1994 the Daedalus AA5000 MIVIS (Multispectral Infrared and Visible Imaging Spectrometer) instrument, acquired by CNR (Italian National Research Council) in the framework of its LARA (Airborne Laboratory for Environmental Studies) Project, has been intensively operative. A number of MIVIS (Multispectral Infrared and Visible Imaging Spectrometer) deployments have been carried out in Italy and Europe in cooperation with national and international institutions on a variety of sites, including active volcanoes, coastlines, lagoons and ocean, vegetated and cultivated areas, oil polluted surfaces, waste discharges, and archeological sites. Two years of activity have shown the high system efficiency, from the survey to data preprocessing and dissemination. 12 refs., 3 figs.

  13. Assessment of Pen Branch delta and corridor vegetation changes using multispectral scanner data 1992--1994

    SciTech Connect

    1996-01-01

    Airborne multispectral scanner data were used to monitor natural succession of wetland vegetation species over a three-year period from 1992 through 1994 for Pen Branch on the Savannah River Site in South Carolina. Image processing techniques were used to identify and measure wetland vegetation communities in the lower portion of the Pen Branch corridor and delta. The study provided a reliable means for monitoring medium- and large-scale changes in a diverse environment. Findings from the study will be used to support decisions regarding remediation efforts following the cessation of cooling water discharge from K reactor at the Department of Energy`s Savannah River Site in South Carolina.

  14. Uav Multispectral Survey to Map Soil and Crop for Precision Farming Applications

    NASA Astrophysics Data System (ADS)

    Sonaa, Giovanna; Passoni, Daniele; Pinto, Livio; Pagliari, Diana; Masseroni, Daniele; Ortuani, Bianca; Facchi, Arianna

    2016-06-01

    New sensors mounted on UAV and optimal procedures for survey, data acquisition and analysis are continuously developed and tested for applications in precision farming. Procedures to integrate multispectral aerial data about soil and crop and ground-based proximal geophysical data are a recent research topic aimed to delineate homogeneous zones for the management of agricultural inputs (i.e., water, nutrients). Multispectral and multitemporal orthomosaics were produced over a test field (a 100 m x 200 m plot within a maize field), to map vegetation and soil indices, as well as crop heights, with suitable ground resolution. UAV flights were performed in two moments during the crop season, before sowing on bare soil, and just before flowering when maize was nearly at the maximum height. Two cameras, for color (RGB) and false color (NIR-RG) images, were used. The images were processed in Agisoft Photoscan to produce Digital Surface Model (DSM) of bare soil and crop, and multispectral orthophotos. To overcome some difficulties in the automatic searching of matching points for the block adjustment of the crop image, also the scientific software developed by Politecnico of Milan was used to enhance images orientation. Surveys and image processing are described, as well as results about classification of multispectral-multitemporal orthophotos and soil indices.

  15. Multispectral Scanner for Monitoring Plants

    NASA Technical Reports Server (NTRS)

    Gat, Nahum

    2004-01-01

    A multispectral scanner has been adapted to capture spectral images of living plants under various types of illumination for purposes of monitoring the health of, or monitoring the transfer of genes into, the plants. In a health-monitoring application, the plants are illuminated with full-spectrum visible and near infrared light and the scanner is used to acquire a reflected-light spectral signature known to be indicative of the health of the plants. In a gene-transfer- monitoring application, the plants are illuminated with blue or ultraviolet light and the scanner is used to capture fluorescence images from a green fluorescent protein (GFP) that is expressed as result of the gene transfer. The choice of wavelength of the illumination and the wavelength of the fluorescence to be monitored depends on the specific GFP.

  16. Multispectral sensing of moisture stress

    NASA Technical Reports Server (NTRS)

    Olson, C. E., Jr.

    1970-01-01

    Laboratory reflectance data, and field tests with multispectral remote sensors provide support for this hypotheses that differences in moisture content and water deficits are closely related to foliar reflectance from woody plants. When these relationships are taken into account, automatic recognition techniques become more powerful than when they are ignored. Evidence is increasing that moisture relationships inside plant foliage are much more closely related to foliar reflectance characteristics than are external variables such as soil moisture, wind, and air temperature. Short term changes in water deficits seem to have little influence on foliar reflectance, however. This is in distinct contrast to significant short-term changes in foliar emittance from the same plants with changing wind, air temperature, incident radiation, or water deficit conditions.

  17. Cinematic camera emulation using two-dimensional color transforms

    NASA Astrophysics Data System (ADS)

    McElvain, Jon S.; Gish, Walter

    2015-02-01

    For cinematic and episodic productions, on-set look management is an important component of the creative process, and involves iterative adjustments of the set, actors, lighting and camera configuration. Instead of using the professional motion capture device to establish a particular look, the use of a smaller form factor DSLR is considered for this purpose due to its increased agility. Because the spectral response characteristics will be different between the two camera systems, a camera emulation transform is needed to approximate the behavior of the destination camera. Recently, twodimensional transforms have been shown to provide high-accuracy conversion of raw camera signals to a defined colorimetric state. In this study, the same formalism is used for camera emulation, whereby a Canon 5D Mark III DSLR is used to approximate the behavior a Red Epic cinematic camera. The spectral response characteristics for both cameras were measured and used to build 2D as well as 3x3 matrix emulation transforms. When tested on multispectral image databases, the 2D emulation transforms outperform their matrix counterparts, particularly for images containing highly chromatic content.

  18. Use of a Multispectral Uav Photogrammetry for Detection and Tracking of Forest Disturbance Dynamics

    NASA Astrophysics Data System (ADS)

    Minařík, R.; Langhammer, J.

    2016-06-01

    This study presents a new methodological approach for assessment of spatial and qualitative aspects of forest disturbance based on the use of multispectral imaging camera with the UAV photogrammetry. We have used the miniaturized multispectral sensor Tetracam Micro Multiple Camera Array (μ-MCA) Snap 6 with the multirotor imaging platform to get multispectral imagery with high spatial resolution. The study area is located in the Sumava Mountains, Central Europe, heavily affected by windstorms, followed by extensive and repeated bark beetle (Ips typographus [L.]) outbreaks in the past 20 years. After two decades, there is apparent continuous spread of forest disturbance as well as rapid regeneration of forest vegetation, related with changes in species and their diversity. For testing of suggested methodology, we have launched imaging campaign in experimental site under various stages of forest disturbance and regeneration. The imagery of high spatial and spectral resolution enabled to analyse the inner structure and dynamics of the processes. The most informative bands for tree stress detection caused by bark beetle infestation are band 2 (650nm) and band 3 (700nm), followed by band 4 (800 nm) from the, red-edge and NIR part of the spectrum. We have identified only three indices, which seems to be able to correctly detect different forest disturbance categories in the complex conditions of mixture of categories. These are Normalized Difference Vegetation Index (NDVI), Simple 800/650 Ratio Pigment specific simple ratio B1 and Red-edge Index.

  19. Color image reproduction based on multispectral and multiprimary imaging: experimental evaluation

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Masahiro; Teraji, Taishi; Ohsawa, Kenro; Uchiyama, Toshio; Motomura, Hideto; Murakami, Yuri; Ohyama, Nagaaki

    2001-12-01

    Multispectral imaging is significant technology for the acquisition and display of accurate color information. Natural color reproduction under arbitrary illumination becomes possible using spectral information of both image and illumination light. In addition, multiprimary color display, i.e., using more than three primary colors, has been also developed for the reproduction of expanded color gamut, and for discounting observer metamerism. In this paper, we present the concept for the multispectral data interchange for natural color reproduction, and the experimental results using 16-band multispectral camera and 6-primary color display. In the experiment, the accuracy of color reproduction is evaluated in CIE (Delta) Ea*b* for both image capture and display systems. The average and maximum (Delta) Ea*b* = 1.0 and 2.1 in 16-band mutispectral camera system, using Macbeth 24 color patches. In the six-primary color projection display, average and maximum (Delta) Ea*b* = 1.3 and 2.7 with 30 test colors inside the display gamut. Moreover, the color reproduction results with different spectral distributions but same CIE tristimulus value are visually compared, and it is confirmed that the 6-primary display gives improved agreement between the original and reproduced colors.

  20. Experimental Advanced Airborne Research Lidar (EAARL) Data Processing Manual

    USGS Publications Warehouse

    Bonisteel, Jamie M.; Nayegandhi, Amar; Wright, C. Wayne; Brock, John C.; Nagle, David

    2009-01-01

    . Each pulse is focused into an illumination area that has a radius of about 20 centimeters on the ground. The pulse-repetition frequency of the EAARL transmitter varies along each across-track scan to produce equal cross-track sample spacing and near uniform density (Nayegandhi and others, 2006). Targets can have varying physical and optical characteristics that cause extreme fluctuations in laser backscatter complexity and signal strength. To accommodate this dynamic range, EAARL has the real-time ability to detect, capture, and automatically adapt to each laser return backscatter. The backscattered energy is collected by an array of four high-speed waveform digitizers connected to an array of four sub-nanosecond photodetectors. Each of the four photodetectors receives a finite range of the returning laser backscatter photons. The most sensitive channel receives 90% of the photons, the least sensitive receives 0.9%, and the middle channel receives 9% (Wright and Brock, 2002). The fourth channel is available for detection but is not currently being utilized. All four channels are digitized simultaneously into 65,536 samples for every laser pulse. Receiver optics consists of a 15-centimeter-diameter dielectric-coated Newtonian telescope, a computer-driven raster scanning mirror oscillating at 12.5 hertz (25 rasters per second), and an array of sub-nanosecond photodetectors. The signal emitted by the pulsed laser transmitter is amplified as backscatter by the optical telescope receiver. The photomultiplier tube (PMT) then converts the optical energy into electrical impulses (Nayegandhi and others, 2006). In addition to the full-waveform resolving laser, the EAARL sensor suite includes a down-looking 70-centimeter-resolution Red-Green-Blue (RGB) digital network camera, a high-resolution color infrared (CIR) multispectral camera (14-centimeter-resolution), two precision dual-frequency kinematic carrier-phase global positioning system (GPS) receivers, and an

  1. Study on multispectral imaging detection and recognition

    NASA Astrophysics Data System (ADS)

    Jun, Wang; Na, Ding; Gao, Jiaobo; Yu, Hu; Jun, Wu; Li, Junna; Zheng, Yawei; Fei, Gao; Sun, Kefeng

    2009-07-01

    Multispectral imaging detecting technology use target radiation character in spectral spatial distribution and relation between spectral and image to detect target and remote sensing measure. Its speciality is multi channel, narrow bandwidth, large amount of information, high accuracy. The ability of detecting target in environment of clutter, camouflage, concealment and beguilement is improved. At present, spectral imaging technology in the range of multispectral and hyperspectral develop greatly. The multispectral imaging equipment of unmanned aerial vehicle can be used in mine detection, information, surveillance and reconnaissance. Spectral imaging spectrometer operating in MWIR and LWIR has already been applied in the field of remote sensing and military in the advanced country. The paper presents the technology of multispectral imaging. It can enhance the reflectance, scatter and radiation character of the artificial targets among nature background. The targets among complex background and camouflage/stealth targets can be effectively identified. The experiment results and the data of spectral imaging is obtained.

  2. Remote sensing of clouds by multispectral sensors.

    PubMed

    Lindner, B L; Isaacs, R G

    1993-05-20

    A multispectral minimization approach that uses the wavelength dependence of the radiance rather than the magnitude of the radiance is advocated for the retrieval of cloud optical thickness, phase, and particle size by future sensors.

  3. Multispectral imaging with vertical silicon nanowires

    PubMed Central

    Park, Hyunsung; Crozier, Kenneth B.

    2013-01-01

    Multispectral imaging is a powerful tool that extends the capabilities of the human eye. However, multispectral imaging systems generally are expensive and bulky, and multiple exposures are needed. Here, we report the demonstration of a compact multispectral imaging system that uses vertical silicon nanowires to realize a filter array. Multiple filter functions covering visible to near-infrared (NIR) wavelengths are simultaneously defined in a single lithography step using a single material (silicon). Nanowires are then etched and embedded into polydimethylsiloxane (PDMS), thereby realizing a device with eight filter functions. By attaching it to a monochrome silicon image sensor, we successfully realize an all-silicon multispectral imaging system. We demonstrate visible and NIR imaging. We show that the latter is highly sensitive to vegetation and furthermore enables imaging through objects opaque to the eye. PMID:23955156

  4. Determining Camera Gain in Room Temperature Cameras

    SciTech Connect

    Joshua Cogliati

    2010-12-01

    James R. Janesick provides a method for determining the amplification of a CCD or CMOS camera when only access to the raw images is provided. However, the equation that is provided ignores the contribution of dark current. For CCD or CMOS cameras that are cooled well below room temperature, this is not a problem, however, the technique needs adjustment for use with room temperature cameras. This article describes the adjustment made to the equation, and a test of this method.

  5. Multispectral palmprint recognition using a quaternion matrix.

    PubMed

    Xu, Xingpeng; Guo, Zhenhua; Song, Changjiang; Li, Yafeng

    2012-01-01

    Palmprints have been widely studied for biometric recognition for many years. Traditionally, a white light source is used for illumination. Recently, multispectral imaging has drawn attention because of its high recognition accuracy. Multispectral palmprint systems can provide more discriminant information under different illuminations in a short time, thus they can achieve better recognition accuracy. Previously, multispectral palmprint images were taken as a kind of multi-modal biometrics, and the fusion scheme on the image level or matching score level was used. However, some spectral information will be lost during image level or matching score level fusion. In this study, we propose a new method for multispectral images based on a quaternion model which could fully utilize the multispectral information. Firstly, multispectral palmprint images captured under red, green, blue and near-infrared (NIR) illuminations were represented by a quaternion matrix, then principal component analysis (PCA) and discrete wavelet transform (DWT) were applied respectively on the matrix to extract palmprint features. After that, Euclidean distance was used to measure the dissimilarity between different features. Finally, the sum of two distances and the nearest neighborhood classifier were employed for recognition decision. Experimental results showed that using the quaternion matrix can achieve a higher recognition rate. Given 3000 test samples from 500 palms, the recognition rate can be as high as 98.83%. PMID:22666049

  6. Multispectral Palmprint Recognition Using a Quaternion Matrix

    PubMed Central

    Xu, Xingpeng; Guo, Zhenhua; Song, Changjiang; Li, Yafeng

    2012-01-01

    Palmprints have been widely studied for biometric recognition for many years. Traditionally, a white light source is used for illumination. Recently, multispectral imaging has drawn attention because of its high recognition accuracy. Multispectral palmprint systems can provide more discriminant information under different illuminations in a short time, thus they can achieve better recognition accuracy. Previously, multispectral palmprint images were taken as a kind of multi-modal biometrics, and the fusion scheme on the image level or matching score level was used. However, some spectral information will be lost during image level or matching score level fusion. In this study, we propose a new method for multispectral images based on a quaternion model which could fully utilize the multispectral information. Firstly, multispectral palmprint images captured under red, green, blue and near-infrared (NIR) illuminations were represented by a quaternion matrix, then principal component analysis (PCA) and discrete wavelet transform (DWT) were applied respectively on the matrix to extract palmprint features. After that, Euclidean distance was used to measure the dissimilarity between different features. Finally, the sum of two distances and the nearest neighborhood classifier were employed for recognition decision. Experimental results showed that using the quaternion matrix can achieve a higher recognition rate. Given 3000 test samples from 500 palms, the recognition rate can be as high as 98.83%. PMID:22666049

  7. Long-distance eye-safe laser TOF camera design

    NASA Astrophysics Data System (ADS)

    Kovalev, Anton V.; Polyakov, Vadim M.; Buchenkov, Vyacheslav A.

    2016-04-01

    We present a new TOF camera design based on a compact actively Q-switched diode pumped solid-state laser operating in 1.5 μm range and a receiver system based on a short wave infrared InGaAs PIN diodes focal plane array with an image intensifier and a special readout integration circuit. The compact camera is capable of depth imaging up to 4 kilometers with 10 frame/s and 1.2 m error. The camera could be applied for airborne and space geodesy location and navigation.

  8. Digital staining for histopathology multispectral images by the combined application of spectral enhancement and spectral transformation.

    PubMed

    Bautista, Pinky A; Yagi, Yukako

    2011-01-01

    In this paper we introduced a digital staining method for histopathology images captured with an n-band multispectral camera. The method consisted of two major processes: enhancement of the original spectral transmittance and the transformation of the enhanced transmittance to its target spectral configuration. Enhancement is accomplished by shifting the original transmittance with the scaled difference between the original transmittance and the transmittance estimated with m dominant principal component (PC) vectors;the m-PC vectors were determined from the transmittance samples of the background image. Transformation of the enhanced transmittance to the target spectral configuration was done using an nxn transformation matrix, which was derived by applying a least square method to the enhanced and target spectral training data samples of the different tissue components. Experimental results on the digital conversion of a hematoxylin and eosin (H&E) stained multispectral image to its Masson's trichrome stained (MT) equivalent shows the viability of the method.

  9. Multispectral fundus imaging for early detection of diabetic retinopathy

    NASA Astrophysics Data System (ADS)

    Beach, James M.; Tiedeman, James S.; Hopkins, Mark F.; Sabharwal, Yashvinder S.

    1999-04-01

    Functional imaging of the retina and associated structures may provide information for early assessment of risks of developing retinopathy in diabetic patients. Here we show results of retinal oximetry performed using multi-spectral reflectance imaging techniques to assess hemoglobin (Hb) oxygen saturation (OS) in blood vessels of the inner retina and oxygen utilization at the optic nerve in diabetic patients without retinopathy and early disease during experimental hyperglycemia. Retinal images were obtained through a fundus camera and simultaneously recorded at up to four wavelengths using image-splitting modules coupled to a digital camera. Changes in OS in large retinal vessels, in average OS in disk tissue, and in the reduced state of cytochrome oxidase (CO) at the disk were determined from changes in reflectance associated with the oxidation/reduction states of Hb and CO. Step to high sugar lowered venous oxygen saturation to a degree dependent on disease duration. Moderate increase in sugar produced higher levels of reduced CO in both the disk and surrounding tissue without a detectable change in average tissue OS. Results suggest that regulation of retinal blood supply and oxygen consumption are altered by hyperglycemia and that such functional changes are present before clinical signs of retinopathy.

  10. An airborne laser polarimeter system (ALPS) for terrestrial physics research

    NASA Technical Reports Server (NTRS)

    Kalshoven, James E., Jr.; Dabney, Philip W.

    1988-01-01

    The design of a multispectral polarized laser system for characterizing the depolarization properties of the earth's surface is described. Using a laser as the light source, this airborne system measures the Stokes parameters of the surface to simultaneously arrive at the polarization degree, azimuthal angle, and ellipticity for each wavelength. The technology will be studied for the feasibility of expansion of the sensor to do surface polarization imaging. The data will be used in support of solar polarization studies and to develop laser radiometry as a tool in environmental remote sensing.

  11. Multispectral Imaging from Mars PATHFINDER

    NASA Technical Reports Server (NTRS)

    Ferrand, William H.; Bell, James F., III; Johnson, Jeffrey R.; Bishop, Janice L.; Morris, Richard V.

    2007-01-01

    The Imager for Mars Pathfinder (IMP) was a mast-mounted instrument on the Mars Pathfinder lander which landed on Mars Ares Vallis floodplain on July 4, 1997. During the 83 sols of Mars Pathfinders landed operations, the IMP collected over 16,600 images. Multispectral images were collected using twelve narrowband filters at wavelengths between 400 and 1000 nm in the visible and near infrared (VNIR) range. The IMP provided VNIR spectra of the materials surrounding the lander including rocks, bright soils, dark soils, and atmospheric observations. During the primary mission, only a single primary rock spectral class, Gray Rock, was recognized; since then, Black Rock, has been identified. The Black Rock spectra have a stronger absorption at longer wavelengths than do Gray Rock spectra. A number of coated rocks have also been described, the Red and Maroon Rock classes, and perhaps indurated soils in the form of the Pink Rock class. A number of different soil types were also recognized with the primary ones being Bright Red Drift, Dark Soil, Brown Soil, and Disturbed Soil. Examination of spectral parameter plots indicated two trends which were interpreted as representing alteration products formed in at least two different environmental epochs of the Ares Vallis area. Subsequent analysis of the data and comparison with terrestrial analogs have supported the interpretation that the rock coatings provide evidence of earlier martian environments. However, the presence of relatively uncoated examples of the Gray and Black rock classes indicate that relatively unweathered materials can persist on the martian surface.

  12. Multi-spectral IR reflectography

    NASA Astrophysics Data System (ADS)

    Fontana, Raffaella; Bencini, Davide; Carcagnì, Pierluigi; Greco, Marinella; Mastroianni, Maria; Materazzi, Marzia; Pampaloni, Enrico; Pezzati, Luca

    2007-07-01

    A variety of scientific investigation methods applied to paintings are, by now, an integral part of the repair process, both to plan the restoration intervention and to monitor its various phases. Optical techniques are widely diffused and extremely well received in the field of painting diagnostics because of their effectiveness and safety. Among them infrared reflectography is traditionally employed in non-destructive diagnostics of ancient paintings to reveal features underlying the pictorial layer thanks to transparency characteristics to NIR radiation of the materials composing the paints. High-resolution reflectography was introduced in the 90s at the Istituto Nazionale di Ottica Applicata, where a prototype of an innovative scanner was developed, working in the 900-1700 nm spectral range. This technique was recently improved with the introduction of an optical head, able to acquire simultaneously the reflectogram and the color image, perfectly superimposing. In this work we present a scanning device for multi-spectral IR reflectography, based on contact-less and single-point measurement of the reflectance of painted surfaces. The back-scattered radiation is focused on square-shaped fiber bundle that carries the light to an array of 14 photodiodes equipped with pass-band filters so to cover the NIR spectral range from 800 to 2500 nm

  13. Vacuum Camera Cooler

    NASA Technical Reports Server (NTRS)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  14. Making Ceramic Cameras

    ERIC Educational Resources Information Center

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  15. Constrained space camera assembly

    DOEpatents

    Heckendorn, Frank M.; Anderson, Erin K.; Robinson, Casandra W.; Haynes, Harriet B.

    1999-01-01

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

  16. Adaptation of the Camera Link Interface for Flight-Instrument Applications

    NASA Technical Reports Server (NTRS)

    Randall, David P.; Mahoney, John C.

    2010-01-01

    COTS (commercial-off-the-shelf) hard ware using an industry-standard Camera Link interface is proposed to accomplish the task of designing, building, assembling, and testing electronics for an airborne spectrometer that would be low-cost, but sustain the required data speed and volume. The focal plane electronics were designed to support that hardware standard. Analysis was done to determine how these COTS electronics could be interfaced with space-qualified camera electronics. Interfaces available for spaceflight application do not support the industry standard Camera Link interface, but with careful design, COTS EGSE (electronics ground support equipment), including camera interfaces and camera simulators, can still be used.

  17. Integration of visible-through microwave-range multispectral image data sets for geologic mapping

    NASA Technical Reports Server (NTRS)

    Kruse, Fred A.; Dietz, John B.

    1991-01-01

    Multispectral remote sensing data sets collected during the Geologic Remote Sensing Field Experiment (GRSFE) conducted during 1989 in the southwestern U.S. were used to produce thematic image maps showing details of the surface geology. LANDSAT TM (Thematic Mapper) images were used to map the distribution of clays, carbonates, and iron oxides. AVIRIS (Airborne Visible/Infrared Imaging Spectrometer) data were used to identify and map calcite, dolomite, sericite, hematite, and geothite, including mixtures. TIMS (Thermal Infrared Multispectral Scanner) data were used to map the distribution of igneous rock phases and carbonates based on their silica contents. AIRSAR (Airborne Synthetic Aperture Radar) data were used to map surface textures related to the scale of surface roughness. The AIRSAR also allowed identification of previously unmapped fault segments and structural control of lithology and minerology. Because all of the above data sets were geographically referenced, combination of different data types and direct comparison of the results with conventional field and laboratory data sets allowed improved geologic mapping of the test site.

  18. Nanosecond frame cameras

    SciTech Connect

    Frank, A M; Wilkins, P R

    2001-01-05

    The advent of CCD cameras and computerized data recording has spurred the development of several new cameras and techniques for recording nanosecond images. We have made a side by side comparison of three nanosecond frame cameras, examining them for both performance and operational characteristics. The cameras include; Micro-Channel Plate/CCD, Image Diode/CCD and Image Diode/Film; combinations of gating/data recording. The advantages and disadvantages of each device will be discussed.

  19. Harpicon camera for HDTV

    NASA Astrophysics Data System (ADS)

    Tanada, Jun

    1992-08-01

    Ikegami has been involved in broadcast equipment ever since it was established as a company. In conjunction with NHK it has brought forth countless television cameras, from black-and-white cameras to color cameras, HDTV cameras, and special-purpose cameras. In the early days of HDTV (high-definition television, also known as "High Vision") cameras the specifications were different from those for the cameras of the present-day system, and cameras using all kinds of components, having different arrangements of components, and having different appearances were developed into products, with time spent on experimentation, design, fabrication, adjustment, and inspection. But recently the knowhow built up thus far in components, , printed circuit boards, and wiring methods has been incorporated in camera fabrication, making it possible to make HDTV cameras by metbods similar to the present system. In addition, more-efficient production, lower costs, and better after-sales service are being achieved by using the same circuits, components, mechanism parts, and software for both HDTV cameras and cameras that operate by the present system.

  20. Digital Pinhole Camera

    ERIC Educational Resources Information Center

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…

  1. Radiometric performance of the Viking Mars lander cameras

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Burcher, E. E.; Taylor, E. J.; Wall, S. D.

    1975-01-01

    The Viking lander cameras feature an array of 12 silicon photodiodes for electronic focus selection and multispectral imaging. Comparisons of absolute radiometric calibrations of the four cameras selected for the mission to Mars with performance predictions based on their design data revealed minor discrepancies. These discrepancies were caused primarily by the method used to calibrate the photosensor array and apparently also from light reflections internal to the array. The sensitivity and dynamic range of all camera channels are found to be sufficient for high quality pictures, providing that the commandable gains and offsets can be optimized for the scene radiance; otherwise, the quantization noise may be too high or the dynamic range too low for an adequate characterization of the scene.

  2. Detection in urban scenario using combined airborne imaging sensors

    NASA Astrophysics Data System (ADS)

    Renhorn, Ingmar; Axelsson, Maria; Benoist, Koen; Bourghys, Dirk; Boucher, Yannick; Briottet, Xavier; De Ceglie, Sergio; Dekker, Rob; Dimmeler, Alwin; Dost, Remco; Friman, Ola; Kåsen, Ingebjørg; Maerker, Jochen; van Persie, Mark; Resta, Salvatore; Schwering, Piet; Shimoni, Michal; Haavardsholm, Trym Vegard

    2012-06-01

    The EDA project "Detection in Urban scenario using Combined Airborne imaging Sensors" (DUCAS) is in progress. The aim of the project is to investigate the potential benefit of combined high spatial and spectral resolution airborne imagery for several defense applications in the urban area. The project is taking advantage of the combined resources from 7 contributing nations within the EDA framework. An extensive field trial has been carried out in the city of Zeebrugge at the Belgian coast in June 2011. The Belgian armed forces contributed with platforms, weapons, personnel (soldiers) and logistics for the trial. Ground truth measurements with respect to geometrical characteristics, optical material properties and weather conditions were obtained in addition to hyperspectral, multispectral and high resolution spatial imagery. High spectral/spatial resolution sensor data are used for detection, classification, identification and tracking.

  3. 1994-1995 CNR LARA project airborne hyperspectral campaigns

    SciTech Connect

    Bianchi, R.; Cavalli, R.M.; Fiumi, L.

    1996-08-01

    CNR established a new laboratory for airborne hyperspectral imaging devoted to environmental problems and since the end of last June 1994 the project (LARA Project) is fully operative to provide hyperspectral data to the national and international scientific community. The Daedalus AA5000 MIVIS (Multispectral Infrared and Visible Imaging Spectrometer) instrument, acquired by CNR (Italian National Research Council) in the framework of its LARA (Airborne Laboratory for Environmental Studies) Project, has been intensively operative. A number of MIVIS deployments have been carried out in Italy and Europe in cooperation with national and international institutions on a variety of sites, including active volcanoes, coastlines, lagoons and ocean, vegetated and cultivated areas, oil polluted surfaces, waste discharges, and archeological sites. One year of activity has shown the high system efficiency, from the survey to data preprocessing and dissemination.

  4. 2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING WEST TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  5. 6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VAL CAMERA CAR, DETAIL OF COMMUNICATION EQUIPMENT INSIDE CAMERA CAR WITH CAMERA MOUNT IN FOREGROUND. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. 7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA CAR, DETAIL OF 'FLARE' OR TRAJECTORY CAMERA INSIDE CAMERA CAR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  7. Geometric calibration and accuracy assessment of a multispectral imager on UAVs

    NASA Astrophysics Data System (ADS)

    Zheng, Fengjie; Yu, Tao; Chen, Xingfeng; Chen, Jiping; Yuan, Guoti

    2012-11-01

    The increasing developments in Unmanned Aerial Vehicles (UAVs) platforms and associated sensing technologies have widely promoted UAVs remote sensing application. UAVs, especially low-cost UAVs, limit the sensor payload in weight and dimension. Mostly, cameras on UAVs are panoramic, fisheye lens, small-format CCD planar array camera, unknown intrinsic parameters and lens optical distortion will cause serious image aberrations, even leading a few meters or tens of meters errors in ground per pixel. However, the characteristic of high spatial resolution make accurate geolocation more critical to UAV quantitative remote sensing research. A method for MCC4-12F Multispectral Imager designed to load on UAVs has been developed and implemented. Using multi-image space resection algorithm to assess geometric calibration parameters of random position and different photogrammetric altitudes in 3D test field, which is suitable for multispectral cameras. Both theoretical and practical accuracy assessments were selected. The results of theoretical strategy, resolving object space and image point coordinate differences by space intersection, showed that object space RMSE were 0.2 and 0.14 pixels in X direction and in Y direction, image space RMSE were superior to 0.5 pixels. In order to verify the accuracy and reliability of the calibration parameters,practical study was carried out in Tianjin UAV flight experiments, the corrected accuracy validated by ground checkpoints was less than 0.3m. Typical surface reflectance retrieved on the basis of geo-rectified data was compared with ground ASD measurement resulting 4% discrepancy. Hence, the approach presented here was suitable for UAV multispectral imager.

  8. Hyperspectral and multispectral bioluminescence optical tomography for small animal imaging

    NASA Astrophysics Data System (ADS)

    Chaudhari, Abhijit J.; Darvas, Felix; Bading, James R.; Moats, Rex A.; Conti, Peter S.; Smith, Desmond J.; Cherry, Simon R.; Leahy, Richard M.

    2005-12-01

    For bioluminescence imaging studies in small animals, it is important to be able to accurately localize the three-dimensional (3D) distribution of the underlying bioluminescent source. The spectrum of light produced by the source that escapes the subject varies with the depth of the emission source because of the wavelength-dependence of the optical properties of tissue. Consequently, multispectral or hyperspectral data acquisition should help in the 3D localization of deep sources. In this paper, we describe a framework for fully 3D bioluminescence tomographic image acquisition and reconstruction that exploits spectral information. We describe regularized tomographic reconstruction techniques that use semi-infinite slab or FEM-based diffusion approximations of photon transport through turbid media. Singular value decomposition analysis was used for data dimensionality reduction and to illustrate the advantage of using hyperspectral rather than achromatic data. Simulation studies in an atlas-mouse geometry indicated that sub-millimeter resolution may be attainable given accurate knowledge of the optical properties of the animal. A fixed arrangement of mirrors and a single CCD camera were used for simultaneous acquisition of multispectral imaging data over most of the surface of the animal. Phantom studies conducted using this system demonstrated our ability to accurately localize deep point-like sources and show that a resolution of 1.5 to 2.2 mm for depths up to 6 mm can be achieved. We also include an in vivo study of a mouse with a brain tumour expressing firefly luciferase. Co-registration of the reconstructed 3D bioluminescent image with magnetic resonance images indicated good anatomical localization of the tumour.

  9. Tower Camera Handbook

    SciTech Connect

    Moudry, D

    2005-01-01

    The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for comparison with the albedo that can be calculated from downward-looking radiometers, as well as some indication of present weather. Similarly, during spring time, the camera images show the changes in the ground albedo as the snow melts. The tower images are saved in hourly intervals. In addition, two other cameras, the skydeck camera in Barrow and the piling camera in Atqasuk, show the current conditions at those sites.

  10. Unsupervised classification of remote multispectral sensing data

    NASA Technical Reports Server (NTRS)

    Su, M. Y.

    1972-01-01

    The new unsupervised classification technique for classifying multispectral remote sensing data which can be either from the multispectral scanner or digitized color-separation aerial photographs consists of two parts: (a) a sequential statistical clustering which is a one-pass sequential variance analysis and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. Applications of the technique using an IBM-7094 computer on multispectral data sets over Purdue's Flight Line C-1 and the Yellowstone National Park test site have been accomplished. Comparisons between the classification maps by the unsupervised technique and the supervised maximum liklihood technique indicate that the classification accuracies are in agreement.

  11. Automated Camera Calibration

    NASA Technical Reports Server (NTRS)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  12. Optimal out-of-band correction for multispectral remote sensing.

    PubMed

    Chen, Wei

    2012-11-20

    In this paper, an optimal out-of-band (OOB) correction transform (OOBCT) for dealing with onboard Visible/Infrared Imaging Radiometer Suite (VIIRS) OOB effects is proposed. This paper addresses the OOB response issue without consideration of the impact of other error sources on the correction processing. The OOBCT matrix is derived by minimizing an objective function of error summation between the expected and realistic recovered band-averaged spectral radiances. Using the VIIRS filter transmittance functions for all multiband sensors obtained from prelaunch laboratory measurements and a simulated dataset obtained from Airborne Visible InfraRed Imaging Spectrometer (AVIRIS) hyperspectral data, the OOBCT matrix is numerically computed. The processing of the OOB correction is straightforward and can be performed by a product between the OOBCT matrix and a measured multispectral image vector. The experimental results with both AVIRIS and Hyperspectral Imager for the Coastal Ocean datasets demonstrate that the ratios of average errors of recovered band-averaged spectral radiances divided by the measured radiances with the OOB responses are less than 4%. The average values of the relative errors for all pixels and bands indicate that the OOBCT method outperforms the works reported in literature.

  13. Radiometric Characterization of Hyperspectral Imagers using Multispectral Sensors

    NASA Technical Reports Server (NTRS)

    McCorkel, Joel; Kurt, Thome; Leisso, Nathan; Anderson, Nikolaus; Czapla-Myers, Jeff

    2009-01-01

    The Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne and satellite based sensors. Often, ground-truth measurements at these test sites are not always successful due to weather and funding availability. Therefore, RSG has also automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor, This work studies the feasibility of determining the radiometric calibration of a hyperspectral imager using multispectral a imagery. The work relies on the Moderate Resolution Imaging Spectroradiometer (M0DIS) as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. Hyperion bands are compared to MODIS by band averaging Hyperion's high spectral resolution data with the relative spectral response of M0DlS. The results compare cross-calibration scenarios that differ in image acquisition coincidence, test site used for the calibration, and reference sensor. Cross-calibration results are presented that show agreement between the use of coincident and non-coincident image pairs within 2% in most brands as well as similar agreement between results that employ the different MODIS sensors as a reference.

  14. High-speed multispectral confocal imaging

    NASA Astrophysics Data System (ADS)

    Carver, Gary E.; Locknar, Sarah A.; Morrison, William A.; Farkas, Daniel L.

    2013-02-01

    A new approach for generating high-speed multispectral images has been developed. The central concept is that spectra can be acquired for each pixel in a confocal spatial scan by using a fast spectrometer based on optical fiber delay lines. This concept merges fast spectroscopy with standard spatial scanning to create datacubes in real time. The spectrometer is based on a serial array of reflecting spectral elements, delay lines between these elements, and a single element detector. The spatial, spectral, and temporal resolution of the instrument is described, and illustrated by multispectral images of laser-induced autofluorescence in biological tissues.

  15. High-speed multispectral confocal biomedical imaging

    PubMed Central

    Carver, Gary E.; Locknar, Sarah A.; Morrison, William A.; Krishnan Ramanujan, V.; Farkas, Daniel L.

    2014-01-01

    Abstract. A new approach for generating high-speed multispectral confocal images has been developed. The central concept is that spectra can be acquired for each pixel in a confocal spatial scan by using a fast spectrometer based on optical fiber delay lines. This approach merges fast spectroscopy with standard spatial scanning to create datacubes in real time. The spectrometer is based on a serial array of reflecting spectral elements, delay lines between these elements, and a single element detector. The spatial, spectral, and temporal resolution of the instrument is described and illustrated by multispectral images of laser-induced autofluorescence in biological tissues. PMID:24658777

  16. Coastal and estuarine applications of multispectral photography

    NASA Technical Reports Server (NTRS)

    Yost, E.; Wenderoth, S.

    1972-01-01

    An evaluation of multispectral photographic techniques for optical penetration of water in the northeastern United States and the Gulf of Mexico coastal waters is presented. The spectral band (493 to 543 nanom), when exposed to place the water mass at about unit density on the photographic emulsion, was found to provide the best water penetration, independent of altitude or time of day, as long as solar glitter from the surface of the water is avoided. An isoluminous color technique was perfected, which eliminates the dimension of brightness from a multispectral color presentation.

  17. Airborne remote sensing for Deepwater Horizon oil spill emergency response

    NASA Astrophysics Data System (ADS)

    Kroutil, Robert T.; Shen, Sylvia S.; Lewis, Paul E.; Miller, David P.; Cardarelli, John; Thomas, Mark; Curry, Timothy; Kudaraskus, Paul

    2010-08-01

    On April 28, 2010, the Environmental Protection Agency's (EPA) Airborne Spectral Photometric Environmental Collection Technology (ASPECT) aircraft was deployed to Gulfport, Mississippi to provide airborne remotely sensed air monitoring and situational awareness data and products in response to the Deepwater Horizon oil rig disaster. The ASPECT aircraft was released from service on August 9, 2010 after having flown over 75 missions that included over 250 hours of flight operation. ASPECT's initial mission responsibility was to provide air quality monitoring (i.e., identification of vapor species) during various oil burning operations. The ASPECT airborne wide-area infrared remote sensing spectral data was used to evaluate the hazard potential of vapors being produced from open water oil burns near the Deepwater Horizon rig site. Other significant remote sensing data products and innovations included the development of an advanced capability to correctly identify, locate, characterize, and quantify surface oil that could reach beaches and wetland areas. This advanced identification product provided the Incident Command an improved capability to locate surface oil in order to improve the effectiveness of oil skimmer vessel recovery efforts directed by the US Coast Guard. This paper discusses the application of infrared spectroscopy and multispectral infrared imagery to address significant issues associated with this national crisis. More specifically, this paper addresses the airborne remote sensing capabilities, technology, and data analysis products developed specifically to optimize the resources and capabilities of the Deepwater Horizon Incident Command structure personnel and their remediation efforts.

  18. Data Requirements For The Water Directive - Role of Remote Sensing Using The Hrsc-ax Camera Illustrated By The Blue City Project

    NASA Astrophysics Data System (ADS)

    Martin, J.; O'Kane, J. P.

    A completely digital, aerial survey of Cork City and the Lee Catchment, flown in May 2001 at an altitude of 3000m, covered an area of 325sqkm. The campaign used the German Aerospace Centres HRSC-AX system. The SAXT (Airborne, Extended- & cedil;Generation) version of the HRSC (High Resolution Stereo Camera), in commercial operation since late 2000, produces very high resolution Digital Elevation Models (DEMs) and multi-spectral ortho-rectified images. The system contains nine CCD line sensors, mounted in parallel behind one single optic. Operating on a push-broom principle, these acquire nine superimposed image tracks simultaneously (along-track). Five of the nine CCD lines are panchromatic sensors arranged at specific viewing an- gles to provide the multiple-stereo and photogrammetric capabilities of the instrument. The other four are covered with colour filters (red, green, blue near infrared) to ac- quire multi-spectral images. The fully automatic photogrammetric and cartographic processing system, developed at the DLR Institute of Space Sensor Technology and Planetary Exploration (in co-operation with the Technical University of Berlin) yields mean absolute accuracies of s15-20cm in both the horizontal and vertical directions ´ for all products. For the Blue City Project, the system yielded a DEM with a resolu- tion of 100cm, a nadir (panchromatic) ortho-mosaic with a 15cm resolution, and true colour (R, G, B and nIR) ortho-mosaics with a 30cm resolution. Joining the colour bands yielded true-colour (RGB) and false-colour (nIR,G,B) ortho-image mosaics of the full area. The combination of these high resolution products allows detailed to- pographic and multi-spectral analysis of the study area. This data set will provide the foundation for a multi-purpose geographic information system, linked to hydro- dynamic, hydraulic and hydrologic computer models of surface and subsurface flow, including water quality, throughout the catchment of the Lee and the City of

  19. Non-contact assessment of melanin distribution via multispectral temporal illumination coding

    NASA Astrophysics Data System (ADS)

    Amelard, Robert; Scharfenberger, Christian; Wong, Alexander; Clausi, David A.

    2015-03-01

    Melanin is a pigment that is highly absorptive in the UV and visible electromagnetic spectra. It is responsible for perceived skin tone, and protects against harmful UV effects. Abnormal melanin distribution is often an indicator for melanoma. We propose a novel approach for non-contact melanin distribution via multispectral temporal illumination coding to estimate the two-dimensional melanin distribution based on its absorptive characteristics. In the proposed system, a novel multispectral, cross-polarized, temporally-coded illumination sequence is synchronized with a camera to measure reflectance under both multispectral and ambient illumination. This allows us to eliminate the ambient illumination contribution from the acquired reflectance measurements, and also to determine the melanin distribution in an observed region based on the spectral properties of melanin using the Beer-Lambert law. Using this information, melanin distribution maps can be generated for objective, quantitative assessment of skin type of individuals. We show that the melanin distribution map correctly identifies areas with high melanin densities (e.g., nevi).

  20. Hemodynamic and morphologic responses in mouse brain during acute head injury imaged by multispectral structured illumination

    NASA Astrophysics Data System (ADS)

    Volkov, Boris; Mathews, Marlon S.; Abookasis, David

    2015-03-01

    Multispectral imaging has received significant attention over the last decade as it integrates spectroscopy, imaging, tomography analysis concurrently to acquire both spatial and spectral information from biological tissue. In the present study, a multispectral setup based on projection of structured illumination at several near-infrared wavelengths and at different spatial frequencies is applied to quantitatively assess brain function before, during, and after the onset of traumatic brain injury in an intact mouse brain (n=5). For the production of head injury, we used the weight drop method where weight of a cylindrical metallic rod falling along a metal tube strikes the mouse's head. Structured light was projected onto the scalp surface and diffuse reflected light was recorded by a CCD camera positioned perpendicular to the mouse head. Following data analysis, we were able to concurrently show a series of hemodynamic and morphologic changes over time including higher deoxyhemoglobin, reduction in oxygen saturation, cell swelling, etc., in comparison with baseline measurements. Overall, results demonstrates the capability of multispectral imaging based structured illumination to detect and map of brain tissue optical and physiological properties following brain injury in a simple noninvasive and noncontact manner.

  1. Microchannel plate streak camera

    DOEpatents

    Wang, Ching L.

    1989-01-01

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

  2. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1984-09-28

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (uv to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 keV x-rays.

  3. Microchannel plate streak camera

    DOEpatents

    Wang, C.L.

    1989-03-21

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras is disclosed. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1,000 KeV x-rays. 3 figs.

  4. GRACE star camera noise

    NASA Astrophysics Data System (ADS)

    Harvey, Nate

    2016-08-01

    Extending results from previous work by Bandikova et al. (2012) and Inacio et al. (2015), this paper analyzes Gravity Recovery and Climate Experiment (GRACE) star camera attitude measurement noise by processing inter-camera quaternions from 2003 to 2015. We describe a correction to star camera data, which will eliminate a several-arcsec twice-per-rev error with daily modulation, currently visible in the auto-covariance function of the inter-camera quaternion, from future GRACE Level-1B product releases. We also present evidence supporting the argument that thermal conditions/settings affect long-term inter-camera attitude biases by at least tens-of-arcsecs, and that several-to-tens-of-arcsecs per-rev star camera errors depend largely on field-of-view.

  5. Estimating noise and information for multispectral imagery

    NASA Astrophysics Data System (ADS)

    Aiazzi, Bruno; Alparone, Luciano; Barducci, Alessandro; Baronti, Stefano; Pippi, Ivan

    2002-03-01

    We focus on reliably estimating the information conveyed to a user by multispectral image data. The goal is establishing the extent to which an increase in spectral resolution can increase the amount of usable information. As a matter of fact, a trade- off exists between spatial and spectral resolution, due to physical constraints of sensors imaging with a prefixed SNR. After describing some methods developed for automatically estimating the variance of the noise introduced by multispectral imagers, lossless data compression is exploited to measure the useful information content of the multispectral data. In fact, the bit rate achieved by the reversible compression process takes into account both the contribution of the 'observation' noise, i.e., information regarded as statistical uncertainty, whose relevance is null to a user, and the intrinsic information of hypothetically noise free multispectral data. An entropic model of the image source is defined and, once the standard deviation of the noise, assumed to be white and Gaussian, has been preliminarily estimated, such a model is inverted to yield an estimate of the information content of the noise-free source from the code rate. Results of both noise and information assessment are reported and discussed on synthetic noisy images and on Landsat thematic mapper (TM) data.

  6. Multispectral laser imaging for advanced food analysis

    NASA Astrophysics Data System (ADS)

    Senni, L.; Burrascano, P.; Ricci, M.

    2016-07-01

    A hardware-software apparatus for food inspection capable of realizing multispectral NIR laser imaging at four different wavelengths is herein discussed. The system was designed to operate in a through-transmission configuration to detect the presence of unwanted foreign bodies inside samples, whether packed or unpacked. A modified Lock-In technique was employed to counterbalance the significant signal intensity attenuation due to transmission across the sample and to extract the multispectral information more efficiently. The NIR laser wavelengths used to acquire the multispectral images can be varied to deal with different materials and to focus on specific aspects. In the present work the wavelengths were selected after a preliminary analysis to enhance the image contrast between foreign bodies and food in the sample, thus identifying the location and nature of the defects. Experimental results obtained from several specimens, with and without packaging, are presented and the multispectral image processing as well as the achievable spatial resolution of the system are discussed.

  7. Analytical multicollimator camera calibration

    USGS Publications Warehouse

    Tayman, W.P.

    1978-01-01

    Calibration with the U.S. Geological survey multicollimator determines the calibrated focal length, the point of symmetry, the radial distortion referred to the point of symmetry, and the asymmetric characteristiecs of the camera lens. For this project, two cameras were calibrated, a Zeiss RMK A 15/23 and a Wild RC 8. Four test exposures were made with each camera. Results are tabulated for each exposure and averaged for each set. Copies of the standard USGS calibration reports are included. ?? 1978.

  8. Polarization encoded color camera.

    PubMed

    Schonbrun, Ethan; Möller, Guðfríður; Di Caprio, Giuseppe

    2014-03-15

    Digital cameras would be colorblind if they did not have pixelated color filters integrated into their image sensors. Integration of conventional fixed filters, however, comes at the expense of an inability to modify the camera's spectral properties. Instead, we demonstrate a micropolarizer-based camera that can reconfigure its spectral response. Color is encoded into a linear polarization state by a chiral dispersive element and then read out in a single exposure. The polarization encoded color camera is capable of capturing three-color images at wavelengths spanning the visible to the near infrared. PMID:24690806

  9. Ringfield lithographic camera

    DOEpatents

    Sweatt, William C.

    1998-01-01

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

  10. LSST Camera Optics Design

    SciTech Connect

    Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  11. Mars Airborne Prospecting Spectrometer

    NASA Astrophysics Data System (ADS)

    Steinkraus, J. M.; Wright, M. W.; Rheingans, B. E.; Steinkraus, D. E.; George, W. P.; Aljabri, A.; Hall, J. L.; Scott, D. C.

    2012-06-01

    One novel approach towards addressing the need for innovative instrumentation and investigation approaches is the integration of a suite of four spectrometer systems to form the Mars Airborne Prospecting Spectrometers (MAPS) for prospecting on Mars.

  12. Software defined multi-spectral imaging for Arctic sensor networks

    NASA Astrophysics Data System (ADS)

    Siewert, Sam; Angoth, Vivek; Krishnamurthy, Ramnarayan; Mani, Karthikeyan; Mock, Kenrick; Singh, Surjith B.; Srivistava, Saurav; Wagner, Chris; Claus, Ryan; Vis, Matthew Demi

    2016-05-01

    Availability of off-the-shelf infrared sensors combined with high definition visible cameras has made possible the construction of a Software Defined Multi-Spectral Imager (SDMSI) combining long-wave, near-infrared and visible imaging. The SDMSI requires a real-time embedded processor to fuse images and to create real-time depth maps for opportunistic uplink in sensor networks. Researchers at Embry Riddle Aeronautical University working with University of Alaska Anchorage at the Arctic Domain Awareness Center and the University of Colorado Boulder have built several versions of a low-cost drop-in-place SDMSI to test alternatives for power efficient image fusion. The SDMSI is intended for use in field applications including marine security, search and rescue operations and environmental surveys in the Arctic region. Based on Arctic marine sensor network mission goals, the team has designed the SDMSI to include features to rank images based on saliency and to provide on camera fusion and depth mapping. A major challenge has been the design of the camera computing system to operate within a 10 to 20 Watt power budget. This paper presents a power analysis of three options: 1) multi-core, 2) field programmable gate array with multi-core, and 3) graphics processing units with multi-core. For each test, power consumed for common fusion workloads has been measured at a range of frame rates and resolutions. Detailed analyses from our power efficiency comparison for workloads specific to stereo depth mapping and sensor fusion are summarized. Preliminary mission feasibility results from testing with off-the-shelf long-wave infrared and visible cameras in Alaska and Arizona are also summarized to demonstrate the value of the SDMSI for applications such as ice tracking, ocean color, soil moisture, animal and marine vessel detection and tracking. The goal is to select the most power efficient solution for the SDMSI for use on UAVs (Unoccupied Aerial Vehicles) and other drop

  13. Vein visualization using a smart phone with multispectral Wiener estimation for point-of-care applications.

    PubMed

    Song, Jae Hee; Kim, Choye; Yoo, Yangmo

    2015-03-01

    Effective vein visualization is clinically important for various point-of-care applications, such as needle insertion. It can be achieved by utilizing ultrasound imaging or by applying infrared laser excitation and monitoring its absorption. However, while these approaches can be used for vein visualization, they are not suitable for point-of-care applications because of their cost, time, and accessibility. In this paper, a new vein visualization method based on multispectral Wiener estimation is proposed and its real-time implementation on a smart phone is presented. In the proposed method, a conventional RGB camera on a commercial smart phone (i.e., Galaxy Note 2, Samsung Electronics Inc., Suwon, Korea) is used to acquire reflectance information from veins. Wiener estimation is then applied to extract the multispectral information from the veins. To evaluate the performance of the proposed method, an experiment was conducted using a color calibration chart (ColorChecker Classic, X-rite, Grand Rapids, MI, USA) and an average root-mean-square error of 12.0% was obtained. In addition, an in vivo subcutaneous vein imaging experiment was performed to explore the clinical performance of the smart phone-based Wiener estimation. From the in vivo experiment, the veins at various sites were successfully localized using the reconstructed multispectral images and these results were confirmed by ultrasound B-mode and color Doppler images. These results indicate that the presented multispectral Wiener estimation method can be used for visualizing veins using a commercial smart phone for point-of-care applications (e.g., vein puncture guidance). PMID:24691170

  14. Hybrid Image Fusion for Sharpness Enhancement of Multi-Spectral Lunar Images

    NASA Astrophysics Data System (ADS)

    Awumah, Anna; Mahanti, Prasun; Robinson, Mark

    2016-10-01

    Image fusion enhances the sharpness of a multi-spectral (MS) image by incorporating spatial details from a higher-resolution panchromatic (Pan) image [1,2]. Known applications of image fusion for planetary images are rare, although image fusion is well-known for its applications to Earth-based remote sensing. In a recent work [3], six different image fusion algorithms were implemented and their performances were verified with images from the Lunar Reconnaissance Orbiter (LRO) Camera. The image fusion procedure obtained a high-resolution multi-spectral (HRMS) product from the LRO Narrow Angle Camera (used as Pan) and LRO Wide Angle Camera (used as MS) images. The results showed that the Intensity-Hue-Saturation (IHS) algorithm results in a high-spatial quality product while the Wavelet-based image fusion algorithm best preserves spectral quality among all the algorithms. In this work we show the results of a hybrid IHS-Wavelet image fusion algorithm when applied to LROC MS images. The hybrid method provides the best HRMS product - both in terms of spatial resolution and preservation of spectral details. Results from hybrid image fusion can enable new science and increase the science return from existing LROC images.[1] Pohl, Cle, and John L. Van Genderen. "Review article multisensor image fusion in remote sensing: concepts, methods and applications." International journal of remote sensing 19.5 (1998): 823-854.[2] Zhang, Yun. "Understanding image fusion." Photogramm. Eng. Remote Sens 70.6 (2004): 657-661.[3] Mahanti, Prasun et al. "Enhancement of spatial resolution of the LROC Wide Angle Camera images." Archives, XXIII ISPRS Congress Archives (2016).

  15. The PanCam Calibration Target (PCT) and multispectral image processing for the ExoMars 2018 mission

    NASA Astrophysics Data System (ADS)

    Barnes, D.; Wilding, M.; Gunn, M.; Tyler, L.; Pugh, S.; Coates, A.; Griffiths, A.; Cousins, C.; Schmitz, N.; Paar, G.

    2011-10-01

    The Panoramic Camera (PanCam) instrument for the ESA/NASA 2018 ExoMars mission is designed to be the 'eyes' of the Mars rover and is equipped with two wide angle multispectral cameras (WACs) from MSSL, and a focusable High Resolution Camera (HRC) from DLR. To achieve its science role within the ExoMars mission, the PanCam will generate terrain reflectance spectra to help identify the mineralogy of the Martian surface, and generate true-colour images of the Martian environment. The PanCam Calibration Target (PCT) is an essential component for the science operations of the PanCam instrument. Its purpose is to allow radiometric calibration and to support geometric calibration check-out of the PanCam instrument during the mission. Unlike other camera calibration targets flown to Mars, the PCT target regions are being made from stained glass. The paper describes the work undertaken during the early build and testing of the PCT, together with results from the baseline algorithms that have been designed and implemented to process the multispectral PanCam images.

  16. Camera Operator and Videographer

    ERIC Educational Resources Information Center

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

  17. The Camera Cook Book.

    ERIC Educational Resources Information Center

    Education Development Center, Inc., Newton, MA.

    Intended for use with the photographic materials available from the Workshop for Learning Things, Inc., this "camera cookbook" describes procedures that have been tried in classrooms and workshops and proven to be the most functional and inexpensive. Explicit starting off instructions--directions for exploring and loading the camera and for taking…

  18. Constrained space camera assembly

    DOEpatents

    Heckendorn, F.M.; Anderson, E.K.; Robinson, C.W.; Haynes, H.B.

    1999-05-11

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity is disclosed. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras. 17 figs.

  19. CCD Luminescence Camera

    NASA Technical Reports Server (NTRS)

    Janesick, James R.; Elliott, Tom

    1987-01-01

    New diagnostic tool used to understand performance and failures of microelectronic devices. Microscope integrated to low-noise charge-coupled-device (CCD) camera to produce new instrument for analyzing performance and failures of microelectronics devices that emit infrared light during operation. CCD camera also used to indentify very clearly parts that have failed where luminescence typically found.

  20. Multispectral fluorescence imaging techniques for nondestructive food safety inspection

    NASA Astrophysics Data System (ADS)

    Kim, Moon S.; Lefcourt, Alan M.; Chen, Yud-Ren

    2004-03-01

    The use of spectral sensing has gained acceptance as a rapid means for nondestructive inspection of postharvest food produce. Current technologies generally use color or a single wavelength camera technology. The applicability and sensitivity of these techniques can be expanded through the use of multiple wavelengths. Reflectance in the Vis/NIR is the prevalent spectral technique. Fluorescence, compared to reflectance, is regarded as a more sensitive technique due to its dynamic responses to subtle changes in biological entities. Our laboratory has been exploring fluorescence as a potential means for detection of quality and wholesomeness of food products. Applications of fluorescence sensing require an understanding of the spectral characteristics emanating from constituents and potential contaminants. A number of factors affecting fluorescence emission characteristics are discussed. Because of relatively low fluorescence quantum yield from biological samples, a system with a powerful pulse light source such as a laser coupled with a gated detection device is used to harvest fluorescence, in the presence of ambient light. Several fluorescence sensor platforms developed in our laboratory, including hyperspectral imaging, and laser-induced fluorescence (LIF) and steady-state fluorescence imaging systems with multispectral capabilities are presented. We demonstrate the potential uses of recently developed fluorescence imaging platforms in food safety inspection of apples contaminated with animal feces.

  1. Active multispectral near-IR detection of small surface targets

    NASA Astrophysics Data System (ADS)

    de Jong, Arie N.; Winkel, Hans; Roos, Marco J. J.

    2001-10-01

    The detection and identification of small surface targets with Electro-Optical sensors is seriously hampered by ground clutter, leading to false alarms and reduced detection probabilities. Active ground illumination can improve the detection performance of EO sensors compared to passive skylight illumination because of the knowledge of the illumination level and of its temporal stability. Sun and sky cannot provide this due to the weather variability. In addition multispectral sensors with carefully chosen spectral bands ranging from the visual into the near IR from 400-2500 nm wavelength can take benefit of a variety of cheap active light sources, ranging from lasers to Xenon or halogen lamps. Results are presented, obtained with a two- color laser scanner with one wavelength in the chlorophyll absorption dip. Another active scanner is described operating at 4 wavebands between 1400 and 2300 nm, using tungsten halogen lamps. Finally a simple TV camera was used with either a ste of narrow band spectral filters or polarization filters in front of the lamps. The targets consisted of an array of mixed objects, most of them real mines. The results how great promise in enhancing the detection and identification probabilities of EO sensors against small surface targets.

  2. Dry imaging cameras

    PubMed Central

    Indrajit, IK; Alam, Aftab; Sahni, Hirdesh; Bhatia, Mukul; Sahu, Samaresh

    2011-01-01

    Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow. PMID:21799589

  3. Optical Communications Link to Airborne Transceiver

    NASA Technical Reports Server (NTRS)

    Regehr, Martin W.; Kovalik, Joseph M.; Biswas, Abhijit

    2011-01-01

    An optical link from Earth to an aircraft demonstrates the ability to establish a link from a ground platform to a transceiver moving overhead. An airplane has a challenging disturbance environment including airframe vibrations and occasional abrupt changes in attitude during flight. These disturbances make it difficult to maintain pointing lock in an optical transceiver in an airplane. Acquisition can also be challenging. In the case of the aircraft link, the ground station initially has no precise knowledge of the aircraft s location. An airborne pointing system has been designed, built, and demonstrated using direct-drive brushless DC motors for passive isolation of pointing disturbances and for high-bandwidth control feedback. The airborne transceiver uses a GPS-INS system to determine the aircraft s position and attitude, and to then illuminate the ground station initially for acquisition. The ground transceiver participates in link-pointing acquisition by first using a wide-field camera to detect initial illumination from the airborne beacon, and to perform coarse pointing. It then transfers control to a high-precision pointing detector. Using this scheme, live video was successfully streamed from the ground to the aircraft at 270 Mb/s while simultaneously downlinking a 50 kb/s data stream from the aircraft to the ground.

  4. Air Pollution Determination Using a Surveillance Internet Protocol Camera Images

    NASA Astrophysics Data System (ADS)

    Chow Jeng, C. J.; Hwee San, Hslim; Matjafri, M. Z.; Abdullah, Abdul, K.

    Air pollution has long been a problem in the industrial nations of the West It has now become an increasing source of environmental degradation in the developing nations of east Asia Malaysia government has built a network to monitor air pollution But the cost of these networks is high and limits the knowledge of pollutant concentration to specific points of the cities A methodology based on a surveillance internet protocol IP camera for the determination air pollution concentrations was presented in this study The objective of this study was to test the feasibility of using IP camera data for estimating real time particulate matter of size less than 10 micron PM10 in the campus of USM The proposed PM10 retrieval algorithm derived from the atmospheric optical properties was employed in the present study In situ data sets of PM10 measurements and sun radiation measurements at the ground surface were collected simultaneously with the IP camera images using a DustTrak meter and a handheld spectroradiometer respectively The digital images were separated into three bands namely red green and blue bands for multispectral algorithm calibration The digital number DN of the IP camera images were converted into radiance and reflectance values After that the reflectance recorded by the digital camera was subtracted by the reflectance of the known surface and we obtained the reflectance caused by the atmospheric components The atmospheric reflectance values were used for regression analysis Regression technique was employed to determine suitable

  5. 3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VAL CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH THE VAL TO THE RIGHT, LOOKING NORTHEAST. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  6. 7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VAL CAMERA STATION, INTERIOR VIEW OF CAMERA MOUNT, COMMUNICATION EQUIPMENT AND STORAGE CABINET. - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  7. Spectral Characterization of a Prototype SFA Camera for Joint Visible and NIR Acquisition.

    PubMed

    Thomas, Jean-Baptiste; Lapray, Pierre-Jean; Gouton, Pierre; Clerc, Cédric

    2016-01-01

    Multispectral acquisition improves machine vision since it permits capturing more information on object surface properties than color imaging. The concept of spectral filter arrays has been developed recently and allows multispectral single shot acquisition with a compact camera design. Due to filter manufacturing difficulties, there was, up to recently, no system available for a large span of spectrum, i.e., visible and Near Infra-Red acquisition. This article presents the achievement of a prototype of camera that captures seven visible and one near infra-red bands on the same sensor chip. A calibration is proposed to characterize the sensor, and images are captured. Data are provided as supplementary material for further analysis and simulations. This opens a new range of applications in security, robotics, automotive and medical fields. PMID:27367690

  8. Spectral Characterization of a Prototype SFA Camera for Joint Visible and NIR Acquisition

    PubMed Central

    Thomas, Jean-Baptiste; Lapray, Pierre-Jean; Gouton, Pierre; Clerc, Cédric

    2016-01-01

    Multispectral acquisition improves machine vision since it permits capturing more information on object surface properties than color imaging. The concept of spectral filter arrays has been developed recently and allows multispectral single shot acquisition with a compact camera design. Due to filter manufacturing difficulties, there was, up to recently, no system available for a large span of spectrum, i.e., visible and Near Infra-Red acquisition. This article presents the achievement of a prototype of camera that captures seven visible and one near infra-red bands on the same sensor chip. A calibration is proposed to characterize the sensor, and images are captured. Data are provided as supplementary material for further analysis and simulations. This opens a new range of applications in security, robotics, automotive and medical fields. PMID:27367690

  9. Integrated radar-camera security system: experimental results

    NASA Astrophysics Data System (ADS)

    Zyczkowski, M.; Palka, N.; Trzcinski, T.; Dulski, R.; Kastek, M.; Trzaskawka, P.

    2011-06-01

    The nature of the recent military conflicts and terrorist attacks along with the necessity to protect bases, convoys and patrols have made a serious impact on the development of more effective security systems. Current widely-used perimeter protection systems with zone sensors will soon be replaced with multi-sensor systems. Multi-sensor systems can utilize day/night cameras, IR uncooled thermal cameras, and millimeter-wave radars which detect radiation reflected from targets. Ranges of detection, recognition and identification for all targets depend on the parameters of the sensors used and of the observed scene itself. In this paper two essential issues connected with multispectral systems are described. We will focus on describing the autonomous method of the system regarding object detection, tracking, identification, localization and alarm notifications. We will also present the possibility of configuring the system as a stationary, mobile or portable device as in our experimental results.

  10. UV/visible camera for the Clementine mission

    SciTech Connect

    Kordas, J.F.; Lewis, I.T.; Priest, R.E.

    1995-04-01

    This article describes the Clementine UV/Visible (UV/Vis) multispectral camera, discusses design goals and preliminary estimates of on-orbit performance, and summarizes lessons learned in building and using the sensor. While the primary objective of the Clementine Program was to qualify a suite of 6 light-weight, low power imagers for future Department of Defense flights, the mission also has provided the first systematic mapping of the complete lunar surface in the visible and near-infrared spectral regions. The 410 g, 4.65 W UV/Vis camera uses a 384 x 288 frame-transfer silicon CCD FPA and operates at 6 user-selectable wavelength bands between 0.4 and 1.1 {micro}m. It has yielded lunar imagery and mineralogy data with up to 120 in spatial resolution (band dependent) at 400 km periselene along a 39 km cross-track swath.

  11. Investigation related to multispectral imaging systems

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F.; Erickson, J. D.

    1974-01-01

    A summary of technical progress made during a five year research program directed toward the development of operational information systems based on multispectral sensing and the use of these systems in earth-resource survey applications is presented. Efforts were undertaken during this program to: (1) improve the basic understanding of the many facets of multispectral remote sensing, (2) develop methods for improving the accuracy of information generated by remote sensing systems, (3) improve the efficiency of data processing and information extraction techniques to enhance the cost-effectiveness of remote sensing systems, (4) investigate additional problems having potential remote sensing solutions, and (5) apply the existing and developing technology for specific users and document and transfer that technology to the remote sensing community.

  12. Information extraction techniques for multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Malila, W. A.; Crane, R. B.; Turner, R. E.

    1972-01-01

    The applicability of recognition-processing procedures for multispectral scanner data from areas and conditions used for programming the recognition computers to other data from different areas viewed under different measurement conditions was studied. The reflective spectral region approximately 0.3 to 3.0 micrometers is considered. A potential application of such techniques is in conducting area surveys. Work in three general areas is reported: (1) Nature of sources of systematic variation in multispectral scanner radiation signals, (2) An investigation of various techniques for overcoming systematic variations in scanner data; (3) The use of decision rules based upon empirical distributions of scanner signals rather than upon the usually assumed multivariate normal (Gaussian) signal distributions.

  13. Multispectral device for help in diagnosis

    NASA Astrophysics Data System (ADS)

    Delporte, Céline; Ben Chouikha, Mohamed; Sautrot, Sylvie; Viénot, Françoise; Alquié, Georges

    2012-03-01

    In order to build biological tissues spectral characteristics database to be used in a multispectral imaging system a tissues optical characterization bench is developed and validated. Several biological tissue types have been characterized in vitro and ex vivo with our device such as beef, turkey and pork muscle and beef liver. Multispectral images obtained have been analyzed in order to study the dispersion of biological tissues spectral luminance factor. Tissue internal structure inhomogeneity was identified as a phenomenon contributing to the dispersion of spectral luminance factor. This dispersion of spectral luminance factor could be a characteristic of the tissue. A method based on envelope technique has been developed to identify and differentiate biological tissues in the same scene. This method applied to pork tissues containing muscle and fat gives detection rates of 59% for pork muscle and 14% for pork fat.

  14. Atmospheric effects in multispectral remote sensor data

    NASA Technical Reports Server (NTRS)

    Turner, R. E.

    1975-01-01

    The problem of radiometric variations in multispectral remote sensing data which occur as a result of a change in geometric and environmental factors is studied. The case of spatially varying atmospheres is considered and the effect of atmospheric scattering is analyzed for realistic conditions. Emphasis is placed upon a simulation of LANDSAT spectral data for agricultural investigations over the United States. The effect of the target-background interaction is thoroughly analyzed in terms of various atmospheric states, geometric parameters, and target-background materials. Results clearly demonstrate that variable atmospheres can alter the classification accuracy and that the presence of various backgrounds can change the effective target radiance by a significant amount. A failure to include these effects in multispectral data analysis will result in a decrease in the classification accuracy.

  15. Multi-spectral photoacoustic elasticity tomography

    PubMed Central

    Liu, Yubin; Yuan, Zhen

    2016-01-01

    The goal of this work was to develop and validate a spectrally resolved photoacoustic imaging method, namely multi-spectral photoacoustic elasticity tomography (PAET) for quantifying the physiological parameters and elastic modulus of biological tissues. We theoretically and experimentally examined the PAET imaging method using simulations and in vitro experimental tests. Our simulation and in vitro experimental results indicated that the reconstructions were quantitatively accurate in terms of sizes, the physiological and elastic properties of the targets. PMID:27699101

  16. Investigations in adaptive processing of multispectral data

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.; Horwitz, H. M.

    1973-01-01

    Adaptive data processing procedures are applied to the problem of classifying objects in a scene scanned by multispectral sensor. These procedures show a performance improvement over standard nonadaptive techniques. Some sources of error in classification are identified and those correctable by adaptive processing are discussed. Experiments in adaptation of signature means by decision-directed methods are described. Some of these methods assume correlation between the trajectories of different signature means; for others this assumption is not made.

  17. Multi-spectral photoacoustic elasticity tomography

    PubMed Central

    Liu, Yubin; Yuan, Zhen

    2016-01-01

    The goal of this work was to develop and validate a spectrally resolved photoacoustic imaging method, namely multi-spectral photoacoustic elasticity tomography (PAET) for quantifying the physiological parameters and elastic modulus of biological tissues. We theoretically and experimentally examined the PAET imaging method using simulations and in vitro experimental tests. Our simulation and in vitro experimental results indicated that the reconstructions were quantitatively accurate in terms of sizes, the physiological and elastic properties of the targets.

  18. Night Vision Camera

    NASA Technical Reports Server (NTRS)

    1996-01-01

    PixelVision, Inc. developed the Night Video NV652 Back-illuminated CCD Camera, based on the expertise of a former Jet Propulsion Laboratory employee and a former employee of Scientific Imaging Technologies, Inc. The camera operates without an image intensifier, using back-illuminated and thinned CCD technology to achieve extremely low light level imaging performance. The advantages of PixelVision's system over conventional cameras include greater resolution and better target identification under low light conditions, lower cost and a longer lifetime. It is used commercially for research and aviation.

  19. Gamma camera purchasing.

    PubMed

    Wells, C P; Buxton-Thomas, M

    1995-03-01

    The purchase of a new gamma camera is a major undertaking and represents a long-term commitment for most nuclear medicine departments. The purpose of tendering for gamma cameras is to assess the best match between the requirements of the clinical department and the equipment available and not necessarily to buy the 'best camera' [1-3]. After many years of drawing up tender specifications, this paper tries to outline some of the traps and pitfalls of this potentially perilous, although largely rewarding, exercise. PMID:7770241

  20. Ringfield lithographic camera

    DOEpatents

    Sweatt, W.C.

    1998-09-08

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D{sub source} {approx_equal} 0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors. 11 figs.

  1. Kitt Peak speckle camera

    NASA Technical Reports Server (NTRS)

    Breckinridge, J. B.; Mcalister, H. A.; Robinson, W. G.

    1979-01-01

    The speckle camera in regular use at Kitt Peak National Observatory since 1974 is described in detail. The design of the atmospheric dispersion compensation prisms, the use of film as a recording medium, the accuracy of double star measurements, and the next generation speckle camera are discussed. Photographs of double star speckle patterns with separations from 1.4 sec of arc to 4.7 sec of arc are shown to illustrate the quality of image formation with this camera, the effects of seeing on the patterns, and to illustrate the isoplanatic patch of the atmosphere.

  2. Interferometry based multispectral photon-limited 2D and 3D integral image encryption employing the Hartley transform.

    PubMed

    Muniraj, Inbarasan; Guo, Changliang; Lee, Byung-Geun; Sheridan, John T

    2015-06-15

    We present a method of securing multispectral 3D photon-counted integral imaging (PCII) using classical Hartley Transform (HT) based encryption by employing optical interferometry. This method has the simultaneous advantages of minimizing complexity by eliminating the need for holography recording and addresses the phase sensitivity problem encountered when using digital cameras. These together with single-channel multispectral 3D data compactness, the inherent properties of the classical photon counting detection model, i.e. sparse sensing and the capability for nonlinear transformation, permits better authentication of the retrieved 3D scene at various depth cues. Furthermore, the proposed technique works for both spatially and temporally incoherent illumination. To validate the proposed technique simulations were carried out for both the 2D and 3D cases. Experimental data is processed and the results support the feasibility of the encryption method. PMID:26193568

  3. Perceptual evaluation of color transformed multispectral imagery

    NASA Astrophysics Data System (ADS)

    Toet, Alexander; de Jong, Michael J.; Hogervorst, Maarten A.; Hooge, Ignace T. C.

    2014-04-01

    Color remapping can give multispectral imagery a realistic appearance. We assessed the practical value of this technique in two observer experiments using monochrome intensified (II) and long-wave infrared (IR) imagery, and color daylight (REF) and fused multispectral (CF) imagery. First, we investigated the amount of detail observers perceive in a short timespan. REF and CF imagery yielded the highest precision and recall measures, while II and IR imagery yielded significantly lower values. This suggests that observers have more difficulty in extracting information from monochrome than from color imagery. Next, we measured eye fixations during free image exploration. Although the overall fixation behavior was similar across image modalities, the order in which certain details were fixated varied. Persons and vehicles were typically fixated first in REF, CF, and IR imagery, while they were fixated later in II imagery. In some cases, color remapping II imagery and fusion with IR imagery restored the fixation order of these image details. We conclude that color remapping can yield enhanced scene perception compared to conventional monochrome nighttime imagery, and may be deployed to tune multispectral image representations such that the resulting fixation behavior resembles the fixation behavior corresponding to daylight color imagery.

  4. Image processing of underwater multispectral imagery

    USGS Publications Warehouse

    Zawada, D.G.

    2003-01-01

    Capturing in situ fluorescence images of marine organisms presents many technical challenges. The effects of the medium, as well as the particles and organisms within it, are intermixed with the desired signal. Methods for extracting and preparing the imagery for analysis are discussed in reference to a novel underwater imaging system called the low-light-level underwater multispectral imaging system (LUMIS). The instrument supports both uni- and multispectral collections, each of which is discussed in the context of an experimental application. In unispectral mode, LUMIS was used to investigate the spatial distribution of phytoplankton. A thin sheet of laser light (532 nm) induced chlorophyll fluorescence in the phytoplankton, which was recorded by LUMIS. Inhomogeneities in the light sheet led to the development of a beam-pattern-correction algorithm. Separating individual phytoplankton cells from a weak background fluorescence field required a two-step procedure consisting of edge detection followed by a series of binary morphological operations. In multispectral mode, LUMIS was used to investigate the bio-assay potential of fluorescent pigments in corals. Problems with the commercial optical-splitting device produced nonlinear distortions in the imagery. A tessellation algorithm, including an automated tie-point-selection procedure, was developed to correct the distortions. Only pixels corresponding to coral polyps were of interest for further analysis. Extraction of these pixels was performed by a dynamic global-thresholding algorithm.

  5. Airborne thermography for condition monitoring of a public baths building

    NASA Astrophysics Data System (ADS)

    Mattsson, Mats; Hellman, Erik; Ljungberg, Sven-Ake

    2001-03-01

    Airborne and ground-based thermography surveys have been performed in order to detect moisture and energy related problems in the construction of a public swimming bath building. This paper describes the information potential and the advantages and limitations using a standard IR-camera and traditional inspection methods to gather information for retrofit priorities. The damage conditions indicated in the thermal images are confirmed by field inspections and photographic documentation.

  6. Case studies of aerosol remote sensing with the Airborne Multiangle SpectroPolarimetric Imager (AirMSPI)

    NASA Astrophysics Data System (ADS)

    Diner, D. J.; Xu, F.; Garay, M. J.; Martonchik, J. V.; Kalashnikova, O. V.; Davis, A. B.; Rheingans, B.; Geier, S.; Jovanovic, V.; Bull, M.

    2012-12-01

    The Airborne Multiangle SpectroPolarimetric Imager (AirMSPI) is an 8-band (355, 380, 445, 470, 555, 660, 865, 935 nm) pushbroom camera, measuring polarization in the 470, 660, and 865 nm bands, mounted on a gimbal to acquire multiangular observations over a ±67° along-track range with 10-m spatial resolution across an 11-km wide swath. Among the instrument objectives are exploration of methodologies for combining multiangle, multispectral, polarimetric, and imaging observations to retrieve the optical depth and microphysical properties of tropospheric aerosols. AirMSPI was integrated on NASA's ER-2 high-altitude aircraft in 2010 and has successfully completed a number of flights over land and ocean targets in the Southern California vicinity. In this paper, we present case studies of AirMSPI imagery, interpreted using vector radiative transfer theory. AirMSPI observations over California's Central Valley are compared with model calculations using aerosol properties reported by the Fresno AERONET sunphotometer. Because determination of the radiative impact of different types of aerosols requires accurate attribution of the source of the reflected light along with characterization of the aerosol optical and microphysical properties, we explore the sensitivity of the Fresno measurements to variations in different aerosol properties, demonstrating the value of combining intensity and polarimetry at multiple view angles and spectral bands for constraining particle microphysical properties. Images over ocean to be presented include scenes over nearly cloud-free skies and scenes containing scattered clouds. It is well known that imperfect cloud screening confounds the determination of aerosol impact on radiation; it is perhaps less well appreciated that the effect of cloud reflections in the water can also be problematic. We calculate the magnitude of this effect in intensity and polarization and discuss its potential impact on aerosol retrievals, underscoring the value

  7. MESSENGER Multispectral Imaging of the Surface of Mercury

    NASA Astrophysics Data System (ADS)

    Blewett, D. T.; Robinson, M. S.; Denevi, B. W.; Prockter, L. M.; Murchie, S. L.; Gillis-Davis, J. J.; Head, J. W.; Domingue, D. L.; Izenberg, N. R.; McClintock, W. E.; Holsclaw, G. M.; Sprague, A. L.; Vilas, F.

    2008-05-01

    The January 2008 flyby of Mercury by the MESSENGER spacecraft provided the first close-up images of the planet since the Mariner 10 observations of 1974-75. MESSENGER's Mercury Dual Imaging System (MDIS) collected high signal-to-noise-ratio images with two cameras: a high-spatial-resolution broadband visible Narrow Angle Camera (NAC), and a lower-spatial- resolution multispectral Wide Angle Camera (WAC). The WAC has 11 narrow-band color filters with center wavelengths in the range 430 to 1020 nm. A spot spectrometer, the Visible and Infrared Spectrograph (VIRS) component of the Mercury Atmospheric and Surface Composition Spectrometer (MASCS), collected spectra over the range 350 to 1450 nm. The MESSENGER observations cover portions of the planet seen by Mariner 10, as well as new regions not previously imaged by spacecraft. Studies of the Moon over the past forty years — via spectral analysis of returned lunar samples, Earth-based telescopic reflectance spectra, and imaging by the Galileo and Clementine spacecraft, along with theoretical work on the interaction of light with a silicate regolith — provide a framework for interpreting spectra of Mercury. Ferrous iron in silicate minerals (primarily pyroxene and olivine) and glasses has a strong influence on lunar reflectance spectra, producing diagnostic absorption features. "Space weathering," the response of a regolith to micrometeorite bombardment and solar-wind sputtering, tends to reduce the contrast of absorption bands and introduces an overall strong positive ("red") slope to the spectrum. These spectral effects are attributed to tiny metallic iron particles created by vapor deposition during space weathering. Spectrally neutral opaque phases are dark with a flat "bluish" spectral slope and lack strong absorptions. Ilmenite, an iron-titanium oxide, is the key opaque phase in lunar samples. The MESSENGER multispectral observations validate and greatly extend the knowledge gained from analysis of Mariner 10

  8. Airborne data acquisition techniques

    SciTech Connect

    Arro, A.A.

    1980-01-01

    The introduction of standards on acceptable procedures for assessing building heat loss has created a dilemma for the contractor performing airborne thermographic surveys. These standards impose specifications on instrumentation, data acquisition, recording, interpretation, and presentation. Under the standard, the contractor has both the obligation of compliance and the requirement of offering his services at a reasonable price. This paper discusses the various aspects of data acquisition for airborne thermographic surveys and various techniques to reduce the costs of this operation. These techniques include the calculation of flight parameters for economical data acquisition, the selection and use of maps for mission planning, and the use of meteorological forecasts for flight scheduling and the actual execution of the mission. The proper consideration of these factors will result in a cost effective data acquisition and will place the contractor in a very competitive position in offering airborne thermographic survey services.

  9. Comparison of multispectral remote-sensing techniques for monitoring subsurface drain conditions. [Imperial Valley, California

    NASA Technical Reports Server (NTRS)

    Goettelman, R. C.; Grass, L. B.; Millard, J. P.; Nixon, P. R.

    1983-01-01

    The following multispectral remote-sensing techniques were compared to determine the most suitable method for routinely monitoring agricultural subsurface drain conditions: airborne scanning, covering the visible through thermal-infrared (IR) portions of the spectrum; color-IR photography; and natural-color photography. Color-IR photography was determined to be the best approach, from the standpoint of both cost and information content. Aerial monitoring of drain conditions for early warning of tile malfunction appears practical. With careful selection of season and rain-induced soil-moisture conditions, extensive regional surveys are possible. Certain locations, such as the Imperial Valley, Calif., are precluded from regional monitoring because of year-round crop rotations and soil stratification conditions. Here, farms with similar crops could time local coverage for bare-field and saturated-soil conditions.

  10. The Multispectral Imaging Science Working Group. Volume 3: Appendices

    NASA Technical Reports Server (NTRS)

    Cox, S. C. (Editor)

    1982-01-01

    The status and technology requirements for using multispectral sensor imagery in geographic, hydrologic, and geologic applications are examined. Critical issues in image and information science are identified.

  11. Comparison of Hyperspectral and Multispectral Satellites for Discriminating Land Cover in Northern California

    NASA Astrophysics Data System (ADS)

    Clark, M. L.; Kilham, N. E.

    2015-12-01

    Land-cover maps are important science products needed for natural resource and ecosystem service management, biodiversity conservation planning, and assessing human-induced and natural drivers of land change. Most land-cover maps at regional to global scales are produced with remote sensing techniques applied to multispectral satellite imagery with 30-500 m pixel sizes (e.g., Landsat, MODIS). Hyperspectral, or imaging spectrometer, imagery measuring the visible to shortwave infrared regions (VSWIR) of the spectrum have shown impressive capacity to map plant species and coarser land-cover associations, yet techniques have not been widely tested at regional and greater spatial scales. The Hyperspectral Infrared Imager (HyspIRI) mission is a VSWIR hyperspectral and thermal satellite being considered for development by NASA. The goal of this study was to assess multi-temporal, HyspIRI-like satellite imagery for improved land cover mapping relative to multispectral satellites. We mapped FAO Land Cover Classification System (LCCS) classes over 22,500 km2 in the San Francisco Bay Area, California using 30-m HyspIRI, Landsat 8 and Sentinel-2 imagery simulated from data acquired by NASA's AVIRIS airborne sensor. Random Forests (RF) and Multiple-Endmember Spectral Mixture Analysis (MESMA) classifiers were applied to the simulated images and accuracies were compared to those from real Landsat 8 images. The RF classifier was superior to MESMA, and multi-temporal data yielded higher accuracy than summer-only data. With RF, hyperspectral data had overall accuracy of 72.2% and 85.1% with full 20-class and reduced 12-class schemes, respectively. Multispectral imagery had lower accuracy. For example, simulated and real Landsat data had 7.5% and 4.6% lower accuracy than HyspIRI data with 12 classes, respectively. In summary, our results indicate increased mapping accuracy using HyspIRI multi-temporal imagery, particularly in discriminating different natural vegetation types, such as

  12. Advanced CCD camera developments

    SciTech Connect

    Condor, A.

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  13. The MKID Camera

    NASA Astrophysics Data System (ADS)

    Maloney, P. R.; Czakon, N. G.; Day, P. K.; Duan, R.; Gao, J.; Glenn, J.; Golwala, S.; Hollister, M.; LeDuc, H. G.; Mazin, B.; Noroozian, O.; Nguyen, H. T.; Sayers, J.; Schlaerth, J.; Vaillancourt, J. E.; Vayonakis, A.; Wilson, P.; Zmuidzinas, J.

    2009-12-01

    The MKID Camera project is a collaborative effort of Caltech, JPL, the University of Colorado, and UC Santa Barbara to develop a large-format, multi-color millimeter and submillimeter-wavelength camera for astronomy using microwave kinetic inductance detectors (MKIDs). These are superconducting, micro-resonators fabricated from thin aluminum and niobium films. We couple the MKIDs to multi-slot antennas and measure the change in surface impedance produced by photon-induced breaking of Cooper pairs. The readout is almost entirely at room temperature and can be highly multiplexed; in principle hundreds or even thousands of resonators could be read out on a single feedline. The camera will have 576 spatial pixels that image simultaneously in four bands at 750, 850, 1100 and 1300 microns. It is scheduled for deployment at the Caltech Submillimeter Observatory in the summer of 2010. We present an overview of the camera design and readout and describe the current status of testing and fabrication.

  14. The Complementary Pinhole Camera.

    ERIC Educational Resources Information Center

    Bissonnette, D.; And Others

    1991-01-01

    Presents an experiment based on the principles of rectilinear motion of light operating in a pinhole camera that projects the image of an illuminated object through a small hole in a sheet to an image screen. (MDH)

  15. Miniature TV Camera

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Originally devised to observe Saturn stage separation during Apollo flights, Marshall Space Flight Center's Miniature Television Camera, measuring only 4 x 3 x 1 1/2 inches, quickly made its way to the commercial telecommunications market.

  16. Airborne oceanographic lidar system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Specifications and preliminary design of an Airborne Oceanographic Lidar (AOL) system, which is to be constructed for installation and used on a NASA Wallops Flight Center (WFC) C-54 research aircraft, are reported. The AOL system is to provide an airborne facility for use by various government agencies to demonstrate the utility and practicality of hardware of this type in the wide area collection of oceanographic data on an operational basis. System measurement and performance requirements are presented, followed by a description of the conceptual system approach and the considerations attendant to its development. System performance calculations are addressed, and the system specifications and preliminary design are presented and discussed.

  17. Airborne rain mapping radar

    NASA Technical Reports Server (NTRS)

    Wilson, W. J.; Parks, G. S.; Li, F. K.; Im, K. E.; Howard, R. J.

    1988-01-01

    An airborne scanning radar system for remote rain mapping is described. The airborne rain mapping radar is composed of two radar frequency channels at 13.8 and 24.1 GHz. The radar is proposed to scan its antenna beam over + or - 20 deg from the antenna boresight; have a swath width of 7 km; a horizontal spatial resolution at nadir of about 500 m; and a range resolution of 120 m. The radar is designed to be applicable for retrieving rainfall rates from 0.1-60 mm/hr at the earth's surface, and for measuring linear polarization signatures and raindrop's fall velocity.

  18. Multispectral thermal infrared mapping of the 1 October 1988 Kupaianaha flow field, Kilauea volcano, Hawaii

    USGS Publications Warehouse

    Realmuto, V.J.; Hon, K.; Kahle, A.B.; Abbott, E.A.; Pieri, D.C.

    1992-01-01

    Multispectral thermal infrared radiance measurements of the Kupaianaha flow field were acquired with the NASA airborne Thermal Infrared Multispectral Scanner (TIMS) on the morning of 1 October 1988. The TIMS data were used to map both the temperature and emissivity of the surface of the flow field. The temperature map depicted the underground storage and transport of lava. The presence of molten lava in a tube or tumulus resulted in surface temperatures that were at least 10?? C above ambient. The temperature map also clearly defined the boundaries of hydrothermal plumes which resulted from the entry of lava into the ocean. The emissivity map revealed the boundaries between individual flow units within the Kupaianaha field. In general, the emissivity of the flows varied systematically with age but the relationship between age and emissivity was not unique. Distinct spectral anomalies, indicative of silica-rich surface materials, were mapped near fumaroles and ocean entry sites. This apparent enrichment in silica may have resulted from an acid-induced leaching of cations from the surfaces of glassy flows. Such incipient alteration may have been the cause for virtually all of the emissivity variations observed on the flow field, the spectral anomalies representing areas where the acid attack was most intense. ?? 1992 Springer-Verlag.

  19. Combination of multispectral remote sensing, variable rate technology and environmental modeling for citrus pest management.

    PubMed

    Du, Qian; Chang, Ni-Bin; Yang, Chenghai; Srilakshmi, Kanth R

    2008-01-01

    The Lower Rio Grande Valley (LRGV) of south Texas is an agriculturally rich area supporting intensive production of vegetables, fruits, grain sorghum, and cotton. Modern agricultural practices involve the combined use of irrigation with the application of large amounts of agrochemicals to maximize crop yields. Intensive agricultural activities in past decades might have caused potential contamination of soil, surface water, and groundwater due to leaching of pesticides in the vadose zone. In an effort to promote precision farming in citrus production, this paper aims at developing an airborne multispectral technique for identifying tree health problems in a citrus grove that can be combined with variable rate technology (VRT) for required pesticide application and environmental modeling for assessment of pollution prevention. An unsupervised linear unmixing method was applied to classify the image for the grove and quantify the symptom severity for appropriate infection control. The PRZM-3 model was used to estimate environmental impacts that contribute to nonpoint source pollution with and without the use of multispectral remote sensing and VRT. Research findings using site-specific environmental assessment clearly indicate that combination of remote sensing and VRT may result in benefit to the environment by reducing the nonpoint source pollution by 92.15%. Overall, this study demonstrates the potential of precision farming for citrus production in the nexus of industrial ecology and agricultural sustainability.

  20. A multispectral automatic target recognition application for maritime surveillance, search, and rescue

    NASA Astrophysics Data System (ADS)

    Schoonmaker, Jon; Reed, Scott; Podobna, Yuliya; Vazquez, Jose; Boucher, Cynthia

    2010-04-01

    Due to increased security concerns, the commitment to monitor and maintain security in the maritime environment is increasingly a priority. A country's coast is the most vulnerable area for the incursion of illegal immigrants, terrorists and contraband. This work illustrates the ability of a low-cost, light-weight, multi-spectral, multi-channel imaging system to handle the environment and see under difficult marine conditions. The system and its implemented detecting and tracking technologies should be organic to the maritime homeland security community for search and rescue, fisheries, defense, and law enforcement. It is tailored for airborne and ship based platforms to detect, track and monitor suspected objects (such as semi-submerged targets like marine mammals, vessels in distress, and drug smugglers). In this system, automated detection and tracking technology is used to detect, classify and localize potential threats or objects of interest within the imagery provided by the multi-spectral system. These algorithms process the sensor data in real time, thereby providing immediate feedback when features of interest have been detected. A supervised detection system based on Haar features and Cascade Classifiers is presented and results are provided on real data. The system is shown to be extendable and reusable for a variety of different applications.

  1. Monitoring Geothermal Features in Yellowstone National Park with ATLAS Multispectral Imagery

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph; Berglund, Judith

    2000-01-01

    The National Park Service (NPS) must produce an Environmental Impact Statement for each proposed development in the vicinity of known geothermal resource areas (KGRAs) in Yellowstone National Park. In addition, the NPS monitors indicator KGRAs for environmental quality and is still in the process of mapping many geothermal areas. The NPS currently maps geothermal features with field survey techniques. High resolution aerial multispectral remote sensing in the visible, NIR, SWIR, and thermal spectral regions could enable YNP geothermal features to be mapped more quickly and in greater detail In response, Yellowstone Ecosystems Studies, in partnership with NASA's Commercial Remote Sensing Program, is conducting a study on the use of Airborne Terrestrial Applications Sensor (ATLAS) multispectral data for monitoring geothermal features in the Upper Geyser Basin. ATLAS data were acquired at 2.5 meter resolution on August 17, 2000. These data were processed into land cover classifications and relative temperature maps. For sufficiently large features, the ATLAS data can map geothermal areas in terms of geyser pools and hot springs, plus multiple categories of geothermal runoff that are apparently indicative of temperature gradients and microbial matting communities. In addition, the ATLAS maps clearly identify geyserite areas. The thermal bands contributed to classification success and to the computation of relative temperature. With masking techniques, one can assess the influence of geothermal features on the Firehole River. Preliminary results appear to confirm ATLAS data utility for mapping and monitoring geothermal features. Future work will include classification refinement and additional validation.

  2. Analysis of multispectral signatures and investigation of multi-aspect remote sensing techniques

    NASA Technical Reports Server (NTRS)

    Malila, W. A.; Hieber, R. H.; Sarno, J. E.

    1974-01-01

    Two major aspects of remote sensing with multispectral scanners (MSS) are investigated. The first, multispectral signature analysis, includes the effects on classification performance of systematic variations found in the average signals received from various ground covers as well as the prediction of these variations with theoretical models of physical processes. The foremost effects studied are those associated with the time of day airborne MSS data are collected. Six data collection runs made over the same flight line in a period of five hours are analyzed, it is found that the time span significantly affects classification performance. Variations associated with scan angle also are studied. The second major topic of discussion is multi-aspect remote sensing, a new concept in remote sensing with scanners. Here, data are collected on multiple passes by a scanner that can be tilted to scan forward of the aircraft at different angles on different passes. The use of such spatially registered data to achieve improved classification of agricultural scenes is investigated and found promising. Also considered are the possibilities of extracting from multi-aspect data, information on the condition of corn canopies and the stand characteristics of forests.

  3. CNR LARA project, Italy: Airborne laboratory for environmental research

    NASA Technical Reports Server (NTRS)

    Bianchi, R.; Cavalli, R. M.; Fiumi, L.; Marino, C. M.; Pignatti, S.

    1995-01-01

    The increasing interest for the environmental problems and the study of the impact on the environment due to antropic activity produced an enhancement of remote sensing applications. The Italian National Research Council (CNR) established a new laboratory for airborne hyperspectral imaging, the LARA Project (Laboratorio Aero per Ricerche Ambientali - Airborne Laboratory for Environmental Research), equipping its airborne laboratory, a CASA-212, mainly with the Daedalus AA5000 MIVIS (Multispectral Infrared and Visible Imaging Spectrometer) instrument. MIVIS's channels, spectral bandwidths, and locations are chosen to meet the needs of scientific research for advanced applications of remote sensing data. MIVIS can make significant contributions to solving problems in many diverse areas such as geologic exploration, land use studies, mineralogy, agricultural crop studies, energy loss analysis, pollution assessment, volcanology, forest fire management and others. The broad spectral range and the many discrete narrow channels of MIVIS provide a fine quantization of spectral information that permits accurate definition of absorption features from a variety of materials, allowing the extraction of chemical and physical information of our environment. The availability of such a hyperspectral imager, that will operate mainly in the Mediterranean area, at the present represents a unique opportunity for those who are involved in environmental studies and land-management to collect systematically large-scale and high spectral-spatial resolution data of this part of the world. Nevertheless, MIVIS deployments will touch other parts of the world, where a major interest from the international scientific community is present.

  4. Gamma ray camera

    SciTech Connect

    Robbins, C.D.; Wang, S.

    1980-09-09

    An anger gamma ray camera is improved by the substitution of a gamma ray sensitive, proximity type image intensifier tube for the scintillator screen in the anger camera, the image intensifier tube having a negatively charged flat scintillator screen and a flat photocathode layer and a grounded, flat output phosphor display screen all of the same dimension (Unity image magnification) and all within a grounded metallic tube envelope and having a metallic, inwardly concaved input window between the scintillator screen and the collimator.

  5. NASA Airborne Lidar July 1991

    Atmospheric Science Data Center

    2016-05-26

    NASA Airborne Lidar July 1991 Data from the 1991 NASA Langley Airborne Lidar flights following the eruption of Pinatubo in July ... and Osborn [1992a, 1992b]. Project Title:  NASA Airborne Lidar Discipline:  Field Campaigns ...

  6. NASA Airborne Lidar May 1992

    Atmospheric Science Data Center

    2016-05-26

    NASA Airborne Lidar May 1992 An airborne Nd:YAG (532 nm) lidar was operated by the NASA Langley Research Center about a year following the June 1991 eruption of ... Osborn [1992a, 1992b].  Project Title:  NASA Airborne Lidar Discipline:  Field Campaigns ...

  7. Multispectral imaging of the ocular fundus using light emitting diode illumination

    NASA Astrophysics Data System (ADS)

    Everdell, N. L.; Styles, I. B.; Calcagni, A.; Gibson, J.; Hebden, J.; Claridge, E.

    2010-09-01

    We present an imaging system based on light emitting diode (LED) illumination that produces multispectral optical images of the human ocular fundus. It uses a conventional fundus camera equipped with a high power LED light source and a highly sensitive electron-multiplying charge coupled device camera. It is able to take pictures at a series of wavelengths in rapid succession at short exposure times, thereby eliminating the image shift introduced by natural eye movements (saccades). In contrast with snapshot systems the images retain full spatial resolution. The system is not suitable for applications where the full spectral resolution is required as it uses discrete wavebands for illumination. This is not a problem in retinal imaging where the use of selected wavelengths is common. The modular nature of the light source allows new wavelengths to be introduced easily and at low cost. The use of wavelength-specific LEDs as a source is preferable to white light illumination and subsequent filtering of the remitted light as it minimizes the total light exposure of the subject. The system is controlled via a graphical user interface that enables flexible control of intensity, duration, and sequencing of sources in synchrony with the camera. Our initial experiments indicate that the system can acquire multispectral image sequences of the human retina at exposure times of 0.05 s in the range of 500-620 nm with mean signal to noise ratio of 17 dB (min 11, std 4.5), making it suitable for quantitative analysis with application to the diagnosis and screening of eye diseases such as diabetic retinopathy and age-related macular degeneration.

  8. 1. VARIABLEANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VARIABLE-ANGLE LAUNCHER CAMERA CAR, VIEW OF CAMERA CAR AND TRACK WITH CAMERA STATION ABOVE LOOKING NORTH TAKEN FROM RESERVOIR. - Variable Angle Launcher Complex, Camera Car & Track, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  9. 9. VIEW OF CAMERA STATIONS UNDER CONSTRUCTION INCLUDING CAMERA CAR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. VIEW OF CAMERA STATIONS UNDER CONSTRUCTION INCLUDING CAMERA CAR ON RAILROAD TRACK AND FIXED CAMERA STATION 1400 (BUILDING NO. 42021) ABOVE, ADJACENT TO STATE HIGHWAY 39, LOOKING WEST, March 23, 1948. (Original photograph in possession of Dave Willis, San Diego, California.) - Variable Angle Launcher Complex, Camera Stations, CA State Highway 39 at Morris Reservior, Azusa, Los Angeles County, CA

  10. Deployable Wireless Camera Penetrators

    NASA Technical Reports Server (NTRS)

    Badescu, Mircea; Jones, Jack; Sherrit, Stewart; Wu, Jiunn Jeng

    2008-01-01

    A lightweight, low-power camera dart has been designed and tested for context imaging of sampling sites and ground surveys from an aerobot or an orbiting spacecraft in a microgravity environment. The camera penetrators also can be used to image any line-of-sight surface, such as cliff walls, that is difficult to access. Tethered cameras to inspect the surfaces of planetary bodies use both power and signal transmission lines to operate. A tether adds the possibility of inadvertently anchoring the aerobot, and requires some form of station-keeping capability of the aerobot if extended examination time is required. The new camera penetrators are deployed without a tether, weigh less than 30 grams, and are disposable. They are designed to drop from any altitude with the boost in transmitting power currently demonstrated at approximately 100-m line-of-sight. The penetrators also can be deployed to monitor lander or rover operations from a distance, and can be used for surface surveys or for context information gathering from a touch-and-go sampling site. Thanks to wireless operation, the complexity of the sampling or survey mechanisms may be reduced. The penetrators may be battery powered for short-duration missions, or have solar panels for longer or intermittent duration missions. The imaging device is embedded in the penetrator, which is dropped or projected at the surface of a study site at 90 to the surface. Mirrors can be used in the design to image the ground or the horizon. Some of the camera features were tested using commercial "nanny" or "spy" camera components with the charge-coupled device (CCD) looking at a direction parallel to the ground. Figure 1 shows components of one camera that weighs less than 8 g and occupies a volume of 11 cm3. This camera could transmit a standard television signal, including sound, up to 100 m. Figure 2 shows the CAD models of a version of the penetrator. A low-volume array of such penetrator cameras could be deployed from an

  11. An investigative study of multispectral data compression for remotely-sensed images using vector quantization and difference-mapped shift-coding

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    A study is conducted to investigate the effects and advantages of data compression techniques on multispectral imagery data acquired by NASA's airborne scanners at the Stennis Space Center. The first technique used was vector quantization. The vector is defined in the multispectral imagery context as an array of pixels from the same location from each channel. The error obtained in substituting the reconstructed images for the original set is compared for different compression ratios. Also, the eigenvalues of the covariance matrix obtained from the reconstructed data set are compared with the eigenvalues of the original set. The effects of varying the size of the vector codebook on the quality of the compression and on subsequent classification are also presented. The output data from the Vector Quantization algorithm was further compressed by a lossless technique called Difference-mapped Shift-extended Huffman coding. The overall compression for 7 channels of data acquired by the Calibrated Airborne Multispectral Scanner (CAMS), with an RMS error of 15.8 pixels was 195:1 (0.41 bpp) and with an RMS error of 3.6 pixels was 18:1 (.447 bpp). The algorithms were implemented in software and interfaced with the help of dedicated image processing boards to an 80386 PC compatible computer. Modules were developed for the task of image compression and image analysis. Also, supporting software to perform image processing for visual display and interpretation of the compressed/classified images was developed.

  12. Airborne Fraunhofer Line Discriminator

    NASA Technical Reports Server (NTRS)

    Gabriel, F. C.; Markle, D. A.

    1969-01-01

    Airborne Fraunhofer Line Discriminator enables prospecting for fluorescent materials, hydrography with fluorescent dyes, and plant studies based on fluorescence of chlorophyll. Optical unit design is the coincidence of Fraunhofer lines in the solar spectrum occurring at the characteristic wavelengths of some fluorescent materials.

  13. Recognizing Airborne Hazards.

    ERIC Educational Resources Information Center

    Schneider, Christian M.

    1990-01-01

    The heating, ventilating, and air conditioning (HVAC) systems in older buildings often do not adequately handle air-borne contaminants. Outlines a three-stage Indoor Air Quality (IAQ) assessment and describes a case in point at a Pittsburgh, Pennsylvania, school. (MLF)

  14. Airborne asbestos in buildings.

    PubMed

    Lee, R J; Van Orden, D R

    2008-03-01

    The concentration of airborne asbestos in buildings nationwide is reported in this study. A total of 3978 indoor samples from 752 buildings, representing nearly 32 man-years of sampling, have been analyzed by transmission electron microscopy. The buildings that were surveyed were the subject of litigation related to suits alleging the general building occupants were exposed to a potential health hazard as a result the presence of asbestos-containing materials (ACM). The average concentration of all airborne asbestos structures was 0.01structures/ml (s/ml) and the average concentration of airborne asbestos > or = 5microm long was 0.00012fibers/ml (f/ml). For all samples, 99.9% of the samples were <0.01 f/ml for fibers longer than 5microm; no building averaged above 0.004f/ml for fibers longer than 5microm. No asbestos was detected in 27% of the buildings and in 90% of the buildings no asbestos was detected that would have been seen optically (> or = 5microm long and > or = 0.25microm wide). Background outdoor concentrations have been reported at 0.0003f/ml > or = 5microm. These results indicate that in-place ACM does not result in elevated airborne asbestos in building atmospheres approaching regulatory levels and that it does not result in a significantly increased risk to building occupants.

  15. International Symposium on Airborne Geophysics

    NASA Astrophysics Data System (ADS)

    Mogi, Toru; Ito, Hisatoshi; Kaieda, Hideshi; Kusunoki, Kenichiro; Saltus, Richard W.; Fitterman, David V.; Okuma, Shigeo; Nakatsuka, Tadashi

    2006-05-01

    Airborne geophysics can be defined as the measurement of Earth properties from sensors in the sky. The airborne measurement platform is usually a traditional fixed-wing airplane or helicopter, but could also include lighter-than-air craft, unmanned drones, or other specialty craft. The earliest history of airborne geophysics includes kite and hot-air balloon experiments. However, modern airborne geophysics dates from the mid-1940s when military submarine-hunting magnetometers were first used to map variations in the Earth's magnetic field. The current gamut of airborne geophysical techniques spans a broad range, including potential fields (both gravity and magnetics), electromagnetics (EM), radiometrics, spectral imaging, and thermal imaging.

  16. Photoreactivation in Airborne Mycobacterium parafortuitum

    PubMed Central

    Peccia, Jordan; Hernandez, Mark

    2001-01-01

    Photoreactivation was observed in airborne Mycobacterium parafortuitum exposed concurrently to UV radiation (254 nm) and visible light. Photoreactivation rates of airborne cells increased with increasing relative humidity (RH) and decreased with increasing UV dose. Under a constant UV dose with visible light absent, the UV inactivation rate of airborne M. parafortuitum cells decreased by a factor of 4 as RH increased from 40 to 95%; however, under identical conditions with visible light present, the UV inactivation rate of airborne cells decreased only by a factor of 2. When irradiated in the absence of visible light, cellular cyclobutane thymine dimer content of UV-irradiated airborne M. parafortuitum and Serratia marcescens increased in response to RH increases. Results suggest that, unlike in waterborne bacteria, cyclobutane thymine dimers are not the most significant form of UV-induced DNA damage incurred by airborne bacteria and that the distribution of DNA photoproducts incorporated into UV-irradiated airborne cells is a function of RH. PMID:11526027

  17. THE DARK ENERGY CAMERA

    SciTech Connect

    Flaugher, B.; Diehl, H. T.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Buckley-Geer, E. J.; Honscheid, K.; Abbott, T. M. C.; Bonati, M.; Antonik, M.; Brooks, D.; Ballester, O.; Cardiel-Sas, L.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Boprie, D.; Campa, J.; Castander, F. J.; Collaboration: DES Collaboration; and others

    2015-11-15

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel{sup −1}. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6–9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  18. The CAMCAO infrared camera

    NASA Astrophysics Data System (ADS)

    Amorim, Antonio; Melo, Antonio; Alves, Joao; Rebordao, Jose; Pinhao, Jose; Bonfait, Gregoire; Lima, Jorge; Barros, Rui; Fernandes, Rui; Catarino, Isabel; Carvalho, Marta; Marques, Rui; Poncet, Jean-Marc; Duarte Santos, Filipe; Finger, Gert; Hubin, Norbert; Huster, Gotthard; Koch, Franz; Lizon, Jean-Louis; Marchetti, Enrico

    2004-09-01

    The CAMCAO instrument is a high resolution near infrared (NIR) camera conceived to operate together with the new ESO Multi-conjugate Adaptive optics Demonstrator (MAD) with the goal of evaluating the feasibility of Multi-Conjugate Adaptive Optics techniques (MCAO) on the sky. It is a high-resolution wide field of view (FoV) camera that is optimized to use the extended correction of the atmospheric turbulence provided by MCAO. While the first purpose of this camera is the sky observation, in the MAD setup, to validate the MCAO technology, in a second phase, the CAMCAO camera is planned to attach directly to the VLT for scientific astrophysical studies. The camera is based on the 2kx2k HAWAII2 infrared detector controlled by an ESO external IRACE system and includes standard IR band filters mounted on a positional filter wheel. The CAMCAO design requires that the optical components and the IR detector should be kept at low temperatures in order to avoid emitting radiation and lower detector noise in the region analysis. The cryogenic system inclues a LN2 tank and a sptially developed pulse tube cryocooler. Field and pupil cold stops are implemented to reduce the infrared background and the stray-light. The CAMCAO optics provide diffraction limited performance down to J Band, but the detector sampling fulfills the Nyquist criterion for the K band (2.2mm).

  19. The Dark Energy Camera

    SciTech Connect

    Flaugher, B.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  20. Satellite camera image navigation

    NASA Technical Reports Server (NTRS)

    Kamel, Ahmed A. (Inventor); Graul, Donald W. (Inventor); Savides, John (Inventor); Hanson, Charles W. (Inventor)

    1987-01-01

    Pixels within a satellite camera (1, 2) image are precisely located in terms of latitude and longitude on a celestial body, such as the earth, being imaged. A computer (60) on the earth generates models (40, 50) of the satellite's orbit and attitude, respectively. The orbit model (40) is generated from measurements of stars and landmarks taken by the camera (1, 2), and by range data. The orbit model (40) is an expression of the satellite's latitude and longitude at the subsatellite point, and of the altitude of the satellite, as a function of time, using as coefficients (K) the six Keplerian elements at epoch. The attitude model (50) is based upon star measurements taken by each camera (1, 2). The attitude model (50) is a set of expressions for the deviations in a set of mutually orthogonal reference optical axes (x, y, z) as a function of time, for each camera (1, 2). Measured data is fit into the models (40, 50) using a walking least squares fit algorithm. A transformation computer (66 ) transforms pixel coordinates as telemetered by the camera (1, 2) into earth latitude and longitude coordinates, using the orbit and attitude models (40, 50).

  1. The Dark Energy Camera

    NASA Astrophysics Data System (ADS)

    Flaugher, B.; Diehl, H. T.; Honscheid, K.; Abbott, T. M. C.; Alvarez, O.; Angstadt, R.; Annis, J. T.; Antonik, M.; Ballester, O.; Beaufore, L.; Bernstein, G. M.; Bernstein, R. A.; Bigelow, B.; Bonati, M.; Boprie, D.; Brooks, D.; Buckley-Geer, E. J.; Campa, J.; Cardiel-Sas, L.; Castander, F. J.; Castilla, J.; Cease, H.; Cela-Ruiz, J. M.; Chappa, S.; Chi, E.; Cooper, C.; da Costa, L. N.; Dede, E.; Derylo, G.; DePoy, D. L.; de Vicente, J.; Doel, P.; Drlica-Wagner, A.; Eiting, J.; Elliott, A. E.; Emes, J.; Estrada, J.; Fausti Neto, A.; Finley, D. A.; Flores, R.; Frieman, J.; Gerdes, D.; Gladders, M. D.; Gregory, B.; Gutierrez, G. R.; Hao, J.; Holland, S. E.; Holm, S.; Huffman, D.; Jackson, C.; James, D. J.; Jonas, M.; Karcher, A.; Karliner, I.; Kent, S.; Kessler, R.; Kozlovsky, M.; Kron, R. G.; Kubik, D.; Kuehn, K.; Kuhlmann, S.; Kuk, K.; Lahav, O.; Lathrop, A.; Lee, J.; Levi, M. E.; Lewis, P.; Li, T. S.; Mandrichenko, I.; Marshall, J. L.; Martinez, G.; Merritt, K. W.; Miquel, R.; Muñoz, F.; Neilsen, E. H.; Nichol, R. C.; Nord, B.; Ogando, R.; Olsen, J.; Palaio, N.; Patton, K.; Peoples, J.; Plazas, A. A.; Rauch, J.; Reil, K.; Rheault, J.-P.; Roe, N. A.; Rogers, H.; Roodman, A.; Sanchez, E.; Scarpine, V.; Schindler, R. H.; Schmidt, R.; Schmitt, R.; Schubnell, M.; Schultz, K.; Schurter, P.; Scott, L.; Serrano, S.; Shaw, T. M.; Smith, R. C.; Soares-Santos, M.; Stefanik, A.; Stuermer, W.; Suchyta, E.; Sypniewski, A.; Tarle, G.; Thaler, J.; Tighe, R.; Tran, C.; Tucker, D.; Walker, A. R.; Wang, G.; Watson, M.; Weaverdyck, C.; Wester, W.; Woods, R.; Yanny, B.; DES Collaboration

    2015-11-01

    The Dark Energy Camera is a new imager with a 2.°2 diameter field of view mounted at the prime focus of the Victor M. Blanco 4 m telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five-element optical corrector, seven filters, a shutter with a 60 cm aperture, and a charge-coupled device (CCD) focal plane of 250 μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 megapixel focal plane comprises 62 2k × 4k CCDs for imaging and 12 2k × 2k CCDs for guiding and focus. The CCDs have 15 μm × 15 μm pixels with a plate scale of 0.″263 pixel-1. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 s with 6-9 electron readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  2. A multispectral method of determining sea surface temperatures

    NASA Technical Reports Server (NTRS)

    Shenk, W. E.

    1972-01-01

    A multispectral method for determining sea surface temperatures is discussed. The specifications of the equipment and the atmospheric conditions required for successful multispectral data acquisition are described. Examples of data obtained in the North Atlantic Ocean are presented. The differences between the actual sea surface temperatures and the equivalent blackbody temperatures as determined by a radiometer are plotted.

  3. Multispectral data compression through transform coding and block quantization

    NASA Technical Reports Server (NTRS)

    Ready, P. J.; Wintz, P. A.

    1972-01-01

    Transform coding and block quantization techniques are applied to multispectral aircraft scanner data, and digitized satellite imagery. The multispectral source is defined and an appropriate mathematical model proposed. The Karhunen-Loeve, Fourier, and Hadamard encoders are considered and are compared to the rate distortion function for the equivalent Gaussian source and to the performance of the single sample PCM encoder.

  4. Reproducible high-resolution multispectral image acquisition in dermatology

    NASA Astrophysics Data System (ADS)

    Duliu, Alexandru; Gardiazabal, José; Lasser, Tobias; Navab, Nassir

    2015-07-01

    Multispectral image acquisitions are increasingly popular in dermatology, due to their improved spectral resolution which enables better tissue discrimination. Most applications however focus on restricted regions of interest, imaging only small lesions. In this work we present and discuss an imaging framework for high-resolution multispectral imaging on large regions of interest.

  5. Extraction of topographic and spectral albedo information from multispectral images.

    USGS Publications Warehouse

    Eliason, P.T.; Soderblom, L.A.; Chavez, P.A., Jr.

    1981-01-01

    A technique has been developed to separate and extract spectral-reflectivity variations and topographic informaiton from multispectral images. The process is a completely closed system employing only the image data and can be applied to any digital multispectral data set. -from Authors

  6. Measurement of water depth by multispectral ratio techniques

    NASA Technical Reports Server (NTRS)

    Polcyn, F. C.

    1970-01-01

    The technique for measuring the depth of water using a multispectral scanner is discussed. The procedure takes advantage of the absorption properties of different wavelengths of light. Making use of the property of the selected transmission of light at different wavelengths, an equation was developed relating the outputs of at least two channels of multispectral scanner to measure water depth.

  7. Fingerprint enhancement using a multispectral sensor

    NASA Astrophysics Data System (ADS)

    Rowe, Robert K.; Nixon, Kristin A.

    2005-03-01

    The level of performance of a biometric fingerprint sensor is critically dependent on the quality of the fingerprint images. One of the most common types of optical fingerprint sensors relies on the phenomenon of total internal reflectance (TIR) to generate an image. Under ideal conditions, a TIR fingerprint sensor can produce high-contrast fingerprint images with excellent feature definition. However, images produced by the same sensor under conditions that include dry skin, dirt on the skin, and marginal contact between the finger and the sensor, are likely to be severely degraded. This paper discusses the use of multispectral sensing as a means to collect additional images with new information about the fingerprint that can significantly augment the system performance under both normal and adverse sample conditions. In the context of this paper, "multispectral sensing" is used to broadly denote a collection of images taken under different illumination conditions: different polarizations, different illumination/detection configurations, as well as different wavelength illumination. Results from three small studies using an early-stage prototype of the multispectral-TIR (MTIR) sensor are presented along with results from the corresponding TIR data. The first experiment produced data from 9 people, 4 fingers from each person and 3 measurements per finger under "normal" conditions. The second experiment provided results from a study performed to test the relative performance of TIR and MTIR images when taken under extreme dry and dirty conditions. The third experiment examined the case where the area of contact between the finger and sensor is greatly reduced.

  8. Multispectral rock-type separation and classification.

    SciTech Connect

    Moya, Mary M.; Fogler, Robert Joseph; Paskaleva, Biliana; Hayat, Majeed M.

    2004-06-01

    This paper explores the possibility of separating and classifying remotely-sensed multispectral data from rocks and minerals onto seven geological rock-type groups. These groups are extracted from the general categories of metamorphic, igneous and sedimentary rocks. The study is performed under ideal conditions for which the data is generated according to laboratory hyperspectral data for the members, which are, in turn, passed through the Multi-spectral Thermal Imager (MTI) filters yielding 15 bands. The main challenge in separability is the small size of the training data sets, which initially did not permit direct application of Bayesian decision theory. To enable Bayseian classification, the original training data is linearly perturbed with the addition minerals, vegetation, soil, water and other valid impurities. As a result, the size of the training data is significantly increased and accurate estimates of the covariance matrices are achieved. In addition, a set of reduced (five) linearly-extracted canonical features that are optimal in providing the most important information about the data is determined. An alternative nonlinear feature-selection method is also employed based on spectral indices comprising a small subset of all possible ratios between bands. By applying three optimization strategies, combinations of two and three ratios are found that provide reliable separability and classification between all seven groups according to the Bhattacharyya distance. To set a benchmark to which the MTI capability in rock classification can be compared, an optimization strategy is performed for the selection of optimal multispectral filters, other than the MTI filters, and an improvement in classification is predicted.

  9. Multispectral fingerprint imaging for spoof detection

    NASA Astrophysics Data System (ADS)

    Nixon, Kristin A.; Rowe, Robert K.

    2005-03-01

    Fingerprint systems are the most widespread form of biometric authentication. Used in locations such as airports and in PDA's and laptops, fingerprint readers are becoming more common in everyday use. As they become more familiar, the security weaknesses of fingerprint sensors are becoming better known. Numerous websites now exist describing in detail how to create a fake fingerprint usable for spoofing a biometric system from both a cooperative user and from latent prints. While many commercial fingerprint readers claim to have some degree of spoof detection incorporated, they are still generally susceptible to spoof attempts using various artificial fingerprint samples made from gelatin or silicone or other materials and methods commonly available on the web. This paper describes a multispectral sensor that has been developed to collect data for spoof detection. The sensor has been designed to work in conjunction with a conventional optical fingerprint reader such that all images are collected during a single placement of the finger on the sensor. The multispectral imaging device captures sub-surface information about the finger that makes it very difficult to spoof. Four attributes of the finger that are collected with the multispectral imager will be described and demonstrated in this paper: spectral qualities of live skin, chromatic texture of skin, sub-surface image of live skin, and blanching on contact. Each of these attributes is well suited to discriminating against particular kinds of spoofing samples. A series of experiments was conducted to demonstrate the capabilities of the individual attributes as well as the collective spoof detection performance.

  10. Digital rectification of ERTS multispectral imagery

    NASA Technical Reports Server (NTRS)

    Rifman, S. S.

    1973-01-01

    Rectified ERTS multispectral imagery have been produced utilizing all digital techniques, as the first step toward producing precision corrected imagery. Errors arising from attitude and ephemeris sources have been corrected, and the resultant image is represented in a meter/meter mapping utilizing an intensity resampling technique. Early results from available data indicate negligible degradation of the photometric and resolution properties of the source data as a consequence of the geometric correction process. Work utilizing ground control points to produce precision rectified imagery, and including photometric corrections resulting from available sensor calibration data, is currently in progress.

  11. Multispectral imaging for diagnosis and treatment

    NASA Astrophysics Data System (ADS)

    Carver, Gary E.; Locknar, Sarah A.; Morrison, William A.; Farkas, Daniel L.

    2014-03-01

    A new approach for generating high-speed multispectral images has been previously reported by our team. The central concept is that spectra can be acquired for each pixel in a confocal spatial laser scan by using a fast spectrometer based on optical fiber delay lines. This method merges fast spectroscopy with standard spatial scanning to create image datacubes in real time. The datacubes can be analyzed to define regions of interest (ROIs) containing diseased tissue. Firmware and software have been developed for selectively scanning these ROIs with increased optical power. This enables real time image-guided laser treatment with a spatial resolution of a few microns.

  12. Wavelet-based multispectral face recognition

    NASA Astrophysics Data System (ADS)

    Liu, Dian-Ting; Zhou, Xiao-Dan; Wang, Cheng-Wen

    2008-09-01

    This paper proposes a novel wavelet-based face recognition method using thermal infrared (IR) and visible-light face images. The method applies the combination of Gabor and the Fisherfaces method to the reconstructed IR and visible images derived from wavelet frequency subbands. Our objective is to search for the subbands that are insensitive to the variation in expression and in illumination. The classification performance is improved by combining the multispectal information coming from the subbands that attain individually low equal error rate. Experimental results on Notre Dame face database show that the proposed wavelet-based algorithm outperforms previous multispectral images fusion method as well as monospectral method.

  13. Multispectral scanner imagery for plant community classification.

    NASA Technical Reports Server (NTRS)

    Driscoll, R. S.; Spencer, M. M.

    1973-01-01

    Optimum channel selection among 12 channels of multispectral scanner imagery identified six as providing the best information for computerized classification of 11 plant communities and two nonvegetation classes. Intensive preprocessing of the spectral data was required to eliminate bidirectional reflectance effects of the spectral imagery caused by scanner view angle and varying geometry of the plant canopy. Generalized plant community types - forest, grassland, and hydrophytic systems - were acceptably classified based on ecological analysis. Serious, but soluble, errors occurred with attempts to classify specific community types within the grassland system. However, special clustering analyses provided for improved classification of specific grassland communities.

  14. Mapping soil types from multispectral scanner data.

    NASA Technical Reports Server (NTRS)

    Kristof, S. J.; Zachary, A. L.

    1971-01-01

    Multispectral remote sensing and computer-implemented pattern recognition techniques were used for automatic ?mapping' of soil types. This approach involves subjective selection of a set of reference samples from a gray-level display of spectral variations which was generated by a computer. Each resolution element is then classified using a maximum likelihood ratio. Output is a computer printout on which the researcher assigns a different symbol to each class. Four soil test areas in Indiana were experimentally examined using this approach, and partially successful results were obtained.

  15. Multispectral-image fusion using neural networks

    NASA Astrophysics Data System (ADS)

    Kagel, Joseph H.; Platt, C. A.; Donaven, T. W.; Samstad, Eric A.

    1990-08-01

    A prototype system is being developed to demonstrate the use of neural network hardware to fuse multispectral imagery. This system consists of a neural network IC on a motherboard a circuit card assembly and a set of software routines hosted by a PC-class computer. Research in support of this consists of neural network simulations fusing 4 to 7 bands of Landsat imagery and fusing (separately) multiple bands of synthetic imagery. The simulations results and a description of the prototype system are presented. 1.

  16. Multispectral image fusion using neural networks

    NASA Technical Reports Server (NTRS)

    Kagel, J. H.; Platt, C. A.; Donaven, T. W.; Samstad, E. A.

    1990-01-01

    A prototype system is being developed to demonstrate the use of neural network hardware to fuse multispectral imagery. This system consists of a neural network IC on a motherboard, a circuit card assembly, and a set of software routines hosted by a PC-class computer. Research in support of this consists of neural network simulations fusing 4 to 7 bands of Landsat imagery and fusing (separately) multiple bands of synthetic imagery. The simulations, results, and a description of the prototype system are presented.

  17. Multispectral analysis of ocean dumped materials

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1977-01-01

    Remotely sensed data were collected in conjunction with sea-truth measurements in three experiments in the New York Bight. Pollution features of primary interest were ocean dumped materials, such as sewage sludge and acid waste. Sewage-sludge and acid-waste plumes, including plumes from sewage sludge dumped by the 'line-dump' and 'spot-dump' methods, were located, identified, and mapped. Previously developed quantitative analysis techniques for determining quantitative distributions of materials in sewage sludge dumps were evaluated, along with multispectral analysis techniques developed to identify ocean dumped materials. Results of these experiments and the associated data analysis investigations are presented and discussed.

  18. Multispectral analysis of ocean dumped materials

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1977-01-01

    Experiments conducted in the Atlantic coastal zone indicated that plumes resulting from ocean dumping of acid wastes and sewage sludge have unique spectral characteristics. Remotely sensed wide area synoptic coverage provided information on these pollution features that was not readily available from other sources. Aircraft remotely sensed photographic and multispectral scanner data were interpreted by two methods. First, qualitative analyses in which pollution features were located, mapped, and identified without concurrent sea truth and, second, quantitative analyses in which concurrently collected sea truth was used to calibrate the remotely sensed data and to determine quantitative distributions of one or more parameters in a plume.

  19. Design and fabrication of multispectral optics using expanded glass map

    NASA Astrophysics Data System (ADS)

    Bayya, Shyam; Gibson, Daniel; Nguyen, Vinh; Sanghera, Jasbinder; Kotov, Mikhail; Drake, Gryphon; Deegan, John; Lindberg, George

    2015-06-01

    As the desire to have compact multispectral imagers in various DoD platforms is growing, the dearth of multispectral optics is widely felt. With the limited number of material choices for optics, these multispectral imagers are often very bulky and impractical on several weight sensitive platforms. To address this issue, NRL has developed a large set of unique infrared glasses that transmit from 0.9 to > 14 μm in wavelength and expand the glass map for multispectral optics with refractive indices from 2.38 to 3.17. They show a large spread in dispersion (Abbe number) and offer some unique solutions for multispectral optics designs. The new NRL glasses can be easily molded and also fused together to make bonded doublets. A Zemax compatible glass file has been created and is available upon request. In this paper we present some designs, optics fabrication and imaging, all using NRL materials.

  20. Solid state television camera

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The design, fabrication, and tests of a solid state television camera using a new charge-coupled imaging device are reported. An RCA charge-coupled device arranged in a 512 by 320 format and directly compatible with EIA format standards was the sensor selected. This is a three-phase, sealed surface-channel array that has 163,840 sensor elements, which employs a vertical frame transfer system for image readout. Included are test results of the complete camera system, circuit description and changes to such circuits as a result of integration and test, maintenance and operation section, recommendations to improve the camera system, and a complete set of electrical and mechanical drawing sketches.

  1. HIGH SPEED CAMERA

    DOEpatents

    Rogers, B.T. Jr.; Davis, W.C.

    1957-12-17

    This patent relates to high speed cameras having resolution times of less than one-tenth microseconds suitable for filming distinct sequences of a very fast event such as an explosion. This camera consists of a rotating mirror with reflecting surfaces on both sides, a narrow mirror acting as a slit in a focal plane shutter, various other mirror and lens systems as well as an innage recording surface. The combination of the rotating mirrors and the slit mirror causes discrete, narrow, separate pictures to fall upon the film plane, thereby forming a moving image increment of the photographed event. Placing a reflecting surface on each side of the rotating mirror cancels the image velocity that one side of the rotating mirror would impart, so as a camera having this short a resolution time is thereby possible.

  2. Selective-imaging camera

    NASA Astrophysics Data System (ADS)

    Szu, Harold; Hsu, Charles; Landa, Joseph; Cha, Jae H.; Krapels, Keith A.

    2015-05-01

    How can we design cameras that image selectively in Full Electro-Magnetic (FEM) spectra? Without selective imaging, we cannot use, for example, ordinary tourist cameras to see through fire, smoke, or other obscurants contributing to creating a Visually Degraded Environment (VDE). This paper addresses a possible new design of selective-imaging cameras at firmware level. The design is consistent with physics of the irreversible thermodynamics of Boltzmann's molecular entropy. It enables imaging in appropriate FEM spectra for sensing through the VDE, and displaying in color spectra for Human Visual System (HVS). We sense within the spectra the largest entropy value of obscurants such as fire, smoke, etc. Then we apply a smart firmware implementation of Blind Sources Separation (BSS) to separate all entropy sources associated with specific Kelvin temperatures. Finally, we recompose the scene using specific RGB colors constrained by the HVS, by up/down shifting Planck spectra at each pixel and time.

  3. An airborne real-time hyperspectral target detection system

    NASA Astrophysics Data System (ADS)

    Skauli, Torbjorn; Haavardsholm, Trym V.; Kåsen, Ingebjørg; Arisholm, Gunnar; Kavara, Amela; Opsahl, Thomas Olsvik; Skaugen, Atle

    2010-04-01

    An airborne system for hyperspectral target detection is described. The main sensor is a HySpex pushbroom hyperspectral imager for the visible and near-infrared spectral range with 1600 pixels across track, supplemented by a panchromatic line imager. An optional third sensor can be added, either a SWIR hyperspectral camera or a thermal camera. In real time, the system performs radiometric calibration and georeferencing of the images, followed by image processing for target detection and visualization. The current version of the system implements only spectral anomaly detection, based on normal mixture models. Image processing runs on a PC with a multicore Intel processor and an Nvidia graphics processing unit (GPU). The processing runs in a software framework optimized for large sustained data rates. The platform is a Cessna 172 aircraft based close to FFI, modified with a camera port in the floor.

  4. Electronic Still Camera

    NASA Technical Reports Server (NTRS)

    Holland, S. Douglas (Inventor)

    1992-01-01

    A handheld, programmable, digital camera is disclosed that supports a variety of sensors and has program control over the system components to provide versatility. The camera uses a high performance design which produces near film quality images from an electronic system. The optical system of the camera incorporates a conventional camera body that was slightly modified, thus permitting the use of conventional camera accessories, such as telephoto lenses, wide-angle lenses, auto-focusing circuitry, auto-exposure circuitry, flash units, and the like. An image sensor, such as a charge coupled device ('CCD') collects the photons that pass through the camera aperture when the shutter is opened, and produces an analog electrical signal indicative of the image. The analog image signal is read out of the CCD and is processed by preamplifier circuitry, a correlated double sampler, and a sample and hold circuit before it is converted to a digital signal. The analog-to-digital converter has an accuracy of eight bits to insure accuracy during the conversion. Two types of data ports are included for two different data transfer needs. One data port comprises a general purpose industrial standard port and the other a high speed/high performance application specific port. The system uses removable hard disks as its permanent storage media. The hard disk receives the digital image signal from the memory buffer and correlates the image signal with other sensed parameters, such as longitudinal or other information. When the storage capacity of the hard disk has been filled, the disk can be replaced with a new disk.

  5. A Field Evaluation of Airborne Techniques for Detection of Unexploded Ordnance

    SciTech Connect

    Bell, D.; Doll, W.E.; Hamlett, P.; Holladay, J.S.; Nyquist, J.E.; Smyre, J.; Gamey, T.J.

    1999-03-14

    US Defense Department estimates indicate that as many as 11 million acres of government land in the U. S. may contain unexploded ordnance (UXO), with the cost of identifying and disposing of this material estimated at nearly $500 billion. The size and character of the ordnance, types of interference, vegetation, geology, and topography vary from site to site. Because of size or composition, some ordnance is difficult to detect with any geophysical method, even under favorable soil and cultural interference conditions. For some sites, airborne methods may provide the most time and cost effective means for detection of UXO. Airborne methods offer lower risk to field crews from proximity to unstable ordnance, and less disturbance of sites that maybe environmentally sensitive. Data were acquired over a test site at Edwards AFB, CA using airborne magnetic, electromagnetic, multispectral and thermal sensors. Survey areas included sites where trenches might occur, and a test site in which we placed deactivated ordnance, ranging in size from small ''bomblets'' to large bombs. Magnetic data were then acquired with the Aerodat HM-3 system, which consists of three cesium magnetometers within booms extending to the front and sides of the helicopter, and mounted such that the helicopter can be flown within 3m of the surface. Electromagnetic data were acquired with an Aerodat 5 frequency coplanar induction system deployed as a sling load from a helicopter, with a sensor altitude of 15m. Surface data, acquired at selected sites, provide a comparison with airborne data. Multispectral and thermal data were acquired with a Daedelus AADS 1268 system. Preliminary analysis of the test data demonstrate the value of airborne systems for UXO detection and provide insight into improvements that might make the systems even more effective.

  6. A multisensor system for airborne surveillance of oil pollution

    NASA Technical Reports Server (NTRS)

    Edgerton, A. T.; Ketchal, R.; Catoe, C.

    1973-01-01

    The U.S. Coast Guard is developing a prototype airborne oil surveillance system for use in its Marine Environmental Protection Program. The prototype system utilizes an X-band side-looking radar, a 37-GHz imaging microwave radiometer, a multichannel line scanner, and a multispectral low light level system. The system is geared to detecting and mapping oil spills and potential pollution violators anywhere within a 25 nmi range of the aircraft flight track under all but extreme weather conditions. The system provides for false target discrimination and maximum identification of spilled materials. The system also provides an automated detection alarm, as well as a color display to achieve maximum coupling between the sensor data and the equipment operator.

  7. High spectral resolution airborne short wave infrared hyperspectral imager

    NASA Astrophysics Data System (ADS)

    Wei, Liqing; Yuan, Liyin; Wang, Yueming; Zhuang, Xiaoqiong

    2016-05-01

    Short Wave InfraRed(SWIR) spectral imager is good at detecting difference between materials and penetrating fog and mist. High spectral resolution SWIR hyperspectral imager plays a key role in developing earth observing technology. Hyperspectral data cube can help band selections that is very important for multispectral imager design. Up to now, the spectral resolution of many SWIR hyperspectral imagers is about 10nm. A high sensitivity airborne SWIR hyperspectral imager with narrower spectral band will be presented. The system consists of TMA telescope, slit, spectrometer with planar blazed grating and high sensitivity MCT FPA. The spectral sampling interval is about 3nm. The IFOV is 0.5mrad. To eliminate the influence of the thermal background, a cold shield is designed in the dewar. The pixel number of spatial dimension is 640. Performance measurement in laboratory and image analysis for flight test will also be presented.

  8. Multiresolution processing for fractal analysis of airborne remotely sensed data

    NASA Technical Reports Server (NTRS)

    Jaggi, S.; Quattrochi, D.; Lam, N.

    1992-01-01

    Images acquired by NASA's Calibrated Airborne Multispectral Scanner are used to compute the fractal dimension as a function of spatial resolution. Three methods are used to determine the fractal dimension: Shelberg's (1982, 1983) line-divider method, the variogram method, and the triangular prism method. A description of these methods and the result of applying these methods to a remotely-sensed image is also presented. The scanner data was acquired over western Puerto Rico in January, 1990 over land and water. The aim is to study impacts of man-induced changes on land that affect sedimentation into the near-shore environment. The data were obtained over the same area at three different pixel sizes: 10 m, 20 m, and 30 m.

  9. Modeling of estuarne chlorophyll a from an airborne scanner

    USGS Publications Warehouse

    Khorram, Siamak; Catts, Glenn P.; Cloern, James E.; Knight, Allen W.

    1987-01-01

    Near simultaneous collection of 34 surface water samples and airborne multispectral scanner data provided input for regression models developed to predict surface concentrations of estuarine chlorophyll a. Two wavelength ratios were employed in model development. The ratios werechosen to capitalize on the spectral characteristics of chlorophyll a, while minimizing atmospheric influences. Models were then applied to data previously acquired over the study area thre years earlier. Results are in the form of color-coded displays of predicted chlorophyll a concentrations and comparisons of the agreement among measured surface samples and predictions basedon coincident remotely sensed data. The influence of large variations in fresh-water inflow to the estuary are clearly apparent in the results. The synoptic view provided by remote sensing is another method of examining important estuarine dynamics difficult to observe from in situ sampling alone.

  10. GROT in NICMOS Cameras

    NASA Astrophysics Data System (ADS)

    Sosey, M.; Bergeron, E.

    1999-09-01

    Grot is exhibited as small areas of reduced sensitivity, most likely due to flecks of antireflective paint scraped off the optical baffles as they were forced against each other. This paper characterizes grot associated with all three cameras. Flat field images taken from March 1997 through January 1999 have been investigated for changes in the grot, including possible wavelength dependency and throughput characteristics. The main products of this analysis are grot masks for each of the cameras which may also contain any new cold or dead pixels not specified in the data quality arrays.

  11. Wide angle pinhole camera

    NASA Technical Reports Server (NTRS)

    Franke, J. M.

    1978-01-01

    Hemispherical refracting element gives pinhole camera 180 degree field-of-view without compromising its simplicity and depth-of-field. Refracting element, located just behind pinhole, bends light coming in from sides so that it falls within image area of film. In contrast to earlier pinhole cameras that used water or other transparent fluids to widen field, this model is not subject to leakage and is easily loaded and unloaded with film. Moreover, by selecting glass with different indices of refraction, field at film plane can be widened or reduced.

  12. Artificial human vision camera

    NASA Astrophysics Data System (ADS)

    Goudou, J.-F.; Maggio, S.; Fagno, M.

    2014-10-01

    In this paper we present a real-time vision system modeling the human vision system. Our purpose is to inspire from human vision bio-mechanics to improve robotic capabilities for tasks such as objects detection and tracking. This work describes first the bio-mechanical discrepancies between human vision and classic cameras and the retinal processing stage that takes place in the eye, before the optic nerve. The second part describes our implementation of these principles on a 3-camera optical, mechanical and software model of the human eyes and associated bio-inspired attention model.

  13. Bandpass filter arrays patterned by photolithography for multispectral remote sensing

    NASA Astrophysics Data System (ADS)

    Bauer, T.; Thome, Heidi; Eisenhammer, Thomas

    2014-10-01

    Optical remote sensing of the earth from air and space typically utilizes several channels from visible (VIS), near infrared (NIR) up to the short wave infrared (SWIR) spectral region. Thin-film optical filters are applied to select these channels. Filter wheels and arrays of discrete stripe filters are standard configurations. To achieve compact and light weight camera designs multi-channel filter plates or assemblies can be mounted close to the electronic detectors. Optics Balzers has implemented a micro-structuring process based on a sequence of multiple coatings and photolithography on the same substrate. High-performance band pass filters are applied by plasma assisted evaporation (plasma IAD) with advance plasma source (APS) technology and optical broad-band monitoring (BBM). This technology has already proven for various multi spectral imager (MSI) configurations on fused silica, sapphire and other substrates for remote sensing application. The optical filter design and performance is limited by the maximum coating thickness micro-structurable by photolithographic lift-off processes and by thermal and radiation load on the photoresist mask during the process Recent progress in image resolution and sensor selectivity requires improvements of optical filter performance. Blocking in the UV and NIR and in between the spectral cannels, in-band transmission and filter edge steepness are subject of current development. Technological limits of the IAD coating accuracy can be overcome by more precise coating technologies like plasma assisted reactive magnetron sputtering (PARMS) and combination with optical broadband monitoring (BBM). We present an overview about concepts and technologies for band-pass filter arrays for multi-spectral imaging at Optics Balzers. Recent performance improvements of filter arrays made by micro-structuring will be presented.

  14. Real-time multispectral imaging system for online poultry fecal inspection using UML

    NASA Astrophysics Data System (ADS)

    Park, Bosoon; Kise, Michio; Lawrence, Kurt C.; Windham, William R.; Smith, Douglas P.; Thai, Chi N.

    2006-10-01

    A prototype real-time multispectral imaging system for fecal and ingesta contaminant detection on broiler carcasses has been developed. The prototype system includes a common aperture camera with three optical trim filters (517, 565 and 802-nm wavelength), which were selected by visible/NIR spectroscopy and validated by a hyperspectral imaging system with decision tree algorithm. The on-line testing results showed that the multispectral imaging technique can be used effectively for detecting feces (from duodenum, ceca, and colon) and ingesta on the surface of poultry carcasses with a processing speed of 140 birds per minute. This paper demonstrated both multispectral imaging hardware and real-time image processing software. For the software development, the Unified Modeling Language (UML) design approach was used for on-line application. The UML models included class, object, activity, sequence, and collaboration diagram. User interface model included seventeen inputs and six outputs. A window based real-time image processing software composed of eleven components, which represented class, architecture, and activity. Both hardware and software for a real-time fecal detection were tested at the pilot-scale poultry processing plant. The run-time of the software including online calibration was fast enough to inspect carcasses on-line with an industry requirement. Based on the preliminary test at the pilot-scale processing line, the system was able to acquire poultry images in real-time. According to the test results, the imaging system is reliable for the harsh environments and UML based image processing software is flexible and easy to be updated when additional parameters are needed for in-plant trials.

  15. [Air-borne disease].

    PubMed

    Lameiro Vilariño, Carmen; del Campo Pérez, Victor M; Alonso Bürger, Susana; Felpeto Nodar, Irene; Guimarey Pérez, Rosa; Pérez Alvarellos, Alberto

    2003-11-01

    Respiratory protection is a factor which worries nursing professionals who take care of patients susceptible of transmitting microorganisms through the air more as every day passes. This type of protection covers the use of surgical or hygienic masks against the transmission of infection by airborne drops to the use of highly effective masks or respirators against the transmission of airborne diseases such as tuberculosis or SARS, a recently discovered disease. The adequate choice of this protective device and its correct use are fundamental in order to have an effective protection for exposed personnel. The authors summarize the main protective respiratory devices used by health workers, their characteristics and degree of effectiveness, as well as the circumstances under which each device is indicated for use. PMID:14705591

  16. Development of multi-spectral three-dimensional micro particle tracking velocimetry

    NASA Astrophysics Data System (ADS)

    Tien, Wei-Hsin

    2016-08-01

    The color-coded 3D micro particle tracking velocimetry system (CC3DμPTV) is a volumetric velocimetry technique that uses the defocusing digital particle image velocimetry (DDPIV) approach to reconstruct the 3D location of the particle. It is suited for microscopic flow visualization because of the single camera configuration. However, several factors limit the performance of the system. In this study, the limitation of the CC3DμPTV is discussed in detail and a new concept of a multi-camera 3D μ-PTV system is proposed to improve the performance of the original CC3DμPTV system. The system utilizes two dichroic beam splitters to separate the incoming light into 3 spectral ranges, and image with three monochrome image sensors. The use of a color-matched light source, off-center individual pinhole and monochrome image sensors allow the system to achieve better sensitivity and optical resolution. The use of coherent lasers light sources with high-speed cameras improves the velocity measurement dynamic range. The performance of the proposed multi-spectral system is first evaluated with a simulation model based on the finite element method (FEM). The performance is also compared numerically with the CC3DμPTV system. The test results show significant improvements on the signal to noise ratio and optical resolution. Originally presented in 11th International Symposium on Particle Image Velocimetry, Santa Barbara, California, September 14–16, 2015.

  17. Automatic multispectral ultraviolet, visible and near-infrared capturing system for the study of artwork

    NASA Astrophysics Data System (ADS)

    Herrera, Jorge; Vilaseca, Meritxell; Pujol, Jaume

    2011-03-01

    This paper shows the simulations of the usage of a LED cluster as the illumination source for a multispectral imaging system covering the range of wavelengths from 350 to 1650 nm. The system can be described as being composed of two modules determined by the spectral range of the imaging sensors responses, one of them covering the range from 350- 950nm (CCD camera) and the other one covering the wavelengths from 900-1650nm (InGaAs camera). A well known method of reflectance estimation, the pseudo-inverse method, jointly with the experimentally measured data of the spectral responses of the cameras and the spectral emission of the LED elements are used for the simulations. The performance of the system for spectral estimation under ideal conditions and realistic noise influence is evaluated through different spectral and colorimetric metrics like the GFC, RMS error and CIEDE2000 color difference formula. The results show that is expectable a rather good performance of the real setup. However, they also reveal a difference in the performances of the modules. The second module has poorer performance due to the less narrow spectral emission and less number of LED elements that covers the near-infrared spectral range.

  18. Development of multi-spectral three-dimensional micro particle tracking velocimetry

    NASA Astrophysics Data System (ADS)

    Tien, Wei-Hsin

    2016-08-01

    The color-coded 3D micro particle tracking velocimetry system (CC3DμPTV) is a volumetric velocimetry technique that uses the defocusing digital particle image velocimetry (DDPIV) approach to reconstruct the 3D location of the particle. It is suited for microscopic flow visualization because of the single camera configuration. However, several factors limit the performance of the system. In this study, the limitation of the CC3DμPTV is discussed in detail and a new concept of a multi-camera 3D μ-PTV system is proposed to improve the performance of the original CC3DμPTV system. The system utilizes two dichroic beam splitters to separate the incoming light into 3 spectral ranges, and image with three monochrome image sensors. The use of a color-matched light source, off-center individual pinhole and monochrome image sensors allow the system to achieve better sensitivity and optical resolution. The use of coherent lasers light sources with high-speed cameras improves the velocity measurement dynamic range. The performance of the proposed multi-spectral system is first evaluated with a simulation model based on the finite element method (FEM). The performance is also compared numerically with the CC3DμPTV system. The test results show significant improvements on the signal to noise ratio and optical resolution. Originally presented in 11th International Symposium on Particle Image Velocimetry, Santa Barbara, California, September 14-16, 2015.

  19. Radiometric Characterization of IKONOS Multispectral Imagery

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary; Ryan, Robert E.; Kelly, Michelle; Holekamp, Kara; Zanoni, Vicki; Thome, Kurtis; Schiller, Stephen

    2002-01-01

    A radiometric characterization of Space Imaging's IKONOS 4-m multispectral imagery has been performed by a NASA funded team from the John C. Stennis Space Center (SSC), the University of Arizona Remote Sensing Group (UARSG), and South Dakota State University (SDSU). Both intrinsic radiometry and the effects of Space Imaging processing on radiometry were investigated. Relative radiometry was examined with uniform Antarctic and Saharan sites. Absolute radiometric calibration was performed using reflectance-based vicarious calibration methods on several uniform sites imaged by IKONOS, coincident with ground-based surface and atmospheric measurements. Ground-based data and the IKONOS spectral response function served as input to radiative transfer codes to generate a Top-of-Atmosphere radiance estimate. Calibration coefficients derived from each vicarious calibration were combined to generate an IKONOS radiometric gain coefficient for each multispectral band assuming a linear response over the full dynamic range of the instrument. These calibration coefficients were made available to Space Imaging, which subsequently adopted them by updating its initial set of calibration coefficients. IKONOS imagery procured through the NASA Scientific Data Purchase program is processed with or without a Modulation Transfer Function Compensation kernel. The radiometric effects of this kernel on various scene types was also investigated. All imagery characterized was procured through the NASA Scientific Data Purchase program.

  20. [Multispectral image compression algorithms for color reproduction].

    PubMed

    Liang, Wei; Zeng, Ping; Luo, Xue-mei; Wang, Yi-feng; Xie, Kun

    2015-01-01

    In order to improve multispectral images compression efficiency and further facilitate their storage and transmission for the application of color reproduction and so on, in which fields high color accuracy is desired, WF serial methods is proposed, and APWS_RA algorithm is designed. Then the WF_APWS_RA algorithm, which has advantages of low complexity, good illuminant stability and supporting consistent coior reproduction across devices, is presented. The conventional MSE based wavelet embedded coding principle is first studied. And then color perception distortion criterion and visual characteristic matrix W are proposed. Meanwhile, APWS_RA algorithm is formed by optimizing the. rate allocation strategy of APWS. Finally, combined above technologies, a new coding method named WF_APWS_RA is designed. Colorimetric error criterion is used in the algorithm and APWS_RA is applied on visual weighted multispectral image. In WF_APWS_RA, affinity propagation clustering is utilized to exploit spectral correlation of weighted image. Then two-dimensional wavelet transform is used to remove the spatial redundancy. Subsequently, error compensation mechanism and rate pre-allocation are combined to accomplish the embedded wavelet coding. Experimental results show that at the same bit rate, compared with classical coding algorithms, WF serial algorithms have better performance on color retention. APWS_RA preserves least spectral error and WF APWS_RA algorithm has obvious superiority on color accuracy.

  1. Multispectral multisensor image fusion using wavelet transforms

    USGS Publications Warehouse

    Lemeshewsky, George P.

    1999-01-01

    Fusion techniques can be applied to multispectral and higher spatial resolution panchromatic images to create a composite image that is easier to interpret than the individual images. Wavelet transform-based multisensor, multiresolution fusion (a type of band sharpening) was applied to Landsat thematic mapper (TM) multispectral and coregistered higher resolution SPOT panchromatic images. The objective was to obtain increased spatial resolution, false color composite products to support the interpretation of land cover types wherein the spectral characteristics of the imagery are preserved to provide the spectral clues needed for interpretation. Since the fusion process should not introduce artifacts, a shift invariant implementation of the discrete wavelet transform (SIDWT) was used. These results were compared with those using the shift variant, discrete wavelet transform (DWT). Overall, the process includes a hue, saturation, and value color space transform to minimize color changes, and a reported point-wise maximum selection rule to combine transform coefficients. The performance of fusion based on the SIDWT and DWT was evaluated with a simulated TM 30-m spatial resolution test image and a higher resolution reference. Simulated imagery was made by blurring higher resolution color-infrared photography with the TM sensors' point spread function. The SIDWT based technique produced imagery with fewer artifacts and lower error between fused images and the full resolution reference. Image examples with TM and SPOT 10-m panchromatic illustrate the reduction in artifacts due to the SIDWT based fusion.

  2. Multi-spectral imaging of oxygen saturation

    NASA Astrophysics Data System (ADS)

    Savelieva, Tatiana A.; Stratonnikov, Aleksander A.; Loschenov, Victor B.

    2008-06-01

    The system of multi-spectral imaging of oxygen saturation is an instrument that can record both spectral and spatial information about a sample. In this project, the spectral imaging technique is used for monitoring of oxygen saturation of hemoglobin in human tissues. This system can be used for monitoring spatial distribution of oxygen saturation in photodynamic therapy, surgery or sports medicine. Diffuse reflectance spectroscopy in the visible range is an effective and extensively used technique for the non-invasive study and characterization of various biological tissues. In this article, a short review of modeling techniques being currently in use for diffuse reflection from semi-infinite turbid media is presented. A simple and practical model for use with a real-time imaging system is proposed. This model is based on linear approximation of the dependence of the diffuse reflectance coefficient on relation between absorbance and reduced scattering coefficient. This dependence was obtained with the Monte Carlo simulation of photon propagation in turbid media. Spectra of the oxygenated and deoxygenated forms of hemoglobin differ mostly in the red area (520 - 600 nm) and have several characteristic points there. Thus four band-pass filters were used for multi-spectral imaging. After having measured the reflectance, the data obtained are used for fitting the concentration of oxygenated and free hemoglobin, and hemoglobin oxygen saturation.

  3. MLS airborne antenna research

    NASA Technical Reports Server (NTRS)

    Yu, C. L.; Burnside, W. D.

    1975-01-01

    The geometrical theory of diffraction was used to analyze the elevation plane pattern of on-aircraft antennas. The radiation patterns for basic elements (infinitesimal dipole, circumferential and axial slot) mounted on fuselage of various aircrafts with or without radome included were calculated and compared well with experimental results. Error phase plots were also presented. The effects of radiation patterns and error phase plots on the polarization selection for the MLS airborne antenna are discussed.

  4. Airborne forest fire research

    NASA Technical Reports Server (NTRS)

    Mattingly, G. S.

    1974-01-01

    The research relating to airborne fire fighting systems is reviewed to provide NASA/Langley Research Center with current information on the use of aircraft in forest fire operations, and to identify research requirements for future operations. A literature survey, interview of forest fire service personnel, analysis and synthesis of data from research reports and independent conclusions, and recommendations for future NASA-LRC programs are included.

  5. Snapshot polarimeter fundus camera.

    PubMed

    DeHoog, Edward; Luo, Haitao; Oka, Kazuhiko; Dereniak, Eustace; Schwiegerling, James

    2009-03-20

    A snapshot imaging polarimeter utilizing Savart plates is integrated into a fundus camera for retinal imaging. Acquired retinal images can be processed to reconstruct Stokes vector images, giving insight into the polarization properties of the retina. Results for images from a normal healthy retina and retinas with pathology are examined and compared. PMID:19305463

  6. Spas color camera

    NASA Technical Reports Server (NTRS)

    Toffales, C.

    1983-01-01

    The procedures to be followed in assessing the performance of the MOS color camera are defined. Aspects considered include: horizontal and vertical resolution; value of the video signal; gray scale rendition; environmental (vibration and temperature) tests; signal to noise ratios; and white balance correction.

  7. The LSST Camera Overview

    SciTech Connect

    Gilmore, Kirk; Kahn, Steven A.; Nordby, Martin; Burke, David; O'Connor, Paul; Oliver, John; Radeka, Veljko; Schalk, Terry; Schindler, Rafe; /SLAC

    2007-01-10

    The LSST camera is a wide-field optical (0.35-1um) imager designed to provide a 3.5 degree FOV with better than 0.2 arcsecond sampling. The detector format will be a circular mosaic providing approximately 3.2 Gigapixels per image. The camera includes a filter mechanism and, shuttering capability. It is positioned in the middle of the telescope where cross-sectional area is constrained by optical vignetting and heat dissipation must be controlled to limit thermal gradients in the optical beam. The fast, f/1.2 beam will require tight tolerances on the focal plane mechanical assembly. The focal plane array operates at a temperature of approximately -100 C to achieve desired detector performance. The focal plane array is contained within an evacuated cryostat, which incorporates detector front-end electronics and thermal control. The cryostat lens serves as an entrance window and vacuum seal for the cryostat. Similarly, the camera body lens serves as an entrance window and gas seal for the camera housing, which is filled with a suitable gas to provide the operating environment for the shutter and filter change mechanisms. The filter carousel can accommodate 5 filters, each 75 cm in diameter, for rapid exchange without external intervention.

  8. Jack & the Video Camera

    ERIC Educational Resources Information Center

    Charlan, Nathan

    2010-01-01

    This article narrates how the use of video camera has transformed the life of Jack Williams, a 10-year-old boy from Colorado Springs, Colorado, who has autism. The way autism affected Jack was unique. For the first nine years of his life, Jack remained in his world, alone. Functionally non-verbal and with motor skill problems that affected his…

  9. Photogrammetric camera calibration

    USGS Publications Warehouse

    Tayman, W.P.; Ziemann, H.

    1984-01-01

    Section 2 (Calibration) of the document "Recommended Procedures for Calibrating Photogrammetric Cameras and Related Optical Tests" from the International Archives of Photogrammetry, Vol. XIII, Part 4, is reviewed in the light of recent practical work, and suggestions for changes are made. These suggestions are intended as a basis for a further discussion. ?? 1984.

  10. Communities, Cameras, and Conservation

    ERIC Educational Resources Information Center

    Patterson, Barbara

    2012-01-01

    Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…

  11. Anger Camera Firmware

    2010-11-19

    The firmware is responsible for the operation of Anger Camera Electronics, calculation of position, time of flight and digital communications. It provides a first stage analysis of 48 signals from 48 analog signals that have been converted to digital values using A/D convertors.

  12. Make a Pinhole Camera

    ERIC Educational Resources Information Center

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  13. Advanced Virgo phase cameras

    NASA Astrophysics Data System (ADS)

    van der Schaaf, L.; Agatsuma, K.; van Beuzekom, M.; Gebyehu, M.; van den Brand, J.

    2016-05-01

    A century after the prediction of gravitational waves, detectors have reached the sensitivity needed to proof their existence. One of them, the Virgo interferometer in Pisa, is presently being upgraded to Advanced Virgo (AdV) and will come into operation in 2016. The power stored in the interferometer arms raises from 20 to 700 kW. This increase is expected to introduce higher order modes in the beam, which could reduce the circulating power in the interferometer, limiting the sensitivity of the instrument. To suppress these higher-order modes, the core optics of Advanced Virgo is equipped with a thermal compensation system. Phase cameras, monitoring the real-time status of the beam constitute a critical component of this compensation system. These cameras measure the phases and amplitudes of the laser-light fields at the frequencies selected to control the interferometer. The measurement combines heterodyne detection with a scan of the wave front over a photodetector with pin-hole aperture. Three cameras observe the phase front of these laser sidebands. Two of them monitor the in-and output of the interferometer arms and the third one is used in the control of the aberrations introduced by the power recycling cavity. In this paper the working principle of the phase cameras is explained and some characteristic parameters are described.

  14. Imaging phoswich anger camera

    NASA Astrophysics Data System (ADS)

    Manchanda, R. K.; Sood, R. K.

    1991-08-01

    High angular resolution and low background are the primary requisites for detectors for future astronomy experiments in the low energy gamma-ray region. Scintillation counters are still the only available large area detector for studies in this energy range. Preliminary details of a large area phoswich anger camera designed for coded aperture imaging is described and its background and position characteristics are discussed.

  15. Millisecond readout CCD camera

    NASA Astrophysics Data System (ADS)

    Prokop, Mark; McCurnin, Thomas W.; Stradling, Gary L.

    1993-01-01

    We have developed a prototype of a fast-scanning CCD readout system to record a 1024 X 256 pixel image and transport the image to a recording station within 1 ms of the experimental event. The system is designed to have a dynamic range of greater than 1000 with adequate sensitivity to read single-electron excitations of a CRT phosphor when amplified by a microchannel plate image intensifier. This readout camera is intended for recording images from oscilloscopes, streak, and framing cameras. The sensor is a custom CCD chip, designed by LORAL Aeroneutronics. This CCD chip is designed with 16 parallel output ports to supply the necessary image transfer speed. The CCD is designed as an interline structure to allow fast clearing of the image and on-chip fast sputtering. Special antiblooming provisions are also included. The camera is designed to be modular and to allow CCD chips of other sizes to be used with minimal reengineering of the camera head.

  16. Millisecond readout CCD camera

    NASA Astrophysics Data System (ADS)

    Prokop, M.; McCurnin, T. W.; Stradling, G.

    We have developed a prototype of a fast-scanning CCD readout system to record a 1024 x 256 pixel image and transport the image to a recording station within 1 ms of the experimental event. The system is designed to have a dynamic range of greater than 1000 with adequate sensitivity to read single-electron excitations of a CRT phosphor when amplified by a microchannel plate image intensifier. This readout camera is intended for recording images from oscilloscopes, streak, and framing cameras. The sensor is a custom CCD chip, designed by LORAL Aeroneutronics. This CCD chip is designed with 16 parallel output ports to supply the necessary image transfer speed. The CCD is designed as an interline structure to allow fast clearing of the image and on-chip fast shuttering. Special antiblooming provisions are also included. The camera is designed to be modular and to allow CCD chips of other sizes to be used with minimal reengineering of the camera head.

  17. Mutagenicity of airborne particles.

    PubMed

    Chrisp, C E; Fisher, G L

    1980-09-01

    The physical and chemical properties of airborne particles are important for the interpretation of their potential biologic significance as genotoxic hazards. For polydisperse particle size distributions, the smallest, most respirable particles are generally the most mutagenic. Particulate collection for testing purposes should be designed to reduce artifact formation and allow condensation of mutagenic compounds. Other critical factors such as UV irradiation, wind direction, chemical reactivity, humidity, sample storage, and temperature of combustion are important. Application of chemical extraction methods and subsequent class fractionation techniques influence the observed mutagenic activity. Particles from urban air, coal fly ash, automobile and diesel exhaust, agricultural burning and welding fumes contain primarily direct-acting mutagens. Cigarette smoke condensate, smoke from charred meat and protein pyrolysates, kerosene soot and cigarette smoke condensates contain primarily mutagens which require metabolic activation. Fractionation coupled with mutagenicity testing indicates that the most potent mutagens are found in the acidic fractions of urban air, coal fly ash, and automobile diesel exhaust, whereas mutagens in rice straw smoke and cigarette smoke condensate are found primarily in the basic fractions. The interaction of the many chemical compounds in complex mixtures from airborne particles is likely to be important in determining mutagenic or comutagenic potentials. Because the mode of exposure is generally frequent and prolonged, the presence of tumor-promoting agents in complex mixtures may be a major factor in evaluation of the carcinogenic potential of airborne particles.

  18. Mammalian airborne allergens.

    PubMed

    Aalberse, Rob C

    2014-01-01

    Historically, horse dandruff was a favorite allergen source material. Today, however, allergic symptoms due to airborne mammalian allergens are mostly a result of indoor exposure, be it at home, at work or even at school. The relevance of mammalian allergens in relation to the allergenic activity of house dust extract is briefly discussed in the historical context of two other proposed sources of house dust allergenic activity: mites and Maillard-type lysine-sugar conjugates. Mammalian proteins involved in allergic reactions to airborne dust are largely found in only 2 protein families: lipocalins and secretoglobins (Fel d 1-like proteins), with a relatively minor contribution of serum albumins, cystatins and latherins. Both the lipocalin and the secretoglobin family are very complex. In some instances this results in a blurred separation between important and less important allergenic family members. The past 50 years have provided us with much detailed information on the genomic organization and protein structure of many of these allergens. However, the complex family relations, combined with the wide range of post-translational enzymatic and non-enzymatic modifications, make a proper qualitative and quantitative description of the important mammalian indoor airborne allergens still a significant proteomic challenge. PMID:24925404

  19. Airborne wireless communication systems, airborne communication methods, and communication methods

    DOEpatents

    Deaton, Juan D.; Schmitt, Michael J.; Jones, Warren F.

    2011-12-13

    An airborne wireless communication system includes circuitry configured to access information describing a configuration of a terrestrial wireless communication base station that has become disabled. The terrestrial base station is configured to implement wireless communication between wireless devices located within a geographical area and a network when the terrestrial base station is not disabled. The circuitry is further configured, based on the information, to configure the airborne station to have the configuration of the terrestrial base station. An airborne communication method includes answering a 911 call from a terrestrial cellular wireless phone using an airborne wireless communication system.

  20. Rapid algal culture diagnostics for open ponds using multispectral image analysis.

    PubMed

    Murphy, Thomas E; Macon, Keith; Berberoglu, Halil

    2014-01-01

    This article presents a multispectral image analysis approach for probing the spectral backscattered irradiance from algal cultures. It was demonstrated how this spectral information can be used to measure algal biomass concentration, detect invasive species, and monitor culture health in real time. To accomplish this, a conventional RGB camera was used as a three band photodetector for imaging cultures of the green alga Chlorella sp. and the cyanobacterium Anabaena variabilis. A novel floating reference platform was placed in the culture, which enhanced the sensitivity of image color intensity to biomass concentration. Correlations were generated between the RGB color vector of culture images and the biomass concentrations for monocultures of each strain. These correlations predicted the biomass concentrations of independently prepared cultures with average errors of 22 and 14%, respectively. Moreover, the difference in spectral signatures between the two strains was exploited to detect the invasion of Chlorella sp. cultures by A. variabilis. Invasion was successfully detected for A. variabilis to Chlorella sp. mass ratios as small as 0.08. Finally, a method was presented for using multispectral imaging to detect thermal stress in A. variabilis. These methods can be extended to field applications to provide delay free process control feedback for efficient operation of large scale algae cultivation systems.

  1. A multispectral study of an extratropical cyclone with Nimbus 3 medium resolution infrared radiometer data

    NASA Technical Reports Server (NTRS)

    Holub, R.; Shenk, W. E.

    1973-01-01

    Four registered channels (0.2 to 4, 6.5 to 7, 10 to 11, and 20 to 23 microns) of the Nimbus 3 Medium Resolution Infrared Radiometer (MRIR) were used to study 24-hr changes in the structure of an extratropical cyclone during a 6-day period in May 1969. Use of a stereographic-horizon map projection insured that the storm was mapped with a single perspective throughout the series and allowed the convenient preparation of 24-hr difference maps of the infrared radiation fields. Single-channel and multispectral analysis techniques were employed to establish the positions and vertical slopes of jetstreams, large cloud systems, and major features of middle and upper tropospheric circulation. Use of these techniques plus the difference maps and continuity of observation allowed the early detection of secondary cyclones developing within the circulation of the primary cyclone. An automated, multispectral cloud-type identification technique was developed, and comparisons that were made with conventional ship reports and with high-resolution visual data from the image dissector camera system showed good agreement.

  2. Thresholding for biological material detection in real-time multispectral imaging

    NASA Astrophysics Data System (ADS)

    Yoon, Seung Chul; Park, Bosoon; Lawrence, Kurt C.; Windham, William R.

    2005-09-01

    Recently, hyperspectral image analysis has proved successful for a target detection problem encountered in remote sensing as well as near sensing utilizing in situ instrumentation. The conventional global bi-level thresholding for target detection, such as the clustering-based Otsu's method, has been inadequate for the detection of biologically harmful material on foods that has a large degree of variability in size, location, color, shape, texture, and occurrence time. This paper presents multistep-like thresholding based on kernel density estimation for the real-time detection of harmful contaminants on a food product presented in multispectral images. We are particularly concerned with the detection of fecal contaminants on poultry carcasses in real-time. In the past, we identified 2 optimal wavelength bands and developed a real-time multispectral imaging system using a common aperture camera and a globally optimized thresholding method from a ratio of the optimal bands. This work extends our previous study by introducing a new decision rule to detect fecal contaminants on a single bird level. The underlying idea is to search for statistical separability along the two directions defined by the global optimal threshold vector and its orthogonal vector. Experimental results with real birds and fecal samples in different amounts are provided.

  3. Rapid algal culture diagnostics for open ponds using multispectral image analysis.

    PubMed

    Murphy, Thomas E; Macon, Keith; Berberoglu, Halil

    2014-01-01

    This article presents a multispectral image analysis approach for probing the spectral backscattered irradiance from algal cultures. It was demonstrated how this spectral information can be used to measure algal biomass concentration, detect invasive species, and monitor culture health in real time. To accomplish this, a conventional RGB camera was used as a three band photodetector for imaging cultures of the green alga Chlorella sp. and the cyanobacterium Anabaena variabilis. A novel floating reference platform was placed in the culture, which enhanced the sensitivity of image color intensity to biomass concentration. Correlations were generated between the RGB color vector of culture images and the biomass concentrations for monocultures of each strain. These correlations predicted the biomass concentrations of independently prepared cultures with average errors of 22 and 14%, respectively. Moreover, the difference in spectral signatures between the two strains was exploited to detect the invasion of Chlorella sp. cultures by A. variabilis. Invasion was successfully detected for A. variabilis to Chlorella sp. mass ratios as small as 0.08. Finally, a method was presented for using multispectral imaging to detect thermal stress in A. variabilis. These methods can be extended to field applications to provide delay free process control feedback for efficient operation of large scale algae cultivation systems. PMID:24265121

  4. Skin Parameter Map Retrieval from a Dedicated Multispectral Imaging System Applied to Dermatology/Cosmetology

    PubMed Central

    2013-01-01

    In vivo quantitative assessment of skin lesions is an important step in the evaluation of skin condition. An objective measurement device can help as a valuable tool for skin analysis. We propose an explorative new multispectral camera specifically developed for dermatology/cosmetology applications. The multispectral imaging system provides images of skin reflectance at different wavebands covering visible and near-infrared domain. It is coupled with a neural network-based algorithm for the reconstruction of reflectance cube of cutaneous data. This cube contains only skin optical reflectance spectrum in each pixel of the bidimensional spatial information. The reflectance cube is analyzed by an algorithm based on a Kubelka-Munk model combined with evolutionary algorithm. The technique allows quantitative measure of cutaneous tissue and retrieves five skin parameter maps: melanin concentration, epidermis/dermis thickness, haemoglobin concentration, and the oxygenated hemoglobin. The results retrieved on healthy participants by the algorithm are in good accordance with the data from the literature. The usefulness of the developed technique was proved during two experiments: a clinical study based on vitiligo and melasma skin lesions and a skin oxygenation experiment (induced ischemia) with healthy participant where normal tissues are recorded at normal state and when temporary ischemia is induced. PMID:24159326

  5. Detection of contamination on selected apple cultivars using reflectance hyperspectral and multispectral analysis

    NASA Astrophysics Data System (ADS)

    Mehl, Patrick M.; Chao, Kevin; Kim, Moon S.; Chen, Yud-Ren

    2001-03-01

    Presence of natural or exogenous contaminations on apple cultivars is a food safety and quality concern touching the general public and strongly affecting this commodity market. Accumulations of human pathogens are usually observed on surface lesions of commodities. Detections of either lesions or directly of the pathogens are essential for assuring the quality and safety of commodities. We are presenting the application of hyperspectral image analysis towards the development of multispectral techniques for the detection of defects on chosen apple cultivars, such as Golden Delicious, Red Delicious, and Gala apples. Separate apple cultivars possess different spectral characteristics leading to different approaches for analysis. General preprocessing analysis with morphological treatments is followed by different image treatments and condition analysis for highlighting lesions and contaminations on the apple cultivars. Good isolations of scabs, fungal and soil contaminations and bruises are observed with hyperspectral imaging processing either using principal component analysis or utilizing the chlorophyll absorption peak. Applications of hyperspectral results to a multispectral detection are limited by the spectral capabilities of our RGB camera using either specific band pass filters and using direct neutral filters. Good separations of defects are obtained for Golden Delicious apples. It is however limited for the other cultivars. Having an extra near infrared channel will increase the detection level utilizing the chlorophyll absorption band for detection as demonstrated by the present hyperspectral imaging analysis

  6. Detection of soil properties with airborne hyperspectral measurements of bare fields.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Airborne remote sensing data, using a hyperspectral (HSI) camera, were collected for a flight over two fields with a total of 128 ha. of recently seeded and nearly bare soil. The within-field spatial distribution of several soil properties was found by using multiple linear regression to select the ...

  7. Quantitative evaluation of lipid concentration in atherosclerotic plaque phantom by near-infrared multispectral angioscope at wavelengths around 1200 nm

    NASA Astrophysics Data System (ADS)

    Matsui, Daichi; Ishii, Katsunori; Awazu, Kunio

    2015-07-01

    Atherosclerosis is a primary cause of critical ischemic diseases like heart infarction or stroke. A method that can provide detailed information about the stability of atherosclerotic plaques is required. We focused on spectroscopic techniques that could evaluate the chemical composition of lipid in plaques. A novel angioscope using multispectral imaging at wavelengths around 1200 nm for quantitative evaluation of atherosclerotic plaques was developed. The angioscope consists of a halogen lamp, an indium gallium arsenide (InGaAs) camera, 3 optical band pass filters transmitting wavelengths of 1150, 1200, and 1300 nm, an image fiber having 0.7 mm outer diameter, and an irradiation fiber which consists of 7 multimode fibers. Atherosclerotic plaque phantoms with 100, 60, 20 vol.% of lipid were prepared and measured by the multispectral angioscope. The acquired datasets were processed by spectral angle mapper (SAM) method. As a result, simulated plaque areas in atherosclerotic plaque phantoms that could not be detected by an angioscopic visible image could be clearly enhanced. In addition, quantitative evaluation of atherosclerotic plaque phantoms based on the lipid volume fractions was performed up to 20 vol.%. These results show the potential of a multispectral angioscope at wavelengths around 1200 nm for quantitative evaluation of the stability of atherosclerotic plaques.

  8. [Soil Salinity Estimation Based on Near-Ground Multispectral Imagery in Typical Area of the Yellow River Delta].

    PubMed

    Zhang, Tong-rui; Zhao, Geng-xing; Gao, Ming-xiu; Wang, Zhuo-ran; Jia, Ji-chao; Li, Ping; An, De-yu

    2016-01-01

    This study chooses the core demonstration area of 'Bohai Barn' project as the study area, which is located in Wudi, Shandong Province. We first collected near-ground and multispectral images and surface soil salinity data using ADC portable multispectral camera and EC110 portable salinometer. Then three vegetation indices, namely NDVI, SAVI and GNDVI, were used to build 18 models respectively with the actual measured soil salinity. These models include linear function, exponential function, logarithmic function, exponentiation function, quadratic function and cubic function, from which the best estimation model for soil salinity estimation was selected and used for inverting and analyzing soil salinity status of the study area. Results indicated that all models mentioned above could effectively estimate soil salinity and models using SAVI as the dependent variable were more effective than the others. Among SAVI models, the linear model (Y = -0.524x + 0.663, n = 70) is the best, under which the test value of F is the highest as 141.347 at significance test level, estimated R2 0.797 with a 93.36% accuracy. Soil salinity of the study area is mainly around 2.5 per thousand - 3.5 per thousand, which gradually increases from southwest to northeast. The study has probed into soil salinity estimation methods based on near-ground and multispectral data, and will provide a quick and effective technical soil salinity estimation approach for coastal saline soil of the study area and the whole Yellow River Delta. PMID:27228776

  9. Innovativ Airborne Sensors for Disaster Management

    NASA Astrophysics Data System (ADS)

    Altan, M. O.; Kemper, G.

    2016-06-01

    Disaster management by analyzing changes in the DSM before and after the "event". Advantage of Lidar is that beside rain and clouds, no other weather conditions limit their use. As an active sensor, missions in the nighttime are possible. The new mid-format cameras that make use CMOS sensors (e.g. Phase One IXU1000) can capture data also under poor and difficult light conditions and might will be the first choice for remotely sensed data acquisition in aircrafts and UAVs. UAVs will surely be more and more part of the disaster management on the detailed level. Today equipped with video live cams using RGB and Thermal IR, they assist in looking inside buildings and behind. Thus, they can continue with the aerial survey where airborne anomalies have been detected.

  10. Multispectral Imaging Systems for Airborne Remote Sensing to Support Agricultural Production Management

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Remote sensing has shown promise as a tool for managing agricultural application and production. Earth-observing satellite systems have an advantage for large-scale analysis at regional levels but are limited in spatial resolution. High-resolution satellite systems have been available in recent year...

  11. Evaluating spectral measures derived from airborne multispectral imagery for detecting cotton root rot

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cotton root rot, caused by the soilborne fungus Phymatotrichopsis omnivore, is one of the most destructive plant diseases occurring throughout the southwestern United States. This disease has plagued the cotton industry for more than 100 years, but effective practices for its control are still lacki...

  12. Mapping cotton root rot infestations over a 10-year interval with airborne multispectral imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cotton root rot, caused by the pathogen Phymatotrichopsis omnivora, is a very serious and destructive disease of cotton grown in the southwestern and south central U.S. Accurate information regarding temporal changes of cotton root rot infestations within fields is important for the management and c...

  13. Monitoring cotton root rot progression within a growing season using airborne multispectral imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cotton root rot, caused by the fungus Phymatotrichopsis omnivora, is a serious and destructive disease affecting cotton production in the southwestern United States. Accurate delineation of cotton root rot infections is important for cost-effective management of the disease. The objective of this st...

  14. Airborne multi-spectral remote sensing with ground truth for areawide pest management

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Scientists and researchers have been developing, integrating, and evaluating multiple strategies and technologies into a systems approach for management of field crop insect pests. Remote sensing along with Global Positioning Systems, Geographic Information Systems, and variable rate technology are...

  15. Use of Airborne Multi-Spectral Imagery in Pest Management Systems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Scientists and researchers have been developing, integrating, and evaluating multiple strategies and technologies into a systems approach for management of field crop insect pests. Remote sensing along with Global Positioning Systems, Geographic Information Systems, and variable rate technology are...

  16. Airborne multispectral remote sensing with ground truth for areawide pest management

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Scientists and engineers in areawide pest management programs have been developing, integrating, and evaluating multiple strategies and technologies into a systems approach for management of field crop insect pests. Remote sensing along with global positioning systems, geographic information system...

  17. Change detection of cotton root rot infection over a 10-year interval using airborne multispectral imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Cotton root rot is a very serious and destructive disease of cotton grown in the southwestern and south central United States. Accurate information regarding the spatial and temporal infections of the disease within fields is important for effective management and control of the disease. The objecti...

  18. Analysis of airborne multi-spectral imagery of an oil spill field trial

    SciTech Connect

    Kalnins, V.J.; Freemantle, J.R.; Brown, C.E.

    1996-12-31

    A field trial was conducted at Canadian Forces Base Petawawa in May 1993 by the Emergencies Science Division of Environment Canada to test the effectiveness of remote sensing systems to detect oil spills. Shallow test pools covered with various thicknesses and types of oil were overflown by a number of sensors. Imagery from one of the sensors used, the Multi-element Electro-optical Imaging Scanner (MEIS), has recently been transcribed from high density digital tape and analyzed. The MEIS sensor was flown on a Falcon 20 jet and collected data at 7 different wavelengths from 518 nm to 873 nm. Preliminary results show that one of the slicks, Hydraulic Fluid, can be readily identified by its distinctive color in the visible region. The oil slicks, at least under these very controlled conditions, presented unique spectral signatures which could be identified using normal image processing classification techniques.

  19. Estimating hourly crop ET using a two-source energy balance model and multispectral airborne imagery

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Efficient water use through improved irrigation scheduling is expected to moderate fast declining groundwater levels and improve sustainability of the Ogallala Aquifer. Thus, an accurate estimation of spatial actual evapotranspiration (ET) is needed for this purpose. Therefore, during 2007, the Bush...

  20. Mapping forest stand complexity for woodland caribou habitat assessment using multispectral airborne imagery

    NASA Astrophysics Data System (ADS)

    Zhang, W.; Hu, B.; Woods, M.

    2014-11-01

    The decline of the woodland caribou population is a result of their habitat loss. To conserve the habitat of the woodland caribou and protect it from extinction, it is critical to accurately characterize and monitor its habitat. Conventionally, products derived from low to medium spatial resolution remote sensing data, such as land cover classification and vegetation indices are used for wildlife habitat assessment. These products fail to provide information on the structure complexities of forest canopies which reflect important characteristics of caribou's habitats. Recent studies have employed the LiDAR system (Light Detection And Ranging) to directly retrieve the three dimensional forest attributes. Although promising results have been achieved, the acquisition cost of LiDAR data is very high. In this study, utilizing the very high spatial resolution imagery in characterizing the structural development the of forest canopies was exploited. A stand based image texture analysis was performed to predict forest succession stages. The results were demonstrated to be consistent with those derived from LiDAR data.

  1. Airborne seeker evaluation and test system

    NASA Astrophysics Data System (ADS)

    Jollie, William B.

    1991-08-01

    The Airborne Seeker Evaluation Test System (ASETS) is an airborne platform for development, test, and evaluation of air-to-ground seekers and sensors. ASETS consists of approximately 10,000 pounds of equipment, including sixteen racks of control, display, and recording electronics, and a very large stabilized airborne turret, all carried by a modified C- 130A aircraft. The turret measures 50 in. in diameter and extends over 50 in. below the aircraft. Because of the low ground clearance of the C-130, a unique retractor mechanism was designed to raise the turret inside the aircraft for take-offs and landings, and deploy the turret outside the aircraft for testing. The turret has over 7 cubic feet of payload space and can accommodate up to 300 pounds of instrumentation, including missile seekers, thermal imagers, infrared mapping systems, laser systems, millimeter wave radar units, television cameras, and laser rangers. It contains a 5-axis gyro-stabilized gimbal system that will maintain a line of sight in the pitch, roll, and yaw axes to an accuracy better than +/- 125 (mu) rad. The rack-mounted electronics in the aircraft cargo bay can be interchanged to operate any type of sensor and record the data. Six microcomputer subsystems operate and maintain all of the system components during a test mission. ASETS is capable of flying at altitudes between 200 and 20,000 feet, and at airspeeds ranging from 100 to 250 knots. Mission scenarios can include air-to-surface seeker testing, terrain mapping, surface target measurement, air-to-air testing, atmospheric transmission studies, weather data collection, aircraft or missile tracking, background signature measurements, and surveillance. ASETS is fully developed and available to support test programs.

  2. Image Sensors Enhance Camera Technologies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

  3. Airborne Submillimeter Spectroscopy

    NASA Technical Reports Server (NTRS)

    Zmuidzinas, J.

    1998-01-01

    This is the final technical report for NASA-Ames grant NAG2-1068 to Caltech, entitled "Airborne Submillimeter Spectroscopy", which extended over the period May 1, 1996 through January 31, 1998. The grant was funded by the NASA airborne astronomy program, during a period of time after the Kuiper Airborne Observatory was no longer operational. Instead. this funding program was intended to help develop instrument concepts and technology for the upcoming SOFIA (Stratospheric Observatory for Infrared Astronomy) project. SOFIA, which is funded by NASA and is now being carried out by a consortium lead by USRA (Universities Space Research Association), will be a 747 aircraft carrying a 2.5 meter diameter telescope. The purpose of our grant was to fund the ongoing development of sensitive heterodyne receivers for the submillimeter band (500-1200 GHz), using sensitive superconducting (SIS) detectors. In 1997 July we submitted a proposal to USRA to construct a heterodyne instrument for SOFIA. Our proposal was successful [1], and we are now continuing our airborne astronomy effort with funding from USRA. A secondary purpose of the NAG2-1068 grant was to continue the anaIN'sis of astronomical data collected with an earlier instrument which was flown on the NASA Kuiper Airborne Observatory (KAO). The KAO instrument and the astronomical studies which were carried out with it were supported primarily under another grant, NAG2-744, which extended over October 1, 1991 through Januarv 31, 1997. For a complete description of the astronomical data and its anailysis, we refer the reader to the final technical report for NAG2-744, which was submitted to NASA on December 1. 1997. Here we report on the SIS detector development effort for SOFIA carried out under NAG2-1068. The main result of this effort has been the demonstration of SIS mixers using a new superconducting material niobium titanium nitride (NbTiN), which promises to deliver dramatic improvements in sensitivity in the 700

  4. Combination of RGB and Multispectral Imagery for Discrimination of Cabernet Sauvignon Grapevine Elements

    PubMed Central

    Fernández, Roemi; Montes, Héctor; Salinas, Carlota; Sarria, Javier; Armada, Manuel

    2013-01-01

    This paper proposes a sequential masking algorithm based on the K-means method that combines RGB and multispectral imagery for discrimination of Cabernet Sauvignon grapevine elements in unstructured natural environments, without placing any screen behind the canopy and without any previous preparation of the vineyard. In this way, image pixels are classified into five clusters corresponding to leaves, stems, branches, fruit and background. A custom-made sensory rig that integrates a CCD camera and a servo-controlled filter wheel has been specially designed and manufactured for the acquisition of images during the experimental stage. The proposed algorithm is extremely simple, efficient, and provides a satisfactory rate of classification success. All these features turn out the proposed algorithm into an appropriate candidate to be employed in numerous tasks of the precision viticulture, such as yield estimation, water and nutrients needs estimation, spraying and harvesting. PMID:23783736

  5. Multispectral imaging approach for simplified non-invasive in-vivo evaluation of gingival erythema

    NASA Astrophysics Data System (ADS)

    Eckhard, Timo; Valero, Eva M.; Nieves, Juan L.; Gallegos-Rueda, José M.; Mesa, Francisco

    2012-03-01

    Erythema is a common visual sign of gingivitis. In this work, a new and simple low-cost image capture and analysis method for erythema assessment is proposed. The method is based on digital still images of gingivae and applied on a pixel-by-pixel basis. Multispectral images are acquired with a conventional digital camera and multiplexed LED illumination panels at 460nm and 630nm peak wavelength. An automatic work-flow segments teeth from gingiva regions in the images and creates a map of local blood oxygenation levels, which relates to the presence of erythema. The map is computed from the ratio of the two spectral images. An advantage of the proposed approach is that the whole process is easy to manage by dental health care professionals in clinical environment.

  6. A Near-Infrared (NIR) Global Multispectral Map of the Moon from Clementine

    NASA Technical Reports Server (NTRS)

    Eliason, E. M.; Lee, E. M.; Becker, T. L.; Weller, L. A.; Isbell, C. E.; Staid, M. I.; Gaddis, L. R.; McEwen, A. S.; Robinson, M. S.; Duxbury, T.

    2003-01-01

    In May and June of 1994, the NASA/DoD Clementine Mission acquired global, 11- band, multispectral observations of the lunar surface using the ultraviolet-visible (UVVIS) and near-infrared (NIR) camera systems. The global 5-band UVVIS Digital Image Model (DIM) of the Moon at 100 m/pixel was released to the Planetary Data System (PDS) in 2000. The corresponding NIR DIM has been compiled by the U.S. Geological Survey for distribution to the lunar science community. The recently released NIR DIM has six spectral bands (1100, 1250, 1500, 2000, 2600, and 2780 nm) and is delivered in 996 quads at 100 m/pixel (303 pixels/degree). The NIR data were radiometrically corrected, geometrically controlled, and photometrically normalized to form seamless, uniformly illuminated mosaics of the lunar surface.

  7. Multispectral hypercolorimetry and automatic guided pigment identification: some masterpieces case studies

    NASA Astrophysics Data System (ADS)

    Melis, Marcello; Miccoli, Matteo; Quarta, Donato

    2013-05-01

    A couple of years ago we proposed, in this same session, an extension to the standard colorimetry (CIE '31) that we called Hypercolorimetry. It was based on an even sampling of the 300-1000nm wavelength range, with the definition of 7 hypercolor matching functions optimally shaped to minimize the methamerism. Since then we consolidated the approach through a large number of multispectral analysis and specialized the system to the non invasive diagnosis for paintings and frescos. In this paper we describe the whole process, from the multispectral image acquisition to the final 7 bands computation and we show the results on paintings from Masters of the colour. We describe and propose in this paper a systematic approach to the non invasive diagnosis that is able to change a subjective analysis into a repeatable measure indipendent from the specific lighting conditions and from the specific acquisition system. Along with the Hypercolorimetry and its consolidation in the field of non invasive diagnosis, we developed also a standard spectral reflectance database of pure pigments and pigments painted with different bindings. As we will see, this database could be compared to the reflectances of the painting to help the diagnostician in identifing the proper matter. We used a Nikon D800FR (Full Range) camera. This is a 36megapixel reflex camera modified under a Nikon/Profilocolore common project, to achieve a 300-1000nm range sensitivity. The large amount of data allowed us to perform very accurate pixels comparisions, based on their spectral reflectance. All the original pigments and their binding have been provided by the Opificio delle Pietre Dure, Firenze, Italy, while the analyzed masterpieces belong to the collection of the Pinacoteca Nazionale of Bologna, Italy.

  8. Modular multispectral imaging system for multiple missions and applications

    NASA Astrophysics Data System (ADS)

    Schoonmaker, Jon; Podobna, Yuliya; Boucher, Cynthia; Saggese, Steve; Oakley, Daniel; Medeiros, Dustin

    2011-05-01

    The Navy recently began investing in the design of mission-specific payloads for the Small Tactical Unmanned Aircraft System (STUAS). STUAS is a Tier II size UAS with a roughly 35 pound mission payload and a gimbaled general-purpose electro optical/infrared (EO/IR) system. The EO/IR system is likely composed of a video camera in the visible, a mid-wave infrared (MWIR) and/or a long-wave infrared (LWIR) for night operations, and an infrared marker and laser range finder. Advanced Coherent Technologies, LLC (ACT), in a series of SBIR efforts, has developed a modular, multi-channel imaging system for deployment on airborne and UAV platforms. ACT's system, called EYE5, demonstrates how an EO/IR system combined with an on-board, real-time processor can be tailored for specific applications to produce real-time actionable data. The EYE5 sensor head and modular real-time processor descriptions are presented in this work. Examples of the system's abilities in various Navy-relevant applications are reviewed.

  9. Lattice algebra approach to multispectral analysis of ancient documents.

    PubMed

    Valdiviezo-N, Juan C; Urcid, Gonzalo

    2013-02-01

    This paper introduces a lattice algebra procedure that can be used for the multispectral analysis of historical documents and artworks. Assuming the presence of linearly mixed spectral pixels captured in a multispectral scene, the proposed method computes the scaled min- and max-lattice associative memories to determine the purest pixels that best represent the spectra of single pigments. The estimation of fractional proportions of pure spectra at each image pixel is used to build pigment abundance maps that can be used for subsequent restoration of damaged parts. Application examples include multispectral images acquired from the Archimedes Palimpsest and a Mexican pre-Hispanic codex.

  10. Multispectral Filter Arrays: Recent Advances and Practical Implementation

    PubMed Central

    Lapray, Pierre-Jean; Wang, Xingbo; Thomas, Jean-Baptiste; Gouton, Pierre

    2014-01-01

    Thanks to some technical progress in interferencefilter design based on different technologies, we can finally successfully implement the concept of multispectral filter array-based sensors. This article provides the relevant state-of-the-art for multispectral imaging systems and presents the characteristics of the elements of our multispectral sensor as a case study. The spectral characteristics are based on two different spatial arrangements that distribute eight different bandpass filters in the visible and near-infrared area of the spectrum. We demonstrate that the system is viable and evaluate its performance through sensor spectral simulation. PMID:25407904

  11. Eliminate background interference from latent fingerprints using ultraviolet multispectral imaging

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Xu, Xiaojing; Wang, Guiqiang

    2014-02-01

    Fingerprints are the most important evidence in crime scene. The technology of developing latent fingerprints is one of the hottest research areas in forensic science. Recently, multispectral imaging which has shown great capability in fingerprints development, questioned document detection and trace evidence examination is used in detecting material evidence. This paper studied how to eliminate background interference from non-porous and porous surface latent fingerprints by rotating filter wheel ultraviolet multispectral imaging. The results approved that background interference could be removed clearly from latent fingerprints by using multispectral imaging in ultraviolet bandwidth.

  12. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    NASA Astrophysics Data System (ADS)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  13. Evaluation of 0.46- to 2.36-micrometre multispectral scanner images of the East Tintic mining district, Utah, for mapping hydrothermally altered rocks.

    USGS Publications Warehouse

    Rowan, L.C.; Kahle, A.B.

    1982-01-01

    Airborne multispectral scanner images recorded in the 0.46 to 2.36 micrometre region for the E Tintic mining district, Utah, were evaluated to determine their usefulness for distinguishing six types of hydrothermally altered rocks from a wide range of sedimentary and igneous rock types. The laboratory and field evaluation of a color ratio composite image, supported by in situ spectral reflectance measurements and an alteration map compiled from a published map, shows that silicified, argillized, and pyritized rocks can be mapped in detail utilizing an intense OH absorption band centered near 2.2 micrometre. This absorption band is absent or weak in most of the unaltered rocks. -from Authors

  14. Streak camera receiver definition study

    NASA Technical Reports Server (NTRS)

    Johnson, C. B.; Hunkler, L. T., Sr.; Letzring, S. A.; Jaanimagi, P.

    1990-01-01

    Detailed streak camera definition studies were made as a first step toward full flight qualification of a dual channel picosecond resolution streak camera receiver for the Geoscience Laser Altimeter and Ranging System (GLRS). The streak camera receiver requirements are discussed as they pertain specifically to the GLRS system, and estimates of the characteristics of the streak camera are given, based upon existing and near-term technological capabilities. Important problem areas are highlighted, and possible corresponding solutions are discussed.

  15. Automated Camera Array Fine Calibration

    NASA Technical Reports Server (NTRS)

    Clouse, Daniel; Padgett, Curtis; Ansar, Adnan; Cheng, Yang

    2008-01-01

    Using aerial imagery, the JPL FineCalibration (JPL FineCal) software automatically tunes a set of existing CAHVOR camera models for an array of cameras. The software finds matching features in the overlap region between images from adjacent cameras, and uses these features to refine the camera models. It is not necessary to take special imagery of a known target and no surveying is required. JPL FineCal was developed for use with an aerial, persistent surveillance platform.

  16. Combustion pinhole camera system

    DOEpatents

    Witte, A.B.

    1984-02-21

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor. 2 figs.

  17. Combustion pinhole camera system

    DOEpatents

    Witte, Arvel B.

    1984-02-21

    A pinhole camera system utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  18. LSST Camera Optics

    SciTech Connect

    Olivier, S S; Seppala, L; Gilmore, K; Hale, L; Whistler, W

    2006-06-05

    The Large Synoptic Survey Telescope (LSST) is a unique, three-mirror, modified Paul-Baker design with an 8.4m primary, a 3.4m secondary, and a 5.0m tertiary feeding a camera system that includes corrector optics to produce a 3.5 degree field of view with excellent image quality (<0.3 arcsecond 80% encircled diffracted energy) over the entire field from blue to near infra-red wavelengths. We describe the design of the LSST camera optics, consisting of three refractive lenses with diameters of 1.6m, 1.0m and 0.7m, along with a set of interchangeable, broad-band, interference filters with diameters of 0.75m. We also describe current plans for fabricating, coating, mounting and testing these lenses and filters.

  19. NSTX Tangential Divertor Camera

    SciTech Connect

    A.L. Roquemore; Ted Biewer; D. Johnson; S.J. Zweben; Nobuhiro Nishino; V.A. Soukhanovskii

    2004-07-16

    Strong magnetic field shear around the divertor x-point is numerically predicted to lead to strong spatial asymmetries in turbulence driven particle fluxes. To visualize the turbulence and associated impurity line emission near the lower x-point region, a new tangential observation port has been recently installed on NSTX. A reentrant sapphire window with a moveable in-vessel mirror images the divertor region from the center stack out to R 80 cm and views the x-point for most plasma configurations. A coherent fiber optic bundle transmits the image through a remotely selected filter to a fast camera, for example a 40500 frames/sec Photron CCD camera. A gas puffer located in the lower inboard divertor will localize the turbulence in the region near the x-point. Edge fluid and turbulent codes UEDGE and BOUT will be used to interpret impurity and deuterium emission fluctuation measurements in the divertor.

  20. PHARUS airborne SAR concept

    NASA Astrophysics Data System (ADS)

    Snoeij, Paul; Pouwels, Henk; Koomen, Peter J.; Hoogeboom, Peter

    1995-11-01

    PHARUS (phased array universal SAR) is an airborne SAR concept which is being developed in the Netherlands. The PHARUS system differs from other airborne SARs by the use of a phased array antenna, which provides both for the flexibility in the design as well as for a compact, light-weight instrument that can be carried on small aircraft. The concept allows for the construction of airborne SAR systems on a common generic basis but tailored to specific user needs and can be seen as a preparation for future spaceborne SAR systems using solid state transmitters with electronically steerable phased array antenna. The whole approach is aimed at providing an economic and yet technically sophisticated solution to remote sensing or surveying needs of a specific user. The solid state phased array antenna consists of a collection of radiating patches; the design flexibility for a large part resides in the freedom to choose the number of patches, and thereby the essential radar performance parameters such as resolution and swath width. Another consequence of the use of the phased array antenna is the system's compactness and the possibility to rigidly mount it on a small aircraft. The use of small aircraft of course considerably improves the cost/benefit ratio of the use of airborne SAR. Flight altitude of the system is flexible between about 7,000 and 40,000 feet, giving much operational freedom within the meteo and airspace control limits. In the PHARUS concept the airborne segment is complemented by a ground segment, which consists of a SAR processor, possibly extended by a matching image processing package. (A quick look image is available in real-time on board the aircraft.) The SAR processor is UNIX based and runs on easily available hardware (SUN station). Although the additional image processing software is available, the SAR processing software is nevertheless designed to be able to interface with commercially available image processing software, as well as being able